<html><head></head><body><div class="ydpeba615beyahoo-style-wrap" style="font-family:Helvetica Neue, Helvetica, Arial, sans-serif;font-size:13px;"><div></div>
        <div dir="ltr" data-setdir="false">Exponential curve is probably not the ideal formula now.  <br></div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false">The math of exponential curve assumes that all past cases still contribute to growth.  This is exact for compound interest and close for bacteria in a Petri dish.</div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false">With COVID-19, cases more than a couple weeks old are no longer carriers.  Epidemiologists have more complex models that estimate how many cases are active.  <br></div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false">Now we have efforts to track cases and isolate them before they spread.<br></div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false">The data presented by Michael shows the number of cases grew by 100,000 : 1 from the beginning of Feb to the end of May.  <br></div><div dir="ltr" data-setdir="false">Since then it has grown by 16 %.</div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false">If the growth was really exponential in the last couple of weeks, the time to double would be several months.  3 weeks of data isn't enough to accurately compute the time.</div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false">In any event, the re-opening process will add more variables to the mix.  And it takes 2-3 weeks for a change in policy to be seen in the change in rate of new cases.<br></div><div dir="ltr" data-setdir="false"><br></div><div dir="ltr" data-setdir="false"><br></div><div><br></div>
        
        </div><div id="ydped5829fyahoo_quoted_3344204663" class="ydped5829fyahoo_quoted">
            <div style="font-family:'Helvetica Neue', Helvetica, Arial, sans-serif;font-size:13px;color:#26282a;">
                
                <div>
                    On Thursday, June 18, 2020, 3:08:12 AM PDT, Michael Paoli <michael.paoli@cal.berkeley.edu> wrote:
                </div>
                <div><br></div>
                <div><br></div>
                <div>Good news, bad news.<br clear="none"><br clear="none">So, let's look some more at US numbers,<br clear="none">official COVID-19 deaths.  Sure, probably not 100% accurate, probably<br clear="none">undercounted to some fair/significant percentage, but regardless, let's<br clear="none">work with those numbers, as an at least rough/crude approximation.<br clear="none"><br clear="none">Good news ... doesn't look like 1,000,000 dead by 2020-06-18.<br clear="none"><br clear="none">Bad news - the numbers continue to go up.<br clear="none"><br clear="none">Let's look a bit closer.<br clear="none"><br clear="none">So, from earlier:<br clear="none"><a shape="rect" href="http://linuxmafia.com/pipermail/conspire/2020-May/010796.html" rel="nofollow" target="_blank">http://linuxmafia.com/pipermail/conspire/2020-May/010796.html</a><br clear="none">we have date and US COVID-19 total deaths:<br clear="none">2020-02-06 1<br clear="none">2020-05-27 100,000<br clear="none">And most current:<br clear="none">2020-06-17 115,980<br clear="none"><br clear="none">So, what's that give us - for a rough/gross approximation for ...<br clear="none">exponential bases covering these periods?:<br clear="none">period<br clear="none">2020-02-06--2020-05-27<br clear="none">2020-05-27--2020-06-17<br clear="none">2020-02-06--2020-06-17<br clear="none"><br clear="none">Ddiff(){<br clear="none">   echo \($(TZ=GMT0 date +%s -d "$1"'T12:00:00+00:00') - \<br clear="none">   $(TZ=GMT0 date +%s -d "$2"'T12:00:00+00:00')\)/3600/24 |<br clear="none">   bc -l | sed -e 's/\.00*$//'<br clear="none">}<br clear="none"><br clear="none">Ddiff 2020-05-27 2020-02-06<br clear="none">111<br clear="none">Ddiff 2020-06-18 2020-05-27<br clear="none">22<br clear="none">Ddiff 2020-06-18 2020-02-06<br clear="none">133<br clear="none"><br clear="none">period                 days (increment)<br clear="none">2020-02-06--2020-05-27 111<br clear="none">2020-05-27--2020-06-17  22<br clear="none">2020-02-06--2020-06-17 133<br clear="none"><br clear="none">So ... how far into future (at what dates) to hit these numbers?:<br clear="none"><br clear="none">c='200000     500000     1000000    2000000'<br clear="none">echo "base         $c"<br clear="none">Dpredictor(){<br clear="none">   (<br clear="none">     d2="$1"<br clear="none">     d1="$2"<br clear="none">     c2="$3"<br clear="none">     c1="$4"<br clear="none">     # future c #s to predict for:<br clear="none">     set -- $c<br clear="none">     # elapsed days:<br clear="none">     d=$(<br clear="none">       echo \($(TZ=GMT0 date +%s -d "$d2"'T12:00:00+00:00') - \<br clear="none">       $(TZ=GMT0 date +%s -d "$d1"'T12:00:00+00:00')\)/3600/24 |<br clear="none">       bc -l | sed -e 's/\.00*$//'<br clear="none">     )<br clear="none">     b=$(<br clear="none">       # leading TABs in the actual code, NOT spaces<br clear="none">       bc -l <<- __EOT__<br clear="none">         scale=50<br clear="none">         define p(x,y){<br clear="none">           return e(y*l(x))<br clear="none">         }<br clear="none">         p(($c2/$c1),(1/$d))<br clear="none">       __EOT__<br clear="none">     )<br clear="none">     printf "%1.10f" $b<br clear="none">     for c3; do<br clear="none">       dp=$(<br clear="none">         # leading TABs in the actual code, NOT spaces<br clear="none">         bc -l <<- __EOT__<br clear="none">           scale=50<br clear="none">           define p(x,y){<br clear="none">             return e(y*l(x))<br clear="none">           }<br clear="none">           l($c3/$c1)/l($b)<br clear="none">         __EOT__<br clear="none">       )<br clear="none">       # round:<br clear="none">       case "$dp" in<br clear="none">         *.[5-9]*)<br clear="none">           dp=$(echo "$dp" | sed -e 's/\.[0-9]*$//')<br clear="none">           dp=$(expr "$dp" + 1)<br clear="none">         ;;<br clear="none">         *)<br clear="none">           dp=$(echo "$dp" | sed -e 's/\.[0-9]*$//')<br clear="none">         ;;<br clear="none">       esac<br clear="none">       echo -n " $(date -I -d "${d1}T12:00:00-07:30 + $dp days")"<br clear="none">     done<br clear="none">     echo<br clear="none">   )<br clear="none">}<br clear="none"><br clear="none">Dpredictor 2020-05-27 2020-02-06 100000 1<br clear="none">Dpredictor 2020-06-17 2020-05-27 115980 100000<br clear="none">Dpredictor 2020-06-17 2020-02-06 115980 1<br clear="none"><br clear="none">base         200000     500000     1000000    2000000<br clear="none">1.1092898649 2020-06-03 2020-06-12 2020-06-18 2020-06-25<br clear="none">1.0070843848 2020-09-02 2021-01-10 2021-04-18 2021-07-25<br clear="none">1.0923618862 2020-06-23 2020-07-04 2020-07-11 2020-07-19<br clear="none"><br clear="none">So ... first of all, we have our various base numbers,<br clear="none">based on our data, and crudely presuming same base rate throughout<br clear="none">each period.  We see higher base for earlier period, lowest for most<br clear="none">recent, and something between for period covering the full range.<br clear="none">Reality is likely somewhere between the most recent period,<br clear="none">and overall range, so excluding the other, that would leave us with:<br clear="none">base         200000     500000     1000000    2000000<br clear="none">1.0070843848 2020-09-02 2021-01-10 2021-04-18 2021-07-25<br clear="none">1.0923618862 2020-06-23 2020-07-04 2020-07-11 2020-07-19<br clear="none"><br clear="none">Most recent would almost certainly be artificially low on the base,<br clear="none">as the mortality numbers lag behind current spreading - probably by<br clear="none">roughly 2 weeks (probably somewhere between 5 and 15 days, anyway),<br clear="none">as it covers much of shelter-in-place/lockdown close to its maximum/peak<br clear="none">use, before things started opening up more - and particularly when<br clear="none">also considering the spread --> death lag factor.<br clear="none"><br clear="none">Likewise, overall period likely shows somewhat artificially high rate,<br clear="none">as it covers much of period before any lockdown / shelter-in-place.<br clear="none"><br clear="none">And this still doesn't cover exponential --> logistic curve - as the<br clear="none">latter becomes more accurate as the no longer vulnerable to being<br clear="none">infected (immune, or dead) numbers go up.  Realizing too, "immune"<br clear="none">may be partial or temporary - but "enough" and "long enough" to<br clear="none">take as effectively so over the periods examined.  But we're<br clear="none">still just showing simple exponential here, not logistic.<br clear="none"><br clear="none">So, if we drop out the period that's only the earlier, ...<br clear="none">base         200000     500000     1000000    2000000<br clear="none">1.0070843848 2020-09-02 2021-01-10 2021-04-18 2021-07-25<br clear="none">1.0923618862 2020-06-23 2020-07-04 2020-07-11 2020-07-19<br clear="none">It looks like we're still quite tracking towards ominous<br clear="none">milestones of 200,000, 500,000, etc. - though for the<br clear="none">latter ones, the model eventually breaks down (need to go<br clear="none">to logistic, or data greatly changes when safe effective<br clear="none">vaccine becomes widely available and deployed).<br clear="none">So ... between Summer, late Summer, early next year and<br clear="none">into Spring next year, we'll much better know what<br clear="none">numbers we're tracking on and towards.<br clear="none"><br clear="none">Some folks may quite think we're done with this virus,<br clear="none">but alas, it's definitely not done with us.<br clear="none"><br clear="none">references/excerpts:<br clear="none"><a shape="rect" href="http://linuxmafia.com/pipermail/conspire/2020-May/010796.html" rel="nofollow" target="_blank">http://linuxmafia.com/pipermail/conspire/2020-May/010796.html</a><br clear="none"><a shape="rect" href="https://en.wikipedia.org/wiki/COVID-19_pandemic_deaths" rel="nofollow" target="_blank">https://en.wikipedia.org/wiki/COVID-19_pandemic_deaths</a><br clear="none"><br clear="none"><div class="ydped5829fyqt0135605495" id="ydped5829fyqtfd38573"><br clear="none">> From: "Michael Paoli" <<a shape="rect" href="mailto:Michael.Paoli@cal.berkeley.edu" rel="nofollow" target="_blank">Michael.Paoli@cal.berkeley.edu</a>><br clear="none">> Subject: 100,000, bc(1), & miscellaneous<br clear="none">> Date: Thu, 28 May 2020 00:41:34 -0700<br clear="none"><br clear="none">> 2020-05-27 100,000 US dead; less than 4 months ago we were at 1 US dead<br clear="none">> Rate may not have been consistent (variations in<br clear="none">> lockdown/shelter-in-place/reopening/...), and theoretically more<br clear="none">> logistic than exponential, but still far from saturation / herd<br clear="none">> immunity, so if we approximate using exponential,<br clear="none">> and do gross approximation of consistent rate, we have ...<br clear="none">><br clear="none">> First US confirmed fatality ...<br clear="none">> was earlier thought to be 2020-02-28 in Seattle, WA, but (autopsies)<br clear="none">> later confirmed first was:<br clear="none">> 2020-02-06 in in Santa Clara County, CA<br clear="none">><br clear="none">> So ...<br clear="none">> $ echo \($(TZ=GMT0 date +%s -d '2020-05-27T12:00:00+00:00') - \<br clear="none">> $(TZ=GMT0 date +%s -d '2020-02-06T12:00:00+00:00')\)/3600/24 |<br clear="none">> bc -l<br clear="none">> 111.00000000000000000000<br clear="none">> 111 days ago - or<br clear="none">> 111 days from from 2020-02-06 (1) to 2020-05-27 (100,000)<br clear="none">><br clear="none">> So, ... we'll make the (gross) approximation of consistent<br clear="none">> exponential growth rate throughout.<br clear="none">><br clear="none">> So, then what per-day growth rate?<br clear="none">> b^111=100,000 = e^(111*ln(b))<br clear="none">> b=100,000^(1/111) (or 111th root of 100,000) = e^((1/111)*ln(100,000))<br clear="none">>  ~=<br clear="none">> 1.10928986489522280772, e.g.: $ echo 'e((1/111)*l(100000))' | bc -l<br clear="none">> Remember, exponentials get big/small fast - unless the base is 1.<br clear="none">> if the base is >1 they grow, if the base is <1 they get shrink.<br clear="none">> So, at ~1.11, we go from 1 to 100,000 in 111 days<br clear="none">><br clear="none">> If we presume same exponential, when would we hit 1,000,000?<br clear="none">> b^x=1,000,000<br clear="none">> x=log-base-b of 1,000,000 = ln(1,000,000)/ln(b) =<br clear="none">> ln(1,000,000)/ln(e^((1/111)*ln(100,000))) =<br clear="none">> ln(1,000,000)/((1/111)*ln(100,000)) ~=<br clear="none">> 133.20000000000000014454, e.g.:<br clear="none">> $ echo 'l(1000000)/((1/111)*l(100000))' | bc -l<br clear="none">> So, 10x growth in ~22 (133-111) days<br clear="none">> $ TZ=GMT0 date -I -d '2020-02-06T12:00:00+00:00 + 133 days'<br clear="none">> So,<br clear="none">> 2020-06-18 for 1,000,000 (gross approximation of continuous unchanged<br clear="none">> exponential)<br clear="none">><br clear="none">> And, when for 10,000,000, and 100,000,000 ...?<br clear="none">> The model would break down by/before then, logistic curve would be<br clear="none">> much more appropriate fit (and even then, if we do gross approximation<br clear="none">> of no factors changing along the way).<br clear="none">> Exponential doesn't take into account immune / no longer vulnerable<br clear="none">> to infection or any limits on pool available to be infected, whereas<br clear="none">> logistic does.<br clear="none">> Essentially no longer vulnerable to infection (at least to sufficient<br clear="none">> model approximation) happens one of two ways:<br clear="none">> o immune (vaccine, infected and recovered ... "immune" not necessarily<br clear="none">>   permanent immunity, but "long enough" to cover the period under<br clear="none">>   examination).<br clear="none">> o deceased (not the best way to get removed from pool of vulnerable to<br clear="none">>   infection)<br clear="none">><br clear="none">> So, if we've got ~10x growth in ~22 days, and again, our crude gross<br clear="none">> approximation exponential modeling (presuming consistent rate of<br clear="none">> spread), how many days for, e.g. 2x (doubling), 10x, 100x, 1024x,<br clear="none">> ...?<br clear="none">> Let's call x our multiplier (e.g. for 2x, 10x, etc. growth factor).<br clear="none">> Let's use d for number of days.<br clear="none">> From our earlier, we have base (call it b) of exponent,<br clear="none">> for our daily growth factor (b^1 = our 1 day multiplier)<br clear="none">> b=100,000^(1/111) (or 111th root of 100,000) = e^((1/111)*ln(100,000))<br clear="none">>  ~=<br clear="none">> 1.10928986489522280772, e.g.: $ echo 'e((1/111)*l(100000))' | bc -l<br clear="none">> We'll use (relatively) standard notation, and convert to bc(1)<br clear="none">> syntax/format:<br clear="none">> b=e((1/111)*l(100000))<br clear="none">> b^d=x<br clear="none">> d=log-base-b of x<br clear="none">> d=ln(x)/ln(b)<br clear="none">> d=l(x)/l(b)<br clear="none">> And, wee bit 'o shell:<br clear="none">> b='e((1/111)*l(100000))'<br clear="none">> for x in 2 10 100 1024 1048576; do<br clear="none">>   echo -n "x=$x d="<br clear="none">>   echo "l($x)/l($b)" | bc -l<br clear="none">> done<br clear="none">> And we have:<br clear="none">> x=2 d=6.68286590374038254157<br clear="none">> x=10 d=22.20000000000000002616<br clear="none">> x=100 d=44.40000000000000005242<br clear="none">> x=1024 d=66.82865903740382541642<br clear="none">> x=1048576 d=133.65731807480765083285<br clear="none">><br clear="none">> "Of course" ... "reality" ... the base continually changes, and may be<br clear="none">> quite region/locality/country/... specific, altered by factors such as:<br clear="none">> o physical/social distancing<br clear="none">> o shelter-in-place / lockdown / reopenings / social and other<br clear="none">>   gatherings<br clear="none">> o hygiene and other relevant practices, especially as regards<br clear="none">>   SARS-CoV-2 --> COVID-19 infection/spread pathways<br clear="none">> Anyway, the more and longer the base can be pushed and further pushed<br clear="none">> below 1, the quicker the infections get to 0.  Until then, it<br clear="none">> spreads/grows, notwithstanding immunity(/deaths) numbers becoming so<br clear="none">> large the model starts to substantially differ from exponential and<br clear="none">> better approximates logistic.<br clear="none">><br clear="none">> Bit more on bc(1) ... want a handy power (x^y) function in bc(1)?<br clear="none">> bc(1) goes relatively minimal in some ways, it give one the basic<br clear="none">> necessary functions needed to define most other functions/capabilities<br clear="none">> that may be needed.  E.g. it has natural logarithm/exponentiation, but<br clear="none">> no other bases, as those can be derived from natural base e.<br clear="none">> Likewise trig, it has sine and cosine functions, but no tangent<br clear="none">> function, as tangent can be calculated from sine and cosine.<br clear="none">> So, arbitrary base to arbitrary exponent:<br clear="none">> x^y = e^(y*ln(x))<br clear="none">> So, we can define a function, call it p (for power) in bc:<br clear="none">> define p(x,y){<br clear="none">>   e(y*l(x))<br clear="none">> }<br clear="none">> e.g.:<br clear="none">> $ bc -lq<br clear="none">> define p(x,y){<br clear="none">>   return e(y*l(x))<br clear="none">> }<br clear="none">> p(2,10)<br clear="none">> 1023.99999999999999992594<br clear="none">> p(144,1/2)<br clear="none">> 11.99999999999999999988<br clear="none">> quit<br clear="none">> $<br clear="none">> bc(1) also well handles relatively arbitrary precision, and other<br clear="none">> number bases.  E.g.:<br clear="none">> $ bc -lq<br clear="none">> scale=66<br clear="none">> 4*a(1)<br clear="none">> 3.141592653589793238462643383279502884197169399375105820974944592304<br clear="none">> obase=16<br clear="none">> 4*a(1)<br clear="none">> 3.243F6A8885A308D313198A2E03707344A4093822299F31D0082EFA3<br clear="none">> ibase=2<br clear="none">> 110001011001001101101000010110100110000000000001011100000010011100000000<br clear="none">> C593685A6001702700<br clear="none">> quit<br clear="none">> $<br clear="none">><br clear="none">> references/excerpts:<br clear="none">> <a shape="rect" href="http://linuxmafia.com/pipermail/conspire/2020-May/010792.html" rel="nofollow" target="_blank">http://linuxmafia.com/pipermail/conspire/2020-May/010792.html</a><br clear="none">> <a shape="rect" href="http://linuxmafia.com/pipermail/conspire/2020-March/010315.html" rel="nofollow" target="_blank">http://linuxmafia.com/pipermail/conspire/2020-March/010315.html</a><br clear="none">> bc(1)<br clear="none">> sh(1)<br clear="none">> <a shape="rect" href="https://www.youtube.com/playlist?list=PLIOESHELJOCnqaaUqq7AzTOGp-k-2KzKY" rel="nofollow" target="_blank">https://www.youtube.com/playlist?list=PLIOESHELJOCnqaaUqq7AzTOGp-k-2KzKY</a><br clear="none">><br clear="none">> And now for slightly cheerier note(s):<br clear="none">> France Musique Le BolĂ©ro de Ravel par l'Orchestre national de France en<br clear="none">> <a shape="rect" href="https://youtu.be/Sj4pE_bgRQI" rel="nofollow" target="_blank">https://youtu.be/Sj4pE_bgRQI</a><br clear="none"><br clear="none"><br clear="none">_______________________________________________<br clear="none">conspire mailing list<br clear="none"><a shape="rect" href="mailto:conspire@linuxmafia.com" rel="nofollow" target="_blank">conspire@linuxmafia.com</a><br clear="none"><a shape="rect" href="http://linuxmafia.com/mailman/listinfo/conspire" rel="nofollow" target="_blank">http://linuxmafia.com/mailman/listinfo/conspire</a><br clear="none"></div></div>
            </div>
        </div></body></html>