[conspire] 100,000, bc(1), & miscellaneous

paulz at ieee.org paulz at ieee.org
Sat May 30 11:14:43 PDT 2020


 The idea behind exponential growth is that all previous cases can still infect.  This math is exact for compound interest.  It's pretty close for cells in a Petri dish.With COVID19, it no longer holds.  Some of the past cases are dead.  Most have developed anti-bodies and are no longer carrying the virus.
Immunology uses more complex models to estimate the active cases based on various assumptions.We currently have a rough balance between new cases added and existing cases becoming inactive.

But if we let our guard down, those active cases would multiply at a new exponential rate and we likely wouldn't detect it for 2-3 weeks as people develop symptoms and call their doctors.

    On Thursday, May 28, 2020, 1:34:10 AM PDT, tom r lopes <tomrlopes at gmail.com> wrote:  
 
 If you are trying to see how many days to 1,000,00 deaths (assuming exponential)there is an easier way.  
Using your notation: b is the exponential rate per day.  And now after 111 days we have 100,000 deaths.  e
so (*)  b^111 = 100,000
100,000 is 10^5 and 1,000,000 is 10^6 
Raise both sides of equation (*) to 6/5th power
b^(111*6/5) = 1,000,000
111*6/5 = 133.2 
133 days.  So 3 weeks from today (June 18th)
Thomas

---------- Forwarded message ----------
From: Michael Paoli <Michael.Paoli at cal.berkeley.edu>
To: conspire at linuxmafia.com
Cc: 
Bcc: 
Date: Thu, 28 May 2020 00:41:34 -0700
Subject: [conspire] 100,000, bc(1), & miscellaneous
2020-05-27 100,000 US dead; less than 4 months ago we were at 1 US dead
Rate may not have been consistent (variations in
lockdown/shelter-in-place/reopening/...), and theoretically more
logistic than exponential, but still far from saturation / herd
immunity, so if we approximate using exponential,
and do gross approximation of consistent rate, we have ...

First US confirmed fatality ...
was earlier thought to be 2020-02-28 in Seattle, WA, but (autopsies)
later confirmed first was:
2020-02-06 in in Santa Clara County, CA

So ...
$ echo \($(TZ=GMT0 date +%s -d '2020-05-27T12:00:00+00:00') - \
$(TZ=GMT0 date +%s -d '2020-02-06T12:00:00+00:00')\)/3600/24 |
bc -l
111.00000000000000000000
111 days ago - or
111 days from from 2020-02-06 (1) to 2020-05-27 (100,000)

So, ... we'll make the (gross) approximation of consistent
exponential growth rate throughout.

So, then what per-day growth rate?
b^111=100,000 = e^(111*ln(b))
b=100,000^(1/111) (or 111th root of 100,000) = e^((1/111)*ln(100,000))
  ~=
1.10928986489522280772, e.g.: $ echo 'e((1/111)*l(100000))' | bc -l
Remember, exponentials get big/small fast - unless the base is 1.
if the base is >1 they grow, if the base is <1 they get shrink.
So, at ~1.11, we go from 1 to 100,000 in 111 days

If we presume same exponential, when would we hit 1,000,000?
b^x=1,000,000
x=log-base-b of 1,000,000 = ln(1,000,000)/ln(b) =
ln(1,000,000)/ln(e^((1/111)*ln(100,000))) =
ln(1,000,000)/((1/111)*ln(100,000)) ~=
133.20000000000000014454, e.g.:
$ echo 'l(1000000)/((1/111)*l(100000))' | bc -l
So, 10x growth in ~22 (133-111) days
$ TZ=GMT0 date -I -d '2020-02-06T12:00:00+00:00 + 133 days'
So,
2020-06-18 for 1,000,000 (gross approximation of continuous unchanged
exponential)

And, when for 10,000,000, and 100,000,000 ...?
The model would break down by/before then, logistic curve would be
much more appropriate fit (and even then, if we do gross approximation
of no factors changing along the way).
Exponential doesn't take into account immune / no longer vulnerable
to infection or any limits on pool available to be infected, whereas
logistic does.
Essentially no longer vulnerable to infection (at least to sufficient
model approximation) happens one of two ways:
o immune (vaccine, infected and recovered ... "immune" not necessarily
   permanent immunity, but "long enough" to cover the period under
   examination).
o deceased (not the best way to get removed from pool of vulnerable to
   infection)

So, if we've got ~10x growth in ~22 days, and again, our crude gross
approximation exponential modeling (presuming consistent rate of
spread), how many days for, e.g. 2x (doubling), 10x, 100x, 1024x,
...?
Let's call x our multiplier (e.g. for 2x, 10x, etc. growth factor).
Let's use d for number of days.
 From our earlier, we have base (call it b) of exponent,
for our daily growth factor (b^1 = our 1 day multiplier)
b=100,000^(1/111) (or 111th root of 100,000) = e^((1/111)*ln(100,000))
  ~=
1.10928986489522280772, e.g.: $ echo 'e((1/111)*l(100000))' | bc -l
We'll use (relatively) standard notation, and convert to bc(1)
syntax/format:
b=e((1/111)*l(100000))
b^d=x
d=log-base-b of x
d=ln(x)/ln(b)
d=l(x)/l(b)
And, wee bit 'o shell:
b='e((1/111)*l(100000))'
for x in 2 10 100 1024 1048576; do
   echo -n "x=$x d="
   echo "l($x)/l($b)" | bc -l
done
And we have:
x=2 d=6.68286590374038254157
x=10 d=22.20000000000000002616
x=100 d=44.40000000000000005242
x=1024 d=66.82865903740382541642
x=1048576 d=133.65731807480765083285

"Of course" ... "reality" ... the base continually changes, and may be
quite region/locality/country/... specific, altered by factors such as:
o physical/social distancing
o shelter-in-place / lockdown / reopenings / social and other
   gatherings
o hygiene and other relevant practices, especially as regards
   SARS-CoV-2 --> COVID-19 infection/spread pathways
Anyway, the more and longer the base can be pushed and further pushed
below 1, the quicker the infections get to 0.  Until then, it
spreads/grows, notwithstanding immunity(/deaths) numbers becoming so
large the model starts to substantially differ from exponential and
better approximates logistic.

Bit more on bc(1) ... want a handy power (x^y) function in bc(1)?
bc(1) goes relatively minimal in some ways, it give one the basic
necessary functions needed to define most other functions/capabilities
that may be needed.  E.g. it has natural logarithm/exponentiation, but
no other bases, as those can be derived from natural base e.
Likewise trig, it has sine and cosine functions, but no tangent
function, as tangent can be calculated from sine and cosine.
So, arbitrary base to arbitrary exponent:
x^y = e^(y*ln(x))
So, we can define a function, call it p (for power) in bc:
define p(x,y){
   e(y*l(x))
}
e.g.:
$ bc -lq
define p(x,y){
   return e(y*l(x))
}
p(2,10)
1023.99999999999999992594
p(144,1/2)
11.99999999999999999988
quit
$
bc(1) also well handles relatively arbitrary precision, and other
number bases.  E.g.:
$ bc -lq
scale=66
4*a(1)
3.141592653589793238462643383279502884197169399375105820974944592304
obase=16
4*a(1)
3.243F6A8885A308D313198A2E03707344A4093822299F31D0082EFA3
ibase=2
110001011001001101101000010110100110000000000001011100000010011100000000
C593685A6001702700
quit
$

references/excerpts:
http://linuxmafia.com/pipermail/conspire/2020-May/010792.html
http://linuxmafia.com/pipermail/conspire/2020-March/010315.html
bc(1)
sh(1)
https://www.youtube.com/playlist?list=PLIOESHELJOCnqaaUqq7AzTOGp-k-2KzKY

And now for slightly cheerier note(s):
France Musique Le Boléro de Ravel par l'Orchestre national de France en
https://youtu.be/Sj4pE_bgRQI



_______________________________________________
conspire mailing list
conspire at linuxmafia.com
http://linuxmafia.com/mailman/listinfo/conspire

_______________________________________________
conspire mailing list
conspire at linuxmafia.com
http://linuxmafia.com/mailman/listinfo/conspire
  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://linuxmafia.com/pipermail/conspire/attachments/20200530/8f083c48/attachment.html>


More information about the conspire mailing list