[conspire] Order to Compel Apple to Assist With SB Shooter Unlock

Rick Moen rick at linuxmafia.com
Sat Feb 20 18:14:00 PST 2016


I wrote:

> They might already know all about what was on Farook's iPhone 5c, and
> only be trying to gain the right to compel technology firms as a goal
> in itself.  There are any number of other ways to break into such
> embedded devices -- the common theme among these and other specialists in
> creating security compromise being the use of lateral thinking.  It's
> dumb to (metaphorically) batter down the armoured door, and instead you
> look for the unlatched window.

One popular workaround:  'If the device's data are encrypted, maybe the
data in the backups aren't.'  But:

> FYI, starting with iOS 8, most userspace applications on an iPhone have
> written all their back-end (disk) stored data strongly encrypted using a
> symmetric AES cipher whose key is a combination of a 256-bit key (the
> 'UID') and the user's 4-digit numeric passcode.  [RM: No longer 4-digit.
> See correction below.]  Details here:
> http://www.darthnull.org/2014/10/06/ios-encryption

In this design, the strength of the implementation resides, in part, in
the 'UID' being available only on the device itself, during device
runtime.  It's believed to be extremely difficult to extract.  So, how
is encryption maintained during off-device backups?  Answer:  It isn't.
iOS devices get occasionally backed up to Apple's iCloud service --
unencrypted.  In fact, Syed Farook's San Bernardino County Department of
Health-issued iPhone would do a fresh backup to iCloud automatically
following the December 2nd mass shooting, by just putting it near
Farook's workplace WiFi network for a while, leaving it alone, and
letting it do a plaintext backup.  FBI then wouldn't even need a
warrant, then:  The County would have gladly given a copy of the iCloud
backup.

But, turns out, a funny thing happened:
https://www.washingtonpost.com/world/national-security/fbi-asked-san-bernardino-to-reset-the-password-for-shooters-phone-backup/2016/02/20/21fe9684-d800-11e5-be55-2cc3c1e4b76b_story.html

  The same Sunday [Dec. 6th], the FBI asked the county for help in
  retrieving data from the phone, Wert said in an interview. "So the
  county said we could get to the information on the cloud if we changed
  the password, or had Apple change the password," he said. "The FBI
  asked us to do that, and we did."

I can easily see someone making this blunder, because rushed
and tired people in crisis situations make mistakes:  The County changing 
Farook's iCloud password (allegedly -- see below -- at FBI's request)
_locked the 'phone out_ of its regular backups.

Oops.

Turns out, Farook's most recent iCloud backup was on October 19, 2015.
So, FBI got from iCloud the full contents of the 'phone as of 1 1/2
months before the mass shooting -- and then someone at either the FBI or
the County clumsily ruined the easiest
way to get everything else up to December 2nd.
 
Quoted text above reflects the County Health Department's story about
the course of events.  It appears that the County claims they reset the
password to implement an FBI request to do so, but an anonymous Fed
speaking for the FBI seems to dispute the assertion:
http://gizmodo.com/the-san-bernardino-terrorists-icloud-password-was-accid-1760158613
http://gizmodo.com/san-bernardino-county-calls-the-fbi-liars-over-terroris-1760317923
http://gizmodo.com/apple-this-mess-couldve-been-avoided-if-the-government-1760211382

A court filing from one side of this matter speculates the 1 1/2 month
delay 'indicates to the FBI that Farook may have disabled the automatic
iCloud backup function to hide evidence.'  Maybe.  Or it just might not
have happened to sync after that.  

Quoting the last of the gizmodo.com links:

  But there is one thing the government and Apple agree on: It is
  technically possible for Apple to write the kind of software in demand.
  In fact, the executive admitted that the Cupertino company would be able
  to write this software not only for its newest phones but also for all
  phones it has in use.

Exactly.  Which brings us back to the situation of individuals seeking
meaningful security and privacy.  Much has been made, post-Snowden, of 
big tech companies rushing to roll out alleged end-to-end encryption in
Web service and sundry computer products.  In my opinion, that is
possibly laudable (e.g., Google's various proprietary Internet products
implementing 'perfect forward security',
https://en.wikipedia.org/wiki/Forward_secrecy), but should be assumed
unreliable.  

I think the writing is on the wall, that if you want real security and
privacy, you should implement and manage your own crypto software and
your own keys, being very careful about implementations, key-handling,
and trust relationships.  And what I mean, here, for starters, is ignore
any vendor crypto asserted to exist, assume that it's Swiss-cheesed, and
layer your own crypto on top of it.

Owners of iOS devices are probably totally boned in that department 
(consequence of the walled garden), but I'll admit to being curious
about what might be possible _even_ for such people.






More information about the conspire mailing list