Interesting story from NYT today titled “Judge Tells Apple to Help Unlock San Bernardino Gunman’s iPhone.”
Here is an excerpt:
WASHINGTON — A judge in California on Tuesday ordered Apple to help the F.B.I. unlock an iPhone used by one of the attackers in the assault in San Bernardino that killed 14 people in December.
The ruling handed the F.B.I. a potentially important victory in its long-running battle with Apple and other Silicon Valley companies over the government’s ability to get access to encrypted data in investigations. Apple has maintained that requiring it to provide the “keys” to its technology would compromise the security of the information of hundreds of millions of users.
The F.B.I. says that its experts have been unable to get into the iPhone 5c used by Syed Rizwan Farook, who was killed by the police along with his wife, Tashfeen Malik, after they attacked Mr. Farook’s co-workers at a holiday gathering.
Prosecutors said in a court filing that Apple had the “exclusive” means to bypass the security features on the phone, but that the company “has declined to provide that assistance voluntarily.” F.B.I. experts say that because of the phone’s security features, they risk losing the data permanently after 10 failed attempts to enter the password.
The Justice Department had secured a search warrant for the phone, which is owned by Mr. Farook’s former employer, the San Bernardino County Department of Public Health. But prosecutors said they saw little choice but to seek the additional order compelling Apple’s assistance.
In an unusually detailed directive, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California ordered Apple to provide “reasonable technical assistance” to the F.B.I. in unlocking the phone. That assistance should allow investigators to “bypass or erase the auto-erase function” on the phone, among other steps, she wrote.
A spokesman for Apple could not be immediately reached for comment.
…
Eileen M. Decker, the United States attorney in Los Angeles, where the investigation is being handled, said the effort to compel Apple’s technical cooperation marked “another step — a potentially important step — in the process of learning everything we possibly can about the attack in San Bernardino.”
…
James B. Comey, the F.B.I. director, has been at odds with Apple and other technology companies for months over whether they should provide de-encryption technology for their products. Without it, he has argued, the bureau is at risk of “going dark” in its investigations. The Democratic presidential candidate Hillary Clinton and most of the Republican hopefuls support Mr. Comey’s stance.
Apple and other technology companies say that creating an opening in their products for government investigators would also create a vulnerability that Chinese, Iranian, Russian or North Korean hackers could exploit.
I wonder how this story would have been reported if it had been the Chinese government asking for Apple to access some ETIM (East Turkestan independence movement) or Dalai Lama terrorists? No doubt, there would have been universal condemnation against the Chinese oppression! Apple would have been goaded and lauded to stand up to such appalling government interference. There would have been vigils for all those who might be apprehended were the Chinese to get intelligence about collaborators.
But here it is the U.S., and the lime light is focused squarely on the bad guys and the dangers … or …. on the flip side an enlightened American company such as Apple fighting a good worthy fight. But ultimately both sides are deemed legitimate … with fears strewn, not surprisingly, on how Chinese might play the same game for their interests in the future…
This story shows clearly why some in China advocate developing its own cell phones and network equipment – as a matter of national security. If its market continues to be flooded with foreign equipment maintained by foreign companies unsympathetic to Chinese concerns for security, that condition is understandably of grave national concern … especially when the foreign companies has the means to cooperate … but choose to stonewall systematically Chinese concerns …
Whether Apple publicly agrees to cooperate with the US government is not the issue: the point is that Western companies will respond to and acknowledge Western concerns one way or another …. but not Chinese or any others’.
Allen says
Does this make coherent sense?
The following is a tweet from Google’s current CEO to say something about Apple vs. FBI after the Google was under pressure by activists to say something and to stand with Apple.
So Google is ok with giving “access” to user data and information – without user’s consent – under valid court order … but wholly troubled by enabling government to “hack” customer devices and data?
Is there really a difference?
The gov’t is asking for unimpeded “access” with court order … “in protecting the public against crime and terrorism.” Is that “hacking” or not?
Apparently not … when Google agrees it’s for a ‘good’?
Don’t be evil … so innocent and good sounding …
but embedded in that phrase is the presumption, the scary presumption, that Google knows what is good and evil.
Black Pheonix says
@Allen
“So Google is ok with giving “access” to user data and information – without user’s consent – under valid court order … but wholly troubled by enabling government to “hack” customer devices and data?”
While the result may be similar, I do agree with Google and Apple that the method of this court order is a drastic increase.
The difference is, this court order requires Apple to create a software tool to hack into data that’s not already accessible, whereas traditionally, Law enforcement has only requested data that’s already accessible by existing tools.
If AT&T already can access your phone data, and a court just ask AT&T for the data that they can access, then it’s just a matter of AT&T looking up the data.
But if there is no tools to allow AT&T to look up the data, and the court requires AT&T to actually MAKE a new tool to access the data, then that’s a magnitude higher, and it would be more of “active hacking”.
1 point that’s missed by virtually everyone is that, this court order also violates US Employment laws, (and probably employment laws world wide). Because the court order compels a company like Apple to assist in work that is not within the scope of anyone’s employment contracts.
Imagine that you are an Apple employee who worked on the encryption of the iPhone. That was your employment scope, you build the encryption for iPhone data. Fine. Now, comes along some FBI agents with a court order, which tells Apple to ORDER you to do work to create a tool to break your own encryption. You might validly say, I object, that’s not my job scope, I never signed up or agreed to do this.
You may have your personal ethical, moral, or even security concerns about doing something like that. For one, if you do it, you will be subject to secrecy orders from the US government. You might not want that. For another, if you do it, you might become a large target for terrorists, who might want to kidnap you to get you to tell them how to break the iPhone encryption.
In a simpler term, the Government is make you do government work, make you take the risks associated with it, but not giving you any benefits associated. Who will protect you for that work?
Sure, you might agree with the Government in this, and you might gladly help. But it’s not like the Government is giving you a choice either. They are ORDERING Apple to ORDER you to do what FBI said to do.
You don’t have a choice. They are drafting you into compulsory government service. Sure, you might still be paid by Apple, but you are doing work that you didn’t agree to do.
The government’s rationale is (1) Apple CAN do this, (2) it is important.
But (1) just because Apple CAN do this, doesn’t mean that its employees agreed to do this. Just as it would be illogical to argue the same to compel the FBI agents to be deployed in Afghanistan as military police.
and (2) it is not an emergency. the FBI could easily hire its own technical staff to do the same thing.
Allen says
@Black Pheonix
I disagree.
When the government asks a telephone company such as AT&T for information, AT&T has to go out of normal practices to provide the data. I don’t think there is a department within AT&T devoted to snooping for AT&T’s sake. When gov’t asks AT&T to provide info, it is a disruption in the comapanies’ regular processes.
Government asks AT&T because AT&T is in the best position to obtain information readily available. And when that happens: AT&T has to spend money and resources to provide the information the government wants, in the format useful for the government. The resources spent inevitably (in my opinion) would include creating “tools” to collect, aggregate and send info in a form readily usable by the government.
I personally don’t understand how Apple can win here. The U.S. government has recently pried open offshore overseas account for tax accounting purposes – including those reclusive Swedish accounts. There is nothing considered more private than those. If the gov’t can force banks all over the world – in Europe, Cayman Islands, Japan, Taiwan, Singapore, Hong Kong, etc. – to share info, you don’t think they can prevail to force Apple to do the same with cell Phones?
Remember: regarding the data stored on the clouds, the U.S. already has (and exercise) the legal authority to force companies under U.S. jurisdiction to share data. The issue is if the U.S. also have authority to do the same on client devices like cell phones. I don’t see why not – especially under an explicit “warrant.”
Apple’s (and others’) argument that if a backdoor is open, it’d be prong to exploit weighs little. Sure, there is a risk, but the risk can’t be that absolute as to be dispositive, especially considering that financial institutions, power grid controls, and many of the nation’s defense networks are already linked to the Internet. Just because there is a door doesn’t mean per se the door is open. Just because a door can be broken doesn’t mean we should have no doors per se. We take those sort of risks all over, but we can’t with cell phones?
If the U.S. gov’t can keep the nuclear launch code secret from hacking, I am sure they can do well keeping Apple’s backdoor tool secure as well…
Also your employment law arguments to me makes little sense. Lawyers all over demand discoveries on companies all the time. Sometimes these are truly big enterprises – yet the law places these burdens on companies all over. I am sure the companies would prefer not to deal with these, but they do, and their employees do.
With the rise of terrorism and their use of encryption, here is just another burden the law can/will place on companies like Apple. Apple should get used to it.
Black Pheonix says
@Allen
It’s not a matter of US government getting the data, they have the phone, they have the data, but it’s all encrypted. They want Apple to break the encryption, which there is no tool for right now.
The scenario of how it might play out with Apple (or any other company in the same situation) would be, Apple will go and try to find employees who can do this, and then say, “FBI wants us to break the encryption, can you do it?”
And the individual employees will choose, whether they can or will do it for the FBI. (I suspect many, most if not all will object for various reasons). I for one certainly do not want to be subjected to any government secrecy orders for this kind of work.
At the end of the day, if there is not enough expertise in Apple who “can” do it, Apple will respond to FBI and say, “we can’t find the necessary people to do this work. We will have to hire a new group of people, train them, and then they can do this. It will take probably at least 1 year.”
FBI can live with that, or they will file a contempt of court motion against Apple, which won’t get anywhere, because FBI can’t squeeze blood out of stone. Apple can’t force its employees to work on projects that they say they can’t do. There is nothing the court can do to force those employees to either. (I haven’t seen a court order a company to fire its employees yet).
Black Pheonix says
@Allen
Also, this is completely different from traditional “assist” orders that companies get. Traditionally, such orders rise from liability as part of the business, i.e. akin to warranty of liability for services etc.
For example, AT&T caches its customers data, so it becomes subject to turn over to government for examination. If AT&T never caches its customers data, then it actually becomes impossible to perform such. Then, the government would have to pass a law to compel AT&T to cache data.
Another: Copyright takedown orders. (or other internet service take down orders). That comes with the territory of the ISP services. They put them up, they get liability to take them down.
But here, Apple only created the phone with the encryption. The encryption is not part of any service. It comes with the hardware. The iPhone doesn’t come with “warranty to service Government wherever needed.”
This court order now is set to create a new liability for Hardware manufacturers, which never existed before, how to defeat your own hardware.
It is akin to the US government to say, hey M1A1 tank maker, now you have to create a weapon that can kill your own tank on the battlefield, (and you have to do it for free), because a terrorist got their hands on one of the tanks.
* which brings to mind that all weapons manufacturers should be very worried by this precedence. Indeed, what’s to stop US government from forcing all hardware companies into making perpetual upgrades for free?
Allen says
@Black Pheonix
My understanding of the technology is that it can be done. It’s not about breaking encryption per se (which is mathematically very very difficult), but enabling other ways to defeat encryption.
One example: the passcode in the phone concerned in this case I believe is only 4 digits. FBI can try the 9999 combinations and get in. But the software will erase data after 10 successive failures. If Apple builds a tool to allow the phone to be copied without first providing the passcode (this should in principle be done: it’s not about breaking code), then FBI can try the code by brute force and restore the phone each time it “self destructs” until it gets in.
There may be other variations … such as extracting the data and building virtual phone software but without the self destruct feature for the FBI to restore the encrypted data into … and the FBI can try the to break in by brute force ….
This makes no sense. Employees don’t get to “choose” to do work. FBI gives Apple an order. Apple gives employees the project. Most employees should do what’s provided. I suppose employees can object on ground of being a conscientious objector, but I don’t see how this case rise to that level. Many would willing work on the project – assuming it is technically feasible (and it is) – to catch the terrorists. We can bitch and whine about the U.S. gov’t, but for many people, it stands for the force of good. So long as Apple creates the project, it will be done.
If Apple wants to stall, that’s fine. If executives want to lie about how technically unfeasible it is, fine, too. But if/when the truth comes out, people will go to jail.
Allen says
@Black Pheonix
Again I don’t think your legal theory holds. How is it that AT&T’s “orders rise from liability as part of the business”?
To the extent AT&T “enabled” terrorism or some other crimes to be conducted through its network, so has Apple through its devices….
I highly doubt AT&T is on the hook because of its “liability.” It’s on the hook because it has the means to comply. Gov’t thinks Apple has the means. If it does, it must comply. If it doesn’t, then the order was given in error…
Allen says
@Black Pheonix
Well, there are various self destruct systems that are built into various weapons systems. There are auto-tracking features that are demanded. Technically they could fall into the enemy’s hands, but they are still done.
As for whether they are “free” – it’s accounting. The cost is probably just bundled into the system.
It’s part of the cost of doing business… And let’s be honest, whatever the manhour cost, Apple can afford it. Would it want to be liable for enabling the next 911?
Allen says
In 2008, Microsoft (Microsoft employees) had already helped law enforcement built a tool that extract data from bitlocker encrypted systems.
http://www.pcworld.com/article/145318/article.html
http://www.technewsworld.com/story/62825.html
The tool does not break encryption per se, but allows law enforcement people to extract info while computer is logged on but protected by screen saver.
But the way to “break” bitlocker shouldn’t be hard. Clone an encrypted drive (simple) … and then somehow break the TPM chip that stores the key that decrypt the drive. There are many ways to break the TPM. None of this is probably easy, but I am sure the gov’t has / or can do it.
https://gcn.com/Articles/2010/02/02/Black-Hat-chip-crack-020210.aspx
http://www.networkworld.com/article/2292465/lan-wan/black-hat-paper-on-breaking-trusted-platform-module-withdrawn.html
I wonder … can the NSA simply just clone a TPM chip?
Back to iphone. Perhaps someone can just copy the encrypted internal flash (should be easy), then copy into many iphones and just try brute force. If a phone self-erases (after 10 failures), then copy data again and proceed.
Black Pheonix says
This is correct. But this tool is essentially a patch to the iOS on the iPhone, and it’s not as simple as FBI makes it sounds.
For one, it’s a 1 off, such a unique patch tool has never been done before. Technically, there are all kinds of unforeseen problems with doing surgery on iOS like this.
Even normal iOS patches are somewhat hazardous and risky, if the ultimate goal is for FBI to recover data. We all know how iPhone patches can often lead to phone crashes. And that’s with patches that Apple has tested through many many months of development.
Worse: such a patch can “brick” the phone completely. (that happens with jailbreak type patches on the iOS).
Realistically, if this is to be done, then it has to be tested on multiple test phones 1st, before it is used on the actual terrorist phone. Then, that increases the risk of this tool leaking out exponentially.
All of this points to the simple FACT: that this is not simple.
Also, there ARE alternatives, as you pointed out. As I suggested, the FBI could hire its own people to do this as well, since there is no emergency.
Which goes against the FBI’s arguments, because 1 of the 4 requirements of the All Writs Act is that “The absence of alternative remedies.”
Black Pheonix says
Employees of Apple would likely object to this. And they may raise their own arguments about this being outside of their scope of employment.
A lot of tech guys are backing Apple on this one. And I wouldn’t put it beyond them to refuse to participate.
The scenario is realistic, because Tim Cook will have to find the people who are willing and capable of doing this. If they (probably only a dozen who know the iOS that well), all object and all refuse to do this, then FBI is SOL.
FBI can very well question Tim Cook’s efforts, and may think he’s stalling, but that’s the precise problem with the nature of this court order.
This court order REQUIRES a company to build something that doesn’t exist. So, it believes that the FBI knows whether it’s possible, but doesn’t ask if the employees want to do it. Apple is caught in the middle.
Apple is on the hook if things don’t work out or FBI doesn’t get what it wants, regardless of who’s actually at fault, regardless of whether FBI’s request is just fantasy.
Black Pheonix says
I don’t think this is being done. We know ISIS actually got their hands on at least 1 M1A1 tank from the Iraqis. No self-destruct, and I don’t think General Dynamics would like to be subject to a court order to build that now.
Black Pheonix says
I agree with your last bit, YES, the FBI could do that. Except they don’t want to do, because it is too expensive and would take too long.
Hence, the All Writs fails 1 of its requirements, that there are no alternative remedies.
And let’s face the reality, the FBI actually doesn’t even know what they are looking for. the FBI has the access to the iCloud for that phone already. So they have all the data for that phone for the last backup. What else is there? The FBI doesn’t know. They just want to look.
Well that’s the classic definition of “fishing expedition”. There may be absolutely nothing useful and new in the phone.
The FBI probably also went to their own internal tech people and requested this to be done. The FBI tech team probably came back with “It’s going to cost you $1 million-ish and 1 year to do this” (or some other huge numbers).
And the FBI couldn’t justify their own expense on this, so they just dropkicked it to Apple.
Well, if it’s not that important to justify FBI’s own tech expenses, then why pass the buck to Apple?
NSA can also easily do this. But they probably also don’t want to spend the money for this kind of fishing expedition.
Allen says
@Black Pheonix
It is done … maybe not all systems (there are definitely minuses with having self destructs).
But, see, e.g.:
http://www.slate.com/articles/news_and_politics/explainer/2011/12/downed_cia_drone_was_it_rigged_to_self_destruct_.html
http://spectrum.ieee.org/tech-talk/computing/hardware/us-militarys-chip-self-destructs-on-command
https://fas.org/man/dod-101/sys/land/docs/ch2.pdf
It’s certainly been contemplated.
See, e.g.,
https://books.google.com/books?id=gJHtHlm9r4YC&pg=SL599-PA106&lpg=SL599-PA106&dq=self+destruct+missiles&source=bl&ots=dYHfSxhFx5&sig=dVYSljZt01xre3wHcKnD4WNJV3A&hl=en&sa=X&ved=0ahUKEwj0kNmFq4_LAhWIMGMKHZ9VDPM4ChDoAQhZMAk#v=onepage&q=self%20destruct%20missiles&f=false
Black Pheonix says
Surely self-destruct type mechanism is not the same type as demanding the maker to create a new mechanism AFTER the fact to destroy the original machine that didn’t have the self-destruct.
Sure, the US government passing a law that demanded a backdoor from the very beginning is 1 thing, demanding the backdoor AFTER the fact is creating an obligation that didn’t exist.
“Again I don’t think your legal theory holds. How is it that AT&T’s “orders rise from liability as part of the business”?”
“To the extent AT&T “enabled” terrorism or some other crimes to be conducted through its network, so has Apple through its devices….”
Shouldn’t generalize “enabled terrorism”. Pretty much EVERYTHING in life could “enable terrorism” or crimes. Food can enable terrorism, but that’s too far removed.
AT&T creates liability when they keep/record business related records for other reasons. The data becomes part of their business record, such as financials, customer data, etc.
In the case of United States v. New York Telephone Co. 434 U.S. 159 (1977), relating to the All Writs Act in this present case, the phone company already receives the dialing phone numbers as part of its telephone operation. the court order with ALL writs simply required that the phone company record the same information for the police.
Now, if in that case, the phone company actually never had such information, and the court order required the phone company to BUILD such new capability to acquire and record the information, then it’s a completely different magnitude.
This is the “business liability” that companies would have to necessitate their “help”, because such liabilities are created by the nature of their business. I.e. they would have the information. It’s not just related to their business, but rather the “help” coincides/overlaps the nature of their work, i.e. the phone company ALREADY receives the information demanded.
But a phone maker that creates encryption and security does not normally create “hacks” that defeat their own security. (That’s not required by law). So it’s no longer just “help” that is demanded, but tremendous amount of new work that’s not guaranteed to even work.
That presents a huge burden upon the company, which violates the 2nd requirement of the All Writs Act that there is no unreasonable burden upon the recipient of the order.