Apple e la sicurezza dei dati personali 18.02.162016-02-09:01:00
Pochi giorni fa l'FBI ha chiesto ad Apple di scardinare la sicurezza dell'iPhone 5c di Syed Farook, killer della strage di San Bernardino, riprogettando iOS per aggiungere una backdoor che dia libero accesso ai dati.
La risposta negativa di Apple ha aperto un vasto dibattito su quale sia il confine della privacy e della sicurezza dei dati di un individuo e se questo caso possa creare un pericoloso precedente.
In 2014, Apple began making iPhones with additional encryption software that they said they couldn't unlock, even if faced with a court order.
[...] Sheri Pym, the federal judge, has ordered Apple not to turn off its encryption but to make it easier for federal agents to randomly guess the suspects' iPhone passcode.
Tim Cook ha preso una posizione molto netta al riguardo pubblicando una lettera rivolta a tutti i clienti di Apple.
The United States government has demanded that Apple take an unprecedented step which threatens the security of our customers. We oppose this order, which has implications far beyond the legal case at hand.
This moment calls for public discussion, and we want our customers and people around the country to understand what is at stake.
We have great respect for the professionals at the FBI, and we believe their intentions are good. Up to this point, we have done everything that is both within our power and within the law to help them. But now the U.S. government has asked us for something we simply do not have, and something we consider too dangerous to create. They have asked us to build a backdoor to the iPhone.
Specifically, the FBI wants us to make a new version of the iPhone operating system, circumventing several important security features, and install it on an iPhone recovered during the investigation. In the wrong hands, this software — which does not exist today — would have the potential to unlock any iPhone in someone’s physical possession.
The FBI may use different words to describe this tool, but make no mistake: Building a version of iOS that bypasses security in this way would undeniably create a backdoor. And while the government may argue that its use would be limited to this case, there is no way to guarantee such control.
Come funziona la crittografia automatica dei dati sui sistemi operativi progettati da Apple e da Google introdotta a partire dal 2014.
Both Apple and Google follow in the footsteps of the now somewhat beleaguered Blackberry, which has encrypted data by default for some time. The firms' focus on privacy comes after nude photos of celebrities were leaked online earlier this month. The breach, which affected actress Jennifer Lawrence, among others, was linked by some security experts to vulnerabilities in Apple's iCloud storage service.
With its decision, which angered the F.B.I. and other law enforcement agencies, the administration essentially agreed with Apple, Google, Microsoft and a group of the nation's top cryptographers and computer scientists that millions of Americans would be vulnerable to hacking if technology firms and smartphone manufacturers were required to provide the government with "back doors," or access to their source code and encryption keys.
The U.S. debate seemed all but over in October, when the Obama administration said it had decided against pushing for legislation mandating backdoors, at least for the time being. But the issue gained new steam in light of attacks by terrorists in California and Paris last fall, which some reports linked to encrypted communications.
Negli Stati Uniti questo dibattito è iniziato negli anni '90 durante l'amministrazione Clinton.
In 1993, the Clinton White House tried to promote "key escrow" solutions to access encryption, including a microchip known as a "Clipper Chip" developed by the National Security Agency for use by telecommunications companies to encrypt voice data in their phone products. The chip essentially acted as a master key for accessing encryption, and was held by the government or a third party.
Apple, in caso diversi, si è comportata in modo più conciliante. Daily Beast riporta i casi in cui Cupertino ha collaborato con le autorità.
But in a similar case in New York last year, Apple acknowledged that it could extract such data if it wanted to. And according to prosecutors in that case, Apple has unlocked phones for authorities at least 70 times since 2008. (Apple doesn't dispute this figure.) In other words, Apple's stance in the San Bernardino case may not be quite the principled defense that Cook claims it is.
Il braccio di ferro tra FBI e Apple potrebbe decidersi di fronte alla Corte Suprema arrivando ad assumere una valenza politica sul piano della privacy e dei diritti civili senza precedenti.
An Apple spokeswoman declined to elaborate on the statement, but the company's most likely next step is to file an appeal. The legal issues are complicated. They involve statutory interpretation, rather than constitutional rights, and they could end up before the Supreme Court.
Le organizzazioni civili e alcuni tra i principali attori del settore si sono spesi a favore della decisione presa da Apple.
Essentially, the government is asking Apple to create a master key so that it can open a single phone. And once that master key is created, we're certain that our government will ask for it again and again, for other phones, and turn this power against any software or device that has the audacity to offer strong security [...]
EFF applauds Apple for standing up for real security and the rights of its customers. We have been fighting to protect encryption, and stop backdoors, for over 20 years. That's why EFF plans to file an amicus brief in support of Apple's position.
Il CEO di WhatsApp Jan Koum.
I have always admired Tim Cook for his stance on privacy and Apple's efforts to protect user data and couldn’t agree more with everything said in their Customer Letter today. We must not allow this dangerous precedent to be set. Today our freedom and our liberty is at stake.
Il CEO di Google Sundar Pichai.
Important post by @tim_cook. Forcing companies to enable hacking could compromise users' privacy. We know that law enforcement and intelligence agencies face significant challenges in protecting the public against crime and terrorism. We build secure products to keep your information safe and we give law enforcement access to data based on valid legal orders. But that’s wholly different than requiring companies to enable hacking of customer devices & data. Could be a troubling precedent. Looking forward to a thoughtful and open discussion on this important issue.