Franchise Player
|
Ok, I definitely get Resolute14's position now, as expressed in his last post. I disagree, because I think it's too extreme - I am willing to compromise some of that impenetrable security to allow for what I think is a necessary function of investigation by the authorities. I'm also less cynical about the motivations of government and private actors than some others, and willing to bear the risk of someone else eventually figuring out how it's done, which I think is less than perhaps you guys do. I think I understand where our perspectives part ways.
As for photon's post, first, your statement is exactly my view: the problem here is using the judiciary to coerce Apple into creating a new invention that the applicant government thinks would serve the public benefit. In my view, that's a step too far for the judiciary. As I said above I disagree with your second statement (that the government can't be trusted and that leaks are inevitable, particularly for a company like Apple), as I think it's too cynical and the risks are less than you perceive. For your fourth, I think you're saying that it solved THAT problem (that is, the "can't trust the government" problem), but doesn't solve the leak-related problems you and Resolute14 are concerned about, which as I've said I am less concerned about. Again, correct me if I'm wrong but I think I'm understanding where we agree and where we don't.
Just to be clear, some comments made by Sam Harris on his podcast which I mentioned were the genesis of my "impenetrable locked room" hypothetical earlier, which can be seen here.
After I posted here to see what people thought, I sent him an e-mail last night setting out my position, which is below (spoilered because it's long).
Spoiler!
Quote:
Dear Dr. Harris,
I listened with interest to your most recent podcast offering. One of the news items you discussed is the current and ongoing controversy surrounding Apple's position to oppose a court order requiring that it create a program that would allow the user to circumnavigate the currently impenetrable security on the iPhone.
For the record, my view is that, first, when one is faced with a court order, one has three options. First, obey the court order; second, appeal the court order and obey it while that appeal is pending; third, appeal the court order and get the court to grant a stay pending the outcome of the appeal. Anything else should result in imprisonment. Regardless of the public sentiment on the substance of the issue, we cannot have individuals or organizations ignoring court orders they don't like – even if they have good reasons for not liking said orders. I suspect that Apple will go with option three.
As to the substance of the issue, I have a problem, in terms of the way power is delineated in society, with giving a Court jurisdiction to order a company to develop a new product that the government will want to use. Regardless of the social utility of that product, this is simply not something that a Court should be empowered to do. No doubt Apple could use its vast resources in a number of ways that would have great social utility, or more narrowly, would be eminently useful to law enforcement. However, I don't think the judiciary should be empowered to force them to use those resources in the way it thinks best. However free you think market actors should be, surely they should at best be limited by statutory regulations, and not dictated to by the whims of judges, whose raison d'etre is to interpret and enforce those statutes.
This is different from asking Apple to supply data that it already has, or use its expertise to help the government achieve a goal (for example, if the security was already beatable if you happened to have the expertise to do it). What is being sought is the conscription by the Court of Apple's labour force to develop a new idea, meeting all the requirements of a patentable technology. If the government wants a private actor to do something like that, it should be seeking that collaboration contractually. Legislated regulation of future smartphone OSes being developed such that they are developed within pre-set parameters regarding security would be another reasonable way to approach the issue.
The other interesting point you made was with regard to the analogy of building an impenetrable room in your house. You asked, if it were possible to build such a room that no one could get into, should this be allowed? You thought not, and noted that this scenario has never arisen in reality. However, it has in fact arisen in real life – for the better part of the 19th century.
Current locks are incredibly easy to pick, and actually represent a massive downgrade from the quality of the most secure locks, developed about two centuries ago. In the 1800s, there was something of a technological race to achieve an unbreakable, impenetrable lock. The lock developed by Bramah (in 1777) was impenetrable for decades. This did in fact attract government attention; his majesty launched a competition, offering a 100 sterling reward to anyone who could create an impenetrable lock that would also seize up and become un-openable (even using the key) if tampered with. Jeremiah Chubb managed it, and claimed the prize in 1818. This is actually remarkably analogous to the iPhone's security system, which effectively "seizes up" after ten incorrect password attempts.
So, to re-phrase your hypothetical question: should Bramah and Chubb have been forbidden by the British Government to sell the locks they'd designed? In my opinion, no. Nor should government have forced them to build an inferior lock. Through their ingenuity, they created a product that the public, and the government itself, wanted to buy. That's the sort of technological progress that society ought to encourage.
Ultimately, as one would expect, the advent of a better lock yielded a better lockpick, and someone eventually discovered how to beat these locks in 1851. An American locksmith named Hobbs, who made his living demonstrating to banks the weaknesses in their security systems, managed to beat both locks. The result of this was a permanent downward spiral in lock security. As discussed, modern locks are hilariously easy to pick – the average person could be taught to pick the bolt lock on your front door alarmingly quickly. Once the public became disillusioned with the notion that unbeatable security existed, there was no longer any margin in developing better technology: no one would pay for it anyway. Instead, the focus shifted to locks that were easy to mass produce cheaply – the modern pin and tumbler design.
What would the result be if the government insisted that every device with electronic security could be beat owing to a built-in security flaw, or "back door"? Possibly the same progress chill as occurred above – once someone's figured out how to breach security, it's only a matter of time until that becomes common knowledge. Arguably, this is inevitable anyway, and forcing Apple to do it themselves is merely speeding matters along. Or maybe history won't repeat itself in this case, and technology companies will continue to develop better locks for future phones. In any event, I thought you might be interested to hear that your unrealistic hypothetical thought experiment has, in a sense, already come to pass. The actual real-world implications of these sorts of public policy decisions are likely impossible to predict.
|
Harris released another podcast today, wherein he lambasted some of the people who responded vociferously to his statements in the above video. I think he was overly hostile: basically he considers people like Resolute14 to be a part of a new religion of perfect privacy which regards all state actors with excessive suspicion, and would offer them no investigative tools to deal with what he thinks is a real problem in the form of jihadism. I guess I fall somewhere in the middle; I see Resolute's concerns, I just think they're overstated.
__________________
"The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
Last edited by CorsiHockeyLeague; 02-24-2016 at 05:09 PM.
|