Actually, you're missing the point as well, CHL. This is not a question of trading liberty for security. This is an issue where you are sacrificing liberty while also sacrificing security. It is lose-lose for everyone but government snoops, hackers, and anyone else who relies on backdoors and security breaches to profit.
Castle building 101: any crack in the wall is a weakness to be exploited. There is no such thing as "99% secure". Create an exploit, and it will be exploited. Especially by the government. That is precisely why they are fishing for precedent. Because the goal isn't to get information off a single phone in a single case. The goal is to undermine security entirely.
Also, you damage interactions with foreign governments. Suppose the Chinese government decides to go on fishing expeditions with US consular staff and orders Apple to use this software to unlock seized phones to check for espionage/"crimes"? Apple cant exactly say no given the circumstances.
Location: Close enough to make a beer run during a TV timeout
Exp:
Quote:
Originally Posted by Resolute 14
This is not a question of trading liberty for security.
Depends on the circumstances. Consider this scenario; a girl and guy meet up in a bar, and exchange phone numbers. They text each other back and forth and agree to meet up. The guy ends up killing the girl.
On the girl's cell phone is information that could lead to the identity of a suspect. However because it has a 4 digit PIN to unlock the phone, that info isn't available to police.
Now that this case is public, a less reputable person knows they have less chance of being caught.
I'd even consider a more likely scenario. I normally use my fingerprint to unlock my phone, and I never reboot it so I am never prompted to enter my PIN. Something goes wrong with the fingerprint reader, and now I can't remember my PIN to unlock my phone. All of my pictures are now lost because I can't take it to the Apple store and have them reset my PIN.
In my second scenario, we had that happen this week at work. Fortunately the guy remembered his PIN on attempt #7. I would personally be more concerned with that happening than with a hacker or the gov't getting my info from my phone.
There is such a think as 99% secure. I've been using the perfect security era as an analogy here (a period of a few decades in the 17th and 18th centuries where locks had been invented that were perfect - no one could open them without the key).
So imagine a perfect lock, like the Bramah lock, which eventually turns out to be pickable, but only by highly skilled lockpicks who have lots of time to do it and the right tools. Now, in almost 100% of cases, you're secure. No one's getting through your lock. Sure, there's a guy out there who could pick it. But it's highly unlikely that he'll happen to come and pick this particular lock - especially given that they only use him when the public interest deems it necessary, e.g. when there's been a murder or something similarly pressing.
Has your liberty really been significantly eroded because that slim possibility exists? I can see the argument that it has in principle, for sure, but for practical purposes I think you're in almost as good a spot as you were before, and the marginal loss in security is offset by the public good that's allowed for.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
There is such a think as 99% secure. I've been using the perfect security era as an analogy here (a period of a few decades in the 17th and 18th centuries where locks had been invented that were perfect - no one could open them without the key).
So imagine a perfect lock, like the Bramah lock, which eventually turns out to be pickable, but only by highly skilled lockpicks who have lots of time to do it and the right tools. Now, in almost 100% of cases, you're secure. No one's getting through your lock. Sure, there's a guy out there who could pick it. But it's highly unlikely that he'll happen to come and pick this particular lock - especially given that they only use him when the public interest deems it necessary, e.g. when there's been a murder or something similarly pressing.
Has your liberty really been significantly eroded because that slim possibility exists? I can see the argument that it has in principle, for sure, but for practical purposes I think you're in almost as good a spot as you were before, and the marginal loss in security is offset by the public good that's allowed for.
One big difference is that an expert lockpicker can't pass on their lock picking abilities to others very easily.
But one person who cracks the software to unlock an iPhone can make it useable and available to large numbers in a very short period of time.
Actually Ken, it does not depend on the circumstances. What the government wants both erodes liberty and security. Your hypothetical scenario is one where you argue the erosion of security can be beneficial in a specific circumstance, but you've still sacrificed the liberty and privacy of everybody in the process.
As to your second example, I do not find your own inability to manage basic computer security to be a compelling argument in favour of sacrificing my own security. This is also an inapplicable scenario as you can reset your password through Apple. What is at issue here is the ability to brute force attack an iPhone, not voluntarily reset its password.
CHL- you're still trying to play along with the government argument that is is a one use, one issue case when the facts say otherwise. The FBI especially has been working for a while to undermine all security by demanding backdoors. The more appropriate way to frame your analogy is that the government wants a master key to be created for it to use on any Bramah lock. And you're just supposed to trust them both that they (1) won't abuse the master key and (2) won't lose control of it.
I think we both know how unlikely both of those scenarios are.
Location: Close enough to make a beer run during a TV timeout
Exp:
Quote:
Originally Posted by Resolute 14
This is also an inapplicable scenario as you can reset your password through Apple.
I think you are confusing the iTunes account password with the PIN unlock. If Apple had a way that I could request a reset of my PIN, this issue would be moot.
The Following User Says Thank You to ken0042 For This Useful Post:
Location: In my office, at the Ministry of Awesome!
Exp:
Quote:
Originally Posted by ken0042
Depends on the circumstances. Consider this scenario; a girl and guy meet up in a bar, and exchange phone numbers. They text each other back and forth and agree to meet up. The guy ends up killing the girl.
On the girl's cell phone is information that could lead to the identity of a suspect. However because it has a 4 digit PIN to unlock the phone, that info isn't available to police.
Now that this case is public, a less reputable person knows they have less chance of being caught.
I'd even consider a more likely scenario. I normally use my fingerprint to unlock my phone, and I never reboot it so I am never prompted to enter my PIN. Something goes wrong with the fingerprint reader, and now I can't remember my PIN to unlock my phone. All of my pictures are now lost because I can't take it to the Apple store and have them reset my PIN.
In my second scenario, we had that happen this week at work. Fortunately the guy remembered his PIN on attempt #7. I would personally be more concerned with that happening than with a hacker or the gov't getting my info from my phone.
I the first instance you're arguing for increasing the risk of crime against a huge portion of the population in exchange for helping solve a crime against a very small portion of the population. Let's face it, knowing that under a very specific set of circumstances, my chance of getting caught for a crime are marginally higher, isn't likely to alter my decision making process.
So you've hugely increased one kind of risk in the hopes of marginally reducing another.
In your second example you've again introduced a huge involuntary risk to everyone else because a minority of people have chosen to improperly use their device. That is akin to saying that door locks should be really easy to pick (like, anyone can do it with a pointy stick) because you sometimes lock yourself out of the house.
I don't know how to use a product properly is no excuse for taking away the functionality and security for everyone else.
__________________
THE SHANTZ WILL RISE AGAIN. <-----Check the Badge bitches. You want some Awesome, you come to me!
And you're just supposed to trust them both that they (1) won't abuse the master key and (2) won't lose control of it.
I think we both know how unlikely both of those scenarios are.
Agree with Resolute 100%. Leaks via Snowden, Wikileaks, etc should be enough to prove that that the government cannot be trusted with the ability to access it's citizen's devices at will.
See, there are a number of different levels to this. The one we're currently on is, "the government simply cannot be trusted to be responsible with this tool if it's given to them."
So, to address that, here's another wrinkle: how about if Apple designs a back door, but it remains confidential to Apple (that is, the government doesn't get it). The only way to have a phone unlocked is to send it, physically, to Apple. As for the FBI or other law enforcement, they have to get a fresh court order to use it each time they want Apple to use it to get past iPhone security. Doesn't that resolve the above concern?
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
It's kind of funny, it used to be (maybe still is?) illegal to export certain types of encryption in the US (I guess to prevent the other guy from getting strong encryption). If those that want back doors in everything get their way, it'll probably become illegal to import strong encryption.
__________________ Uncertainty is an uncomfortable position.
But certainty is an absurd one.
The Following User Says Thank You to photon For This Useful Post:
So, to address that, here's another wrinkle: how about if Apple designs a back door, but it remains confidential to Apple (that is, the government doesn't get it), and they have to get a fresh court order to use it each time they want Apple to use it to get past iPhone security?
Why should we trust Apple less or more than we trust the government? Both in terms of capability to protect secrets and resisting the urge to not abuse them for some gain. Though Apple having this doesn't really gain them anything I guess.
Companies get hacked all the time as well, so that limitation doesn't help with regards to making sure this exploit doesn't make it into the wild.
Apple's a global company, what happens when other countries make the same request for use of the tool?
__________________ Uncertainty is an uncomfortable position.
But certainty is an absurd one.
The Following 2 Users Say Thank You to photon For This Useful Post:
Well, first, it's obvious that Apple can make this tool if it wants to. It's their product. So whether or not they're trustworthy seems immaterial to me.
But aside from that, you're now moving the goal posts: is your concern that the government can't be trusted, or that if Apple develops this someone's going to get ahold of the records that show how they did it and reproduce their work?
I would suggest that a company that can make an impenetrable phone can at least keep its own information secure, if they really want to.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
I think you are confusing the iTunes account password with the PIN unlock. If Apple had a way that I could request a reset of my PIN, this issue would be moot.
Crap. You're right there. Though I stand by my original point. My security should not be compromised over your failure to remember your pin.
Well, first, it's obvious that Apple can make this tool if it wants to. It's their product. So whether or not they're trustworthy seems immaterial to me.
Apple can make whatever product it likes, absolutely. But if it is going to trade its name on security, creating such a backdoor would destroy its reputation.
Quote:
But aside from that, you're now moving the goal posts: is your concern that the government can't be trusted, or that if Apple develops this someone's going to get ahold of the records that show how they did it and reproduce their work?
I don't think we're moving the goalposts so much as you're coming to realize there is more than one issue at play. I neither trust the government nor do I trust that such a tool could be kept secure.
Quote:
I would suggest that a company that can make an impenetrable phone can at least keep its own information secure, if they really want to.
The issue there is scale. One user with one iphone has small risk of breach at present. Largely, falling to social engineering or failing to lock the phone in the first place. Move up to a multinational company level with thousands of employees, and the risk of exposure grows dramatically. This despite the fact that Apple probably spends more money on security than many governments. All it can take is one dummy who doesn't follow internal security protocols, one flaw in one of the systems Apple uses, or even one rogue sysop or developer. It's actually pretty much the same risk, just with many more potential points of failure.
Last edited by Resolute 14; 02-24-2016 at 01:18 PM.
Oh, I'm well aware that there are multiple issues at play. We haven't even discussed my main objection to the court order. I was just presenting a solution to the issue we were talking about at that point. So I take it my solution addressed that problem; that we don't trust the government with this tool. Right?
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
Well, first, it's obvious that Apple can make this tool if it wants to. It's their product. So whether or not they're trustworthy seems immaterial to me.
We're not talking about Apple just creating the tool, we're talking about Apple being forced to create the tool and then what happens after (i.e not committing the significant development effort is one thing while not using it after it's been created is another). You raised the scenario though where Apple was the one responsible for holding the tool so their trustworthiness would seem to be relevant; I doubt the FBI would go for that scenario where the tool was held in someone else's hands that they don't control.
To me I if my data is sensitive enough I'm not going to trust any 3rd party, company or government.
Quote:
Originally Posted by CorsiHockeyLeague
But aside from that, you're now moving the goal posts: is your concern that the government can't be trusted, or that if Apple develops this someone's going to get ahold of the records that show how they did it and reproduce their work?
As mentioned there's multiple concerns which are interdependent. We know that the government can't be trusted, and we know that governments and companies regularly experience leaks, there's zero way to guarantee this custom firmware wouldn't get out into the wild other than to not create it.
Quote:
Originally Posted by CorsiHockeyLeague
I would suggest that a company that can make an impenetrable phone can at least keep its own information secure, if they really want to.
False equivalence, the two are wildly different.
Quote:
Originally Posted by CorsiHockeyLeague
So I take it my solution addressed that problem; that we don't trust the government with this tool. Right?
Not really, let's say that was the decided on solution, but when someone manages to get their hands on the tool and leak it to the web, the FBI will be one of the first in line to download it and try to make it work for them, but they'll only be one of a large number.
__________________ Uncertainty is an uncomfortable position.
But certainty is an absurd one.
Ok, I definitely get Resolute14's position now, as expressed in his last post. I disagree, because I think it's too extreme - I am willing to compromise some of that impenetrable security to allow for what I think is a necessary function of investigation by the authorities. I'm also less cynical about the motivations of government and private actors than some others, and willing to bear the risk of someone else eventually figuring out how it's done, which I think is less than perhaps you guys do. I think I understand where our perspectives part ways.
As for photon's post, first, your statement is exactly my view: the problem here is using the judiciary to coerce Apple into creating a new invention that the applicant government thinks would serve the public benefit. In my view, that's a step too far for the judiciary. As I said above I disagree with your second statement (that the government can't be trusted and that leaks are inevitable, particularly for a company like Apple), as I think it's too cynical and the risks are less than you perceive. For your fourth, I think you're saying that it solved THAT problem (that is, the "can't trust the government" problem), but doesn't solve the leak-related problems you and Resolute14 are concerned about, which as I've said I am less concerned about. Again, correct me if I'm wrong but I think I'm understanding where we agree and where we don't.
Just to be clear, some comments made by Sam Harris on his podcast which I mentioned were the genesis of my "impenetrable locked room" hypothetical earlier, which can be seen here.
Spoiler!
After I posted here to see what people thought, I sent him an e-mail last night setting out my position, which is below (spoilered because it's long).
Spoiler!
Quote:
Dear Dr. Harris,
I listened with interest to your most recent podcast offering. One of the news items you discussed is the current and ongoing controversy surrounding Apple's position to oppose a court order requiring that it create a program that would allow the user to circumnavigate the currently impenetrable security on the iPhone.
For the record, my view is that, first, when one is faced with a court order, one has three options. First, obey the court order; second, appeal the court order and obey it while that appeal is pending; third, appeal the court order and get the court to grant a stay pending the outcome of the appeal. Anything else should result in imprisonment. Regardless of the public sentiment on the substance of the issue, we cannot have individuals or organizations ignoring court orders they don't like – even if they have good reasons for not liking said orders. I suspect that Apple will go with option three.
As to the substance of the issue, I have a problem, in terms of the way power is delineated in society, with giving a Court jurisdiction to order a company to develop a new product that the government will want to use. Regardless of the social utility of that product, this is simply not something that a Court should be empowered to do. No doubt Apple could use its vast resources in a number of ways that would have great social utility, or more narrowly, would be eminently useful to law enforcement. However, I don't think the judiciary should be empowered to force them to use those resources in the way it thinks best. However free you think market actors should be, surely they should at best be limited by statutory regulations, and not dictated to by the whims of judges, whose raison d'etre is to interpret and enforce those statutes.
This is different from asking Apple to supply data that it already has, or use its expertise to help the government achieve a goal (for example, if the security was already beatable if you happened to have the expertise to do it). What is being sought is the conscription by the Court of Apple's labour force to develop a new idea, meeting all the requirements of a patentable technology. If the government wants a private actor to do something like that, it should be seeking that collaboration contractually. Legislated regulation of future smartphone OSes being developed such that they are developed within pre-set parameters regarding security would be another reasonable way to approach the issue.
The other interesting point you made was with regard to the analogy of building an impenetrable room in your house. You asked, if it were possible to build such a room that no one could get into, should this be allowed? You thought not, and noted that this scenario has never arisen in reality. However, it has in fact arisen in real life – for the better part of the 19th century.
Current locks are incredibly easy to pick, and actually represent a massive downgrade from the quality of the most secure locks, developed about two centuries ago. In the 1800s, there was something of a technological race to achieve an unbreakable, impenetrable lock. The lock developed by Bramah (in 1777) was impenetrable for decades. This did in fact attract government attention; his majesty launched a competition, offering a 100 sterling reward to anyone who could create an impenetrable lock that would also seize up and become un-openable (even using the key) if tampered with. Jeremiah Chubb managed it, and claimed the prize in 1818. This is actually remarkably analogous to the iPhone's security system, which effectively "seizes up" after ten incorrect password attempts.
So, to re-phrase your hypothetical question: should Bramah and Chubb have been forbidden by the British Government to sell the locks they'd designed? In my opinion, no. Nor should government have forced them to build an inferior lock. Through their ingenuity, they created a product that the public, and the government itself, wanted to buy. That's the sort of technological progress that society ought to encourage.
Ultimately, as one would expect, the advent of a better lock yielded a better lockpick, and someone eventually discovered how to beat these locks in 1851. An American locksmith named Hobbs, who made his living demonstrating to banks the weaknesses in their security systems, managed to beat both locks. The result of this was a permanent downward spiral in lock security. As discussed, modern locks are hilariously easy to pick – the average person could be taught to pick the bolt lock on your front door alarmingly quickly. Once the public became disillusioned with the notion that unbeatable security existed, there was no longer any margin in developing better technology: no one would pay for it anyway. Instead, the focus shifted to locks that were easy to mass produce cheaply – the modern pin and tumbler design.
What would the result be if the government insisted that every device with electronic security could be beat owing to a built-in security flaw, or "back door"? Possibly the same progress chill as occurred above – once someone's figured out how to breach security, it's only a matter of time until that becomes common knowledge. Arguably, this is inevitable anyway, and forcing Apple to do it themselves is merely speeding matters along. Or maybe history won't repeat itself in this case, and technology companies will continue to develop better locks for future phones. In any event, I thought you might be interested to hear that your unrealistic hypothetical thought experiment has, in a sense, already come to pass. The actual real-world implications of these sorts of public policy decisions are likely impossible to predict.
Harris released another podcast today, wherein he lambasted some of the people who responded vociferously to his statements in the above video. I think he was overly hostile: basically he considers people like Resolute14 to be a part of a new religion of perfect privacy which regards all state actors with excessive suspicion, and would offer them no investigative tools to deal with what he thinks is a real problem in the form of jihadism. I guess I fall somewhere in the middle; I see Resolute's concerns, I just think they're overstated.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno
Last edited by CorsiHockeyLeague; 02-24-2016 at 05:09 PM.
The Following User Says Thank You to CorsiHockeyLeague For This Useful Post:
Well, CHL, I will give you some credit. If you can look at the actions of both governments and their investigative bodies and still believe they should not be viewed with "excessive suspicion", then I will credit your ability to keep faith in the face of observable reality. Which, incidentally, is rather ironic given you are citing a guy best known for being an atheist in your defence.
But yes, I think you get the jist of my/our arguments. I am also in IT, with a level of responsibility in security (though it isn't my main job function). I see on a regular basis the ease with which both powerful tools and user ignorance/negligence can create significant consequences. Technologically, the government is demanding that a very dangerous weapon be created. That you think it is also a legal overstep really just underscores my belief that the government itself is a bad actor in this case.
I'm best known for being an atheist? I actually have spent most of my life believing in a higher power, and only in the past few years have shifted into agnosticism.
The point about you being in IT, while I don't think it makes a difference to the validity of your arguments, does immediately have me nodding "ah, that makes sense". I think the technology crowd have developed a very particular perspective on this, which some would attribute to expertise and others to bias. I just don't see it as that dangerous a weapon, depending on how the issue is handled. The majority of my personal information is on my desktop and laptop computers, not my phone, and I have relatively minimal security concerns there in spite of the lack of a perfect lock.
As for a legal overstep, it's sort of procedural - as I said, I'd be fine if the appropriate legislator wanted to pass a law that said, "all OS's from now on must have the following features to allow for law enforcement to investigate the contents of smartphones and tablets". If a bunch of people oppose said law, they can make it an election issue and vote those legislators out of office. It's a matter of venue; I don't think this should be a matter within the jurisdiction of the courts. So it's not "the government is a bad actor", that's not what you should get out of my position. It's more that the Court shouldn't have granted the order and Apple should prevail on appeal.
All of that being said, my beliefs in this area are as in all things defeasible, which I'd rather people "best know" me for than atheism. A person's non-belief in God says next to nothing about them.
__________________ "The great promise of the Internet was that more information would automatically yield better decisions. The great disappointment is that more information actually yields more possibilities to confirm what you already believed anyway." - Brian Eno