Hey there! Sign in to join this conversationNew here? Join for free
    Offline

    19
    ReputationRep:
    (Original post by Unkempt_One)
    So I assume your experience means nothing when applied to a large commercial company like Apple.
    Why would you assume that? A secure development is a secure development environment, no matter what the size of the company owning or using it, as long as it is properly managed and set up.
    • TSR Support Team
    • Peer Support Volunteers
    Offline

    21
    ReputationRep:
    TSR Support Team
    Peer Support Volunteers
    (Original post by Good bloke)
    Having managed the development of software in secure environments, I can assure you that a properly selected and tight group in such an environment would pose next to no risk to the public. This is scare-mongering of the first water: we aren't talking of hacking into a commercial website here - there would be no need to be on a network at all.
    Yes… because giving a backdoor to the United States Government would have no repercussions at all… :rolleyes:
    Offline

    19
    ReputationRep:
    (Original post by iEthan)
    Yes… because giving a backdoor to the United States Government would have no repercussions at all… :rolleyes:
    That is not what need necessarily happen. Apple could create a backdoor, use it to obtain the data, and then destroy it again. And :rolleyes: right back at you.
    Offline

    15
    ReputationRep:
    I think Apple should comply. The police could need vital information about terrorist attacks for example.

    Never liked apple, there on a different planet...and if you don't follow there planet they won't accept you on it!
    • TSR Support Team
    • Peer Support Volunteers
    Offline

    21
    ReputationRep:
    TSR Support Team
    Peer Support Volunteers
    (Original post by Good bloke)
    That is not what need necessarily happen. Apple could create a backdoor, use it to obtain the data, and then destroy it again. And :rolleyes: right back at you.
    Why would the security services hand over the device for Apple to take away and obtain the data privately to return to them? It wouldn't happen like that. :rolleyes: dutifully returned again
    Offline

    19
    ReputationRep:
    (Original post by iEthan)
    Why would the security services hand over the device for Apple to take away and obtain the data privately to return to them? It wouldn't happen like that. :rolleyes: dutifully returned again
    If that were not acceptable then an FBI representative - or a judge, even - could babysit the phone while the transfer took place. Apple would not even need possession of it while the software was written and tested, and the data extraction itself would take only moments.

    Your objections are spurious.
    • TSR Support Team
    • Peer Support Volunteers
    Offline

    21
    ReputationRep:
    TSR Support Team
    Peer Support Volunteers
    (Original post by Good bloke)
    If that were not acceptable then an FBI representative - or a judge, even - could babysit the phone while the transfer took place. Apple would not even need possession of it while the software was written and tested, and the data extraction itself would take only moments.

    Your objections are spurious.
    How on earth do you know how long it would take to decrypt the files on the device? My objections are well based. To do this and to provide the US Gov't of all organisations the outlet to potentially do it again is absolutely the worst idea I have ever heard. My objections are not false :rofl:, they are my opinion. Just as your opinions are no less valid than mine. Play nicely :giggle:
    Offline

    19
    ReputationRep:
    (Original post by iEthan)
    How on earth do you know how long it would take to decrypt the files on the device?
    Just how large do you think the memory on the phone is? Does it really matter if it takes half an hour? Or five hours? No. The extraction time won't be days or weeks and can be monitored throughout by an appropriate adult.
    Offline

    13
    ReputationRep:
    (Original post by Good bloke)
    Why would you assume that? A secure development is a secure development environment, no matter what the size of the company owning or using it, as long as it is properly managed and set up.
    "As long as it is properly managed and set up". There's the risk right there. I'd be rich if I had a dollar for every time supposedly foolproof security has been broken by individual incompetence. I'm not trying to assume anything, I think it's highly probable they could make it work without exposing broader security issues, indeed, if it is managed correctly there will be no issues. But the best way to ensure the integrity of the product and their reputation is to never create the software in the first place. Also, the precedent it sets just compounds the risk over time.
    Offline

    16
    ReputationRep:
    (Original post by Good bloke)
    But nobody will be watching them do it, and it isn't just a question of opening a safe: lives are potentially at risk.

    The only people involved would be the Apple software team who would divulge nothing. They could release the phone, dump its data and then close it again and destroy their code for all the FBI cares.
    The existence of the "backdoor" creates a defect that then exists.

    This creates multiple problems:
    1) The protection of the confidentiality of that backdoor then becomes the most onerous job of running the whole operation from insider threats or from espionage which then exposes everyone's data to dragnet surveillance techniques from "friendly" and unfriendly intelligence services.
    2) Doing it once means that it sets a dangerous precedent that will cause Apple (and the rest of the technology industry) loss in several dimensions: Financial, Goodwill and Trust.

    If your argument that the potential protection of a potential life against a potential terrorist attack that may or may not exist is a price worth paying with our privacy, then I think you need to reframe your thinking.
    Offline

    16
    ReputationRep:
    (Original post by Good bloke)
    How? No information useful to hackers would leave Apple.
    You can guarantee that? I certainly couldn't, and it's my job to mitigate these sorts of risks.

    (Original post by Good bloke)
    Having managed the development of software in secure environments, I can assure you that a properly selected and tight group in such an environment would pose next to no risk to the public. This is scare-mongering of the first water: we aren't talking of hacking into a commercial website here - there would be no need to be on a network at all.
    Properly selected people... Right, so we're going to vet all of these people to the n'th degree as we do in the government with Developed Vetting or TS/SCI+Poly in the US? Because that's never failed to identify someone with ulterior motives... The whiff that this capability exists would instantly pique the interests of those who would find this valuable and I can almost guarantee that efforts would be made, covertly, to acquire it, even on off-band networks.

    (Original post by Good bloke)
    Why would you assume that? A secure development is a secure development environment, no matter what the size of the company owning or using it, as long as it is properly managed and set up.
    All you're doing is making the environment less vulnerable to network attack. The human factor is by FAR the biggest risk here.
    Offline

    17
    ReputationRep:
    I think if Apple don't comply then new laws are needed to force them to comply, which will have a much worse effect on privacy than Apple complying in one off cases like this.

    Apple telling the FBI they can't investigate the phone is like me telling the police they can't search my home because they don't have a key. It doesn't work that way, if they don't have a key it is within their right to break the door down. Sure searching my home is an invasion of privacy but so is much of the work the police do to investigate crime. They have these powers for a reason.

    I just hope the directors at Apple are held to account and trialed in a court of law as accomplices if it comes to light that a terrorist attack happens that could have been prevented through access to data on that phone.
    Offline

    13
    ReputationRep:
    (Original post by Sephiroth)
    I think if Apple don't comply then new laws are needed to force them to comply, which will have a much worse effect on privacy than Apple complying in one off cases like this.

    Apple telling the FBI they can't investigate the phone is like me telling the police they can't search my home because they don't have a key. It doesn't work that way, if they don't have a key it is within their right to break the door down. Sure searching my home is an invasion of privacy but so is much of the work the police do to investigate crime. They have these powers for a reason.

    I just hope the directors at Apple are held to account and trialed in a court of law as accomplices if it comes to light that a terrorist attack happens that could have been prevented through access to data on that phone.
    Authoritarianism in the 21st century.
    Offline

    15
    ReputationRep:
    I never thought i'd be backing wholeheartedly a big multinational corporation but fair play.
    Offline

    20
    ReputationRep:
    (Original post by Sephiroth)
    I think if Apple don't comply then new laws are needed to force them to comply, which will have a much worse effect on privacy than Apple complying in one off cases like this.

    Apple telling the FBI they can't investigate the phone is like me telling the police they can't search my home because they don't have a key. It doesn't work that way, if they don't have a key it is within their right to break the door down. Sure searching my home is an invasion of privacy but so is much of the work the police do to investigate crime. They have these powers for a reason.

    I just hope the directors at Apple are held to account and trialed in a court of law as accomplices if it comes to light that a terrorist attack happens that could have been prevented through access to data on that phone.
    Well if the FBI don't have a code they are well within their rights to try to crack it. If they can't then tough. Just like if the police couldn't (for some reason) break open your door.
    Offline

    1
    ReputationRep:
    (Original post by Sephiroth)
    if they don't have a key it is within their right to break the door down.
    No one is stopping the FBI from trying to break in.


    (Original post by Sephiroth)
    I just hope the directors at Apple are held to account and trialed in a court of law as accomplices if it comes to light that a terrorist attack happens that could have been prevented through access to data on that phone.
    For what crime? Don't try blaming Apple because a buyer of an Apple product was a terrorist. Just because data on the phone might have prevented an attack doesn't automatically justify this kind of thing to break into it. My worry is that if Apple do this, it will be a foot in the door for them to be required to do the same thing in more and more cases until the police end up using this software in every case where they have seized a phone. That puts the security and privacy of EVERY Apple user at risk, because it could always be YOU it gets used against. Right now one of Apple's big selling points is data security. If they jump into bed with a government to compromise that security then they will lose their reputation and people will go elsewhere for proper security.

    I'm glad that Apple seem to take data security so seriously.
    • TSR Support Team
    Online

    21
    ReputationRep:
    TSR Support Team
    (Original post by Sephiroth)
    I think if Apple don't comply then new laws are needed to force them to comply, which will have a much worse effect on privacy than Apple complying in one off cases like this.

    Apple telling the FBI they can't investigate the phone is like me telling the police they can't search my home because they don't have a key. It doesn't work that way, if they don't have a key it is within their right to break the door down. Sure searching my home is an invasion of privacy but so is much of the work the police do to investigate crime. They have these powers for a reason.

    I just hope the directors at Apple are held to account and trialed in a court of law as accomplices if it comes to light that a terrorist attack happens that could have been prevented through access to data on that phone.
    You're entirely missing the point - The FBI are free to investigate, what apple are refusing to do is build weaknesses into their security systems - and rightly so; the amount of important data stored on an iPhone (or indeed any smart phone) opens up the possibility of a lot of identity theft, as well as the loss of banking details, should anyone less than savoury get access to the weakness - and they will if it's there. So, the simple question is do you think the collapse of online shopping and banking, massive risk of identity theft and the complete loss of privacy is worth it to make the FBIs job that little bit easier?

    Posted from TSR Mobile
    Offline

    19
    ReputationRep:
    (Original post by Stiff Little Fingers)
    what apple are refusing to do is build weaknesses into their security systems
    No. They are refusing to help the FBI, despite a court order, to gain access to one particular phone, the property of a convicted terrorist. They are not being asked to make that access generic at all.
    Offline

    16
    ReputationRep:
    (Original post by Sephiroth)
    I think if Apple don't comply then new laws are needed to force them to comply, which will have a much worse effect on privacy than Apple complying in one off cases like this.

    Apple telling the FBI they can't investigate the phone is like me telling the police they can't search my home because they don't have a key. It doesn't work that way, if they don't have a key it is within their right to break the door down. Sure searching my home is an invasion of privacy but so is much of the work the police do to investigate crime. They have these powers for a reason.

    I just hope the directors at Apple are held to account and trialed in a court of law as accomplices if it comes to light that a terrorist attack happens that could have been prevented through access to data on that phone.
    This is a badly thought out argument.

    The example you provide is not analogous to this situation. The Bureau isn't asking Apple for a Skeleton Key to unlock the door - that doesn't exist. What they're asking them for is to flash the phone with a fundamentally broken version of the operating system that doesn't currently exist and would need to be specially developed from source for this purpose.

    And to what end? What if Apple do capitulate or get forced to do it? What kind of tinpot terrorist organisation is going to "stick to the plan" when a person that apparently knows, in great detail, the entire plan has been arrested by a government agency with an alleged history of complicity in methods of torture and unlawful interrogation? Anyone with a bit of common sense would have to assume that the entire operation is compromised and that any element of the plan - people, method, location etc. was likely to be now known in detail by the enemy. So what's the first thing you're going to do? Abort, go underground, burn everything and start again. This is counter-intelligence 101 - it's not rocket science. Even if the Bureau got hold of this data, its likely intelligence value at that point would be practically nil.

    To think that this request is a "one off" is frankly naïve - you don't get a capability like this once and say "never again"; you wheel it out repeatedly - "you did it for this terrorist, now you should do it for this terrorist and this bad guy, and while you're at it, all of these criminals' phones..."

    And as for the notion of holding the executives of Apple Inc to account if there is a terrorist attack, how can you prove that their "non-compliance" was the root cause of that intelligence failure? Good luck prosecuting that. Without the contents of the phone available, it's not possible to make that claim.

    And besides, the Supreme Court forces big tech companies to unlock their devices on demand, what are the bad guys going to do? Change their tactics. Everyone else suffers while the bad guys keep doing their thing, just in a different (and uncompromised) way.

    This is a cat and mouse game that traditional law-making cannot outmanoeuvre and any attempt to do so has a high price for the liberty of law-abiding citizens while delivering practically no benefit to our security. It's a fallacy to think that curtailing people's rights and endlessly making things illegal are going to stop people that are willing to take the most extreme illegal courses of action to achieve their objectives.
    • TSR Support Team
    Online

    21
    ReputationRep:
    TSR Support Team
    (Original post by Good bloke)
    No. They are refusing to help the FBI, despite a court order, to gain access to one particular phone, the property of a convicted terrorist. They are not being asked to make that access generic at all.
    They're asked to build software that opens up an iPhone to brute force attacks - that the fbi are looking at the one of the san bernadino shooters doesn't mean it's not the building of a global weakness.

    Posted from TSR Mobile
 
 
 
Reply
Submit reply
TSR Support Team

We have a brilliant team of more than 60 Support Team members looking after discussions on The Student Room, helping to make it a fun, safe and useful place to hang out.

Updated: March 18, 2016
Poll
Do you agree with the PM's proposal to cut tuition fees for some courses?
Useful resources

The Student Room, Get Revising and Marked by Teachers are trading names of The Student Room Group Ltd.

Register Number: 04666380 (England and Wales), VAT No. 806 8067 22 Registered Office: International House, Queens Road, Brighton, BN1 3XE

Write a reply...
Reply
Hide
Reputation gems: You get these gems as you gain rep from other members for making good contributions and giving helpful advice.