Hey there! Sign in to join this conversationNew here? Join for free
Turn on thread page Beta
    Offline

    16
    ReputationRep:
    Inb4 Apple implement a checksum fail anti-defeat mechanism into the key store, whereby if the checksum of the firmware changes without the device being unlocked and authorising the upgrade, it'll destroy the private key.

    That's what I'd be doing right now, if I were Apple
    • TSR Group Staff
    Offline

    19
    ReputationRep:
    TSR Group Staff
    (Original post by Good bloke)
    They are not being asked to make that access generic at all.
    Please stop spouting this nonsense. If you think for one moment that the FBI won't ask for more iPhones to be unlocked in the future, you must be seriously delusional.
    Offline

    19
    ReputationRep:
    (Original post by Dez)
    Please stop spouting this nonsense. If you think for one moment that the FBI won't ask for more iPhones to be unlocked in the future, you must be seriously delusional.
    Are you able to comprehend that there is a difference between unlocking all phones and unlocking one phone under a court order. If there are more requests for court orders then each will be dealt with on its own merits, and the FBI obviously cannot get into any old phone without (a) the software (which won't be installed on any other phone and (b) a court order.

    The originator of the request may be the FBI, but the order for the action comes from a court, after due process, not the FBI.
    • TSR Support Team
    Offline

    21
    ReputationRep:
    TSR Support Team
    (Original post by Good bloke)
    Are you able to comprehend that there is a difference between unlocking all phones and unlocking one phone under a court order. If there are more requests for court orders then each will be dealt with on its own merits, and the FBI obviously cannot get into any old phone without (a) the software (which won't be installed on any other phone and (b) a court order.

    The originator of the request may be the FBI, but the order for the action comes from a court, after due process, not the FBI.
    You're missing his point - it may just be this phone they want access to right now (and the cynic in me would suggest they've already got it since its really not that hard to get past a 4 digit code - I dare say this is more about setting themselves up for the future), but once they've got it they've got a backdoor into any iPhone and a legal precident to force companies to hand over our confidential data.

    Posted from TSR Mobile
    Offline

    19
    ReputationRep:
    (Original post by Stiff Little Fingers)
    they've got a backdoor into any iPhone and a legal precident to force companies to hand over our confidential data.
    They already have the legal precedent they need and Apple itself has said it would have no compunction in handing over confidential data (so what price your concerns on that score?) This is not a request from the FBI: this is a court order, and the phone owner is a convicted terrorist.

    Interestingly, this is a change of heart on the part of Apple: it has previously complied with similar orders, so principles are not all that important to the company..

    http://www.bbc.co.uk/news/technology-34647704

    The FBI does not already have access to this particular phone. Apparently it is set up such that phone's data is erased after too many unsuccessful attempts to unlock it, and to attempt to guess at passwords would risk losing it.
    Offline

    1
    ReputationRep:
    (Original post by Good bloke)
    They already have the legal precedent they need and Apple itself has said it would have no compunction in handing over confidential data (so what price your concerns on that score?) This is not a request from the FBI: this is a court order, and the phone owner is a convicted terrorist.

    Interestingly, this is a change of heart on the part of Apple: it has previously complied with similar orders, so principles are not all that important to the company..

    http://www.bbc.co.uk/news/technology-34647704

    The FBI does not already have access to this particular phone. Apparently it is set up such that phone's data is erased after too many unsuccessful attempts to unlock it, and to attempt to guess at passwords would risk losing it.
    Has he actually been convicted of terror offences, I thought he was dead?

    In any case, why should Apple be forced to build something that attacks/weakens the security of their product to help the FBI in an investigation? People aren't going to want to trust Apple enough to use their products if they cooperate with law enforcement on this level. Handing over data they already have on production of a valid warrant is one thing, but it is wrong to compel a company to make something that directly attacks the security of their own system.
    Offline

    16
    ReputationRep:
    (Original post by KvasirVanir)
    Has he actually been convicted of terror offences, I thought he was dead?

    In any case, why should Apple be forced to build something that attacks/weakens the security of their product to help the FBI in an investigation? People aren't going to want to trust Apple enough to use their products if they cooperate with law enforcement on this level. Handing over data they already have on production of a valid warrant is one thing, but it is wrong to compel a company to make something that directly attacks the security of their own system.
    Quite. Where does this ******** stop?

    "Hi Verisign, your country demands of you that you provide us with a Root CA signed Google Inc TLS certificate for our latest classified anti-terror program. It's a critical matter of national security."
    Offline

    19
    ReputationRep:
    (Original post by KvasirVanir)
    Has he actually been convicted of terror offences, I thought he was dead?
    You are right, my mistake. This was the radicalised San Bernadino couple that were shot dead after their rampage. There is no doubt about their guilt though.
    Offline

    1
    ReputationRep:
    (Original post by Good bloke)
    You are right, my mistake. This was the radicalised San Bernadino couple that were shot dead after their rampage. There is no doubt about their guilt though.
    I agree, no doubt at all, but they aren't convicted terrorists. And even if they were, I don't think Apple should be forced to build an OS that weakens their product in this manner. It would destroy trust in that company, people aren't going to want to buy Apple phones if they can be made insecure and accessible by law enforcement like this.
    • TSR Support Team
    Offline

    21
    ReputationRep:
    TSR Support Team
    (Original post by Good bloke)
    They already have the legal precedent they need and Apple itself has said it would have no compunction in handing over confidential data (so what price your concerns on that score?) This is not a request from the FBI: this is a court order, and the phone owner is a convicted terrorist.

    Interestingly, this is a change of heart on the part of Apple: it has previously complied with similar orders, so principles are not all that important to the company..

    http://www.bbc.co.uk/news/technology-34647704

    The FBI does not already have access to this particular phone. Apparently it is set up such that phone's data is erased after too many unsuccessful attempts to unlock it, and to attempt to guess at passwords would risk losing it.
    just going through the ten thousand different potential codes (0000 to 9999) is not the only way of getting in, nor is it even the most efficient in the case where there isn't a guess restriction. If the FBI are less competent with computers than a 14 year old with access to google then that's their own bloody problem, and doesn't mean that Apple (and as a result every other tech company) should be forced to make their software vulnerable to data harvesting given the inevitable fallout when other groups (both hostile nations and criminal gangs) get their hands on it (and they will, the likes of Assange have shown that governments can't keep things completely under wrap if someone is after them).

    So, is the complete loss of privacy, severe risk to national security and high risk of identity fraud worth it for the illusion of safety?


    Posted from TSR Mobile
    Offline

    19
    ReputationRep:
    (Original post by TheArtofProtest)
    Do you believe Apple should comply with a Court Order issued by Chinese authorities or any authority around the world that demands access to a suspect's iPhone or device?
    I would be surprised if it had not already happened many times. Compliance with local law is part and parcel of doing business in any country.
    Offline

    16
    ReputationRep:
    (Original post by Good bloke)
    Apple should comply.

    1. This is a convicted terrorist's phone; it is likely to hold data that could save my life (by allowing the FBI to prevent another outrage); 2. the requested fix can only work on this phone (if it is a feasible fix in the first place). There is no public downside to complying, therefore.

    Apple's stance is directly comparable to a bomb disposal expert refusing to defuse a booby-trap on the grounds that doing so would break the privacy of the person whose front door is booby-trapped.

    3.I won't be buying an more Apple products.
    1. A dead terrorist. There is nothing to win from getting that phone unlocked. If that phone has the phone numbers of other criminals, it is obvious the criminals will have change their real numbers by now. Unlocking that phone won't save any lives. Anymore than decrypting a (hypothetical) recently found diary by Hitler will save any lives.
    2. You obviously haven't read Apple's letter. The software to unlock the phone (it's not a fix - as there is nothing to broken, watch your vocabulary) can only work on that software but tweaking the software to unlock any phone could be trivial. Plus, this sets a precedent for any powerful country to do the same and force Apple to provide them with tools to break into any given phone in the name of security. Once you go down the slippery slope, the path of least resistance is the only possible path. For us that means less privacy.

    3. Google will be happy to hear that. They will sell your privacy to the biggest seller.
    Offline

    16
    ReputationRep:
    (Original post by Good bloke)
    But nobody will be watching them do it, and it isn't just a question of opening a safe: lives are potentially at risk.

    The only people involved would be the Apple software team who would divulge nothing. They could release the phone, dump its data and then close it again and destroy their code for all the FBI cares.
    Naive.

    You don't need to watch them do it. The tool they would provide to the FBI would be the mechanism on how to do it.

    Reverse engineer the tool and you just got a nice toy to literally hack into any Apple phone and cause a massive economic destruction to Apple by turning its strength into a weakness.

    P.S. Stop being dramatic. The criminals are dead. They can't communicate with anyone. Otherwise, you could say the same of anyone who is a relative of a former Nazi officer or a dead criminal.

    Anyway, this "The only people involved would be the Apple software team" is ignorant to say. You also need testers. And even if they did not want to say anything, there is nothing stopping the FBI or any other intelligence body in the world from hacking into any/all of these people's phones/emails/etc and getting valuable information about the unlocking mechanism or just "paying them a visit" to force them to reveal all info they have on the unlocking mechanism.

    It seems you don't understand how software works.
    Offline

    16
    ReputationRep:
    (Original post by Good bloke)


    How is anyone else's privacy in danger if Apple approaches it as I suggested?
    How is anyone else's privacy in danger if Apple doesn't?
    Offline

    16
    ReputationRep:
    (Original post by GoldenFang)
    What makes a phone different from an address book? Why should the government be able to read your address book with a lawfully issued warrant, but not the data from your phone?
    This sets a precedent from future continuous phone security violations any of which would result in someone learning how to violate a phone security and do it at a massive scale for malicious purposes. Plus, this could very well destroy Apple's reputation.

    Reading a single address book will never make you able to read any other books out there or collect data on them. "Reading" data in your locked phone could make you able to read data in other locked phones out there and collect data on them. People around the world, including politicians and heads of state, as well as children, stalked/abused/raped women would easily become top victims of this.
    Offline

    19
    ReputationRep:
    (Original post by Juichiro)
    There is nothing to win from getting that phone unlocked. If that phone has the phone numbers of other criminals, it is obvious the criminals will have change their real numbers by now.
    I believe they are looking for photos and text messages rather than numbers, which would be potentially useful but I could be wrong.
    Offline

    16
    ReputationRep:
    (Original post by Unkempt_One)
    English translation of potentially: not. Compared to the risks imposed on the security of customers of creating the software there is a very low probability information acquired on his phone would lead to preventing a terrorist attack.

    "The only people involved would be the Apple software team who would divulge nothing. They could release the phone, dump its data and then close it again and destroy their code for all the FBI cares."

    You're confusing the ideal scenario with the actual risk-benefit landscape, seemingly a confounding problem for over-zealous law enforcement nowadays. Unless there's a 100% foolproof way to ensure that the software is not leaked, the risks are completely unacceptable. Also, it's not simply about this individual case but also the legal precedent set of law enforcement being allowed break encryption to get access to someone's phone. Apple must challenge this.
    Yes, Good Bloke is unable to think rationally preferring instead to consider dramatic scenarios where dead terrorists' phones put lives at risk.
    Offline

    16
    ReputationRep:
    (Original post by Good bloke)
    Having managed the development of software in secure environments, I can assure you that a properly selected and tight group in such an environment would pose next to no risk to the public. This is scare-mongering of the first water: we aren't talking of hacking into a commercial website here - there would be no need to be on a network at all.
    Maybe you could show us an actual proof that there is 100% no chance that a leak can happen. Else, your assurances are worth nothing and certainly way less than Apple's point.
    Offline

    19
    ReputationRep:
    (Original post by Juichiro)
    Maybe you could show us an actual proof that there is 100% no chance that a leak can happen. Else, your assurances are worth nothing and certainly way less than Apple's point.
    Nothing in this life is 100% guaranteed, other than death. My assurances are worth nothing anyway.

    However, if Apple is employing untrustworthy people in such a significant privacy-related area, hacks would most likely already have escaped from within Apple into the wild.
    Offline

    16
    ReputationRep:
    (Original post by Dez)
    Please stop spouting this nonsense. If you think for one moment that the FBI won't ask for more iPhones to be unlocked in the future, you must be seriously delusional.
    +1 This is one of the key points of Apple's letter

    (Original post by Good bloke)
    Are you able to comprehend that there is a difference between unlocking all phones and unlocking one phone under a court order. If there are more requests for court orders then each will be dealt with on its own merits, and the FBI obviously cannot get into any old phone without (a) the software (which won't be installed on any other phone and (b) a court order.

    The originator of the request may be the FBI, but the order for the action comes from a court, after due process, not the FBI.
    Technically, the mechanism to unlock a phone might also be the mechanism to unlocking most Apple phones. It only takes one leak from a hypothetical Apple hack of the phone or a future hypothetical of any other Apple phone to put the safety of many people under risk. After all, software is just information. Once you have a copy of the leak, massively distributing it is trivial. Plus, this sets a precedent for governments around the world to force tech companies to break their security open to get data of phones of people they don't "like". In our case, it is a terrorist, but in [insert country with fragile/absent democracy], it could be political dissidents or bloggers.
 
 
 
Reply
Submit reply
Turn on thread page Beta
TSR Support Team

We have a brilliant team of more than 60 Support Team members looking after discussions on The Student Room, helping to make it a fun, safe and useful place to hang out.

Updated: March 17, 2016
Poll
“Yanny” or “Laurel”
Useful resources

The Student Room, Get Revising and Marked by Teachers are trading names of The Student Room Group Ltd.

Register Number: 04666380 (England and Wales), VAT No. 806 8067 22 Registered Office: International House, Queens Road, Brighton, BN1 3XE

Write a reply...
Reply
Hide
Reputation gems: You get these gems as you gain rep from other members for making good contributions and giving helpful advice.