Apple’s privacy myth does not match reality

Must read

In the south, November is...

Now you can explode a gre...

“Worker Data Scienc...

Ashes 2021: Cummins, Root...


In 2021, Apple Has portrayed himself as the privacy superhero in the world.Its leader insist “Privacy has always been at the core of our work…from the beginning”, this is a “Basic human rights. “ Its new advertise Even bragging about privacy and the iPhone The same thing. Last spring, a software update (iOS 14.5) was introduced that allows users to say no to apps that monitor their activities on the Internet, which really proves something important: when people don’t have to fight to control their information At that time, they chose privacy.Now only 25% Of users agree, but before that, nearly 75% of users agreed to use their information for targeted advertising.As Apple plans to add more privacy protections in iOS15, this will be Release next month, It continues to build itself into a possibly capable force Slow down Slower growth On Facebook, Monitor CapitalismUnfortunately, Apple’s privacy commitment does not show the full story.

The company’s most shocking privacy failure may also be one of its most profitable: iCloud. Over the years, cloud-based storage services have further consolidated the position of hundreds of millions of Apple customers in its ecosystem. This is an Internet-enabled hard drive extension designed to easily unload photos, movies, and other files to your invisible Backup drive. Unfortunately, iCloud allows the police to access all these files almost easily.

In the past, Apple has always insisted that it will not weaken the security of its own devices to build backdoors. But for older equipment, the door is already built. According to Apple’s law enforcement manual, Anyone running iOS 7 or earlier will be unlucky if they fall into the police or ICE crosshairs. With a simple authorization, Apple can unlock a mobile phone. This seems to be the same for courses in Silicon Valley, but the CEOs of most technology giants have not previously claimed that the authorization of their devices would endanger the data security of “hundreds of millions of legal-abiding people… Dangerous precedents that threaten everyone’s civil liberties. “As the security vulnerabilities are finally resolved in future operating systems, this service can be used.

Since 2015, Apple has attracted FBI and Justice Department’s anger Every new round of security enhancements is to create a security device that even Apple can’t crack. But the dirty little secret of almost all Apple’s privacy promises is that there have always been backdoors.Whether it’s the iPhone data from Apple’s latest device or the iMessage data that the company has always advocated “End-to-end encryption,” When using iCloud, all this data is vulnerable to attack.

Apple’s simple design choice to keep the iCloud encryption key has complicated consequences. They will not do this with your iPhone (despite government requests). They don’t use iMessage to do this. Some of the benefits of setting exceptions for iCloud are obvious. If Apple does not hold the key, account users who forget the password will be out of luck. Really secure cloud storage means that the company itself is no better than a random attacker to reset your password. However, retaining this power gives them the terrible ability to hand over the entire iCloud backup when ordering.

iCloud data is not limited to photos and files, it also includes location data, such as from “Find My Phone” or AirTags, Apple’s controversial new tracking device. With just one court order, all your Apple devices can be against you and become a weaponized surveillance system. Apple can certainly fix it. Many companies have secure file sharing platforms.Swiss company Treasure Provide true “end-to-end encryption” for its cloud services. Tresorit users can also see their files uploaded to the cloud in real time and synchronized across multiple devices. The difference is that it is the user who holds the encryption key, not Tresorit. This does mean that if users forget their passwords, they will also lose their files. But as long as the provider has the right to recover or change the password, they have the right to hand the information to the police.

The threat will only grow. Under a new set of content review tools, Apple will scan iCloud uploads and iMessage communications for suspicious child sexual abuse materials.Although the company has searched exclusively Photo Upload to iCloud to find suspicious CSAM, the new tool can now turn any photos and texts you send or receive into your detriment. Stopping CSAM is a lofty goal, but when artificial intelligence fails, the consequences can be disastrous for those who have been wrongly accused. But even if the software works as expected, it can be fatal.As Harvard Law School lecturer Kendra Albert pointed out on Twitter, these “Features can cause queer to be kicked out of the house, beaten, or worse.” Software launched under the name “Child Safety” may pose a fatal threat to LGBTQ+ children who are betrayed by homophobic and transgender parents. Equally chilling is that the tools used to easily track CSAM today can be trained to mark political and religious content for tomorrow.





Source link

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article

In the south, November is...

Now you can explode a gre...

“Worker Data Scienc...

Ashes 2021: Cummins, Root...

Sony Pulse 3D wireless he...