Apple defends its new anti-child abuse technology in response to privacy concerns

Must read

Top 15 Black Friday Vacuu...

35 relaxing Black Friday ...

West Indies Tour in Pakis...

The action plan aims to c...


After the announcement this week, some experts believe that Apple will soon announce that iCloud will be encrypted. If iCloud is encrypted, but the company can still identify child abuse materials, submit the evidence to law enforcement, and suspend the offender’s duties, which may reduce some of the political pressure on Apple executives.

Won’t relieve all Pressure: Most governments that want Apple to take more action on the issue of child abuse also want more action on content related to terrorism and other crimes. But child abuse is a real and considerable problem, and most large technology companies have so far failed to succeed.

“Apple’s method protects privacy better than any other method I know of,” said David Forsyth, head of the Department of Computer Science at the University of Illinois at Urbana-Champaign, who reviewed Apple’s system. “In my opinion, this system may significantly increase ownership or transaction [CSAM] Was found; this should help protect children. Harmless users should experience minimal or no loss of privacy, because visual derivatives will only be displayed when there is a sufficient match with CSAM pictures, and only for images that match known CSAM pictures. The accuracy of the matching system combined with the threshold makes it unlikely that unknown CSAM pictures will be leaked. “

What about WhatsApp?

Every large technology company faces the terrible reality of child abuse material on its platform. No one is as close to it as Apple.

Like iMessage, WhatsApp is an end-to-end encrypted messaging platform with billions of users. Like any platform of this size, they face a big abuse problem.

“I read the information that Apple posted yesterday, and I am very worried,” Will Cathcart, head of WhatsApp Tweet on Friday. “I think this is a wrong approach, and it is a frustration to the privacy of people all over the world. People ask us if we will adopt this system on WhatsApp. The answer is no.”

WhatsApp includes a reporting function, so any user can report abusive content to WhatsApp. Although the functionality is far from perfect, WhatsApp reported more than 400,000 cases to NCMEC last year.

Cathcart said in his tweet: “This is a surveillance system built and operated by Apple that can be easily used to scan the private content of anything they or the government decides to control.” “Countries where iPhones are sold are acceptable. Products have different definitions. Will this system be used in China? What content will they consider illegal there, and how do we know? How will they manage other types of content proposed by governments around the world to be added to the list for Scan request?”

In a briefing with reporters, Apple emphasized that this new scanning technology is currently only released in the United States. But the company continues to argue that it has a record of fighting for privacy and hopes to continue to do so. In this way, it largely boils down to trust in Apple.

The company argued that government actions cannot easily pirate the new system, and repeatedly emphasized that opt-out is as simple as turning off iCloud backup.

Although iMessage is one of the most popular messaging platforms on the planet, it has long been criticized for the lack of reporting features that are now commonplace on the social Internet. Therefore, Apple has always reported a small number of cases done by companies such as Facebook to NCMEC.

Apple did not adopt this solution, but built something completely different-the end result is an open and worrying issue for privacy hawks. For others, this is a welcome radical change.

NCMEC President John Clark said in a statement: “Apple’s expansion of child protection is a game changer.” “The reality is that privacy and child protection can coexist.”

high risk

One optimist It would be said that enabling full encryption of iCloud accounts while still detecting child abuse material is both anti-abuse and privacy triumph—perhaps even a clever political move that can weaken the anti-encryption rhetoric of officials in the United States, Europe, India, and China.

Realists worry about what will happen to the most powerful country in the world. As government officials begin to imagine the surveillance possibilities of this scanning technology, Apple will receive-and may have received-calls from the capital city, which is a virtual guarantee. Political pressure is one thing, supervision and authoritarian control are another. But this threat is not new, nor is it unique to the system.As a family owned Record of quiet but profitable compromise with ChinaApple has a lot of work to do to convince users that it is capable of resisting harsh governments.

All of the above can be true. The next thing will finally define Apple’s new technology. If the government uses this function to expand the scope of surveillance, then the company has obviously failed to deliver on its privacy promise.





Source link

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article

Top 15 Black Friday Vacuu...

35 relaxing Black Friday ...

West Indies Tour in Pakis...

The action plan aims to c...

India vs New Zealand live...