Apple said that researchers can review its child safety features. It is suing a startup that does this.

Must read

Digital transformation &#...

How To Use PayPal As A Pa...

Visit the garden where an...

The Dilemma of “Y: ...

[ad_1]

Apple can hand over the code for review-although this is not what it says it will do. Researchers can also try to reverse-engineer the function in a “static” way, that is, without executing the actual program in a real-time environment.

However, in fact, all these options have at least one major problem in common: they do not allow you to view the code running in real time on the latest iPhone to understand how it actually works in the actual environment. On the contrary, these methods still rely on trust, not only that Apple is open and honest, but also that the code it writes is free of any major errors and omissions.

Another option is to grant members of the Apple Security Research Equipment Program access to the system to verify the company’s claims. But this group of researchers from outside of Apple is a highly exclusive and restricted project. There are many rules for what researchers can say or do, and it may not necessarily solve the trust problem.

For researchers who want to do such things inside the iPhone, there are actually only two options. First, hackers can use zero-day vulnerabilities to jailbreak old iPhones. This is difficult and expensive, and can be turned off with a security patch.

“Apple spent a lot of money trying to prevent people from jailbreaking their phones,” Thiel explained. “They hired people from the jailbreak community to make jailbreaking more difficult.”

Or researchers can use a virtual iPhone that can turn off Apple’s security features. In fact, this means Corellium.

There are also limits to what any security researcher can observe, but the researcher may be able to find out if the scan goes beyond the photos shared to iCloud.

However, if non-child abuse material enters the database, researchers will not see it. In order to solve this problem, Apple stated that it will require two independent child protection organizations in different jurisdictions to have the same CSAM images in their databases. But it provides few details, including how it works, who will run the database, which jurisdictions will be involved, and what the ultimate source of the database is.

Thiel pointed out that the child abuse material issue that Apple is trying to solve is real.

“This is not a theoretical problem,” Thiel said. “This is not an excuse that people put forward as an excuse to implement surveillance. This is a common and practical problem that needs to be solved. The solution is not like getting rid of these mechanisms. This makes them as safe as possible from future abuse.”

However, Corellium’s Tait stated that Apple is trying to lock and be transparent at the same time.

“Apple is trying to have its own cake and eat it at the same time,” said Tate, a former information security expert at the British intelligence agency GCHQ.

“With their left hand, they made it difficult to escape and sued Corellium and other companies to prevent their existence. Now they use their right hand to say,’Oh, we built this very complex system, and it turns out that some people don’t believe that Apple is honest Do it—but it’s okay, because any security researcher can continue to prove it themselves.’”

“I’m sitting here thinking, what does it mean that you can do this? You have designed your system, so they can’t. The only reason people can do this kind of thing is to ignore you, not to thank you.”

Apple did not respond to a request for comment.

[ad_2]

Source link

- Advertisement -spot_img

More articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

- Advertisement -spot_img

Latest article

Digital transformation &#...

How To Use PayPal As A Pa...

Visit the garden where an...

The Dilemma of “Y: ...

Episode 97: Native Altern...