News

Industries

Companies

Jobs

Events

People

Video

Audio

Galleries

My Biz

Submit content

My Account

Advertise with us

Apple joins AI scene late but raises privacy standard

One of the biggest announcements at Apple’s Worldwide Developer Conference was the Apple Intelligence features in iOS and iPadOS 18 that integrate with third party AI models. At a basic level it is an admission from Apple that it doesn’t make financial sense right now to develop its own multimodal AI models in-house and a strange lowering of the trellises in its mobile computing walled garden. What this means is that if the task cannot be completed on device, Apple will send data to the cloud.
Apple Private Cloud Compute is a big step in bringing consumer private protection to AI.
Apple Private Cloud Compute is a big step in bringing consumer private protection to AI.

If you know anything about Apple, you’ll understand that this approach to computing is new territory for the iPhone and they’ve integrated this in the most nerdy way possible.

The name for the solution, Private Cloud Compute (PCC), is predictably focused on communicating privacy above all else to its users.

Apple says that this system extends the robust privacy measures of Apple devices into the cloud environment and “sets a new standard for cloud privacy”.

Traditionally, cloud-based AI processing had inherent privacy risks due to the need for unencrypted access to user data.

PCC directly addresses these challenges with a new approach that prioritises user privacy.

State security

First there’s Stateless Computation where user data is only used to handle requests and is then deleted. This approach could also circumvent data management laws in the US.

The US Lawful Access to Encrypted Data Act was designed to enhance law enforcement's ability to access encrypted data, both at rest (stored data) and in motion (transmitted data).

For data in motion (data transmission off device), the bill mandates that communication service providers with over a million monthly active users must be capable of providing law enforcement with decryption or decoding of communications.

Providers must be able to decrypt or decode intercepted communications unless technically impossible due to actions of unaffiliated entities. But, since Apple is disposing of the data afterwards, they cannot be compelled to provide it.

From a customer protection side, Apple has enforceable guarantees in place where the security and privacy features of PCC are backed up by technical proof.

Anonymous

PCC blocks anyone, even Apple staff, from accessing user data during processing, and its design makes it difficult for hackers to target specific users' data. Apple will also publish software images of PCC builds, allowing researchers to verify the system's privacy claims.

Apple is hiding user device's IP address behind a third-party relay before sending requests to external servers. This stops the linking of requests with specific users, improving privacy.

Requests are also scrambled using the public keys of checked PCC nodes, making sure they are encrypted from end to end and blocking access to the request in transit by any entity outside of the PCC nodes.

This encryption pipeline is a legitimately exciting step in the evolution of cloud computing and could force a Google to follow suit on Android and prioritise user trust and data protection. With PCC, users can benefit from cloud-based AI while maintaining control over their personal information.

About Lindsey Schutters

Lindsey is the editor for ICT, Construction&Engineering and Energy&Mining at Bizcommunity
Let's do Biz