Meta has found herself consistently at the heart of the privacy debate. There is no doubt that the company is using data. For example, they train AI models (more commonly known as Meta AI) using photos published on Facebook and Instagram. But now it seems that Meta is taking things to another level. Recent findings suggest that they want full access to the camera roll on their phones. This is even a photo I haven't shared on Facebook (or Instagram).
As reported by TechCrunch, some Facebook users recently came across curiosity Popup while trying to upload a story. Notifications encourage you to choose a feature called “cloud processing.” On the surface, it sounds fair and safe as Facebook says that using this setting will automatically scan your phone's camera roll and “regularly” upload images to the meta cloud. In return, the company promises to offer “creative ideas” such as photo collages, event summaries, AI-generated filters, suggestions for themes for birthdays, graduations, or other milestones.
Isn't it cool? But I wait for that. If you agree to those terms and conditions, Meta will give you a meta to continually and continuously analyze the content of your private photo.
There is little doubt that AI is about making AI more useful to you. Because AI may need all the data to understand the real world and respond accordingly to questions and prompts. Meta says this is an opt-in feature. This means that users can choose to disable it when they need it. It's fair, but given that this is the user data we're talking about and is given a Facebook history, some users (and privacy advocates) have something to do with it.
The tech giant has previously acknowledged since 2007 that it has helped train generative AI models with all public content uploaded by Facebook and Instagram adults. However, meta does not clearly define the meaning of “public.” Alternatively, the age at which someone qualifies as an “adult” in a dataset from 2007 is more room for various interpretations and more concern. Additionally, updated AI terms active after June 23, 2024 need not bear that these cloud-treated, unpublished photos are exempt from being used as training feed.
Verge reached out to Meta AI executives, but Meta frankly denied, “We don't currently train AI models on those photos, but we won't answer any questions about whether we'll do that in the future or what rights we'll have to keep to camera roll images.”
Thankfully there is an exit. Facebook users can dive into settings and disable this cloud processing feature. Once off, Meta promises to begin deleting unpublished images from the cloud within 30 days. Still, the nature of this tool is pitched as a fun and useful feature, and it offers questions about how users can hand over private data without fully realizing its meaning.
As AI is restructuring how it interacts with Tech, companies like Meta are testing the limits of data they can collect, analyze, and ultimately monetize. This latest move blurs the line between user assistance and data extraction. It was a conscious decision before – posting photos to share with the world – now replaced by quiet uploads of backgrounds, all the risks of seeing invisible AI eyes are unfolding. See how things pan out.
– end