Insecure AI app leaks Android users’ personal data

Applications of AI


Not all AI tools you come across in the mobile app marketplace are the same. In fact, many of them may be more at stake for privacy than previously thought.

Cybersecurity experts have confirmed that billions of records and personal data have been compromised due to the plethora of unlicensed or insecure AI apps in the Google Play Store for Android, including apps sold for identity verification and editing purposes.

See also:

Two Amazon cloud outages caused by AI tools in December, report says

A recent investigation by Cybernews revealed that one of the apps available on Android among others, “Video AI Art Generator & Maker,” had 1.5 million user images, over 385,000 videos, and millions of user AI-generated media files leaked. The security flaw was discovered by researchers who discovered that a misconfiguration of a Google Cloud Storage bucket left personal files vulnerable to outsiders. In total, more than 12 terabytes of users’ media files were accessible through the exposed bucket, the publication said. The app had 500,000 downloads at the time.

Another app called IDMerit leaked customer recognition data and personally identifiable information from users in 25 countries, primarily the United States.

The information includes names, addresses, dates of birth, IDs, and contact information, which equates to terabytes of data. The developers of both apps resolved the vulnerabilities after being notified by researchers.

Still, cybersecurity experts warn that the tendency for these types of AI apps to have lax security poses widespread risks to users. Many AI apps often store user-uploaded files along with AI-generated content, and also use a highly criticized technique known as “hard-coding secrets,” which embeds sensitive information such as API keys, passwords, and encryption keys directly into the app’s source code. Cybernews found that 72% of hundreds of Google Play apps analyzed by researchers had similar security vulnerabilities.



Source link