If you've ever thought about jailbreaking on your iPhone, here are some reasons why you definitely want to reconsider: A new deepfake tool for iOS has been discovered and it has been discovered that it can insert videos into devices to trick bank apps into committing identity theft.
As reported by CyberNews, security researchers at the Biometric Authentication Company have found a new tool that Cybercriminals is using to do this on their Jailbrooken iPhone.
This tool works on jailbreaked devices running iOS 15 or later, and can use a special server using RPTM to link hacker computers to iPhone to take over the link between the device's camera and apps. To prevent the app from displaying the actual camera feed, you can instead give AI a stream of generated Deepfake video.
On the other hand, the camera remains normal to the user. Point to any object and it shows it via the camera app. However, the app on the other side may display fake faces. This could allow criminals to trick some apps into thinking they are dealing with real people in real time. This means you can commit fraud through a banking app that uses biometric authentication by pretending to be a real person. Similarly, this tool can be used to create fake identities.
iProov researchers, according to reports on the issue, believe the tool comes from China, and companies that create banks and financial apps will need to upgrade to a more powerful system that can ultimately test “active” by checking whether people on-screen are realistic and in fact exist.
