Industry
Published June 03, 2025
Last updated June 03, 2025

7 expert takes on stopping AI-based fraud, injection attacks, and the latest fraud trends

Fraud experts offer insights into the new fraud ecosystem: deepfakes, stolen selfies, injection attacks, and more.
Louis DeNicola
Louis DeNicola
6 min
Key takeaways
Deepfakes get a lot of attention, and they look more authentic than ever before. But fraudsters are also buying or stealing real information, including selfies, and injecting it into verification flows. 
Fraudsters can use emulators and injection attacks to make it look like they’re taking a selfie from a mobile phone when they’re actually using a laptop. 
You don’t need to add new steps to your flow to stop these types of attacks. Gather and analyze signals from your existing process to spot fraudsters without compromising conversion.

The fraud ecosystem is experiencing big changes in 2025. Deepfakes can evade detection if fraud analysts and AI fraud models rely solely on visual clues. And a bad actor can buy a stolen identity, including the victim’s name, Social Security number, date of birth, address, selfie, and pictures of their government ID for less than a deluxe car wash at your local gas station. 

Persona’s Pat Hall and SentiLink’s Dr. David Maimon recently hosted a webinar to dig into fighting fraud under these new circumstances. They opened with a real example of deepfakes getting through onboarding at a financial institution and discussed how Persona ultimately flagged the fraud.

But the conversation quickly pivoted to an equally troubling trend — bad actors stealing or buying real people’s information to inject during identity verification checks. It continued with a lively discussion filled with questions from the audience. 

Here are seven questions and answers that stood out. You can also watch the full webinar on demand.

1. How are fraudsters collecting selfies from real people?

Pat pointed toward a recent trend: fraudsters setting up websites and apps to collect verification information from victims. 

“You're asked to do a verification,” says Pat. “The question is, are you on a secure network, or is the verification that you're doing being intercepted?” For example, someone might upload a selfie and video of themselves turning their head to ‘verify’ their identity on a website for a fake dating service. The fraudster can then use or sell the victim’s information and submissions. 

David monitors a variety of fraud forums and groups and says you can find complete profiles, including these types of images and short videos, for $12. There are even bulk discounts if you want to buy more. 

Alternatively, bad actors sometimes pay people rather than tricking them. “A lot of videos out there are of people turning their head left and right, up and down, and then they sell it for $25,” says David.

Bad actors can similarly buy fake verification documents, such as energy bills and Social Security cards, on fraud marketplaces. 

2. What are virtual cameras and injection attacks?

Whether they’re using AI to create or spoof a face, or using a victim’s real face, fraudsters can then try to inject the selfies and videos during identity verification. 

Let’s break down four common tools and techniques fraudsters rely on:

  • Virtual camera: A software-based camera that can stream content as if it were capturing and sending data. 

  • Emulators: Software that can simulate a different device. For example, you can use an emulator on your laptop to create a virtual smartphone. 

  • Replay attacks: When a fraudster uses images and pre-recorded videos to attack identity verification checks. For example, when they try to take a picture of a piece of paper or a different screen with a headshot. 

  • Injection attacks: When a fraudster bypasses a device’s physical camera and inserts (injects) an image or video directly into the output. 

Visuals can go a long way here. We shared one example of an emulator and injection attack during the webinar. Here’s another video of someone using an emulator to inject a selfie into an identity verification request. 

undefined

Fraudsters can use emulators to inject a screenshot or video into a verification request.

3. Can you detect injection attacks with the human eye?

Not necessarily. Sometimes, a deepfake or other AI-generated or altered image will look off, and some emulators leave visual signals that fraud analysts can spot. But you also might need to rely on additional signals, such as information about the device or the user’s behavior. 

Analyzing system-wide signals can also be especially helpful at spotting sophisticated fraud attempts. For example, you could look for patterns or similarities across users' submissions, such as similar backgrounds from multiple selfies. 

4. Which systems are more vulnerable to injection attacks today?

Pat says iOS is decently secure, while Android systems tend to be more vulnerable. However, every system can get breached.

After the webinar, Pat noted that web-based verifications have always been more vulnerable than app-based verifications, regardless of the operating system. And the deficiency is a growing issue as more bad actors use GenAI tools. 

“Web-based verification doesn’t let us detect a lot around the transmission mechanism itself, such as what device a person is using, which we can use to catch deepfakes,” Pat says. “Folks usually aren’t buying multiple devices, but they can switch their browsers easily to hide their tracks with web verifications.”

5. What signals should you collect, and when should you add additional checks?

It’s all about finding a balance between getting the data you need and keeping things easy for users. However, as Pat points out, you can collect a lot of passive signals in the background without adding any friction. “It's actually the best way to do it because the fraudster doesn't know which friction they’re trying to beat,” he says. 

“You may insert a gov ID challenge or a selfie challenge when you detect something weird,” Pat adds. However, you can still look beyond the surface to check a lot of passive and behavioral signals during the submission and evaluation process. 

6. How can fraudsters find images of a victim’s face for an account takeover?

Sometimes, stolen PII is sold as a set with the person’s selfies. Fraudsters also might be able to use the personal information they steal or buy to find images of the victim online. 

David also pointed to another scary possibility after the webinar. “You can also buy the remote desktop protocol access that folks establish when they took over the account,” he says. “And if you have remote desktop protocol to the ‘client’ — the victim — you can get everything on the computer … you'll be able to take a picture when they're working on the computer.”

7. Looking long-term, what will be the best options for identity verification in the future?

As the cat-and-mouse game continues, there will always be the “next great” technology. Right now, there’s a lot of excitement around using digital identities, such as mobile driver's licenses, for identity verification. But no system is foolproof.

Pat suggests the best approach is a multi-layered approach to fraud detection. It’s not about finding a single solution — you need a flexible system that can draw meaningful insights from the many signals you collect. 

David agreed, saying, “It's all about a comprehensive approach of finding signals and making sure that there's consistency across the signal.” 

Watch the full webinar for additional Q&As, insights from Pat and David, and another example of an injection attack.

The information provided is not intended to constitute legal advice; all information provided is for general informational purposes only and may not constitute the most up-to-date information. Any links to other third-party websites are only for the convenience of the reader.
Louis DeNicola
Louis DeNicola
Louis DeNicola is a content marketing manager at Persona who focuses on fraud and identity. You can often find him at the climbing gym, in the kitchen (cooking or snacking), or relaxing with his wife and cat in West Oakland.