AI-caused car damage is becoming a real insurance fraud problem, with Admiral linking a sharp rise in cases in 2025 to manipulated images and fake supporting materials. The problem is no longer limited to suspicious documents. Photos of damaged vehicles can now be edited to make damage appear worse or to support duplicate submission.
According to a BBC report, an AI-edited license plate from a damaged Land Rover was used in one file, while a similar image with a different license plate appeared in a second case.
Another picture made the rear damage appear more serious than it was. Admiral said these submissions were intercepted by its fraud team and rejected before a payout was made.
Admiral also said fraud increased 71% in 2025 compared to the previous year, and attributed some of that increase to easier access to AI tools that can alter images and create documents that never existed. This gives this trend a clear consumer perspective, as the cost of fraud does not rest solely on the fraudster.
How the fake evidence works
Instead of just relying on fake forms or made-up stories, scammers can now submit a convincing image as supposed evidence. In the examples provided, AI was used to alter vehicle photos so that the damage could be exaggerated or the same incident could be transferred to a different file.
This changes the burden on the claims teams. They no longer just check the paperwork and schedules, but also check whether the image itself is trustworthy. Admiral said its fraud tools are improving and the entire industry is sharing tactics as this type of abuse becomes harder to ignore.
Why bonuses are part of it
Fraud increases costs across the system, and insurers say those costs can lead to higher premiums across the board.
This makes AI image fraud more than just a niche crime thriller. Even drivers with legitimate claims could feel the impact through higher prices and more scrutiny during the verification process.
In some cases these are opportunistic attempts to inflate an actual loss, while others involve forged documents and other fabricated materials designed to support a false claim from the start. AI makes scaling both paths easier.
What happens next?
The immediate response is better detection, but the risk to customers is also clear.
Admiral said fabricated or exaggerated evidence can result in claim denial, policy cancellation and, in more serious cases, criminal prosecution. As AI-generated vehicle evidence becomes more widespread, closer inspection of accident photos will likely become a normal part of claims investigation.
Although Google has taken steps to ensure that AI image generation is watermarked, this is not an industry-wide practice.




