This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Insights Insights
| 1 minute read

The Evolving Threat of Deepfake Telemedicine Scams

The rapid adoption of telehealth has revolutionized healthcare delivery, but it has also created new opportunities for criminals to exploit. One area of concern is the use of deepfakes in telemedicine swindles.

Deepfakes are highly realistic, synthetic media created using AI algorithms where manipulated videos and audio recordings make it appear as someone is saying or doing something they never did (see the example at the end of the article linked below). By creating deepfake videos or audio recordings of existing patients, scammers can trick healthcare providers into providing unnecessary or fraudulent services. Scammers can use deepfakes to impersonate doctors or nurses, gaining access to patient information or convincing patients to share sensitive data. Additionally, AI can create fake medical records to support fraudulent claims or to obtain prescription drugs. 

These deepfakes can have profound consequences, including monetary loss for the provider or patient, identity theft, and potential patient harm. Just as we now do with email phishing scams, telehealth providers will want to educate their workforce and patients about deepfakes. Providers and patients can take several steps to protect themselves against deepfake scams such as encouraging staff and patients to be suspicious of any unusual or unexpected requests, instituting multiple methods to verify patient identity, and avoiding public Wi-Fi or unsecured communication channels when conducting telehealth appointments.

As the use of telehealth continues to expand, it is imperative for healthcare providers and patients to be vigilant against the threat of deepfake swindles. By understanding the risks and taking proactive steps to protect themselves, we can help to ensure the safety and security of telehealth services.

Today, criminals will go so far as to schedule Teams calls using their impersonations – "and they're on video, and they look exactly like the person that you would normally engage with on video," she said.

Tags

insights, ai and blockchain, health care, ruggio_michael