Zoom woes: ‘Are you real?’ will soon replace ‘You’re on mute’

Make sure that you are using the latest version of video conferencing software in case it incorporates security features to detect deepfakes.
Make sure that you are using the latest version of video conferencing software in case it incorporates security features to detect deepfakes.

Summary

  • Fast-developing deepfake technology means anyone on a screen may be an AI creation. It’s time to adopt authenticity cues as deepfakes join our meetings.

Is the boss who’s giving you an order real or just realistic? Deepfakes are taking Zoom calls to another level of awkwardness, making us question whether our co-workers are genuine. A finance worker in Hong Kong transferred more than $25 million to scammers after they posed as his chief financial officer and other colleagues on a video conference call, marking perhaps the biggest known corporate fraud using deepfake technology to date. The worker had been suspicious about an email requesting a secret transaction, but the scammers looked and sounded so convincing on the call that he sent the money.

Corporate IT managers have spent more than a decade trying to train office workers to spot phishing emails and resist the urge to click on dodgy attachments. Hackers and fraudsters often need just one person among hundreds to inadvertently download the malware needed to tunnel into a corporate network. With AI-powered video tools, they’re moving into territory we considered safe, underscoring how quickly deepfake technology has developed in just the past year. While it sounds like science fiction, such elaborate frauds are now relatively easy to set up, ushering us into a new age of scepticism.

The fraud in Hong Kong almost certainly used real-time deepfakes, meaning that the fake executive mirrored the scammer as they listened, talked and nodded during the meeting. According to David Maimon, a criminology professor at Georgia State University, online fraudsters have been using real-time deepfakes on video calls since at least last year for smaller-scale fraud including romance scams.

Maimon posted a video on LinkedIn, showing a demo from developers who are selling deepfake video tools to potential fraudsters. In it, you can see the real image of a man on the left and his fake persona on the right, a beautiful young woman scamming the male victim in the middle.

This is uncharted territory for most of us, but here’s what the Hong Kong victim could have done to spot the deepfake, and what we’ll all need to do in the future for sensitive video calls:

Use visual cues to verify who you’re talking to: Deepfakes still can’t do complex movements in real time. If in doubt, you could ask your video conference counterpart to perform a a unique gesture, like touching their ear or waving a hand, which can be difficult for deepfakes to replicate convincingly in real-time.

Watch the mouth: Look out for discrepancies in lip syncing or weird facial expressions that go beyond a typical telecom connection glitch.

Employ multi-factor authentication: For sensitive meetings, consider involving a secondary conversation via email, SMS or an authenticator app, to make sure the participants are who they claim to be.

Use other secure channels: For critical meetings that involve sensitive information or financial transactions, you and the other meeting participants could verify your identities through an encrypted messaging app like Signal or confirm decisions such as financial transactions through those same channels.

Update your software: Make sure that you are using the latest version of video conferencing software in case it incorporates security features to detect deepfakes.

Avoid unknown video conferencing platforms: Use well-known platforms like Zoom or Google Meet that have relatively strong security measures in place, especially for sensitive meetings.

Look out for suspicious behaviour and activity: Beware of urgent requests for money, last-minute meetings that involve big decisions, or changes in tone, language or a person’s style of speaking. Scammers often use pressure tactics. So, beware of any attempts to rush a decision too.

Some of these tips could become outdated over time, especially visual cues. Last year, you could spot a deepfake by asking the speaker to turn sideways to see them in profile. Now some deepfakes can convincingly move their heads side to side.

For years fraudsters have hacked into computers of wealthy people, hoovering up their personal information to help them get through security checks with their bank. At least in banking, managers can create new processes to force their underlings to tighten up security. The corporate world is far messier, with an array of different approaches to security that allow fraudsters to simply cast their nets wide enough to find vulnerabilities.

The more people wisen up to the possibility of fakery, the less chance scammers will have. We will just have to pay the price as the discomfort of conference calls becomes ever more agonizing, and the old Zoom clichés about your peers being on mute morph into requests for them to scratch their noses. ©bloomberg

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
more

MINT SPECIALS