Deepfake Voice and Video Are Coming for Law Firms. Here's How to Prepare.
In February 2024, a Hong Kong finance worker was tricked into transferring $25 million after a video call with what appeared to be the company's CFO and other executives. Every person on the call was a deepfake. AI-generated video avatars, speaking in cloned voices, in a live video conference.
For law firms, deepfake technology creates unique and serious risks.
Law-Firm-Specific Deepfake Risks
Client Impersonation
A cloned voice calls your firm claiming to be a client: "I need you to wire the settlement funds to this new account." The voice sounds exactly like the client. It passes the "does this sound right?" test that most people rely on.
Opposing Counsel Impersonation
A deepfake call from "opposing counsel" requesting an emergency extension, a document exchange, or a settlement discussion. Information gathered during the call could be used for competitive advantage or fraud.
Evidence Fabrication
Deepfake video and audio introduced as evidence in litigation. A fabricated confession, a manufactured statement, or an altered surveillance recording. Authenticating evidence is becoming a critical challenge.
Partner Impersonation
A deepfake of a senior partner authorizing a financial transaction, approving a document release, or directing staff to take action. The authority and trust associated with senior partners makes this particularly effective.
Defense Strategies
- Code words for financial transactions. Pre-establish verification codes with clients for any instruction involving money movement. AI can clone a voice. It can't guess a code word.
- Callback verification on separate channels. Any significant instruction received by phone or video gets verified through a different channel. Voice call? Verify by email. Video call? Verify by phone to a known number.
- Multi-person authorization. No single person can authorize wire transfers, document releases, or significant actions based on a single communication.
- Evidence authentication protocols. Establish procedures for authenticating audio and video evidence. Engage forensic experts when authenticity is questioned.
- Limit public voice/video exposure. Attorneys' voices and likenesses in webinars, podcasts, and social media videos provide source material for deepfakes. Consider the trade-off between marketing presence and deepfake risk.
- Update your security awareness training. Include deepfake scenarios in training. Show staff examples of deepfake audio and video so they understand the capabilities.
The era of "trust but verify" is over. In the age of deepfakes, it's "verify, then trust." Every time.