
Deepfake AI Imitates CZ Of Binance In Disturbing Ways
CZ, former CEO of Binance, did not recognise a deepfake video with his own voice. Artificial intelligence replicates faces and sounds with dangerous accuracy.
CZ, former CEO of Binance, did not recognise a deepfake video with his own voice. Artificial intelligence replicates faces and sounds with dangerous accuracy.
Get the latest news, learn from experts, discover new tools, and find inspiration right in your inbox.
No spam. Unsubscribe anytime.
Former Binance CEO Changpeng "CZ" Zhao recounted an astonishing experience with a video that perfectly imitated his voice thanks to AI technology.
Zhao revealed in a post posted on Thursday that the Mandarin dubbing in the video was indistinguishable from his real voice due to the extreme accuracy of the reproduction.
The video, in which Zhao appears speaking Chinese in a montage of clips and images created with artificial intelligence, raised strong concerns about the unauthorised use of AI to impersonate public figures. CZ described the accuracy of the voice and lip synchronisation as "disturbing".
Digital duplication of crypto industry executives via generative AI tools is on the rise, and this case is a clear example of malicious impersonation.
After stepping down as CEO of Binance in 2023, Zhao continued to be influential in the cryptocurrency industry and had previously warned the public about the risks of deepfakes.
In the month of October 2024, Zhao had issued a specific warning: do not trust any videos asking for cryptocurrency transfers, due to the spread of manipulated content portraying him.
The deepfakes represent increasingly severe operational risks for the crypto sector
Zhao's latest experience shows how impersonation methods are evolving from simple textual or static image scams to sophisticated audio-visual simulations.
Patrick Hillmann, former Chief Communications Officer of Binance, told in 2023 that some scammers had created a deepfake video to impersonate him during a Zoom meeting with project representatives.
The video had been generated by collecting years of his public interviews and online content, making the fictitious meeting as believable as a real official call from Binance.
Today, advanced voice cloning technologies allow a person to be imitated so well that not even the original can recognise the fake. This represents a danger far beyond identity theft on social networks.
A Million Dollar Scam-Shows the concrete risks of deepfakes
An incident in February highlights the serious financial risks of deepfakes: Arup employees in Hong Kong were scammed into transferring some $25 million during a video call on Microsoft Teams.
According to the South China Morning Post, all participants on the call were AI simulations that mimicked the voice and appearance of the UK's CFO and other colleagues.
Vocal Cloning: Technology Increasingly Accessible
The technology behind realistic voice cloning is increasingly accessible and requires very little voice data to work.
Services such as ElevenLabs allow users to create realistic voice clones from less than 60 seconds of recording.
According to a UK financial institution, over 25% of adults in the UK have dealt with scams based on cloned voices in the last year alone.
Reports from CyFlare show that voice cloning APIs can be purchased in darknet marketplaces for as low as $5. Commercial models usually include watermarking and require consent, while the open-source or illegal ones offer no protection.
EU Requires Explicit Labels for Deepfakes, But Legislation Will Only Arrive in 2026
The EU Artificial Intelligence Regulation, adopted in March 2024, requires any publicly visible deepfake content to be clearly labelled.
However, full implementation of the regulation is only planned for 2026, leaving a significant period of vulnerability.
In the meantime, hardware manufacturers are starting to incorporate sensing systems into consumer devices.
During the Mobile World Congress 2025 in Barcelona, tools capable of detecting real-time audio and video manipulations embedded in the devices themselves were unveiled - a possible step towards autonomous authenticity verification solutions, though still far from commercial distribution.
Sign up for Spaziocrypto®
No spam. Unsubscribe anytime.
Read Next
Crypto User Data Sold on the Dark Web: High Risk
Sensitive data of Ledger, Gemini and Robinhood users put up for sale on the dark web. Concerns about crypto security grow.
New Trend: GHIBLI Token Listed on Various Exchanges
The GHIBLI token, inspired by Studio Ghibli, is gaining popularity with listings on Gate.io, BingX and other platforms. Growth of 20% in 24h.
Chatbot Grok Launches Token via Bankr, Raises $270K
Chatbot Grok launches its token via Bankr, raising $270K. An important step in the integration of AI and blockchain.
DeepSeek Crypto Scam: China Raises The Alert
Chinese authorities warn of a criminal group using the name DeepSeek to scam investors with fake platforms and infected apps.