CSA Issues Alert Over AI-generated Endorsement Videos

The Cyber Security Authority (CSA) has issued an alert about the use of deepfake or Artificial Intelligence (AI)-generated endorsement videos.
According to the Authority, these videos falsely portray publicly exposed persons endorsing fake investment schemes, fraudulent financial platforms, or unapproved medical products and cures, including diabetic and weight-loss drugs.
It uses personalities like the President, Ministers of State, Members of Parliament, Government Appointees, Media Houses, and Media Personalities.
The fake videos are widely circulated on social media, especially Facebook with threat actors promising unrealistic or guaranteed high returns on investments.
Other videos advertise “miracle cures” or unapproved medications, often with dangerous health implications, or aim to steal personal and financial information.
Unsuspecting victims are eventually defrauded through advanced fee payments, fake product purchases, or investment traps.
The CSA has warned the public to look out for deepfake red flags such as mismatched lip movements with the audio, unnatural eye movements or blinking, a robotic or overly polished voice that does not sound quite right and unusual lighting, shadows, or background inconsistencies.
The public has also been warned against promotional videos that claim to feature national leaders or officials without verification from official sources, and against sending money or personal information in response to advertisements or messages without confirming their authenticity.
It noted that any investment or health-related claims must be verified with the necessary regulators including the Bank of Ghana, Food and Drugs Authority (FDA), or other regulatory authorities.
“The CSA is working closely with social media platforms, law enforcement, and relevant government institutions to identify and remove these malicious videos and hold perpetrators accountable.”
Source: https://opemsuo.com/author/hajara-fuseini/






