KUALA LUMPUR, Victims of scams involving fake videos and images, or deepfakes, created using artificial intelligence (AI) technology are urged to come forward to lodge a report to facilitate further action.
Bukit Aman Commercial Crime Investigation Department director Datuk Seri Ramli Mohamed Yoosuf said this is because only the victims themselves can verify if their faces and voices were used without their knowledge in fraudulent activities.
‘Such scams are becoming more frequent, and we anticipate that they will continue to occur…deepfakes are indeed very difficult (to tackle) because they require verification from the victims.
‘For instance, when it involves political leaders, only that individual can verify or deny their involvement,’ he said when contacted by Bernama, adding that the police have not received any reports about the case so far.
Ramli said that based on his observations, the public is often deceived by AI-generated deepfakes, particularly when influential persons like politicians, bu
sinessmen, and celebrities are involved.
According to him, creating videos using the images and voices of influential individuals can result in commercial crimes.
‘This tactic can lead to commercial criminal activities; for instance, there may be videos featuring influential individuals like politicians and celebrities urging people to invest in dubious schemes.
‘The spread of deepfake videos can cause unrest and disharmony if not addressed early,’ he said.
Therefore, Ramli urged the public to exercise caution and verify the authenticity of any information received to avoid falling victim to such scams.
Communications Minister Fahmi Fadzil today urged the public to be wary of AI-generated deepfakes after popular songstress Datuk Seri Siti Nurhaliza fell victim to the tactics.
He said that while the use of AI indeed helps the country’s development, some parties are misusing the technology.
Yesterday, the country’s number-one singer reportedly revealed the latest tactic used by scammers believed to be us
ing AI technology to imitate her voice and face for profit.
Source: BERNAMA News Agency