The face of Prime Minister Scott Moe is a frequent view on social networks, but recently its similarity has been informed about doubtful video ads for cryptocurrency schemes that you have never supported.
The videos take the voice and similarity of Moe, and use AI to produce a convincing video of it speaking.
This is not the first time that the image of the prime minister is used to advance fraudulent businesses. In March, MOE recognized similar schemes, which were using their face to sell cryptocurrencies.
On August 1, Moe spoke strongly against the videos and denied any affiliation with the websites mentioned in them.
“I want to express it without a doubt that when you see myself and probably any politician who supports specific cryptocurrencies or things of that nature, which is likely and is certainly a deep defake,” he said.
“You should pass it and advance. They are just there to hurt you.”
The Saskatchewan Financial Affairs and Consumer Authority (FCAA) also issued a warning, advising people who do not send money to entities that are not registered in the province.
According to the 1988 Securities Law, people and companies must register in the FCAA to trade or sell values and other financial products in Saskatchewan. The FCAA has configured a website, Arheyyregistered.CATo allow people to easily verify the status of someone who offers an investment.
“Do not deal with unregistered entities,” said the Executive Director of the FCAA Securities Division, Dean Murrison. “Do not make investment decisions based on public figures endorsements.”
According to FCAA, scammers are creating fraudulent articles and social media articles, commonly using deep defects and other methods to imitate real media sources, including the CBC.
The scammers are resorting to the deep falsifications of the public figures of trust to carry their money through false online ads. The National Ian Hanomansing is among them. He discovered what the law says and what social media companies are doing about it.
Beyond verifying the investment entity, the FCAA encourages people to always look for a second opinion or seek professional advice on investments that they see online. Nor should they make an investment decision based on a support of a remarkable figure.
A global effort to fight
The rapid development of artificial intelligence tools has provided the scammers new unprecedented methods to develop false videos of celebrities and politicians that support their products.
Countries around the world are dealing with how to stay ahead of these scams. Denmark is considering a law that will allow people to report their digital similarity, allowing them to pursue civil cases if their similarity is used without their consent.
Some AI observers are skeptical of that solution. Henry Ajder, an expert in Deepfakes, says that the copyright classification would force people to proactively pursue AI abusers and not be able to trust the government for the police.
“Copyright are treated as a civil law, not necessarily treated as a criminal. Therefore, violating copyright is not something that the State will necessarily process an author,” he said. “This is something that is expected to bring a civil case.”
Nelson Godfrey, an intellectual property lawyer based in Vancouver, says it is unlikely that Canada will follow the Danish route.
“Trying to characterize someone’s similarity as copyright is a bit strange. Therefore, make it work within the existing copyright legislation, it would certainly need to discover how the property works, how the authorship works, if there was a joint authorship or co -ownership of works,” he said. “There are real complications when it comes to those things.”
Unlike the European Union, Canada has no specific laws, although it has digital security laws and a new Artificial Intelligence Minister.
The Ministry of Justice says that it is working to criminalize non -consensual sexual falsifications, that Prime Minister Mark Carney promised while campaigning before the most recent federal elections.