China’s DeepSeek AI is watching what you type

Deepseek from China, free artificial intelligence chatbot that is undermining US counterparts, has caused concerns about whether it is safe to use.

Although cyber security researchers say that the application does not seem to be immediately dangerous, it still involves substantial privacy risks, as well as an application that follows China’s laws as an artificial intelligence product that can collect and reorganize everything that People tell him.

All large language models, or LLM, the type of advanced chatbot driven by AI that becomes famous by the OpenAi chatgpt, are based on a large amount of massive data and work in part collecting what people write in them . Deepseek, although more efficient than chatgpt, is not different.

According to Chinese law, all companies must cooperate and help with Chinese intelligence efforts, which can expose data in the hands of Chinese companies to the surveillance of the Chinese government. This system differs from the USA., Where, in most cases, US agencies generally need a court order or order to access information maintained by US technical companies.

But it is possible to use Depseek and minimize the amount of data that it sends to China. The use of application or chatbot through Deepseek.com requires users to register an account, either with an email address or through a Chinese telephone number, which most people outside China do not have.

Lukasz Olejnik, an independent consultant and researcher at the King’s College London Institute, told NBC News that it means that people should be careful to share any sensitive or personal data with Depseek.

“Be careful when entering confidential personal data, financial details, commercial secrets or information on medical care. Anything you write could be stored, analyzed or applied for authorities under China’s data laws, ”said Olejnik.

Ron Deibert, director of the Citizens Laboratory of the University of Toronto, said that Speeek users should be particularly cautious if they have reasons to fear the Chinese authorities.

“Users who are high risk in relation to Continental China, including human rights activists, specific diaspora populations and journalists must be particularly sensitive to these risks and avoid entering something to the system,” said Deibert .

One way to reduce what it sends to China is to register Depseek with a new email account, not one that you already use for other important services. That could maintain the application, or potentially Chinese intelligence services, of being able to easily match what tells Deepseek who you are in other parts of the Internet.

For the most intelligent technologically, it is possible to download the AI ​​Deepseek model and ask you questions directly, without having to go through the Chinese company processing those applications. That not only prevents China from seeing any information to be given to the model, but also means little or no censorship on blocked topics in Beijing, said Olejnik.

Deepseek has also caused concerns because its privacy policy declares that it collects a large amount of confidential information of users, including what type of device they are using and “pattern or rhythms of key pulsation.” While some people can find that invasive, it is limited to what a person writes in the application and not to what he writes in other applications, and is not unknown: Tiktok and Facebook, for example, have had ways to track the pulsations of User keys and mouse movements.

Deibert warned that while there are risks to give information to a Chinese LLM, Americans also carry risks.

“The same risks apply to all AI platforms, including headquarters in the United States,” said Deibert.

Deibert said that many American technological companies collect similar sensitive information, and recently, have worked for the president of the Donald Trump Court. “Whatever critical with the administration, it is a guard dog of the administration, or it is part of a vulnerable or at risk community, it must exercise serious caution before using or entering any data in what to a large extent are ‘paintings blacks’. Remember, as with virtually all social media platforms, user data are part of the raw material used to train those systems, ”he said.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *