ElectronicsGadgetITScienceTechnology

LLMs vulnerable to data poisoning and prompt injection risks, UK authorities warn

The UK’s National Cyber ​​Security Center (NCSC) has warned organizations to be vigilant against the imminent cyber risks associated with integrating Large Language Models (LLMs) such as ChatGPT into their businesses, products and services.

in set of blog postNCSC is a global technology The community is still not fully aware of LLM’s features, weaknesses, and (most importantly) vulnerabilities. “I would say that our understanding of LLM is still ‘beta’,” the official said.

One of the most widely reported security weaknesses in existing LLMs is their vulnerability to malicious “prompt injection” attacks. They occur when the user creates input intended to cause a cause. AI Models that behave in unintended ways, such as generating offensive content or disclosing confidential information.

Additionally, there are two risks to the data on which LLM is trained.First of all, a lot of this data Collected from the open internet, it may contain inaccurate, controversial, or biased content.

Second, cybercriminals can not only distort the data available for malicious acts (also known as “data poisoning”), but can also use it to hide prompt injection attacks. . In this way, for example, an AI assistant for bank account holders can be tricked into transferring money to an attacker.

“The advent of LLM is undoubtedly a very exciting time in technology, with many people and organizations (including NCSC) wanting to explore and benefit from LLM,” the official said.

“However, organizations building services that use LLM should exercise the same caution when using beta products or code libraries,” NCSC added. In other words, be careful.

UK authorities are urging organizations to establish cybersecurity principles to ensure that any “worst case scenario” allowed for LLM-powered applications can be addressed.

https://thenextweb.com/news/llms-data-poisoning-prompt-injection-risks LLMs vulnerable to data poisoning and prompt injection risks, UK authorities warn

Show More
Back to top button