April 20, 2024

Brighton Journal

Complete News World

Three Samsung employees reportedly leaked sensitive data to ChatGPT

Three Samsung employees reportedly leaked sensitive data to ChatGPT

On the surface, it might seem like a tool that could be useful for a range of business tasks. But before you ask a chatbot to summarize important notes or check your work for errors, it’s worth remembering that anything you share with ChatGPT can be used to train the system and possibly show up in its responses to other users. This is something many employees probably should have been aware of before they were told to share confidential information with a chatbot.

Soon after Samsung’s semiconductor division began allowing engineers to use ChatGPT, workers leaked confidential information to it on at least three occasions, according to (As I spotted it ). One employee reportedly asked the chatbot to check sensitive database source code for errors, another demanded that the code be improved, and a third fed a ChatGPT-recorded meeting and asked it to generate minutes.

She notes that after learning about the security bugs, Samsung attempted to limit the extent of future bugs by limiting the length of ChatGPT messages to employees to 1 kilobyte, or 1,024 characters of text. The company is also said to be investigating the three employees involved and is building its own chatbot to prevent similar incidents. Engadget has contacted Samsung for comment.

ChatGPT’s It states that unless users explicitly opt out, it uses their prompts to train its models. OpenAI is the owner of the chatbot Do not share confidential information with ChatGPT in chats because it is “unable to delete certain prompts from your history.” The only way to get rid of personally identifiable information on ChatGPT is to delete your account – a process .

See also  Google's old Nest cameras now work with the Google Home app

The Samsung saga is another example of why it’s just as it should be with all your online activity. You never really know where your data will end up.