Do not share sensitive information with ChatGPT

by time news

Wednesday, February 1, 2023 03:12 PM

Earlier this month, a Microsoft employee asked in an internal forum if ChatGPT was allowed to be used at work, CNBC reported.

Microsoft responded that they are allowed, as long as employees do not share confidential information with the AI ​​tool, saying: “Please do not send sensitive data to any of the OpenAI programs, as they may use it to train future models.”

The world’s sudden obsession with ChatGPT has caused some tech companies to scramble to warn employees against interacting freely with the technology.

What is remarkable this time is that Microsoft is a major supporter and partner of OpenAI, as last week Microsoft announced a new round of financing for the startup.

ADS



The software giant also plans to integrate OpenAI technology into some of its other products, such as the Bing search engine and Office applications, The Information previously reported.

The close relationship between the two companies could create a potential conflict of interest for Microsoft, as it could benefit from OpenAI getting more training data, according to Vincent Konitzer, professor of computer science and director of the Artificial Intelligence Lab at Carnegie Mellon University.

The fear is that workers could inadvertently share confidential company information — such as internal software code — with a chatbot to ask for advice on how to improve their work.

ChatGPT can, in turn, process the data to train itself, and possibly share copies of confidential information it receives on future exchanges with other users.

Source: Technology News: Microsoft warns its employees: Do not share sensitive information with ChatGPT

You may also like

Leave a Comment