State data protection officer - Roßnagel calls for clear rules for the use of AI
Hessian State Data Protection Commissioner Alexander Roßnagel warns against uncontrolled growth in the use of AI text robots such as ChatGPT. In an interview with Deutsche Presse-Agentur in Wiesbaden, the expert said that he urgently recommends that authorities and companies in particular draw up guidelines for dealing with artificial intelligence. "It would make sense, for example, to set up an official account if the system is used to generate texts." This would protect employees whose data could otherwise be analyzed by the software.
"Companies also need to regulate what these systems can be used for," said Roßnagel. According to the data protection officer's advice, no personal data should be entered and only general questions should be asked.
The results generated by the computer must be checked by an employee before any further work is carried out. "Data protectionists are not generally against the use of AI," emphasized the expert. In many cases, assistance functions could be very helpful. However, there are also risks.
AI chatbots such as ChatGPT from the start-up OpenAI can formulate texts at the linguistic level of a human. The principle behind this is that they estimate word by word how a sentence should continue. The models are trained with huge amounts of information.
Roßnagel urged caution with regard to what data AI chatbots are fed - for example, in order to deliver more accurate results. "Are there trade secrets or personal data involved? That could be problematic," he warned. As a rule, the chat histories that you have with this system are also evaluated. "It is possible that personal data of customers and clients, for example, could be entered," said Roßnagel.
Read also:
- A clan member is punished here
- Traffic lawyer warns: Don't talk to the police!
- Will he be convicted as Jutta's murderer after 37 years?
- He also wanted to kill his cousin
- The German Press Agency reported from Wiesbaden about Hesse's State Data Protection Commissioner Alexander Roßnagel, who recommended implementing guidelines for AI use, including creating an official account for software like ChatGPT to protect employee data.
- Roßnagel suggested that companies should restrict AI chatbots like ChatGPT to handling only general questions and not enter personal data, as the software's evaluations of chat histories could potentially expose sensitive information.
- The city of Hesse, along with various companies, has seen a wild growth in the usage of AI software, including ChatGPT, which has the linguistic capabilities of a human and is trained on vast amounts of data.
- In a recent software development project in Wiesbaden, experts warned about the potential dangers of AI chatbots, such as ChatGPT, when it comes to data protection and privacy, citing the risk of feeding the machines with trade secrets or personal data for improved results.
Source: www.stern.de