Skip to content

Roßnagel calls for clear rules for the use of AI

AI text robots such as ChatGPT are now being used in numerous places - including in companies. Data protection expert Roßnagel warns of potential risks.

Alexander Roßnagel, Hessian data protection officer. Photo.aussiedlerbote.de
Alexander Roßnagel, Hessian data protection officer. Photo.aussiedlerbote.de

State data protection officer - Roßnagel calls for clear rules for the use of AI

Hessian State Data Protection Commissioner Alexander Roßnagel warns against uncontrolled growth in the use of AI text robots such as ChatGPT. In an interview with Deutsche Presse-Agentur in Wiesbaden, the expert said that he urgently recommends that authorities and companies in particular draw up guidelines for dealing with artificial intelligence. "It would make sense, for example, to set up an official account if the system is used to generate texts." This would protect employees whose data could otherwise be analyzed by the software.

"Companies also need to regulate what these systems can be used for," said Roßnagel. According to the data protection officer's advice, no personal data should be entered and only general questions should be asked.

The results generated by the computer must be checked by an employee before any further work is carried out. "Data protectionists are not generally against the use of AI," emphasized the expert. In many cases, assistance functions could be very helpful. However, there are also risks.

AI chatbots such as ChatGPT from the start-up OpenAI can formulate texts at the linguistic level of a human. The principle behind this is that they estimate word by word how a sentence should continue. The models are trained with huge amounts of information.

Roßnagel urged caution with regard to what data AI chatbots are fed - for example, in order to deliver more accurate results. "Are there trade secrets or personal data involved? That could be problematic," he warned. As a rule, the chat histories that you have with this system are also evaluated. "It is possible that personal data of customers and clients, for example, could be entered," said Roßnagel.

Read also:

  1. The German Press Agency reported from Wiesbaden about Hesse's State Data Protection Commissioner Alexander Roßnagel, who recommended implementing guidelines for AI use, including creating an official account for software like ChatGPT to protect employee data.
  2. Roßnagel suggested that companies should restrict AI chatbots like ChatGPT to handling only general questions and not enter personal data, as the software's evaluations of chat histories could potentially expose sensitive information.
  3. The city of Hesse, along with various companies, has seen a wild growth in the usage of AI software, including ChatGPT, which has the linguistic capabilities of a human and is trained on vast amounts of data.
  4. In a recent software development project in Wiesbaden, experts warned about the potential dangers of AI chatbots, such as ChatGPT, when it comes to data protection and privacy, citing the risk of feeding the machines with trade secrets or personal data for improved results.

Source: www.stern.de

Comments

Latest

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria

Grave accusations levied against JVA staff members in Bavaria The Augsburg District Attorney's Office is currently investigating several staff members of the Augsburg-Gablingen prison (JVA) on allegations of severe prisoner mistreatment. The focus of the investigation is on claims of bodily harm in the workplace. It's

Members Public