AI applications under surveillance by data protection authorities
Concerns about how artificial intelligence (AI) systems use and store data collected from users have been a frequent topic of discussion recently among data protection authorities and experts dealing with data security and privacy issues. . These discussions are exposed to the public through decisions, publications and press releases related to this subject.
The Italian data protection authority, Garante per la Protezione dei dati Personali (“Garante”), put ChatGPT at the center of these debates by temporarily restricting it and later announcing the waiver of its decision on this restriction. rice field. ChatGPT, a large-scale language model developed by US-based OpenAI, is described as an AI-based chatbot that can generate human-like language using deep learning algorithms and can be used on large datasets. to generate meaningful responses through natural language processing technology.
There is a concern that applications like ChatGPT are not sufficiently transparent about how they use and store data collected from users during interactions. Underlying these arguments is that ChatGPT’s privacy policy isn’t clear enough, it doesn’t provide enough information about the use of collected data, users don’t have enough control over their data, The idea is that there is no clear and transparent disclosure of personal information. We are also uncertain whether adequate security measures are in place to protect data and whether user data may be shared with third parties when integrated with other applications and services.
ChatGPT Rating by Garante
With ChatGPT’s recent rapid growth trend and increasing use of it by everyone, it’s time for data protection authorities in the Netherlands, France, Spain, Italy and the UK to step up their scrutiny of AI-based applications and publications. . materials in this field. The most important step in this regard came first from Italy. On March 20, 2023, in response to a notice of a data breach affecting ChatGPT users’ conversations and subscriber payment information, Garante announced that it would be authorized by the U.S.-based company to hold an Italian resident account pending the findings of ChatGPT’s investigation. We have decided to temporarily stop processing your data. Privacy practices were awaited. Notable violations of Garante’s decision include the lack of clear and transparent disclosure to users and other data owners about the processing of personal data by ChatGPT; It was a lack of legal grounds to do so. Algorithms underlying the operation of the platform, erroneous data processing as the information provided does not always match accurate data, and the lack of a filter system for age detection of children expose children to inappropriate responses. It is mentioned that it is done. In particular, Garante has determined that it is not sufficient for ChatGPT to announce in its privacy policy that the intended user is an individual over the age of 13, especially a child user. Garante’s decision is also important in that it does not distinguish between the use of personal data to create or train an algorithmic model and the user’s input of personal data into the particular algorithmic model that is developed.
ChatGPT has been banned for use in Italy after OpenAI made a series of changes to address data security concerns, including increased transparency and data owner rights following Garante’s temporary restriction decision. resumed. The platform currently offers users a number of rights, including the right to disable conversation options used to train ChatGPT’s algorithms. When it comes to children’s data, controls have been put in place to protect children under the age of 13, and age verification tools have been introduced for users accessing her ChatGPT from Italy. A notice has been posted on the Site that ChatGPT may generate false information about “people, places and facts” regarding the mishandling of data and the inability to provide accurate data, Our privacy policy has been updated several times. .
ChatGPT Türkiye privacy policy
The Privacy Policy, last updated on April 27, 2023 and published on the site, explains how user data is processed for users accessing ChatGPT from Turkey. It serves as an important reference for assessing the status of ChatGPT’s data processing processes in light of the Personal Data Protection Act No. 6698 Data (“DPL”) and secondary legislation.
ChatGPT states in its privacy policy that personal data collected from users, such as chat history, search history, user search queries, browser information, device information, location information, etc., may be used to provide, manage, maintain and/or provide said to be processed. Analyzing our services, improving user experience through natural language processing and artificial intelligence techniques, enhancing our services, communicating with users, preventing abuse of our services, ensuring the security of IT systems and networks, developing new programs and services, legal Interests to fulfill obligations and protect legitimate rights of the company. The legal reasons for data processing appear to be based on the establishment and execution of a contract under the DPL, the need for data processing to protect our legitimate interests and compliance with legal obligations. It states that the Terms of Use and Privacy Policy are deemed to have been accepted upon entering the requested information during registration and clicking the continue button.
ChatGPT allows users to control the use of chat history data in application development. In addition, the chat history of users who do not allow this will be deleted from the system within 30 days. The lack of an effective age verification system upon registration is an important detail for users accessing Turkey that is expected to be addressed by the Personal Data Protection Commission (“Commission”).
Our privacy policy states that additional information within the scope of the GDPR will be provided to users in the EU, UK and Switzerland. Evaluating the equivalence of the data processing terms specified in the privacy policy from the perspective of the DPL, the data processing terms in the DPL are similar to his GDPR, including the need to process the personal data of the contracting parties directly related to the DPL. You can see that it shows gender. the establishment or performance of a contract, the necessity of data processing for the data controller to comply with a legal obligation, or the necessity of data processing for the legitimate interests of the data controller other than express consent. In addition, since the terms of use and privacy policy are published only in English, users accessing from Turkey may not be presented with the content in an easy-to-understand manner, or they may agree to the terms of use without fully understanding them. . Regarding data transfers abroad, it is expected that current practices violate her DPL for users accessing Turkey. Furthermore, registration with the Registry of Data Controllers (VERBIS) is also a legal obligation to consider. AI applications should also generally pay the utmost attention to the right to object to adverse consequences for individuals through analysis of data processed solely by automated systems, which is one of the rights that must be upheld. It must be said that there is one. Provided to Parties under DPL. Processing data in this manner may have adverse consequences for individuals, which is the basis for her DPL objection. This issue is laid out in great detail in his GDPR and EU countries are not satisfied with this and are preparing another bill on artificial intelligence.
On the one hand, countries such as the UK, which are tackling this problem with a focus on innovation, should take an interdisciplinary stance, adopt common principles, and address the opportunities and risks associated with artificial intelligence. Demonstrates the need to regulate producers.
Undoubtedly, AI applications like ChatGPT and such automated processing of personal data will remain on the agenda for a long time and all aspects of it will be widely debated. For example, AI-based applications like ChatGPT should provide clear and transparent information about their privacy policies and take necessary steps to protect their users’ data. In addition, data protection authorities are also expected to regularly audit the correct operation of these systems. In general, all next-generation applications and solutions must be pre-audited for personal data protection and privacy, and compliance checks and analysis must be performed before the products and applications can be used.
We would like to thank Aslı Naz Güzel for his contribution.
