Why mental health is not for chatbot: Replika and its ban from Italy

That the sanity of the youngest Internet users was a priority is not exactly new.

There are now many reports that deal with the question of the long-term consequences of the use of social networks and their derivations in the hands of people unanimously considered more fragile and therefore with greater need for protection by various legal systems.

More interesting, however, is the fact that the issue of mental health of the youngest is put at the center by an authority that deals with privacy. We are used to identifying privacy authorities as those that put emphasis and analysis on compliance with formal procedures that make it possible for companies to use the data that users  provide to them.

Rarer instead that they go so far as to syndicate the nature of the contents of a social network or, as in this case, of an APP.

This is the case of Replika, a US company based in California that sells artificial intelligence services and which is mainly based on a type of service via app: the creation of a sort of virtual replica of oneself or a virtual girlfriend with which to interact.

The app was particularly successful among very young people and the Italian authority decided at the beginning of February 2023 decided  to ban this app from the Italian market, threatening fines (up to about 22 million dollars) if the problems exposed are not resolved within 20 days of the ruling.

The most important problem identified is the fact that it  is possible for a minor (a teenager or even a child) to open an account without age control performed by the app and interact with it without limitations.

The authority considered the proposed filter to be absolutely inadequate, which instead collects data from minors and uses them against the minors themselves, given that the app has proven to be dangerous for the mental health of users, it also offers sexually explicit content and in some cases it has been reported even a potential suicide instigation signal. Not infrequently it has generally been shown to respond in a worryingly harassing and mean manner, leading the younger user into potential mental harm.

The app unlawfully collects tons of users' personal data including: personal confessions and problems, joys and sorrows. The avatar is instead designed to gain the user's trust by pushing the user  to reveal as much as possible about himself. On the one hand, as often happens, the app declares in its policy that it respects the gdpr but in practice, even if it guarantees that users cannot be under the age of 13 (but in the app stores where it is sold it is declared suitable only for people over 17), does nothing to prevent minors of this age from interacting with it. The legal basis on which the data processing is performed would therefore be the contract itself between the user and the chatbot, but this does not seem to be sufficient given that a minor would hardly be able to fully understand its terms.

There is another aspect to consider which is often overlooked but which in this case could act as an extension of the powers of the privacy guarantor in relation to the matter. The service offers a type of content that affects people's mental health. if this were to be declared in an incontrovertible manner (and not only in Italy) we would be faced with a much more delicate fact. In fact, the app would be forced to comply with even more stringent rules relating to people's health, like someone else's health service provider. Limits on advertising, content monitoring so that people are treated fairly, restrictions in the use and storage of data and above all full responsibility for the health of users if compromised by the reactions of the chatbot. At that point the Privacy authority would therefore have found itself responsible not only for protecting the personal data of users and the collection procedures, but for doing so above all to protect the mental health of the same. Jen Persson, director of children's privacy advocacy group Defend Digital Me shares the same opinion: he told Reuters that tools designed to influence a child's mood or mental well-being ought to be classified as health products, and should therefore be subject to stringent safety standards.

Replika therefore seems to be a typical case in which what appears to be “gaming” is actually a mask under which addiction and abuse are conveyed to attract the weakest without respecting rules that are mentioned but not applied within the company privacy policies.

We will see how the company will react to the ruling and if other authorities will join Italy, this time a pioneer in defending the rights of the youngest (and all those who need support and hope, in good faith, to find it in an unprepared app to the task).



Previous
Previous

Data privacy for rich people: what the latest Meta changes mean for users

Next
Next

Emily in Paris. But make it legal.