In a landmark preliminary ruling, the European Court of Justice (ECJ) clarified the extent to which online platforms are responsible for personal data contained in user-generated content and what due diligence online platforms are required to follow when publishing it (judgment of 2 December, case no. C-492/23 – Russmedia). The case arose from a legal dispute concerning an advertisement that was published on a Romanian online marketplace. The advertisement wrongfully portrayed a woman as a provider of sexual services and contained photos as well as her telephone number without her consent. The advertisement was subsequently reproduced on other online classifieds platforms.
In its judgment, the court clarified that, in addition to the advertiser (who has remained unknown), the operator of the online marketplace is a joint controller of the personal data contained in the advertisement under the General Data Protection Regulation (GDPR), if the publication of the advertisement is in the operator’s own (particularly commercial) interest and if the operator influences the processing of this data beyond merely publishing (hosting) it. This can be the case if the platform operator prescribes the presentation, duration, categorisation, or (algorithmic) ranking of advertisements, for example, thus does remaining purely neutral from a privacy perspective. The ruling is particularly relevant because these criteria apply to almost all commonly used online platforms that feature user-generated content.
Whenever platform operators act as joint controllers of the personal data in their user-generated content, they now face far-reaching obligations under the new ruling: Before publishing user-generated content, platform operators must first check whether it contains sensitive data (e.g. regarding health or sexuality), then verify the identity of the user posting it and, if necessary, request consent, as well as take appropriate technical and organizational measures to at least limit the uncontrolled dissemination of personal data. Without such a prior check, user-generated content may not be published. At the same time, the court clarified that platforms may not invoke their role as a hosting provider to limit their liability, under which they would not be required to review user-generated content. According to the ECJ’s decision, this rule, which has only recently been confirmed under the new EU Digital Services Act, does not apply where platform operators act as joint controllers of personal data in user-generated content.
This holding, which deals with a specific scenario on an online marketplace, can also be applied to various other online platforms. For example, it may be applied to cyberbullying on Facebook, making the ruling of great significance for social networks and online forums. In the future, these social media platforms may be required to review all posts that (may) contain sensitive data in advance and, if necessary, prevent their publication. In practical terms, the ruling could theoretically mark a shift towards greater control of user-generated content online, which could jeopardize the business model of anonymous online platforms.
However, it should be noted that the new obligations only apply to platforms where the operator directly influences the processing of personal data contained in user-generated content. This means that the ruling does not apply to pure hosting providers who remain neutral in the publication process. Review obligations are also less far-reaching for non-sensitive personal data. Platform operators should therefore examine how they can adapt their business models and technical processes, if necessary. In extreme cases, it may be necessary to either establish reliable verification and control mechanisms for user-generated content, or to revert to a pure hosting model to avoid the new audit obligations.
(Dr Lukas Mezger, UNVERZAGT Rechtsanwälte)