Decisions Made - In select instances, platforms can be held responsible for offensive remarks.
The operators of social media platforms can only be held responsible for legally offensive content shared by their users if the infringement is easily identifiable, ruled the Oberlandesgericht (OLG) Frankfurt on Thursday.
The decision was made in response to a case where Baden-Württemberg Anti-Semitism Commissioner Michael Blume complained to Twitter (then known as X) about a series of offensive tweets. He demanded that the platform remove and stop distributing certain tweets.
In response, Twitter deleted the account of a user who published six of the problematic tweets. In the initial judge's decision, the Landgericht Frankfurt am Main required Twitter to stop distributing five specific statements by the user against Blume. But Twitter appealed the ruling.
The OLG rejected the injunction. They argued that the platform simply offers a platform for third-party statements and only becomes responsible for potentially offensive content after becoming aware of it. The affected party must first address the platform with specific complaints that can be easily recognized as infractions.
The provider only assumes the responsibility to further investigate and evaluate the reported circumstances after receiving specific complaints. In the case of Blume's complaint, the court noted that the lawyer's letter lacked enough facts for the platform to recognize an infringement without extensive legal or factual review. It was just about "illegal content," but with no explanation or supporting evidence.
Blume also argued that X's reporting form didn't provide a text field for additional information. However, the OLG found that the reporting form met the requirements of the German Act to Improve Enforcement in Social Networks (NetzDG) and was mainly aimed at controlling criminal content. More details could have been provided in the "Content" field or added as an annex.
The decision, made in an expedited procedure, is not subject to appeal, said the OLG.
Press release OLG Frankfurt.
Read also:
- Despite this ruling, the case in Germany involving Baden-Württemberg Anti-Semitism Commissioner Michael Blume and Twitter (now known as X) highlighted the responsibility of social media platforms in dealing with offensive content.
- The Regional Court Frankfurt am Main initially ordered Twitter to stop distributing five specific statements by a user after Blume filed a complaint, but Twitter appealed the ruling to the OLG in Baden-Württemberg.
- The judgment from the OLG in Baden-Württemberg, or the Higher Regional Court Frankfurt am Main, determined that Twitter was not responsible for the offensive tweets until it became aware of them and received specific complaints.
- In a similar case in Hesse, a regional court in Frankfurt am Main had to consider a complaint against a user on a short message service platform that violated someone's rights.
- Michael Blume, who is also active on Twitter, has been a vocal advocate for fighting against hate speech and promoting tolerance on social media platforms, such as in the city of Frankfurt am Main.
- The decision in the Blume case could have implications for other regions in Germany, as the OLG set a precedent for platforms' responsibility in handling offensive content shared by their users.
- This year, Frankfurt am Main, known as the financial hub of Germany and home to the European Central Bank, hosted the Frankfurt Book Fair, which showcased discussions on various issues, including the role and responsibility of social media platforms in dealing with offensive content.