CASE OF SANCHEZ v. FRANCE
(Application no. 45581/15)
JUDGMENT
Art 10 • Freedom of expression • Elected politician fined in criminal proceedings for failing to delete, from his publicly accessible Facebook “wall” used for his election campaign, Islamophobic comments by third parties also convicted • Foreseeability of law • Duties and responsibilities of politicians using social networks for political and election-related purposes • Accentuated impact of hate speech causing greater harm in election period marked by tensions • Need for shared liability between all actors involved • Desirable implementation of minimum degree of moderation or prior filtering by host or account holder to identify and remove unlawful remarks within a reasonable time, even in absence of notification by injured party • Deliberate choice of applicant, experienced in public communication and having knowledge of digital platforms, to allow public access to account • Failure to act despite being informed of impugned comments • No question of account with potentially excessive traffic • Proportionality analysis by Court depending on level of responsibility of person concerned and degree of notoriety and representativeness • Proportionate criminal sentence
STRASBOURG
15 May 2023
This judgment is final but it may be subject to editorial revision.
TABLE OF CONTENTS
THE CIRCUMSTANCES OF THE CASE. 5
RELEVANT LEGAL FRAMEWORK AND PRACTICE.. 10
A. Freedom of the Press Act (Law of 29 July 1881) 10
B. Audiovisual Communication Act (Law no. 82-652 of 29 July 1982) 11
C. The legal regime applicable to the “producer”. 12
1. The concept of “producer”. 12
2. Establishing the liability of the “producer”. 13
D. Other relevant domestic law material 15
1. Case-law of the Court of Cassation. 15
2. Legislation subsequent to circumstances of present case. 16
II. INTERNATIONAL MATERIAL. 17
A. Communication on the Internet 17
2. Other international sources. 21
(a) Committee of Ministers of the Council of Europe. 22
(b) Parliamentary Assembly of the Council of Europe (PACE) 26
(c) European Commission against Racism and Intolerance (ECRI) 27
(b) Committee on the Elimination of Racial Discrimination. 33
C. European Union law and case-law of the Court of Justice of the European Union (CJEU) 34
D. Comparative law material 36
III. TERMS OF USE OF SOCIAL NETWORK FACEBOOK.. 37
ALLEGED VIOLATION OF ARTICLE 10 OF THE CONVENTION.. 37
B. The parties’ submissions. 38
3. Observations of third-party interveners. 43
(a) The Government of Slovakia. 43
(b) The Government of the Czech Republic. 44
(c) Media Defence and the Electronic Frontier Foundation. 44
(d) European Information Society Institute (EISi) 44
1. Whether there has been an interference. 45
2. Whether the interference was lawful 45
(b) Application of those principles to the present case. 47
3. Whether the interference pursued a legitimate aim... 50
4. Whether the interference was necessary in a democratic society. 51
(b) Application of those principles to the present case. 57
CONCURRING OPINION OF JUDGE KŪRIS. 70
DISSENTING OPINION OF JUDGE RAVARANI. 71
DISSENTING OPINION OF JUDGE BOŠNJAK.. 74
JOINT DISSENTING OPINION OF JUDGES WOJTYCZEK AND ZÜND.. 79
In the case of Sanchez v. France,
The European Court of Human Rights, sitting as a Grand Chamber composed of:
Georges Ravarani, President
Marko Bošnjak,
Gabriele Kucsko-Stadlmayer,
Krzysztof Wojtyczek,
Faris Vehabović,
Egidijus Kūris,
Branko Lubarda,
Armen Harutyunyan,
Georgios A. Serghides,
Lətif Hüseynov,
María Elósegui,
Gilberto Felici,
Erik Wennerström,
Saadet Yüksel,
Ana Maria Guerra Martins,
Mattias Guyomar,
Andreas Zünd, judges,
and Marialena Tsirli, Registrar,
Having deliberated in private on 29 June 2022 and 8 February 2023,
Delivers the following judgment, which was adopted on the latter date:
1. The case originated in an application (no. 45581/15) against the French Republic lodged with the Court on 15 September 2015 under Article 34 of the Convention for the Protection of Human Rights and Fundamental Freedoms (“the Convention”) by a French national, Mr Julien Sanchez (“the applicant”).
2. The applicant was represented by Mr D. Dassa-Le Deist, a lawyer practising in Paris. The French Government (“the Government”) were represented by Mr F. Alabrune and later by Mr D. Colas, Director of Legal Affairs at the Ministry of European and Foreign Affairs.
3. The applicant contended that there had been a violation of Article 10 of the Convention, on account of his criminal conviction for the offence of incitement to hatred or violence against a group or an individual on grounds of religion, following his failure to take prompt action to delete comments posted by third parties on the “wall” of his Facebook account.
4. The application was allocated to the Fifth Section of the Court (Rule 52 § 1 of the Rules of Court). On 9 January 2018 notice of the complaint under Article 10 of the Convention was given to the Government and the remainder of the application was declared inadmissible pursuant to Rule 54 § 3.
5. On 2 September 2021 a Chamber of that Section composed of Síofra O’Leary, President, Mārtiņš Mits, Ganna Yudkivska, Stéphanie Mourou‑Vikström, Ivana Jelić, Arnfinn Bårdsen, Mattias Guyomar, judges, and Victor Soloveytchik, Section Registrar, delivered its judgment. It declared, unanimously, the application admissible and found, by six votes to one, that there had been no violation of Article 10 of the Convention.
6. On 29 November 2021 the applicant sought the referral of the case to the Grand Chamber and on 17 January 2022 the panel of the Grand Chamber accepted that request.
7. The composition of the Grand Chamber was then decided in accordance with Article 26 §§ 4 and 5 of the Convention and Rule 24.
8. Both the applicant and the Government submitted written observations on the merits of the case (Rule 59 § 1).
9. Observations were also received from the Slovak and Czech Governments, Media Defence, the Electronic Frontier Foundation and the European Information Society Institute, having been granted leave by the President of the Grand Chamber to submit written comments (Article 36 § 2 of the Convention and Rules 71 § 1 and 44 § 3).
10 A hearing took place in public in the Human Rights Building, Strasbourg, on 29 June 2022.
There appeared before the Court:
(a) for the Government
Mr T. Stehelin , co-Agent;
(b) for the applicant
Mr D. Dassa-Le-Deist,
Mr S. Josserand, Counsel.
The Court heard addresses by Mr Stehelin, Mr Dassa-Le Deist and Mr Josserand, and also their replies to questions from judges.
11. The application concerns, with regard to Article 10 of the Convention, the criminal conviction of the applicant, at the time a local councillor who was standing for election to Parliament, for the offence of incitement to hatred or violence against a group or an individual on grounds of religion, following his failure to take prompt action to delete comments posted by third parties on the “wall” of his Facebook account.
12. The applicant was born in 1983 and lives in Beaucaire.
13. He has been the mayor of Beaucaire since 2014 and chairs the group of the Rassemblement national (a political party known as Front national (FN) until 2018) in the Regional Council of Occitanie. The website of Beaucaire town hall contains a page presenting the applicant on which it is also stated that in his “professional life” he was responsible for the “FN’s Internet strategy ... for 7 years”. At the time of the events at issue he was the Front national candidate for the Nîmes constituency in the French parliamentary elections. F.P., then a member of the European Parliament (MEP) and first deputy to the mayor of Nîmes, was one of his political opponents.
14. On 24 October 2011 the applicant wrote a post about F.P. on the publicly accessible “wall” of his Facebook account, which was administered by him personally, reading as follows (translation):
“While the FN has launched its new national website on schedule, spare a thought for the Nîmes UMP [Union for a Popular Movement] MEP [F.P.], whose site, which was supposed to be launched today, is displaying an ominous triple zero on its homepage ...”
15. Fifteen or so comments by third parties appeared in response to that post. Among them was that of S.B., who reacted on the same day by posting the following remarks on the applicant’s Facebook “wall” (translation):
16. Another reader, L.R., added the following three comments (translation):
“Shisha bars all over the town centre and veiled women... Look what’s become of nimes, the so-called roman city... The UMP and the PS are allies of the muslims.” (sic)
“Drug trafficking run by the muslims rue des lombards, it’s been going on for years... even with CCTV in the street... more drug dealing in plain sight on avenue general leclerc where riffraff sell drugs all day long but police never come and even outside schools, stones get thrown at cars belonging to ‘white people’ route d’arles at the lights all the time ... nimes, insecurity capital of languedoc roussillon.” (sic)
“prout, councillor for economic devellopment lol hallal economic devellopment boulevard gambetta and (islamic) republic street.” (sic)
17. On the morning of 25 October 2011, F.P.’s partner Leila T. (who had apparently been designated by the forename “Leilla” in the comment by S.B. - see paragraph 15 above) became aware of the comments. Feeling directly and personally insulted by remarks that she described as “racist”, associating her forename, which “sounded North African”, with the policy of her partner, she immediately went to the hairdressing salon run by S.B., whom she knew personally. S.B., who had been unaware that the applicant’s Facebook “wall” was public, deleted his comment just after Leila T. left, as he subsequently confirmed when he was interviewed by the gendarmes.
18. On 26 October 2011 Leila T. wrote to the Nîmes public prosecutor to lodge a criminal complaint against the applicant, together with S.B. and L.R., on account of the offending comments published on the applicant’s Facebook “wall”. With her letter she attached screen shots as evidence of the comments.
19. On 27 October 2011 the applicant posted a message on the “wall” of his Facebook account asking contributors to “be careful with the content of [their] comments”, but without moderating the comments already posted.
20. Leila T. was interviewed by gendarmes on 6 December 2011. She stated that she had discovered the comments on the morning of 25 October 2011 when she was in the office of her partner F.P., MEP and first deputy to the mayor of Nîmes. She explained that their relationship was public knowledge and that the comments on the applicant’s publicly accessible Facebook “wall”, interspersed with racist remarks, associated her North African‑sounding forename with the name of her partner and his policies. After she had discovered the comments she had immediately gone to the hairdressing salon run by S.B. to express her indignation. According to her, S.B. had been very surprised and had clearly not been aware of the public nature of this Facebook “wall”, but he had confirmed he was talking about her when he wrote “Thanks Franck and kisses to Leilla”. She added that she had then been accompanied to the town hall by the Prefect’s wife, who was just passing by and who had seen how annoyed she was. On the way there she had logged onto Facebook again to find that S.B.’s comment had already been removed. An investigation into the applicant’s Facebook account revealed, on the same day, that the applicant’s original post and the comments by L.R. were still visible, while those posted by S.B. had indeed disappeared.
21. For his part, L.R. was identified by the gendarmes during their investigation as being an employee of the Nîmes municipality. When interviewed by the gendarmes on 23 January 2012 he stated that he had been working as an assistant in the applicant’s election campaign and denied that his comments had been racist or had incited racial hatred. Explaining that he had never intended to target Leila T. with his remarks, he said that in the meantime he had deleted the comments in which F.P. could have recognised himself or have been recognised by others.
22. During his interview on 25 January 2012, S.B. told the gendarmes that he had been unaware that the applicant’s Facebook “wall” was publicly accessible and had deleted his comment immediately after Leila T. had confronted him at his hairdressing salon. He added that he had informed the applicant later that day of his altercation with her.
23. On 28 January 2012 the applicant himself was also interviewed by the investigators. Recalling that he had previously been a candidate in Nîmes, standing against F.P., Leila T.’s partner, he explained that he had been unable to monitor the large number of comments posted every week on the “wall” of his Facebook account. He indicated in particular: that he had not been the author of the impugned comments; that S.B.’s comment had been deleted by its author before he had had the time to do so; that he had only become aware of L.R.’s comments when he was summoned to the gendarmerie, and was prepared to delete them if the courts so requested; that he consulted his Facebook “wall” every day, but did not often read the comments, which were too numerous given that he had more than 1,800 “friends” who could post comments twenty-four hours a day, and that he preferred to post content to inform his readers; that Leila T. had not been mentioned by name and he had discovered her forename only when she had filed a complaint; that Leila T. had once personally taken him to task at a polling station; that she should have telephoned him to ask him to delete the comments, which would have “spared her the trouble” of filing a criminal complaint, but that her aim had clearly been to destabilise his candidature, as he was standing against her partner; that instead, Leila T. had gone to the hairdressing salon of S.B., whom she knew, to insult and threaten him in front of witnesses; lastly, that he knew L.R. and S.B., who were activists in his party but not office holders. Referring to his own foreign origins, he added that he had never displayed any racism or discrimination against anyone, and that he did not perceive any call to murder or violence in the impugned remarks, which in his view remained within the limits of any citizen’s freedom of expression. He emphasised that he had removed public access to his Facebook “wall” a few days before this interview, in order to limit access only to those who chose to be his friends and to avoid any further incidents that were not of his making. After the interview, the investigators were able to confirm that the applicant’s Facebook “wall” was indeed no longer accessible to the public.
24. The applicant, together with S.B. and L.R., were summoned to appear before Nîmes Criminal Court in connection with the posting of the comments in question on the “wall” of his Facebook account, to answer charges of incitement to hatred or violence against a group, and targeting in particular Leila T., on account of their origin or of their belonging, or not belonging, to a specific ethnicity, nation, race or religion. The summons referred to section 23, first paragraph, section 24, eighth paragraph, and section 65-3 of the Law of 29 July 1881, and section 93-3 of Law no. 82-652 of 29 July 1982.
25. In a judgment of 28 February 2013 the Nîmes Criminal Court found the applicant, S.B. and L.R. guilty as charged and ordered each of them to pay a fine of 4,000 euros (EUR). The applicant was convicted under section 23, first paragraph, and section 24, eighth paragraph, of the Law of 29 July 1881, and section 93-3 of Law no. 82-652 of 29 July 1982. S.B. and the applicant were also ordered jointly to pay EUR 1,000 to Leila T., as civil party, in compensation for the non-pecuniary damage she had sustained. However, the court did not see fit to impose the sanction of electoral disqualification that had been called for by the prosecution.
26. In its judgment, the court began by finding as follows:
To equate, in the same exchange, the members of the relevant group, i.e. ‘Muslims’, expressly with ‘drug dealers and prostitutes (sic)’ who ‘reign supreme (sic)’, ‘riffraff who sell drugs all day long’ or those responsible for the ‘stones [that] get thrown at cars belonging to white people’, was clearly likely, on account of both the meaning and scope of the words, to arouse a strong feeling of rejection or hostility towards a group of people, namely those of the Muslim faith, or presumed to be of that faith.”
27. The court further took the view that Leila T. could be regarded as having been provoked by the impugned comments, in view of the references to her partner, who was mentioned several times in the exchange, including in the quip “Thanks Franck and kisses to Leilla (sic)”, with the effect of portraying them both as being responsible for the alleged transformation of “Nimes into Algiers” and of arousing hatred or violence against them.
28. As regards the applicant, the court observed that it could be inferred from section 93-3 of Law no. 82-652 of 29 July 1982, as interpreted by the Constitutional Council in its decision of 16 September 2011, that the criminal liability of the producer of a website intended for communication to the general public, including access to comments posted by its users, would only be engaged in respect of such comments where it could be established that the producer had been aware of their content before they were posted, or otherwise where he or she had failed to act promptly to delete the comments at issue upon becoming aware of them. It dismissed the applicant’s argument that he had not had time to read the comments and that he had not been aware of those posted by S.B. and L.R., on the grounds that: first, comments could only be posted on his “wall” once he had given access to his “friends”, of which there were 1,829 at 25 October 2011, and he was responsible for verifying the content of the comments; second, he must have been aware that his “wall” was likely to attract comments with a political, and thus essentially polemical, content, and should have been all the more careful to monitor them. The court concluded that, having set up an electronic service for communication to the public on his own initiative, for the purpose of exchanging views, and having left the offending comments online - still being visible on 6 December 2011 according to the investigators - the applicant had failed to act promptly to put an end to their dissemination. It inferred that the applicant had to be “declared guilty as principal”. It found S.B. and L.R. guilty as accomplices in the offence committed by the applicant, explaining that their status in the proceedings had been debated at the hearing.
29. The applicant and S.B. appealed. The latter subsequently withdrew his appeal.
30. In a judgment of 18 October 2013 the Nîmes Court of Appeal upheld the convictions, while reducing the applicant’s fine to EUR 3,000. It further ordered him to pay Leila T. EUR 1,000 in costs for the appeal proceedings.
“... The legislative provision on which the charges are based refers to discrimination against a person or group. The expression ‘kisses to Leilla’, referring to [L.T.], and her connection with [F.P.], deputy mayor of Nîmes, who is described in the texts as having contributed to an abandonment of the town of Nîmes to the Muslims and thus to insecurity, is such as to associate her with the transformation of the town and thus to arouse hatred or violence against her; on the basis of these elements, the two texts in question constitute incitement to hatred or violence against a person, namely [F.P.]’s partner [L.T.], on account of a presumption, in view of her forename, that she belonged to a Muslim community. The offence provided for in section 24, eighth paragraph, of the Law of 29 July 1881 is thus made out ...”
33. The applicant appealed on points of law to the Court of Cassation, relying in particular on Article 10 of the Convention. In a single ground of appeal, he argued: that, for the offence to be made out, the comments had to contain encouragement or incitement to discrimination, hatred or violence, and not merely give rise to a strong feeling of rejection or hostility towards a group or person; that the mere fear of a risk of racism could not deprive citizens of the freedom to express their views on the consequences of immigration in certain towns or neighbourhoods, the comments having specifically deplored the transformation of the town of Nîmes by immigrants of North African origin and of the Muslim faith; that the summons to appear before the court had been unlawful; and, lastly, that the impugned remarks had in no way targeted Leila T. personally and had been distorted in the Court of Appeal’s judgment.
34. In a judgment of 17 March 2015 the Court of Cassation dismissed his appeal on points of law, in particular with regard to Article 10 of the Convention, with the following reasoning:
“... first, the offence of incitement ... is made out where, as in the present case, the court finds that, by both their meaning and their scope, the impugned texts may arouse a feeling of rejection or hostility, hatred or violence, towards a group or an individual on account of a particular religion; ... second, since the above-mentioned text falls foul of the restrictions provided for in paragraph 2 of Article 10 of the European Convention on Human Rights, the principle of freedom of expression enshrined in paragraph 1 of that Article cannot be relied upon; ...”
RELEVANT LEGAL FRAMEWORK AND PRACTICE
A. Freedom of the Press Act (Law of 29 July 1881)
35. The relevant provisions, in the version of the Act that was applicable at the time of the acts for which the applicant was prosecuted, read as follows:
Section 23
This provision shall also be applicable where the incitement has been followed only by an attempt to commit a serious crime (crime) under Article 2 of the Criminal Code.”
Section 24 (eighth and tenth to twelfth paragraphs)
“...
Anyone who, by one of the means referred to in section 23, has incited discrimination, hatred or violence against a person or group on account of their origin or of their belonging, or not belonging, to a given ethnicity, nation, race or religion, shall be liable to a one-year prison term and a fine of 45,000 euros, or only one of those two sanctions.
...
Where a conviction is secured for one of the offences provided for in the two preceding paragraphs, the court may further order:
(1) the deprivation of the rights listed in paragraphs 2 and 3 of Article 131-26 of the Criminal Code for a maximum of five years, save where the offender’s liability is engaged under section 42 and the first paragraph of section 43 hereof, or under the first three paragraphs of section 93-3 of Law no. 82-652 of 29 July 1982 on audiovisual communication;
(2) the display or dissemination of the decision as provided in Article 131-35 of the Criminal Code;
...”
B. Audiovisual Communication Act (Law no. 82-652 of 29 July 1982)
36. Section 93-3 of Law no. 82-652 of 29 July 1982, inserted by Law no. 85-1317 of 13 December 1985, incorporated the so-called “cascading” liability system, as provided for in section 42 of the Freedom of the Press Act of 29 July 1881, into the field of audiovisual communication and subsequently that of “communication to the public by electronic means”. The section was amended by Law no. 92-1336 of 16 December 1992 concerning the entry into force of the New Criminal Code (inserting a reference to Article 121-7 of the Criminal Code, in the place of Article 60 thereof), by Law no. 2004-575 of 21 June 2004 on the promotion of confidence in the digital economy (confiance dans l’économie numérique), known as the “LCEN” Act (substituting the broader concept of “communication to the public by electronic means” for that of “audiovisual communication”) and by Law no. 2009‑669 of 12 June 2009 on the dissemination and protection of creation on the Internet, known as the “HADOPI I” Act. In the preparation of that 2009 legislation, a fifth paragraph was added to section 93-3, in line with an amendment proposed, first, with a view to “creating the status of online press publisher, accompanied by a tailored liability regime” and, secondly, to “concurrently adapt the regime of editorial liability of online communication services” (National Assembly, Amendment No. 201 Rect.):
“The regime of section 93-3 of the Law of 29 July 1982 presumes that the publication director is primarily liable for press offences committed through a publication thereby, via an online service for communication to the public, where the text in question has undergone ‘prior fixing’. This presumption would appear difficult to implement in the case of personal participation formats (discussion fora, blogs), involving contributions from, and the participation of, Internet users.
It is therefore proposed that user contributions should give rise to a mitigated liability regime, irrespective of the type of moderation adopted, and that they should not engage the liability of the publication director as principal unless he or she had actual knowledge of the content made available to the public.”
37. Section 93-3 of Law no. 82-652 of 29 July 1982, as in force at the material time, read as follows:
“Where one of the offences provided for in chapter IV of the Freedom of the Press Act of 29 July 1881 is committed by an electronic means of communication to the public, the publication director or, in the situation provided for in the second paragraph of section 93-2 hereof, the publication codirector, shall be prosecuted as the principal, when the content of the impugned statement has undergone ‘prior fixing’ before being transmitted to the public.
If not the above, the author, and failing which the producer, shall be prosecuted as principal.
Where charges are brought against the publication director or codirector, the author shall be prosecuted as an accomplice.
Any person to whom Article 121-7 of the Criminal Code is applicable may also be prosecuted as an accomplice.
Where the offence stems from the content of a message addressed by an Internet user to an online service for communication to the public and made publicly accessible by that service in a forum of personal contributions identified as such, the publication director or codirector may not be held criminally liable as principal if it is established that he or she had no actual knowledge of the message before it was posted on line or if, upon becoming aware thereof, he or she acted promptly to ensure the deletion of the said message.”
C. The legal regime applicable to the “producer”
38. The Court of Cassation has clarified the concept of “producer”, adopting this characterisation for a person who has taken the initiative of creating an electronic communication service for the exchange of opinions on pre-defined topics (Court of Cassation, Criminal Division, 8 December 1998, published in the reports of judgments of the Criminal Division - “Bull. crim.” ‑, no. 335; see also the two leading judgments of 16 February 2010: Court of Cassation, Criminal Division, appeal no. 08-86.301, Bull. crim., no. 30, concerning the liability, as producer, of the managing director of a company operating a website, on account of the dissemination of a number of texts on a discussion forum; and Court of Cassation, Criminal Division, appeal no. 09‑81.064, Bull. crim., no. 31, concerning the liability, as producer, of the chair of an association for the dissemination of contentious statements on its blog). This definition of “producer” was endorsed by the Constitutional Council, which, in a decision of 16 September 2011 (see paragraph 40 below), observed:
“It follows from these provisions, as interpreted by the Court of Cassation in its judgments of 16 February 2010 ..., that a person who has taken the initiative of creating an online communication service for the exchange of opinions on pre-defined topics may be prosecuted in his or her capacity as producer.”
2. Establishing the liability of the “producer”
39. In its two judgments of 16 February 2010, cited above (see paragraph 38 above), the Court of Cassation further confirmed that, under section 93-3 of Law no. 82-652 of 29 July 1982, where an offence enumerated in Chapter IV of the Law of 29 July 1881 was committed by an electronic means of communication to the public, if not the author then the producer of the service would be prosecuted as principal, even if the statement had not undergone “prior fixing” before being transmitted to the public (Bull. crim., nos. 30 and 31). In addition, in one of those cases, the Court of Cassation quashed the judgment of a Court of Appeal which had acquitted the administrator of a blog, without ascertaining whether he could be prosecuted as producer, in proceedings concerning a comment posted thereon by a third party, even though that author had been identified (Court of Cassation, Criminal Division, 16 February 2010, appeal no. 09-81.064, Bull. crim., no. 31, and see also the further judgment in the same case, Court of Cassation, Criminal Division, 30 October 2012, appeal no. 10-88.825, Bull. crim., no. 233). In his report, the reporting judge at the Court of Cassation, when considering the questions raised in the examination of the first appeal on points of law (no. 09-81.064, giving rise to the judgment of 16 February 2010), expressed the following view on the question of the “autonomy of the proceedings”:
“Does this fluidity of roles in the chain of Internet actors permit the public prosecutor or the victim of a press offence to ‘choose’ the person to be prosecuted, out of those listed in section 93-3?
Taken literally, section 93-3, like sections 42 and 43 of the 1881 Law, assigns a particular status to each actor (principal, accomplice), following a strict mechanism (‘if not...’ meaning ‘in the absence of...’, ‘failing which ..’, without the reasons for the absence being explained: unidentified person, immunity, deliberate passing-over of the previous level...). But the jurisprudence has long espoused a principle of ‘procedural autonomy’ whereby:
‘No statutory provision on freedom of the press requires proceedings first to be brought against the author of comments before proceedings can be brought against the publication director as principal or, under any status whatsoever, against other persons who may be criminally liable in accordance with sections 42 and 43 of that Law’ (see, for example, Court of Cassation, Criminal Division, 16 July 1992, no. 91-86.156; for other applications: Court of Cassation, Criminal Division, 20 January 1987, 20 October 2005, or Court of Cassation, First Civil Division, 12 July 2006).”
40. In addition, a preliminary reference on constitutionality (question prioritaire de constitutionnalité - QPC) was made to the Constitutional Council concerning the difference in treatment between, on the one hand, the publication director, the only actor to be mentioned in the last paragraph of section 93-3 inserted by Law no. 2009-669 of 12 June 2009, and on the other, the producer, who was not mentioned in that paragraph. In a decision of 16 September 2011 (no. 2011-164 QPC), the Constitutional Council declared section 93-3 of Law no. 82-652 of 29 July 1982 on audiovisual communication to be compliant with the Constitution, subject to the following interpretative reservation:
“7. Consequently, taking into account, on the one hand, the specific liability applicable to the publication director under the first and last paragraphs of section 93‑3 and, on the other, the characteristics of the Internet which, as the relevant rules and techniques now stand, allow the author of a comment disseminated on the Internet to preserve his or her anonymity, the provisions under review cannot, without establishing an irrebuttable presumption of criminal liability in breach of the aforementioned constitutional requirements, be interpreted as allowing the creator or administrator of an online website for communication to the public, rendering comments by Internet users publicly accessible, to be held criminally liable as producer solely on account of the content of comments of which he or she had no knowledge before they were posted online. Subject to that reservation, the provisions under review are not incompatible with Article 9 of the Declaration of 1789.”
41. In its case-law, the Criminal Division of the Court of Cassation subsequently drew the appropriate conclusions from the Constitutional Council’s decision of 16 September 2011 (see paragraph 40 above), in a judgment of 31 January 2012 (appeal no. 10-80.010, Bull. crim., no. 233; see also, in the same vein, Court of Cassation, Criminal Division, 30 October 2012, appeal no. 10-88.825):
“It can be inferred from section 93-3 of the Law of 29 July 1982 as amended, interpreted in line with the reservation set forth by the Constitutional Council in its decision QPC no. 2011-64 of 16 September 2011, that the criminal liability of the producer of a website intended for communication, including the public transmission of comments posted by its users, will only be engaged in respect of such comments if it can be established that the producer had knowledge of their content before they were posted or otherwise if he or she failed to act promptly to delete the comment at issue upon becoming aware thereof.
It is thus decided to quash the judgment which declared the creator of an online discussion forum guilty of defamation, on account of the text posted on this personal‑contribution forum by a user, without ascertaining whether, in his capacity as producer within the meaning of the above-mentioned legislation, he had knowledge, prior to its posting, of the content of this message, or otherwise whether he had failed to delete the text promptly after becoming aware of it; ...”
D. Other relevant domestic law material
1. Case-law of the Court of Cassation
43. The regime of “cascading liability”, which is to be applied when the publication director or author of remarks cannot be identified, does not exclude the autonomy of the proceedings, which allows proceedings to be brought against all those who are liable, when they are identified, or just one of those actors. Accordingly, the “cascading liability” regime and the principle of the autonomy of the proceedings apply without prejudice to each other. The Court of Cassation has thus taken the view, with regard to the principle of the autonomy of the proceedings, that no provision of the Freedom of the Press Act requires the prior prosecution of the author of the impugned remarks before proceedings can be brought against the publication director as principal or, under any status whatsoever, against other persons who may be criminally liable pursuant to that Act. In a judgment of 16 July 1992, the Criminal Division of the Court of Cassation thus expressly applied this principle in dismissing an appeal against the conviction by a Court of Appeal of the publication director of a periodical, as principal, together with the author of the article in question, as accomplice, for the offence of inciting discrimination, hatred or racial violence, as provided for in section 24 of the Law of 29 July 1881 (appeal no. 91-86.156, Bull. crim., no. 273). Moreover, in a judgment of 20 January 1987, the Court of Cassation similarly quashed the judgment of a Court of Appeal because it had annulled the summons by which civil parties had initiated the relevant proceedings on the grounds, inter alia, that they had not sought to prosecute the author of the article in question and had not specified on what basis the publication director was being prosecuted for the offence of inciting discrimination, hatred or violence against a group on account of their origin or of their belonging to a given ethnicity, nation, race or religion (appeal no. 84-94.444, Bull. crim., no. 30).
44. As regards the offence of incitement to hatred or violence, the Court of Cassation has consistently held that the comments in question must be such as to arouse immediate reactions from the reader, against the persons targeted, of rejection or even hatred and violence (Court of Cassation, Criminal Division, 21 May 1996, Bull. crim., no. 210), or that the courts must find that by both its meaning and scope, the text at issue may either arouse a feeling of hostility or rejection, or incite the public to hatred or violence against a specific person or group (Court of Cassation, Criminal Division, 16 July 1992, Bull. crim., no. 273; Court of Cassation, Criminal Division, 14 May 2002, appeal no. 01-85.482; Court of Cassation, Criminal Division, 30 May 2007, appeal no. 06-84.328; Court of Cassation, Criminal Division, 29 January 2008, appeal no. 07-83.695, and Court of Cassation, Criminal Division, 3 February 2009, appeal nos. 06-83.063 and 08-82.402). Certain remarks may also give rise to sanctions if their meaning is implicit (Court of Cassation, Criminal Division, 16 July 1992, Bull. crim., no. 273).
2. Legislation subsequent to circumstances of present case
45. The “LCEN” Act (see paragraph 36 above) clarifies the conditions in which “hosts”, that is, “natural or legal persons which provide, even free of charge, with a view to transmission to the public via online communication services, the storage of signs, writings, images, sounds or texts of any kind provided by the recipients of such services”, such as Facebook, may be deemed to have had knowledge of offending messages. Hosts cannot be held civilly liable for activities or information stored at the request of a recipient of those services “if they did not have actual knowledge of their manifestly unlawful nature or of facts and circumstances indicating that nature or if, from the time they became aware of it, they acted promptly to remove the data or to render access thereto impossible”, “knowledge of the facts in issue [being] presumed” where the content in question had been notified to them beforehand as provided in section 5 of the Act. Section 6 of the “LCEN” Act provides, however, that hosts “are not producers within the meaning of section 93-3 of Law no. 82-652 of 29 July 1982 on audiovisual communication”. Furthermore, in a decision of 10 June 2004 (no. 2004‑496 DC), the Constitutional Council stated that the provisions of the Law of 21 June 2004 “[could] not have the effect of engaging the liability of a host which ha[d] not removed information that ha[d] been denounced as unlawful by a third party if it [was] not manifestly of such nature or if its removal ha[d] not been ordered by a court”.
46. Furthermore, Law no. 2020-766 of 24 June 2020, on the combat against hateful content on the Internet (and which was the subject of Constitutional Council decision no. 2020-801 DC of 18 June 2020, declaring numerous provisions to be unconstitutional) created an online hate “Observatory”. Its mission is to monitor and analyse developments in this area, by involving operators (in particular of social networks such as Facebook), associations, authorities and researchers concerned with the combat against and prevention of such acts. Working groups have been tasked with reflecting on the concept of hateful content, improving knowledge of this phenomenon, analysing the mechanisms of dissemination and the means of combating it, and with ensuring prevention, education and support for Internet users.
47. That Law also led to the creation, within the Paris tribunal judiciaire, of a specialised national unit for combating online hate, which started operating in January 2021. It exercises jurisdiction based on the complexity of a prosecution or the extent of a breach of public order, which may stem in particular from the high media profile or particular sensitivity of a given case (Circular of 24 November 2020 on the combat against online hate - CRIM 2020 23 E1 24.11.2020).
48. Lastly, Law no. 2021-1109 of 24 August 2021 on the securing of respect for the principles of the Republic, which includes a part concerning online hate speech, created a new offence to combat such hate speech (new Article 223-1-1 of the Criminal Code, which makes it an offence to reveal, disseminate or transmit, by any means whatsoever, information relating to the private, family or professional life of a person that makes it possible to identify or locate him or her in order to expose him or her, or his or her family members, to a direct risk of harm to their person or property of which the author could not have been unaware). It also imposed, ahead of the EU “Digital Services Act” (see paragraph 75 below), a new system for moderating illegal content on online platforms until the end of 2023 (procedures for dealing with judicial requests, public information about the moderation mechanism, risk assessment, etc.), under the supervision of an independent administrative authority, the Audiovisual and Digital Communications Regulatory Authority (ARCOM).
A. Communication on the Internet
“1. For the purposes of this Protocol:
‘racist and xenophobic material’ means any written material, any image or any other representation of ideas or theories, which advocates, promotes or incites hatred, discrimination or violence, against any individual or group of individuals, based on race, colour, descent or national or ethnic origin, as well as religion if used as a pretext for any of these factors.
...”
Article 3 - Dissemination of racist and xenophobic material through computer systems
“1. Each Party shall adopt such legislative and other measures as may be necessary to establish as criminal offences under its domestic law, when committed intentionally and without right, the following conduct:
distributing, or otherwise making available, racist and xenophobic material to the public through a computer system.
2 A Party may reserve the right not to attach criminal liability to conduct as defined by paragraph 1 of this article, where the material, as defined in Article 2, paragraph 1, advocates, promotes or incites discrimination that is not associated with hatred or violence, provided that other effective remedies are available.
...”
50. The Explanatory Report accompanying this Additional Protocol provides the following explanations:
“...
2. As technological, commercial and economic developments bring the peoples of the world closer together, racial discrimination, xenophobia and other forms of intolerance continue to exist in our societies. Globalisation carries risks that can lead to exclusion and increased inequality, very often along racial and ethnic lines.
3. In particular, the emergence of international communication networks like the Internet provide certain persons with modern and powerful means to support racism and xenophobia and enables them to disseminate easily and widely expressions containing such ideas.
...
Article 2 - Definition
Paragraph 1 - ‘Racist and xenophobic material’
...
12. The definition contained in Article 2 refers to written material (e.g. texts, books, magazines, statements, messages, etc.), images (e.g. pictures, photos, drawings, etc.) or any other representation of thoughts or theories, of a racist and xenophobic nature, in such a format that it can be stored, processed and transmitted by means of a computer system.
13. The definition contained in Article 2 of this Protocol refers to certain conduct to which the content of the material may lead, rather than to the expression of feelings/belief/aversion as contained in the material concerned. ...
14. The definition requires that such material advocates, promotes, incites hatred, discrimination or violence. ‘Advocates’ refers to a plea in favour of hatred, discrimination or violence, ‘promotes’ refers to an encouragement to or advancing hatred, discrimination or violence and ‘incites’ refers to urging others to hatred, discrimination or violence.
...
16. Whether the treatment is discriminatory or not has to be considered in the light of the specific circumstances of the case. Guidance for interpreting the term ‘discrimination’ can also be found in Article 1 of the CERD [Convention on the Elimination of All Forms of Racial Discrimination], where the term ‘racial discrimination’ means ‘any distinction, exclusion, restriction or preference based on race, colour, descent, or national or ethnic origin which has the purpose or effect of nullifying or impairing the recognition, enjoyment or exercise, on an equal footing, of human rights and fundamental freedoms in the political, economic, social, cultural or any other field of public life’.
...
17. Hatred, discrimination or violence, have to be directed against any individual or group of individuals, for the reason that they belong to a group distinguished by ‘race, colour, descent or national or ethnic origin, as well as religion, if used as a pretext for any of these factors’.
...
Article 3 - Dissemination of racist and xenophobic material in a computer system
27. This article requires States Parties to criminalize distributing or otherwise making available racist and xenophobic material to the public through a computer system. The act of distributing or making available is only criminal if the intent is also directed to the racist and xenophobic character of the material.
28. ’Distribution’ refers to the active dissemination of racist and xenophobic material, as defined in Article 2 of the Protocol, to others, while ‘making available’ refers to the placing on line of racist and xenophobic material for the use of others. This term also intends to cover the creation or compilation of hyperlinks in order to facilitate access to such material.
...
31. Exchanging racist and xenophobic material in chat rooms, posting similar messages in newsgroups or discussion fora, are examples of making such material available to the public. In these cases the material is accessible to any person. Even when access to the material would require authorisation by means of a password, the material is accessible to the public where such authorisation would be given to anyone or to any person who meets certain criteria. In order to determine whether the making available or distributing was to the public or not, the nature of the relationship between the persons concerned should be taken into account.
...”
51. On 28 May 2003, at the 840th meeting of the Ministers’ Deputies, the Committee of Ministers of the Council of Europe adopted a Declaration on freedom of communication on the Internet. The relevant parts of the Declaration read as follows:
“Principle 7: Anonymity
In order to ensure protection against online surveillance and to enhance the free expression of information and ideas, member states should respect the will of users of the Internet not to disclose their identity. This does not prevent member states from taking measures and co-operating in order to trace those responsible for criminal acts, in accordance with national law, the Convention for the Protection of Human Rights and Fundamental Freedoms and other international agreements in the fields of justice and the police.”
52. In its Recommendation CM/Rec(2007)16 to member States on measures to promote the public service value of the Internet (adopted on 7 November 2007), the Committee of Ministers noted that the Internet could, on the one hand, significantly enhance the exercise of certain human rights and fundamental freedoms while, on the other, it could adversely affect these and other such rights. It recommended that the member States draw up a clear legal framework delineating the boundaries of the roles and responsibilities of all key stakeholders in the field of new information and communication technologies.
53. On 16 April 2014 Recommendation CM/Rec(2014)6 of the Committee of Ministers to member States on a Guide to human rights for Internet users was adopted. The relevant part of the Guide reads as follows:
Freedom of expression and information
“You have the right to seek, receive and impart information and ideas of your choice, without interference and regardless of frontiers. This means:
1. you have the freedom to express yourself online and to access information and the opinions and expressions of others. This includes political speech, views on religion, opinions and expressions that are favourably received or regarded as inoffensive, but also those that may offend, shock or disturb others. You should have due regard to the reputation or rights of others, including their right to privacy;
2. restrictions may apply to expressions which incite discrimination, hatred or violence. These restrictions must be lawful, narrowly tailored and executed with court oversight;
...
6. you may choose not to disclose your identity online, for instance by using a pseudonym. However, you should be aware that measures can be taken, by national authorities, which might lead to your identity being revealed.”
54. On 7 March 2018, Recommendation CM/Rec(2018)2 of the Committee of Ministers to member States on the roles and responsibilities of Internet intermediaries was also adopted. It explains in particular what is meant by “Internet intermediaries”:
“4. A wide, diverse and rapidly evolving range of players, commonly referred to as ‘internet intermediaries’, facilitate interactions on the internet between natural and legal persons by offering and performing a variety of functions and services. Some connect users to the internet, enable the processing of information and data, or host web-based services, including for user-generated content. Others aggregate information and enable searches; they give access to, host and index content and services designed and/or operated by third parties. Some facilitate the sale of goods and services, including audiovisual services, and enable other commercial transactions, including payments.
5. Intermediaries may carry out several functions in parallel. They may also moderate and rank content, including through automated processing of personal data, and may thereby exert forms of control which influence users’ access to information online in ways comparable to media, or they may perform other functions that resemble those of publishers. Intermediary services may also be offered by traditional media, for instance, when space for user-generated content is offered on their platforms. The regulatory framework governing the intermediary function is without prejudice to the frameworks that are applicable to the other functions offered by the same entity.”
2. Other international sources
55. The UN Human Rights Council’s Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression stated the following in his report of 16 May 2011 to the Human Rights Council (A/HRC/17/27):
“25. As such, legitimate types of information which may be restricted include child pornography (to protect the rights of children), hate speech (to protect the rights of affected communities), defamation (to protect the rights and reputation of others against unwarranted attacks), direct and public incitement to commit genocide (to protect the rights of others), and advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence (to protect the rights of others, such as the right to life).
...
43. The Special Rapporteur believes that censorship measures should never be delegated to a private entity, and that no one should be held liable for content on the Internet of which they are not the author. Indeed, no State should use or force intermediaries to undertake censorship on its behalf ...”
56. In its thematic report on online hate speech, submitted at the seventy‑fourth session of the United Nations General Assembly in September 2019 (A/74/486), it was declared as follows:
(a) Strictly define the terms in their laws that constitute prohibited content under article 20 (2) of the International Covenant on Civil and Political Rights and article 4 of the International Convention on the Elimination of All Forms of Racial Discrimination and resist criminalizing such speech except in the gravest situations, such as advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence, and adopt the interpretations of human rights law contained in the Rabat Plan of Action;
...
(d) Adopt or review intermediary liability rules to adhere strictly to human rights standards and do not demand that companies restrict expression that the States would be unable to do directly, through legislation; ...”
57. In a Joint Declaration of 21 December 2005, the UN Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression, the Representative on Freedom of the Media of the Organization for Security and Co‑operation in Europe and the Special Rapporteur on Freedom of Expression of the Organization of American States, stated as follows:
“No one should be liable for content on the Internet of which they are not the author, unless they have either adopted that content as their own or refused to obey a court order to remove that content.”
58. In a Joint Declaration of 1 June 2011 on freedom of expression and the Internet, the United Nations Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information stated, with regard to the liability of intermediaries, that no one who simply provided technical Internet services such as access to, searches for, or transmission or caching of information, should be liable for content generated by others, which was disseminated using those services, as long as they did not specifically intervene in that content or refuse to obey a court order to remove that content, where they had the capacity to do so.
59. In the 2013 annual report of 31 December 2013 (OEA/Ser.L/V/II.149. Doc. 50), the Special Rapporteur for Freedom of Expression of the Inter‑American Commission on Human Rights took the view that the authors of impugned remarks should be held liable rather than the intermediaries.
(a) Committee of Ministers of the Council of Europe
60. The Annex to Recommendation R (97) 20 of the Council of Europe’s Committee of Ministers on “hate speech”, adopted on 30 October 1997, provides in particular as follows:
“Scope
The principles set out hereafter apply to hate speech, in particular hate speech disseminated through the media.
For the purposes of the application of these principles, the term ‘hate speech’ shall be understood as covering all forms of expression which spread, incite, promote or justify racial hatred, xenophobia, anti-Semitism or other forms of hatred based on intolerance, including: intolerance expressed by aggressive nationalism and ethnocentrism, discrimination and hostility against minorities, migrants and people of immigrant origin.
...
Principle 1
The governments of the member states, public authorities and public institutions at the national, regional and local levels, as well as officials, have a special responsibility to refrain from statements, in particular to the media, which may reasonably be understood as hate speech, or as speech likely to produce the effect of legitimising, spreading or promoting racial hatred, xenophobia, anti-Semitism or other forms of discrimination or hatred based on intolerance. Such statements should be prohibited and publicly disavowed whenever they occur.
...
Principle 6
National law and practice in the area of hate speech should take due account of the role of the media in communicating information and ideas which expose, analyse and explain specific instances of hate speech and the underlying phenomenon in general as well as the right of the public to receive such information and ideas.
To this end, national law and practice should distinguish clearly between the responsibility of the author of expressions of hate speech, on the one hand, and any responsibility of the media and media professionals contributing to their dissemination as part of their mission to communicate information and ideas on matters of public interest on the other hand.”
61. In Recommendation CM/Rec(2022)16 on Combating Hate Speech, adopted on 20 May 2022, the Committee of Ministers stated as follows:
“...
Stressing that, in order to effectively prevent and combat hate speech, it is crucial to identify and understand its root causes and wider societal context, as well as its various expressions and different impacts on those targeted;
Noting that hate speech is a deep-rooted, complex and multidimensional phenomenon, which takes many dangerous forms and can be disseminated very quickly and widely through the internet, and that the persistent availability of hate speech online exacerbates its impact, including offline;
Realising that hate speech negatively affects individuals, groups and societies in a variety of ways and with different degrees of severity, including by instilling fear in and causing humiliation to those it targets and by having a chilling effect on participation in public debate, which is detrimental to democracy;
Being aware that individuals and groups can be targeted by hate speech on different grounds, or combinations of grounds, and acknowledging that those persons and groups need special protection, without detriment to the rights of other persons or groups;
...
Being aware that hate speech is defined and understood in differing ways at the national, European and international levels and that it is crucial to develop a common understanding of the concept, nature and implications of this phenomenon and to devise more effective policies and strategies to tackle it;
Considering that measures to combat hate speech should be appropriate and proportionate to the level of severity of its expression; some expressions of hate speech warrant a criminal law response, while others call for a civil or administrative law response, or should be dealt with through measures of a non-legal nature, such as education and awareness raising, or a combination of different approaches and measures;
...
Being aware that internet intermediaries can facilitate public debate, in particular through the digital tools and services they make available, while at the same time highlighting that those tools and services can be used to disseminate, quickly and widely, worrying volumes of hate speech, and underlining that internet intermediaries should ensure that their activities do not have or facilitate an adverse impact on human rights online and address such impacts when they occur;
Recognising that legislative and policy measures to prevent and combat online hate speech should be kept under regular review in order to take into account the fast evolution of technology and online services and, more widely, digital technologies and their influence on information and communication flows in contemporary democratic societies; and acknowledging that those reviews should take into account the dominance of certain internet intermediaries, the power asymmetry between some digital platforms and their users, and the influence of these dynamics on democracies;
...
Recommends that the governments of member States:
...
2. take appropriate measures to give encouragement and support to national human rights institutions, equality bodies, civil society organisations, the media, internet intermediaries and other stakeholders to adopt the measures that are outlined for them in the principles and guidelines appended to this recommendation;
3. protect human rights and fundamental freedoms in the digital environment, including by co‑operating with internet intermediaries, in line with Recommendation CM/Rec(2018)2 on the roles and responsibilities of internet intermediaries, and other applicable Council of Europe standards; ...”
62. The Annex to Recommendation CM/Rec(2022)16 provides the following explanations:
“...
2. For the purposes of this recommendation, hate speech is understood as all types of expression that incite, promote, spread or justify violence, hatred or discrimination against a person or group of persons, or that denigrates them, by reason of their real or attributed personal characteristics or status such as ‘race’, colour, language, religion, nationality, national or ethnic origin, age, disability, sex, gender identity and sexual orientation.
3. As hate speech covers a range of hateful expressions which vary in their severity, the harm they cause and their impact on members of particular groups in different contexts, member States should ensure that a range of properly calibrated measures is in place to effectively prevent and combat hate speech. Such a comprehensive approach should be fully aligned with the European Convention on Human Rights and the relevant case law of the European Court of Human Rights (the Court) and should differentiate between:
a. i. hate speech that is prohibited under criminal law; and
ii. hate speech that does not attain the level of severity required for criminal liability, but is nevertheless subject to civil or administrative law; and
b. offensive or harmful types of expression which are not sufficiently severe to be legitimately restricted under the European Convention on Human Rights, but nevertheless call for alternative responses, as set out below, such as: counter-speech and other countermeasures; measures fostering intercultural dialogue and understanding, including via the media and social media; and relevant educational, information-sharing and awareness-raising activities.
...
Legislation regarding online hate speech
...
19. Member States should ensure that mechanisms are in place for the reporting of cases of online hate speech to public authorities and private actors, including internet intermediaries, and clear rules for the processing of such reports.
20. Removal procedures and conditions as well as related responsibilities and liability rules imposed on internet intermediaries should be transparent, clear and predictable and those procedures should be subject to due process. ...
21. Member States should take into account the substantial differences in the size, nature, function and organisational structure of internet intermediaries when devising, interpreting and applying the legislative framework governing the liability of internet intermediaries, ... in order to prevent a possible disproportionate impact on smaller internet intermediaries.
...
24. Member States should have a system in place for the disclosure of subscriber information in cases where competent authorities have assessed that online hate speech is in breach of the law and authors and disseminators are unknown to the competent authorities. ...
...
32. Internet intermediaries should carefully calibrate their responses to content identified as hate speech on the basis of its severity, as outlined in paragraph 4 above, and elaborate and apply alternatives to the removal of content in less severe cases of hate speech.
...
34. Internet intermediaries should appoint a sufficient number of content moderators and ensure that they are impartial, have adequate expertise, are regularly trained and receive appropriate psychological support. ...”
63. The Ministers of Foreign Affairs of the Council of Europe also launched, at their 118th ministerial session, a “White paper on Intercultural Dialogue”, entitled Living Together As Equals in Dignity (2008). This document “responds to an increasing demand to clarify how intercultural dialogue may help appreciate diversity while sustaining social cohesion”. As stated therein:
“[It] emphatically argues in the name of the governments of the 47 member states of the Council of Europe that our common future depends on our ability to safeguard and develop human rights, as enshrined in the European Convention on Human Rights, democracy and the rule of law and to promote mutual understanding. It reasons that the intercultural approach offers a forward-looking model for managing cultural diversity. It proposes a conception based on individual human dignity (embracing our common humanity and common destiny). If there is a European identity to be realised, it will be based on shared fundamental values, respect for common heritage and cultural diversity as well as respect for the equal dignity of every individual.”
(b) Parliamentary Assembly of the Council of Europe (PACE)
“9.1. act strongly against discrimination in all areas;
9.2. condemn and combat Islamophobia;
9.3. act resolutely against hate speech and all other forms of behaviour which run counter to core human rights and democratic values, even when their authors seek to justify them on religious grounds;
...”
“Islam, Islamism and Islamophobia in Europe
...
12. The Assembly deplores that a growing number of political parties in Europe exploit and encourage fear of Islam and organise political campaigns which promote simplistic and negative stereotypes concerning Muslims in Europe and often equate Islam with extremism. It is inadmissible to incite intolerance and sometimes even hatred against Muslims. The Assembly calls on member states to pursue political action in accordance with General Policy Recommendation No. 5 (2000) of the European Commission against Racism and Intolerance (ECRI) on combating intolerance and discrimination against Muslims. It reiterates that it is for the member states to reject political statements that stir up fear and hatred of Muslims and Islam, while complying with the stipulations of the Convention, in particular Article 10.2.12.
...”
(c) European Commission against Racism and Intolerance (ECRI)
67. The relevant passages of ECRI’s General Policy Recommendation no. 15 on combating hate speech, adopted on 8 December 2015 reads as follows:
“The European Commission against Racism and Intolerance (ECRI):
...
Taking note of the differing ways in which hate speech has been defined and is understood at the national and international level as well as of the different forms that it can take;
Considering that hate speech is to be understood for the purpose of the present General Policy Recommendation as the advocacy, promotion or incitement, in any form, of the denigration, hatred or vilification of a person or group of persons, as well as any harassment, insult, negative stereotyping, stigmatization or threat in respect of such a person or group of persons and the justification of all the preceding types of expression, on the ground of ‘race’, colour, descent, national or ethnic origin, age, disability, language, religion or belief, sex, gender, gender identity, sexual orientation and other personal characteristics or status;
...
Recognising also that forms of expression that offend, shock or disturb will not on that account alone amount to hate speech and that action against the use of hate speech should serve to protect individuals and groups of persons rather than particular beliefs, ideologies or religions;
Recognising that the use of hate speech can reflect or promote the unjustified assumption that the user is in some way superior to a person or a group of persons that is or are targeted by it;
Recognising that the use of hate speech may be intended to incite, or reasonably expected to have the effect of inciting others to commit, acts of violence, intimidation, hostility or discrimination against those who are targeted by it and that this is an especially serious form of such speech;
...
Recognising that the use of hate speech appears to be increasing, especially through electronic forms of communication which magnify its impact, but that its exact extent remains unclear because of the lack of systematic reporting and collection of data on its occurrence and that this needs to be remedied, particularly through the provision of appropriate support for those targeted or affected by it;
...
Recognising that politicians, religious and community leaders and others in public life have a particularly important responsibility in this regard because of their capacity to exercise influence over a wide audience;
Conscious of the particular contribution that all forms of media, whether online or offline, can play both in disseminating and combating hate speech;
...
Recommends that the governments of member States:
10. take appropriate and effective action against the use, in a public context, of hate speech which is intended or can reasonably be expected to incite acts of violence, intimidation, hostility or discrimination against those targeted by it through the use of the criminal law provided that no other, less restrictive, measure would be effective and the right to freedom of expression and opinion is respected, and accordingly:
a. ensure that the offences are clearly defined and take due account of the need for a criminal sanction to be applied;
b. ensure that the scope of these offences is defined in a manner that permits their application to keep pace with technological developments;
c. ensure that prosecutions for these offences are brought on a nondiscriminatory basis and are not used in order to suppress criticism of official policies, political opposition or religious beliefs;
d. ensure the effective participation of those targeted by hate speech in the relevant proceedings;
e. provide penalties for these offences that take account both of the serious consequences of hate speech and the need for a proportionate response;
f. monitor the effectiveness of the investigation of complaints and the prosecution of offenders with a view to enhancing both of these;
...”
68. In its “Explanatory Memorandum” ECRI provides the following clarifications:
“...
14. The Recommendation further recognises that, in some instances, a particular feature of the use of hate speech is that it may be intended to incite, or can reasonably be expected to have the effect of inciting, others to commit acts of violence, intimidation, hostility or discrimination against those targeted by it. As the definition above makes clear, the element of incitement entails there being either a clear intention to bring about the commission of acts of violence, intimidation, hostility or discrimination or an imminent risk of such acts occurring as a consequence of the particular hate speech used.
15. Intent to incite might be established where there is an unambiguous call by the person using hate speech for others to commit the relevant acts or it might be inferred from the strength of the language used and other relevant circumstances, such as the previous conduct of the speaker. However, the existence of intent may not always be easy to demonstrate, particularly where remarks are ostensibly concerned with supposed facts or coded language is being used.
...”
...
Convinced that the peaceful co-existence of religions in a pluralistic society is founded upon respect for equality and for non-discrimination between religions in a democratic state with a clear separation between the laws of the State and religious institutions; ...
Rejecting all deterministic views of Islam and recognising the great diversity intrinsic in the practice of this religion;
Observing the significant increase of anti-Muslim hatred and discrimination in many member States of the Council of Europe, and stressing that this increase is also characterised by contemporary forms of this phenomenon, which has followed closely contemporary world developments, notably the terror attacks of 11 September 2001; the subsequent strengthened efforts in the fight against terrorism; the situation in the Middle East and the growing migration from Muslim majority countries into Europe;
...
Noting that anti-Muslim racism and discrimination often have an intersectional dimension operating on several grounds such as religion, national or ethnic origin and gender;
...
C. Contemporary forms of anti-Muslim racism and discrimination
...
...
22. ... online hate speech targeting Muslims in particular has soared in recent years and remains very prevalent. On social media platforms in particular, inflammatory anti‑Muslim narratives proliferate and include the demonisation of Muslim communities, conspiracy theories referring to Muslims as ‘invaders’ of Europe, those discourses specific to the Covid-19 pandemic, and incitements to violence against Muslims. People who are identifiably Muslim online may also find that their Muslim identity is targeted in the virtual space and subjected to abusive and threatening behaviour, even in the context of issues that have nothing to do with their faith or community, which for some has a chilling effect on online participation. ECRI has observed that surges in online hate speech are mostly sparked or ‘triggered’ by external developments, such as terror attacks or through statements giving rise to tension by failing to make a distinction between the criticism of a religion and offending the followers of that religion.
...
33. ECRI’s monitoring reports have demonstrated the prevalence of hate-motivated violence against Muslims. Anti-Muslim attacks range from the desecration of Muslim cemeteries, religious buildings and mosques, to abusive behaviour, threats, physical assaults, including in public, against Muslim men or men believed to be Muslim, to murder and deadly terrorist attacks. Data from many European countries suggest that Muslim women are frequently the targets of violence that often involves the pulling off of face veils and headscarves or being spat at. ECRI always calls for strong actions to prevent and punish such attacks since public humiliation of this kind undermines human dignity, creates fear and isolation as well as hinders integration and inclusion. As noted above, Muslim men and women have both been subjected to anti-Muslim hate speech on and offline, targeted with abuse and hostility, with evidence suggesting that incidents of anti-Muslim hostility are likely to increase in the aftermath of terrorist attacks carried out by those who claim to do so in the name of Islam.
34. Overall, the extent of violent incidents against Muslims often remains undocumented and under-reported. Victims and witnesses usually refrain from reporting these incidents due to the fear of reprisal or lack of trust in the authorities. ECRI notes that the failure of the authorities to react appropriately to hate crimes against Muslims may lead to the repetition of such acts and lack of prosecution might send a message of impunity. In this context, ECRI has repeatedly emphasised the need to take steps to ensure the effective functioning of the justice system against anti-Muslim hate crime. These include, among others, the effective monitoring and recording of incidents, collecting uniform and reliable data, increasing the capacities of law enforcement agencies and prosecuting services to effectively identify and investigate bias-motivated crimes, developing support mechanisms for victims and implementing confidence-building measures to enhance the relationship between the police and Muslim communities.
...
ECRI recommends that the governments of member States:
...
...
16. encourage political actors, opinion leaders and other public personalities to take a firm public stand against anti-Muslim racism, speaking out against its various manifestations, including all its contemporary forms, and making clear that anti-Muslim racism will never be tolerated;
...
26. regulate internet companies, including social media networks, telecom operators and internet service providers in order to establish effective systems to monitor and stop anti‑Muslim hate speech online, in line with international human rights standards, and engage with social media networks to work together on initiatives, in particular in the field of education, that could help propagate balanced narratives about Muslims and Islam on social media platforms;
27. ensure continuous training at local, regional and national levels for police officers, prosecutors and the judiciary on preventing and combating anti‑Muslim racism, including recognising and recording anti-Muslim hate crime, agreed as best practice by European agencies and other international organisations;
...
D. Prosecution / Law Enforcement
...
51. ensure that criminal law also covers anti-Muslim bias and penalises the following anti-Muslim acts when committed intentionally:
...
f. public insults and defamation of a person or a group of persons because they are Muslims or perceived to be Muslims;
g. threats against a person or group of persons because they are Muslims or perceived to be Muslims;
h. the public expression, with a racist aim, of an ideology which depreciates or denigrates, or which incites hatred against a group of persons because they are Muslims or perceived to be Muslims;
...
...”
70. In its report on France adopted on 8 December 2015 (CRI (2016)1), ECRI noted a substantial rise in intolerance and a worsening of racist behaviour in recent years. It recommended that certain conduct be expressly criminalised: (i) the public expression of an ideology claiming the superiority of or depreciating or denigrating a group of persons; and (ii) the creation or leadership of a group which promotes racism, support for such a group or participation in its activities (§ 10). ECRI noted a decline in the tolerance of diversity since 2009, as the National Consultative Commission for Human Rights had found in its report published on 12 June 2014 on the combating of racism, antisemitism and xenophobia (report for 2013, published by La documentation française), as well as the prevalence of antisemitic stereotypes, especially in various segments of French society (Front national supporters, part of the population of Arab origin, and supporters of the Front de gauche). It pointed out that hate speech had led to acts of racist violence, especially by extremist groups.
71. The Special Representative of the Secretary General of the Council of Europe organised a consultation of Muslim organisations. The results gave rise to a working document (July 2021) containing the following conclusion (unofficial translation):
“In sum, the propagation of discrimination, incitement to violence and death threats on line is a growing concern among minorities in Europe, the Muslim community in particular. Like other types of racist and anti-religious intolerance, the phenomenon of anti-Muslim feeling and hate is complex. It is clear, however, that it is on the rise and that it is dangerous because on-line hatred leads to violence and killing. It must therefore be dealt with urgently.”
72. In his report submitted in accordance with Human Rights Council resolution 16/4 (A/67/357, 7 September 2012), the Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression made the following observations in particular:
“46. While some of the above concepts may overlap, the Special Rapporteur considers the following elements to be essential when determining whether an expression constitutes incitement to hatred: real and imminent danger of violence resulting from the expression; intent of the speaker to incite discrimination, hostility or violence; and careful consideration by the judiciary of the context in which hatred was expressed, given that international law prohibits some forms of speech for their consequences, and not for their content as such, because what is deeply offensive in one community may not be so in another. Accordingly, any contextual assessment must include consideration of various factors, including the existence of patterns of tension between religious or racial communities, discrimination against the targeted group, the tone and content of the speech, the person inciting hatred and the means of disseminating the expression of hate. For example, a statement released by an individual to a small and restricted group of Facebook users does not carry the same weight as a statement published on a mainstream website. Similarly, artistic expression should be considered with reference to its artistic value and context, given that art may be used to provoke strong feelings without the intention of inciting violence, discrimination or hostility.
47. Moreover, while States are required to prohibit by law any advocacy of national, racial or religious hatred that constitutes incitement to discrimination, hostility or violence under article 20 (2) of the Covenant, there is no requirement to criminalize such expression. The Special Rapporteur underscores that only serious and extreme instances of incitement to hatred, which would cross the seven-part threshold, should be criminalized.
48. In other cases, the Special Rapporteur is of the view that States should adopt civil laws, with the application of diverse remedies, including procedural remedies (for example, access to justice and ensuring effectiveness of domestic institutions) and substantive remedies (for example, reparations that are adequate, prompt and proportionate to the gravity of the expression, which may include restoring reputation, preventing recurrence and providing financial compensation).
49. In addition, while some types of expression may raise concerns in terms of tolerance, civility and respect for others, there are instances in which neither criminal nor civil sanctions are justified. The Special Rapporteur wishes to reiterate that the right to freedom of expression includes forms of expression that are offensive, disturbing and shocking. Indeed, since not all types of inflammatory, hateful or offensive speech amount to incitement, the two should not be conflated.”
(b) Committee on the Elimination of Racial Discrimination
73. General Recommendation no. 35 of 26 September 2013, on combating racist hate speech, provides guidelines on the requirements of the International Convention on the Elimination of All Forms of Racial Discrimination, the aim being to help the States parties to fulfil their obligations. It is indicated in particular as follows:
“6. Racist hate speech addressed in Committee practice has included all the specific speech forms referred to in article 4 directed against groups recognized in article 1 of the Convention - which forbids discrimination on grounds of race, colour, descent, or national or ethnic origin - such as indigenous peoples, descent-based groups, and immigrants or non-citizens, including migrant domestic workers, refugees and asylum seekers, as well as speech directed against women members of these and other vulnerable groups. In the light of the principle of intersectionality, and bearing in mind that ‘criticism of religious leaders or commentary on religious doctrine or tenets of faith’ should not be prohibited or punished, the Committee’s attention has also been engaged by hate speech targeting persons belonging to certain ethnic groups who profess or practice a religion different from the majority, including expressions of Islamophobia, anti-Semitism and other similar manifestations of hatred against ethno‑religious groups, as well as extreme manifestations of hatred such as incitement to genocide and to terrorism. Stereotyping and stigmatization of members of protected groups has also been the subject of expressions of concern and recommendations adopted by the Committee.
7. Racist hate speech can take many forms and is not confined to explicitly racial remarks. As is the case with discrimination under article 1, speech attacking particular racial or ethnic groups may employ indirect language in order to disguise its targets and objectives. In line with their obligations under the Convention, States parties should give due attention to all manifestations of racist hate speech and take effective measures to combat them. The principles articulated in the present recommendation apply to racist hate speech, whether emanating from individuals or groups, in whatever forms it manifests itself, orally or in print, or disseminated through electronic media, including the Internet and social networking sites, as well as non-verbal forms of expression such as the display of racist symbols, images and behaviour at public gatherings, including sporting events.
...
15. ... On the qualification of dissemination and incitement as offences punishable by law, the Committee considers that the following contextual factors should be taken into account:
The content and form of speech: whether the speech is provocative and direct, in what form it is constructed and disseminated, and the style in which it is delivered.
The economic, social and political climate prevalent at the time the speech was made and disseminated, including the existence of patterns of discrimination against ethnic and other groups, including indigenous peoples. Discourses which in one context are innocuous or neutral may take on a dangerous significance in another: in its indicators on genocide the Committee emphasized the relevance of locality in appraising the meaning and potential effects of racist hate speech.
The position or status of the speaker in society and the audience to which the speech is directed. The Committee consistently draws attention to the role of politicians and other public opinion-formers in contributing to the creation of a negative climate towards groups protected by the Convention, and has encouraged such persons and bodies to adopt positive approaches directed to the promotion of intercultural understanding and harmony. The Committee is aware of the special importance of freedom of speech in political matters and also that its exercise carries with it special duties and responsibilities.
The reach of the speech, including the nature of the audience and the means of transmission: whether the speech was disseminated through mainstream media or the Internet, and the frequency and extent of the communication, in particular when repetition suggests the existence of a deliberate strategy to engender hostility towards ethnic and racial groups.
The objectives of the speech: speech protecting or defending the human rights of individuals and groups should not be subject to criminal or other sanctions.
...”
C. European Union law and case-law of the Court of Justice of the European Union (CJEU)
74. Framework Decision 2008/913/JHA of 28 November 2008 of the Council of the European Union on combating certain forms and expressions of racism and xenophobia by means of criminal law (OJ L 328, pp. 55-58) is presented in paragraphs 82 et seq. of the Court’s judgment in Perinçek v. Switzerland ([GC], no. 27510/08, 15 October 2015).
75. In addition, in May 2016 the European Commission launched a Code of Conduct involving four major digital technology companies (Facebook, Microsoft, Twitter and YouTube) to prevent and counter the spread of racist and xenophobic hate speech online. The aim of the code is to ensure that content removal notifications are dealt with promptly. To date the Commission has conducted six evaluations of the Code of Conduct, presenting its results every year from 2016 to 2021. On 1 March 2018 the Commission published Recommendation (EU) 2018/334 on measures to combat illegal content online effectively (OJ L 63, 6 March 2018). Lastly, on 15 December 2020, the Commission published, inter alia, the draft “Digital Services Act” (DSA) with the aim of having it adopted in 2022, to enable the implementation of a new regulatory framework, introducing across the European Union a series of new harmonised obligations for digital services (COM/2020/825 final). A provisional agreement on the DSA, between the Council of the European Union and the European Parliament, was reached on 23 April 2022. The DSA entered into force on 16 November 2022.
76. As to the case-law of the CJEU, it ruled in its judgment Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v. Wirtschaftsakademie Schleswig-Holstein GmbH of 5 June 2018 (C-210/16, EU:C:2018: 388), that the administrator of a “fan page” hosted on Facebook (such page being, unlike a personal account such as that used by the applicant in the present case, a professional account for the promotion of a company or organisation on Facebook, operating with a series of specific strategies to improve and measure visitor interaction) had to be characterised as being responsible for the processing of the data of individuals visiting the page and therefore shared joint liability with the operator of the social network, within the meaning of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data (OJ L 281 of 23 November 1995, pp. 31-50).
77. In its judgment in Fashion ID of 29 July 2019 (C‑40/17, EU:C:2019:629), the CJEU held that the administrator of a website (that of an online retail business selling fashion clothing) who inserted a “like” module from the social network Facebook, could be regarded as responsible, within the meaning of Directive 95/46, for the collection and communication of the personal data of visitors to that website.
78. In Glawischnig-Piesczek v. Facebook Ireland of 3 October 2019 (C- 18/18, EU:C:2019:821), the CJEU ruled that Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (OJ L 178 of 17 July 2000, pp. 1-16), in particular Article 15 § 1 thereof, had to be interpreted as not precluding a court of a member State from: ordering a host provider like Facebook to remove information which it stored, the content of which was identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information ordering a host provider to remove information which it stored, the content of which was equivalent to the content of information which was previously declared to be unlawful, or to block access to that information, provided that the monitoring of and search for the information concerned by such an injunction were limited to information conveying a message the content of which remained essentially unchanged compared with the content which gave rise to the finding of illegality and containing the elements specified in the injunction, and provided that the differences in the wording of that equivalent content, compared with the wording characterising the information which was previously declared to be illegal, were not such as to require the host provider to carry out an independent assessment of that content; and ordering a host provider to remove information covered by the injunction or to block access to that information worldwide within the framework of the relevant international law. Lastly, a court could also order a host to delete information concerned by the order or to block access to it worldwide, in the context of the relevant international law.
79. From the information in the Court’s possession it can be seen that the liability of individual holders of social media accounts in respect of comments posted by others on their “walls” or accounts is an issue which has not been dealt with specifically in thirty-four member States of the Council of Europe: Albania, Austria, Azerbaijan, Belgium, Bosnia-Herzegovina, Croatia, Cyprus, the Czech Republic, Denmark, Estonia, Germany, Greece, Hungary, Iceland, Italy, Latvia, Liechtenstein, Luxemburg, the Republic of North Macedonia, Malta, the Republic of Moldova, Montenegro, the Netherlands, Norway, Poland, Romania, San Marino, Serbia, the Slovak Republic, Slovenia, Spain, Sweden, Türkiye and the United Kingdom. To date, in only six of these States has the matter been addressed in one way or another (Austria, Croatia, Germany, Romania, Sweden and Türkiye), together with Switzerland (judgment of the Federal Court of 7 April 2022, case no. 6B 1360/2021). Some national courts have interpreted the existing legal norms relating to Internet hosts or intermediary service providers as forming the basis for attribution of such liability. In the other twenty‑eight member States in question there are no explicit legal provisions, regulations or judicial practice expressly dealing with this question. Thus, there does not always seem to be room for the attribution of any form of liability in that context. On the other hand, in some other countries such liability may, in theory, be imposed either with reference to general provisions of civil, administrative or criminal law, or on the basis of more specific provisions relating to the duties and obligations of Internet hosts or intermediary service providers. It should nevertheless be borne in mind that such situations remain hypothetical, given that they have so far never arisen in practice.
80. As regards “hate speech”, although, as such, this concept is defined in the legislation of only three of the member States surveyed (Albania, Montenegro and Serbia), the others also prohibit and penalise certain forms of expression, including “incitement to hatred”. Specific elements need to be present in order for expression of hatred to be punishable. In particular, it has to be made in public; to be directed against a group (or a person belonging to that group) with protected characteristics; to be intentional; and to reach a certain level of seriousness or to be capable of harmful consequences.
III. TERMS OF USE OF SOCIAL NETWORK FACEBOOK
81. At the relevant time a “Statement of rights and responsibilities” governed Facebook’s relations with its users, who were deemed to agree to it upon accessing the network. It provided in particular as follows in point 2.4: “when you publish content or information using the Public setting, it means that you are allowing everyone, including people off of Facebook, to access and use that information and to associate it with you (i.e. your name and profile picture)”. The statement also contained a provision on “hateful” content (replaced by “hate speech”, then “hateful speech” on subsequent amendment - cf. Part III, point 12 “Hate Speech”, in the latest version of “Facebook Community Standards”).
82. In addition, there was no legal provision requiring the holder of a personal account on a social network to set up any automatic filtering of comments posted by others and the practical possibility of prior content moderation on Facebook did not yet exist at the relevant time. However, Facebook has since enabled the administrators of its pages to implement prior and subsequent moderation of content posted by third parties.
ALLEGED VIOLATION OF ARTICLE 10 OF THE CONVENTION
83. The applicant alleged that his criminal conviction, on account of comments posted by third parties on the “wall” of his Facebook account, had breached Article 10 of the Convention, which reads as follows:
“1. Everyone has the right to freedom of expression. This right shall include freedom to hold opinions and to receive and impart information and ideas without interference by public authority and regardless of frontiers. This Article shall not prevent States from requiring the licensing of broadcasting, television or cinema enterprises.
2. The exercise of these freedoms, since it carries with it duties and responsibilities, may be subject to such formalities, conditions, restrictions or penalties as are prescribed by law and are necessary in a democratic society, in the interests of national security, territorial integrity or public safety, for the prevention of disorder or crime, for the protection of health or morals, for the protection of the reputation or rights of others, for preventing the disclosure of information received in confidence, or for maintaining the authority and impartiality of the judiciary.”
86. It noted in particular that the comments posted on the applicant’s Facebook “wall”, to which the public had access, were clearly unlawful. While acknowledging the election context and taking account of the medium used, the “wall” of a Facebook account, it took the view that the domestic courts’ findings concerning those comments had been fully substantiated.
87. In addition, after noting that the applicant had solely been reproached for his lack of vigilance and reaction with respect to certain comments on his Facebook “wall”, and in view of the specific local political context, the Chamber found, having regard to the margin of appreciation afforded to the respondent State, that the decision of the domestic courts to convict the applicant for not having promptly deleted the unlawful comments posted by third parties on his “wall”, which he was using in support of his election campaign, had been based on relevant and sufficient grounds. Accordingly, it held that the interference could be considered “necessary in a democratic society” within the meaning of Article 10 § 2 of the Convention.
88. The applicant pointed out that he had been convicted in his capacity as “producer” within the meaning of French law, without having received any notification asking him to remove the impugned comments. He submitted that it had not been proven that he had been aware of the comments or of their unlawfulness. He also pointed to the fact that he had been using a Facebook account as a local councillor and that the impugned comments had been posted by authors who had been both identified and convicted, his own conviction thus replicating theirs. At the relevant time, Facebook’s parameters had not provided for any filtering of comments prior to posting. The monitoring obligation imposed, according to the Chamber judgment, on a Facebook account holder would be very burdensome and would present him or her with irreconcilable conflicts of interest.
89. In the applicant’s submission, in view of the scale of the task, the holder of a Facebook account would inevitably be forced to engage in censorship owing to the risk of criminal proceedings, even in respect of remarks that were not manifestly unlawful. He took the view that his case, in reality, concerned the question of remarks which, while virulent, polemical or unpleasant, did not exceed the permissible limits of freedom of expression in political matters, particularly when made during an election campaign.
90. As regards the legality of his criminal conviction, he submitted that the criteria of accessibility, precision and foreseeability were lacking. He pointed out that, although the basis for his conviction had been section 24 of the Law of 29 July 1881, it was in fact section 93-3 of Law no. 82-652 of 29 July 1982 on audiovisual communication, enabling his liability to be engaged as “producer”, within the meaning of that provision, which was at issue in the present case.
91. The applicant stressed that section 93-3 of Law no. 82-652 of the Law of 29 July 1982 provided for a duality of actors and a “cascading” hierarchy of liability for the prosecution of an offence, the effect of which was to allow proceedings against the producer only if it was not possible to prosecute the publication director or, failing that, the authors. He noted, however, that there had been no publication director in the present case and that he had been prosecuted as producer even though the two authors of the impugned comments had been identified and convicted. He concluded that the application of the law and his conviction as producer had not been foreseeable. He added that the concept of producer was not defined by law in relation to social media.
92. He further submitted that section 93-3 of Law no. 82-652 of 29 July 1982 did not specify the conditions under which the producer was considered to have had knowledge of the unlawful remarks. It was inconsistent not to require prior notification to the producer, in order to ensure legal certainty, as was provided for in Law no. 2004-575 of 21 June 2004 in the case of hosts.
93. He also disagreed with the argument that his criminal conviction had pursued a legitimate aim, since section 93-3 of Law no. 82-652 of 29 July 1982 was intended to implicate the producer only if proceedings could not be brought against either the publication director or the authors, but that had not been the case.
94. As to the necessity of the interference, the applicant complained that he had been convicted in respect of the first message even though it had been removed by the author himself less than twenty-four hours after being posted. With regard to the comments made by L.R., he submitted that the courts had failed to show that he had been aware of them or that they were manifestly unlawful, merely invoking a presumption of a general duty of enhanced scrutiny based on his status as a politician. He had in any event removed those comments as soon as he had been informed of their existence, upon being summoned by the police.
95. The applicant further observed that the impugned remarks had reflected his party’s political manifesto, which had never been banned, and that they denounced a policy which promoted the establishment of community-specific businesses, thus amounting to political speech and criticism that should be allowed on social media. The remarks in his view were lawful as the language used had been vivacious, rather than vulgar or insulting, and Internet users were entitled to have recourse to a degree of exaggeration or provocation.
96. The applicant pointed out that he was a politician and thus his environment was a sensitive one, requiring him to strike a balance between the protection of the reputation of others and the interests of free discussion of political issues, particularly during an election campaign, which was a high point in the life of any party or political leader. The Internet and social media helped to simplify speech, and the pitfall of self-censorship had to be avoided at all costs, so as not to eliminate criticism of official policy or political opposition. The Internet provided, in particular, a bottom-up chain of communication from the citizen to the politician, as a means of expressing concerns, positions and criticisms to the latter. The transposition of media law was therefore inadequate and, moreover, with a system of interactive monologues, each speaker should be solely liable for his or her remarks.
97. Furthermore, the applicant argued that filtering would not be desirable in view of the emergence of an electronic democracy. On the other hand, formal notification requesting the removal of a given comment, whether by electronic registered letter or through a reporting mechanism, would be a means of establishing that the account holder was aware of the comment and of verifying that he or she was acting in good faith by immediately deleting it.
98. Lastly, he referred to the need to have recourse to means other than criminal proceedings in order to respond to political criticism.
99. The applicant concluded from the foregoing that the reasons given by the domestic courts to convict him had been neither relevant nor sufficient.
100. The Government acknowledged that there had been an interference with the applicant’s freedom of expression, describing it as indirect, since he had not been convicted on the basis of his own words and he himself had not conveyed the impugned remarks to the public. They concluded that the interference in the present case thus concerned only the possible limitations on the applicant’s ability to open a free discussion forum, allowing third parties to express their views and react on his Facebook “wall”, and therefore a circumscribed area of his means of political expression.
101. They submitted, however, that the interference in question was prescribed by law, pursued legitimate aims and was necessary in a democratic society.
102. As to the lawfulness of that interference, they referred in particular to the terms of section 24(8) of the Law of 29 July 1881 and to the case-law of the Court of Cassation, which illustrated the various situations in which the relevant offence had or had not been considered to have been committed. They also pointed out that racist remarks might not constitute incitement to hatred if they did not, even implicitly, call for or encourage discrimination, hatred or violence, emphasising that the courts would look at the intention behind the speech. The Government submitted that in the present case the domestic courts, in which Leila T. had brought proceedings, had given reasons for their decisions and had applied the usual tests. In their view, the political backdrop and the existence of an election campaign had been expressly taken into account.
103. As regards the imputability of the offence, the Government pointed out that the applicant had been convicted in his capacity as “producer” within the meaning of section 93-3 of Law no. 82-652 of 29 July 1982, which provided for two scenarios, depending on whether or not the message had been “fixed” prior to its transmission to the public. The definition of “producer” did not derive from statute law, but from the jurisprudence of the Court of Cassation and the Constitutional Council; they referred to the Constitutional Council’s decision of 16 September 2011 (QPC, no. 2011‑164) and to the case-law of the Court of Cassation, which had drawn the appropriate conclusions from that decision. They further submitted that the Court had previously held that imputing criminal liability to the “producer” was not incompatible with the Convention (they referred to Radio France and Others v. France, no. 53984/00, § 24, ECHR 2004 II).
104. The Government observed that the applicant had been prosecuted for specific conduct directly linked to his status as producer, being the account holder, whereas the authors had been prosecuted and convicted as accomplices, in accordance with section 93-3 of Law no. 82-652 of 29 July 1982.
105. They submitted that political debate was not exempt from restrictions in the Court’s case-law, as politicians could be held criminally liable for hate speech at political rallies if it was uttered publicly. The liability of a politician, like that of a private individual, for unlawful remarks made by third parties on their Facebook “wall”, could be engaged irrespective of whether the proceedings were criminal or civil. The Government also pointed out that while a political party might be held liable for unlawful remarks posted by third parties in the context of an account that it had created on a social network or medium as part of its activity, a legal entity could not, with certain exceptions, be prosecuted for offences under the Freedom of the Press Act of 29 July 1881. In any event, any potential liability of an entity did not rule out liability on the part of the individuals who were the perpetrators of the alleged offences or their accomplices.
106. The Government further submitted that there were a number of different forms of political expression, each of which was controlled and regulated. As regards social media pages and political communication, they enjoyed wide dissemination, extending beyond the circle of supporters and the context of a political rally, while remaining permanently accessible. They highlighted the specificity of social networks as tools of political communication, as opposed to political meetings or rallies. In French law, unlike the provisions applying to remarks made at such rallies, online hate speech fell within a different set of rules, in that the liability of the author of the hate speech could be engaged only as an alternative to that of the “publication director” or the “producer” of an online communication site. They explained this by the fact that the use of social media differed from certain more traditional means of political communication, entailing broad and sustained dissemination over time to a wide audience that extended well beyond that of a political rally: therefore, in view of the risk of hate speech being spread even more widely, it would be particularly dangerous not to regulate its dissemination on Facebook. They took the view that the applicant’s Facebook account was more akin to a large portal run for professional and commercial purposes than other types of Internet fora, as defined in Delfi AS v. Estonia ([GC], no. 64569/09, ECHR 2015). In that context, the applicant had had only one obligation: to delete unlawful posts promptly after becoming aware of them. No regulation required the automatic filtering of comments and there was no practical possibility of prior content moderation on Facebook. They concluded that the responsibility contemplated by Article 10 § 2 of the Convention should lead producers, in particular when they were candidates for election or elected officials, to open a forum for discussion only if they were able to moderate comments to a minimum degree.
107. Moreover, the Government acknowledged that section 93-3 of Law no. 82-652 of 29 July 1982 did not specify the conditions in which the publication director or producer was deemed to have actual knowledge of the comments, unlike hosts under the provisions of Law no. 2004-575 of 21 June 2004, which laid down rules for the reporting of illegal content. According to the case-law of a number of courts of first instance, the need for prompt deletion required a very rapid reaction, and the Court of Cassation had stated that prior knowledge of the remarks had to be established.
108. As regards the aim pursued by the interference, the Government submitted that it pursued at least one legitimate aim within the meaning of Article 10 of the Convention, namely the protection of the rights of others.
109. As to the necessity of the interference and the context in which the comments had been made, the domestic courts had correctly applied section 24 of the Law of 29 July 1881 in characterising the offence as incitement to hatred, since the applicant’s Facebook “wall” had not only been overtly presented as that of a local Front national politician who was running an election campaign, but had also been voluntarily left open to anyone with a Facebook account. This meant that his responsibility was all the greater in his capacity as a politician.
110. The Government further submitted that the impugned interference had been proportionate and had been the only measure capable in practice of ensuring the removal of the impugned comments, which had remained accessible to the public, as victims had not had the necessary means to achieve this. They submitted that the applicant could have changed the settings to regulate the comments on his Facebook account and that, in order to avoid further harm to Leila T. and without running the risk of hindering his election campaign, it would have been sufficient for him to delete the comments of which he was perfectly aware and which did not form part of the local debate.
111. Lastly, the Government noted that the applicant had been fined a reduced amount by the Court of Appeal, without there being any further consequences for him.
3. Observations of third-party interveners
(a) The Government of Slovakia
112. The Slovak Government observed in particular that the social media era had taken public debate onto the Internet. In addition, repeated attacks on democratic principles, human dignity and private life, hidden under the cloak of freedom of expression, should be excluded from its scope of protection. The State should be allowed to combat such attacks by criminalising hate speech.
113. With regard to the very significant impact of social media and their use by politicians, the Slovak Government observed that they had become a tool for political combat and public influence. To illustrate this argument, they provided statistical data obtained for Slovakia and analysed by a Slovak newspaper, according to which, for a country of 5.45 million inhabitants, the three best known Slovak politicians had, in 2021, collected 11, 4.2 and 4.1 million social media interactions respectively (the second and third being of a far-right political orientation). Similarly, in 2020, the mayor of a Slovak town of 5,000 people, known for his resistance to governmental measures relating to the Covid-19 pandemic, had 125,000 interactions on his Facebook page, and nearly 1.5 million the following year. They added that politicians were also the authors of highly successful social media posts.
114. In that context, the Slovak Government considered that the question of politicians’ criminal liability for hate speech disseminated on social media should be approached with extreme caution.
(b) The Government of the Czech Republic
115. The Czech Government argued in particular that the scope of the liability shared between the author of the content, the social media platform and third parties had to be clarified by the Court, so that the obligations should be reasonably foreseeable for each one.
116. They took the view that the liability of the social network or platform should not be overlooked, to avoid imposing a disproportionate burden on the holder of an account. They also raised the issue of the scope of the States’ positive obligation where the authors of the offending remarks had been identified.
117. In addition, warning against the chilling effect of criminal sanctions, particularly in an election context, they found it necessary to envisage alternative procedures and less severe measures.
(c) Media Defence and the Electronic Frontier Foundation
118. Media Defence and the Electronic Frontier Foundation submitted inter alia that the principles established in Delfi AS (cited above) should not be applied to the users of digital platforms (such as Facebook) acting as mere intermediaries which, according to some studies, were among those most impacted by erroneous moderation.
119. In their view, the various users of social media should not be obliged to decide whether third-party posts on their accounts were lawful, since that was a matter for the national courts alone, or to monitor content produced by third parties. They should be held liable only in the event of proven knowledge of the illegal content.
(d) European Information Society Institute (EISi)
120. EISi underscored the need to determine the outer limits of the liability of speech facilitators while examining the interaction between the various actors in a complex digital “ecosystem”. Holding a Facebook “wall” owner criminally liable for failing to take prompt, pre-notification action against hate speech by identifiable authors was a disproportionate measure with a potential chilling effect. Any liability should be shared between the authors of the comments if they could be identified and the other actors involved, in line with a “graduated and differentiated” approach.
121. EISi submitted in particular that social media platforms had inherent characteristics that were incompatible with editorial control such as that applied by the press and that it was not possible to require monitoring of all comments in the first twenty-four hours of publication without imposing a disproportionate burden. It advocated a “notice-and-takedown” model of liability enforcement, with the exception of situations where the intermediary had itself incited the unlawful comments in question.
1. Whether there has been an interference
122. It is not in dispute between the parties that the applicant’s criminal conviction constituted an interference with his right to freedom of expression, as guaranteed by Article 10 § 1 of the Convention. The Court sees no reason to hold otherwise (see, in the same vein, Delfi AS, cited above, § 118).
123. Such interference will be in breach of the Convention unless it was “prescribed by law”, pursued one or more of the legitimate aims referred to in the second paragraph of Article 10 and was “necessary in a democratic society”.
2. Whether the interference was lawful
124. The Court reiterates that the expression “prescribed by law” in the second paragraph of Article 10 not only requires that the impugned measure should have a legal basis in domestic law, but also refers to the quality of the law in question, which should be accessible to the person concerned and foreseeable as to its effects (see, among other authorities, NIT S.R.L. v. Republic of Moldova [GC], no. 28470/12, § 158, 5 April 2022; Satakunnan Markkinapörssi Oy and Satamedia Oy v. Finland [GC], no. 931/13, § 142, 27 June 2017; and Delfi AS, cited above, § 120).
125. As regards the requirement of foreseeability, the Court has repeatedly held that a norm cannot be regarded as a “law” within the meaning of Article 10 § 2 unless it is formulated with sufficient precision to enable a person to regulate his or her conduct. That person must be able - if need be with appropriate advice - to foresee, to a degree that is reasonable in the circumstances, the consequences which a given action may entail. Those consequences need not be foreseeable with absolute certainty. A law which confers a discretion is thus not in itself inconsistent with the requirement of foreseeability, provided that the scope of the discretion and the manner of its exercise are indicated with sufficient clarity, having regard to the legitimate aim of the measure in question, to give the individual adequate protection against arbitrary interference (see Magyar Kétfarkú Kutya Párt v. Hungary [GC], no. 201/17, § 94, 20 January 2020, with further references). Whilst certainty is desirable, it may bring in its train excessive rigidity, and the law must be able to keep pace with changing circumstances. Accordingly, many laws are inevitably couched in terms which, to a greater or lesser extent, are vague, and whose interpretation and application are questions of practice (see Satakunnan Markkinapörssi Oy et Satamedia Oy, cited above, § 143; Delfi AS, cited above, § 121; and Lindon, Otchakovsky-Laurens and July v. France [GC], nos. 21279/02 and 36448/02, § 41, ECHR 2007‑I). The level of precision required of domestic legislation - which cannot provide for every eventuality - depends to a considerable degree on the content of the law in question, the field it is designed to cover and the number and status of those to whom it is addressed (see NIT S.R.L., cited above, § 160; Satakunnan Markkinapörssi Oy and Satamedia Oy, cited above, § 144; and Delfi AS, cited above, § 122).
126. A margin of doubt in relation to borderline facts does not therefore by itself make a legal provision unforeseeable in its application. Nor does the mere fact that a provision is capable of more than one construction mean that it fails to meet the requirement of “foreseeability” for the purposes of the Convention. The role of adjudication vested in the courts serves precisely to dissipate such interpretational doubts as remain, taking into account the changes in everyday practice (see Magyar Kétfarkú Kutya Párt, cited above, § 97, and Gorzelik and Others v. Poland [GC], no. 44158/98, § 65, ECHR 2004‑I).
127. At the same time, the Court is aware that there must come a day when a given legal norm is applied for the first time (see NIT S.R.L., cited above, § 159; Magyar Kétfarkú Kutya Párt, cited above, § 97; and Kudrevičius and Others v. Lithuania [GC], no. 37553/05, § 115, ECHR 2015). The novel character of a legal question that has not hitherto been raised, particularly with regard to previous decisions, is not in itself incompatible with the requirements of accessibility and foreseeability of the law, provided the solution adopted is consistent with one of the possible and reasonably foreseeable interpretations (see, mutatis mutandis, X and Y v. France, no. 48158/11, § 61, 1 September 2016; Huhtamäki v. Finland, no. 54468/09, § 51, 6 March 2012; and Soros v. France, no. 50425/06, § 58, 6 October 2011).
128. The Court’s power to review compliance with domestic law is thus limited, as it is primarily for the national authorities, notably the courts, to interpret and apply domestic law (see, among other authorities, NIT S.R.L., cited above, § 160; Satakunnan Markkinapörssi Oy and Satamedia Oy, cited above, § 144; and Kudrevičius and Others, cited above, § 110). Unless the interpretation is arbitrary or manifestly unreasonable, the Court’s role is confined to ascertaining whether the effects of that interpretation are compatible with the Convention (see NIT S.R.L., cited above, § 160; Radomilja and Others v. Croatia [GC], nos. 37685/10 and 22768/12, § 149, 20 March 2018; and Centre for Democracy and the Rule of Law v. Ukraine, no. 10090/16, § 108, 26 March 2020, with further references). In any event, it is not for the Court to express a view on the appropriateness of methods chosen by the legislature of a respondent State to regulate a given field. Its task is confined to determining whether the methods adopted and the effects they entail are in conformity with the Convention (see Delfi AS, cited above, § 127, and Gorzelik and Others, cited above, § 67).
(b) Application of those principles to the present case
129. The Grand Chamber would begin by noting that the applicant’s criminal conviction was handed down on the basis of section 23, first paragraph, section 24, eighth paragraph, of the Law of 29 July 1881, and section 93-3 of Law no. 82-652 of 29 July 1982. Like the Chamber (see paragraph 71 of the Chamber judgment), it reiterates that a criminal conviction under sections 23 and 24 of the Law of 29 July 1881 meets the requirement of foreseeability of the law for the purposes of Article 10 of the Convention (see, among other authorities, Le Pen v. France (dec.), no. 18788/09, 20 April 2010; Soulas and Others v. France, no. 15948/03, § 29, 10 July 2008; Garaudy v. France (dec.), no. 65831/01, 24 June 2003; and Bonnet v. France (dec.), no. 35364/19, § 32, 25 January 2022). It does not see any reason to hold otherwise in the present case.
130. With regard more specifically to section 93-3 of Law no. 82-652 of 29 July 1982, it notes that this provision lays down a legal framework which has developed in three stages (see paragraph 36 above ).
131. The Government submitted in this connection that section 93-3 of Law no. 82-652 of 29 July 1982 provided for two scenarios, depending on whether or not the statement had undergone “prior fixing” (see paragraph 103 above). The Court observes that an absence of “prior fixing” had precisely been the reason for the amendment introduced by Law no. 2009-669 of 12 June 2009 (see paragraph 36 above), with the addition of a fifth and last paragraph in section 93-3. This paragraph sought specifically to regulate the liability of the publication director in such a situation. Ultimately, in addition to the 2009 reform amending section 93-3, both the Constitutional Council and the Court of Cassation have extended the benefit of the last paragraph of section 93-3 to the producer (see paragraphs 40 and 41 above).
132. The applicant, however, in addition to submitting that the concept of producer was not defined by the law in relation to social networks, argued that the application of section 93-3 of Law no. 82-652 of 29 July 1982 on audiovisual communication and his conviction as producer had not been foreseeable and that prior notification to the producer was required in order to ensure legal certainty (see paragraphs 90 and 91 above).
133. The Court would particularly emphasise the fact that, further to a preliminary reference on the question of constitutionality (QPC) concerning a difference in treatment between the publication director, the only actor referred to in section 93-3 inserted by Law no. 2009-669 of 12 June 2009, and the producer, the Constitutional Council provided some key explanations in its decision no. 2011-164 QPC of 16 September 2011 (see paragraph 40 above). First, as regards the definition of the concept of “producer”, it referred to the interpretation given by the Court of Cassation in its judgments of 16 February 2010 (see paragraph 38 above). Secondly, the Constitutional Council formulated an interpretative reservation whereby section 93-3 of Law no. 82‑652 of 29 July 1982 could not be interpreted as allowing the creator or administrator of a website for online communication to the public, which made comments by Internet users publicly accessible, to be held criminally liable as producer solely on account of the content of comments of which he or she had no knowledge before they were posted online. Accordingly, the effect of its interpretative reservation is to allow the application to the producer of the same mitigated liability regime as that granted to the publication director under the fifth and last paragraph of section 93-3.
134. The Court finds, first, that the definition of producer within the meaning of Law no. 82-652 of 29 July 1982 emerges from a consistent line of decisions of the Court of Cassation, as subsequently endorsed by the Constitutional Council (see paragraphs 38, 40 and 133 above), in terms that are clear and unequivocal. It is therefore of the view that no question arises in this connection concerning the lawfulness of the interference.
135. Secondly, as regards the application of section 93-3 of Law no. 82‑652 of 29 July 1982 and its regime of “cascading” liability, the Court notes at the outset that the case of Radio France and Others (cited above), as relied upon by the Government (see paragraph 103 above) concerned a situation which is not relevant to the present case, namely the presumption of liability of a publication director, in the audiovisual field, where the impugned message had undergone “prior fixing” before being broadcast.
136. The Court would underscore the importance of clearly and precisely defining the scope of criminal offences relating to expression which incites, encourages or justifies violence, and the need to interpret the relevant provisions of criminal law strictly. It further notes the recommendations of the Committee of Ministers, which has emphasised the fact that the responsibilities and liability rules imposed on Internet intermediaries should be “transparent, clear and predictable” (see paragraph 62 above). It is important that the High Contracting Parties bear this in mind when adapting existing regulations or adopting new norms, as and when technologies such as the Internet progress.
137. The Court notes that the “cascading” liability regime, which is intended to solve the problem, for the potential victim of an offence, of the author’s anonymity, was endorsed by the Court of Cassation in its case-law from 2010 onwards (see paragraph 39 above).
138. In the present case, the authors were not only identified but also prosecuted together with the applicant and convicted as his accomplices. In this connection, the Court notes that, prior to the applicant’s conviction, the Court of Cassation’s jurisprudence already permitted the possibility that the liability of the producer alone could be brought into play in the case of offences made out under the press legislation on account of statements by clearly identified third parties. The principle of the autonomy of the proceedings, which has been applied by different formations of the Court of Cassation in relation to various situations (see paragraph 43 above), applies without prejudice to the “cascading” liability regime (see paragraph 39 above), which is intended for a different scenario, namely where proceedings cannot be brought against the author of the offending message for a variety of reasons. Thus, in a judgment handed down prior to the facts of the present case, on 16 February 2010 (appeal no. 09-81.064, Bull. crim., no. 31), the Court of Cassation had quashed the judgment of a Court of Appeal which had acquitted the administrator of a blog, without ascertaining whether he could be prosecuted as producer, in proceedings concerning a comment posted thereon by a third party, even though that author had been identified (see paragraph 39 above). The Court would further emphasise that, in its decision of 16 September 2011, the Constitutional Council accepted that section 93‑3 of Law no. 82-652 of 29 July 1982, in aligning the regime of liability applicable to the producer with that of the publication director, was in conformity with the Constitution (see paragraphs 40 and 133 above).
139. Consequently, the Court takes note of the interpretation of section 93-3 of Law no. 82-652 of 29 July 1982 and its application by the domestic courts, in the light of the domestic law as it stood at the material time (see paragraphs 35 et seq. above), and considers that they were neither arbitrary nor manifestly unreasonable.
140. Lastly, as to the question of the point in time from which the producer is deemed to have had knowledge of the unlawful remarks, the Court notes that section 93-3 of Law no. 82-652 of 29 July 1982 indeed remains silent (see paragraph 37 above), leaving the matter to be decided by the relevant domestic courts on a case-by-case basis. Moreover, at the material time, the domestic law did not require any prior representation by a victim vis-à-vis the producer, unlike the rule then applying to “hosts” such as Facebook (see paragraph 45 above). The Court would again point out that it is not its task to express a view on the appropriateness of methods chosen by the legislature of a respondent State to regulate a given field (see paragraph 128 above). The lack of a system of prior notification to the producer cannot therefore in itself raise a difficulty in terms of the lawfulness of the interference, regardless of any difference in relation to the regime that may be applicable to hosts (see paragraph 45 above). The Court would, moreover, reiterate that in cases where third-party user comments take the form of hate speech, the rights and interests of others and of society as a whole may entitle Contracting States to impose liability on the relevant Internet news portals, without contravening Article 10 of the Convention, if they fail to take measures to remove clearly unlawful comments without delay, even without notice from the alleged victim or from third parties (see Delfi AS, cited above, § 159). Even though the applicant’s situation cannot be compared to that of an Internet news portal (see paragraph 180 below), the Court sees no reason to hold otherwise in the present case. A situation entailing the judicial interpretation of principles contained in statute law will not in itself necessarily fall foul of the requirement that the law be framed in sufficiently precise terms, as the role of adjudication vested in the courts serves precisely to dissipate such interpretational doubts as remain (see paragraphs 126 et seq. above).
141. The question of the liability of a Facebook account holder, in the present case a politician during an election campaign, for remarks posted on his or her “wall”, particularly in a political and electoral context, had not yet given rise to any specific case-law at the relevant time. However, as the Court has already pointed out, a margin of doubt as to the consequences of applying a law to borderline facts does not in itself mean that its application fails to meet the requirement of foreseeability (see paragraph 126 above), nor does the fact that this was the first case of its kind as such render the interpretation of the law unforeseeable (see paragraph 127 above). The novel character of the legal question raised in the case was not in itself incompatible with the requirements of accessibility and foreseeability of the law. Moreover, as the Chamber rightly observed (see paragraphs 69 and 72 of its judgment), the applicant, even though he was assisted by a lawyer at the Conseil d’État and at the Court of Cassation, did not raise this matter in his appeal on points of law, thus showing that he did not intend to dispute in the domestic courts the quality of the legal basis of the proceedings against him. In any event, the Court notes that the applicant did not substantiate his allegation that the domestic courts’ interpretation had been arbitrary or manifestly unreasonable (see paragraphs 127 and 128 above). On the contrary, having regard to the foregoing, it was one of the possible and reasonably foreseeable interpretations.
142. Having regard to all of the foregoing considerations, the Court finds that section 93-3 of Law no. 82-652 of 29 July 1982 was formulated with sufficient precision, for the purposes of Article 10 of the Convention, to enable the applicant to regulate his conduct in the circumstances of the present case.
3. Whether the interference pursued a legitimate aim
143. The applicant did not agree that his criminal conviction had pursued a legitimate aim, arguing that section 93-3 of Law no. 82-652 of 29 July 1982 was intended as a basis for proceedings to be brought against the producer only where they could not be brought against the publication director and authors.
144. While referring to its findings on the lawfulness of the interference in this connection (see paragraphs 135-139 above), the Court takes the view that there is no doubt, having regard to the reasoning given by the domestic courts in support of the applicant’s conviction (see paragraphs 26-28 and 31‑26 above), that the interference pursued not only the legitimate aim of protecting the reputation or the rights of others but also that of preventing disorder and crime (contrast Perinçek v. Switzerland [GC], no. 27510/08, § 153, ECHR 2015 (extracts)).
4. Whether the interference was necessary in a democratic society
(i) Freedom of expression
145. The general principles concerning the question whether a given interference is “necessary in a democratic society” are well established in the Court’s case-law and can be summed up as follows (see, among many other authorities, NIT S.R.L., cited above, § 177, Perinçek, cited above, §§ 196‑197, and Delfi AS, cited above, § 131):
“(i) Freedom of expression constitutes one of the essential foundations of a democratic society and one of the basic conditions for its progress and for each individual’s self-fulfilment. Subject to paragraph 2 of Article 10, it is applicable not only to ‘information’ or ‘ideas’ that are favourably received or regarded as inoffensive or as a matter of indifference, but also to those that offend, shock or disturb. Such are the demands of pluralism, tolerance and broadmindedness without which there is no ‘democratic society’. As set forth in Article 10, this freedom is subject to exceptions, which ... must, however, be construed strictly, and the need for any restrictions must be established convincingly ...
(ii) The adjective ‘necessary’, within the meaning of Article 10 § 2, implies the existence of a ‘pressing social need’. The Contracting States have a certain margin of appreciation in assessing whether such a need exists, but it goes hand in hand with European supervision, embracing both the legislation and the decisions applying it, even those given by an independent court. The Court is therefore empowered to give the final ruling on whether a ‘restriction’ is reconcilable with freedom of expression as protected by Article 10.
(iii) The Court’s task, in exercising its supervisory jurisdiction, is not to take the place of the competent national authorities but rather to review under Article 10 the decisions they delivered pursuant to their power of appreciation. This does not mean that the supervision is limited to ascertaining whether the respondent State exercised its discretion reasonably, carefully and in good faith; what the Court has to do is to look at the interference complained of in the light of the case as a whole and determine whether it was ‘proportionate to the legitimate aim pursued’ and whether the reasons adduced by the national authorities to justify it are ‘relevant and sufficient’ ... In doing so, the Court has to satisfy itself that the national authorities applied standards which were in conformity with the principles embodied in Article 10 and, moreover, that they relied on an acceptable assessment of the relevant facts ...”
(ii) Debate in the field of politics
(α) Protection of political debate
146 . There is little scope under Article 10 § 2 of the Convention for restrictions on freedom of expression in the field of political speech (see NIT S.R.L., cited above, § 178; Sürek v. Turkey (no. 1) [GC], no. 26682/95, § 61, ECHR 1999‑IV; and Fleury v. France, no. 29784/06, § 43, 11 May 2010). The promotion of free political debate is a very important feature of a democratic society and the Court attaches the highest importance to freedom of expression in the context of such debate (see Feldek v. Slovakia, no. 29032/95, § 83, ECHR 2001‑VIII). The authorities’ margin of appreciation, in assessing the “necessity” of a contested measure in this context, is therefore particularly narrow (see, among other authorities, Tête v. France, no. 59636/16, § 63, 26 March 2020; Willem v. France, no. 10883/05, § 32, 16 July 2009; Mamère v. France, no. 12697/03, § 20, ECHR 2006‑XIII; and Lingens v. Austria, 8 July 1986, § 42, Series A no. 103).
147. Freedom of expression is especially important for an elected representative of the people, political parties and their active members and, accordingly, interference with the freedom of expression of a member of the opposition, who represents his or her electorate, draws attention to their preoccupations and defends their interests, thus calls for the closest scrutiny on the part of the Court (see Selahattin Demirtaş v. Turkey (no. 2) [GC], no. 14305/17, § 242, 22 December 2020; Karácsony and Others v. Hungary [GC], nos. 42461/13 and 44357/13, § 137, 17 May 2016; Otegi Mondragon v. Spain, no. 2034/07, § 50, ECHR 2011; and Féret v. Belgium, no. 15615/07, § 65, 16 July 2009).
(β) The existence of responsibility and limits not to be exceeded
148. While political speech calls for an elevated level of protection, the freedom of political debate is not absolute in nature. A Contracting State may make it subject to certain “restrictions” or “penalties”, but it is for the Court to give a final ruling on the compatibility of such measures with the freedom of expression enshrined in Article 10 (see Selahattin Demirtaş, cited above, § 245; Féret, cited above, § 63; and Castells v. Spain, 23 April 1992, § 46, Series A no. 236).
149. Since tolerance and respect for the equal dignity of all human beings constitute the foundations of a democratic, pluralistic society, it follows that, in principle, it may be considered necessary in certain democratic societies to penalise or even prevent all forms of expression that propagate, encourage, promote or justify hatred based on intolerance (including religious intolerance), provided that any “formalities”, “conditions”, “restrictions” or “penalties” imposed are proportionate to the legitimate aim pursued (see Féret, cited above, § 64). However, while any individual who takes part in a public debate of general concern must not overstep certain limits, particularly with regard to respect for the reputation and the rights of others, a degree of exaggeration, or even provocation, is permitted (see Fleury, cited above, § 45, and Willem, cited above, § 33).
150. Moreover, political figures also have duties and responsibilities. Thus the Court has found that it is crucial for politicians, when expressing themselves in public, to avoid comments that might foster intolerance (see Erbakan v. Turkey, no. 59405/00, § 64, 6 July 2006), and that they should also be particularly careful to defend democracy and its principles, their ultimate aim being to govern (see Féret, cited above, § 75). In particular, to foster the exclusion of foreigners constitutes a fundamental attack on individual rights, and everyone - politicians included - should exercise particular caution in discussing such matters (ibid.). Consequently, remarks capable of arousing a feeling of rejection and hostility towards a community fall outside the protection guaranteed by Article 10 (see Le Pen v. France (dec.), no. 45416/16, §§ 34 et seq., 28 February 2017).
151. Such responsibility does not, of course, rule out any discussion of delicate or sensitive matters, but it must be borne in mind that political parties have the right to defend their opinions in public, even if some may offend, shock or disturb part of the population. They can therefore propose solutions to the problems linked to immigration, but in doing so they must avoid advocating racial discrimination and resorting to vexatious or humiliating remarks or attitudes, as such conduct might trigger reactions among the public that would be detrimental to a peaceful social climate and might undermine confidence in the democratic institutions (see Féret, cited above, § 77).
(γ) The election context
152. In the context of an election campaign, a certain vivacity of comment may be tolerated more than in other circumstances (see Desjardin v. France, no. 22567/03, § 48, 22 November 2007, and Brasilier v. France, no. 71343/01, § 42, 11 April 2006). One of the principal characteristics of democracy is indeed the possibility it offers of resolving problems through public debate (see Dareskizb Ltd v. Armenia, no. 61737/08, § 77, 21 September 2021). Generally speaking, during an election campaign, discussion of the candidates and their programmes contributes to the public’s right to receive information and strengthens voters’ ability to make informed choices between candidates (see Orlovskaya Iskra v. Russia, no. 42911/08, § 130, 21 February 2017).
153. Furthermore, while political parties should enjoy broad freedom of expression in the context of an election, in order to try to convince their electorate, in the case of racist or xenophobic discourse such a context contributes to stirring up hatred and intolerance, as the positions of the candidates will inevitably harden and slogans or catchphrases become more prominent than reasoned arguments. The impact of racist and xenophobic discourse then becomes greater and more harmful (see Féret, cited above, § 76).
(iii) Hate speech
154. In its Perinçek judgment (cited above, §§ 204-208), the Court reiterated the applicable principles concerning calls to violence and hate speech, as summed up in its judgment in Erkizia Almandoz v. Spain (no. 5869/17, §§ 40‑41, 22 June 2021):
“40. For the purpose of identifying hate speech there are a certain number of factors to be taken into account and they have been consolidated in, for example, the Perinçek judgment (cited above, §§ 204-207, with the references cited):
(i) The question whether the statements were made against a tense political or social background. The presence of such a background has generally led the Court to accept that some form of interference with such statements was justified.
(ii) The question whether the statements, being correctly interpreted and assessed in their immediate or more general context, may be regarded as a direct or indirect call to violence, or as justifying violence, hatred or intolerance. Where it examines this question, the Court has been particularly sensitive towards sweeping statements attacking or casting in a negative light entire ethnic, religious or other groups.
(iii) The Court has also paid attention to the manner in which the statements were made, and their capacity - direct or indirect - to lead to harmful consequences.
41. In the context of the above-mentioned cases, it was the interplay between the various factors rather than any one of them taken in isolation that determined the outcome of the case. The Court’s approach to that type of case can thus be described as highly context-specific (Perinçek, cited above, § 208).”
155. Moreover, as the Court pointed out in its Féret judgment (cited above, § 73 - see also Atamanchuk v. Russia, no. 4493/11, § 52, 11 February 2020), where the circumstances had arisen in a political context and specifically that of an election campaign:
“... incitement to hatred does not necessarily require a call for specific acts of violence or other offences. Attacks on persons committed by insulting, holding up to ridicule or slandering certain parts of the population or specific groups thereof, or to incite discrimination, as was the case in the present instance, will be sufficient for the authorities to seek to combat such racist speech in response to freedom of expression which has been exercised in an irresponsible manner and is harmful to the dignity, or even the safety, of those parties or groups ... Political speeches that stir up hatred based on religious, ethnic or cultural prejudices represent a threat to social peace and political stability in democratic States ...”
156. The question of statements directed at particular groups on account of their origin or religion is nothing new (see, in particular, Le Pen, no. 18788/09, cited above, and Soulas and Others, cited above, §§ 36 et seq.). Where the remarks in question incite violence against an individual or a public official or a sector of the population, the State authorities enjoy a broader margin of appreciation in assessing the “necessity” of a given interference with the right to freedom of expression (see, among other authorities, Sürek v. Turkey (no. 1) [GC], no. 26682/95, § 61, ECHR 1999‑IV and the references cited therein). In addition, expressions that seek to spread, incite or justify hatred based on intolerance, including religious intolerance, do not enjoy the protection afforded by Article 10 of the Convention (see E.S. v. Austria, no. 38450/12, § 43, 25 October 2018).
157. In its Soulas judgment (cited above, § 42), the Court reiterated one of the lessons of Jersild v. Denmark (23 September 1994, § 30, Series A no. 298), namely that it was of the utmost importance to combat racial discrimination in all its forms and manifestations. Moreover, the Court has consistently held that the varying degrees of problems that States may face in the context of immigration and integration policies require that they be afforded a margin of appreciation that is broad enough to determine the existence and extent of the necessity of such interference (see Le Pen, decisions cited above, and Soulas, cited above, § 38). Hate speech is not always openly presented as such. It may take various forms, not only through patently aggressive and insulting remarks that wilfully undermine the values of tolerance, social peace and non-discrimination (which may give rise to the application of Article 17 of the Convention - see, among many other authorities, Ayoub and Others v. France, nos. 77400/14 and 2 others, 8 October 2020, and the numerous authorities cited therein at §§ 92-101), but also implicit statements which, even if expressed guardedly or in a hypothetical form (see Smajić v. Bosnia and Herzegovina (dec.), no. 48657/16, 16 January 2018), prove equally as hateful.
(iv) Internet and social media
(α) General principles
158. The Internet has become one of the principal means by which individuals exercise their right to freedom of expression. It provides essential tools for participation in activities and discussions concerning political issues and issues of general interest (see Vladimir Kharitonov v. Russia, no. 10795/14, § 33, 23 June 2020, and Melike v. Turkey, no. 35786/19, § 44, 15 June 2021).
159. The possibility for user-generated expressive activity on the Internet provides an unprecedented platform for the exercise of freedom of expression (see Delfi AS, cited above, § 110; Times Newspapers Ltd v. the United Kingdom (nos. 1 and 2), nos. 3002/03 and 23676/03, § 27, ECHR 2009; and Ahmet Yıldırım v. Turkey, no. 3111/10, § 48, ECHR 2012). Given the important role played by the Internet in enhancing the public’s access to news and in generally facilitating the dissemination of information (see Delfi AS, cited above, § 133), the function of bloggers and popular users of social media may be assimilated to that of a “public watchdog” in so far as the protection afforded by Article 10 is concerned (see Magyar Helsinki Bizottság v. Hungary [GC], no. 18030/11, § 168, 8 November 2016).
160. As the Court has previously observed, the Internet has fostered the “emergence of citizen journalism”, as political content ignored by the traditional media is often disseminated via websites to a large number of users, who are then able to view, share and comment upon the information (see Cengiz and Others v. Turkey, nos. 48226/10 and 14027/11, § 52, ECHR 2015 (extracts)). Generally speaking, the use of new technologies, especially in the political field, is now commonplace, whether it be the Internet or a mobile application “put in place by [a political party] for voters to impart their political opinions”, “but also to convey a political message”; in other words, a mobile application may become a tool “allowing [voters] to exercise their right to freedom of expression” (see Magyar Kétfarkú Kutya Párt, cited above, §§ 88-89).
162. Defamatory and other types of clearly unlawful speech, including hate speech and speech inciting violence, can be disseminated as never before, worldwide, in a matter of seconds, and sometimes remain available online for lengthy periods (see Savva Terentyev v. Russia, no. 10692/09, § 79, 28 August 2018, and Savcı Çengel v. Turkey (dec.), no. 30697/19, § 35, 18 May 2021). Bearing in mind the need to protect the values underlying the Convention, and considering that the rights under Articles 10 and 8 of the Convention deserve equal respect, a balance must be struck that retains the essence of both rights. While the Court acknowledges that important benefits can be derived from the Internet in the exercise of freedom of expression, it has also found that the possibility of imposing liability for defamatory or other types of unlawful speech must, in principle, be retained, constituting an effective remedy for violations of personality rights (see Delfi AS, cited above, § 110).
(β) Liability for third-party comments on the Internet
163 . It was in the Delfi AS judgment (cited above, § 111) that the Court was called upon for the first time to examine a complaint in the evolving field of technological innovation that is the Internet. That case concerned the liability - exclusively civil - of a company which owned a major online news portal, on account of unlawful comments posted by third parties following the publication of an article on that portal. In that case, in order to resolve the question whether the domestic courts’ decisions holding the applicant company liable for the comments by third parties were in breach of its freedom of expression, the Court relied on the following aspects: first, the context of the comments; second, the measures applied by the applicant company in order to prevent or remove defamatory comments; third, the liability of the actual authors of the comments as an alternative to the applicant company’s liability; and fourth, the consequences of the domestic proceedings for the applicant company (ibid., §§ 142-143; see also, for an application of these criteria in a different context, Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary, no. 22947/13, §§ 69-70, 2 February 2016).
164. In view of the particular nature of the Internet, the “duties and responsibilities” that are to be conferred on a news portal for the purposes of Article 10 may differ to some degree from those of a traditional publisher as regards third-party content (see Delfi AS, cited above, § 113; see also Orlovskaya Iskra, cited above, § 109).
165. Based on the above criteria, the Court found that the domestic award of damages against the Internet news portal for insulting comments posted on its website by anonymous third parties had been justified, under Article 10 of the Convention, taking into account in particular the extreme nature of the comments in question, amounting as they did to hate speech and speech inciting violence (see Delfi AS, cited above, § 162).
166. In the case of a comment posted on an association’s blog it is also important to examine the size of the entity and whether or not it is engaged in a profit-making activity in order to assess the likelihood that it would attract a large number of comments or would be widely read (see Pihl v. Sweden (dec.), no. 74742/14, § 31, 7 February 2017; contrast Delfi AS, cited above, §§ 115-16). In striking a fair balance between an individual’s right to respect for his or her private life under Article 8 and the right to freedom of expression under Article 10, the nature of the comment will have to be taken into consideration, in order to ascertain whether it amounted to hate speech or incitement to violence, together with the steps that were taken after a request for its removal by the person targeted in the impugned remarks (see Pihl, cited above, § 37, and Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt, cited above, §§ 76 and 80-83).
(b) Application of those principles to the present case
167. The Grand Chamber observes that the Chamber, in its judgment, set out as follows the approach upon which it proposed to base its reasoning:
“79. The Court observes that the domestic courts found the applicant guilty of the offence of inciting hatred or violence against a group in general, and the individual L.T. in particular, on account of their origin or the fact of belonging, or not belonging, to a given ethnicity, nation, race or religion ...
80. In the light of the reasoning of the domestic courts, the Court must, in accordance with its settled case-law, examine whether their finding of liability on the part of the applicant was based on relevant and sufficient grounds in the particular circumstances of the case (see, in relation to a major Internet news portal, Delfi AS, cited above, § 142). In doing so, and in assessing the proportionality of the impugned penalty, it will consider the context of the comments, the steps taken by the applicant to remove the comments once posted, the possibility of holding the authors liable instead of the applicant and, lastly, the consequences of the domestic proceedings for the applicant (see, inter alia, Delfi AS, cited above, §§ 142-43, and Jezior v. Poland [Committee], no. 31955/11, § 53, 4 June 2020).”
168. The Grand Chamber sees no reason to depart from that approach and will also follow it for the purposes of its examination of the present case.
(i) Context of the comments at issue
(α) Nature of impugned comments
169. While referring back to its survey of the case-law on this question (see paragraphs 154-157 above), the Court would first note that there is no universal definition of “hate speech” (see, concerning the work of the Committee of Ministers of the Council of Europe, paragraphs 60 et seq. above).
170. The Court would then point out that the present case concerns the posting, by two different authors, of a number of disputed comments. The first comment was posted by S.B., who referred to “Leilla” (sic) and “Franck” (see paragraph 15 above). The Court notes that Leila T., F.P.’s partner, considered that it had targeted her personally. The other three comments were posted by a single author, L.R.
171. The Court finds it necessary to examine the content of the remarks in question, particularly in the light of the reasoning given by the domestic courts.
172. In this connection it would first observe that the Criminal Court, in its judgment of 28 February 2013, began by noting that the remarks had “perfectly” defined a specific group of persons, namely Muslims, using phrases such as “the UMP and the PS are allies of the muslims” or “drug trafficking run by the muslims”, but also in conjunction with words such as “kebab”, “mosque”, “sharia”, “shisha bars” and “hallal economic devellopment” (see paragraphs 15, 16 and 26 above). The Court shares this view, adding that the words “veiled women”, from a comment by L.R., also clearly denoted Muslims (see paragraph 16 above).
173. The Court further observes that the group constituted by Muslims is also associated, unequivocally in view of the way the comments are formulated, with objectively insulting and hurtful language. This is the case of the references, after speaking of the transformation of “Nimes into Algiers”, to “kebab shops”, to the “mosque”, or to “dealers and prostitutes [who] reign supreme”, and it can be seen from other passages, namely “more drug dealing”, “riffraff sell drugs all day long”, “stones get thrown at cars belonging to ‘white people’” (see paragraphs 15 and 16 above). In the Court’s view, the association is even more obvious where mention is made of “drug trafficking run by the muslims” (emphasis added; see paragraph 16 above); a most revealing choice of words, it accentuates the intended assimilation between a group - taken as a whole on account of its religion - and criminality.
174. As regards the comments by L.R., the applicant argued before the Court that they had not exceeded the permissible limits of freedom of expression in the field of political speech, adding that the impugned remarks had reflected his party’s political manifesto, which had never been banned (see paragraphs 89 and 95 above).
175. The Court acknowledges that those comments were made in a very specific context, since they were made by a citizen who, in the run-up to an election and on the Facebook “wall” of the candidate whose ideas he supported and for whom he was actually working as campaign assistant (see paragraph 21 above), was complaining about the local situation in terms from which the applicant did not distance himself (see paragraphs 23 and 95 above). Moreover, the Court accepts that the comments reflected a wish to complain of certain local difficulties, or even a degree of social distress that might call for a political response, in particular on account of criminal acts allegedly committed against a section of the population. Nor does it contest the fact that regard must be had to the specificities of the style of communication on certain online portals, where comments are commonly expressed, as in the present case, in conversational language or indeed in a colloquial or vulgar register (see Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt, cited above, § 77).
176. It must nevertheless be said that, in an election context, the impact of racist and xenophobic discourse becomes greater and more harmful, as the Court has already pointed out (see paragraph 153 above). That is particularly true in the present circumstances where the political and social climate was troubled, especially at the local level with “clear tensions within the population, which were evident in particular from the comments at issue, but also between the protagonists”, namely the applicant and F.P., who was his political opponent, as the Chamber pertinently pointed out (see paragraph 91 of its judgment). Indeed, when interpreted and assessed in their immediate context, bearing in mind that the comments were posted on a politician’s Facebook “wall” during an election campaign, they genuinely amounted to hate speech, in view of their content and general tone, together with the virulence and vulgarity of some of the language used. The reach of such remarks and comments was, moreover, not limited to the party’s members and supporters; it can be seen from the reaction of Leila T. that, on the contrary, they spread beyond a strictly partisan readership.
177. Having regard to the foregoing, the Court is of the view that the impugned comments posted by S.B. and L.R. on the applicant’s Facebook “wall” were clearly unlawful.
178. Lastly, the consideration that the comments were in line with his party’s manifesto, as the applicant claimed, is immaterial. The Court reiterates that while political parties have the right to defend their opinions in public, even if some may offend, shock or disturb part of the population, particularly when proposing solutions to problems linked to immigration, in doing so they must avoid advocating racial discrimination or resorting to vexatious or humiliating remarks or attitudes, as such conduct might trigger reactions among members of the public that would be detrimental to a peaceful social climate and might undermine confidence in the democratic institutions (see Féret, cited above, § 77).
(β) The political context and the applicant’s specific liability in respect of comments posted by third parties
179. In the Delfi AS judgment ( cited above), when circumscribing its examination in order to define the scope of its assessment, the Court observed that the case concerned a “large professionally managed Internet news portal run on a commercial basis” (ibid., § 115). However, it excluded from its examination “other fora on the Internet where third-party comments can be disseminated”, in particular “a social media platform where the platform provider does not offer any content and where the content provider may be a private person running the website or blog as a hobby” (ibid., § 116).
180. In the present case the Court notes that the applicant’s Facebook “wall” is not comparable to a “large professionally managed Internet news portal run on a commercial basis”, contrary to the respondent Government’s argument (see paragraph 106 above). While there can be little doubt that it falls within the category of “other fora on the Internet where third-party comments can be disseminated”, as formulated in Delfi AS ( cited above, § 116), the specific features of the present case prompt the Court to approach this question in the light of the “duties and responsibilities”, within the meaning of Article 10 § 2 of the Convention, which are to be attributed to politicians when they decide to use social media for political purposes, particularly to meet electoral goals, by opening publicly accessible fora on the Internet where the reactions and comments of users can be posted. The applicant was not merely a private individual and he himself pointed out that he was using this account in his capacity as a local councillor (see paragraph 88 above), for political purposes and in the context of an election (see paragraphs 89, 95 and 96 above). In addition, the Court notes that the applicant is not only a professional in politics, but has also had some expertise in the digital services field. The website of Beaucaire town hall contains a page presenting the applicant as its mayor on which it is expressly stated that in his “professional life” he was responsible for the “FN’s Internet strategy ... for 7 years” (see paragraph 13 above).
181. The Court would first note that the applicant’s initial post on his Facebook “wall” did not contain any offensive language and raises no issue on such grounds (see paragraph 14 above). The domestic authorities reproached him solely for his lack of vigilance and failure to react in respect of certain comments posted by third parties.
182. Moreover, it notes that the attribution of liability for acts committed by third parties may vary depending on the moderation or vetting techniques applied by Internet users who are characterised as “producers” and who merely use social networks or accounts for non-commercial purposes. There is no consensus on this issue among the member States (see paragraph 79 above).
183. The Court is of the view, however, that to engage a person’s liability as “producer”, within the meaning of section 93-3 of Law no. 82-652 of 29 July 1982, does not raise any difficulty as a matter of principle, provided that safeguards exist in the apportionment of such liability, which is to be applied in a context of shared liability between various actors, as is the case in the example of hosts.
184. As the Internet has become one of the principal means by which individuals exercise their right to freedom of expression (see paragraphs 158 et seq. above), the Court is of the view that interferences with the exercise of that right should be examined particularly carefully, since they are likely to have a chilling effect, which carries a risk of self-censorship. Nevertheless, the identification of such a risk must not obscure the existence of other dangers for the exercise and enjoyment of fundamental rights and freedoms, in particular those emanating from unlawful, defamatory, hateful or violence‑inciting remarks, which can be disseminated as never before (see paragraphs 161 and 162 above). For this reason the possibility for individuals complaining of defamatory or other types of unlawful speech to bring an action to establish liability must, in principle, be maintained, to constitute an effective remedy for the alleged violations (see, mutatis mutandis, Delfi AS, cited above, § 110).
185. The Court observes that, at the relevant time, the holder of a Facebook account used for non-commercial purposes was not fully able to control the administration of comments. In addition to the fact that there was no automatic filtering process available - although it had been possible to remove public access (see paragraphs 82 and 106 above) - the effective monitoring of all comments, especially in the case of a very popular account, would have required availability or recourse to significant, if not considerable, resources. Nevertheless, to exempt producers from all liability might facilitate or encourage abuse and misuse, including hate speech and calls to violence, but also manipulation, lies and misinformation. In the Court’s view, while professional entities which create social networks and make them available to other users necessarily have certain obligations (see, in particular, paragraph 75 above), there should be a sharing of liability between all the actors involved, allowing if necessary for the degree of liability and the manner of its attribution to be graduated according to the objective situation of each one.
186. The Court further notes that French law is consistent with such a view, providing in the case of the “producer” for a shared liability subject to the conditions of the last paragraph of section 93-3 of the Law no. 82-652 of 29 July 1982, while in the case of hosts within the meaning of the Law of 21 June 2004 - Facebook being one example - liability remains limited, as confirmed by the Constitutional Council in its decision no. 2004-496 DC of 10 June 2004 (see paragraph 45 above).
187. Moreover, the domestic courts in the present case referred to the applicant’s status as a politician and inferred from this that a special obligation was incumbent upon him (see paragraphs 28 and 26 above). It is certainly true that, in general, a politician has duties and responsibilities (see the case-law summarised in paragraphs 150-151 and 153 above), in addition to the fact that a degree of notoriety and representativeness necessarily lend a certain resonance and authority to the person’s words or deeds. Owing to a politician’s particular status and position in society, he or she is more likely to influence voters, or even to incite them, directly or indirectly, to adopt positions and conduct that may prove unlawful, thus explaining why he or she can be expected to be “all the more vigilant”, to use the words of the Nîmes Court of Appeal (see paragraph 26 above).
188. The Court would, however, emphasise that this finding is not to be understood as entailing an inversion of the principles established in its case‑law hitherto (see paragraphs 146-147 above). Thus, while specific duties may be required of the applicant on account of his status as politician, such requirement is indissociable from the principles relating to the rights which come with such status, and the Nîmes Court of Appeal could usefully have referred to those principles in order to strengthen its reasoning. It is only when those principles have been properly taken into account that it becomes possible for the domestic courts, where the facts submitted to them so justify and provided their decision contains the relevant reasoning, to base their decision on the ground that freedom of political expression is not absolute and that a Contracting State may render it subject to certain “restrictions” or “penalties” (see paragraphs 148 et seq. above).
189. The fact remains that the applicant was using his Facebook account in his capacity as a local councillor and for political purposes, during an election campaign to which the impugned comments were directly related (see paragraph 180 above). Referring back to its case-law in such matters, the Court would reiterate that national authorities are better placed than itself to understand and appreciate the specific societal problems faced in particular communities and contexts, or the likely impact of certain acts that they are called upon to adjudicate (see, mutatis mutandis, Wingrove v. the United Kingdom, 25 November 1996, § 63, Reports of Judgments and Decisions 1996-V, and Maguire v. the United Kingdom (dec.), no. 58060/13, § 54, 3 March 2015). In the circumstances of the present case, while referring to its earlier finding that the content of the comments published on the applicant’s “wall” were clearly unlawful (see paragraphs 169‑177 above), the Court considers that the Criminal Court and Nîmes Court of Appeal were best placed to assess the facts in the light of the difficult local context and their acknowledged political dimension (see paragraph 176 above). The Court therefore fully endorses the Chamber’s finding that the language used in the comments at issue clearly incited hatred and violence against a person on account of his or her religion and this cannot be disguised or minimised by the election context or by a wish to discuss local difficulties (see paragraph 88 of the Chamber judgment).
(ii) Steps taken by the applicant
190. The Court first observes that there can be little doubt that a minimum degree of subsequent moderation or automatic filtering would be desirable in order to identify clearly unlawful comments as quickly as possible and to ensure their deletion within a reasonable time, even where there has been no notification by an injured party, whether this is done by the host itself (in this case Facebook), acting as a professional entity which creates and provides a social network for its users, or by the account holder, who uses the platform to post his or her own articles or views while allowing other users to add their comments. Referring to the principles which emerge from its case-law (see paragraphs 158 et seq. above), the Court would emphasise the fact that an account holder cannot claim any right to impunity in his or her use of electronic resources made available on the Internet and that such a person has a duty to act within the confines of conduct that can reasonably be expected of him or her (see also paragraph 185 above).
191. In the present case, no regulation required the automatic filtering of comments and there was no practical possibility of prior content moderation on Facebook (see paragraphs 82 and 106 above). Accordingly, the question arises as to what steps the applicant ought to have - or could have – reasonably taken in his capacity as “producer” within the meaning of section 93-3 of Law no. 82-652 of 29 July 1982.
192. In this connection the Court would first point out that in his initial post the applicant had not conveyed a message potentially constituting or encouraging hate speech or a call to violence (see paragraphs 14 and 181 above).
193. It further notes that the applicant had been free to decide whether or not to make access to the “wall” of his Facebook account public. The domestic courts thus took into account the fact that he had decided to make it publicly accessible by choice; the Nîmes Court of Appeal inferred that he had “therefore authorised his friends to post comments on it” (see paragraphs 28 and 26 above). The Court, while agreeing with this observation, takes the view that he cannot be reproached for this decision in itself, as it was a technical means made available to him by the platform which enabled him to communicate with voters in his capacity as a politician and as a candidate standing for election. Nevertheless, in view of the local and election-related tensions at the time (see, in particular, paragraph 176 above), that option was clearly not without potentially serious consequences, as the applicant must have been aware in the circumstances. The Court thus finds it legitimate to make a distinction, as the domestic courts did, between limiting access to the Facebook “wall” to certain individuals and making it accessible to the general public. In the latter case, everyone, and therefore especially a politician experienced in communication to the public, must be aware of the greater risk of excessive and immoderate remarks that might appear and necessarily become visible to a wider audience. This is without doubt a major factual element, directly linked to the deliberate choice of the applicant, who was - as the Court has already emphasised - not only a politician campaigning in the run-up to an election but also a professional in matters of online communication strategy (see paragraph 13 above).
194. In addition, the Court would point out that the use of Facebook was subject to the acceptance of certain terms and conditions laid down by the social network, in particular those in the “Statement of rights and responsibilities”, of which the applicant must have been aware (see paragraph 81 above). It further notes that while each Facebook user must individually ensure compliance with the operating rules, the applicant nevertheless saw fit to draw the attention of his “friends” to the need to ensure that their remarks remained lawful, as he posted a message asking them to “be careful with the content of [their] comments” (see paragraph 19 above), thus apparently showing that he was at least aware of the issues raised by some of the comments on his “wall”. The Grand Chamber in fact agrees with the Chamber’s finding that the applicant had posted this warning message without deleting the impugned comments and, above all, without having taken the trouble to check, or to have checked, the content of comments that were then publicly accessible (see paragraph 97 of the Chamber judgment). The lack of such minimal verification appears all the more difficult to explain as, the very next day, the applicant had been informed by S.B. of his confrontation with Leila T. (see paragraph 22 above) and had thus definitely been made aware of the problems that might be caused by the other comments.
195. Turning more specifically to the impugned comments, the Court agrees with the analysis by the Chamber concerning that of S.B. when it found that it had been “promptly withdrawn by its author, less than twenty-four hours after being posted [and that], [a]ccordingly, assuming that the applicant had indeed had the time and opportunity to see this comment before its deletion, ... to require him to have acted even more promptly, bearing in mind that the domestic authorities [had been] unable to show the existence of such an obligation in the light of the particular circumstances of the case, would amount to requiring excessive and impracticable responsiveness”.
196. S.B.’s comment, however, is only one of the elements to be taken into consideration in the present case in an examination of all the acts held against the applicant by the domestic authorities. The applicant was in fact prosecuted, and ultimately convicted, not on account of the remarks made by S.B. or L.R., but for failing to proceed with the prompt deletion of all the unlawful comments posted by those authors on his Facebook “wall”. Moreover, those comments did not merely follow on from one another chronologically. Far from being just a “system of interactive monologues” as suggested by the applicant (see paragraph 96 above), they were responding to and complementing each other following the applicant’s initial post, as shown in particular by the systematic references to F.P., the applicant’s political opponent, in the messages posted both by S.B. and by L.R. For the Court this was not, therefore, simply a discussion thread but clearly a form of ongoing dialogue representing a coherent whole and it was reasonable for the domestic authorities to apprehend it as such.
197. It may also be inferred from the above, in the Court’s view, that the deletion of S.B.’s remarks by their author within twenty-four hours after they were posted does not suffice to negate the applicant’s liability in respect of Leila T., who joined the criminal proceedings as a civil party. The Court notes in this connection that, in its judgment of 18 October 2013, the Nîmes Court of Appeal upheld the judgment of the Criminal Court as to its civil provisions in favour of Leila T., and in addition to the award of EUR 1,000 at first instance for non-pecuniary damage it awarded the same amount for the costs she had incurred in the appeal proceedings. Whilst it is true that S.B. promptly deleted his own comment, the only one referring directly to Leila T., that deletion took place only after further comments had been posted by L.R. which, echoing the remarks of S.B., contributed to and thus pursued the same discourse. The applicant’s initial post not only started a dialogue, as the Court has already noted, but also had repercussions which went beyond that post on account of the very nature of social networks on the Internet (see paragraphs 161 et seq. above). Therefore, this form of ongoing dialogue, forming a coherent whole (see paragraph 196 above) was such as to justify the fact that the applicant was ordered to pay certain sums to Leila T., as civil party, even though S.B.’s comment, in response to his initial post, had been deleted. Accordingly, having regard to the foregoing, the Court finds that the Nîmes Court of Appeal was entitled to conclude, by reasoning that was neither arbitrary nor manifestly unreasonable, that the deletion of S.B.’s message had therefore no longer been capable of reversing the consequences for the civil party Leila T. It is emphasised that the applicant’s liability, both criminal and civil, was not engaged on account of any specific comment taken in isolation.
198. The Court reiterates, on this point, that its task, in exercising its supervisory jurisdiction, is not to take the place of the competent national authorities, which moreover enjoy a margin of appreciation, to which the preamble to the Convention now refers expressly, following the entry into force of Protocol No. 15 on 1 August 2021, but rather to review the compatibility with Article 10 of the decisions they have delivered pursuant to their power of appreciation, and that involves assessing the impugned interference in the light of all the circumstances of the case.
199. The Court further observes that the domestic courts gave reasoned decisions and proceeded with a reasonable assessment of the facts, specifically examining the question whether the applicant had been aware of the unlawful comments posted on his Facebook “wall”. While the Criminal Court’s judgment merely noted that the applicant had allowed his “friends” to access his “wall” and that he had not removed the impugned comments, which were “still visible as of 6 December 2011” (see paragraph 28 above), without seeking to ascertain whether he had actually known about them at that time - a question that nevertheless went to the heart of the matter - the Court of Appeal’s judgment provided further factual clarification (see paragraph 26 above), namely: the fact that, during the investigation, the applicant had stated that he would consult his account every day; his failure to delete S.B.’s comment; the fact that S.B. had informed him that he had been confronted by Leila T. after posting his comment; and lastly, the fact that the applicant had justified his position by asserting his view that the impugned comments were compatible with freedom of expression.
200. As regards, more specifically, the consideration that the applicant consulted his account on a daily basis, it is true that the applicant had also told the investigators that the comments posted on his “wall” were too numerous for him to be able to read regularly, given the number of “friends” – more than 1,800 - who could post comments twenty-four hours a day (see paragraph 23 above). The domestic courts did not see fit to give reasons for their decision on this point, even though it was a key question for the purposes of assessing the credibility of the applicant’s statements in terms of the number of comments actually posted on his Facebook “wall” in response to his initial post, in order to ascertain whether or not he could have been reasonably expected to review the content of the comments and if necessary delete them. The Court notes, however, that during the hearing before it, the respondent Government clarified, without being contradicted by the applicant, that about fifteen comments had appeared in response to his post of 24 October 2011 (see paragraphs 14 and 15 above). Accordingly, the question of the difficulties caused by the potentially excessive traffic on a politician’s account and the resources required to ensure its effective monitoring, of which the Slovak Government provided an illustration in their observations (see paragraph 113 above), clearly does not arise in the present case.
201. The Court finds, moreover, that a degree of notoriety and representativeness necessarily lend a certain resonance and authority to the words, deeds or omissions of the person in question. Accordingly, it is appropriate to proceed with a proportionality analysis based on the degree of liability that may be attributed to such person: a private individual of limited notoriety and representativeness will have fewer duties than a local politician and a candidate standing for election to local office, who in turn will have a lesser burden than a national figure for whom the requirements will necessarily be even heavier, on account of the weight and scope accorded to his or her words and the resources to which he or she will enjoy greater access in order to intervene efficiently on social media platforms (see, mutatis mutandis, Mesić v. Croatia, no. 19362/18, § 104, 5 May 2022, and Melike, cited above, § 51).
(iii) The possibility of holding the authors liable instead of the applicant
202. The Court would first refer to its findings on the lawfulness of the interference (see paragraphs 129-139 above), from which it can clearly be seen that the acts of which the applicant stood accused were both distinct from those committed by the authors of the unlawful comments and governed by a different regime of liability, one that was related to the specific and autonomous status of “producer” within the meaning of section 93-3 of Law no. 82-652 of 29 July 1982, which carried certain requirements. In particular, it would point out that the applicant has failed to show that the interpretation of that provision and its application by the domestic courts were in any way arbitrary or manifestly unreasonable (see paragraph 139 above).
203. Secondly, the Grand Chamber endorses the Chamber’s finding that the applicant was not therefore prosecuted instead of S.B. and L.R., who themselves were also convicted and sentenced (see paragraph 100 of the Chamber judgment). Consequently, any questions relating to anonymity on the Internet and the identification of authors, as examined by the Court in the case of Delfi AS (cited above, §§ 147-51), do not arise in the present case.
204. Lastly, it further notes that, but for some very rare exceptions (see paragraphs 55 and 57-59 above), international law materials do not address the question whether authors should be prosecuted rather than intermediaries, in particular where the latter are not professional entities in the digital services field engaged in an Internet-based activity for commercial gain, but individuals such as the present applicant who use social networks or other types of online fora on which third-party comments can be posted.
(iv) Consequences of the domestic proceedings for the applicant
205. The Court would first observe that, even in the case of civil-law measures, the attribution of liability for third-party comments may have negative consequences for the comment area of an online portal and may have a chilling effect on freedom of expression on the Internet (see Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt, cited above, § 86, and Pihl, cited above, § 35), an effect which may be particularly detrimental for a non-commercial website (see Magyar Tartalomszolgáltatók Egyesk ülete and Index.hu Zrt, cited above, § 86). In cases involving criminal liability, which must be adapted and proportionate to the seriousness of the remarks, such repercussions for freedom of expression may thus be regarded as potentially accentuated (see, in particular, the observations of the Czech Government, paragraph 117 above, and Recommendation CM/Rec (2022) 16, paragraph 61 above, with its Annex, points 3 and 4, paragraph 62 above).
206. The Court is aware that a criminal conviction is capable, as the applicant and certain third-party interveners have argued, of having chilling effects for the users of Facebook, other social networks or discussion fora (see paragraphs 89, 117, 118 and 120 above). However, while there is a movement in favour of decriminalising defamation (see, inter alia, Recommendation 1814 (2007) of the Parliamentary Assembly of the Council of Europe on this subject), this does not extend to hate speech or calls to violence. In the Annex to Recommendation CM/Rec(2022)16, the Committee of Ministers proposes, on the contrary, to make a distinction according to the seriousness of the hate speech, without excluding recourse to the criminal law (point 3, paragraph 62 above).
207. The Court would further reiterate that the imposition of a prison sentence for an offence in the area of political speech may be compatible with freedom of expression as guaranteed by Article 10 of the Convention, but only in exceptional circumstances, notably in the case of hate speech or incitement to violence (see Otegi Mondragon, cited above, § 59, and Féret, cited above, §§ 34 and 80; see also, concerning the freedom of expression of journalists, Cumpǎnǎ and Mazǎre v. Romania [GC], no. 33348/96, § 115, ECHR 2004-XI). In addition, even if a fine of a certain amount may appear harsh in relation to the circumstances, this must be assessed in the light of the fact that a prison sentence could, in principle, have been handed down (see Le Pen, decisions cited above, and Soulas and Others, cited above, § 46).
208. In the present case the Court notes that, at the material time, the maximum penalty faced by the applicant was a one-year prison term and a fine of EUR 45,000 (see paragraph 35 above). He was, however, only sentenced to a fine of EUR 4,000 at first instance, reduced to EUR 3,000 by the Court of Appeal, together with the payment of EUR 1,000 to Leila T. in respect of her costs (see paragraph 30 above). Moreover, as the Chamber rightly noted, there were no other consequences for the applicant (see paragraph 103 of the Chamber judgment). The Court notes in particular that it was not argued by the applicant that he had subsequently been forced to change his conduct, or that his conviction had had a chilling effect on the exercise of his freedom of expression or any negative impact on his subsequent political career and his relations with voters. His conviction by the Criminal Court, upheld by the Nîmes Court of Appeal on 18 October 2013, did not, moreover, prevent him from being elected mayor of the town of Beaucaire in 2014 or from continuing to exercise responsibilities for his political party (see paragraph 13 above).
209. In view of the foregoing, on the basis of an assessment in concreto of the specific circumstances of the present case and having regard to the margin of appreciation afforded to the respondent State, the Court finds that the decisions of the domestic courts were based on relevant and sufficient reasons, both as to the liability attributed to the applicant, in his capacity as a politician, for the unlawful comments posted in the run-up to an election on his Facebook “wall” by third parties, who themselves were identified and prosecuted as accomplices, and as to his criminal conviction. The impugned interference can therefore be considered to have been “necessary in a democratic society”.
210. Accordingly, there has been no violation of Article 10 of the Convention.
Holds, by thirteen votes to four, that there has been no violation of Article 10 of the Convention.
Done in English and in French, and delivered at a public hearing in the Human Rights Building, Strasbourg, on 15 May 2023, pursuant to Rule 77 §§ 2 and 3 of the Rules of Court.
{signature_p_2}
Marialena Tsirli Georges Ravarani
Registrar President
In accordance with Article 45 § 2 of the Convention and Rule 74 § 2 of the Rules of Court, the following separate opinions are annexed to this judgment:
(a) concurring opinion of Judge Kūris;
(b) dissenting opinion of Judge Ravarani;
(c) dissenting opinion of Judge Bošnjak;
(d) joint dissenting opinion of Judges Wojtyczek and Zünd.
G.R.I.
M.T.
CONCURRING OPINION OF JUDGE KŪRIS
1. I find a lot of merit in the arguments of Judge Bošnjak expressed in his dissenting opinion regarding the dubious foreseeability of the impugned measure and its rather shaky necessity. I had strong hesitations when debating whether or not to vote with the majority in finding that there had been no violation of Article 10 of the Convention. What finally tilted my vote in that direction was the clearly insufficient persuasiveness of some of the applicant’s submissions, in particular as regards his alleged inability to monitor posts by his friends on his Facebook “wall”, especially given how few comments his message received. No less important was the point that, in assessing the measure in question, one should give adequate consideration to the specific circumstances in which the events evolved - their time and place, as well their politically and socially sensitive context. No doubt the domestic courts which examined the applicant’s case were much better placed for that purpose than any international court examining these matters more than eleven years later. Still, I am not sure that I would be able to support a finding of no violation of Article 10 in other factual circumstances.
2. With hindsight, I believe that the Court should have taken a tougher stance on hate-speech-inciting language also in other cases, e.g., in Perinçek v. Switzerland ([GC], no. 27510/08, ECHR 2015 (extracts)). I surmise that had that case (in which I was among the dissenters) been decided after the present one, its outcome would have been different.
3. Be that as it may, the statutory regulation of the so-called cascading criminal liability is disconcerting, both when applied to the “producers” of communication (such as the applicant), but also in and of itself, because it creates preconditions for indiscriminate penalisation of social media account holders for any “lack of diligence”. However, the Strasbourg Court is not a supranational constitutional court and is therefore not called to assess that regulation in abstracto.
DISSENTING OPINION OF JUDGE RAVARANI
(Translation)
1. It is with great regret that I have been unable to vote in favour of the finding in the judgment’s operative provision, even though I am in agreement with most of the Court’s reasoning.
2. Thus, in spite of some reluctance, I can agree with the finding as to the legality of the interference with the applicant’s right to freedom of expression and more specifically the foreseeability of his conviction as producer under section 23, first paragraph, and section 24, eighth paragraph, of the amended Law of 29 July 1881, and section 93-3 of Law no. 82-652 of 29 July 1982. My initial reluctance can be explained by the lack of a statutory definition of the concept of “producer”, this concept being a creation of domestic jurisprudence, albeit one that was well established at the material time - this being the aspect which ultimately garnered my support for the finding.
3. I also agree with the majority in finding that no violation of Article 10 of the Convention was committed by the domestic courts as regards the comments posted on the applicant’s Facebook “wall” by L.R. Whilst the law requires, in cases where the producer had no knowledge of the content of an unlawful message before it was posted, that it be deleted “promptly” once he or she has become aware of it, the facts of the case show as follows: that the offending comments were published by L.R. on 24 October 2011 and were found in the investigation to still be visible on 6 December 2011; and, in particular, that the applicant had told the investigators on 28 January 2012 that he would be prepared to remove them if so requested by the courts (see paragraph 23 of the judgment). He thus manifestly failed to meet the requirement of promptness.
4. By contrast, the comment posted by S.B. was deleted by its author the day after it was posted, on 25 October 2011. Admittedly it was not the applicant who deleted it, but it would be difficult to reproach him for any omission as he would have been materially incapable of doing so. Nor can the promptness of that deletion, less than 24 hours after it was posted, be a matter of dispute. The majority themselves acknowledge that it could scarcely have been any quicker (see paragraph 195 of the judgment).
So how does the judgment manage to justify its inclusion of S.B.’s comment among those still at stake? By means of the following reasoning (see paragraph 197 of the judgment):
“Whilst it is true that S.B. promptly deleted his own comment, the only one referring directly to Leila T., that deletion took place only after further comments had been posted by L.R. which, echoing the remarks of S.B., contributed to and thus pursued the same discourse. The applicant’s initial post not only started a dialogue, as the Court has already noted, but also had repercussions which went beyond that post on account of the very nature of social networks on the Internet ... Therefore, this form of ongoing dialogue, forming a coherent whole ... was such as to justify the fact that the applicant was ordered ....”
5. Criminal law is, however, to be interpreted strictly. French law requires the producer, where he or she has knowledge of such a comment, to delete it promptly. With all due respect to the majority, it seems to me that they are here engaging in intellectual acrobatics and pure speculation in order to punish the applicant for a comment that was posted on his “wall” and promptly deleted. What is the basis for the statement that the messages “were responding to ... each other” and constituted “an ongoing dialogue” (see paragraph 196), as the authors were not in fact replying to each other’s comments? It is similarly astonishing to read the reference to the statement by the Nîmes Court of Appeal about a “failure to delete S.B.’s comment” (see paragraph 199 of the judgment), as this too misrepresents the facts.
Such reasoning amounts, in my view, to an unacceptable extension of a criminal incrimination by an international court which constantly repeats that it is not a court of fourth instance.
6. It is true that the judgment does emphasise that “S.B.’s comment, however, is only one of the elements to be taken into consideration in the present case in an examination of all the acts held against the applicant by the domestic authorities” and that the “applicant was in fact prosecuted, and ultimately convicted, not on account of the remarks made by S.B. or L.R., but for failing to proceed with the prompt deletion of all the unlawful comments posted by those authors on his Facebook ‘wall’” (see paragraph 196 of the judgment). But this, quite simply, does not seem to be the case. The domestic courts made specific awards against the applicant in respect of the comment posted by S.B. This can be seen from the fact that the applicant was ordered to pay Leila T. the sum of EUR 1,000, jointly with S.B., in compensation for non-pecuniary damage (see paragraph 25 of the judgment), and a further EUR 1,000 for her costs in the appeal proceedings (see paragraph 30 of the judgment). Leila T.’s name had been mentioned only in the impugned comment of S.B.
7. Taking into account the whole body of facts on which the Court was called upon to adjudicate, not simply the comments of L.R., I felt obliged to distance myself from the majority’s finding that the domestic courts had “engaged in a reasonable assessment of the facts” (see paragraph 199 of the judgment), although I should once again point out that, aside from the “S.B.” aspect of the case, I would have been able to agree with that finding. In view of the amalgam in the operative part of the judgment, where the finding of no violation of Article 10 relates to all the facts of the case, without any distinction being made between the applicant’s impugned conduct as regards S.B.’s comment on the one hand, and as regards those of L.R. on the other, I was unable to vote in favour of that general finding of no violation in the operative part.
DISSENTING OPINION OF JUDGE BOŠNJAK
1. I respectfully disagree with the majority in their finding that there has been no violation of Article 10 in this case. I take this position with some unease. Notably, I cannot subscribe to several principal arguments advanced by the applicant, in particular that the remarks posted by L.R. and S.B. allegedly amounted to political speech and criticism that should be allowed on social media, particularly during an election campaign, and that the monitoring obligation imposed on a Facebook account holder in respect of messages posted by third persons would be an excessive burden (see paragraphs 88-89). However, I remain unpersuaded by two positions taken by the majority in the present judgment, namely that (a) the applicant’s conviction on the basis of section 93-3 of Law no. 82-652 of 29 July 1982 (hereinafter referred to as section 93-3) was foreseeable, and that (b) the applicant’s conviction for the message posted by S.B. was proportionate.
1. Whether the applicant’s conviction was foreseeable
2. Before addressing this question, I observe that the applicant failed to raise the issue of the foreseeability of his conviction in his appeal on points of law to the Court of Cassation (paragraph 33 of the present judgment). Furthermore, it appears that the same is true in respect of the earlier stages of the domestic proceedings against the applicant. As the respondent Government did not raise a plea of non-exhaustion in respect of this argument, the Grand Chamber implicitly decided not to take this circumstance into account when adjudicating this case. I would argue that the Grand Chamber could very well have done so, for two reasons. Firstly, I consider that it is high time for the Court to examine, of its own motion and even in the absence of any objection by the respondent State in a case, whether the applicant had raised the matter, at least in substance, in those domestic legal remedies that he or she had used and had thereby afforded the domestic authorities, in particular the apex courts, a sufficient opportunity to address the alleged violation. Secondly, and even more importantly, the failure of an applicant to assert that a provision as applied to his or her detriment in the domestic proceedings had been unforeseeable casts considerable doubt on whether this was indeed the case. More likely, the plea of unforeseeability is an argument that such applicant thinks might play out well before the Court, while he or she expected no such beneficial effect from it domestically.
3. The majority in this case missed a good opportunity to take an important step in its jurisprudence and decided not to abort an argument of this sort by the present applicant. Instead the Grand Chamber examined its merits. I regretfully express my disagreement with the majority’s findings on this point.
4. The applicant was convicted on the basis of section 93-3, which incorporated the so-called cascading criminal liability regime, as provided for in section 42 of the Freedom of the Press Act of 29 July 1881 (hereinafter referred to as the “1881 Act”), into the field of audiovisual communication and subsequently that of communication to the public by electronic means. Pursuant to the first paragraph of section 93-3, the publication director (or in some cases the publication codirector) is to be prosecuted as the principal, where one of the offences criminalised by the 1881 Act is committed by an electronic means of communication, if the impugned content has undergone prior fixing. Under the fifth paragraph of section 93-3, the publication director may not be held criminally liable as principal if he had no actual knowledge of the impugned message or if, upon becoming aware thereof, he acted promptly to ensure its deletion.
5. The second paragraph of section 93-3 stipulates that in the absence of the publication director, the author, and in the absence thereof the producer, shall be prosecuted as the principal. This appears to be the core of the cascading criminal liability, the purpose of which is to ensure that criminal offences committed in the media do not go unpunished.
6. In the present case, the applicant was convicted as producer. Before the Court, he argued that by virtue of the cascading criminal liability regime, the producer could be prosecuted only if it was not possible to prosecute the publication director, or failing that, the authors. He emphasised that while there was no publication director in the present case, the two authors of the impugned comments, namely S.B. and L.R., had indeed been identified, prosecuted and convicted.
7. Therefore, the main legal question in this case is whether it was foreseeable that the applicant could be prosecuted and convicted when both S.B. and L.R. were prosecuted and convicted too. The majority say that it was. I respectfully disagree.
8. In line with the logic of cascading liability, the provision of the second paragraph of section 93-3 appears to subject the prosecution of the producer to the absence of the author. While the third paragraph of the same section expressly allows for the prosecution of both the publication director and the author (as accomplice), no such or similar solution is provided for the concurrence of the author and the producer. The conclusion, a contrario, that it is impossible to prosecute the producer when the author himself or herself is identified and prosecuted is only a step away.
9. In the absence of any legal provision allowing for criminal liability of both the producer and the author, the majority rely upon the interpretation and application of section 93-3 by the apex domestic courts. The domestic case‑law, as considered relevant by the majority, is reproduced in paragraphs 39‑43 of the judgment. I would argue, however, that the domestic jurisprudence as relied upon by the majority does not support their conclusion.
10. First and foremost, there does not appear to exist a single domestic case, apart from that of the applicant, where a court said, be it in its ratio decidendi or by way of obiter dictum, that a producer may be prosecuted and convicted even though the author has been prosecuted too.
11. Secondly, the majority consider that cascading criminal liability does not prevent the courts from applying the principle of the independence or autonomy of criminal prosecutions, which in their view allows for proceedings to be instituted against various actors in the cascade chain, regardless of whether another actor has or has not been prosecuted. In this respect, they put forward a judgment of the Court of Cassation from 16 July 1992 (appeal no. 91-86.156, Bull. crim., no. 273, cited in paragraph 43 of the judgment) which confirmed the possibility of prosecuting both the publication director and the author as accomplices. Such a stance of the Court of Cassation is hardly surprising, bearing in mind the explicit provision of the third paragraph of section 93-3. This provision, however, does not govern the concurrence of the author and the producer (see paragraph 8 of this separate opinion above), which is the issue material to this case. In this respect, it is telling that neither the Court of Cassation nor any other French court has referred to the principle of the independence or autonomy of prosecutions as being applicable to the role of the producer.
12. Thirdly, the majority refer to a judgment of the Court of Cassation from 16 February 2010 (appeal no. 09-81.064, Bull. crim. no. 31, reproduced in paragraph 39 of the judgment) where it quashed a judgment of acquittal which had been rendered by a Court of Appeal without examining whether the defendant in that case could be prosecuted as producer, even though the author had been identified but had not been held to account by the civil party. In this respect, I wish to underline the following. In that case, unlike the present one, the author had not been prosecuted. Therefore, the legal position taken in that case is clearly inapplicable to the case of our applicant, who invites us to rule on the foreseeability of the legal basis of a case where both the authors and the producer (i.e. the applicant) were prosecuted. Furthermore, the Court is not familiar with the reasons why the author in that case, although identified, had not been prosecuted. There may have existed legal or factual obstacles to a (successful) prosecution, calling in turn for the cascading liability regime to be triggered against the producer.
13. Lastly, the majority draw attention to the decision of the Constitutional Council from 16 September 2011 (no. 2011-164 QPC) where it ruled that the benefit of the first and last paragraphs of section 93-3 as applicable to the publication director (namely that he or she cannot be held criminally liable for the content of comments of which he or she had no knowledge before they were posted online) should also extend to the producer (see paragraph 138). However, the Constitutional Council adopted this decision to protect the fundamental rights of producers, by referring to Article 9 of the Declaration of 1789. In no way has the Constitutional Council, by this or any other decision, generally aligned the legal regime applicable to publication directors and producers or opened a gateway to prosecution of a producer alongside the author of a message.
14. On the basis of the above-mentioned domestic jurisprudence, the majority conclude (in paragraph 139 of the judgment) that the domestic courts’ interpretation of section 93-3 in the applicant’s case was neither arbitrary nor manifestly unreasonable. However, this is not the yardstick against which the domestic judgments should be measured. The standard here is whether the applicant’s conviction alongside the two authors was foreseeable. Review of foreseeability should be all the stricter when, as in the present case, a criminal conviction is at stake (where the requirement of lex certa is a particularly important safeguard) and the Court should not hide behind the fact that the novel character of the issue at the material time was not in itself incompatible with the requirements of accessibility and foreseeability (paragraph 141 of the judgment).
15. To conclude, the case-law referred to by the majority as supporting the prosecution and conviction of both the authors and the producer for a criminal offence on the basis of section 93-3 is clearly not pertinent in the circumstances of the present case, as it is addressing other legal issues and could not therefore, as such, have guided the applicant at the material time, even with the assistance of competent legal advice, to foresee the possibility of his criminal liability alongside that of the two identified authors. As explained in paragraphs 5 and 8 above of this dissenting opinion, the wording of section 93-3 itself is such that it does not lend support to a view that prosecution of both the authors and the producer is possible in the framework of the cascading criminal liability concept. Finally, the domestic courts dealing with the applicant’s case failed to develop any arguments to address the issue.
16. For these reasons, I consider that the applicant’s criminal conviction alongside S.B. and L.R. was not foreseeable and was therefore not prescribed by law within the meaning of Article 10 of the Convention.
17. The considerations I have outlined above would in themselves suffice to find a violation of Article 10 of the Convention. However, I wish to signal another reason which would in my view prompt the same finding. It pertains to the fact that the applicant was convicted for his failure to delete the offending comments posted by L.R. and S.B. and thereby put an end to their dissemination.
18. As explained in the first paragraph of this dissenting opinion, I do not take issue with the majority’s finding that, taking into account the domestic authorities’ margin of appreciation, the nature of the comments was such as to require a repressive response and that, in principle, a holder of a public Facebook account may be held liable for comments posted by third parties, provided that certain safeguards (as stipulated by section 93-3 and the ensuing jurisprudence of the French apex courts, in particular the decision of the Constitutional Council mentioned above) are respected. What I disagree with is the finding that the applicant could and should have deleted the comment posted by S.B., taking into account the factual circumstances of the case as established by the domestic courts. In this respect, I refer to the eloquent dissenting opinion of Vice-President Ravarani and express my full agreement with his views in this respect, without seeing the need to repeat them or develop them further in my dissenting opinion.
19. Therefore, while being aware of several sensitive elements of this case, I could not join the majority and consequently voted against their finding that there has been no violation of Article 10 of the Convention.
JOINT DISSENTING OPINION OF JUDGES WOJTYCZEK AND ZÜND
(Translation)
1. With all due respect to the majority we are unable to adhere to the view that Article 10 has not been breached in the present case. In our view, the French law applicable at the material time did not adequately satisfy the foreseeability test. We also have reservations concerning the very regime of individual criminal liability for failure to ensure prompt deletion of remarks made by third parties.
2. The present case concerns the criminal law governing a crucial aspect of freedom of expression. We disagree with the majority as to their identification of the applicable Convention rules and of the relevant case‑law of the Court. The approach adopted in the judgment focusses on the general standards applied under Article 10, without taking account of the fact that the interference complained of by the applicant was one in which the criminal law was brought to bear. In our view, in order to assess the legality of a criminal-law interference with the sphere of freedom of expression, Article 10 must be read in the light of Article 7 and the standards enunciated under that Article in the Court’s case-law. A criminal-law interference with freedom of expression cannot be assessed by the same standards as one that does not involve a criminal sanction. It is worth recalling, in this context, the following standard as formulated by the Court in Del Río Prada v. Spain ([GC], no. 42750/09, § 79, ECHR 2013):
“It follows that offences and the relevant penalties must be clearly defined by law. This requirement is satisfied where the individual can know from the wording of the relevant provision, if need be with the assistance of the courts’ interpretation of it and after taking appropriate legal advice, what acts and omissions will make him criminally liable and what penalty he faces on that account ...”
These are principles which must equally apply when it comes to assessing a criminal-law interference with freedom of expression under Article 10.
As the majority have rightly reiterated in paragraph 125 of the judgment:
“The level of precision required of domestic legislation - which cannot provide for every eventuality - depends to a considerable degree on the content of the law in question, the field it is designed to cover and the number and status of those to whom it is addressed (see NIT S.R.L., cited above, § 160; Satakunnan Markkinapörssi Oy and Satamedia Oy, cited above, § 144; and Delfi AS, cited above, § 122).”
In seeking to consolidate that approach we would add that, in our view, the foreseeability of a statutory provision should be assessed from the standpoint of the average person to whom that provision is addressed. A rule addressed to a professional should therefore be assessed according to the standard of the average professional, while one addressed to the whole population must be assessed by the standard of the ordinary person (the “man in the street”).
Moreover, we see the need to make a distinction between two points: the content of the applicable law and the nature of any decisions taken in order to apply it. The content of the law must be sufficiently clear and thus render its application foreseeable, while individual decisions taken on the basis of that law must not be arbitrary. On that front, the majority appear to confuse the two standards, assimilating a foreseeable law to one which has been interpreted in a non-arbitrary manner (see paragraphs 128, 139, 141 in fine, 197 and 202). Such confusion is difficult to accept. The fact that a decision to apply a law is not arbitrary does not mean that the law which has been applied is necessarily one of sufficient clarity.
In addition, it has been the Court’s view that the national case-law clarifying the criminal law must conform to the standard of an accessible and reasonably foreseeable interpretation and not merely ensure a lack of arbitrariness, as it explained in Del Río Prada (cited above, § 93, emphasis added):
“The lack of an accessible and reasonably foreseeable judicial interpretation can even lead to a finding of a violation of the accused’s Article 7 rights (see, concerning the constituent elements of the offence, Pessino v. France, no. 40403/02, §§ 35‑36, 10 October 2006, and Dragotoniu and Militaru-Pidhorni v. Romania, nos. 77193/01 and 77196/01, §§ 43-44, 24 May 2007; as regards the penalty, see Alimuçaj v. Albania, no. 20134/05, §§ 154-62, 7 February 2012).”
It should be emphasised that Article 7 also enshrines the principle of lex stricta whereby “the criminal law must not be extensively construed to an accused’s detriment, for instance by analogy” (see Vasiliauskas v. Lithuania [GC], no. 35343/05, § 154, ECHR 2015). Thus it does not suffice for the national jurisprudence clarifying or applying the criminal law merely to be accessible and reasonably foreseeable; it must also refrain from extensive interpretation, and in particular it must not apply analogy to the detriment of a defendant.
3. We note that the majority have expressed the following view at paragraph 129:
“Like the Chamber (see paragraph 71 of the Chamber judgment), [the Grand Chamber] reiterates that a criminal conviction under sections 23 and 24 of the Law of 29 July 1881 meets the requirement of foreseeability of the law for the purposes of Article 10 of the Convention (see, among other authorities, Le Pen v. France (dec.), no. 18788/09, 20 April 2010; Soulas and Others v. France, no. 15948/03, § 29, 10 July 2008; Garaudy v. France (dec.), no. 65831/01, 24 June 2003; and Bonnet v. France (dec.), no. 35364/19, § 32, 25 January 2022). It does not see any reason to hold otherwise in the present case.”
In our view this approach is methodologically unsound. What the Court is called upon to assess is not the quality of the various criminal-law provisions taken separately, one by one, but the normative content of a body of criminal‑law provisions taken as a whole. A criminal-law provision that is not problematic in itself may raise questions in a different context, requiring its application in conjunction with other provisions, particularly on the basis of cross-references between provisions. Thus the question of the foreseeability of the law is not perceived in the same terms by the author of remarks as it is by a person who operates a Facebook account or any other website which is open for commenting. Any ambiguity in the law will be more problematic for the account holder than for the author. In our view the applicable French legislation called for re-examination by the Court, with all of the relevant provisions then being taken into account.
4. We would note that the legislation applicable in the present case was addressed to “everyone” and not only to professionals in the political sphere. Yet, in assessing the quality of the law, the majority point out a number of times that the applicant was a professional in politics and in Internet communication (see paragraphs 180, 190 and 193). In our view this aspect, which goes to an assessment of the proportionality of the interference, is not relevant in assessing the quality of the applicable domestic law. It is the viewpoint of the “man in the street” that should have been adopted for that purpose.
5. The analysis of the applicable domestic law prompts a series of questions. The definition of the criminal offence in question is spread between various legislative instruments: the Law of 29 July 1881 on freedom of the press and Law no. 82-652 of 29 July 1982 on audiovisual communication; and while this is not untoward in itself, it does not facilitate an understanding of the law by an average addressee.
The legislation uses the terms “publication director (or codirector)” and “producer”, without indicating the relationship between the two; and this is indeed lacking in clarity.
The notion of “producer” is not defined in the legislation. As the majority point out in paragraph 38 of the judgment, it has been clarified by the Court of Cassation, which has adopted this characterisation for a person who has taken the initiative of creating an electronic communication service for the exchange of opinions on pre-defined topics. We note the extensive approach to this in domestic case-law: the legislation which was conceived for a given field of activity (audiovisual) has been extended to cover a different field (social networks on the Internet) by jurisprudential interpretation. This conveys the impression that the principle of analogy has been accepted for use in a criminal-law matter.
Contrary to the majority’s finding in paragraph 134 of the judgment, the definition of “producer” emerging from the case-law does raise a certain number of questions. Thus, is an individual who starts a Facebook page (with the possibility of commenting) a “person who has taken the initiative of creating an electronic communication service for the exchange of opinions” or rather a person who has taken the initiative of using a service that already existed? Or would it be the company which has set up Facebook itself which corresponds more to the definition given here? Is a Facebook account an electronic communication service? Or does that correspond to the system through which Facebook is operated? Do the authors of comments on Facebook pages discuss pre-defined topics or undefined topics? To find answers to all these questions would require in-depth jurisprudential research.
The case-law also brings into play the principle of autonomy of prosecution (indépendance des poursuites) against a given defendant (see paragraph 39). This jurisprudence would seem, at first sight, to be based on an interpretation contra legem of Law no. 82-652 of 29 July 1982 on audiovisual communication. Moreover, it is not easy to understand how the principle of cascading liability squares with that of autonomy of prosecution (see also, on this point, the dissenting opinion of Judge Bošnjak).
According to yet other judicial decisions, the producer is liable if he or she fails to act promptly to withdraw offending messages upon becoming aware of them. However, it can be seen from the approach taken in the present case by the national courts that it is not necessary to establish with certainty the exact time when an individual became aware of the comments posted on his or her Facebook account. This aspect, combined with the others mentioned above, further accentuates the “fuzziness” of the applicable law.
It is worthy of note, ad abundantiam, that the respondent Government did not respond to the applicant’s main arguments concerning the question of the foreseeability of the applicable law and did not provide any elements capable of dispelling doubts in this respect. In sum, the Government failed to show that the applicable domestic law had been foreseeable.
For our part, we would point out that an addressee of the legal rules in question is obliged to seek clarification concerning his or her status not only in at least two statutes but also in voluminous and very scattered jurisprudence. A reading of the applicable legislation and relevant court decisions prompts the observation that this collection of rules is difficult to comprehend, even for a lawyer. With such accumulated imperfection in the law, it is difficult to assert that the average citizen will be able to know with sufficient certainty which acts and omissions engage his or her criminal liability and which regime is applicable to that liability. In our view, a field as important as social networks calls for legislation that is rather more accessible to those to whom it is addressed.
6. The French legislation also raises questions in terms of the proportionality principle. Judge Mourou-Vikström, in the dissenting opinion that she appended to the Chamber judgment in the present case, identified the main problems in this area, expressing the following view:
“The application of this ‘projected’ or ‘derived’ liability of the Facebook account holder is, to my mind, capable of breaching the right to free expression of commentators and account holders, especially in the case of public figures or politicians who have a large number of ‘friends’ ...
The finding of no violation of Article 10 of the Convention places a very heavy burden on the account holder in terms of monitoring posts, since he or she would otherwise face criminal prosecution. There is a risk that such a fear will cause the account holder to systematically vet and even to censor certain comments posted on his or her ‘wall’. In case of doubt as to the legal consequence of a comment posted by someone else, the account holder will of course be more inclined to delete or report a message by way of precaution. The chilling effect is self-evident, thus entailing a serious threat to freedom of expression.”
We share those concerns. If the holder of a Facebook account had to spend his or her time monitoring comments - numerous as they may be - it would be difficult to use this tool as a forum for political discussion. Freedom of expression would suffer as a result.
We would add that the very principle of a criminal liability based in some way on the deeds of a third party is open to question. A balanced system should at least comprise a mechanism for the giving of prior notice to the holder of an account on Facebook or another social network, allowing a reasonable time-limit for the deletion of unlawful comments, before that account holder can be held personally liable for any failure to delete such comments.
In any event, while the authorities of the respondent State have demonstrated the need to prosecute the authors of the impugned remarks, they have failed to show that it was necessary to prosecute a Facebook account holder in the circumstances of the case.
7. In conclusion, we are of the opinion that the interference with the applicant’s freedom of expression did not have a legal basis which satisfied all the foreseeability criteria. Furthermore, we find that the rules governing liability for remarks by third parties are difficult to reconcile with the principle of proportionality.