Israel to hold social media companies liable for content - editorial

Communications Minister Yoaz Hendel plans to establish a committee to investigate how social media companies censor and present content and possibly hold them legally accountable.

 Minister of Communications Yoaz Hendel speaks during a plenary session at the assembly of the Knesset, the Israeli Parliament in Jerusalem, July 14, 2021.  (photo credit: YONATAN SINDEL/FLASH90)
Minister of Communications Yoaz Hendel speaks during a plenary session at the assembly of the Knesset, the Israeli Parliament in Jerusalem, July 14, 2021.
(photo credit: YONATAN SINDEL/FLASH90)

Communications Minister Yoaz Hendel announced last week his intention to establish a committee that will examine ways to rein in social media companies and make them accountable for the content that appears on their platforms.

Hendel, who flew Saturday night to the United States for meetings in Washington DC and New York, is scheduled to meet with top lawmakers, members of the Biden administration and business executives to discuss the issue. As the minister responsible for the development and expansion of Israel’s 5G network, he will also discuss American concerns about China’s encroachment in Israel.

Hendel’s committee will include an assortment of officials and public experts, including author Micha Goodman, whose latest book in Hebrew discusses the risks of social media.

Facebook, Twitter and others are carefully watching what this committee does and what conclusions Hendel adopts. According to reports, the committee will consider investigating the way these companies censor and present content and might recommend holding the companies accountable for the content they present. If so, this could make Facebook – for example – potentially susceptible to libel suits for inflammatory content that appears on the platform.

Currently, and unlike newspapers or books, Facebook is not legally liable for the content that appears on its platform.

  Facebook CEO Mark Zuckerberg testifies remotely via videoconference in this screengrab made from video during a Senate Judiciary Committee hearing titled, ''Breaking the News: Censorship, Suppression, and the 2020 Election,? on Facebook and Twitter's content moderation practices (credit: US SENATE JUDICIARY COMMITTEE VIA REUTERS/FILE PHOTO)
Facebook CEO Mark Zuckerberg testifies remotely via videoconference in this screengrab made from video during a Senate Judiciary Committee hearing titled, ''Breaking the News: Censorship, Suppression, and the 2020 Election,? on Facebook and Twitter's content moderation practices (credit: US SENATE JUDICIARY COMMITTEE VIA REUTERS/FILE PHOTO)

This question strikes at the identity of these platforms and whether they are considered a publisher or just a platform where others can publish.

The impetus for the establishment of this committee are the recent revelations that Facebook prioritizes hateful and inflammatory content in what it presents to users.

Speaking to the Senate Commerce Subcommittee on Consumer Protection, former Facebook data scientist Frances Haugen called for action to curb the platform.

According to Haugen, Facebook has repeatedly preferred growth and profits over safeguarding its users.

“The result has been more division, more harm, more lies, more threats and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people,” Haugen testified.

The failure of platforms like Facebook to prevent hateful content is not new. For years, Facebook allowed Holocaust deniers to spread their vile lies (that changed in late 2020) and it has also allowed terror groups and supporters to remain active on the site.

Twitter is similar. Iranian Supreme Leader Ali Khamenei has an account with nearly 900,000 followers which he uses to call for Israel’s destruction, another way for urging mass genocide of the Jewish people.

Does Twitter do anything? No. One official explained last year that Khamenei’s tweets were merely “foreign policy saber-rattling.”

We also know that when they want to, these companies know how to take action. Facebook and Twitter banned former president Donald Trump and took steps to prevent the spread of misinformation about COVID-19 vaccines.

President Joe Biden went so far as to say that social media companies were “killing people” when asked at the White House in July about misinformation and what his message was to social media platforms such as Facebook.

Facebook and Twitter have changed the world in allowing people to connect and interact no matter where they are. That is an incredible feat. At the same time though, they cannot wash their hands of what appears on their platform.

Facebook is today the largest publisher of content in the world. It might claim that it is just a facilitator but it decides what we, the users, see and it then makes money from that very content. It cannot continue to claim that it does not bear responsibility.

And while responsibility is tricky, it can be done. Newspaper and other publishers make these decisions every day, and as a result are held accountable for what they publish. Facebook, which essentially does the same – exercising discretion and presenting users with what it’s algorithm determines – is also a publisher and should be treated as one.

After all, most of Facebook’s money comes from advertising, just like traditional publishers.

The time has come for the equation to change and for social media companies – which dominate the way hundred of millions of people receive their news – to be held accountable for what they publish.