Difference between revisions of "Facebook: the new censor"

From New Internationalist Easier English Wiki
Jump to navigation Jump to search
(Created page with "'''Facebook: the new censor''' ''Social media is taking the role of the state says Jillian York.'' http://www.newint.org/sites/default/files/u2/civil-society-internet.jpg...")
 
 
Line 15: Line 15:
 
In Europe, this is already happening. In 2016, the European Commission signed a ‘code of conduct’ with four big American tech companies – Microsoft, Google, Facebook, and Twitter –  to reduce illegal content online. The code says companies should look at reported content in a certain time and delete hateful speech that goes against their own terms of service. The code does not refer to illegal content itself,  but pushes companies to follow their own rules. Civil society groups were part of the talks, but resigned from them in 2016, and said there was not enough open discussion and no ideas from the public.
 
In Europe, this is already happening. In 2016, the European Commission signed a ‘code of conduct’ with four big American tech companies – Microsoft, Google, Facebook, and Twitter –  to reduce illegal content online. The code says companies should look at reported content in a certain time and delete hateful speech that goes against their own terms of service. The code does not refer to illegal content itself,  but pushes companies to follow their own rules. Civil society groups were part of the talks, but resigned from them in 2016, and said there was not enough open discussion and no ideas from the public.
  
Similarly, the German Netzwerkdurchsezungsgesetz (NetzDG) law, which started in late 2017, says companies must delete certain content (such as threats of violence and slander) in 24 hours of a complaint. Or, in legal cases, in seven days. Internet activists criticised the law in Germany and abroad.
+
Similarly, the German Netzwerkdurchsetzungsgesetz (NetzDG) law, which started in late 2017, says companies must delete certain content (such as threats of violence and slander) in 24 hours of a complaint. Or, in legal cases, in seven days. Internet activists criticised the law in Germany and abroad.
  
 
Europe is worried about the increase of hate speech. But asking American companies to decide what is or isn’t hateful is a strange way of doing it. After all, these are the same companies whose software identifies profane words as worse than hate speech, and often does not take white supremacist terrorism seriously.
 
Europe is worried about the increase of hate speech. But asking American companies to decide what is or isn’t hateful is a strange way of doing it. After all, these are the same companies whose software identifies profane words as worse than hate speech, and often does not take white supremacist terrorism seriously.
Line 25: Line 25:
 
More recently, there are demands for these same companies to say what is ‘news’. In the past, social media platforms such as Facebook have not really regulated the media and have made partnerships with news organizations. But after the Trump election, there have been more and more calls for the company to fight misinformation.
 
More recently, there are demands for these same companies to say what is ‘news’. In the past, social media platforms such as Facebook have not really regulated the media and have made partnerships with news organizations. But after the Trump election, there have been more and more calls for the company to fight misinformation.
  
It is a good idea but can we trust tech companies to do it well? The answer is no! Campbell Brown is Head of News Partnerships. At a meeting hosted by the Financial Times and Facebook, he was asked if we can trust Breitbart. Breithart is the far-right media site associated with members of the Trump govermnment and it publishes dishonest and hateful content. Brown said, ‘To some people, it is. To some people, it is not.’
+
It is a good idea but can we trust tech companies to do it well? The answer is no! Campbell Brown is Head of News Partnerships. At a meeting hosted by the Financial Times and Facebook, he was asked if we can trust Breitbart. Breithart is the far-right media site associated with members of the Trump government and it publishes dishonest and hateful content. Brown said, ‘To some people, it is. To some people, it is not.’
 +
 
 
Brown’s answer tells us a lot. Facebook does not want to do the work that it says it will. This is not a surprise when one of the company’s board members sometimes shares Breitbart articles on the platform.
 
Brown’s answer tells us a lot. Facebook does not want to do the work that it says it will. This is not a surprise when one of the company’s board members sometimes shares Breitbart articles on the platform.
  
Line 44: Line 45:
 
'''NOW READ THE ORIGINAL:'''  https://newint.org/features/2018/06/01/facebook-is-the-new-censors-office
 
'''NOW READ THE ORIGINAL:'''  https://newint.org/features/2018/06/01/facebook-is-the-new-censors-office
  
(This article has been simplified so the words, text structure and quotes may have been changed)
+
''(This article has been simplified so the words, text structure and quotes may have been changed)''
  
 
[[Category: Media]]  [[Category:Technology]]  [[Category: Society]]
 
[[Category: Media]]  [[Category:Technology]]  [[Category: Society]]

Latest revision as of 18:55, 7 June 2018

Facebook: the new censor

Social media is taking the role of the state says Jillian York.

civil-society-internet.jpg

With great power comes great responsibility, but for Silicon Valley’s big corporations, that isn’t yet true. The companies that host most of our online activity are mostly only a little over ten years old. They have a lot of power and can influence everything from how we get our news to what we wear.

For many users of social media, it feels a bit like a public market square with conversations and buying and selling. We use them like utilities, but they are not neutral. The laws and rules for these companies are very American and we rely on them to do the job of the state for us.

In a US Senate hearing, they asked Facebook CEO Mark Zuckerberg about his company’s actions and talked about this problem. Senator Ted Cruz asked Zuckerberg if his company thinks it is neutral. Mark Zuckerberg didn’t really answer the question and said he did not know enough about the law.

There are very many calls now for corporations to have or increase regulation of certain types of speech. People in government, the opinion pages of big newspapers, and NGOs agree that Facebook, Google, and others like them should take on the job of government and censor hate speech, regulate ‘fake news’, and fight extremism. And all of this without very much or any checks by the public.

In Europe, this is already happening. In 2016, the European Commission signed a ‘code of conduct’ with four big American tech companies – Microsoft, Google, Facebook, and Twitter – to reduce illegal content online. The code says companies should look at reported content in a certain time and delete hateful speech that goes against their own terms of service. The code does not refer to illegal content itself, but pushes companies to follow their own rules. Civil society groups were part of the talks, but resigned from them in 2016, and said there was not enough open discussion and no ideas from the public.

Similarly, the German Netzwerkdurchsetzungsgesetz (NetzDG) law, which started in late 2017, says companies must delete certain content (such as threats of violence and slander) in 24 hours of a complaint. Or, in legal cases, in seven days. Internet activists criticised the law in Germany and abroad.

Europe is worried about the increase of hate speech. But asking American companies to decide what is or isn’t hateful is a strange way of doing it. After all, these are the same companies whose software identifies profane words as worse than hate speech, and often does not take white supremacist terrorism seriously.

These regulations give us the idea of safety and security but they are taking away democracy by giving more power to companies we cannot hold responsible.

For nearly ten years now, tech companies have been stricter and stricter about what we say online. They restrict the expression of sexuality, and of women’s bodies. They restrict our use of all profane words. We’ve allowed corporate executives – and not elected officials to say what is acceptable speech, and they’ve done a not very good job.

More recently, there are demands for these same companies to say what is ‘news’. In the past, social media platforms such as Facebook have not really regulated the media and have made partnerships with news organizations. But after the Trump election, there have been more and more calls for the company to fight misinformation.

It is a good idea but can we trust tech companies to do it well? The answer is no! Campbell Brown is Head of News Partnerships. At a meeting hosted by the Financial Times and Facebook, he was asked if we can trust Breitbart. Breithart is the far-right media site associated with members of the Trump government and it publishes dishonest and hateful content. Brown said, ‘To some people, it is. To some people, it is not.’

Brown’s answer tells us a lot. Facebook does not want to do the work that it says it will. This is not a surprise when one of the company’s board members sometimes shares Breitbart articles on the platform.

Other Facebook ideas are to ask users to decide if we can trust certain media. This is a dangerous suggestion when we see how popular Breitbart is. Another idea is to have a ‘downvote’ button for users to ‘dislike’ media they think we cannot trust.

And we’ve seen what can happen when companies regulate media. They do not worry about the big media but they worry about small and foreign-language media. In December 2017, a small Ukrainian company was banned for a short time from Facebook, and they told them that they had ‘triggered a malicious ad rule’. And in April 2018, the San Diego City Beat found content censored from its account for not following a content rule. And Facebook’s recent changes to its news algorithm is having an effect on small publishers the most.

All of these actions have been developed outside of democracy. And they see problem speech as something to hide and not to deal with at its roots.

Perhaps, then, it’s not surprising that some of the best solutions to these problems are coming from others. First Draft is a project of the Harvard Kennedy School’s Shorenstein Center. It plans to fight misinformation through research and education, and it will work with partners to cover many techniques used by journalists to verify content. Other projects, including the Dangerous Speech Project, plan to educate about speech that can cause or make violence worse.

It’s important to know that misinformation, violent speech, harassment, and other forms of expression can do real harm. But it is also important to understand the harm done when we give control to corporations when their interest is profit.

But they already have the power and so it’s important that we push companies to be open and responsible. This means looking openly at the way they organise their companies and the way they take down users’ content, and being honest with users about how they collect and use their data.

As we look for solutions, we must see democracy as more important than profit.

NOW READ THE ORIGINAL: https://newint.org/features/2018/06/01/facebook-is-the-new-censors-office

(This article has been simplified so the words, text structure and quotes may have been changed)