fbpx
Wikipedia

Facebook content management controversies

Facebook or Meta Platforms has been criticized for its management of various content on posts, photos and entire groups and profiles. This includes but is not limited to allowing violent content, including content related to war crimes, and not limiting the spread of fake news and COVID-19 misinformation on their platform, as well as allowing incitement of violence against multiple groups.

An example of a Facebook post censored due to an unspecified conflict with "Community Standards"
Error message generated by Facebook for an attempt to share a link to a website that is censored due to Community Standards in a private chat. Messages containing certain links will not be delivered to the recipient.

Intellectual property infringement edit

Facebook has been criticized for having lax enforcement of third-party copyrights for videos uploaded to the service. In 2015, some Facebook pages were accused of plagiarizing videos from YouTube users and re-posting them as their own content using Facebook's video platform, and in some cases, achieving higher levels of engagement and views than the original YouTube posts. Videos hosted by Facebook are given a higher priority and prominence within the platform and its user experience (including direct embedding within the News Feed and pages), giving a disadvantage to posting it as a link to the original external source.[1][2] In August 2015, Facebook announced a video-matching technology aiming to identify reposted videos, and also stated its intention to improve its procedures to remove infringing content faster.[3] In April 2016, Facebook implemented a feature known as "Rights Manager", which allows rights holders to manage and restrict the upload of their content onto the service by third-parties.[4]

Violent content edit

In 2013, Facebook was criticized for allowing users to upload and share videos depicting violent content, including clips of people being decapitated. Having previously refused to delete such clips under the guideline that users have the right to depict the "world in which we live", Facebook changed its stance in May, announcing that it would remove reported videos while evaluating its policy.[5] The following October, Facebook stated that it would allow graphic videos on the platform, as long as the intention of the video was to "condemn, not glorify, the acts depicted",[6] further stating that "Sometimes, those experiences and issues involve graphic content that is of public interest or concern, such as human rights abuses, acts of terrorism, and other violence. When people share this type of graphic content, it is often to condemn it. If it is being shared for sadistic pleasure or to celebrate violence, Facebook removes it."[7] However, Facebook once again received criticism, with the Family Online Safety Institute saying that such videos "crossed a line" and can potentially cause psychological damage among young Facebook users,[6] and then-Prime Minister of the United Kingdom David Cameron calling the decision "irresponsible", citing the same concerns regarding young users.[7] Two days later, Facebook removed a video of a beheading following "worldwide outrage", and while acknowledging its commitment to allowing people to upload gory material for the purpose of condemnation, it also stated that it would be further strengthening its enforcement to prevent glorification.[7] The company's policies were also criticized as part of these developments, with some drawing particular attention to Facebook's permission of graphic content but potential removal of breastfeeding images.[8] In January 2015, Facebook announced that new warnings would be displayed on graphic content, requiring users to explicitly confirm that they wish to see the material.[9][10]

War crimes edit

Facebook has been criticized for failing to take down violent content depicting war crimes in Libya. A 2019 investigation by the BBC[11] found evidence of alleged war crimes in Libya being widely shared on Facebook and YouTube. The BBC found images and videos on social media of the bodies of fighters and civilians being desecrated by fighters from the self-styled Libyan National Army. The force, led by General Khalifa Haftar, controls a swathe of territory in the east of Libya and is trying to seize the capital, Tripoli. BBC Arabic found almost one hundred images and videos from Libya shared on Facebook and YouTube, in violation of their companies' guidelines.[12] The UK Foreign Office said it took the allegations extremely seriously and is concerned about the impact the recent violence is having on the civilian population.[13]

In 2017, a Facebook video of Libyan National Army (LNA) special forces commander Mahmoud al-Werfalli was uploaded showing him shooting dead three captured fighters. The video was then shared on YouTube over ten thousand times. The International Criminal Court used it as evidence to indict al-Werfalli for the war crime of murder.[14] The BBC found the original video was still on Facebook 2 years after his indictment and also discovered videos showing the bodies of civilians being desecrated.[citation needed] These were taken in Ganfouda, a district of Benghazi which was under siege by the LNA between 2016 and 2017. More than 300 people, including dozens of children died during the siege. A video uncovered by BBC Arabic showed soldiers mocking a pile of corpses of dead civilians and trampling on bodies. Among them was a 77-year-old woman, Alia Hamza. Her son, Ali Hamza, had five family members killed in Ganfouda.

Ali Hamza told BBC Arabic, "I sent links to lawyers to send to the ICC in the Hague against Khalifa Haftar and his military commanders regarding the massacres of civilians", said Hamza. In the video, the LNA soldiers label the civilians as terrorists. Human rights lawyer and war crimes specialist Rodney Dixon QC reviewed the evidence BBC Arabic found. "If groups are using those platforms to propagate their campaigns then those platforms should seriously look at their role because they could then be assisting in that process of further crimes being committed", he said.[citation needed] After presenting our findings to Facebook they removed all the videos that show a suspected war crime taking place. However, they opted not to suspend any of the accounts which we found linked to the images. Erin Saltman, Facebook's policy manager for counterterrorism in Europe, Middle East and Africa, told BBC Arabic, "Sometimes there are very conflicting narratives of whether or not the victim is a terrorist, or whether it's a civilian over who's committing that act, we cannot be the pure arbiters of truth."[12] But Facebook and YouTube's own community guidelines explicitly prohibit content that promotes or depicts acts of violence.[15]

Facebook Live edit

Facebook Live, introduced in August 2015 for celebrities[16] and gradually rolled out for regular users starting in January 2016,[17][18] lets users broadcast live videos, with Facebook's intention for the feature to be presenting public events or private celebrations.[19] However, the feature has been used to record multiple crimes, deaths, and violent incidents, causing significant media attention.[20][21][22][23][24][25][26][27]

Facebook has received criticism for not removing videos faster,[28] and Facebook Live has been described as a "monster [Facebook] cannot tame"[29] and "a gruesome crime scene for murders".[30] In response, CEO Mark Zuckerberg announced in May 2017 that the company would hire 3,000 people to review content and invest in tools to remove videos faster.[31][32][33]

Pro-anorexia groups edit

In 2008, Facebook was criticized for hosting groups dedicated to promoting anorexia. The groups promoted dramatic weight loss programs, shared extreme diet tips, and posted pictures of emaciated girls under "Thinspiration" headlines. Members reported having switched to Facebook from Myspace, another social networking service, due to a perceived higher level of safety and intimacy at Facebook.[34] In a statement to BBC News, a Facebook spokesperson stated that "Many Facebook groups relate to controversial topics; this alone is not a reason to disable a group. In cases where content is reported and found to violate the site's terms of use, Facebook will remove it."[35]

Pro-mafia groups' case edit

In Italy in 2009, the discovery of pro-mafia groups, one of them claiming Bernardo Provenzano's sainthood, caused an alert in the country[36][37][38] and brought the government to rapidly issue a law that would force Internet service providers to deny access to entire websites in case of refused removal of illegal contents. The amendment was passed by the Italian Senate and now needs to be passed unchanged by the Chamber of Deputies to become effective.[39][40][41][needs update]

Facebook criticized the government's efforts, telling Bloomberg that it "would be like closing an entire railway network just because of offensive graffiti at one station", and that "Facebook would always remove any content promoting violence and already had a takedown procedure in place."[42]

Trolling edit

On March 31, 2010, The Today Show ran a segment detailing the deaths of three separate adolescent girls and trolls' subsequent reactions to their deaths. Shortly after the suicide of high school student Alexis Pilkington, anonymous posters began trolling for reactions across various message boards, referring to Pilkington as a "suicidal CUSS", and posting graphic images on her Facebook memorial page. The segment also included an exposé of a 2006 accident, in which an eighteen-year-old student out for a drive fatally crashed her father's car into a highway pylon; trolls emailed her grieving family the leaked pictures of her mutilated corpse.[43]

There have been cases where Facebook "trolls" were jailed for their communications on Facebook, particularly memorial pages. In Autumn 2010, Colm Coss of Ardwick, Britain, was sentenced to 26 weeks in jail under s127 of the Communications Act 2003 of Great Britain,[44] for "malicious communications" for leaving messages deemed obscene and hurtful on Facebook memorial pages.[45][46]

In April 2011, Bradley Paul Hampson was sentenced to three years in jail after pleading guilty to two counts of using a carriage service (the Internet) to cause offense, for posts on Facebook memorial pages, and one count each of distributing and possessing child pornography when he posted images on the memorial pages of the deceased with phalluses superimposed alongside phrases such as "Woot I'm dead".[47][48]

Rape pages edit

A series of pro-rape and "rape joke" content on Facebook drew attention from the media and women's groups.[49] Rape Is No Joke (RINJ), a group opposing the pages, argued that removing "pro-rape" pages from Facebook and other social media was not a violation of free speech in the context of Article 19 of the Universal Declaration of Human Rights and the concepts recognized in international human rights law in the International Covenant on Civil and Political Rights.[50] RINJ repeatedly challenged Facebook to remove the rape pages.[51] RINJ then turned to advertisers on Facebook telling them not to let their advertising be posted on Facebook's 'rape pages'.[52]

Following a campaign that involved the participation of Women, Action and the Media, the Everyday Sexism Project and the activist Soraya Chemaly, who were among 100 advocacy groups, Facebook agreed to update its policy on hate speech. The campaign highlighted content that promoted domestic and sexual violence against women, and used over 57,000 tweets and more than 4,900 emails to create outcomes such as the withdrawal of advertising from Facebook by 15 companies, including Nissan UK, House of Burlesque and Nationwide UK. The social media website initially responded by stating that "While it may be vulgar and offensive, distasteful content on its own does not violate our policies",[53] but then agreed to take action on May 29, 2013, after it had "become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like, particularly around issues of gender-based hate".[54]

Child abuse images edit

In June 2015, the UK National Society for the Prevention of Cruelty to Children raised concerns about Facebook's apparent refusal when asked to remove controversial video material which allegedly showed a baby in emotional distress.[55]

In March 2017, BBC News reported in an investigation that Facebook only removed 18 of the 100 groups and posts it had reported for containing child exploitation images. The BBC had been granted an interview with Facebook policy director Simon Milner under the condition that they provide evidence of the activity. However, when presented with the images, Facebook canceled the interview, and told the BBC that it had been reported to the National Crime Agency for illegally distributing child exploitation images (the NCA could not confirm whether the BBC was actually being investigated).[56] Milner later stated to the BBC that the investigation had exposed flaws in its image moderation process that have since been addressed, and that all of the reported content was removed from the service.[57]

According to data from the National Center for Missing & Exploited Children in 2020, there have been 20 million reported incidents of child sexual abuse material on Facebook. This accounted for 95% of total incidents recorded by the organization, while Google accounted for half a million incidents, Snapchat for 150,000 and Twitter for 65,000.[58]

Objectification of women edit

In July 2017, GMA News reported that "a number" of secret Facebook groups that had been engaging in illegal activity of sharing "obscene" photos of women had been exposed, with the Philippine National Bureau of Investigation warning group members of the possibility of being liable for violating child pornography and anti-voyeurism laws. Facebook stated that it would remove the groups as violations of its community guidelines.[59] A few days later, GMA News had an interview with one of the female victims targeted by one of the groups, who stated that she received friend requests from strangers and inappropriate messages. After reporting to authorities, the Philippine National Police's anti-cybercrime unit promised to take action in finding the accounts responsible.[60] Senator Risa Hontiveros responded to the incidents with the proposal of a law that would impose "stiff penalties" on such group members, stating that "These people have no right to enjoy our internet freedom only to abuse our women and children. We will not allow them to shame our young women, suppress their right to express themselves through social media and contribute to a culture of misogyny and hate".[61]

Violation of Palestinian Human Rights edit

According to the study commissioned by Meta and carried out by Business for Social Responsibility (BSR), Facebook and Instagram's policies during Israeli attacks on Gaza Strip in 2021 harmed the fundamental human rights of Palestinians. The social media giant had denied Palestinian users their freedom of expression by erroneously removing their content. BSR's report is yet another indictment of the company's ability to police its global public square and to balance freedom of expression against the potential for harm in a tense international context.[62]

Anti-Semitism edit

Facebook has been suspected of having a double standard when it comes to pages and posts regarding the Arab–Israeli conflict. When it comes to alleged incitement, Facebook has been accused of being unfair, removing only posts and pages that attack Palestinians, while turning a blind eye to similar posts that are violently antisemitic. The NGO Shurat Hadin-Israel Law Center conducted an experiment over the incitement issue, which sought to expose what it viewed as double standards regarding anti-Israel sentiment vis-a-vis the simultaneous launch of two Facebook pages: "Stop Palestinians" and "Stop Israel". Following the launch of the two nearly identical pages, the NGO posted hateful content simultaneously on both pages. Next, Shurat Hadin reported both faux-incitement pages to Facebook to see which, if either, would be removed. According to them, despite featuring nearly identical content, only one was removed from the online platform. They said the page inciting against Palestinians was closed by Facebook (on the same day that it was reported) for "containing credible threat of violence" which "violated our [Facebook's] community standards", but not the page inciting against Israelis. Shurat Hadin said that Facebook claimed that this page was "not in violation of Facebook's rules". Shurat Hadin's staged anti-Israel group "Stop Israel" still remains active on Facebook.[63] ProPublica stated in September 2017 that a website was able to target ads at Facebook users who were interested in "how to burn Jew" and "Jew hater". Facebook removed the categories and said it would try to stop them from appearing to potential advertisers.[64]

In March 2019, Facebook subsidiary Instagram declined to remove an anti-semitic image posted by right-wing conspiracy theorist Alex Jones, saying that it did not violate their community standards.[65][better source needed]

Incitement of violence against Israelis edit

Facebook has been accused of being a public platform that is used to incite violence. In October 2015, 20,000 Israelis claimed that Facebook was ignoring Palestinian incitement on its platform and filed a class-action suit demanding that Facebook remove all posts "containing incitement to murder Jews".[66]

Israeli politicians have complained that Facebook does not comply or assist with requests from the police for tracking and reporting individuals when they share their intent to kill or commit any other act of violence on their Facebook pages. In June 2016, following the murder of Hallel Ariel, 13, by a terrorist who posted on Facebook, Israeli Minister of Public Security Gilad Erdan charged that "Facebook, which has brought a positive revolution to the world, has become a monster ... The dialogue, the incitement, the lies of the young Palestinian generation are happening on the Facebook platform." Erdan accused Facebook of "sabotaging the work of Israeli police" and "refusing to cooperate" when Israel police turns to the site for assistance. It also "sets a very high bar" for removing inciting content.[67]

In July 2016, a civil action for $1 billion in damages was filed in the United States District Court for the Southern District of New York on behalf of the victims and family members of four Israeli-Americans and one US citizen killed by Hamas militants since June 2014.[68][69] The victims and plaintiffs in the case are the families of Yaakov Naftali Fraenkel, a 16-year-old who was kidnapped and murdered by Hamas operatives in 2014; Taylor Force, a 29-year-old American MBA student and US Army veteran killed in a stabbing spree in Jaffa in 2016; Chaya Braun, a three-month-old thrown from her stroller and slammed into the pavement when a Hamas attacker drove his car into a light rail station in Jerusalem in an October 2014; 76-year-old Richard Lakin who was killed in the October 2015 shooting and stabbing attack on a Jerusalem bus; and Menachem Mendel Rivkin, who was seriously wounded in a January 2016 stabbing attack in Jerusalem.[69] The plaintiffs claimed that Facebook knowingly provided its social media platform and communication services to Hamas in violation of provisions of US Anti-Terrorism laws which prohibits US businesses from providing any material support, including services, to designated terrorist groups and their leaders. The government of the United States has designated Hamas as a "Foreign Terrorist Organization" as defined by US law. The suit claims that Hamas "used and relied on Facebook's online social network platform and communications services to facilitate and carry out its terrorist activity, including the terrorist attacks in which Hamas murdered and injured the victims and their families in this case".[68][69] The legal claim was rejected; the court found that Facebook and other social media companies are not considered to be the publishers of material users post when digital tools used by the company match content with what the tool identifies as interested consumers.[70][71]

In August 2016, Israel's security service, the Shin Bet, reported that it had arrested nine Palestinians who had been recruited by the Lebanon-based Hezbollah terrorist organization. Operatives of Hezbollah in Lebanon and Gaza Strip recruited residents of the West Bank, Gaza and Israel through Facebook and other social media sites. After recruiting cell leaders on Facebook, Hezbollah and the recruits used encrypted communications to avoid detection, and the leaders continued to recruit other members. The terror cells received Hezbollah funding and planned to conduct suicide bombings and ambushes and had begun preparing explosive devices for attacks, said the security service, which claimed credit for preventing the attacks. The Shin Bet said it also detected multiple attempts by Hezbollah to recruit Israeli Arabs through a Facebook profile.[72][73][74]

In October 16, 2023, singer and internet personality Dalal Abu Amneh was arrested by the Israeli Police for allegedly promoting hate speech and inciting violence on social media following a massacre perpetrated by Hamas on October 7, 2023.[75]

In 2016, legislation was being prepared in Israel, allowing fines of 300,000 shekels for Facebook and other social media like Twitter and YouTube for every post inciting or praising terrorism that is not removed within 48 hours, and could possibly lead to further acts of terrorism.[76]

Countermeasure efforts edit

In June 2017, Facebook published a blog post, offering insights into how it detects and combats terrorism content. The company claimed that the majority of the terrorism accounts that are found are discovered by Facebook itself, while it reviews reports of terrorism content "urgently", and, in cases of imminent harm, "promptly inform authorities". It also develops new tools to aid in its efforts, including the use of artificial intelligence to match terrorist images and videos, detecting when content is shared across related accounts, and developing technologies to stop repeat offenders. The company stated that it has 150 people dedicated to terrorism countermeasures, and works with governments and industries in an effort to curb terrorist propaganda. Its blog post stated that "We want Facebook to be a hostile place for terrorists."[77][78]

Employee data leak edit

In June 2017, The Guardian reported that a software bug had exposed the personal details of 1,000 Facebook workers involved in reviewing and removing terrorism content, by displaying their profiles in the "Activity" logs of Facebook groups related to terrorism efforts. In Facebook's Dublin, Ireland headquarters, six individuals were determined to be "high priority" victims of the error, after the company concluded that their profiles were likely viewed by potential terrorists in groups such as ISIS, Hezbollah and the Kurdistan Workers' Party. The bug itself, discovered in November 2016 and fixed two weeks later, was active for one month, and had also been retroactively exposing censored personal accounts from August 2016. One affected worker had fled Ireland, gone into hiding, and only returned to Ireland after five months due to a lack of money. Suffering from psychological distress, he filed a legal claim against Facebook and CPL Resources, an outsourcing company, seeking compensation. A Facebook spokesperson stated that "Our investigation found that only a small fraction of the names were likely viewed, and we never had evidence of any threat to the people impacted or their families as a result of this matter", and Craig D'Souza, Facebook's head of global investigations, said: "Keep in mind that when the person sees your name on the list, it was in their activity log, which contains a lot of information ... there is a good chance that they associate you with another admin of the group or a hacker". Facebook offered to install a home-alarm monitoring system, provide transport to and from work, and counseling through its employee assistance program. As a result of the data leak, Facebook is reportedly testing the use of alternative, administrative accounts for workers reviewing content, rather than requiring workers to sign in with their personal profiles.[79][80]

Fake news edit

Facebook has been criticized for not doing enough to limit the spread of fake news stories on their site, especially after the 2016 United States presidential election, which some have claimed Donald Trump would not have won if Facebook had not helped spread what they claim to have been fake stories that were biased in his favor.[81] At a conference called Techonomy, Mark Zuckerberg stated in regards to Donald Trump, "There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news". Zuckerberg affirmed the idea that people do not stray from their own ideals and political leanings. He stated, "I don't know what to do about that" and, "When we started, the north star for us was: We're building a safe community".[82]

Zuckerberg has also been quoted in his own Facebook post, "Of all the content on Facebook, more than 99 percent of what people see is authentic".[83] In addition, The Pew Research Center, stated that "62% of Americans obtain some, or all, of their news on social media-the bulk of it from Facebook".[84] The former editor at Facebook leaked inflammatory information about the websites' algorithm's pointing to certain falsehoods and bias by the news created within Facebook. Although Facebook initially denied claims of issues with fake new stories and their algorithms, they fired the entire trending team involved with a fake news story about Megyn Kelly being a "closeted liberal".[85]

In 2016, Mark Zuckerberg began to take steps to eliminate the prevalence of fake news on Facebook as a result of criticisms of Facebook's influence on the presidential election.[86] Facebook initially partnered with ABC News, the Associated Press, FactCheck.org, Snopes and PolitiFact for its fact-checking initiative;[87] as of 2018, it had over 40 fact-checking partners across the world, including The Weekly Standard.[88] A May 2017 review by The Guardian found that Facebook's fact-checking initiatives of partnering with third-party fact-checkers and publicly flagging fake news were regularly ineffective, and appeared to be having minimal impact in some cases.[89] In 2018, journalists working as fact-checkers for Facebook criticized the partnership, stating that it had produced minimal results and that the company had ignored their concerns.[88]

Incitement of violence in Sri Lanka edit

In March 2018, the government of Sri Lanka blocked Facebook and other social media services in an effort to quell the violence in the 2018 anti-Muslim riots, with Harsha de Silva, the Deputy Minister for National Policies and Economic Affairs, tweeting: "Hate speech on Facebook is increasing beyond acceptable levels. Government will have to act immediately to save lives."[90] Sri Lankan telecommunications minister Harin Fernando stated that Facebook had been too slow in removing content and banning users who were using its platforms to facilitate violence during the riots.[91][92] In response, Facebook stated that it had increased the number of Sinhalese speakers it employs to review content.[91]

In April 2019, during the aftermath of the Easter bombings, the Sri Lankan government blocked access to Facebook, Instagram and WhatsApp in an effort to stop the spread of misinformation that could lead to further violence.[93]

Inclusion of Breitbart News as trusted news source edit

In October 2019, Facebook announced that Breitbart News, an American far-right news and opinion website, would be included as a "trusted source" in its Facebook News feature alongside sources like The New York Times and The Washington Post. The decision sparked controversy due to Breitbart News's status as a platform for the alt-right and its reputation for publishing misinformation.[94][95][96] In October 2021, The Wall Street Journal reported that Facebook executives resisted removing Breitbart News from Facebook's News Tab feature to avoid angering Donald Trump and Republican members of Congress, despite criticism from Facebook employees.[97][98] An August 2019 internal Facebook study had found that Breitbart News was the least trusted news source, and also ranked as low-quality, in the sources it looked at across the U.S. and Great Britain.[97]

Uyghur genocide denial edit

In February 2021, a Press Gazette investigation found that Facebook had accepted promotional content from Chinese state media outlets such as China Daily and China Global Television Network that spread disinformation denying the Uyghur genocide.[99]

Incitement of human rights abuses in Myanmar edit

The chairman of the U.N. Independent International Fact-Finding Mission on Myanmar stated that Facebook played a "determining role" in the Rohingya genocide.[100] Facebook has been accused of enabling the spread of Islamophobic content which targets the Rohingya people.[101] The United Nations Human Rights Council has called the platform "a useful instrument for those seeking to spread hate".[102]

The internet.org initiative was brought to Myanmar in 2015. Myanmar's relatively recent democratic transition did not provide the country with substantial time to form professional and reliable media outlets free from government intervention. Furthermore, approximately 1% of Myanmar's residents had internet access before internet.org. As a result, Facebook was the primary source of information and without verifiable professional media options, Facebook became a breeding ground for hate speech and disinformation. "Rumors circulating among family or friends’ networks on Facebook were perceived as indistinguishable from verified news by its users."[103] Frequent anti-Rohingya sentiments included high Muslim birthrates, increasing economic influence, and plans to takeover the country. Myanmar's Facebook community was also nearly completely unmonitored by Facebook, who at the time only had two Burmese-speaking employees.

In response, Facebook removed accounts which were owned by the Myanmar Armed Forces because they had previously used Facebook to incite hatred against the Rohingya people,[104][105][106] and they were currently "engaging in coordinated inauthentic behavior."[107] In February 2021, Facebook banned the Myanmar military from its platform and set up rules to ban Tatmadaw-linked businesses.[108][109]

The Myanmar military wasn't the only account found to have incited violence. In a review done by Facebook in 2018, Facebook “banned accounts and pages associated with Myanmar military personnel that were indicated by the UN as being directly responsible for the ethnic cleansing in Rakhine. The banned accounts had a widespread reach in the country, as they were followed by nearly 12 million accounts, which is about half of all Myanmar's Facebook users.”[103]

On 6 December 2021, approximately a hundred Rohingya refugees launched a $150 billion lawsuit against Facebook, alleging that it did not do enough to prevent the proliferation of anti-Rohingya hate speech because it was interested in prioritizing engagement.[110] On 10 December 2021, sixteen Rohingya youth living in Cox's Bazar refugee camp made a complaint against Facebook to the Irish National Contact Point for the OECD Guidelines for Multinational Enterprises, alleging that Facebook had violated the guidelines, and owed them a remedy.[111][112] The lead complainants in the case included members of Rohingya civil society group Arakan Rohingya Society for Peace and Human Rights (ARSPH). Mohibullah, who founded ARSPH, and had spearheaded efforts amongst camp-based Rohingya refugees to hold Facebook accountable, had been murdered just over two months before.[113]

Blue tick edit

Facebook grants blue tick to verified accounts of public personalities, brands, and celebrities (including politicians and artists). They have no policy in the cases where an individual who has a verified blue tick account is convicted in a serious criminal case. There was a 2018 case in India where a politician was convicted and sentenced to 10 years in jail in a serious bribery criminal case but his Facebook page still continued to be verified.[114]

Neo-Nazi and white supremacist content edit

From circa 2018 until March 27, 2019, Facebook's internal policy was to permit "white nationalist" content but not "white supremacist" content, despite advice stating there is no distinction.[115] In practice, it hosted much white supremacist and neo-Nazi content.[116] On March 27, 2019, Facebook backtracked and stated that white nationalism "cannot be meaningfully separated from white supremacy and organized hate groups".[115]

In 2020 the Centre for Countering Digital Hate (CCDH) found Facebook was hosting a white supremacist network with more than 80,000 followers and links to the UK far right. The CCDH said: "Facebook's leadership endangered public safety by letting neo-Nazis finance their activities through Facebook and Instagram ...  Facebook was first told about this problem two years ago and failed to act."[117]

COVID-19 misinformation edit

In February 2020, the Taiwanese Central News Agency reported that large amounts of misinformation had appeared on Facebook claiming the pandemic in Taiwan was out of control, the Taiwanese government had covered up the total number of cases, and that President Tsai Ing-wen had been infected.[118][119] The Taiwan fact-checking organization had suggested the misinformation on Facebook shared similarities with mainland China due to its use of simplified Chinese characters and mainland China vocabulary.[120] The organization warned that the purpose of the misinformation is to attack the government.[121][122][123] The Epoch Times, an anti-Chinese Communist Party (CCP) newspaper affiliated with Falun Gong, has spread misinformation related to the COVID-19 pandemic in print and via social media including Facebook and YouTube.[124][125]

In April 2020, rumors circulated on Facebook, alleging that the US Government had "just discovered and arrested" Charles Lieber, chair of the Chemistry and Chemical Biology Department at Harvard University for "manufacturing and selling" the novel coronavirus (COVID-19) to China. According to a report from Reuters, posts spreading the rumor were shared in multiple languages over 79,000 times on Facebook.[126][127]

In January 2021, the Bureau of Investigative Journalism found that 430 Facebook pages – being followed by 45 million people – were spreading false information about COVID-19 or vaccinations.[128] This was despite a promise by Facebook in 2020 that no user or company should directly profit from false information about immunization against COVID-19.[129] A Facebook spokesman said the company had "removed a small number of the pages shared with us for violating our policies".[citation needed] In August 2021, Facebook said that an article raising concerns about potentially fatal effects of a COVID-19 vaccine was the top-performing link in the United States between January and March 2021, and that another site publishing COVID-19 misinformation was among its top 20 visited pages.[130]

Marketplace illegal Amazon rainforest sales edit

In February 2021, BBC investigations revealed that Amazon rainforest plots on land reserved for indigenous people were being illegally traded on the Facebook Marketplace with the sellers admitting they do not have the land title. The BBC reported that Facebook were "ready to work with local authorities", but were unwilling to take independent action.[131]

Incitement of ethnic massacres in Ethiopia edit

In February 2022, Facebook was accused by the Bureau of Investigative Journalism and The Observer of letting activists incite ethnic massacres in the Tigray War by spreading hate and misinformation.[132] Following the report, a lawsuit against Meta was filed in December 2022 in the High Court of Kenya by the son of a Tigrayan academic murdered in November 2021 after receiving racist attacks on the platform.[133]

See also edit

References edit

  1. ^ Setalvad, Ariha (August 7, 2015). "Why Facebook's video theft problem can't last". The Verge. Retrieved May 29, 2017.
  2. ^ Oremus, Will (July 8, 2015). "Facebook's Piracy Problem". Slate. The Slate Group. Retrieved May 29, 2017.
  3. ^ Luckerson, Victor (August 28, 2015). "Facebook to Crack Down on Online Video Piracy". Time. Retrieved May 29, 2017.
  4. ^ Constine, Josh (April 12, 2016). "Facebook launches video Rights Manager to combat freebooting". TechCrunch. AOL. Retrieved May 29, 2017.
  5. ^ Kelion, Leo (May 1, 2013). "Facebook U-turn after charities criticise decapitation videos". BBC News. BBC. Retrieved June 3, 2017.
  6. ^ a b Winter, Michael (October 21, 2013). "Facebook again allows violent videos, with caveat". USA Today. Retrieved June 3, 2017.
  7. ^ a b c "Facebook pulls beheading video". The Daily Telegraph. October 23, 2013. Archived from the original on January 12, 2022. Retrieved June 3, 2017.
  8. ^ Harrison, Virginia (October 23, 2013). "Outrage erupts over Facebook's decision on graphic videos". CNNMoney. CNN. Retrieved June 3, 2017.
  9. ^ Gibbs, Samuel (January 13, 2015). "Facebook tackles graphic videos and photos with 'are you sure?' warnings". The Guardian. Retrieved June 3, 2017.
  10. ^ Kelion, Leo (January 13, 2015). "Facebook restricts violent video clips and photos". BBC News. BBC. Retrieved June 3, 2017.
  11. ^ "Libya 'war crimes' videos shared online". BBC News. Retrieved September 23, 2019.
  12. ^ a b Libyan conflict: Suspected war crimes shared online – BBC Newsnight, archived from the original on 2021-12-21, retrieved September 23, 2019
  13. ^ "BBC: War crimes committed by Haftar's forces shared on Facebook, YouTube". Libyan Express. May 1, 2019. Retrieved October 31, 2020.
  14. ^ "The Prosecutor v. Mahmoud Mustafa Busyf Al-Werfalli" (PDF). International Criminal Court. Retrieved 17 February 2022.
  15. ^ "Community Standards | Facebook". Retrieved September 23, 2019 – via Facebook.
  16. ^ Mangalindan, JP (August 5, 2015). "Facebook launches live streaming, but only for famous people". Mashable. Retrieved June 3, 2017.
  17. ^ Barrett, Brian (January 28, 2016). "Facebook Livestreaming Opens Up to Everyone With an iPhone". Wired. Retrieved June 3, 2017.
  18. ^ Newton, Casey (January 28, 2016). "Facebook rolls out live video streaming to everyone in the United States". The Verge. Retrieved June 3, 2017.
  19. ^ Newton, Casey (December 3, 2015). "Facebook begins testing live video streaming for all users". The Verge. Retrieved June 3, 2017.
  20. ^ Chrisafis, Angelique; Willsher, Kim (June 14, 2016). "French police officer and partner murdered in 'odious terrorist attack'". The Guardian. Retrieved June 3, 2017.
  21. ^ Madden, Justin (June 17, 2016). "Chicago man shot dead while live streaming on Facebook". Reuters. Retrieved June 3, 2017.
  22. ^ Chaykowski, Kathleen (July 7, 2016). "Philando Castile's Death On Facebook Live Highlights Problems For Social Media Apps". Forbes. Retrieved June 3, 2017.
  23. ^ McLaughlin, Eliott C.; Blau, Max; Vercammen, Paul (September 30, 2016). "Police: Man killed by officer pointed vaping device, not gun". CNN. Retrieved June 3, 2017.
  24. ^ Berman, Mark; Hawkins, Derek (January 5, 2017). "Hate crime charges filed after 'reprehensible' video shows attack on mentally ill man in Chicago". The Washington Post. Nash Holdings. Retrieved June 3, 2017.
  25. ^ Steele, Billy (March 22, 2017). "Dozens watched a Facebook Live stream of sexual assault (updated)". Engadget. AOL. Retrieved June 3, 2017.
  26. ^ Gibbs, Samuel (April 25, 2017). "Facebook under pressure after man livestreams killing of his daughter". The Guardian. Retrieved June 3, 2017.
  27. ^ Solon, Olivia (January 27, 2017). "Why a rising number of criminals are using Facebook Live to film their acts". The Guardian. Retrieved June 3, 2017.
  28. ^ Solon, Olivia; Levin, Sam (January 6, 2017). "Facebook refuses to explain why live torture video wasn't removed sooner". The Guardian. Retrieved June 3, 2017.
  29. ^ Krasodomski-Jones, Alex (January 9, 2017). "Facebook has created a monster it cannot tame". CNN. Retrieved June 3, 2017.
  30. ^ Bhattacharya, Ananya (June 18, 2016). "Facebook Live is becoming a gruesome crime scene for murders". Quartz. Retrieved June 3, 2017.
  31. ^ Gibbs, Samuel (May 3, 2017). "Facebook Live: Zuckerberg adds 3,000 moderators in wake of murders". The Guardian. Retrieved June 3, 2017.
  32. ^ Murphy, Mike (May 3, 2017). "Facebook is hiring 3,000 more people to monitor Facebook Live for murders, suicides, and other horrific video". Quartz. Retrieved June 3, 2017.
  33. ^ Ingram, David (May 3, 2017). "Facebook tries to fix violent video problem with 3,000 new workers". Reuters. Retrieved June 3, 2017.
  34. ^ Peng, Tina (November 22, 2008). "Pro-anorexia groups spread to Facebook". Newsweek. Retrieved June 13, 2017.
  35. ^ "Pro-anorexia site clampdown urged". BBC News. BBC. February 24, 2008. Retrieved June 13, 2017.
  36. ^ Masciarelli, Alexis (January 9, 2009). . France 24. Archived from the original on September 6, 2009. Retrieved June 13, 2017.
  37. ^ Donadio, Rachel (January 20, 2009). . The New York Times International Edition. Archived from the original on January 24, 2009. Retrieved June 13, 2017.
  38. ^ Pullella, Philip (January 12, 2009). "Pro-mafia Facebook pages cause alarm in Italy". Reuters. Retrieved June 13, 2017.
  39. ^ Krangel, Eric (February 11, 2009). "Italy Considering National Ban On Facebook, YouTube In Plan To Return To Dark Ages". Business Insider. Axel Springer SE. Retrieved June 13, 2017.
  40. ^ Kington, Tom (February 16, 2009). "Italian bill aims to block mafia Facebook shrines". The Guardian. Retrieved June 13, 2017.
  41. ^ Nicole, Kristen (February 12, 2009). "Mafia Bosses Could Cause Italy's Blocking of Facebook". Adweek. Beringer Capital. Retrieved June 13, 2017.
  42. ^ Oates, John (February 12, 2009). "Facebook hits back at Italian ban". The Register. Situation Publishing. Retrieved June 13, 2017.
  43. ^ "Trolling: The Today Show Explores the Dark Side of the Internet", March 31, 2010. Retrieved April 4, 2010. June 8, 2010, at the Wayback Machine
  44. ^ s127 of the Communications Act 2003 of Great Britain. Retrieved July 13, 2011.
  45. ^ Murder victim-mocking troll jailed, The Register, November 1, 2010. Retrieved July 13, 2011.
  46. ^ Jade Goody website 'troll' from Manchester jailed, BBC, October 29, 2010. Retrieved July 13, 2011.
  47. ^ Facebook troll Bradley Paul Hampson seeks bail, appeal against jail term, The Courier-Mail, April 20, 2011. Retrieved July 13, 2011.
  48. ^ Facebook urged to ban teens from setting up tribute pages, The Australian, June 5, 2010. Retrieved July 13, 2011.
  49. ^ Sherwell, Philip (October 16, 2011). "Cyber anarchists blamed for unleashing a series of Facebook 'rape pages'". The Daily Telegraph. London. Retrieved May 22, 2012.
  50. ^ "Facebook 'rape page' whitelisted and campaign goes global". Womensviewsonnews.org. Meanwhile, campaigns in other countries have begun, most notably in Canada with the Rape is no joke (RINJ) campaign, which has not only campaigned fiercely but has also put together a YouTube video.
  51. ^ . Albuquerque Express. October 23, 2011. Archived from the original on May 18, 2013. Retrieved May 22, 2012.
  52. ^ "Facebook Refuses to Remove 'Rape Pages' Linked to Australian, British Youth". International Business Times. October 18, 2011. Archived from the original on July 17, 2012. Retrieved May 22, 2012. O'Brien said the campaign is now focusing on Facebook advertisers telling them not to let their advertisements be posted on the "rape pages".
  53. ^ Sara C Nelson (May 28, 2013). "#FBrape: Will Facebook Heed Open Letter Protesting 'Endorsement Of Rape & Domestic Violence'?". The Huffington Post UK. Retrieved May 29, 2013.
  54. ^ Rory Carroll (May 29, 2013). "Facebook gives way to campaign against hate speech on its pages". The Guardian UK. London. Retrieved May 29, 2013.
  55. ^ "Facebook criticised by NSPCC over baby ducking video clip". BBC News. June 5, 2015.
  56. ^ "Facebook failed to remove sexualised images of children". BBC News. March 7, 2017. Retrieved March 9, 2017.
  57. ^ "Facebook, Twitter and Google grilled by MPs over hate speech". BBC News. March 14, 2017. Retrieved March 14, 2017.
  58. ^ Hitt, Tarpley (February 24, 2021). "Facebook a Hotbed of 'Child Sexual Abuse Material' With 20.3 Million Reports, Far More Than Pornhub". The Daily Beast. Retrieved August 23, 2021.
  59. ^ Layug, Margaret Claire (July 3, 2017). "'Pastor Hokage' FB groups trading lewd photos of women exposed". GMA News. Retrieved July 8, 2017.
  60. ^ Layug, Margaret Claire (July 5, 2017). "Victim of 'Pastor' FB reports harassment, indecent proposals". GMA News. Retrieved July 8, 2017.
  61. ^ De Jesus, Julliane Love (July 6, 2017). "Hontiveros wants stiff penalties vs 'Pastor Hokage' FB groups". Philippine Daily Inquirer. Retrieved July 8, 2017.
  62. ^ "FACEBOOK REPORT CONCLUDES COMPANY CENSORSHIP VIOLATED PALESTINIAN HUMAN RIGHTS". The Intercept. Retrieved 21 September 2022.
  63. ^ "When it comes to incitement, is Facebook biased against Israel? – Arab-Israeli Conflict – Jerusalem Post". The Jerusalem Post. Retrieved December 16, 2018.
  64. ^ "Facebook tightens ad policy after 'Jew hater' controversy — J". Jweekly.com. Jewish Telegraphic Agency. September 27, 2016. Retrieved September 29, 2017.
  65. ^ Gagliardo-Silver, Victoria (March 29, 2019). "Instagram refuses to remove Alex Jones' anti-semitic post". The Independent. Retrieved March 30, 2019.
  66. ^ "20,000 Israelis sue Facebook for ignoring Palestinian incitement". The Times of Israel. October 27, 2015. Retrieved July 15, 2016.
  67. ^ "Israel: Facebook's Zuckerberg has blood of slain Israeli teen on his hands". The Times of Israel. July 2, 2016. Retrieved July 15, 2016.
  68. ^ a b Wittes, Benjamin; Bedell, Zoe (July 12, 2016). "Facebook, Hamas, and Why a New Material Support Suit May Have Legs". Lawfare.
  69. ^ a b c Pileggi, Tamar (July 11, 2016). "US terror victims seek $1 billion from Facebook for Hamas posts". The Times of Israel. Retrieved July 15, 2016.
  70. ^ Dolmetsch, Chris (July 31, 2019). "Facebook Isn't Responsible as Terrorist Platform, Court Says". Bloomberg. Retrieved August 7, 2019.
  71. ^ "Facebook Defeats Appeal Claiming It Aided Hamas Attacks". Law360. July 31, 2019. Retrieved August 6, 2019.
  72. ^ "Hezbollah created Palestinian terror cells on Facebook, Israel says after bust". Jewish Telegraphic Agency. August 16, 2016. Retrieved August 17, 2016.
  73. ^ Zitun, Yoav (August 16, 2016). "Shin Bet catches Hezbollah recruitment cell in the West Bank". Ynet News. Retrieved August 17, 2016.
  74. ^ Gross, Judah Ari (August 16, 2016). "Hezbollah terror cells, set up via Facebook in West Bank and Israel, busted by Shin Bet". The Times of Israel. Retrieved August 17, 2016.
  75. ^ "Israel Police arrests Palestinian singer, influencer for hate speech". The Jerusalem Post | JPost.com. 2023-10-17. Retrieved 2023-10-17.
  76. ^ "Knesset approves Facebook bill in preliminary vote". The Times of Israel. July 20, 2016. Retrieved July 24, 2016.
  77. ^ Lecher, Colin (June 15, 2017). "Facebook says it wants 'to be a hostile place for terrorists'". The Verge. Retrieved June 16, 2017.
  78. ^ "Facebook using artificial intelligence to fight terrorism". CBS News. June 15, 2017. Retrieved June 16, 2017.
  79. ^ Solon, Olivia (June 16, 2017). "Revealed: Facebook exposed identities of moderators to suspected terrorists". The Guardian. Retrieved June 18, 2017.
  80. ^ Wong, Joon Ian (June 16, 2017). "The workers who police terrorist content on Facebook were exposed to terrorists by Facebook". Quartz. Retrieved June 18, 2017.
  81. ^ Shahani, Aarti (November 17, 2016). "From Hate Speech To Fake News: The Content Crisis Facing Mark Zuckerberg". NPR.
  82. ^ Shahani, Aarti. Zuckerberg Denies Fake News on Facebook had Impact on the Election. Washington: NPR, 2016. ProQuest.
  83. ^ Kravets, David. Facebook, Google Seek to Gut Fake News Sites' Money Stream. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 5, 2016.
  84. ^ Kravets, David. Facebook, Google Seek to Gut Fake News Sites' Money Stream. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 6, 2016.
  85. ^ Newitz, Annalee. Facebook Fires Human Editors, Algorithm Immediately Posts Fake News. New York: Condé Nast Publications, Inc., 2016. ProQuest. Web. December 6, 2016.
  86. ^ Burke, Samuel (November 19, 2016). "Zuckerberg: Facebook will develop tools to fight fake news". CNN Money. Retrieved November 22, 2016.
  87. ^ Jamieson, Amber; Solon, Olivia (2016-12-15). "Facebook to begin flagging fake news in response to mounting criticism". The Guardian. Retrieved 2022-04-07.
  88. ^ a b Levin, Sam (2018-12-13). "'They don't care': Facebook factchecking in disarray as journalists push to cut ties". The Guardian. San Francisco. Retrieved 2022-04-07.
  89. ^ Levin, Sam (2017-05-16). "Facebook promised to tackle fake news. But the evidence shows it's not working". The Guardian. Retrieved 2022-04-07.
  90. ^ Safi, Michael; Perera, Amantha (2018-03-07). "Sri Lanka blocks social media as deadly violence continues". The Guardian. Retrieved 2022-01-28.
  91. ^ a b Safi, Michael (March 14, 2018). "Sri Lanka accuses Facebook over hate speech after deadly riots". The Guardian. Retrieved January 28, 2022.
  92. ^ Taub, Amanda; Fisher, Max (April 21, 2018). "Where Countries Are Tinderboxes and Facebook Is a Match". The New York Times. Retrieved November 28, 2018.
  93. ^ Ellis-Petersen, Hannah (2019-04-21). "Social media shut down in Sri Lanka in bid to stem misinformation". The Guardian. Retrieved 2022-04-07.
  94. ^ Darcy, Oliver (October 26, 2019). "Facebook News launches with Breitbart as a source". CNN. from the original on October 26, 2019. Retrieved 2021-06-20.
  95. ^ Robertson, Adi (2019-10-25). "Mark Zuckerberg is struggling to explain why Breitbart belongs on Facebook News". The Verge. from the original on October 26, 2019. Retrieved 2021-06-20.
  96. ^ Wong, Julia Carrie (2019-10-25). "Facebook includes Breitbart in new 'high quality' news tab". The Guardian. from the original on October 25, 2019. Retrieved 2021-06-20.
  97. ^ a b Hagey, Keach; Horwitz, Jeff (2021-10-24). "Facebook's Internal Chat Boards Show Politics Often at Center of Decision Making". The Wall Street Journal. ISSN 0099-9660. Retrieved 2021-12-11.
  98. ^ Feinberg, Andrew (2021-10-25). "Facebook protected Breitbart to avoid angering Trump, new leaks reveal". The Independent. from the original on October 25, 2021. Retrieved 2021-12-11.
  99. ^ Turvill, William (February 18, 2021). "Profits from propaganda: Facebook takes China cash to promote Uyghur disinformation". Press Gazette. Retrieved February 27, 2021.
  100. ^ "U.N. investigators cite Facebook role in Myanmar crisis". Reuters. March 12, 2018 – via www.reuters.com.
  101. ^ "In Myanmar, Facebook struggles with a deluge of disinformation". The Economist. ISSN 0013-0613. Retrieved October 27, 2020.
  102. ^ "Report of the independent international fact-finding mission on Myanmar" (PDF).
  103. ^ a b Cosentino, Gabriella Cianciolo. Social Media and the Post-Truth World Order: The Global Dynamics of Disinformation. Palgrave Macmillan, 2020.
  104. ^ Stecklow, Steve. "Why Facebook is losing the war on hate speech in Myanmar". Reuters. Retrieved December 15, 2018.
  105. ^ "Facebook bans Myanmar military accounts for 'enabling human rights abuses'". Social.techcrunch.com. 27 August 2018. Retrieved December 15, 2018.
  106. ^ "Some in Myanmar Fear Fallout From Facebook Removal of Military Pages". Radio Free Asia. Retrieved December 15, 2018.
  107. ^ "Facebook Removes More Pages And Groups Linked to Myanmar Military". Radio Free Asia. Retrieved January 30, 2019.
  108. ^ Scott, Liam (2022-08-04). "Myanmar junta drops propaganda on people from helicopters". Coda Media. Retrieved 2022-08-07.
  109. ^ Frankel, Rafael (2021-02-12). "An Update on the Situation in Myanmar". Meta. Retrieved 2022-08-07.
  110. ^ Gilbert, David (7 December 2021). "'Growth Fueled by Hate': Facebook Sued for $150 Billion Over Myanmar Genocide". Vice News. from the original on 8 December 2021. Retrieved 13 December 2021.
  111. ^ "Rohingya refugees supported by Victim Advocates International vs. Facebook". OECD Watch. Retrieved 2023-03-21.
  112. ^ silicon (2022-11-23). "Your tech, our tears: Rohingya activists call on Facebook to remedy its role in atrocities". Silicon Republic. Retrieved 2023-03-21.
  113. ^ "Myanmar: Facebook's systems promoted violence against Rohingya; Meta owes reparations – new report". Amnesty International. 2022-09-29. Retrieved 2023-03-21.
  114. ^ "'Person of eminence' tag on FB for convict Ajay Chautala". December 17, 2018.
  115. ^ a b Beckett, Lois (March 27, 2019). "Facebook to ban white nationalism and separatism content". The Guardian. Retrieved March 28, 2019.
  116. ^ Dearden, Lizzie (March 24, 2019). "Neo-Nazi groups allowed to stay on Facebook because they 'do not violate community standards'". The Independent. Retrieved March 28, 2019.
  117. ^ "Facebook condemned for hosting neo-Nazi network with UK links". The Guardian. November 22, 2020. Retrieved January 31, 2021.
  118. ^ Lee Y, Blanchard B (3 March 2020). "'Provocative' China pressures Taiwan with fighters, fake news amid virus outbreak". Reuters. Retrieved 5 March 2020. 'We have been told to track if the origins are linked to instructions given by the Communist Party, using all possible means,' the official said, adding that authorities had increased scrutiny on online platforms, including chat rooms.
  119. ^ "Hoax circulates online that an old Indian textbook lists treatments for COVID-19". AFP Fact Check. 9 April 2020.
  120. ^ "Saline solution kills China coronavirus? Experts refute online rumour". AFP Fact Check. 24 January 2020. from the original on 1 April 2020. Retrieved 9 April 2020.
  121. ^ 武漢肺炎疫情謠言多 事實查核中心指3大共同點 [There are many rumors about the Wuhan pneumonia epidemic, the fact-checking center points to 3 common points] (in Chinese (Taiwan)). Central News Agency. 26 February 2020.
  122. ^ "Virus Outbreak: Chinese trolls decried for fake news". Taipei Times. 28 February 2020. Retrieved 12 March 2020.
  123. ^ "Taiwan accuses China of waging cyber 'war' to disrupt virus fight". Reuters. 29 February 2020. Retrieved 12 March 2020.
  124. ^ Manavis, Sarah (22 April 2020). "snapsave.app". New Statesman.
  125. ^ "Viral video promotes the unsupported hypothesis that SARS-CoV-2 is a bioengineered virus released from a Wuhan research laboratory". Health Feedback. 17 April 2020.
  126. ^ "False headline claim: Harvard Professor arrested for creating and selling the new coronavirus to China". Reuters. 7 April 2020.
  127. ^ "Fact-check: Did US researcher make and sell Covid-19 to China?". Deccan Herald. 11 August 2020.
  128. ^ "Facebook 'still making money from anti-vax sites'". The Guardian. January 30, 2021. Retrieved January 31, 2021.
  129. ^ Wong, Julia Carrie (April 10, 2020). "Tech giants struggle to stem 'infodemic' of false coronavirus claims". The Guardian. ISSN 0261-3077. Retrieved January 31, 2021.
  130. ^ Callery, James; Goddard, Jacqui (August 23, 2021). "Most-clicked link on Facebook spread doubt about Covid vaccine". The Times. ISSN 0140-0460. Retrieved March 21, 2022.
  131. ^ Fellet, Joao; Pamment, Charlotte (February 27, 2021). "Amazon rainforest plots sold via Facebook Marketplace ads". BBC. Retrieved February 27, 2021.
  132. ^ Jackson, Jasper; Kassa, Lucy; Townsend, Mark (20 February 2022). "Facebook 'lets vigilantes in Ethiopia incite ethnic killing'". The Guardian. Retrieved 21 February 2022.
  133. ^ Hern, Alex (2022-12-14). "Meta faces $1.6bn lawsuit over Facebook posts inciting violence in Tigray war". The Guardian. Retrieved 2022-12-16.

facebook, content, management, controversies, this, article, section, need, cleaned, summarized, because, been, split, from, criticism, facebook, facebook, meta, platforms, been, criticized, management, various, content, posts, photos, entire, groups, profiles. This article or section may need to be cleaned up or summarized because it has been split from to Criticism of Facebook Facebook or Meta Platforms has been criticized for its management of various content on posts photos and entire groups and profiles This includes but is not limited to allowing violent content including content related to war crimes and not limiting the spread of fake news and COVID 19 misinformation on their platform as well as allowing incitement of violence against multiple groups An example of a Facebook post censored due to an unspecified conflict with Community Standards Error message generated by Facebook for an attempt to share a link to a website that is censored due to Community Standards in a private chat Messages containing certain links will not be delivered to the recipient Contents 1 Intellectual property infringement 2 Violent content 3 War crimes 3 1 Facebook Live 4 Pro anorexia groups 5 Pro mafia groups case 6 Trolling 7 Rape pages 8 Child abuse images 9 Objectification of women 10 Violation of Palestinian Human Rights 11 Anti Semitism 12 Incitement of violence against Israelis 12 1 Countermeasure efforts 12 2 Employee data leak 13 Fake news 14 Incitement of violence in Sri Lanka 15 Inclusion of Breitbart News as trusted news source 16 Uyghur genocide denial 17 Incitement of human rights abuses in Myanmar 18 Blue tick 19 Neo Nazi and white supremacist content 20 COVID 19 misinformation 21 Marketplace illegal Amazon rainforest sales 22 Incitement of ethnic massacres in Ethiopia 23 See also 24 ReferencesIntellectual property infringement editFacebook has been criticized for having lax enforcement of third party copyrights for videos uploaded to the service In 2015 some Facebook pages were accused of plagiarizing videos from YouTube users and re posting them as their own content using Facebook s video platform and in some cases achieving higher levels of engagement and views than the original YouTube posts Videos hosted by Facebook are given a higher priority and prominence within the platform and its user experience including direct embedding within the News Feed and pages giving a disadvantage to posting it as a link to the original external source 1 2 In August 2015 Facebook announced a video matching technology aiming to identify reposted videos and also stated its intention to improve its procedures to remove infringing content faster 3 In April 2016 Facebook implemented a feature known as Rights Manager which allows rights holders to manage and restrict the upload of their content onto the service by third parties 4 Violent content editIn 2013 Facebook was criticized for allowing users to upload and share videos depicting violent content including clips of people being decapitated Having previously refused to delete such clips under the guideline that users have the right to depict the world in which we live Facebook changed its stance in May announcing that it would remove reported videos while evaluating its policy 5 The following October Facebook stated that it would allow graphic videos on the platform as long as the intention of the video was to condemn not glorify the acts depicted 6 further stating that Sometimes those experiences and issues involve graphic content that is of public interest or concern such as human rights abuses acts of terrorism and other violence When people share this type of graphic content it is often to condemn it If it is being shared for sadistic pleasure or to celebrate violence Facebook removes it 7 However Facebook once again received criticism with the Family Online Safety Institute saying that such videos crossed a line and can potentially cause psychological damage among young Facebook users 6 and then Prime Minister of the United Kingdom David Cameron calling the decision irresponsible citing the same concerns regarding young users 7 Two days later Facebook removed a video of a beheading following worldwide outrage and while acknowledging its commitment to allowing people to upload gory material for the purpose of condemnation it also stated that it would be further strengthening its enforcement to prevent glorification 7 The company s policies were also criticized as part of these developments with some drawing particular attention to Facebook s permission of graphic content but potential removal of breastfeeding images 8 In January 2015 Facebook announced that new warnings would be displayed on graphic content requiring users to explicitly confirm that they wish to see the material 9 10 War crimes editFacebook has been criticized for failing to take down violent content depicting war crimes in Libya A 2019 investigation by the BBC 11 found evidence of alleged war crimes in Libya being widely shared on Facebook and YouTube The BBC found images and videos on social media of the bodies of fighters and civilians being desecrated by fighters from the self styled Libyan National Army The force led by General Khalifa Haftar controls a swathe of territory in the east of Libya and is trying to seize the capital Tripoli BBC Arabic found almost one hundred images and videos from Libya shared on Facebook and YouTube in violation of their companies guidelines 12 The UK Foreign Office said it took the allegations extremely seriously and is concerned about the impact the recent violence is having on the civilian population 13 In 2017 a Facebook video of Libyan National Army LNA special forces commander Mahmoud al Werfalli was uploaded showing him shooting dead three captured fighters The video was then shared on YouTube over ten thousand times The International Criminal Court used it as evidence to indict al Werfalli for the war crime of murder 14 The BBC found the original video was still on Facebook 2 years after his indictment and also discovered videos showing the bodies of civilians being desecrated citation needed These were taken in Ganfouda a district of Benghazi which was under siege by the LNA between 2016 and 2017 More than 300 people including dozens of children died during the siege A video uncovered by BBC Arabic showed soldiers mocking a pile of corpses of dead civilians and trampling on bodies Among them was a 77 year old woman Alia Hamza Her son Ali Hamza had five family members killed in Ganfouda Ali Hamza told BBC Arabic I sent links to lawyers to send to the ICC in the Hague against Khalifa Haftar and his military commanders regarding the massacres of civilians said Hamza In the video the LNA soldiers label the civilians as terrorists Human rights lawyer and war crimes specialist Rodney Dixon QC reviewed the evidence BBC Arabic found If groups are using those platforms to propagate their campaigns then those platforms should seriously look at their role because they could then be assisting in that process of further crimes being committed he said citation needed After presenting our findings to Facebook they removed all the videos that show a suspected war crime taking place However they opted not to suspend any of the accounts which we found linked to the images Erin Saltman Facebook s policy manager for counterterrorism in Europe Middle East and Africa told BBC Arabic Sometimes there are very conflicting narratives of whether or not the victim is a terrorist or whether it s a civilian over who s committing that act we cannot be the pure arbiters of truth 12 But Facebook and YouTube s own community guidelines explicitly prohibit content that promotes or depicts acts of violence 15 Facebook Live edit Facebook Live introduced in August 2015 for celebrities 16 and gradually rolled out for regular users starting in January 2016 17 18 lets users broadcast live videos with Facebook s intention for the feature to be presenting public events or private celebrations 19 However the feature has been used to record multiple crimes deaths and violent incidents causing significant media attention 20 21 22 23 24 25 26 27 Facebook has received criticism for not removing videos faster 28 and Facebook Live has been described as a monster Facebook cannot tame 29 and a gruesome crime scene for murders 30 In response CEO Mark Zuckerberg announced in May 2017 that the company would hire 3 000 people to review content and invest in tools to remove videos faster 31 32 33 Pro anorexia groups editIn 2008 Facebook was criticized for hosting groups dedicated to promoting anorexia The groups promoted dramatic weight loss programs shared extreme diet tips and posted pictures of emaciated girls under Thinspiration headlines Members reported having switched to Facebook from Myspace another social networking service due to a perceived higher level of safety and intimacy at Facebook 34 In a statement to BBC News a Facebook spokesperson stated that Many Facebook groups relate to controversial topics this alone is not a reason to disable a group In cases where content is reported and found to violate the site s terms of use Facebook will remove it 35 Pro mafia groups case editIn Italy in 2009 the discovery of pro mafia groups one of them claiming Bernardo Provenzano s sainthood caused an alert in the country 36 37 38 and brought the government to rapidly issue a law that would force Internet service providers to deny access to entire websites in case of refused removal of illegal contents The amendment was passed by the Italian Senate and now needs to be passed unchanged by the Chamber of Deputies to become effective 39 40 41 needs update Facebook criticized the government s efforts telling Bloomberg that it would be like closing an entire railway network just because of offensive graffiti at one station and that Facebook would always remove any content promoting violence and already had a takedown procedure in place 42 Trolling editOn March 31 2010 The Today Show ran a segment detailing the deaths of three separate adolescent girls and trolls subsequent reactions to their deaths Shortly after the suicide of high school student Alexis Pilkington anonymous posters began trolling for reactions across various message boards referring to Pilkington as a suicidal CUSS and posting graphic images on her Facebook memorial page The segment also included an expose of a 2006 accident in which an eighteen year old student out for a drive fatally crashed her father s car into a highway pylon trolls emailed her grieving family the leaked pictures of her mutilated corpse 43 There have been cases where Facebook trolls were jailed for their communications on Facebook particularly memorial pages In Autumn 2010 Colm Coss of Ardwick Britain was sentenced to 26 weeks in jail under s127 of the Communications Act 2003 of Great Britain 44 for malicious communications for leaving messages deemed obscene and hurtful on Facebook memorial pages 45 46 In April 2011 Bradley Paul Hampson was sentenced to three years in jail after pleading guilty to two counts of using a carriage service the Internet to cause offense for posts on Facebook memorial pages and one count each of distributing and possessing child pornography when he posted images on the memorial pages of the deceased with phalluses superimposed alongside phrases such as Woot I m dead 47 48 Rape pages editA series of pro rape and rape joke content on Facebook drew attention from the media and women s groups 49 Rape Is No Joke RINJ a group opposing the pages argued that removing pro rape pages from Facebook and other social media was not a violation of free speech in the context of Article 19 of the Universal Declaration of Human Rights and the concepts recognized in international human rights law in the International Covenant on Civil and Political Rights 50 RINJ repeatedly challenged Facebook to remove the rape pages 51 RINJ then turned to advertisers on Facebook telling them not to let their advertising be posted on Facebook s rape pages 52 Following a campaign that involved the participation of Women Action and the Media the Everyday Sexism Project and the activist Soraya Chemaly who were among 100 advocacy groups Facebook agreed to update its policy on hate speech The campaign highlighted content that promoted domestic and sexual violence against women and used over 57 000 tweets and more than 4 900 emails to create outcomes such as the withdrawal of advertising from Facebook by 15 companies including Nissan UK House of Burlesque and Nationwide UK The social media website initially responded by stating that While it may be vulgar and offensive distasteful content on its own does not violate our policies 53 but then agreed to take action on May 29 2013 after it had become clear that our systems to identify and remove hate speech have failed to work as effectively as we would like particularly around issues of gender based hate 54 Child abuse images editIn June 2015 the UK National Society for the Prevention of Cruelty to Children raised concerns about Facebook s apparent refusal when asked to remove controversial video material which allegedly showed a baby in emotional distress 55 In March 2017 BBC News reported in an investigation that Facebook only removed 18 of the 100 groups and posts it had reported for containing child exploitation images The BBC had been granted an interview with Facebook policy director Simon Milner under the condition that they provide evidence of the activity However when presented with the images Facebook canceled the interview and told the BBC that it had been reported to the National Crime Agency for illegally distributing child exploitation images the NCA could not confirm whether the BBC was actually being investigated 56 Milner later stated to the BBC that the investigation had exposed flaws in its image moderation process that have since been addressed and that all of the reported content was removed from the service 57 According to data from the National Center for Missing amp Exploited Children in 2020 there have been 20 million reported incidents of child sexual abuse material on Facebook This accounted for 95 of total incidents recorded by the organization while Google accounted for half a million incidents Snapchat for 150 000 and Twitter for 65 000 58 Objectification of women editIn July 2017 GMA News reported that a number of secret Facebook groups that had been engaging in illegal activity of sharing obscene photos of women had been exposed with the Philippine National Bureau of Investigation warning group members of the possibility of being liable for violating child pornography and anti voyeurism laws Facebook stated that it would remove the groups as violations of its community guidelines 59 A few days later GMA News had an interview with one of the female victims targeted by one of the groups who stated that she received friend requests from strangers and inappropriate messages After reporting to authorities the Philippine National Police s anti cybercrime unit promised to take action in finding the accounts responsible 60 Senator Risa Hontiveros responded to the incidents with the proposal of a law that would impose stiff penalties on such group members stating that These people have no right to enjoy our internet freedom only to abuse our women and children We will not allow them to shame our young women suppress their right to express themselves through social media and contribute to a culture of misogyny and hate 61 Violation of Palestinian Human Rights editAccording to the study commissioned by Meta and carried out by Business for Social Responsibility BSR Facebook and Instagram s policies during Israeli attacks on Gaza Strip in 2021 harmed the fundamental human rights of Palestinians The social media giant had denied Palestinian users their freedom of expression by erroneously removing their content BSR s report is yet another indictment of the company s ability to police its global public square and to balance freedom of expression against the potential for harm in a tense international context 62 Anti Semitism editFacebook has been suspected of having a double standard when it comes to pages and posts regarding the Arab Israeli conflict When it comes to alleged incitement Facebook has been accused of being unfair removing only posts and pages that attack Palestinians while turning a blind eye to similar posts that are violently antisemitic The NGO Shurat Hadin Israel Law Center conducted an experiment over the incitement issue which sought to expose what it viewed as double standards regarding anti Israel sentiment vis a vis the simultaneous launch of two Facebook pages Stop Palestinians and Stop Israel Following the launch of the two nearly identical pages the NGO posted hateful content simultaneously on both pages Next Shurat Hadin reported both faux incitement pages to Facebook to see which if either would be removed According to them despite featuring nearly identical content only one was removed from the online platform They said the page inciting against Palestinians was closed by Facebook on the same day that it was reported for containing credible threat of violence which violated our Facebook s community standards but not the page inciting against Israelis Shurat Hadin said that Facebook claimed that this page was not in violation of Facebook s rules Shurat Hadin s staged anti Israel group Stop Israel still remains active on Facebook 63 ProPublica stated in September 2017 that a website was able to target ads at Facebook users who were interested in how to burn Jew and Jew hater Facebook removed the categories and said it would try to stop them from appearing to potential advertisers 64 In March 2019 Facebook subsidiary Instagram declined to remove an anti semitic image posted by right wing conspiracy theorist Alex Jones saying that it did not violate their community standards 65 better source needed Incitement of violence against Israelis editFacebook has been accused of being a public platform that is used to incite violence In October 2015 20 000 Israelis claimed that Facebook was ignoring Palestinian incitement on its platform and filed a class action suit demanding that Facebook remove all posts containing incitement to murder Jews 66 Israeli politicians have complained that Facebook does not comply or assist with requests from the police for tracking and reporting individuals when they share their intent to kill or commit any other act of violence on their Facebook pages In June 2016 following the murder of Hallel Ariel 13 by a terrorist who posted on Facebook Israeli Minister of Public Security Gilad Erdan charged that Facebook which has brought a positive revolution to the world has become a monster The dialogue the incitement the lies of the young Palestinian generation are happening on the Facebook platform Erdan accused Facebook of sabotaging the work of Israeli police and refusing to cooperate when Israel police turns to the site for assistance It also sets a very high bar for removing inciting content 67 In July 2016 a civil action for 1 billion in damages was filed in the United States District Court for the Southern District of New York on behalf of the victims and family members of four Israeli Americans and one US citizen killed by Hamas militants since June 2014 68 69 The victims and plaintiffs in the case are the families of Yaakov Naftali Fraenkel a 16 year old who was kidnapped and murdered by Hamas operatives in 2014 Taylor Force a 29 year old American MBA student and US Army veteran killed in a stabbing spree in Jaffa in 2016 Chaya Braun a three month old thrown from her stroller and slammed into the pavement when a Hamas attacker drove his car into a light rail station in Jerusalem in an October 2014 76 year old Richard Lakin who was killed in the October 2015 shooting and stabbing attack on a Jerusalem bus and Menachem Mendel Rivkin who was seriously wounded in a January 2016 stabbing attack in Jerusalem 69 The plaintiffs claimed that Facebook knowingly provided its social media platform and communication services to Hamas in violation of provisions of US Anti Terrorism laws which prohibits US businesses from providing any material support including services to designated terrorist groups and their leaders The government of the United States has designated Hamas as a Foreign Terrorist Organization as defined by US law The suit claims that Hamas used and relied on Facebook s online social network platform and communications services to facilitate and carry out its terrorist activity including the terrorist attacks in which Hamas murdered and injured the victims and their families in this case 68 69 The legal claim was rejected the court found that Facebook and other social media companies are not considered to be the publishers of material users post when digital tools used by the company match content with what the tool identifies as interested consumers 70 71 In August 2016 Israel s security service the Shin Bet reported that it had arrested nine Palestinians who had been recruited by the Lebanon based Hezbollah terrorist organization Operatives of Hezbollah in Lebanon and Gaza Strip recruited residents of the West Bank Gaza and Israel through Facebook and other social media sites After recruiting cell leaders on Facebook Hezbollah and the recruits used encrypted communications to avoid detection and the leaders continued to recruit other members The terror cells received Hezbollah funding and planned to conduct suicide bombings and ambushes and had begun preparing explosive devices for attacks said the security service which claimed credit for preventing the attacks The Shin Bet said it also detected multiple attempts by Hezbollah to recruit Israeli Arabs through a Facebook profile 72 73 74 In October 16 2023 singer and internet personality Dalal Abu Amneh was arrested by the Israeli Police for allegedly promoting hate speech and inciting violence on social media following a massacre perpetrated by Hamas on October 7 2023 75 In 2016 legislation was being prepared in Israel allowing fines of 300 000 shekels for Facebook and other social media like Twitter and YouTube for every post inciting or praising terrorism that is not removed within 48 hours and could possibly lead to further acts of terrorism 76 Countermeasure efforts edit In June 2017 Facebook published a blog post offering insights into how it detects and combats terrorism content The company claimed that the majority of the terrorism accounts that are found are discovered by Facebook itself while it reviews reports of terrorism content urgently and in cases of imminent harm promptly inform authorities It also develops new tools to aid in its efforts including the use of artificial intelligence to match terrorist images and videos detecting when content is shared across related accounts and developing technologies to stop repeat offenders The company stated that it has 150 people dedicated to terrorism countermeasures and works with governments and industries in an effort to curb terrorist propaganda Its blog post stated that We want Facebook to be a hostile place for terrorists 77 78 Employee data leak edit In June 2017 The Guardian reported that a software bug had exposed the personal details of 1 000 Facebook workers involved in reviewing and removing terrorism content by displaying their profiles in the Activity logs of Facebook groups related to terrorism efforts In Facebook s Dublin Ireland headquarters six individuals were determined to be high priority victims of the error after the company concluded that their profiles were likely viewed by potential terrorists in groups such as ISIS Hezbollah and the Kurdistan Workers Party The bug itself discovered in November 2016 and fixed two weeks later was active for one month and had also been retroactively exposing censored personal accounts from August 2016 One affected worker had fled Ireland gone into hiding and only returned to Ireland after five months due to a lack of money Suffering from psychological distress he filed a legal claim against Facebook and CPL Resources an outsourcing company seeking compensation A Facebook spokesperson stated that Our investigation found that only a small fraction of the names were likely viewed and we never had evidence of any threat to the people impacted or their families as a result of this matter and Craig D Souza Facebook s head of global investigations said Keep in mind that when the person sees your name on the list it was in their activity log which contains a lot of information there is a good chance that they associate you with another admin of the group or a hacker Facebook offered to install a home alarm monitoring system provide transport to and from work and counseling through its employee assistance program As a result of the data leak Facebook is reportedly testing the use of alternative administrative accounts for workers reviewing content rather than requiring workers to sign in with their personal profiles 79 80 Fake news editMain article Fake news website Facebook has been criticized for not doing enough to limit the spread of fake news stories on their site especially after the 2016 United States presidential election which some have claimed Donald Trump would not have won if Facebook had not helped spread what they claim to have been fake stories that were biased in his favor 81 At a conference called Techonomy Mark Zuckerberg stated in regards to Donald Trump There s a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news Zuckerberg affirmed the idea that people do not stray from their own ideals and political leanings He stated I don t know what to do about that and When we started the north star for us was We re building a safe community 82 Zuckerberg has also been quoted in his own Facebook post Of all the content on Facebook more than 99 percent of what people see is authentic 83 In addition The Pew Research Center stated that 62 of Americans obtain some or all of their news on social media the bulk of it from Facebook 84 The former editor at Facebook leaked inflammatory information about the websites algorithm s pointing to certain falsehoods and bias by the news created within Facebook Although Facebook initially denied claims of issues with fake new stories and their algorithms they fired the entire trending team involved with a fake news story about Megyn Kelly being a closeted liberal 85 In 2016 Mark Zuckerberg began to take steps to eliminate the prevalence of fake news on Facebook as a result of criticisms of Facebook s influence on the presidential election 86 Facebook initially partnered with ABC News the Associated Press FactCheck org Snopes and PolitiFact for its fact checking initiative 87 as of 2018 it had over 40 fact checking partners across the world including The Weekly Standard 88 A May 2017 review by The Guardian found that Facebook s fact checking initiatives of partnering with third party fact checkers and publicly flagging fake news were regularly ineffective and appeared to be having minimal impact in some cases 89 In 2018 journalists working as fact checkers for Facebook criticized the partnership stating that it had produced minimal results and that the company had ignored their concerns 88 Incitement of violence in Sri Lanka editIn March 2018 the government of Sri Lanka blocked Facebook and other social media services in an effort to quell the violence in the 2018 anti Muslim riots with Harsha de Silva the Deputy Minister for National Policies and Economic Affairs tweeting Hate speech on Facebook is increasing beyond acceptable levels Government will have to act immediately to save lives 90 Sri Lankan telecommunications minister Harin Fernando stated that Facebook had been too slow in removing content and banning users who were using its platforms to facilitate violence during the riots 91 92 In response Facebook stated that it had increased the number of Sinhalese speakers it employs to review content 91 In April 2019 during the aftermath of the Easter bombings the Sri Lankan government blocked access to Facebook Instagram and WhatsApp in an effort to stop the spread of misinformation that could lead to further violence 93 Inclusion of Breitbart News as trusted news source editIn October 2019 Facebook announced that Breitbart News an American far right news and opinion website would be included as a trusted source in its Facebook News feature alongside sources like The New York Times and The Washington Post The decision sparked controversy due to Breitbart News s status as a platform for the alt right and its reputation for publishing misinformation 94 95 96 In October 2021 The Wall Street Journal reported that Facebook executives resisted removing Breitbart News from Facebook s News Tab feature to avoid angering Donald Trump and Republican members of Congress despite criticism from Facebook employees 97 98 An August 2019 internal Facebook study had found that Breitbart News was the least trusted news source and also ranked as low quality in the sources it looked at across the U S and Great Britain 97 Uyghur genocide denial editIn February 2021 a Press Gazette investigation found that Facebook had accepted promotional content from Chinese state media outlets such as China Daily and China Global Television Network that spread disinformation denying the Uyghur genocide 99 Incitement of human rights abuses in Myanmar editSee also Rohingya genocide Facebook controversy The chairman of the U N Independent International Fact Finding Mission on Myanmar stated that Facebook played a determining role in the Rohingya genocide 100 Facebook has been accused of enabling the spread of Islamophobic content which targets the Rohingya people 101 The United Nations Human Rights Council has called the platform a useful instrument for those seeking to spread hate 102 The internet org initiative was brought to Myanmar in 2015 Myanmar s relatively recent democratic transition did not provide the country with substantial time to form professional and reliable media outlets free from government intervention Furthermore approximately 1 of Myanmar s residents had internet access before internet org As a result Facebook was the primary source of information and without verifiable professional media options Facebook became a breeding ground for hate speech and disinformation Rumors circulating among family or friends networks on Facebook were perceived as indistinguishable from verified news by its users 103 Frequent anti Rohingya sentiments included high Muslim birthrates increasing economic influence and plans to takeover the country Myanmar s Facebook community was also nearly completely unmonitored by Facebook who at the time only had two Burmese speaking employees In response Facebook removed accounts which were owned by the Myanmar Armed Forces because they had previously used Facebook to incite hatred against the Rohingya people 104 105 106 and they were currently engaging in coordinated inauthentic behavior 107 In February 2021 Facebook banned the Myanmar military from its platform and set up rules to ban Tatmadaw linked businesses 108 109 The Myanmar military wasn t the only account found to have incited violence In a review done by Facebook in 2018 Facebook banned accounts and pages associated with Myanmar military personnel that were indicated by the UN as being directly responsible for the ethnic cleansing in Rakhine The banned accounts had a widespread reach in the country as they were followed by nearly 12 million accounts which is about half of all Myanmar s Facebook users 103 On 6 December 2021 approximately a hundred Rohingya refugees launched a 150 billion lawsuit against Facebook alleging that it did not do enough to prevent the proliferation of anti Rohingya hate speech because it was interested in prioritizing engagement 110 On 10 December 2021 sixteen Rohingya youth living in Cox s Bazar refugee camp made a complaint against Facebook to the Irish National Contact Point for the OECD Guidelines for Multinational Enterprises alleging that Facebook had violated the guidelines and owed them a remedy 111 112 The lead complainants in the case included members of Rohingya civil society group Arakan Rohingya Society for Peace and Human Rights ARSPH Mohibullah who founded ARSPH and had spearheaded efforts amongst camp based Rohingya refugees to hold Facebook accountable had been murdered just over two months before 113 Blue tick editFacebook grants blue tick to verified accounts of public personalities brands and celebrities including politicians and artists They have no policy in the cases where an individual who has a verified blue tick account is convicted in a serious criminal case There was a 2018 case in India where a politician was convicted and sentenced to 10 years in jail in a serious bribery criminal case but his Facebook page still continued to be verified 114 Neo Nazi and white supremacist content editFrom circa 2018 until March 27 2019 Facebook s internal policy was to permit white nationalist content but not white supremacist content despite advice stating there is no distinction 115 In practice it hosted much white supremacist and neo Nazi content 116 On March 27 2019 Facebook backtracked and stated that white nationalism cannot be meaningfully separated from white supremacy and organized hate groups 115 In 2020 the Centre for Countering Digital Hate CCDH found Facebook was hosting a white supremacist network with more than 80 000 followers and links to the UK far right The CCDH said Facebook s leadership endangered public safety by letting neo Nazis finance their activities through Facebook and Instagram Facebook was first told about this problem two years ago and failed to act 117 COVID 19 misinformation editFurther information COVID 19 misinformation In February 2020 the Taiwanese Central News Agency reported that large amounts of misinformation had appeared on Facebook claiming the pandemic in Taiwan was out of control the Taiwanese government had covered up the total number of cases and that President Tsai Ing wen had been infected 118 119 The Taiwan fact checking organization had suggested the misinformation on Facebook shared similarities with mainland China due to its use of simplified Chinese characters and mainland China vocabulary 120 The organization warned that the purpose of the misinformation is to attack the government 121 122 123 The Epoch Times an anti Chinese Communist Party CCP newspaper affiliated with Falun Gong has spread misinformation related to the COVID 19 pandemic in print and via social media including Facebook and YouTube 124 125 In April 2020 rumors circulated on Facebook alleging that the US Government had just discovered and arrested Charles Lieber chair of the Chemistry and Chemical Biology Department at Harvard University for manufacturing and selling the novel coronavirus COVID 19 to China According to a report from Reuters posts spreading the rumor were shared in multiple languages over 79 000 times on Facebook 126 127 In January 2021 the Bureau of Investigative Journalism found that 430 Facebook pages being followed by 45 million people were spreading false information about COVID 19 or vaccinations 128 This was despite a promise by Facebook in 2020 that no user or company should directly profit from false information about immunization against COVID 19 129 A Facebook spokesman said the company had removed a small number of the pages shared with us for violating our policies citation needed In August 2021 Facebook said that an article raising concerns about potentially fatal effects of a COVID 19 vaccine was the top performing link in the United States between January and March 2021 and that another site publishing COVID 19 misinformation was among its top 20 visited pages 130 Marketplace illegal Amazon rainforest sales editIn February 2021 BBC investigations revealed that Amazon rainforest plots on land reserved for indigenous people were being illegally traded on the Facebook Marketplace with the sellers admitting they do not have the land title The BBC reported that Facebook were ready to work with local authorities but were unwilling to take independent action 131 Incitement of ethnic massacres in Ethiopia editIn February 2022 Facebook was accused by the Bureau of Investigative Journalism and The Observer of letting activists incite ethnic massacres in the Tigray War by spreading hate and misinformation 132 Following the report a lawsuit against Meta was filed in December 2022 in the High Court of Kenya by the son of a Tigrayan academic murdered in November 2021 after receiving racist attacks on the platform 133 See also edit nbsp Companies portal nbsp Internet portalCensorship by Facebook Criticism of FacebookReferences edit Setalvad Ariha August 7 2015 Why Facebook s video theft problem can t last The Verge Retrieved May 29 2017 Oremus Will July 8 2015 Facebook s Piracy Problem Slate The Slate Group Retrieved May 29 2017 Luckerson Victor August 28 2015 Facebook to Crack Down on Online Video Piracy Time Retrieved May 29 2017 Constine Josh April 12 2016 Facebook launches video Rights Manager to combat freebooting TechCrunch AOL Retrieved May 29 2017 Kelion Leo May 1 2013 Facebook U turn after charities criticise decapitation videos BBC News BBC Retrieved June 3 2017 a b Winter Michael October 21 2013 Facebook again allows violent videos with caveat USA Today Retrieved June 3 2017 a b c Facebook pulls beheading video The Daily Telegraph October 23 2013 Archived from the original on January 12 2022 Retrieved June 3 2017 Harrison Virginia October 23 2013 Outrage erupts over Facebook s decision on graphic videos CNNMoney CNN Retrieved June 3 2017 Gibbs Samuel January 13 2015 Facebook tackles graphic videos and photos with are you sure warnings The Guardian Retrieved June 3 2017 Kelion Leo January 13 2015 Facebook restricts violent video clips and photos BBC News BBC Retrieved June 3 2017 Libya war crimes videos shared online BBC News Retrieved September 23 2019 a b Libyan conflict Suspected war crimes shared online BBC Newsnight archived from the original on 2021 12 21 retrieved September 23 2019 BBC War crimes committed by Haftar s forces shared on Facebook YouTube Libyan Express May 1 2019 Retrieved October 31 2020 The Prosecutor v Mahmoud Mustafa Busyf Al Werfalli PDF International Criminal Court Retrieved 17 February 2022 Community Standards Facebook Retrieved September 23 2019 via Facebook Mangalindan JP August 5 2015 Facebook launches live streaming but only for famous people Mashable Retrieved June 3 2017 Barrett Brian January 28 2016 Facebook Livestreaming Opens Up to Everyone With an iPhone Wired Retrieved June 3 2017 Newton Casey January 28 2016 Facebook rolls out live video streaming to everyone in the United States The Verge Retrieved June 3 2017 Newton Casey December 3 2015 Facebook begins testing live video streaming for all users The Verge Retrieved June 3 2017 Chrisafis Angelique Willsher Kim June 14 2016 French police officer and partner murdered in odious terrorist attack The Guardian Retrieved June 3 2017 Madden Justin June 17 2016 Chicago man shot dead while live streaming on Facebook Reuters Retrieved June 3 2017 Chaykowski Kathleen July 7 2016 Philando Castile s Death On Facebook Live Highlights Problems For Social Media Apps Forbes Retrieved June 3 2017 McLaughlin Eliott C Blau Max Vercammen Paul September 30 2016 Police Man killed by officer pointed vaping device not gun CNN Retrieved June 3 2017 Berman Mark Hawkins Derek January 5 2017 Hate crime charges filed after reprehensible video shows attack on mentally ill man in Chicago The Washington Post Nash Holdings Retrieved June 3 2017 Steele Billy March 22 2017 Dozens watched a Facebook Live stream of sexual assault updated Engadget AOL Retrieved June 3 2017 Gibbs Samuel April 25 2017 Facebook under pressure after man livestreams killing of his daughter The Guardian Retrieved June 3 2017 Solon Olivia January 27 2017 Why a rising number of criminals are using Facebook Live to film their acts The Guardian Retrieved June 3 2017 Solon Olivia Levin Sam January 6 2017 Facebook refuses to explain why live torture video wasn t removed sooner The Guardian Retrieved June 3 2017 Krasodomski Jones Alex January 9 2017 Facebook has created a monster it cannot tame CNN Retrieved June 3 2017 Bhattacharya Ananya June 18 2016 Facebook Live is becoming a gruesome crime scene for murders Quartz Retrieved June 3 2017 Gibbs Samuel May 3 2017 Facebook Live Zuckerberg adds 3 000 moderators in wake of murders The Guardian Retrieved June 3 2017 Murphy Mike May 3 2017 Facebook is hiring 3 000 more people to monitor Facebook Live for murders suicides and other horrific video Quartz Retrieved June 3 2017 Ingram David May 3 2017 Facebook tries to fix violent video problem with 3 000 new workers Reuters Retrieved June 3 2017 Peng Tina November 22 2008 Pro anorexia groups spread to Facebook Newsweek Retrieved June 13 2017 Pro anorexia site clampdown urged BBC News BBC February 24 2008 Retrieved June 13 2017 Masciarelli Alexis January 9 2009 Anger at pro Mafia groups on Facebook France 24 Archived from the original on September 6 2009 Retrieved June 13 2017 Donadio Rachel January 20 2009 Italian authorities wary of Facebook tributes to Mafia The New York Times International Edition Archived from the original on January 24 2009 Retrieved June 13 2017 Pullella Philip January 12 2009 Pro mafia Facebook pages cause alarm in Italy Reuters Retrieved June 13 2017 Krangel Eric February 11 2009 Italy Considering National Ban On Facebook YouTube In Plan To Return To Dark Ages Business Insider Axel Springer SE Retrieved June 13 2017 Kington Tom February 16 2009 Italian bill aims to block mafia Facebook shrines The Guardian Retrieved June 13 2017 Nicole Kristen February 12 2009 Mafia Bosses Could Cause Italy s Blocking of Facebook Adweek Beringer Capital Retrieved June 13 2017 Oates John February 12 2009 Facebook hits back at Italian ban The Register Situation Publishing Retrieved June 13 2017 Trolling The Today Show Explores the Dark Side of the Internet March 31 2010 Retrieved April 4 2010 Archived June 8 2010 at the Wayback Machine s127 of the Communications Act 2003 of Great Britain Retrieved July 13 2011 Murder victim mocking troll jailed The Register November 1 2010 Retrieved July 13 2011 Jade Goody website troll from Manchester jailed BBC October 29 2010 Retrieved July 13 2011 Facebook troll Bradley Paul Hampson seeks bail appeal against jail term The Courier Mail April 20 2011 Retrieved July 13 2011 Facebook urged to ban teens from setting up tribute pages The Australian June 5 2010 Retrieved July 13 2011 Sherwell Philip October 16 2011 Cyber anarchists blamed for unleashing a series of Facebook rape pages The Daily Telegraph London Retrieved May 22 2012 Facebook rape page whitelisted and campaign goes global Womensviewsonnews org Meanwhile campaigns in other countries have begun most notably in Canada with the Rape is no joke RINJ campaign which has not only campaigned fiercely but has also put together a YouTube video Facebook Refuses to Remove Rape Pages Linked to Australian British Youth Albuquerque Express October 23 2011 Archived from the original on May 18 2013 Retrieved May 22 2012 Facebook Refuses to Remove Rape Pages Linked to Australian British Youth International Business Times October 18 2011 Archived from the original on July 17 2012 Retrieved May 22 2012 O Brien said the campaign is now focusing on Facebook advertisers telling them not to let their advertisements be posted on the rape pages Sara C Nelson May 28 2013 FBrape Will Facebook Heed Open Letter Protesting Endorsement Of Rape amp Domestic Violence The Huffington Post UK Retrieved May 29 2013 Rory Carroll May 29 2013 Facebook gives way to campaign against hate speech on its pages The Guardian UK London Retrieved May 29 2013 Facebook criticised by NSPCC over baby ducking video clip BBC News June 5 2015 Facebook failed to remove sexualised images of children BBC News March 7 2017 Retrieved March 9 2017 Facebook Twitter and Google grilled by MPs over hate speech BBC News March 14 2017 Retrieved March 14 2017 Hitt Tarpley February 24 2021 Facebook a Hotbed of Child Sexual Abuse Material With 20 3 Million Reports Far More Than Pornhub The Daily Beast Retrieved August 23 2021 Layug Margaret Claire July 3 2017 Pastor Hokage FB groups trading lewd photos of women exposed GMA News Retrieved July 8 2017 Layug Margaret Claire July 5 2017 Victim of Pastor FB reports harassment indecent proposals GMA News Retrieved July 8 2017 De Jesus Julliane Love July 6 2017 Hontiveros wants stiff penalties vs Pastor Hokage FB groups Philippine Daily Inquirer Retrieved July 8 2017 FACEBOOK REPORT CONCLUDES COMPANY CENSORSHIP VIOLATED PALESTINIAN HUMAN RIGHTS The Intercept Retrieved 21 September 2022 When it comes to incitement is Facebook biased against Israel Arab Israeli Conflict Jerusalem Post The Jerusalem Post Retrieved December 16 2018 Facebook tightens ad policy after Jew hater controversy J Jweekly com Jewish Telegraphic Agency September 27 2016 Retrieved September 29 2017 Gagliardo Silver Victoria March 29 2019 Instagram refuses to remove Alex Jones anti semitic post The Independent Retrieved March 30 2019 20 000 Israelis sue Facebook for ignoring Palestinian incitement The Times of Israel October 27 2015 Retrieved July 15 2016 Israel Facebook s Zuckerberg has blood of slain Israeli teen on his hands The Times of Israel July 2 2016 Retrieved July 15 2016 a b Wittes Benjamin Bedell Zoe July 12 2016 Facebook Hamas and Why a New Material Support Suit May Have Legs Lawfare a b c Pileggi Tamar July 11 2016 US terror victims seek 1 billion from Facebook for Hamas posts The Times of Israel Retrieved July 15 2016 Dolmetsch Chris July 31 2019 Facebook Isn t Responsible as Terrorist Platform Court Says Bloomberg Retrieved August 7 2019 Facebook Defeats Appeal Claiming It Aided Hamas Attacks Law360 July 31 2019 Retrieved August 6 2019 Hezbollah created Palestinian terror cells on Facebook Israel says after bust Jewish Telegraphic Agency August 16 2016 Retrieved August 17 2016 Zitun Yoav August 16 2016 Shin Bet catches Hezbollah recruitment cell in the West Bank Ynet News Retrieved August 17 2016 Gross Judah Ari August 16 2016 Hezbollah terror cells set up via Facebook in West Bank and Israel busted by Shin Bet The Times of Israel Retrieved August 17 2016 Israel Police arrests Palestinian singer influencer for hate speech The Jerusalem Post JPost com 2023 10 17 Retrieved 2023 10 17 Knesset approves Facebook bill in preliminary vote The Times of Israel July 20 2016 Retrieved July 24 2016 Lecher Colin June 15 2017 Facebook says it wants to be a hostile place for terrorists The Verge Retrieved June 16 2017 Facebook using artificial intelligence to fight terrorism CBS News June 15 2017 Retrieved June 16 2017 Solon Olivia June 16 2017 Revealed Facebook exposed identities of moderators to suspected terrorists The Guardian Retrieved June 18 2017 Wong Joon Ian June 16 2017 The workers who police terrorist content on Facebook were exposed to terrorists by Facebook Quartz Retrieved June 18 2017 Shahani Aarti November 17 2016 From Hate Speech To Fake News The Content Crisis Facing Mark Zuckerberg NPR Shahani Aarti Zuckerberg Denies Fake News on Facebook had Impact on the Election Washington NPR 2016 ProQuest Kravets David Facebook Google Seek to Gut Fake News Sites Money Stream New York Conde Nast Publications Inc 2016 ProQuest Web December 5 2016 Kravets David Facebook Google Seek to Gut Fake News Sites Money Stream New York Conde Nast Publications Inc 2016 ProQuest Web December 6 2016 Newitz Annalee Facebook Fires Human Editors Algorithm Immediately Posts Fake News New York Conde Nast Publications Inc 2016 ProQuest Web December 6 2016 Burke Samuel November 19 2016 Zuckerberg Facebook will develop tools to fight fake news CNN Money Retrieved November 22 2016 Jamieson Amber Solon Olivia 2016 12 15 Facebook to begin flagging fake news in response to mounting criticism The Guardian Retrieved 2022 04 07 a b Levin Sam 2018 12 13 They don t care Facebook factchecking in disarray as journalists push to cut ties The Guardian San Francisco Retrieved 2022 04 07 Levin Sam 2017 05 16 Facebook promised to tackle fake news But the evidence shows it s not working The Guardian Retrieved 2022 04 07 Safi Michael Perera Amantha 2018 03 07 Sri Lanka blocks social media as deadly violence continues The Guardian Retrieved 2022 01 28 a b Safi Michael March 14 2018 Sri Lanka accuses Facebook over hate speech after deadly riots The Guardian Retrieved January 28 2022 Taub Amanda Fisher Max April 21 2018 Where Countries Are Tinderboxes and Facebook Is a Match The New York Times Retrieved November 28 2018 Ellis Petersen Hannah 2019 04 21 Social media shut down in Sri Lanka in bid to stem misinformation The Guardian Retrieved 2022 04 07 Darcy Oliver October 26 2019 Facebook News launches with Breitbart as a source CNN Archived from the original on October 26 2019 Retrieved 2021 06 20 Robertson Adi 2019 10 25 Mark Zuckerberg is struggling to explain why Breitbart belongs on Facebook News The Verge Archived from the original on October 26 2019 Retrieved 2021 06 20 Wong Julia Carrie 2019 10 25 Facebook includes Breitbart in new high quality news tab The Guardian Archived from the original on October 25 2019 Retrieved 2021 06 20 a b Hagey Keach Horwitz Jeff 2021 10 24 Facebook s Internal Chat Boards Show Politics Often at Center of Decision Making The Wall Street Journal ISSN 0099 9660 Retrieved 2021 12 11 Feinberg Andrew 2021 10 25 Facebook protected Breitbart to avoid angering Trump new leaks reveal The Independent Archived from the original on October 25 2021 Retrieved 2021 12 11 Turvill William February 18 2021 Profits from propaganda Facebook takes China cash to promote Uyghur disinformation Press Gazette Retrieved February 27 2021 U N investigators cite Facebook role in Myanmar crisis Reuters March 12 2018 via www reuters com In Myanmar Facebook struggles with a deluge of disinformation The Economist ISSN 0013 0613 Retrieved October 27 2020 Report of the independent international fact finding mission on Myanmar PDF a b Cosentino Gabriella Cianciolo Social Media and the Post Truth World Order The Global Dynamics of Disinformation Palgrave Macmillan 2020 Stecklow Steve Why Facebook is losing the war on hate speech in Myanmar Reuters Retrieved December 15 2018 Facebook bans Myanmar military accounts for enabling human rights abuses Social techcrunch com 27 August 2018 Retrieved December 15 2018 Some in Myanmar Fear Fallout From Facebook Removal of Military Pages Radio Free Asia Retrieved December 15 2018 Facebook Removes More Pages And Groups Linked to Myanmar Military Radio Free Asia Retrieved January 30 2019 Scott Liam 2022 08 04 Myanmar junta drops propaganda on people from helicopters Coda Media Retrieved 2022 08 07 Frankel Rafael 2021 02 12 An Update on the Situation in Myanmar Meta Retrieved 2022 08 07 Gilbert David 7 December 2021 Growth Fueled by Hate Facebook Sued for 150 Billion Over Myanmar Genocide Vice News Archived from the original on 8 December 2021 Retrieved 13 December 2021 Rohingya refugees supported by Victim Advocates International vs Facebook OECD Watch Retrieved 2023 03 21 silicon 2022 11 23 Your tech our tears Rohingya activists call on Facebook to remedy its role in atrocities Silicon Republic Retrieved 2023 03 21 Myanmar Facebook s systems promoted violence against Rohingya Meta owes reparations new report Amnesty International 2022 09 29 Retrieved 2023 03 21 Person of eminence tag on FB for convict Ajay Chautala December 17 2018 a b Beckett Lois March 27 2019 Facebook to ban white nationalism and separatism content The Guardian Retrieved March 28 2019 Dearden Lizzie March 24 2019 Neo Nazi groups allowed to stay on Facebook because they do not violate community standards The Independent Retrieved March 28 2019 Facebook condemned for hosting neo Nazi network with UK links The Guardian November 22 2020 Retrieved January 31 2021 Lee Y Blanchard B 3 March 2020 Provocative China pressures Taiwan with fighters fake news amid virus outbreak Reuters Retrieved 5 March 2020 We have been told to track if the origins are linked to instructions given by the Communist Party using all possible means the official said adding that authorities had increased scrutiny on online platforms including chat rooms Hoax circulates online that an old Indian textbook lists treatments for COVID 19 AFP Fact Check 9 April 2020 Saline solution kills China coronavirus Experts refute online rumour AFP Fact Check 24 January 2020 Archived from the original on 1 April 2020 Retrieved 9 April 2020 武漢肺炎疫情謠言多 事實查核中心指3大共同點 There are many rumors about the Wuhan pneumonia epidemic the fact checking center points to 3 common points in Chinese Taiwan Central News Agency 26 February 2020 Virus Outbreak Chinese trolls decried for fake news Taipei Times 28 February 2020 Retrieved 12 March 2020 Taiwan accuses China of waging cyber war to disrupt virus fight Reuters 29 February 2020 Retrieved 12 March 2020 Manavis Sarah 22 April 2020 snapsave app New Statesman Viral video promotes the unsupported hypothesis that SARS CoV 2 is a bioengineered virus released from a Wuhan research laboratory Health Feedback 17 April 2020 False headline claim Harvard Professor arrested for creating and selling the new coronavirus to China Reuters 7 April 2020 Fact check Did US researcher make and sell Covid 19 to China Deccan Herald 11 August 2020 Facebook still making money from anti vax sites The Guardian January 30 2021 Retrieved January 31 2021 Wong Julia Carrie April 10 2020 Tech giants struggle to stem infodemic of false coronavirus claims The Guardian ISSN 0261 3077 Retrieved January 31 2021 Callery James Goddard Jacqui August 23 2021 Most clicked link on Facebook spread doubt about Covid vaccine The Times ISSN 0140 0460 Retrieved March 21 2022 Fellet Joao Pamment Charlotte February 27 2021 Amazon rainforest plots sold via Facebook Marketplace ads BBC Retrieved February 27 2021 Jackson Jasper Kassa Lucy Townsend Mark 20 February 2022 Facebook lets vigilantes in Ethiopia incite ethnic killing The Guardian Retrieved 21 February 2022 Hern Alex 2022 12 14 Meta faces 1 6bn lawsuit over Facebook posts inciting violence in Tigray war The Guardian Retrieved 2022 12 16 Retrieved from https en wikipedia org w index php title Facebook content management controversies amp oldid 1189907056, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.