web analytics
April 17, 2014 / 17 Nisan, 5774
At a Glance

Posts Tagged ‘OHPI’

Charge: Facebook Pages Spew Blood Libels, Attack Jews and Aborigines, Mock Anne Frank

Tuesday, September 11th, 2012

There is no scientific equation to determine what is hatred, but a Facebook picture of a smiling Anne Frank surrounded by the caption, “What’s that burning?  Oh it’s my family” is an easy one.  So is a Facebook picture of a baby on a scale emblazoned with a Jewish Star, where the bottom of the scale is a meat grinder with raw ground meat (presumably, a baby’s) oozing out.

Is there any doubt in your mind that those images constitute hate speech (one of the official categories for removal under Facebook’s Terms of Service) and should be removed from Facebook?  That was the basis for the complaints filed by the Online Hate Prevention Institute last month.

Facebook disagreed.  The pictures remain up.

The Australia-based Online Hate Prevention Institute was launched in January this year.  Its mission is to help prevent, or at least control, abusive social media behavior which constitute racism or other forms of hate speech.

Dr. Andre Oboler is the chief executive officer of OHPI.  Oboler has been involved in analyzing and monitoring online hate for five years.   In the time that he’s been monitoring Facebook, the response time has improved, but the results have not.

“OHPI submitted documented complaints following the Facebook complaint protocol, and, true to their word, we received a response within 48 hours,” Oboler told The Jewish Press.  “It’s quite amazing; the Facebook reviewers took down the images, reviewed them, and put them back up with a ‘no action’ decision within 48 hours.”

Oboler waited until the Facebook reviews were completed before posting OHPI’s findings.  The methodical process and the constructive suggestions OHPI made could be held up as models of what to do when confronted with hate speech on social media, except that at this point the diligence does not appear to have paid off.

The suggestions included:

1. Remove the offensive images

2. Close the offensive pages that are posting them

3. Permanently close the accounts of the users abusing Facebook to spread such hate

4. Review which staff assessed these examples and audit their decision making

5. Take active measures to improve staff training to avoid similar poor decisions in the future

6. To institute an appeal process as part of the online reporting system

7. To institute systematic random checks of rejected complaints

At this point, Oboler is hopeful that if sufficient attention is generated, Facebook will feel compelled to re-examine their procedures.  What he would like is for there to be a “systematic change to prevent online-generated harm in the future.”

One way to generate that attention, Oboler suggested, is for Facebook users who think the images described above are offensive to go to the Facebook OPHI site and “Like” it.  Another is to sign the OPHI petition urging Facebook to stop allowing hate speech on its site.

OHPI is also critical of the way in which Facebook has chosen to respond to complaints about offensive Facebook Pages.  Its standard response to pages that are entirely devoted to offensive material is to insert the bracketed phrase: [Controversial Humor] before the rest of the page title.  That phrase acts kind of like the warning label posted on cigarette packages.  The page remains vile, just as the cigarettes remain carcinogenic, but by slapping on the Controversial Humor disclaimer, it appears Facebook is seeking immunity from liability.  Or at least from responsibility.

OPHI discovered this Facebook method when it was engaged in an effort to eradicate hate-filled Facebook Pages dedicated to brutalizing Aborigines.  Remember – OPHI is based in Australia.  After engaging in some promising responses to OPHI’s complaints, Facebook ultimately responded that “While we do not remove this type of content from the site entirely unless it violates our Statement of Rights and Responsibilities, out of respect for local laws, we have restricted access to this content in Australia via Facebook.”

But that just doesn’t make any sense, according to Oboler.  As he pointed out, “Facebook’s ‘Statement of Rights and Responsibilities’ says at 3.7 ‘You will not post content that: is hate speech’. We find it very hard to understand how Facebook can look at this material and decide it is not hate speech. Ultimately, this is where Facebook is going wrong.”

Is there anything Facebook has determined to be sufficiently offensive that it will be removed? Yes, but not much.

Oboler explained that thus far the only hate speech kind of content that has been permanently removed by Facebook is when it is directed against an individual, rather than at an entire race or religion.  In other words, the same problem that hate speech codes on campuses have encountered, plagues complainants hoping for a non-offensive inline community.  Unless the nastiness is directed at a specific person, the default Facebook position is to not remove it.

But really, is it possible for anyone to consider the words accompanying the Anne Frank picture anything but impermissible hate speech?  Facebook apparently does and will continue to do so unless enough people tell them they are wrong.

 

YouTube Removes Hundreds of Videos in Response to New Report on Online Anti-Semitism

Wednesday, August 1st, 2012

The Online Hate Prevention Institute (OHPI) on Wednesday released a report documenting extensive anti-Semitism and racial hatred on YouTube, prompting the popular video-sharing website to close the account of the most egregious culprit.

The report, which seeks to address the lacunae in online regulation of hate speech, highlights how one one user uploaded 1,710 videos in a single day – the vast majority (87%) of which consisted of blatant hate speech. A substantial number of the videos concerned Holocaust denial and defense of Holocaust deniers. Some videos accused Jews and Israel of masterminding the September 11 terrorist attacks. YouTube closed the user’s account within 24 hours of receiving an advanced copy of OHPI’s report.

Although encouraged that YouTube took the necessary corrective measures, OHPI expressed concern that it took the impending release of the report to prompt YouTube to close the account. OHPI’s CEO, Dr Andre Oboler: “YouTube must be commended for its speedy response to OHPI’s report, but it is concerning that such hateful content, and in such volume in a single account, was able to remain on the YouTube site for over a month without triggering internal warnings.”

More specifically, the report reveals that one video was flagged multiple times within a few days of being uploaded, and that YouTube was notified by e-mail of both this video and the account more generally by a Jewish community organization – the Executive Council of Australian Jewry – within the first week of the video’s upload.

The report recommends “greater sanctions both by the state and by the platform provider” when such grave violations occur, suggesting that “a comparison can be drawn to copyright law, where commercial scale can tip the matter from a civil action into a criminal offence.”

“When the sanction for copyright infringement is greater, and more rigorously enforced, than the sanction for promoting genocide, we need to stop and question our priorities” Dr Oboler concluded.

OHPI was established in January 2012 in Victoria, Australia for the purpose of combating online hate and facilitating a change in online culture

Printed from: http://www.jewishpress.com/news/jewish-news/youtube-removes-hundreds-of-videos-in-response-to-new-report-on-online-anti-semitism/2012/08/01/

Scan this QR code to visit this page online: