Honolulu Star-Advertiser

Saturday, December 14, 2024 76° Today's Paper


Top News

YouTube reverses some restrictions on gay-themed content

ASSOCIATED PRESS

Robert Kyncl, YouTube Chief Business Officer, speaks as YouTube unveils “YouTube Red,” a new subscription service, at YouTube Space LA offices in Los Angeles. YouTube explained why some gay-themed content was restricted for certain users in a tweet on March 19, 2017.

NEW YORK >> The YouTube video shows two women, dressed in suits and ties. They smile; they sniffle back tears; they gaze into each other’s eyes. They are reading their wedding vows to one another.

The four-minute video titled “Her Vows” contains no nudity, violence or swearing. There’s no revealing clothing. No one is engaging in activities that have a “high risk of injury or death.” And yet, YouTube had deemed the video unsuitable for people under 18.

YouTube acknowledged Monday that it might have made a mistake, saying in a tweet, “Some videos have been incorrectly labeled and that’s not right. We’re on it! More to come.” The restriction on the vows video was lifted by Monday afternoon. But others — including one from YouTube celebrity Tyler Oakley titled “8 Black LGBTQ+ Trailblazers Who Inspire Me” — remained on YouTube’s age-restricted list.

Several YouTube users, many of them have in the lesbian, gay, bisexual and transgender community, have been complaining that their videos are categorized as “restricted” for no obvious reasons.

The “restricted” designation lets parents, schools and libraries filter out content that isn’t appropriate for users under 18. Turning on the restriction makes videos inaccessible. YouTube calls it “an optional feature used by a very small subset of users.”

It’s unclear whether the types of videos in question are now being categorized as “restricted” for the first time, or whether this is a long-standing policy that is only now getting attention. More likely, it is the latter.

U.K.-based YouTube creator Rowan Ellis made a video criticizing the restrictions last week. This video itself was restricted, though YouTube has since reclassified that video as OK. In an email, Ellis said YouTube needs to reach out to the LGBTQ community to explain “how this system works, and how it came to flag like this, if it was indeed an error and not a deliberate targeting.”

Companies like Google, Facebook and Twitter rely on humans and computer software to weed out unsuitable content. Mistakes can happen whether it’s a person or a machine.

This is not the first time, and probably not the last, that an internet company is mired in controversy about what types of content it restricts. Facebook has faced similar complaints, for example, with its removal — and later, reinstatement — of a Pulitzer Prize-winning photo of a naked, screaming girl running from a napalm attack in Vietnam.

The latest complaints spawned the hashtag #YouTubeIsOverParty over the weekend.

YouTube said in a tweet Sunday that LGBTQ videos aren’t automatically filtered out, though some discussing “more sensitive issues” might be restricted. But the company, which is owned by Google, did not specify what it counts as “more sensitive issues.”

In an emailed statement on Monday, YouTube said “some videos that cover subjects like health, politics and sexuality may not appear for users and institutions that choose to use this feature.” In the case of LGBT topics, which are by definition intertwined with health, politics and sexuality, filtering out what is and isn’t appropriate can be difficult.

YouTube followed that statement with another hours later: “We recognize that some videos are incorrectly labeled by our automated system and we realize it’s very important to get this right. We’re working hard to make some improvements.” The statement offered no further explanation.

YouTube content creators can decide to age-restrict their videos themselves. But that’s just one of the ways sensitive content is filtered out. YouTube says it also uses “community flagging,” which means users who have a problem with content in a video can flag it to YouTube for possible restrictions or removal.

But just because something is flagged, it is not automatically removed. Once a video is flagged, YouTube says it reviews it.

“If no violations are found by our review team, no amount of flagging will change that and the video will remain on our site,” YouTube says in its online support page.

What sorts of content gets filtered out in restricted mode can vary by region, based on countries’ varying community standards. In general, though, it includes “sexually explicit language or excessive profanity,” or violence or disturbing content, according to YouTube’s policies.

YouTube’s rules also state that videos “containing nudity or dramatized sexual conduct may be age-restricted when the context is appropriately educational, documentary, scientific or artistic. Videos featuring individuals in minimal or revealing clothing may also be age-restricted if they’re intended to be sexually provocative, but don’t show explicit content.”

Videos that show adults engaging in “activities that have a high risk of injury or death” may also be age-restricted.

By participating in online discussions you acknowledge that you have agreed to the Terms of Service. An insightful discussion of ideas and viewpoints is encouraged, but comments must be civil and in good taste, with no personal attacks. If your comments are inappropriate, you may be banned from posting. Report comments if you believe they do not follow our guidelines. Having trouble with comments? Learn more here.