Authorization

In 60 Minutes appearance, YouTubes CEO offers a master class in moral equivalency

Susan Wojcicki may be one of the most powerful women in Silicon Valley, but she also holds the unenviable role of being ultimately responsible for a lot of garbage that we, along with our parents, siblings, friends, neighbors, colleagues, and children not to mention billions of strangers now consume on YouTube.
That garbage, along with valuable content, is inevitable on a platform that Wojcicki says sees 500 hours of video downloaded to the platform every single minute. But it doesnt meant that YouTube cant do more, particularly given the vast financial resources of its parent company, Alphabet, which had a stunning $117 billion in financial reserves as of this summer more than any company on the planet.
Instead, as Wojcicki explains to reporter Leslie Stahl on tonights episode of 60 Minutes, the company has broadly drawn a line at taking down videos that cause harm, versus videos that spread might merely hatred and disinformation.
The distinction is laughable. So if youre saying, Dont hire somebody because of their race, thats discrimination, according to Wojcicki, and so that would be an example of something that would be a violation against our policies. Meanwhile, as Stahl notes, a video stating that white people are superior but that doesnt explicitly incite action on the part of viewers would be fine with Youtube. If that video says nothing else, yes, confirms Wojcicki.
Its a horrifying position for the company to take and for Wojcicki to be responsible, and ultimately, Wojcickis best defense, in her own words, is that the service is better because she knows she can make it better. Its exceedingly cold comfort.
If you missed the episode, you can read the transcript below.
[STAHLSTUDIO:]

TO GRASP THE PHENOMENAL SCALE OF YOUTUBE: CONSIDER THAT PEOPLE SPEND 1 BILLION HOURS WATCHING VIDEOS ON IT EVERY DAY. IT IS THE MOST USED SOCIAL NETWORK IN THE U-S. MORE QUERIES ARE TYPED INTO THE WEBSITES SEARCH-BAR THANANYWHEREONLINE EXCEPT GOOGLE WHICHOWNSYOUTUBE.

BUTTHE SITE HAS COME UNDER INCREASING SCRUTINY, ACCUSED OF PROPAGATING WHITE SUPREMACY, PEDDLING CONSPIRACIES AND PROFITING FROM IT ALL. THEY RECENTLY AGREED TO PAY A RECORD $170 MILLION DOLLARS TO SETTLE ALLEGATIONS THAT THEY TARGETED CHILDREN WITH ADS. YOUTUBEIS BEING FORCEDTO CONCENTRATE ON CLEANSING THE SITE.

WE VISITED THE COMPANYS HEADQUARTERS IN SAN BRUNO, CALIFORNIA, TO MEET SUSAN WOJISKEY, THE 51-YR-OLD CEO, IN CHARGE OF NURTURING THE SITES CREATIVITY, TAMING THE HATE AND HANDLING THE CHAOS.

VIDEO:

SUSAN:We have 500 hours of video uploaded every single minute to YouTube.
STAHL: Fi say that again.
SUSAN:So we have 500 hours of video uploaded every minute to YouTube.
STAHL: That is breathtaking.
SUSAN:It,it is, it is.We have a lot of video.

ANDA LOT OFINFLUENCE ON OUR LIVES, AND HOW WE PASS OUR TIME.

SOT: MUSIC

OVER A BILLION PEOPLE LISTEN TO MUSIC ON YOUTUBE EVERY MONTH:ITSTHE PLANETS TOP MUSIC SITE.THERESA CHILDRENS CHANNEL; WITH OVER 44-BILLION VIEWS.

STAHL:Do you let your children watch YouTube, including the young ones?
SUSAN:SoI allow my younger kids to use YouTube Kids, but I limit the amount of time thattheyreon it.Ithink too much of anything is not a good thing.Buttheres alot you can learn on YouTube.I think about how YouTube in many ways is this global library.Youwannasee any historical speech you could see it. You want to be able to learn a language
STAHL: Make a souffle?
SUSAN:wannalaugh, you justwannasee something funny.A souffle! Oh, yeah, cooking.Cookingsa great example.

SOS WATCHING PEOPLE BINGE EAT.(NAT)A GROWING NUMBER OF AMERICAN ADULTS ARE TURNING TO IT FOR THEIR NEWSSPORTS MEDICAL INFORMATION.ITS NOWMANKINDS LARGEST HOW TOCOLLECTION:(NAT)HOWTO TIE A TIETIE THE KNOTOR SPEAK THAI.

THE SITE HAS PRODUCEDWHOLENEW PASTTIMESWHERE MILLIONS WATCH STRANGERS OPEN BOXES(NAT)WHISPER SLEEPYOUTUBES ARTIFICIAL INTELLIGENCE ALGORITHMS KEEP RECOMMENDING NEW VIDEOS SOUSERS WATCH MORE AND MORE AND MORE.

STAGE: HAPPY FRIDAY!

WOJCICKIINVITED US TO THE WEEKLY ALL-STAFF MEETING.SHESSURPRISINGLY DOWN-TO-EARTHFORONE OF THE MOST POWERFUL PEOPLEIN SILICON VALLEY,(NAT)WHERE HERTRAJECTORY STARTED IN AN UNLIKELY WAY.

SUSAN:I owned a garage.AndI was worried about covering the mortgage.SoI was willing to rent mygarage to any student.But thentwo studentsappeared. Onewas namedSergeyBrin. The otherwas namedLarry Page. They are the founders of Google.
STAHL: Yes, they are.
SUSAN:But at thetimethey were just students. They looked like any other students.

LARRY AND SERGEYENDED UP HIRINGHERAS THEIR FIRST MARKETING MANAGER:SHE WAS GOOGLE EMPLOYEE 16.ASTHE COMPANYGREW, SO DID HER ROLE AND SO DID HER FAMILY..SHE HAS5CHILDREN.GOOGLEBOUGHT YOUTUBEON HER RECOMMENDATION,FOROVER $1.6 BILLION, AND8 YEARS LATERSHE BECAME CEOWITH AMANDATETOMAKE IT GROWANDMAKE IT PROFITABLE.ANDSHE DID!ITSESTIMATED WORTH IS $160-BILLION.

(SOT POP)

YOUTUBEMAKESMOST OF ITSMONEYFROMADS (NAT)SPLITTING REVENUE WITHPEOPLE WHOCREATE ALL KINDS OFVIDEOS.(NAT)FROMDO-IT-YOURSELF LESSONS TO HIP-HOP LESSONS.THE MORE POPULAR ONES CAN BECOME MULTI-MILLION DOLLAR ENTREPRENEURS.

[Ad: Joe Bidenpromised Ukraine a billion dollars if they fired the prosecutor investigating his sons company]

YOUTUBE ALSOMAKES MONEY FROM POLITICAL ADS,A THORNY ISSUEBECAUSESOME OF THEMHAVEBEENUSEDTOSPREADLIESON SOCIAL MEDIA.

STAHL: Facebookis facing a lot of controversy because itrefuses to take down a President Trump ad aboutBiden whichisnot true. Would you run that ad?
SUSAN:So thatis an ad that,um, right now would not be a violation of our policies.
STAHL: Is it on YouTube right now?
SUSAN: It hasbeenon YouTube.
STAHL: Can a politician lie on YouTube?
SUSAN: For every singlevideoI think its really important to look at it.Politicians are always accusing their opponents of lying.That said,itsnot okay to have technically manipulated content that would be misleading.For example, there was a video uploaded of Nancy Pelosi. It was slowed down just enough that it was unclearwhether or notshe was in her full capacitycauseshe was speaking in a slower voice.

PELOSI AD:Why would I work with you ifyoureinvestigating me

SUSAN:The title of the video actually said drunk, had that in the title.Andwe removed that video.
STAHL: How fast did you remove it?
SUSAN: Very fast.

BUTNOT COMPLETELY. WE JUST DID A SEARCH AND THERE IT WAS STILL AVAILABLE. THE COMPANY KEEPS TRYING TO ERASE THE PURPORTED NAME OF THE IMPEACHMENT WHISTLE-BLOWER, BUT THAT TOO IS STILL THERE. WHICH RAISES DOUBTS ABOUT THEIR SYSTEMS ABILITY TO CLEANSE THE SITE.

INTHE2016ELECTION CYCLE,YOUTUBE FAILED TO DETECT RUSSIAN TROLLS, WHO POSTED OVER 1,100 VIDEOS, ALMOST ALLMEANT TO INFLUENCEAFRICAN-AMERICANS LIKE THISVIDEO.

SOT:Pleasedontvote for Hillary Clinton.Shesnot our candidateShesa f**king old racist bitch.

YOUTUBE ISAN OPEN PLATFORM MEANING ANYONE CAN UPLOAD A VIDEO, AND SO THE SITE HAS BEEN USED TO SPREAD DISINFORMATION, VILE CONSPIRACIES,AND HATE.THIS PAST MARCHA WHITE SUPREMACIST LIVE-STREAMED HIS KILLING OF DOZENS OFMUSLIMSIN CHRISTCHURCH, NEW ZEALAND.HE USED FACEBOOK, BUTFOR THE NEXT 24 HOURSCOPIES OFTHAT FOOTAGEWEREUPLOADED ON YOUTUBE TENS OF THOUSANDS OF TIMES.

SUSAN: This event was unique because it was really a made-for-Internet type of crisis.Every second there was a new upload.Andsoour teams around the world were working on this to remove this content. We had just never seen such a huge volume.
STAHL:I can only imagine when you became CEO of YouTube that you thought, Oh, this isgonnabe so fun. Itspeople are uploadingwonderful things like
SUSAN: funny cat videos.
STAHL:funny.Andlook at what were talking about here.Are you worried that these dark things are beginning to define YouTube?
SUSAN: I thinkitsincredibly important that we have a responsibility framework, and that has been my number one priority.Wereremoving content that violates our policies. We removed, just in thelast quarter, 9 million videos.
STAHL: You recentlytightenedyourpolicy onhate speech.
SUSAN: Uh-huh.
STAHL: Why..whydyou wait so long?
SUSAN: Well, wehave had hate policiessince the verybeginning of YouTube. And we
STAHL: Butpretty ineffective.
SUSAN:What we reallyhad todowastighten our enforcementof that to make sure we were catching everythingand we use a combination of people and machines.SoGoogle as a whole has about 10,000 people that are focused on controversial content.
STAHL:Imtold that it is very stressful to be looking at these questionable videos all the time.Andthat theres actually counselorsto make sure that there arent mental problems with the people who are doing this work.Is that true?
SUSAN:Itsa very importantareafor us.We try to do everything we canto make sure thatthis is a good work environment. Our reviewers work 5 hours of the 8 hours reviewing videos. They have the opportunity totake abreak whenever they want.
STAHL:I also heard that these monitors, reviewers, sometimes,theyrebeginning to buy theconspiracy theories.
SUSAN:Ivedefinitely heard about that.Andwe work really hard with all of our reviewers to make sure that, you know, were providing the right services for them.

SUSAN WOJCICKI SHOWED US TWO EXAMPLES OF HOW HARD IT IS TO DETERMINE WHATSTOO HATEFULOR VIOLENTTO STAY ON THE SITE.

SUSAN@DEMO: [SEE KICK]Sothis is a really hard video to watch.
STAHL:Really hard.
SUSAN: And as you can see,these are prisoners in Syria.So you could look at it and say, Well, should this it be removed, because it shows violence, its graphic, butitsactually uploaded by a group that is trying to expose the violence.

SOSHE LEFT IT UP.THEN SHE SHOWED US THIS WORLD WARTWOVIDEO.

STAHL:Imeanitstotally historical footage that you would seeon the History Channel.

BUTSHE TOOK IT DOWN!

STAHL:Why?
SUSAN: There is this word down here thatyoullsee, 1418.

1418 IS CODE USED BY WHITE SUPREMACISTS TOIDENTIFYONE ANOTHER

SUSAN: For every area we work with experts, and we know all the hand signals, the messaging, the flags, the songs, and so theres quite a lot of context that goes into every single videoto be able to under-stand what are they really trying to say with this video.

THE STRUGGLE FOR WOJCICKIISPOLICING THE SITE WHILEKEEPING YOUTUBE AN OPEN PLATFORM.

SUSAN@HALLWAYYoucan go too far and that can become censorship.Andsowe have been workingreallyhard to figure out whats the right way to balance responsibility with freedom of speech.

BUTTHE PRIVATE SECTOR IS NOT LEGALLY BEHOLDEN TO THE FIRST AMENDMENT.

STAHL:Yourenot operating under some freedom of speech mandate. You get to pick.
SUSAN: We do.Butwe think theres a lot of benefit from being able to hear from groups and underrepresented groups that otherwise we never would have heard from.

[Lauren Southern: But withnamecallingof Nazi or propagandist]

BUTTHAT MEANS HEARING FROM PEOPLE WITHODIOUS MESSAGES ABOUTGAYS,
[Crowder: Mr.LipsyQueer fromVox.]WOMEN[Naked Ape: Sex robot]AND IMMIGRANTS:

Nick Fuentes:I think the easiest way for Mexicans to notgetshot and killed at Walmart

WOJCICKIEXPLAINED THATVIDEOS ARE ALLOWED AS LONG AS THEYDONTCAUSE HARM:BUT HER DEFINITION OF HARM CAN SEEM NARROW.

SUSAN: So ifyouresaying, Dont hire somebody because of their race, thats discrimination. And so that would be an example of something that would be a violation against our policies.
STAHL:But if you just said, White people are superior by itself,thatsokay.
SUSAN: And nothing else, yes.

BUT THATIS HARMFULIN THAT IT GIVES WHITE EXTREMISTS A PLATFORM TO INDOCTRINATE.

SPENCER: We want a flourishing, healthywhite race.

ANDWHAT ABOUTMEDICAL QUACKERY ON THE SITE?LIKETUMERIC CAN REVERSECANCER;BLEACH CURES AUTISM; VACCINES CAUSE AUTISM.

ONCE YOU WATCH ONEOF THESE, YOUTUBES ALGORITHMSMIGHT RECOMMEND YOU WATCH SIMILAR CONTENT.BUTNO MATTER HOWHARMFUL ORUNTRUTHFUL,YOUTUBECANT BE HELD LIABLE FORANY CONTENT,DUE TO A LEGAL PROTECTION CALLED SECTION 230.

STAHL: The law under 230 does not hold you responsible for user-generated content.Butin that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldnt you be held responsible for that material, because you recommend it?
SUSAN: Well, our systemswouldntwork without recommending. And so if
STAHL:Imnot saying dont recommend.Imjust saying be responsible for when you recommend so many times.
SUSAN:Ifwewere heldliable for every single piece of content that we recommended, we would have to review it.That would meantheredbe a much smaller set of information that people wouldbe finding. Much, much smaller.

SHE TOLD US THATEARLIER THIS YEARYOUTUBESTARTEDRE-PROGRAMMINGITS ALGORITHMSIN THE USTO RECOMMENDQUESTIONABLE VIDEOSMUCHLESSANDPOINT USERS WHO SEARCH FOR THAT KIND OF MATERIAL TOAUTHORATATIVE SOURCES,LIKE NEWS CLIPS.WITH THESE CHANGES WOJCICKISAYS THEY HAVE CUT DOWN THE AMOUNT OF TIME AMERICANS WATCH CONTROVERSIAL CONTENT BY 70 PERCENT.

STAHL: Would you be able to say to thepublic:we are confident we can police our site?

SUSAN:YouTube is always going to be differentthansomething like traditional media where every single piece of content is produced and reviewed. We have an open platform.ButI know that I can makeit better.Andthats why Im here.
See also:
Leave a comment
News
  • Latest
  • Read
  • Commented
Calendar Content
«     2020    »
 12345
6789101112
13141516171819
20212223242526
2728293031