In “60 Minutes” appearance, YouTube’s CEO offers a master class in moral equivalency

Susan Wojcicki may be one of the most powerful women in Silicon Valley, but she also holds the unenviable role of being ultimately responsible for a lot of misinformation that we, along with our parents, siblings, friends, neighbors, colleagues, and children — not to mention billions of strangers — now consume on YouTube.

That garbage, along with valuable content, is inevitable on a platform that Wojcicki says sees 500 hours of video downloaded to the platform every single minute. But it doesn’t meant that YouTube can’t do considerably more, particularly given the financial muscle of its parent company, Alphabet, which had a stunning $117 billion in cash on its balance sheet  as of this summer — more than any company on the planet.

Instead, as Wojcicki explains to reporter Lesley Stahl on tonight’s episode of “60 Minutes,” the company has broadly drawn a line at taking down videos that cause “harm,” versus videos that spread might merely hatred and disinformation.

The distinction is laughable. “So if you’re saying, “Don’t hire somebody because of their race, that’s discrimination,” according to Wojcicki, “and so that would be an example of something that would be a violation against our policies.” Meanwhile, as Stahl notes, a video stating that “white people are superior” but that doesn’t explicitly incite action on the part of viewers would be fine with YouTube. If that video says “nothing else, yes,” confirms Wojcicki.

It’s a horrifying position for the company to take and Wojcicki to be responsible for, and worse, Wojcicki’s indirect answer to whether YouTube can capably police its own platform is that she knows she can “make it better,” adding, “and that’s why I’m here.” At this point, thirteen years after Google acquired YouTube and five years into its former ad chief’s tenure as CEO, that’s cold comfort.

If you missed the episode, you can watch it here, or read the transcript below.

[STAHL STUDIO:]  

TO GRASP THE PHENOMENAL SCALE OF YOUTUBE: CONSIDER THAT PEOPLE SPEND 1 BILLION HOURS WATCHING VIDEOS ON IT — EVERY DAY. IT IS THE MOST USED SOCIAL NETWORK IN THE U-S.  MORE QUERIES ARE TYPED INTO THE WEBSITE’S SEARCH-BAR THAN ANYWHERE ONLINE EXCEPT GOOGLE… WHICH OWNS YOUTUBE.

BUT THE SITE HAS COME UNDER INCREASING SCRUTINY, ACCUSED OF PROPAGATING WHITE SUPREMACY, PEDDLING CONSPIRACIES AND PROFITING FROM IT ALL. THEY RECENTLY AGREED TO PAY A RECORD $170 MILLION DOLLARS TO SETTLE ALLEGATIONS THAT THEY TARGETED CHILDREN WITH ADS. YOUTUBE IS BEING FORCED TO CONCENTRATE ON CLEANSING THE SITE.

WE VISITED THE COMPANY’S HEADQUARTERS IN SAN BRUNO, CALIFORNIA, TO MEET SUSAN WOJCICKI, THE 51-YR-OLD CEO, IN CHARGE OF NURTURING THE SITE’S CREATIVITY, TAMING THE HATE AND HANDLING THE CHAOS.

VIDEO:

SUSAN: We have 500 hours of video uploaded every single minute to YouTube.

STAHL: Fi– say that again.

SUSAN: So we have 500 hours of video uploaded every minute to YouTube.

STAHL: That is breathtaking.

SUSAN: It, it is, it is. We have a lot of video.

AND A LOT OF INFLUENCE ON OUR LIVES, AND HOW WE PASS OUR TIME.    

SOT: MUSIC

OVER A BILLION PEOPLE LISTEN TO MUSIC ON YOUTUBE EVERY MONTH: IT’S THE PLANET’S TOP MUSIC SITE. THERE’S A CHILDREN’S CHANNEL; WITH OVER 44-BILLION VIEWS.   

STAHL: Do you let your children watch YouTube, including the young ones?

SUSAN: So I allow my younger kids to use YouTube Kids, but I limit the amount of time that they’re on it.  I think too much of anything is not a good thing. But there’s a lot you can learn on YouTube. I think about how YouTube in many ways is this global library. You wanna see any historical speech – you could see it. You want to be able to learn a language –

STAHL: Make a soufflé?

SUSAN: – wanna laugh, you just wanna see something funny. A soufflé! Oh, yeah, cooking. Cooking’s a great example.

SO’S WATCHING PEOPLE BINGE EAT. (NAT) A GROWING NUMBER OF AMERICAN ADULTS ARE TURNING TO IT FOR THEIR NEWS… SPORTS… MEDICAL INFORMATION. IT’S NOW MANKIND’S LARGEST “HOW TO” COLLECTION: (NAT)HOW TO TIE A TIE… TIE THE KNOT…OR SPEAK THAI.

THE SITE HAS PRODUCED WHOLE NEW PASTTIMES WHERE MILLIONS WATCH STRANGERS OPEN BOXES… (NAT) WHISPER… SLEEP…  YOUTUBE’S ARTIFICIAL INTELLIGENCE ALGORITHMS KEEP RECOMMENDING NEW VIDEOS SO USERS WATCH MORE AND MORE AND MORE.

STAGE: HAPPY FRIDAY!

WOJCICKI INVITED US TO THE WEEKLY ALL-STAFF MEETING. SHE’S SURPRISINGLY DOWN-TO-EARTH FOR ONE OF THE MOST POWERFUL PEOPLE IN SILICON VALLEY, (NAT) WHERE HER TRAJECTORY STARTED IN AN UNLIKELY WAY.

SUSAN: I owned a garage. And I was worried about covering the mortgage. So I was willing to rent my garage to any student. But then two students appeared. One was named Sergey Brin. The other was named Larry Page. They are the founders of Google.

STAHL: Yes, they are.

SUSAN: But at the time they were just students. They looked like any other students.

LARRY AND SERGEY ENDED UP HIRING HER AS THEIR FIRST MARKETING MANAGER: SHE WAS GOOGLE EMPLOYEE 16.  AS THE COMPANY GREW, SO DID HER ROLE AND SO DID HER FAMILY.. SHE HAS 5 CHILDREN. GOOGLE BOUGHT YOUTUBE ON HER RECOMMENDATION, FOR OVER $1.6 BILLION, AND 8 YEARS LATER SHE BECAME CEO – WITH A MANDATE TO MAKE IT GROW AND MAKE IT PROFITABLE. AND SHE DID! IT’S ESTIMATED WORTH IS $160-BILLION.

(SOT POP)

YOUTUBE MAKES MOST OF ITS MONEY FROM ADS – (NAT) SPLITTING REVENUE WITH PEOPLE WHO CREATE ALL KINDS OF VIDEOS(NAT) FROM DO-IT-YOURSELF LESSONS… TO HIP-HOP LESSONS. THE MORE POPULAR ONES CAN BECOME MULTI-MILLION DOLLAR ENTREPRENEURS.

[Ad: Joe Biden promised Ukraine a billion dollars if they fired the prosecutor investigating his son’s company…]

YOUTUBE ALSO MAKES MONEY FROM POLITICAL ADS, A THORNY ISSUE BECAUSE SOME OF THEM HAVE BEEN USED TO SPREAD LIES ON SOCIAL MEDIA. 

STAHL: Facebook is facing a lot of controversy because it refuses to take down a President Trump ad about Biden which is not true. Would you run that ad?

SUSAN: So that is an ad that, um, right now would not be a violation of our policies.

STAHL: Is it on YouTube right now?

SUSAN: It has been on YouTube.

STAHL: Can a politician lie on YouTube?

SUSAN: For every single video I think it’s really important to look at it. Politicians are always accusing their opponents of lying. That said, it’s not okay to have technically manipulated content that would be misleading. For example, there was a video uploaded of Nancy Pelosi. It was slowed down just enough that it was unclear whether or not she was in her full capacity ’cause she was speaking in a slower voice.

PELOSI AD: Why would I work with you if you’re investigating me…

SUSAN: The title of the video actually said drunk, had that in the title. And we removed that video.

STAHL: How fast did you remove it?

SUSAN: Very fast.

BUT NOT COMPLETELY. WE JUST DID A SEARCH AND THERE IT WAS STILL AVAILABLE. THE COMPANY KEEPS TRYING TO ERASE THE PURPORTED NAME OF THE IMPEACHMENT WHISTLE-BLOWER, BUT THAT TOO IS STILL THERE. WHICH RAISES DOUBTS ABOUT THEIR SYSTEM’S ABILITY TO CLEANSE THE SITE. 

IN THE 2016 ELECTION CYCLE, YOUTUBE FAILED TO DETECT RUSSIAN TROLLS, WHO POSTED OVER 1,100 VIDEOS, ALMOST ALL MEANT TO INFLUENCE AFRICAN-AMERICANS – LIKE THIS VIDEO. 

SOT: Please don’t vote for Hillary Clinton.  She’s not our candidate… She’s a f**king old racist bitch.

YOUTUBE IS AN “OPEN PLATFORM” MEANING ANYONE CAN UPLOAD A VIDEO, AND SO THE SITE HAS BEEN USED TO SPREAD DISINFORMATION, VILE CONSPIRACIES, AND HATE. THIS PAST MARCH A WHITE SUPREMACIST LIVE-STREAMED HIS KILLING OF DOZENS OF MUSLIMS IN CHRISTCHURCH, NEW ZEALAND. HE USED FACEBOOK, BUT FOR THE NEXT 24 HOURS COPIES OF THAT FOOTAGE WERE UPLOADED ON YOUTUBE TENS OF THOUSANDS OF TIMES. 

SUSAN: This event was unique because it was really a made-for-Internet type of crisis. Every second there was a new upload. And so our teams around the world were working on this to remove this content. We had just never seen such a huge volume.

STAHL: I can only imagine when you became CEO of YouTube that you thought, “Oh, this is gonna be so fun. It’s “people are uploading wonderful things like

SUSAN: funny cat videos.

STAHL: –funny. And look at what we’re talking about here. Are you worried that these dark things are beginning to define YouTube?

SUSAN: I think it’s incredibly important that we have a responsibility framework, and that has been my number one priority. We’re removing content that violates our policies. We removed, just in the last quarter, 9 million videos.

STAHL: You recently tightened your policy on hate speech.

SUSAN: Uh-huh.

STAHL: Why.. why’d you wait so long?

SUSAN: Well, we have had hate policies since the very beginning of YouTube.  And we–

STAHL: But pretty ineffective.

SUSAN: What we really had to do was tighten our enforcement of that to make sure we were catching everything and we use a combination of people and machines. So Google as a whole has about 10,000 people that are focused on controversial content.

STAHL: I’m told that it is very stressful to be looking at these questionable videos all the time. And that there’s actually counselors to make sure that there aren’t mental problems with the people who are doing this work.  Is that true?

SUSAN: It’s a very important area for us. We try to do everything we can to make sure that this is a good work environment. Our reviewers work 5 hours of the 8 hours reviewing videos.  They have the opportunity to take a break whenever they want.

STAHL: I also heard that these monitors, reviewers, sometimes, they’re beginning to buy the conspiracy theories.

SUSAN: I’ve definitely heard about that. And we work really hard with all of our reviewers to make sure that, you know, we’re providing the right services for them.

SUSAN WOJCICKI SHOWED US TWO EXAMPLES OF HOW HARD IT IS TO DETERMINE WHAT’S TOO HATEFUL OR VIOLENT TO STAY ON THE SITE.

SUSAN@DEMO: [SEE KICK] So this is a really hard video to watch.

STAHL: Really hard.

SUSAN: And as you can see, these are prisoners in Syria. So you could look at it and say, “Well, should this– it be removed, because it shows violence, it’s graphic,” but it’s actually uploaded by a group that is trying to expose the violence.

SO SHE LEFT IT UP. THEN SHE SHOWED US THIS WORLD WAR TWO VIDEO.

STAHL:  I mean it’s totally historical footage that you would see on the History Channel.

BUT SHE TOOK IT DOWN!

STAHL:  Why?

SUSAN: There is this word down here that you’ll see, 1418.

1418 IS CODE USED BY WHITE SUPREMACISTS TO IDENTIFY ONE ANOTHER  

SUSAN: For every area we work with experts, and we know all the hand signals, the messaging, the flags, the songs, and so there’s quite a lot of context that goes into every single video to be able to under- stand what are they really trying to say with this video.

THE STRUGGLE FOR WOJCICKI IS POLICING THE SITE… WHILE KEEPING YOUTUBE AN OPEN PLATFORM. 

SUSAN@HALLWAY You can go too far and that can become censorship. And so we have been working really hard to figure out what’s the right way to balance responsibility with freedom of speech.

BUT THE PRIVATE SECTOR IS NOT LEGALLY BEHOLDEN TO THE FIRST AMENDMENT. 

STAHL: You’re not operating under some– freedom of speech mandate. You get to pick.

SUSAN: We do. But we think there’s a lot of benefit from being able to hear from groups and underrepresented groups that otherwise we never would have heard from.

[Lauren Southern: But with name calling of Nazi or propagandist…]

BUT THAT MEANS HEARING FROM PEOPLE WITH ODIOUS MESSAGES ABOUT GAYS, 

[Crowder: Mr. Lipsy Queer from Vox.] WOMEN [Naked Ape: Sex robot] AND IMMIGRANTS:

Nick Fuentes: I think the easiest way for Mexicans to not get shot and killed at Walmart —

WOJCICKI EXPLAINED THAT VIDEOS ARE ALLOWED AS LONG AS THEY DON’T CAUSE HARM: BUT HER DEFINITION OF “HARM” CAN SEEM NARROW.

SUSAN: So if you’re saying, “Don’t hire somebody because of their race, that’s discrimination.  And so that would be an example of something that would be a violation against our policies.

STAHL: But if you just said, “White people are superior” by itself, that’s okay.

SUSAN: And nothing else, yes.

BUT THAT IS HARMFUL IN THAT IT GIVES WHITE EXTREMISTS A PLATFORM TO INDOCTRINATE. 

SPENCER:  We want a flourishing, healthy white race.

AND WHAT ABOUT MEDICAL QUACKERY ON THE SITE? LIKE TUMERIC CAN REVERSE CANCER; BLEACH CURES AUTISM; VACCINES CAUSE AUTISM. 

ONCE YOU WATCH ONE OF THESE, YOUTUBE’S ALGORITHMS MIGHT RECOMMEND YOU WATCH SIMILAR CONTENT. BUT NO MATTER HOW HARMFUL OR UNTRUTHFUL, YOUTUBE CAN’T BE HELD LIABLE FOR ANY CONTENT, DUE TO A LEGAL PROTECTION CALLED “SECTION 230.”

STAHL: The law under 230 does not hold you responsible for user-generated content. But in that you recommend things, sometimes 1,000 times, sometimes 5,000 times, shouldn’t you be held responsible for that material, because you recommend it?

SUSAN: Well, our systems wouldn’t work without recommending. And so if–

STAHL: I’m not saying don’t recommend. I’m just saying be responsible for when you recommend so many times.

SUSAN: If we were held liable for every single piece of content that we recommended, we would have to review it. That would mean there’d be a much smaller set of information that people would be finding. Much, much smaller.

SHE TOLD US THAT EARLIER THIS YEAR YOUTUBE STARTED RE-PROGRAMMING ITS ALGORITHMS IN THE US TO RECOMMEND QUESTIONABLE VIDEOS MUCH LESS…  AND POINT USERS WHO SEARCH FOR THAT KIND OF MATERIAL TO AUTHORATATIVE SOURCES, LIKE NEWS CLIPS. WITH THESE CHANGES WOJCICKI SAYS THEY HAVE CUT DOWN THE AMOUNT OF TIME AMERICANS WATCH CONTROVERSIAL CONTENT BY 70 PERCENT.  

STAHL: Would you be able to say to the public: we are confident we can police our site?

SUSAN: YouTube is always going to be different than something like traditional media where every single piece of content is produced and reviewed.  We have an open platform. But I know that I can make it better.  And that’s why I’m here.

Read More