The CEOs We Need Now
From Zuckerberg to Dorsey, one thing all Tech Revolutionaries learn is that “Tear down the old system” might be a great rallying cry for building a company in the early days…but what do you do when you win?
Today with Chris Anderson (former Editor-in-Chief, WIRED) and James Currier (General Partner, NFX), we’re looking at the core problems underlying Facebook, Twitter, and many other social networks as tech continues tearing down old media:
- How could the government regulate social media when the thing they need to regulate – content – is happening in real-time, billions of times a day?
- Follow the algorithm, not the sideshow — and why what you’re seeing in the news is a distraction, and where the real solution lies.
- Why are several CEOs not making the transition from CEO to “Stewards of Society” as many of their predecessors did?
- Network Problems demand Network Solutions – How can we as technologists fix problems that satisfy society yet don’t cripple businesses?
The inconvenient truth is that Silicon Valley created this situation, and it’s going to take Silicon Valley to get us out of it. Unless the few of us who know how to build these networks embrace our civic duty to be part of the solution, it won’t just be technology that suffers — it will be society.
This Op-Ed was first published in Bloomberg Opinion on March 25, 2021, How to Provide Real Oversight of Social Media, and is reprinted with permission.
As you watch yet another congressional hearing where social media CEOs awkwardly put on suits and ties to defend the indefensible to the uncomprehending, you couldn’t be blamed for feeling hopeless. Our long-standing policies for regulating traditional media have collapsed in the face of user-generated content, with networks of tens of millions of people creating it in real-time at no cost. And it’s not just members of Congress who are ineffectually shaking their fists at the spread of misinformation and extremism on social media platforms; the executives of these companies themselves also seem powerless to do much more than keep abuse to a dull roar, offending defenders of free speech and defenders of civil discourse in equal measure.
Most policy proposals to keep abuses of social media in check range from the bad to the worse. Breaking up Big Tech may appeal to antitrust lawyers, but a YouTube that’s no longer part of Google isn’t less likely to host anti-vaccination propaganda. Revoking Section 230 protections — which provide companies legal immunity from content their users post — for entities that host content would essentially kill them.
The usual Silicon Valley response to regulation is to offer self-regulation instead — only tech companies have the skills and speed to fix what they broke. They’re not wrong, but even Facebook Inc., Twitter Inc., and Google have struggled to find techniques that are both good at dampening the worst abuses at scale and still trusted enough to keep Congress off their backs.
We have a solution that resurrects a concept from the golden age of newspapers, back when they were the vaunted Fourth Estate that provided the necessary information, accountability and counterbalance to government that makes for a well-functioning democracy: the ombudsman.
Newspapers typically employed a quasi-independent ombudsman who served the broader interests of readers — of society, if you will. Inspired by similar roles in government, a newspaper ombudsman had two main functions: a reactive one (handling complaints from the public) and a proactive one (watching the organization from both inside and outside and flagging bad behavior).
Today, ombudsmen are an endangered species in the media world as newspapers lose their central role in the public discourse. The New York Times retired the role, known as the public editor, in 2017, and it was one of the last to do so. But it’s ripe for reinvention in the digital economy. If social media companies are the “networked Fourth Estate,” can they accept the independent oversight responsibilities that come with that status?
Facebook has nodded at this with its Oversight Board, whose globally diverse members include lawyers, activists and academics (not required for membership: having ever used Facebook). It is now more than two years old and has made decisions on just seven cases out of the more than 220,000 complaints that have been referred to it. Putting aside whether these were the right decisions — its members say they’re already frustrated by the binary “leave it up/take it down” nature of their charter
It’s clear that this board is too slow, too reactive, too opaque and too removed from the operations of the company to make an appreciable dent in Facebook’s acceptable-speech problem. (1)
Independent ombudsmen would be much better. First, they can be proactive, not just reactive, to complaints. Second, because they would have access to a company’s engineers and technology, they could address root algorithmic causes rather than just consequences. Finally, because they would have full-time positions with budgets for full-time staff, not a committee meeting quarterly, they could be fast, nimble and find problematic patterns within hundreds of thousands of complaints. It’s better than counting on existing internal ethics and policy teams — which as non-independent groups risk conflict of interest, and have no requirement of transparency and no assurance of action.
Here’s how this could work:
- Once a social media company hits 25 million monthly active users in the U.S., congrats! Your success means that you’ve triggered the regulatory “public interest” threshold.
- Within 90 days, you must hire an independent board member who will serve as the people’s representative, aka the ombudsman. Call it “director of social governance.”
- The ombudsman, who would be hired for a set term (say, two years) and cannot be fired, must be free of financial conflicts of interest.
- The ombudsman will have full access to the company and its technology, as any board member does. Unlike the other board members, this one issues public reports and recommendations, similar to the roles of inspectors general and the General Accounting Office in government.
- Like traditional ombudsmen, this representative performs two key roles: (1) reviewing public complaints and (2) initiating investigations of structural root-cause problems in the company’s algorithmic designs and business model.
- Once the representative issues a public recommendation, the company must respond within 48 hours if it can be fixed with existing software, or 30 business days if it requires more software development or algorithmic testing.
Let’s stop kidding ourselves that the old-world, hierarchical approaches to regulating media outlets will work with social media companies. These are networks; they require network solutions. Let’s augment them with network-style pieces so that they function better. These networks are important, and they are going to be with us in one form or another forever. We have to help them learn to self-manage, the way the old Fourth Estate used to.
1. In one case, they were confounded by the subtle racial politics of ethnic minorities in Myanmar, and the discussions were in a language none of the panel members speak.