Skip to content
Link copied to clipboard

Hey, Mark Zuckerberg: A software CEO describes how Facebook can fix its ugly ad problem

Advertising Software CEO Justin Choi says Facebook must more rigorously disclose the sources of its information and more clearly convey the promotional nature of the content. Depicting brand logos are a good start - but more proactive disclosures are necessary.

Facebook CEO Mark Zuckerberg (center) leaves a meeting with Sen. John Thune (R., S.D.) on Capitol Hill in Washington, Monday, April 9, 2018. Zuckerberg testified Tuesday before a joint hearing of the Commerce and Judiciary Committees about the use of Facebook data to target American voters in the 2016 election.
Facebook CEO Mark Zuckerberg (center) leaves a meeting with Sen. John Thune (R., S.D.) on Capitol Hill in Washington, Monday, April 9, 2018. Zuckerberg testified Tuesday before a joint hearing of the Commerce and Judiciary Committees about the use of Facebook data to target American voters in the 2016 election.Read moreAndrew Harnik

Testimony provided to the Senate and House of Representatives commerce committees by Justin Choi, CEO of Nativo, an advertising software maker with news publisher clients and U.S. offices in Los Angeles, Chicago and New York  

By Justin Choi

The Cambridge Analytica scandal is not merely a case of data abuse by a single political player on Facebook.

Although Facebook's recent admissions have exposed a number of vulnerabilities and questionable practices at the social media giant, the true concerns for the advertising industry – and indeed society itself – go much deeper. The real threat lies at the core of Facebook's business in the form of its main advertising product, which has played an alarming role in the spread of misinformation across the United States and around the world.

Facebook is under such intense scrutiny today for good reason. The platform provides a monetary incentive for the spread of fake news while being, at the same time, the primary source of all news for the majority of Americans. And to this day, Facebook still lacks an effective mechanism for curtailing the spread of such toxic misinformation in our public discourse.

What would such a mechanism look like? It would seem that this question – above all others – is the one we need to answer today.

Does Facebook bear a responsibility for the content that it helps to spread? Yes. Does that make it a media company? Yes. We've already had those debates; what we really need to debate now is: what practical steps should Facebook take to fight fake news?

The search for a solution begins with one essential observation: the ads Facebook makes its money money from aren't just any ads. The units Facebook sells are native ads. That is, they are made to look and feel like the content around them, creating a seamless and non-interruptive experience for the user.

They are also targeted ads, directed with precision to specific users based on individualized profiles, a practice that Facebook pioneered, perfected, and popularized. The ability to target ads based on user preferences, and the fact that the ads render identically to everything around them, is what makes them so effective. That's made them an incredibly attractive prospect for benign marketers and sinister propagandists alike. It is, after all, what attracted Cambridge Analytica to Facebook in the first place.

But Facebook isn't the only company selling such native ad units to marketers. Most of the leading news publishers do it, too.

The New York Times, Wall Street Journal, and Washington Post make money from them; so does Hearst, Tronc, The Atlantic – the list goes on. In the age of ad blocking, where consumers respond negatively and decisively to ads that impinge on the user experience, native has become an important — if not critical — revenue source for the rest of the media ecosystem. It is one of the last viable revenue sources propping up the publications driving public discourse and producing journalism in the public interest.

In other words, native ads are a critical support line to the fourth estate. The primary news publications driving public discourse will rely on them more and more as interruptive formats become less and less viable.

That makes the need to oversee the implementation of native even more critical.  Native requires a higher level of oversight and disclosure than any other ad unit, perhaps ever.  That's because native can potentially blur the line between advertising and editorial. Reckless abuse of native ads can seriously undermine the editorial product, if proper disclosures are not made, and if the advertisers themselves are not vetted.

Publishers like The New York Times and The Wall Street Journal understand this risk – and they've dealt with it better than Facebook.

I write this opinion as a member of that value chain. Nativo, the company that I founded and where I currently serve as CEO, is a platform where qualified advertisers buy native advertising inventory from such premium publishers. More than 400 publishers on our platform open up their editorial feed – and editorial integrity – to the demand of more than 1,000 advertisers, who place hundreds of thousands of ads per month that render similarly to Facebook ads.

The difference is that they are all subjected to a much more rigorous vetting process than what occurs on Facebook. I believe that the protocols for how the Nativo platform reviews and processes native advertising provide a clear direction for where Facebook's platform needs to go. Those include:

Vetting and verifying the source – Every transaction begins with humans on both sides, who can confirm the identity of the advertiser.

Quality control – We review each ad to ensure its claims are substantiated and there is no false or misleading messaging.

Disclosure – We require that ads are labeled as sponsored content, meeting and exceeding the standards set forward by the Federal Trade Commission.

Compared to these stringent standards of oversight, the measures Facebook has undertaken to date are not nearly enough. A thousand ad checkers might sound like a lot, but it's a pittance compared to Facebook's overall ad load. It's an afterthought, at best. It falls well behind the type of measures put in place by publications like The New York Times and The Wall Street Journal for their own native advertising.

Those publications take their editorial responsibility seriously. They're joined by Nativo and many of the other technology players that power their native offering.

Effective policing goes beyond screening for bad content. Facebook must more rigorously disclose the source of the information and more clearly convey the promotional nature of the content. Depicting brand logos are a good start – but more proactive disclosures are necessary.

Many in tech hope that AI [artificial intelligence software] can one day effectively exercise this type of editorial judgment at scale. I don't think anyone seriously believes it can be done that way now. As of now, all of the important interventions needed on Facebook can only be done with human oversight and direct relationships between the advertiser and the platform, mediated by human contact and active human judgment.

 At the moment, native inventory will have to do something retro: rely on human intelligence and judgment.

This is how media worked when it was still a profitable business. There were direct relationships with advertisers.  Native advertising has introduced a new dimension to this, with the blurring of lines between advertising and editorial. That makes disclosure and direct relationships more important than ever.

Facebook needs to take greater responsibility. Simply being reactive to scandals such as the one sparked by Cambridge Analytica is not enough. The way premium publishers handle native advertising outside of Facebook provides a rubric for how Facebook can start to repair the damage.

Doing the right thing might cut into Facebook's [profit] margin. But if Facebook understands how much it depends on publishers for their revenue, and how it has become the key provider of fake news to our society through its business practices, it will make the smart choice.