Tim Mullaney: It’s up to us to separate the truth from the lies on Facebook

This post was originally published on this site

House Intelligence Committee Exhibit

This 2016 ad on Facebook was created by and paid for by the Russian government.

Few American companies like President Donald Trump less than Facebook FB, +2.31%   — and very few, however unwittingly, saw their products do more to get him elected.

Two-thirds of Facebook-related campaign donations went to Democrats in 2016. And yet, on election night, Russian intelligence agents popped bubbly in St. Petersburg and gloated over how they had used Facebook to Make America Great Again.

Now Facebook is under fire for accepting the Donald Trump campaign ad that CNN T, +0.43%   refused to air, citing its unsubstantiated and apparently false claims about Joe Biden and Ukraine. Naturally, Facebook snapped it right up, prompting charges it hasn’t learned 2016’s lessons and is setting itself up to be used once again.

So, the dilemma: The Internet, and especially social media, want to be free. The whole beauty of Facebook is that it’s published by essentially everyone, saying anything they want. The downside is that it’s published by basically everyone, a large number (and small percentage) of whom want to spread disinformation, libel anyone they disagree with, even undermine democracy itself.

What should Facebook do? What, more importantly, should you do?

Since I’m not a politician, I’ll talk real. Help’s not on the way. Facebook won’t be regulated by Donald Trump and Republicans, both because they don’t control the House and won’t crack down on foreign interference in U.S. elections — Trump’s impending impeachment might be a clue here.

Democrats like Elizabeth Warren are so furious about 2016 that their only response to Facebook is to break it up. Which has nothing to do with anything, really.

So, what can you do?

First, you can think.

The most striking thing about most troll-written and bot-generated content is that it’s amateurish. Much ink is spilled about sophisticated automated armies of Trump, but most of it makes the average e-mail claiming you can have millions of Nigerian naira if you’ll only send thousands of U.S. greenbacks for “expenses” look pretty smart.

The moral: If something looks too “good” to be true, it likely is.

Especially if “news” is coming from some off-brand place, consider waiting for it to bubble up through folks who report before they publish. If it’s on Facebook, that’s nice — so is a lot of junk. Trump’s ad is like this — it’s a set of lies we’ve all covered.

We forget so much about 2016, in our partisan fury. Like, we knew about the Russian interference before the election. The secret was out. We knew that much of what was online about Clinton was garbage. If we believed it, that’s not on Facebook — it’s on us.

Either way, the election came down to 78,000 votes in three states. It never should have been close enough for chicanery to make a difference (if it did, which is in dispute).

That’s on us (and former FBI chief Jim Comey’s stop-and-start investigation of Cllinton’s e-mails). And it won’t be next time.

As for Facebook, sure, it should change. A little.

The core of the company’s platform — letting regular people say what they want on their own pages — won’t change much. And that’s the vast majority of what’s on Facebook.

The company has thousands of people reviewing posts every day, which will continue, but reviews are complaint-driven, subjective and scattershot. We all know that content one person doesn’t like will turn up on another person’s page. The answer is to argue with it, or to ignore it. That’s up to each user.

But screening ads is a far more manageable task.

If Facebook can collect money for an ad, it can devote the work needed to know what’s in it — especially for political campaigns, whose public significance is greater than soap commercials

Facebook’s political advertising business is small enough that Facebook considered dropping political advertising. It can manage common-sense, voluntary standards of accuracy and fairness as broadcast networks do — as it failed to do in this case. Or it can decide it should drop political ads after all.

Mostly, controlling the Facebook effect is up to the audience.

A sprawling community will have many voices, some of them will be wrong, and it’s up to us to know the difference. We had dumb, dishonest and gas-baggy neighbors long before we had Facebook.

But we either trust the marketplace of ideas or we don’t. If we don’t, Putin’s work is already done.