By Bob Gaydos
Algorithms are cool. I get it. I mean, I get that they’re cool, not how they work. I like to think that, if I had to, I could probably work really hard to understand them, but I dropped out of engineering school to do this. No regrets.
In fact, writing about life in all its complexities has given me an appreciation for what people -- real people, not some numbers-crunched algorithm people -- have to deal with on a daily basis. It has exposed me to the value of compassion, compromise and common sense.
Our universal dictionary, Wikipedia, defines an algorithm as “an unambiguous specification of how to solve a class of problems. Algorithms can perform calculation, data processing and automated reasoning tasks.”
But they can’t, obviously, do ambiguous.
I’m thinking about algorithms because Facebook, an Internet empire built on them, recently said it was going to hire 1,000 people to review ads in response to the embarrassing revelation that users’ news feeds during the 2016 U.S. presidential election were awash in political ads run by Russians, undoubtedly using their own algorithms to target various groups in an effort to influence the outcome. Facebook said Russians bought about $100,000 in ads -- with rubles -- but apparently the social media giant’s algorithms detected no ambiguity afoot with Russians arguing to protect Americans’ Second Amendment rights or stirring up anti-gay feelings, not in Moscow, but in the American heartland.
Congress is investigating. That’s good. It should do something this year. But Facebook has more than a Russia problem. It has become the major source of news for millions of Americans, yet its news feeds have been shown to be awash in fake news. Lots of really fake news, not Trump “fake news,” which is real news.
Facebook -- actually Mark Zuckerberg -- is talking about becoming a more responsible source of reliable news information and hiring “content moderators” to review, well, content, and a lot of additional people to look out for violent content on the site. Swell.
If you will permit me a self-serving observation, he’s talking about hiring people to exercise judgment over what appears publicly on Facebook because: (1) algorithms can’t think or feel like people and (2) this is how responsible newspapers have operated forever. Just saying.
In the interests of full disclosure, I also will say I have had my own personal experiences with Facebook algorithms. Recently, I received an e-mail telling me that an ad I wanted to run boosting a column on a Facebook page I administer was rejected because it had too much copy. It didn’t say the copy was boring or poorly written or even offensive. Just too much of it.
OK, I’ve had editors tell me the same thing, but I was also never prepared to give an editor ten bucks just to run the column. Oh yeah, the ad in question was proposed in July. I got the rejection e-mail on Halloween.
Then there’s the friendly way Facebook greets me every day with news of the weather in Phillipsport. “Rain is in the forecast today, Robert.” Thank you. If I Iived in Phillipsport it would matter a lot more, but it’s a half hour drive and there’s a big mountain range between us and my page unambiguously says where I live. Can’t the algorithm read?
But the incident that really convinced me that Facebook had an algorithm problem was its response to a complaint I filed regarding a post that was being sarcastic about the dotard-in-chief. I am guilty as charged of leveling (much-deserved) sarcasm at the Trump, but this cartoon had him in a coffin with a bystander saying to Melania, “‘Sorry about the assassination, Mrs.Trump, but he knew what he signed up for.”
As a “content moderator” for newspapers for several decades, I would never let such a tasteless, provocative, potentially dangerous item to be published. I told Facebook the same thing. I said they should delete it. It encouraged violence at a violent time in our history.
The algorithm replied that the post did not violate Facebook’s standard of, I don’t know: Acceptability? Appropriateness? Decency? Who sets this pathetic standard?
I use Facebook a lot. It has many wonderful benefits. But “automated reasoning” is not a substitute for good old, gut-instinct common sense. It’s the best way to connect people with people. Maybe people cost a little more than algorithms, but I think Zuck can afford it and there are a lot of laid off editors looking for work. If it’s not fake news that he’s serious about running for president some day, he’ll be glad he did it.
I’m also curious to know what Facebook says if I decide I want to pay to boost this post. I wonder if they’ll let me run a picture of Zuck. Can I even call him Zuck?
Stay tuned.
bobgaydos.blogspot.com
No comments:
Post a Comment
Please be civil.