Welcome back to Day 2 of Robotics Week! If you missed it, check out my Tuesday review of Wired’s feature, in which an algorithm helps a science fiction writer write a short story.
Today I’m writing about a short report put out in March of this year by Francesco Marconi and Kourosh Houshmand for the Associated Press. The report is entitled “The role of journalists in the era of algorithms: A guide to preparing newsrooms for humans and machines.” And if that subtitle didn’t get your attention, consider that the researchers wrote this report on behalf of one of the most automated newsrooms in the country. Working with AI firm Automated Insights, the AP has increased its output on articles like quarterly earnings reports and sports recaps by 12x.
The tone of the report, at times, seems optimistic in a professionally forced way, the way a longtime friend and co-worker might deliver news of a “reassignment” to you. “Cheer up!,” it says at times, “humans can still do things!” Remember, these researchers are consultants, and therefore have to report with caution. But much of this report does chart a constructive way forward for writers.
More of that later. For now, here are some takeaways from the report:
- Now that machines not only consume and store information but consume it as well, new roles are necessary for editors and journalists.
There was a time when computers simply gathered info, but that time is long gone. Computers are now producing content, and that content needs human management.
Specifically, because algorithms feed on rules, carrying them out efficiently but without much imagination, human labor is needed to design the rules they follow, and to properly categorize the data it produces.
Example: if an algorithm is designed to produce an article with violent content, then a human editor needs to make sure it is tagged in a way that keeps it out of places where children can access it. Or, if an algorithm is organizing an article for children, it needs an editor to design its rules so that it gathers together, say, colorful, humorous content that doesn’t include weapons or bare bodies etc. The human editor is necessary to design the sort of content that an algorithm has the ability to produce. This is writing, per se, but it is a kind of narrative design.
A key point to note here is that, once again, the ability to relate imaginatively to rules (see my blogpost about the Wired article) is an important asset to the future of human writers. Instead of carrying out rules in blind obedience, humans can serve an irreplaceable function because they can envision and redesign the rules that make the algorithm do better work.
An additional, and very important, example as I close the first point: automation editors (yes, there already is one working at AP) can even help fight the racism that is inherent through algorithm design.
- Automation frees up reporters to “create complex, impactful stories.”
That quote is from the report, and I’m aware that it sounds a little like spin. You won’t lose your jobs! You’ll just do more fun, important jobs!
And make no mistake: when there is 12x less reports to write while traditional media is approaching an advanced state of financial decomposition, some human beings will lose their current jobs.
However, remember Gutenberg. When he invented the printing press, it threatened a class of people for whom literacy was their only job security (priests, scribes, politicians, etc.). The positive side of that was the raising of the bar of human achievement—people were forced to accomplish more, because it was easier and faster to get certain tasks done with the new technology. Hence the Reformation, hence the Enlightenment. Every new technology displaces the jobs of the old economy, while slowly making new work possible.
Robots are not at all good with particularities, but humans have the ability to become more and more attuned to empathy and nuance.
The journalists that remain after automation will be the ones that are able to “create complex, impactful stories.” In the same way that having a bachelor’s degree will be meaningless in a decade or so, so will being able to crank out 10 reports by the end of the week. Machines can do it better. What to do, then? Stop writing like a machine!
Embrace tone, emotion, subjectivity—learn how to handle these matters really well on the page, and you will become irreplaceable.
Also: humor. Which is really important, because it’s so human. In the report, the authors mention sarcasm being a big problem for automated writing programs, because algorithms can’t handle sarcasm.
Takeaway: you need to cultivate your sarcasm skills. Seriously.
I’m sure you’re so happy to hear that.
- Correcting biases is another important human function in the future automated newsroom.
What humans see as prejudice, an algorithm just sees as a pattern. Robots LOVE patterns. Because an algorithm feeds on its ability to pattern-detect, it has the tendency to amplify prejudice. An easy example of this: Fake News.
Just like other types of lazy work, if you set an algorithm to serve a particular interest, and then just forget about it, that algorithm will produce more and more biased material. The report mildly puts the problem as a “heighten[ing]…existing points of view.” But let’s be honest: racism, sexism, and elitism would worsen if algorithms write stories without human direction.
Robots don’t see color. In a world of disproportionate privilege, that’s not a good thing.
Human writers, on the other hand, can learn the nuances of language in a way that better represents particular human experience. Robots are not at all good with particularities, but humans have the ability to become more and more attuned to empathy and nuance.
The virtues that will outlast automation are design, empathy, nuance, and humor. Basically all of those virtues that make for a great friend or companion. Working hard for work’s sake doesn’t work anymore—let the robots do that, and get on to the more tricky task of being human.