If the first online news story about a recent Los Angeles earthquake seemed to appear extremely fast and feel slightly metallic, it’s because the Los Angeles Times report was written by a robot.
Developed by Los Angeles Times database producer Ken Schwenke, the algorithm called Quakebot automatically retrieves data from the U.S. Geological Survey Earthquake Notification Service on 3.0 magnitude earthquakes and above. The bot then translates the data into a text report, adds a map, creates a headline and pings Schwenke, who reviews it and authorizes its publication under his byline. For the March 17 4.4-magnitude earthquake, this process took all of 8 minutes.
The L.A. Times has other bots gather data and writing stories, or at least pieces of stories. One analyzes data on every homicide committed in the Los Angeles area and writes the opening for the paper’s Homicide Report. Another reviews crime and arrest reports and notifies journalists of high-profile names and interesting stories.
If they don’t want write their own algorithms like the L.A. Times, news reporting services can use software developed companies like Narrative Science, which has a bot for writing short reports on sporting events.
Should journalists be nervous about robo-journalists? Not yet. The bots can gather data and organize into a readable form using a specific set of rules. The stories still have to be checked by humans to prevent incidents like the 120 research papers published by the Institute of Electrical and Electronic Engineers and Springer that were later determined to be fakes generated by an algorithm written by two MIT students.
What the bots can’t do is interview witnesses or experts, relate the stories to other issues and form opinions. As a result, major news organizations look at them as tools for reporters, not as replacements, and have not made a major commitment to them.