“I’m afraid robots will end up taking up too much space. We have to think about the loss of human nature and be frightened by it” Noemi, the 21 year old student engaged in politics anxiously shared with me. It is one thing to collect facts. Interpreting them and giving it meaning is another. Global concerns about misinformation is on a high, and according to the digital news report, 55% of people worry about being unable to identify what’s real and what’a true and only 44% trust the news. With algorithm allowing for more automated generated stories to the world of journalism , those numbers might not improve.
The thoughts of being overruled by AI is increasingly making its way into our worst nightmares. Journalism is one of latest victims of this trend, arising serious concerns about the news we are daily being exposed to. Robojournalism, the use of software programs to generate content, is starting to become a common practice among journalistic institutions. However only very little knowledge is shared about it. According to a research conducted by the author of this feature based on 35 interviewees, only 34% of people interviewed knew the existence of automate generated stories, and only a small percentage could clearly defined it. Such a small awareness contrasts with the growing phenomenon of the situation, as many large players like Google have already heavily invested in this field. Google itself has invested £622,000 in a Reporters and Data and Robots (RADAR) project in Britain, through the Press Association (PA) which has already started to produce computer-generated news stories. It is an already embedded international trend as the French paper Le Monde also already used automated writing to help report French election results in 2015. Similarly in 2016, the US used the elections forecasting site Pollyvote to generate stories about the daily elections polls.
The fear of losing human touch on serious matters ignites threats to what we are being told. Issues like the lack of quality writing and trust have been raised by several news readers as others stumbled on issues of misrepresentation and filtering. It seems that the rise of the digital era drastically amplifies business competition and efficiency, leading journalistic institutions to embark on this automated path.
The questions most raised concerning automated generated stories relate to its ability to effectively replace human journalist skills to provide constructive and striking stories. In several cases where participants had access to both types of articles, they most generally preferred human generated news for quality and readability. In the research conducted by the author of this feature, 77% of interviewees expressed a preference for human journalism because of the writing features a human can bring to a story. The former social media and UGC at AP argued that the Narrative Science, one of the established companies which worked out a way of teaching machines how to write journalism, can efficiently turn data into stories but that it will never be winning Pulitzer Prizes. Automated generated stories still lack the ability to input persuasion, emotions or any kind of argumentation. The lack of ability to draw any kind of proximity to a story kills the envy to pursue the reading and feedback suggest that automated articles turn out to be repetitive and monotonous. Today, experts like Mayika estimate that only about 15% of a reporter’s job can be automated by using current level of artificial intelligence.
Automated content production is still in a preliminary state, which presents opportunities for greater quality through technological innovation. Companies like Narrative Science are still searching to perfect their system so that descriptions start to look more like narrative. If automated generated stories can win in text accuracy, it could heavily improve the user’s experience and maybe ultimately efficiently reproduce human generated stories. Hammond, founder of Narrative Science strongly dismissed skepticism towards such improvements as he fiercely stated that “In five years a computer programme will win a Pulitzer Prize – and I’ll be damned it it’s not our technology”.
The journalism industry always had to deal with issues of trust. With the rise of social media and technology, several scandals of fake news exploded which drastically hurt people’s trust towards journalistic institutions. Yes, algorithm, being a sequence of computational steps which transform the input into the output, might be understood as more reliable than human in terms of making silly mistakes. However, even clever robots happen to misuse sets of data or interpret them differently, leading to leeks of fake news to the public. In 2017, the Los Angeles Time’s has been victim of its own automated system reporting a nearly century out of date earthquake as a present event. Data mis-transferred by the US Geological Society led the automated system to process the resulting publication, which inherently made the paper suffer heavily. .
A dominating fear is the danger that automated news generations tools could also be used by propagandist to spread fake news for their own objectives. Algorithm are far from being impartial as every technological system reflect the conscious and unconscious bias of its makers. Trolling and fake news have been the centre of several recent heats and the introduction of automation in journalism can present a new opportunity to such phenomenon to expand exponentially. It is a concern shared by many such as Juliette Garside, an investigations correspondent at The Guardian, who shared with me that the simple idea of robojournalism makes her uncomfortable as it could “amplify the work of trolls farms and fake news operations, allowing more fabricated sorties to be produced.”
The fear of robots overruling human nature that Noemi shared with me has to be addressed urgently as Hammond recently claimed that by 2025 close to 90% of the news read by the general public would be generated by computers. The urgency of the matter unveils serious issues towards reader’s enjoyment and trust so the underlying pressing question is now to either accept to fully merge robots and humans or to accept for the sake of our safety and experience, to only make extensive use of them strictly for research purpose.