Subscribe to Mic Daily
We’ll send you a rundown of the top five stories every day
AI startup seeks to create unbiased news by combining human intuition with algorithm-based research
Knowhere attempts to address biased news by using AI to create news that is unbiased or shows which way a story leans upfront. Monster Ztudio/ShutterStock

It’s hard to trust the news in 2018. People rely on non-biased reporting from journalists. If it’s not tainted local news coloring people’s opinions, it’s social networks that serve the exact kind of news it thinks people want to see.

The AI startup Knowhere says it knows how to disrupt news: by letting readers choose their preferred political leaning while reading an article, or offering a computer-assembled “impartial” view.

“We see it as a partnership between people and machines,” Nathaniel Barling, Knowhere’s co-founder and CEO, said in a phone interview. “Machines are good at analyzing vast amounts of data quickly and unemotionally, but humans are better at editorial decisions like fact-checking, narrative-building and general common sense,” he said.

Barling said the software that he, Dylan Rhodes and Alexandre Elkrief built scours the web for information on a topic the Knowhere News site seeks to cover. Each day, the software looks at the stories that were shared the most and which publications covered it. The algorithm is able to do things that many news-writing humans who aren’t data journalists couldn’t, like quickly examine sources in multiple languages. Unfortunately the computers can’t do everything real journalists can do.

Once the computers rank which stories are worth covering, humans step in. “Story prioritization is then reviewed by our editorial team who make the final call,” Barling said. The founder admits bias can certainly creep in at this step.

The algorithm looks at the chosen stories and can spit out a draft, which, according to Barling, is then checked by human editors who hail from the news industry. The service has seven people working on the tech side and nine people on the editorial news side, the CEO told Mic.

The result is a news story like this, which Knowhere marks at the top with “Impartial.”

An impartial story is marked as such on the top of the page
An impartial story is marked as such on the top of the page Knowhere

When a story may contain bias, Knowhere offers a choice at the top to see the left-leaning version, right-leaning version or the impartial version of the article. The impartial headline reads “US to add citizenship question to 2020 census” while the left-leaning headline says “California sues Trump administration over census citizenship question.” When switched to the right-leaning view the headline changes to: “Liberals object to inclusion of citizenship question on 2020 census.”

In some articles, the Knowhere site allows you to choose which political leaning you’d prefer to see: Left, Right or Impartial
In some articles, the Knowhere site allows you to choose which political leaning you’d prefer to see: Left, Right or Impartial Knowhere

The left-leaning version of the story quotes the National Association of Latino Elected and Appointed Officials Education Fund, saying that the addition of a citizenship question “would have catastrophic consequences for Latinos and all Americans.” Meanwhile the right-leaning article sources Breitbart News to say “the move will result in a more accurate estimate of the illegal alien population and better unemployment data.”

Software is biased by those who create it, but Knowhere thinks it has the answer

It’s well noted that news media affects voting and that the news itself is biased. As Pew Research notes, people’s faith in news media can affect other things like “trust in one’s national government and a sense that the economy is doing well.”

Knowhere’s real value may stem from its ability in presenting both sides at the same time, and tossing in an impartial version to boot. But artificial intelligence tends to be as biased as the humans who mold it. The same is true for algorithms that gather news, which Knowhere says it addressed early. “To train the algorithms we started by labeling stories ourselves,” Barling said. “We soon realized that, if we couldn’t tell a story’s political leaning, we would look at the publication name or the author. To fix this, we looked at just the story — not the byline or name of the source.”

The prevalence of fake news and AI created to mislead voters is enough to make anyone want to give up on news, but Knowhere sees it as an opportunity. “Facebook showed us that we can’t trust the current way, it just doesn’t fit in the modern world,” Barling said. “Social media uses news algorithms to optimizing capturing your attention, but AI is better used to optimize telling the truth.”