This year’s Edelman Trust Barometer indicates that we are living in a world in which distrust has become the default: 59 percent of people distrust until provided with evidence that something is trustworthy. People lack a willingness to consider new ideas, compromise with others and trust that which is not homegrown; which erodes our collective ability to address complex challenges. The report shows an 8-point net decline in how much people trust those from other countries, and a 2-point net decline in trust in people in other states or regions of the same country. Conversely, people feel closer bonds with their coworkers (12 points) and neighbors (7 points).

Essentially, people have reverted to a hyper-partisan framework for decision-making in which information and ideas are subjected to a value “are you one of us” litmus test, before being allowed in and considered.

Essentially, people have reverted to a hyper-partisan framework for decision-making in which information and ideas are subjected to a value “are you one of us” litmus test, before being allowed in and considered. But while people’s decision-making has become more simplistic, the machines that define our world, what we read, who we interact with and what decision paths are available to us have gotten ever more complex and sophisticated. The irony is that these technological advances have largely served to reinforce and exacerbate the values-based orthodoxies that threaten our future.

The irony is that these technological advances have largely served to reinforce and exacerbate the values-based orthodoxies that threaten our future.

There are two forces at work here. The first is that there is far too much information available. People cannot possibly process it all. Humans who are well-known to be cognitive misers even on their best days need tools and short-cuts to get to the information they need to make decisions. This is where machines come in. They are indispensable tools for navigating information in the modern world. They collect, curate, and serve up much of the information we end up consuming. They sift through terabytes of information, so we only must navigate megabytes.

The second force plays off the first. Machines do not collect, curate and serve up information randomly. They are programmed to do so in ways that maximize audience size and time of engagement. This is done by telling people what they want to hear, reinforcing their beliefs, and exploiting their fears and anxieties, all things that encourage rather than disrupt hyper-partisanship and mutual distrust. 46 percent of people see media as a divisive force, with an 11-point difference between those who see media as unifying.

Machines do not collect, curate and serve up information randomly. They are programmed to do so in ways that maximize audience size and time of engagement.

With 42 percent of people expressing they want business to do more about information quality, businesses have a unique opportunity and responsibility to reverse the erosion of trust. The first step is to incorporate societal benefit and an ethic of fairness into the algorithms that machines use to manage the flow of information to individuals. That means tempering the goal of maximizing audience size and engagement with an eye towards a greater good — a society capable of having constructive and civil debates, one in which veracity trumps an adherence to tribal values as a gating function on what people are willing to believe is true. Machines, despite how close they may be to singularity, are not going to make these changes on their own.

Yannis Kotziagkiaouridis is the Global Chief Data & AI Officer of Edelman Data & Intelligence.