Humans are hypersocial animals; as soon as we invent something new, from signal fires to electricity, we use it to communicate. Users of the very first computers would leave notes on the system for friends to read; today, Internet users spend on average over two hours per day using social media. As a cognitive scientist, I’m interested in how we use online tools to communicate, adjust our views, and make decisions; and I’m concerned about the effects of for-profit social media on conversation and debate. My generation is the last to have grown up writing letters - handwritten, meaningful messages from friends - and for-profit social media is psychologically very different.

In October 2017 three academics, concerned about Whatsapp’s influence on the Brazilian election, called for the Facebook-owned company to make it harder to share messages. Instant sharing means that an individual can reach many more people and that events, rumours and viewpoints can go viral in a few hours, sweeping across a country as a wave of copies is made. Importantly, the reader has no sense that the sender has taken the time to craft a communication; sharing is not an act of expression. Oversharing also swamps readers with the cognitive demand of reading hundreds of low-effort posts per day. Luckily for big social media, there is a solution at hand: the algorithm.

Facebook, Snapchat and Twitter originally showed new posts in chronological order. Today, they use news-feed algorithms to select the posts you see. The details of how these algorithms work and what they prioritise are murky, but two things are certain: they collect huge amounts of data on our behaviour, and they aim to maximise the time we spend logged in - not to give us the most interesting or meaningful material. If we argue with someone over an offensive post, the algorithm may counterproductively show us more posts like it. If you turn Facebook’s algorithm off, it swiftly switches itself back on.

The tools we use to communicate online play a huge role in our lives: they help us choose our friends, fix our political and moral beliefs, and construct our personalities. I study attention, the set of brain processes which decide what details of the outside world are important. When you notice a bright light, screen out a distracting noise, or select the most trustworthy panellist in a debate, your attention system is at work. In the modern world, online communication tools enable and support social trends; they host flurries of political discourse during elections; and they allow the viral spread of ideas, from memes to movements. They are the attention system of society.

Before sharing and before the news-feed algorithm, individual people played a huge role in society’s online attention system. Millions of small decisions by individuals combined to select the topics that would dominate the headlines. But we have given up the job. With our news feeds curated by algorithms, we no longer decide what to read. We can still choose which groups to subscribe to or which friends to follow, but we have completely abnegated the most basic and central decision - what messages are put in front of our eyes.

So, using social media is a very different experience from reading or writing a letter. There is no natural end to the experience; there is little incentive to put time and care into writing a message; and there is no control over what you read. For-profit social media, being free, is not a product. Neither is it a a service; it does not give us the options we need to control it. Social media is an experience designed to attract and retain us - so that we can provide the attention which earns Facebook £20 per year per user in advertising revenue, and the behaviour data which is its most valuable asset.

The consequences of our society’s new attention system are extremely serious. Before the 2016 referendum on EU membership, over £2.7m was spent on often-misleading Facebook adverts by an unregulated consortium of lobbying organisations which conspired to break the Electoral Commission’s rules on data sharing. Shortly before the last US election, $70 million per month was spent on social media advertising by the Trump campaign, using voter data stolen through a Facebook loophole to predict personalities and target political ads.

I believe that the best way to highlight the harmful effects of oversharing and news-feed algorithms is to build a platform that supports conversation, discussion, and independent thought rather than prioritising sharing and screen time. A communication tool should not be an experience; it should be a true service, one which puts readers’ and writers’ needs first by giving them the tools to control what they read and who reads their conversations.

We all have a right to freedom of speech - but we don't automatically have the right to broadcast. We need to think carefully about public groups - spaces which can go viral and take on a life of their own. A closed group can’t go viral or expand into a huge community. Public groups can connect you with like-minded people and show you interesting material, but when people disagree on topics close to their heart - human rights, economic policy - public groups descend into chaos or censorship. Their owners or controllers may be unclear; they must use moderators, whose rules are often unfair; and by suppressing conflicting opinions they encourage the development of filter bubbles. What we read online should be selected by individuals’ decisions, not by the moderators of anonymous groups or by algorithms trained for profit.

It is certainly easier to immerse yourself in the experience of for-profit social media. But I believe that conversation and debate should be more than a passive experience: they require effort, engagement and attention. Every day, we spend hours reading and conversing online. Just as we are careful with what we eat and drink, we should be careful with what we read, what we allow to influence us, and to whom we delegate the responsibility of choosing what is put in front of our eyes. The days of letter-writing may have passed, but the experience of reading a message carefully written for you, conveyed to you by a system whose only purpose is to support communication, should live on.