I never really bought into the whole influencing thing, until I realised… I had.
Waiting for chemo to do its thing, I started scrolling through Instagram and paused at a video that popped into my feed of a woman walking down an Instagram perfect Parisian street. There was nothing to say it was Paris, but what could be nicer than letting my imagination go on a few hours’ ramble far away from the hospital. The whole image in my mind was unmistakeable. Probably the Left Bank, the smell of fresh coffee in the air, and the woman - jeans, loafers with thick socks, cashmere jumper and another flung over her shoulders - was someone who bought her food in local markets, her clothes in vintage shops and spend hours in book shops. Everything about her was stylish, effortless, capable of breaking the rules yet unpretentious. A woman I could happily sit and chat to for hours. I pressed “follow”.
If spending the last decade in the world of science and technology has given me any insight, it’s into the remarkable scientific minds that are digging society out of some seriously deep holes. A group who spend their entire careers searching in the dark for the scientific needle in a haystack, often oblivious to the positive change they are creating for human beings.
There’s nothing like having cancer to make you grateful that scientific research and technology innovation is gathering pace.
But all that glitters is not gold and there is a flip side as we move deeper into a future driven by data intensive research and technology.
If you haven’t come across it already, I recommend a fascinating podcast by Tortoise Media, the slow news people who I first met when I spoke about some of the challenges facing the future of quantum technology at one of their newsroom events. The podcast is about Amber Heard, someone who until the courtroom battle with Johnny Depp, I knew absolutely nothing about.
During a train journey through the Sussex countryside, I listened to a futuristic-sounding horror story that threatened to undermine the independence of a legal process, and shatter a person’s reputation until they were broken. It’s easy to think that the villains of the story were the bots manipulating the algorithms to attack. Tortoise examined more than a million tweets and 50% were automated attacks or from people hired to damage Amber Heard. But if the bots were the villains who sent them and why. There’s been plenty of speculation and I’ll leave the evidence and theories to the Tortoise podcast, but the result was an attack of the level rarely seen even on X.
It would be easy for us to push this onto problems caused by authoritarian states to be solved through geopolitical pressures, economic sanctions or worse by conflict. Amber Heard, no matter how utterly repellent the damage caused to her, would be viewed as loose change compared to a risk to national security. Or to fall into the traps that regulation would leave in its haste to control or moderate content on X, Tik Tok, Facebook and others. These are the challenges we must grapple with at a time when the bots are already out of the bag, and knowing that not all technology innovation is bad. Far from it. It would be wrong if fear were to hold back the incredible society advances in medical science for example through the use of artificial intelligence.
Just as there are plenty of other good examples, the reality is that we are living in an age where the ability to analyse vast amounts of data goes well beyond scientific skill and straight into ethical questions about the future of society and how we live. Something that’s always surprised me coming from a career in the legal profession and social justice into science and technology, was the infrequency in the latter of one word. Ethics.
I’m not suggesting that the legal profession or social justice organisations have the monopoly on ethics, but it ever there was a place today where the discussion is needed, top of my list would be deep into the core of education and professional development for scientists. A constant reminder of their role as scientists and citizens.
So when I placed it as the centrepiece of how a modern science organisation thinks about its collective and individual responsibility for society, I might have expected some resistance from the Board or at least lengthy discussion. But it was quite the opposite, a reminder perhaps that the importance of asking complex ethical questions in group decision making sometimes needs nothing more than a firm nudge. I’m sure there are some topics here that we’ll return to.
For now, and as our community thinking here develops, I hope there is also something we can each do. Restricting the use of social media platforms so we are not feeding the beast with our data, would be more than a tad hypocritical, particularly as we are all here on a platform with its own algorithms munching away in the background. It would also be incredibly dull - after all, the street turned out to be Paris, the woman an American former model and owner of a small Parisian shop, who is every bit as lovely in person as the Instagram image.
What we can do is remind ourselves as citizens that the bots can only take our freedom of thought and expression if we let them. Here I hope we can create a place where we actively encourage each other to listen to other people’s opinions, take the time to reflect, know it’s ok to agree and disagree, hold each other and technology platforms to account, and share our own thoughts. Slowing the pace right down and benefitting from the immense value that comes from hearing different opinions.
Technology has the ability to far outpace the human mind. Human beings still have choice. Choose wisely.
Let’s see where this takes us.