As the sun sets on another SXSW, I reflect on just how eventful of an event it was, from what was happening within its official programming to all of its surrounding activity. What a time to have an industry conference that explores the intersection of technology with society and culture.
Day One, Friday. You could literally hear the oxygen being sucked out of the conference as attendees were scrambling around to figure out if they were going to be paid next week, or to hurriedly try and move their reserves from Silicon Valley Bank to the too-big-to-fail banks (if they’d have them). It was chaotic.
Despite the frenetic energy outside of the conference sessions, and the low mood that this brought with it, the discussions continued unabated with founders, technologists and academics all speaking in some way of the future and the role technology will play in improving it, in accelerating the potential. I couldn’t help but think how many of these companies will exist next year, who will make it through the next 12 months of fiscal drought, and actually how much of a hit to innovation this “near extinction” moment of SVB will have.
There is so much promise, and a need, for technology innovation to help us with our grand challenges — the moonshots, the dreamers — all of which now find themselves in a period of jeopardy. Cut it any which way you like, but there will be an innovation slow-down and then a future lag. Perhaps — well not perhaps — it’s definitely, just when we may need this innovation the most, we have a climate emergency to address.
What isn’t showing any signs of a slow-down is Ai, and specifically, Generative Ai — it really was the salted caramel of the show, in everything. A realisation now that much, much more of what we thought was uniquely human characteristics are indeed patterns and therefore can be made mechanical — writing, language, art.
Most notably with Greg Brockman, the co-founder of OpenAI on the main stage talking about the technology, it’s potential and how much more there is to come. Taking a humble approach, he admitted they won’t get everything right the first time, but are adapting, learning and improving. Particular focus on improving accuracy and rectifying bias. He called for a need for proper regulation, and invites it, but is clear that this needs to be focussed on harms not the means; otherwise this important technology may get stifled. A technology he believes will ultimately give us “a way out” to some of the grand challenges of our time like climate change.
Kevin Kelly, the founder of Wired Magazine, spoke of this age of cognition we are now entering and how we should think of these technologies as a resource of "unlimited personal interns (UPIs)" rather than just another technology. He also made it clear that the path forward won't be a one-tool-to-rule-them-all but rather a suite of intelligent tools that go deep into one aspect of intelligence. The worker of tomorrow will be a master of multiple intelligence tools, able to operate across a much wider spectrum of lines of business and expertise.
This point was well made in the session that followed, by Ian Beacraft from Signal & Cipher, who argued that the exciting opportunity for people in the future will be to keep broad and general, not to go overly deep. The intelligence tools will handle depth so that we don't need to. The art-form will be having a broad enough understanding to know what tool to use when. We are entering the age of the creative generalist, as he proclaimed.
What was apparent, and not just because I work for Edelman, was just how much of the macro-conversation was around trust. The SVB fallout further, and incorrectly in my opinion, put doubt in trust in technology. The run-on the banks largely centred around confidence and loss of trust. To trust that tech will save us, and the faith people want to put behind the innovations to tackle climate change or the underpopulation issue and aging population crisis we have. Trust in AI and that these companies and the tools they're creating will ultimately be a force for good and not create more harms. The speed by which they're moving creating a lack of trust as people worry this is happening too fast. Concerns about the quality and volume of synthetic media being created, and whether this will just further accelerate the liars' dividend, leading to a world where the default is to trust nothing.
To succeed, technology companies can no longer rely on blind trust. They must actively court trust, build it and protect it.
There is too much at stake to not.
Justin Westcott is Edelman’s Global Technology Chair.