Melis Senova
Design leadership in an AI-enabled world.
If you are a design leader and haven’t yet watched this video from the Nobel Prize Summit 2023 by Tristan Harris, please do. In fact, if you are a leader of any type, watch it.
It is essential viewing for anyone who is engaged in the conception, design and deployment of digital products, services and experiences.
Tristan starts by quoting one of my favourite authors, Edward O Wilson, who said this:
The real problem facing humanity is that we have Palaeolithic brains and emotions, medieval institutions and God-like technology.
Why do I think this is essential viewing? Because this 15 minute video touches on the three areas I believe all design leaders should be thinking about and acting upon.
Question 1: How do we honour our humanity?
This question will be the most familiar to human-centred designers, although it takes on a different flavour when we consider it in the context of humane technology and AI.
Tristan questions how well we did from the first contact with AI via social media adoption. These platforms, features, filters, share buttons were all conceived, designed, tested and deployed. They were done so with a certain level of care, awareness and wisdom available to us at the time.
At that time, we were largely unaware of the power we were wielding. This power outstripped our ability to foresee the long term consequences of the design decisions being made. We cannot allow that same pattern to repeat itself with AI.
So it is time that we acknowledge a few things:
We must acknowledge the complexity we face is beyond our cognitive capacity.
We have brains that haven’t evolved exponentially (ever) and certainly not since the last 100,000 years. We need to admit that we do NOT have the capacity to ‘think through’ the potential complexity and systemic impact and harm (and benefit of course) that is made possible with AI. So we need to adopt different practices that acknowledges this. I will cover this in a separate article.
Most design practices that I’m aware of cannot point to the ethical framework that informs their design judgments, or still suffer from being ‘left out’ of the rooms where these decisions are being made.
It would be the height of hubris to assume we know exactly what we are doing and how this is going to go. When we make choices about how others are going to behave, we are designing. So we need to do so in a way that is informed by ethics, values and a vision of a world we want to live in with a deep respect for the power that is now literally at our fingertips.
We must resist the pressure to compete in bad races.
As Tristan mentions, there are bad races, and so there are good races. Just because a competitor has deployed AI doesn’t mean your design team needs to deploy it as quickly and in the same way.
Take a pause, check in with your values, make a call on what constitutes a sound, ethical design decision with as much future foresight as you can muster. That is better (and safer) than succumbing to the pressures of keeping up.
Better still, get serious about re-tooling yourself and your team, both internally (by working on your character, values and ethics) and externally (by getting up to speed with the humane tech movement).
This ‘bad race’ Tristan refers to as the ‘race to the bottom of the brain stem’. Let’s put a neuroscience lens on this.
Our brainstem is the bit of your brain that connects to your spinal chord. It is a part of the central nervous system and controls all of your essential functions, like breathing, heart rate and sleep.
Within the brainstem exists the Reticular Activating System (RAS). This is responsible for wakefulness and awareness. It let’s the brain know we need to attend to something and then the brain (neocortex) allocates appropriate cognitive resources in response.
The ‘bad race’ is to get more and more of your attention. How your attention is ‘gotten’ matters, and the ‘how’ is the domain of design.
We must acknowledge that where attention goes, energy flows.
What we pay attention to matters and it matters in the most tangible ways.
As designers, what you pay attention to gets designed.
What directs your attention are your values. What you value you tend to pay attention to. So if you are a designer, or a leader and you cannot confidently explain what values drive your attention, being deliberate about what gets your attention and why, becomes more difficult to understand. The time of faithfully following a brief without critical review is over. The risks are too high, and the power you hold is too great.
How you bring something into reality is governed by your ethics.
The foundation of how you design and lead must be built on your own ethical practice. In other words, you need to know clearly what is ok, what is not ok. A deep understanding of these attributes and keen appreciation for their dynamics creates wisdom. When you have wisdom onboard, you have a better chance of weilding the god-like power within AI with appropriate foresight and accountability. Without it, we are replicating the same pattern as first contact with social media, but with greater pace and impact.
So now let’s consider how social media grabs our attention, while AI learns how to keep it. If we are not practicing wisdom ourselves, in terms of what we watch and how we use this technology, we can very quickly find ourselves creating patterns of harm within our own lives. This is a design challenge and leads perfectly to our next big question.
Question 2: How do we upgrade our institutions?
Institutions are a collection of people, organised by process, enabled by technology, within a culture. They exists to perform a function in the world, and that can take on almost any shape.
As collectives of people, they are powerful forces in shaping how society operates, what it values and what it pays attention to. So how do we upgrade them?
Institutions must pay attention to the diffuse, long term, cumulative harms.
For hundreds of thousands of years, the implication of a choice you made didn’t outlive your life. Back then, the action you took, resulted in consequences you largely understood and would live to see. Now, we are making choices, enabled by technology that will have ramifications well beyond our lifetimes in inconceivable ways.
It’s not surprising then, to see that our institutions are geared toward dealing with acute, discrete and short-term issues, rather than the slowly evolving, chronic, long-term and cumulative ones. You know the ones I’m referring to, world population increase, climate change, vulnerability of socioeconomic structures, rate of change and adoption of emerging technologies without understanding future consequences, to name a few.
Further reading:
Here’s an OECD report if you want to dig deeper into systemic global challenges.
Design teams within institutions need to elevate their voices, their impact and their accountability.
Design as a mindset, knowledge set, skill set and tool set is a necessary and powerful combination in highly complex and ambiguous contexts. Why? Because at its core is the necessity to hold multiple perspectives simultaneously, it is comfortable with ambiguity, and its iterative delivery helps control risk. This provides a resilient framework to bring to diffuse, complex, long-term challenges.
But designers are often feeling excluded from the rooms where significant decisions are being made, usually with a few other key stakeholders also absent, like the future, or the planet, or all living systems.
Understand the power you wield. Designers are powerful creatures as they create shared realities. They direct sense making and choice making with their work, and with that comes responsibility, and ideally accountability.
The example given in Tristan’s talk is astonishing. Fake news spreads many times faster than truth. Extreme voices often go viral, moderate voices don’t. Extreme voices post more than moderate voices do.
Below is a screenshot from his talk that speak volumes. Look at the difference between how many times that correction from the source of truth gets retweeted (58) compared to the original ‘twisted’ perspective (8000).
Here are some questions we need to be asking:
- As designers and design leaders, what are the lessons we have learnt from our first contact with AI via social media?
- Knowing now that human behaviour is biased toward sharing fake news more than truth, how do we design for that?
- How do we bring this experience, let’s call it wisdom, into our roles as designers with AI?
The complexity of our challenges have exceeded the capacity of our institutions to respond. What we need to cultivate to close this gap is wisdom. And what is wisdom? Wisdom is the knowledge and sound judgment that comes from experience.
The quality of our judgments needs to match the complexity of the choices we face. We need to design in a much richer way.
Question 3: How do we match power to wisdom?
Tristan spoke about this as ‘binding the technology’. Based on E. O Wilson’s quote, we need to:
Embrace our Palaeolithic brains, upgrade our medieval institutions and bind the race dynamics of our god-like technology.
The only way to build wisdom is through experience and reflexivity. We need to learn from what we’ve done, integrate it and let it inform different action. Wisdom does not build if we keep making the same choices, in exponentially dynamic contexts.
Systemic designers are concerned with building insight into the whole system within which they are designing. This includes understanding non-human actors as well, considering the relational effects of each actor in the system and designing (making choices and sound judgments) something new.
We are limited in our capacity to see forever into the future, we cannot hold the type of complexity that is required to know with certainly the future implications of what we are designing, but we can do better than what we are doing right now. We can build our resilience and resistance to organisational and market pressures for more growth, more profit and more power. We have the ability to design a different way. We must utilise it.
I’ll leave you with this final statement from Tristan, it’s close but I don’t think I’ve quoted him perfectly, so make sure you watch the video to get the accurate picture:
Tristan says, if you have more power than you do wisdom, by definition you will create effects in the world that you can’t see. He asks the audience, “How do we limit the power of AI that we hand out to meet the level of wisdom and accountability?” These technologies can be beautiful and magical when tied to appropriate levels of wisdom.
You cannot have the power of gods, without the love, prudence and wisdom of gods. — Daniel Schmachtenberger
To my audience of creative, reflexive, ethical and committed designers and design leaders, I know you are all here to grow, deepen your understanding of yourselves to further strengthen your practice. Thank you for doing this challenging work. The fact that you are here, reading this assures me that together, we can do this.
Who is Melis Senova?
I am a coach and advisor to design leaders, C-level executives and leaders in government. My work in This Human is dedicated to the next generation of designers and leaders.
When you’re ready, here’s how I can help you:
Building confidence in your practice is essential for progress. Get started for free with this workbook.
This human community is a place for you to land, connect and learn. It’s free, and it’s yours.
A 3-week guide to uncover essentials to impactful design and leadership.
An 8-week deep dive into your character as a designer. Build your design character blueprint and find your voice as a designer and leader.
If you want to have a chat with me about some coaching, book some time directly.