How to make tech for good: Four lessons from the field
Technology must always be good for people. Why?
Because even if you discount basic ethics (which, let’s face it, you shouldn’t) any business that puts short-term profits over longer-term impact risks a backlash from policy makers and customers alike.
But what exactly is good tech and how do you design it? At Smart Design, our work with clients innovating at the bleeding edge of technology – VR, AI algorithms, and Generative AI – has helped us understand just how tricky it is to draw the right lines around new technology, and how frequently the same questions come up, such as: How should technology be applied – and crucially – how should it not be applied? When should we make decisions on behalf of our users and when should users be in charge? All while, of course, maintaining a competitive edge in an increasingly fast-paced market.
Here’s our advice.
01 Find the canary in the coal mine
In fast moving spaces like GenAI and VR, every company is vying to win mindshare and build a user base – so the tendency is to focus on short-term hooks and dopamine hits, like the one you get from being able to generate huge amounts of content in seconds. But these often come partnered with longer-term downsides, which take time to surface and can be hard to correct.
Happily, there are steps you can take to help predict these downsides. At Smart Design, we carry out research on the impact of any given new technology with the first cohorts of people to experience it. We do this to help identify any negative consequences that might become more widespread as more people use the technology.
GenAI is being rapidly being adopted for specific professional contexts; such as law, media, finance and education. Working in the field of education, we spoke with high school students who were using it to help with their studies, we discovered that many of them assumed the answers that GenAI created were factually watertight – and they were shocked when they noticed mistakes (also referred to as “hallucinations”).
Ask yourself
What healthy friction can we build into an experience to ward off longer-term consequences? With GenAI in education, for example, consider building a prompt to get users to fact check the content.
02 Identify the jobs NOT to be done
Tech that helps us do things faster and better can be good, but it can also risk undermining our sense of meaning. It’s important to know where new features add value and also where they definitely do not. We call this “the jobs NOT to be done.”
Ask yourself
What are the anti-use cases? How can we design certain experiences to ensure our product is not being used?
At Smart Design, we’ve carried out research with designers, musicians, and artists to understand how working with emerging technology can boost their creativity. We discovered that technology can be used to protect the creative headspace from other life demands, rather than take over the creative act. This insight enabled our client to develop features that help creative stay in the flow, experiment with new concepts, and find new ways to collaborate with other artists.
03 Understand the trade-offs
Every new technology comes with dilemmas – choices with no simple answer because they involve difficult trade-offs. A classic example is automation coming at the expense of people’s jobs. Our work with teenagers and their guardians from all over the world shows us that social media can be a great way to connect with others and build a community, especially for those who live somewhere where it’s hard to find like-minded souls. But it can also be a catalyst for unhealthy social comparisons, leading to understandable concerns about teen use.
Ask yourself
For any new technology or feature, explore the pros and cons for different user types and use cases. What information do we need to help us navigate these trade-offs and then make decisions about what to build and launch?
My colleague Cameron Hanson recently explored the design trade-offs around using GenAI to conduct research. To do so, she conducted three experiments: One where she interviewed a synthetic user to see if it strengthened our understanding of research participants; a second where she used GenAI to query a local database of our research transcripts to see if it could find any deeper insights; and finally using text-to-image generation to discover if it could enrich in-person ideation sessions. The result? She discovered that while GenAI can be a helpful tool for research, it can’t yet replicate the value of human participants and researchers.
04 Don’t put all the responsibility on your users
Taking a position on the human implications of technology is a great way for companies to differentiate in the tech space, where many other firms are driven by profit or regulations alone. Our work understanding people who are significantly impacted by the emergence of AI in their industry tells us that they clearly believe some choices about technology and its use and role should not be left to individuals.
Ask yourself
When it comes to how AI is used, what should be left to users to decide and what should companies decide? How can the guardrails we build around tech become a differentiator in a crowded marketplace? Consider crediting or offering compensation to the people whose work is being used to train the models, giving them ways to opt in or opt out.
When it comes to generative AI, perhaps more than anything, people desperately want companies to take a stand. Creatives, educators, parents, and students aren’t asking for everything to be done for them, but they do feel that some things are simply out of their hands. As designers, innovators, and company leaders, we have the opportunity – and a responsibility – to make a real difference in the world. Let’s make it a good one!
About Jamie Munger
Jamie Munger leads the strategy practice at Smart Design, including global projects for Meta, Google, CVS, and Mercedes Benz. Her book on human-centered public policy design was published in November 2020. She holds a BA in Sociology from Emory University, an MDES in Design Research, and an MBA from the Illinois Institute of Technology.