Emotions are more important than you think.
Considering a solution and its emotional resonance can determine the solution’s success in unexpected ways. Thinking about delighting the user is not a new phenomenon in the private sector (particularly in tech), but is not a primary consideration in many public sector solutions. Sometimes, things simply have to work. They don’t have to make people happy. Thinking about how emotions can support a policy or nudge people into action is an important part of sustaining a solution.
Yet, emotions can be a powerful lever when designing for sticky solutions. A good example of this is IDEO’s work in increasing the uptake of HIV prevention products among women in South Africa. These drugs had already proven to be highly effective in fighting HIV, but why were women not taking them? From speaking to women they realised that the packaging was stigmatising its use and even had the potential to lead to domestic violence from partners. Framing HIV products as medicine was a no-go, because male partners become suspicious.
The team’s solution was to re-package the drug with a fresh brand, much like cosmetics. This branding allowed them to be more discreet around their partners and also made them feel empowered and attractive, making taking HIV drugs a lot easier and desirable.
One of the hackathon groups experienced the power of an emotional response that hampered their policy ideas when they suggested an automated solution to gathering performance data in a health center. Speaking to health professionals, they found that they did not want to spend precious treatment time reporting data. They therefore suggested having ‘robots’ in each room that could monitor conversations, track the amount of time spent with each patient, and collect other relevant data about each health visit.
The team quickly modeled their solution with cardboard and pipe cleaners, and invited others to give feedback. While many agreed with automating the data collection process, a physical ‘robot’ made people feel slightly uncomfortable. What if something more subtle was used, like Alexa? It was clear from the participants’ emotional reactions that an alternative solution was needed to combat their visceral concern of a data-collecting entity that potentially violated their privacy.