Putting Problems Before Solutions in Development

interview.jpg
Problem solving, in any context, begins with discovery, research, and scoping. It’s no different in international development. Yet too often, in the early stages of programme development, organizations turn to familiar methods with equally familiar shortcomings (questionnaires, focus groups), rely too much on existing data (literature reviews), or place too much emphasis on conventional wisdom (expert consultations, key informant interviews). Development workers -- and I have been guilty -- quantify responses, prioritize needs, and conceptualize solutions prematurely. And we often start with solutions in mind -- whether that’s more schools, more health clinics, or more investment in ICT4X -- eager for evidence from the field to confirm their viability.
While success metrics vary by programme, indicators are often determined by organizational and political priorities. (Serve the most marginalized! Reach the hardest to reach! All the time, whether it makes sense in the current context!) And when things go awry, we scratch our collective head and wonder where it all went wrong. Rinse and repeat.
Where the Status Quo Falls Short
Better development requires better programme design. Assuming we all agree that programmes should, first and foremost, respond to beneficiary needs, we need to re-examine how we plan, scope, and develop interventions. Here, good design research is foundational.
Design research is critical to creating services, programmes, and systems that respond to human needs and satisfy environmental constraints. But what is ‘design research’? Is that some fancy name for what organizations already do? Yes and no. Development organizations, government agencies, and NGOs already engage in significant amounts of information gathering and needs analysis. But there are two main flaws in their process.
First, too often, the data is largely quantitative or based on conventional wisdom. Facts and figures and expert opinions are relatively more accessible and can be analyzed using established frameworks. But quantitative data often lack nuance and context, and expert opinions are often shaped by political priorities. Why do parents consistently refuse to send their children to school? Why does a health education campaign fail to change behaviour? National survey data or education experts aren’t always able to answer such questions.
Second, the information collected in the project planning stage is ‘just good enough’ for building out programmes. Organizations whiz through the research stage on the way to implementation -- after all, research doesn’t win political points. A lack of deliberation and formalization in process limits the value of research and thus the utility of collected data. Passable data leads to passable programmes.
Enter Design Research
Design research, on the other hand, makes no assumptions about what the solutions might be at project inception. Even suggesting ‘health clinic’ or ‘primary school’ as potential end results is too limiting. Sure, design researchers do the literature reviews and consult the experts, but then our assumptions, hypotheses, and any solutions dreamt up in the office are put to the ultimate test: the real world.
Starting with human needs, we uncover, dig deep into, map relationships between, and see patterns in human behaviour using ethnographic interview and interaction methods. We place these behaviours, and the larger human patterns they represent, in their cultural, social, political, and economic contexts by immersing ourselves in our users’ lives, observing and shadowing those we seek to serve.
Once we have a firm understanding of user needs and environmental constraints, we map them against institutional capacities, political priorities, and development theory. It’s a complex, involved, and often messy process involving lots of inspired systems mapping and frustrated hair tearing. Through these exercises, however, possible solutions emerge and each one is tested against all that the research team witnessed, experienced, and felt -- there’s no better way to develop a strong gut-check mechanism than several weeks living among those you serve.
Instead of a gaggle of new health clinics, maybe a roaming healthcare worker with a network of sub-agents is a better rural health solution. Instead of a school, perhaps a distance-learning programme staffed by a fleet of non-traditional educators is better for nomadic communities. There is no one picture of ‘good development’, and it sure has nothing to do with the number of wells dug, schools built, or community healthcare workers trained.
Case: Financial Inclusion in Pakistan
Recently, I led a project to expand access to financial services among marginalized populations in Pakistan. There, 90 percent of people lack access to affordable basic banking. Public, private, and non-profit collaborations -- including our partners -- are working to change this reality for Pakistan’s marginalized communities. Solutions previously proposed included giving rural Pakistanis more mobile phones, training more mobile banking agents, or airing more television ads to educate the population about branchless banking. All of these possible solutions assumed that the right answer must be additive. That we needed more of something to make things better.
While previous suggestions informed our work, we spent the bulk of our time with the low-income and rural Pakistanis who stand to gain from accessible financial services. Sitting alongside them, we found answers to questions like: What are the distinct market segments that are too often lumped into one feeble ‘bottom of the pyramid’ designation? How might illiteracy — textual, numerical, financial, and technological — impact access to and use of financial services? And how can the formal financial sector allow a new class of entrepreneurs to create greater social value through its supply chains?
Our work took us to disparate contexts, from embeds at our client (a bank)’s headquarters in Karachi to visits to flood refugee camps. We learned about and borrowed practices from service delivery models in the fields of insurance, employment, development, healthcare, technology, and travel. Being realistic about potential solutions, we critically examined the operations of the facilitating bank to understand how to connect the dots between user needs and organizational capacity.
All this allowed us to develop solutions that leveraged the scale of social, political, economic, and technological forces to deliver better services. And guess what? Turns out the optimal solution wasn’t more phones, more agents, or more ad buys. It was a recognition that ‘the bottom of the pyramid’ isn’t a realistic client but that Adil, Hina, and Aqsa are. It was a strengthened business model based on selective, strategic investments for the rural, the poor, and women. It was a complete reorientation of the bank from being product-focused to being service-focused.
Without design research, we never would have gotten there. More phones, more agents, or more TV ads may have been good short-term fixes, but we thought a good long-term one would be better.
The fruits of design research have, to some extent, remained unclaimed among institutions that serve the world’s underprivileged. Perhaps greater care and thought in gathering, using, and processing information for programming will help international development organizations better serve those they work for.