Human-Centered Futures

Introduction

I’m not against innovation, not by a long shot.  But I do like to ask questions.  How will this new invention help people?  Who will it harm?  Who was it designed for?  How can it contribute to a more inclusive future?

As a history of technology PhD I have developed a healthy skepticism of shiny new things.  Too many times past innovations were heralded as progress only to make some people’s lives worse or failed completely; I’m looking at you, high-fructose corn syrup and hydrogen-filled zeppelins.

It isn’t just that some innovations fail or have unintended consequences—which most indeed do—it’s that many were simply not designed with the social good in mind.

As I am currently working through the IDEO.org and Acumen Fund sponsored Design Kit course “Introduction to Human-Centered Design,” I’ve been thinking about historical instances of innovation that did consciously embrace human-centered thinking to build more socially equitable futures.  The examples below range from imagining physical technology to designing methods of culture change, and together they show what is possible when creative people design for a human-centered future.  

Human-Centered Machines

Generally, machines are celebrated for their efficiency, not how engaging they are to operate.  But during the Industrial Revolution of the nineteenth century, Gérard-Joseph Christian disagreed.  He thought machines should be designed both for efficiency and the well-being of the operator.

As an industrial philosopher of sorts, Christian did not design machines, but as the Director of the Conservatoire des Arts et Métiers, the locus of technical knowledge in France at the time, he had an influential platform from which to discuss the social merits of machines.

In the opening decades of the nineteenth century, Christian observed the violent riots in England and France fueled by resistance to industrial machinery.  While he firmly believed mechanical production was the way of the future, he also believed the workers resisting machines had a point.  Tending many of the newly invented machines was little else than pure drudgery.  To Christian, if a machine performed its productive function well but failed to allow intellectual engagement and personal growth for its operator, it was a bad machine.  

   

For example, Christian found certain weaving machinery more satisfactory than others because they required both nimble handwork and progressively keen judgment from the operator, not mere rote physical force.  In discussing potential mechanical designs Christian preferred plans that limited the motions of the machine rather than limit the freedom of the worker.  

In her book, The Mantra of Efficiency, historian Jennifer Alexander writes, “To Christian the most effective machines were dynamic agents of social transformation.  They economized on human labor, freeing it for other work and replacing strenuous, machinelike, and repetitive tasks with ones requiring intelligence and judicious movement.”

Unfortunately Christian’s ideas did not become mainstream, and industrial machinery more often constrained the operator’s experience than expanded it.  However, his industrial philosophy is an early example of an explicit focus on the human experience when engaging with technology.

Human-Centered Networks

At the height of the Cold War computer systems were involved in everything from encoding university student information on punch cards to automating US bombing runs in Vietnam.  Historian Paul Edwards has discussed how computer systems framed a tense “closed world” political discourse from the 1950s to the 1980s in which control and conflict seemed impossible to escape. The creation of these systems exemplified a distinct lack of human-centered design.

At the same time, however, a much more open and human-centered vision of technology was spearheaded by a noted cultural critic named Stewart Brand.  In From Counterculture to Cyberculture Fred Turner chronicles the story of how Brand developed his dream of open information networks built to bring people together.  Turner shows how Brand’s interactions with anti-authoritarian communes and drug culture in the late 1960s, Bay Area avant-garde artists and computer science experts in the 1970s, and tech entrepreneurs and environmentalists in the 1980s and 90s shaped his ideas and innovations. 

Brand’s first innovation was the Whole Earth Catalogue first published in 1968.  A printed booklet that went through numerous editions, the catalogue was an effort to provide a vast quantity of useful information, technical knowledge, and product reviews in one document accessible to anyone.  The contents of the catalogue made it valuable to all kinds of people: from gardeners looking for cultivation techniques to entrepreneurs looking for advice on the latest telecommunications products.  Brand saw a need for people to access the increasingly powerful information and technology blossoming into existence, and the Whole Earth Catalogue was his solution.  

Though its invention was still decades away, Brand’s publication was foundational to ideas about what the internet could be.  In 2005 Steve Jobs compared the Whole Earth Catalogue to “Google in paperback form, 35 years before Google came along.”

In 1985 Brand combined his idea of accessible information sharing with advances in digital technology by founding  one of the first online communities ever created: the Whole Earth ‘Lectronic Link, or the WELL.  The idea behind the WELL was similar to that of the catalogue—to leverage technology to share information openly—but with the added benefit of a digital infrastructure with even greater flexibility and reach than paper could allow.  

The WELL became a popular e-destination where people of various cyber-subcultures could bond and served as the progenitor for many later virtual communities, including Reddit.

By embracing an ethos of openness, Brand pioneered the ideas of accessible information and social networking long before Wikipedia and Facebook existed.  While early computer systems funded by the US Department of Defense were designed to control people and information, Brand’s vision saw technology freeing information to help people build a better, more connected world.  

Human-Centered Solutions

At the time of a 2010 British Medical Journal estimate more than 2.5 billion people globally lacked proper sanitation.  In places without toilets or latrines, locals were forced to practice open defecation.  Parasites and other infections became rampant in those communities due to the uncontrolled waste.  For decades, relief organizations had sponsored construction of sophisticated latrines in rural African and Asian villages to curb hygiene problems via technology, a classic approach to social innovation.  


Some of the latrines were used.  Many were not.  Villagers were often hesitant to fill a structure that was nicer than their house with stinky, dirty waste.  Some of the latrines were even broken down for parts to augment the villagers’ meager dwellings.  

Upon visiting villages in Bangladesh with such latrines installed, sanitation expert Kamal Kar found that open defecation was still a common practice.  The technological approach wasn’t working.  Dismayed but determined, Kar developed a decidedly non-technical, yet innovative approach: engage people on their own terms to change behavior.  

Kar’s method, called Community-Led Total Sanitation (CLTS), is now used in more than 60 countries worldwide and has led to staggering progress in global rural sanitation.  

His insight was brilliant, but it was not complicated.  He simply worked to understand and center on the social forces that impacted hygiene behaviors.  This helped villagers come to their own realizations about the risks of open defecation and to take communal action to address the problem themselves. 

By asking questions about sanitation habits (e.g. where villagers defecated in proximity to where they washed or prepared food) CLTS workers prompted the local people to discuss some of the uncomfortable truths about food and water contamination.  Kar’s method almost invariably inspired each village to dig and maintain their own latrines, thereby giving the villagers ownership over sanitation efforts.  

In other words, Kar didn’t impose a technical future onto the villagers, he helped them to understand the future they wanted and what they could change to get there themselves.  

Conclusion

Building a future that improves life for all people will require innovation with a human-centered approach.  Such an approach is not always easy and requires more effort and creativity than simply designing for efficiency, control, or profit.  But the results are systems and technologies that work and celebrate our shared humanity.  As technological sophistication in automation, biometrics, and artificial intelligence increases it is more important than ever to ground ourselves in human-centered principles. By doing so we ensure that the future is an improvement for all people, not just those behind the algorithm. 


Previous
Previous

Why Futurism Needs History

Next
Next

The Absence of the Incidental