All watched over by machines of loving grace

In health and care tech people aren’t the problem: they’re the purpose

Mark Brown
11 min readOct 16, 2017
A photo of Mark Brown on stage by @RForbister

The Following is the text of an open speech made my Mark Brown at The TSA International Technology Enabled Care at the ICC Birmingham on 16th October 2017

When I think about the history of technology and care I always picture something like the end of an episode of Scooby Doo, where the ghost haunting the abandoned amusement arcade has their scary ghost mask ripped from their face only to reveal the caretaker who had found gold in the basement and would have gotten away with selling the land if it wasn’t for those pesky kids. Instead though it’s the developers of health and care technologies shaking their fists going ‘we would have got away with this product if it wasn’t for those pesky users!’

The title of this talk is ‘people aren’t the problem: they’re the purpose’. I’m not really a tech person, but I am a person who looks at the future and I am a person who likes to dig themselves into working out whether problems really are problems and then working out how people might like to have their problems solved. I am always on the side of the user; even what what they say is contradictory, frustrating, cynical and hopelessly optimistic all at the same time.

The focus of this conference is care. Care, yes, enabled by technology, but primarily care. We all desire care sometimes; to be looked after, to be watched out for, to feel that someone, somewhere has our back. In the UK the NHS and our social care system exist on two planes; one of practical worry and terror and horror and problem solving and nuts and bolts and the other of a kind of collective dream of what it means to be looked after and safe from death and disease and discomfort and frailty and being trapped alone in our bodies. As such health and care is always a conflict between our desire to be looked after and our desire for autonomy.

Health and care is for all of us, when we experience health or care need, a space of desire and fear and hope and despair. It is the most fundamentally human of experiences. Health and care is an emotional space, even if the industries that service it are not. In our fear of death and of disease and of infirmity we created a false opposition between technology, science and medicine. The history of technology enabled care has always been a tension between wipe clean technological solutions and the real and messy lives of people.

A digital friend doesn’t screw you over

It’s a myth that all human endeavours are equally likely to be completely disrupted by digital encroachment. Some things will always need direct human labour. We must make digital technology a friend that makes it possible for us to do more of the things that we actually think are important.

But what is friendship based upon? It’s based repeated exposure: do I see you regularly enough to recognise you? Friendship is based upon affinity: do you and me think and and act in a way that complements each other? It’s based on a common or shared endeavour: are we doing something together that’s meaningful for both of us? A final element of friendship is this: do I feel like I can rely on you? Trust you not to screw me over or turn against me? Do I feel like, when it comes to it, you’ll put into the relationship as much as I will and not just try to take from me or order me about or betray my best interests.

If we truly want technology enabled care to be a beneficial reality not a technocratic nightmare then it really has to be an actual friend; not a pretend one, not one that says one thing to your face and then one thing behind your back. It has to feel real and meaningful and trustworthy. It can’t be a con trick or sticking a shiny smiley face on something that isn’t acting in our best interests.

Too much work in technology enabled care has focused on sterile tests of efficacy measured by people in trials or in user acceptance over user experience. I know that the majority of you in this room are well aware of the difference between user acceptance and user experience, but it’s worth exploring a little.

User acceptance is about whether some can or does do a particular thing in relation to an interaction or service or device. ‘Can you boil water with this kettle?’ ‘Can you read your heart rate from this monitor? ‘Can you talk to your doctor via this screen?’

User experience is about ‘if you had a choice of kettles would you choose this one?’ ‘Is this read out of your heartrate the right thing for you right now?’ ‘How did it feel to talk to your doctor via this screen?’ Was the experience what you expected? Did you get what you needed from it in a way that suited you and made sense in your life? What was missing?’

‘Would you choose this as the way to do what you need or want to do, and if not, why not?’

Giving people the right thing in the wrong way is the same as giving them the wrong thing.

User experience is everything

We care about user experience when we want the user to do something for us; less so when the user want us to do something for them. When user experience is vital for the furtherance of our obvious business model then we focus upon it, other times we’re less fussed. This must change. Too much work has been put into, say, massaging the user experience of ordering a pizza or hailing a cab because the relationship between user experience and revenue is direct. In health and care the lines of revenue are less direct, measure in overall health or prevention or in transactions. We assume that user experience is secondary, that it doesn’t matter if the doctor is rude to you as long as the scalpels are clean and they don’t remove the wrong organ.

But that doesn’t work with technology enabled care. To quote another great science fiction luminary ‘the streets find their own uses for technology’. Whatever you build will be used in ways that suit the person using it or at the very worst case won’t be used at all. I’m put in mind of a photo I saw on twitter of a great big fat book. A book with a kindle. A book with a kindle sticking out of it being used as a rather nifty bookmark. Needless to say, the kindle had been a gift from a well intentioned son or daughter and its owner had managed to find a way of fitting it into their busy reading schedule.

There’s a strong case to be made that people find it extremely difficult to calibrate their trust in technology. As Arthur C. Clarke said, any sufficiently advanced technology is indistinguishable from magic. And the wider technological world is entering the world of magic. The technology that you are buying and selling and showing today is already obsolete, already more rooted in the past than it is in the future. The people you have designed for are not the same people who will use your technology in five or ten or twenty years time. There will always be space for pure research and development, but the wins from now on will be in user experience. People will no longer be grateful for the implementation of bad and clumsy technology that they have to change their life to fit around, especially when their life is being changed around enough by their experience of ill health or of care need. Many more of us will live much longer and much of our life will be lived in company of chronic conditions, but just as the baby boomers are refusing to retire their freedoms in the face of aging or illness; so the coming generations will expect your technology on their terms. It will be impossible to create the required relationship between user and tech to guarantee the outcomes desired if the tech is like a catherine wheel fallen off its pole when the user was expecting a fireworks display.

Like owners look like their dogs, tech looks like its buyers

Technology enabled care must be rooted in a value proposition for the user. Often the voice of the end user of technology enabled care products or services is most quiet, the least examined. If the demand is coming from a system we talk to the system. If the demand is coming from a business model, we listen to the business model. The needs and wishes and desires of the end user become secondary to the final purpose: to make money. When systems and profit motives are misaligned, the end user suffers. Especially in conditions where technology is deployed to make up for the reduction of directly human powered intervention, support or care.

Health and social care as provided in UK public sector spaces is not taught to see little problems. It’s taught to see big problems. It’s taught to think of big systemic changes, incremental changes, glacial changes that will unfold over decades. It’s taught to think ‘go big or go home’.

What has been lacking, and this is where people as citizens have been shut out of this market, is the actual aligning of what is developed with the actual lives people live. Your average punter in the UK in public sector spaces is closed out of defining what the market offers because, in the end, they aren’t needed by the industries that make public sector tech happen. In the NHS, for example, the NHS is the purchaser and the tech maker is the supplier. As long as the NHS is happy, then so is the supplier. And, if it doesn’t quite work for people? Well, we just blame them for being non-compliant, or expect some other body of person to step in and make that online form comprehensible or to make sure that they read the insulin levels or similar. The average punter is in the same position of someone who doesn’t have enough money to take part: no one cares.

We are confused about who our customers are. If we draw an actual set of concentric circles of who we listen to, at the centre is nearly always the person or organisation that puts up the money. They’re the people who make the buying decisions. They’re the people who front up the cash. The business models available shape what the technology looks, feels and smells like. Like owners that begin to look like their dogs, technology in care looks like the people who do the buying. The things that succeed are a physical map of the purchasers decisions, what they deem to be important or desirable. And that’s where we have a problem.

In care the person putting up the money isn’t always the person who is receiving the service. This means that the people who hold the purse strings aren’t the direct end beneficiaries of the things we make and sell.

People don’t know the future until they taste it like the first frost of the year on the tip of your tongue. There is a feeling of something shifting, moving just out of sight, something just out of reach but undeniably there, tantalizing, invigorating. You begin to trust in the future when you can begin to see how your desires, hopes and dreams might be met in the new things that are coming into being. Users may not know what they want directly from your technology, but they know what they value and they know what pisses them off about being them.

We’re all moving into the future whether we like it or not, whether we drag our heels or run headlong into as yet uncharted paths. We carry our expectations and our preferences and our past experiences with us like a map of what we’re looking for.

There is no boundaries around health and care

The overall direction of healthcare is that any boundaries that divided it from everyday life are dissolving. Chronic conditions and health and wellbeing maintenance happen not in medical settings but other there in the real world of people’s lives. There is no longer a division between real life medical or care situations and the average, mundane, amazing, confusing, fantastic life of people. The health and care technology of the future no longer controls the context of an intervention. What it controls is the quality of the experience and the benefit which it provides.

Ubiquitous computing, always on internet and the squeezing of a huge array of sensors into smartphone and other devices at low cost and high processing power has made a world where technology is now seamlessly integrated into the lives of people where they choose it to be.

Such technology could become the interface between people and health services, either automatic, elective or a mixture between the two. It could also be the mechanism for delivery of treatments, the management of treatments, the monitoring of treatments. It could become the third that walks beside you; the manifestation of care and concern and fundamental respect for autonomy in the face of suffering or frailty. Add to that algorithms, machine learning and other forms of AI and the space is there for a new kind of relationship between individuals and care; a real time relationship where automation forms the interface between people who give care and people who need care. This is the future that is unfolding, for those rich enough to afford it at present. Those healthy enough and cared for enough to make the money to bend technology to their will.

Stop caring for people while not caring about them

There is an old maxim that When you make services for poor people you make poor services. I’ve have seen devices and services that make my skin creep because of the assumption baked into their design. I’ve seen transmitters embedded in adhesive tape sold as an answer to people who might get confused and lost. Yes it’s an answer, but it’s a horrible one in its assumptions. I’ve seen a cacophony of devices to record data without the same noise and volume applied to what might happen when that data flags something worrying. I’ve seen technologies that haven’t been tested against the real lives of people but against the stated constraints of services. User centered design is one route. There are many others. None can be done without just hanging out with people and showing you, the people who make technologies of care, actually care about what it’s like to be them.

We should be making technology that helps people live; not technology that just keeps people alive. We should be making technology that doesn’t just tell us where people are; but technology that helps people to go where they want to go. We shouldn’t be making technology that tells people what we want them to know but technology that helps them to tell us what we need to know.

If I want you to do anything I want you to think about users and where they are over the next two a days and how they’d feel about what you say on their behalf. We need to be where the user is if we want to save their health without killing their soul.

To quote a famous poem by Richard Brautigan:

“I like to think (right now, please!) of a cybernetic forest filled with pines and electronics where deer stroll peacefully past computers as if they were flowers with spinning blossoms.

I like to think (it has to be!) of a cybernetic ecology where we are free of our labors and joined back to nature, returned to our mammal brothers and sisters, and all watched over by machines of loving grace.”

You are the people who make the machines, who hold the power as to whether they liberate or they oppress. I want you all to think about how you bring that love and that grace.

@markoneinfour

--

--

Mark Brown
Mark Brown

Written by Mark Brown

Mark Brown edited One in Four, mental health mag 2007–14. Does mental health/tech stuff for cash (or not). Writes for money. Loves speaking. Get in touch

No responses yet