When a piece of technology has the potential to change how someone recovers from a stroke, every design decision carries real weight.

In this episode of the Sonin //Podcast, Paul is joined by James Coleman, former Sonin iOS developer and founder of Facial Dynamics by Life Analytics, to explore the story behind one of the most human-centred digital health products to emerge from the UK in recent years.

The conversation traces the journey from a late-night coding experiment inspired by Apple’s Face ID sensor all the way through to real-world deployment in NHS settings. Along the way, James and Paul unpack the challenges of building for a sensitive clinical context, the realities of user research when direct access to end users is limited, and what it actually takes to turn a compelling idea into a product that clinicians trust and patients want to use.

At its core, this episode is not just about facial rehabilitation software. It is about what happens when a developer with deep curiosity, technical skill, and genuine care for people decides to build something that matters.

What is Facial Dynamics?

Facial Dynamics is the world’s first remote monitoring system for facial movement. It is designed to support people living with facial palsy, a condition that can arise from strokes, Bell’s palsy, and Parkinson’s disease, by helping them carry out and record their rehabilitation exercises at home.

Traditional facial retraining requires patients to practise repetitive movements in front of a mirror. The problem is that many people living with facial palsy find that experience deeply distressing. Confronting your own reflection when your face is not moving the way it used to is not a small ask, and clinicians have long known that low compliance with this approach means slower, and sometimes incomplete, recovery.

Facial Dynamics addresses this by replacing the mirror with an augmented reality mask displayed on an iPhone or iPad. Patients get the feedback they need to do the exercises correctly, without being confronted with a direct likeness of their own face. Their movements are recorded and sent to their clinician, providing objective, measurable data about progress for the first time.

As James explains in the episode, the timing of that retraining matters enormously. For Bell’s palsy, starting exercises in the first three months gives patients the best chance of regaining full symmetry and movement. Anything that improves compliance in that critical window could meaningfully change outcomes.

From Face ID to the NHS

The idea for Facial Dynamics grew out of James’s early experimentation with the True Depth sensor introduced in the iPhone X in 2017. Working with augmented reality at Sonin, he began to explore what the sensor’s infrared depth data could actually be used for beyond unlocking a phone.

The connection to facial rehabilitation came from a familiar source: the FAST campaign, long plastered on the side of ambulances, which uses facial weakness as a primary indicator of stroke. If he could detect and measure facial movement with a device already in millions of people’s pockets, could that data be genuinely useful in stroke recovery?

Years of quiet development followed, alongside conversations with friends in the industry, clinicians at Queen Victoria Hospital in East Grinstead, the UK’s largest facial palsy clinic, and researchers at universities already working in this space. The feedback was consistently positive, and seed funding eventually allowed James to pursue the project full time.

Building for a Sensitive Audience

One of the most thoughtful parts of this conversation is the discussion around user research in a medical context. Getting direct access to patients living with facial palsy is not straightforward. Ethical considerations, clinical intermediaries, and the emotional vulnerability of the people involved all create real constraints.

James’s approach has been to work through the channels that exist, charities like the Stroke Association, Facial Palsy UK, and Different Strokes, and to show up, speak at events, and let people come to him. That approach has generated not just feedback but genuine advocates: people who want to see the product succeed and are willing to help shape it.

At the same time, James has had to balance what patients want with what clinicians are willing to adopt. These are not always the same thing, and navigating that gap, making the case to clinicians that features their patients are asking for will ultimately improve compliance, has been one of the project’s ongoing challenges.

His answer to the risk of building the wrong thing? A tech demo. Early on, James built a scrappy prototype that let him quickly try out ideas and show them to clinicians and patients alike. That demo has since become an invaluable reference point: when people ask today whether Facial Dynamics could do something, he can often pull up the old prototype and show them it has already been tried.

What Comes Next

The long-term ambition for Facial Dynamics goes well beyond rehabilitation. The goal is automated diagnosis, a system capable of scanning someone’s face, evaluating the type and extent of facial palsy, and predicting likely recovery trajectories over weeks and months.

There is already interest from University College London Hospitals in deploying the technology in ambulances to support ambulatory triage for suspected strokes. In areas with poor signal, video calls back to hospital stroke specialists are unreliable. A locally running diagnostic tool could make a significant difference to outcomes in those cases.

The immediate next step is formal recognition as a medical device and wider deployment into NHS pathways.

What This Episode Covers

  • How a feature of the iPhone X became the foundation of a healthcare product
  • Why mirror-based facial rehabilitation fails so many patients — and what Facial Dynamics does differently
  • The realities of user research in a clinical context and how to work around the constraints
  • How to build trust with clinicians while still advocating for what patients actually need
  • Why a quick-and-dirty tech demo is often the most valuable product you can build
  • The long-term vision for automated facial palsy diagnosis in ambulances and A&E

Watch & Subscribe