In 2025, a team at the University of Cambridge and University College London published something that sounds deceptively simple: a robotic glove made from conductive gelatin. What David Hardman, Thomas George Thuruthel, and Fumiya Iida actually built is far more interesting — a multi-modal sensing system that processes over 860,000 distinct signal pathways from just 32 electrodes at the wrist. That’s not a sensor. That’s a network.
The paper, published in Science Robotics, frames this as a tactile problem. But read it closely and you see the deeper question: what if sensing isn’t about placing sensors in the right spots — what if the material itself is the computation?
That reframe is why this matters to anyone watching the convergence of biology and computation. Skin doesn’t compute the way silicon does. It computes the way a distributed biological system does — through pattern, redundancy, and emergent signal processing across a medium that is simultaneously structural and informational.
The Problem with Traditional Robotic Touch — Fragmentation Is a Bug
Every conventional approach to robotic skin suffers from the same architectural flaw: sensor fragmentation. You need one type of sensor for pressure, another for temperature, another for shear force. Embed them all in a flexible substrate, and you’ve created a wiring nightmare that breaks easily, costs a lot to fabricate, and produces signals that actively interfere with each other.
This is analogous to the early days of computing, when analog circuits handled each function discretely. The breakthrough came when engineers stopped asking “how do I build a better adder?” and started asking “what if one substrate could represent all states?” That substrate became silicon. For robotics and biohybrid systems, the equivalent question is finally being asked about soft matter.
The Cambridge-UCL solution uses a single gelatine-based conductive hydrogel — a soft, stretchy material that is electrically active throughout its entire volume, not just at discrete points. When any part of the surface is stimulated — thermally, mechanically, electrically — the impedance patterns across the entire material shift. Those shifts are the data.
860,000 Pathways from 32 Electrodes — The Math Is the Point
Here is the key number: 32 electrodes placed at the wrist yield over 1.7 million measurements across the whole hand. This is possible because the team used high-density electrical impedance tomography (EIT) — a technique borrowed from medical imaging — to reconstruct what’s happening across the entire volume of the material, not just where the electrodes sit.
Biological skin works on a similar principle. Your fingertip doesn’t have a discrete pressure sensor under each ridge. It has a distributed network of mechanoreceptors — Meissner corpuscles, Merkel discs, Ruffini endings — whose collective firing patterns the nervous system interprets as texture, temperature, or pain. The hydrogel skin is a crude but functionally honest analog of that architecture.
What the material can already distinguish:
- The tap of a human finger vs. a robotic arm
- Gentle contact vs. firm pressure
- Hot vs. cold surfaces
- Puncture or cutting damage
- Multi-point simultaneous contact
All from one material. No sensor switching. No signal routing.
Machine Learning as the Decoder Ring
The raw impedance data is meaningless without interpretation. This is where the collaboration between material design and machine learning becomes the real story. The researchers trained a model on physical test data — heat gun blasts, finger pressure, scalpel cuts — to learn which of the 860,000 pathway patterns correspond to which type of contact.
This is not a gimmick. It mirrors how biological nervous systems develop. A newborn’s skin doesn’t arrive pre-calibrated to distinguish hot metal from a mother’s touch. That discrimination is learned through experience and feedback. The Cambridge team’s model is doing the same thing — using physical experience to encode a functional map of the material’s response space.
The implications go beyond robotics. If a material’s entire bulk is a sensing surface, and machine learning can decode the signals, then the design question shifts from where do I place sensors? to what material properties create the richest signal space? That is a biology question. Living tissues have spent 500 million years solving exactly this optimization problem.
Biohybrid Skin Is the Next Move
The current hydrogel is synthetic. But the architecture it demonstrates — distributed volumetric sensing decoded by learned pattern recognition — is precisely the template that biohybrid systems research is converging toward.
Groups working on living materials and muscle-actuated robots already know that synthetic actuators are the bottleneck. The Cambridge result suggests that synthetic sensing is the parallel bottleneck. The two problems share a solution space: replace discrete, fragile engineered components with continuous biological or bio-inspired materials that handle complexity through their intrinsic physics.
At BioComputer, we track this as one of the cleaner proofs that biology-as-computation is not metaphor. When a material’s electrical behavior across 860,000 pathways encodes physical reality, and a neural network learns to read that encoding — you have computation in the literal sense. The substrate happens to be gelatin today. It won’t be next decade.
The Durability Problem Is Real — and Solvable
Hardman and Thuruthel are candid about the current limitations: the skin is not yet as sensitive as human skin, and durability remains a work in progress. Gelatine-based hydrogels are not known for robustness under industrial conditions.
But this is the right stage for the right limitation. Sensitivity and durability in biological systems are not solved by making the material harder — they’re solved by adding self-repair mechanisms. Human skin heals. The next generation of this technology, likely involving hydrogels seeded with living cells or bio-inspired polymer networks that reform after damage, will address this through biology rather than around it.
The Samsung Global Research Outreach Program and the Royal Society funded this work — both organisations with clear commercial and institutional interest in where soft robotics is heading. This is not a curiosity project.
Skin Has Always Been a Computer — We Just Forgot
The moment you accept that skin is not a passive barrier but an active information-processing surface, the biohybrid future becomes legible. Cambridge’s conductive hydrogel is a proof of principle for something biology has demonstrated for hundreds of millions of years: you don’t need discrete components to compute. You need the right medium, the right signals, and a system capable of learning the mapping between them.
When the medium is biological — or bio-inspired enough to self-repair, adapt, and grow — the distinction between sensor and computer dissolves entirely.
References
- Hardman, D., Thuruthel, T.G., Iida, F. (2025). Multimodal information structuring using single layer soft sensory skins and high-density electrical impedance tomography. Science Robotics. https://doi.org/10.1126/scirobotics.adq2303
- University of Cambridge. (2025). Single-material electronic skin gives robots the human touch. University of Cambridge Research Stories. https://www.cam.ac.uk/stories/robotic-skin
- Thuruthel, T.G., et al. (2019). Soft robot perception using embedded soft sensors and recurrent neural networks. Science Robotics. https://doi.org/10.1126/scirobotics.aav1488
- Tee, B.C.K., et al. (2012). An electrically and mechanically self-healing composite with pressure- and flexion-sensitive properties for electronic skin applications. Nature Nanotechnology. https://doi.org/10.1038/nnano.2012.192
Related: Biohybrid Robots: When Machines Grow Their Own Muscles and Brains · Programmable Biology: When Cells Become Living Software
Feature image: AI-generated using Grok.