Haptics Cyborg

Haptics Cyborg collaboration is funded by Cyborg Foundation, Pioneer Works, and New Inc.

“Art and science have their meeting point in method.” — Edward G. Bulwer-Lytton


Stephanie Dinkins is an artist whose body is failing. As a result of a major spinal surgery, she has bad proprioception in the lower left leg. This means her body does not always know where her leg is in space and her foot sometimes feels like little more than dead weight she drags around. Balance is no longer a given. She sometimes stumbles, she sometimes falls.

As an experienced patient, a medical explorer, and an artist interested in creating platforms for ongoing dialog about cybernetics as it intersects aging and our future histories, she offers her views and her body as a testing ground for such explorations.



We started out with a broad concept to develop a cybernetic sense: “Cyborg Futures will team together artists of any discipline with technologists and scientists to help build new senses, or expand those that already exist, in order to pursue new perceptive possibilities. These unique forms of experience are the basis for the creation of a new type of cybernetic art – one that redefines the relationship between technology and the human organism.”


Our team’s aim is to alter perceptual awareness of personal space – improving the slow and sometimes non-existent communication between Stephanie’s brain and her legs.


cyborg_early_sketch-01 copy


The beginning period of exploration includes observing and analyzing Stephanie’s body movement, and then brainstorming on the possible haptic feedbacks we might built. It is a diverging process by aksing a lot of questions before narrowing down on specific solutions.


Acquiring advanced knowledge

Even with good concepts and execution skills, when it came to the cutting edge cyborg community, we still had a lack of domain knowledge. So at the meanwhile, we reached out and connected with artists who practice in the same field to get more insights and advice for building our own cybernetic sense.

From L to R: Zack Freedman - Fat Cat Labs / Liviu Babitzm – Co-Founder at Cyborg Nest / Trevor Goodman – BDYHAX / Viktoria Modesta – MIT Media Lab Fellow bionic woman




Diagnostics Myo copy

Iteration & Exploration

For our early prototype, We attempted to reappropriate the Myo device (https://www.myo.com/) to use on the leg. However, after repetitive testings and trials, we found that the inbuilt gesture detection system in Myo was not as responsive as we would expect, especially for legs.

wk7-lil diagram

This led to a decision to build our own system – a specific range of devices/features (in this case we chose Adafruit IMU BNO055) were proposed for motion capture and feedback transmission based on the limited exploration phase.

We further made a faux motion capture of Stephanie’s gait in order to figure out where do we want to locate the sensors, and what do we want to do with the data collect with these sensors.

As shown in the motion capture video, Stephanie jerks her entire body in a tilt position when she walks, so the legs, shoulders and head are important points. We then decided that we need at least a sensor for both legs (knees for the first experiment), so we could compare the position of the sensors to one another. From there, we can get more factors to compare, such as how the left ankle moves in relation to the left and/or right knee/ankles and so on, by adding more sensors.


  • IMU -> feather


  • IMU -> feather -> iPhone

We worked on the technical part to get the feather communicating data from the IMU to computer/phone via bluetooth. We were then able to use the working sensors on the body to start collecting, reading and comparing values between gyroscopes.


We visualized and interpreted IMU data (absolute orientation, acceleration vector, and linear acceleration vector) in MATLAB.


  • IMU -> feather -> iPhone -> transducers

As we continued our research and experiments on the output side, we landed on the idea of using sensors to inform the body via bone conducting transducers.

The idea is to let the transducer generate aural, haptic, and neural activity as a signal for triggering muscle movement of the body as fast, or faster, than a normally functioning body.

After succeeding in sending data from the sensors to a bluetooth receiver attached to the bone transducers. We are currently working on building a seamless connection model (using above tools for building a phone app) and creating an effective mapping system (based on the detection of speed and dislocation of the legs) for turning input into output.

Current Stage

We are treating the current prototype as a wearable design challenge. We want to explore alternative approach to form & design, such as the materials of manufacturing. And with the possibility of building a project website for future iteration and publication.