Update: You can follow the discussion of this article around the web on HN, Adafruit, Hackaday, and Hackster.io!
This article explores the research and development journey behind my new sensor system, OptiGap, a key component of my PhD research. I’m writing this in a storytelling format to offer insights into my decision-making process and the evolution leading to the final implementation. It should hopefully provide a glimpse into the sometimes-shrouded world of PhD research and may appeal to those curious about the process. For a deeper dive into technical specifics, simulations, and existing research on this subject, my dissertation is available online here.
What does it do?
In very general terms, this sensor is basically a rope that if bent can tell you where along its length you bent it. The fancy term for that is “bend localization.”
OptiGap’s application is mainly within the realm of soft robotics, which typically involves compliant (or ‘squishy’) systems, where the use of traditional sensors is often not practical. The name OptiGap, a fusion of “optical” and “gap,” reflects its core principle of utilizing air gaps within flexible optical light pipes to generate coded patterns essential for bend localization.
How the OptiGap Sensor System Started
The idea for OptiGap came about while I was experimenting with light transmission through various light pipes (optical cables) for use as a bend detection sensor. I was initially trying to see how I could effectively “slow down” light through the fiber…a seemingly straightforward task, right?
During this process, I attached a section of clear 3D printer filament (1.75mm TPU) to a piece of tape measure for an experiment and incidentally discovered that when I bent the tape measure (and filament) at the spot where the electrical tape was attached, there was a significant drop in light transmission. I hypothesized that this was because the sticky residue of the electrical tape was causing the filament to stretch, which in turn reduced the light transmission.
To verify this hypothesis, I attached a longer piece of TPU to a tape measure and began bending it at various points to observe how light transmission would change.
I wrote a small Linux I2C driver for the VL53L0X ToF sensor to run on a Raspberry Pi and push the data to a socket using ZeroMQ. I then created a rough GUI in Python to pull the sensor data from the socket and visualize the light transmission data in realtime, shown in the GIF below, which very quickly validated my hypothesis. This validation marked the “Eureka!” moment that sparked the eventual development of the OptiGap sensor.
The OptiGap Realization
I realized that since I could control where the light was being attenuated, I could use this to encode information about the position of the bend on the sensor. Using electrical tape was not a practical solution, so I started looking for a more reliable and consistent way to create these attenuations. This led me to the idea of cutting the filament and then reattaching it together using a flexible rubber (silicone) sleeve, leaving a small air gap, as shown in the image below.
The main working principle of the air gap is that translation and/or rotation of one light pipe face relative to the other changes the fraction of light transmitted across the gap. The greater the bend angle, the more light escapes across the gap. The resulting change in intensity of the optical signal can then be correlated with known patterns for use as a sensor.
The Big Idea
I then proceeded to test this idea by creating multiple air gaps in a row and bending the filament to measure the attenuation.
As depicted in the GIF below, the optical intensity decreases at each air gap, with a more noticeable decrease as the bend angle increases. This initial experimentation served as proof of concept, demonstrating the feasibility of the idea. It led to the formulation of my final hypothesis of utilizing a pattern of these air gaps to encode information regarding the sensor’s bending and employing a naive Bayes classifier on a microcontroller to decode the bend location.
This concept resembles the functionality of a linear encoder. Linear encoders gauge an object’s linear movement, typically comprising a slider rail with a coded scale akin to a measuring ruler and a sensing head that moves across this scale to read it. Linear (absolute) encoders emit a distinct code at each position, ensuring consistent identification of displacement.
The OptiGap system, functioning like an absolute encoder, would encode absolute positions using patterns of bend-sensitive air gaps along parallel light pipes, effectively serving as a singular fiber optic sensor.
Encoding the Bend Location using Inverse Gray Code
Absolute encoders commonly employ Gray code, a binary system where two successive values differ in only one bit. This property allows for various applications, including error checking. However, Gray code isn’t optimal for the OptiGap sensor system. Here, we aim for consecutive values to differ by the maximum number of bits to facilitate easier differentiation. This necessity gave rise to Inverse Gray code.
Inverse Gray code is a binary code where two successive values differ by the maximum (n-1) number of bits. To implement this, I simply create cuts in the filament wherever there’s a “1” in the Inverse Gray code sequence. This approach can scale to any bit number. For the prototype, I utilized 3 bits, providing 8 possible positions.
Visualization of the OptiGap Sensor System
The illustration below depicts the signal patterns of the OptiGap sensor system for each bend position using three fibers. By employing a naive Bayes classifier, the sensor system can discern bend positions based on signal patterns. The third graph represents actual sensor data from the prototype system, utilized for training the classifier on the microcontroller.
The OptiGap Prototype
I proceeded to construct a prototype of the OptiGap sensor system, utilizing 3 strands of clear TPU 3D printer filament, each featuring a distinct pattern of air gaps. The image below showcases the filament just before cutting, with the cut pattern indicated on a piece of tape.
For the prototype, I employed a commercial 3:1 fiber optic coupler to merge the light from the 3 strands into a single fiber optic cable, resulting in the completion of the sensor prototype, as depicted below.
This marked the final phase of validating the hypothesis and operational theory behind the OptiGap sensor.
Reducing the Physical Size
The initial prototype proved to be large and bulky, primarily due to the size of the 3D printer filament used. Drawing from previous experience, I recognized that PMMA (plastic) optical fiber offered a smaller and more flexible alternative suitable for this application. Consequently, I assessed 500, 750, and 1000 micron unjacketed PMMA optical fibers from Industrial Fiber Optics, Inc. for the sensor strands, resulting in a significant reduction in sensor size.
I conducted tests on all three types of fibers to evaluate their light transmission and flexibility. Among them, the 500 micron fiber emerged as the optimal choice overall, although all three exhibited sufficient flexibility for this application.
Reducing the Optical Transceiver Complexity
I decided to switch from using the complex VL53L0X ToF sensor to a simple photodiode and IR LED setup to reduce the complexity of the system and to increase modularity. This also allowed me to use a microcontroller to read the sensor data, which was a significant improvement over the initial prototype.
I then created a demo system for the sensor based around an STM32 microcontroller and a photodiode/IR LED setup.
Realtime Machine Learning on a Microcontroller
The final stage in developing the OptiGap sensor system involved integrating a naive Bayes classifier onto the STM32 microcontroller to decode the bend location from the sensor data. I opted for a naive Bayes classifier due to its efficiency compared to if-statements or lookup tables, its capability to handle new or previously unseen data, and its potential for increased accuracy by considering relationships between multiple input variables.
Implementing the naive Bayes classifier proved to be relatively straightforward. This classifier is a probabilistic model based on applying Bayes’ theorem to determine how a measurement can be assigned to a particular class, with the class representing the bend location in this context. I utilized the Arm CMSIS-DSP library for the classifier implementation.
Fitting the Sensor Data
The initial step in integrating the classifier was to fit the sensor data to a Gaussian distribution for each air gap pattern. To expedite this process, I developed a Python GUI for rapid labeling and fitting of the data using GNB (Gaussian Naive Bayes) from the scikit-learn library.
I later improved this UI to be more general and to allow for more complex data fitting.
The probabilities for each class were computed and saved as a header for use on the microcontroller.
Filtering the Sensor Data
To enhance the accuracy of the classifier, I implemented a two-stage filtering process on the STM32 . The initial stage involved a basic moving average filter, followed by a Kalman filter in the second stage.
The OptiGap Sensor System Demo
The GIFs provided below illustrate various stages of the OptiGap sensor system, encompassing assembly and the operational demonstration of the final sensor system.
System Overview
Assembly of an OptiGap Sensor using TPU Filament
Attenuation of Light through the OptiGap Sensor
Fitting of the Sensor Data
Segment Classification using PMMA Optical Fiber
Segment Classification using TPU Filament
Underwater Operation
OptiGap Design Specifications
Key Properties & Parameters
Material Recommendations
Next Steps
I’ve made significant progress on the OptiGap system beyond what’s documented here, including its integration into another modular actuation and sensing system I developed called EneGate.
This has involved custom PCB design and systems integration, detailed in my dissertation. Additionally, I’ve prototyped miniature PCB versions of the optics to interface with the PCBs for the EneGate system.
I’ve also validated OptiGap on a real-world soft robotic system, with full details set to be presented in an upcoming RoboSoft paper titled “Embedded Optical Waveguide Sensors for Dynamic Behavior Monitoring in Twisted-Beam Structures.“
Commercialization
There’s an ongoing commercialization aspect to this research as well. Feel free to reach out if you’re interested in further details.
That’s it for now!
I don’t want to make this too long so I’ll end here. I hope this provided some insight into the research and development process involved in something like this. If you have any questions or would like to learn more, don’t hesitate to contact me!
Looks good! I can definitely see use for this. Have you characterised the stiffness of this?
This reminds me of a optical time domain reflectometer.
https://en.wikipedia.org/wiki/Optical_time-domain_reflectometer
Great progress here! Thank you for sharing the details.
I would love to catch up!
Best, Tim Reha
Some ideas for mapping infrastructures – (OIL+GAS)
Assuming there is no segment/length limitations, or at least the same as in regular fibers:
A tracing line along under-sea cables and pipes – a tracing line integrated to the housing of the undersea cable can give precise mapping of the cable, which could also act as a line measurement for the profile of the sea floor.
As well as running these inside oil pipelines. Coupled with other sensors/data – can give extremely precise locations for cuts, breaches in line or pipe.
With an alerting threshold it can confirm compliance with bend radius requirements for a lot of things, as well as perhaps be able to measure tension applied to lines used for various rigging.
it would be interesting to see it spiraled around a pipe/tube in a helix/double-helix to give an even better full measurement of the bends/tensions/ orientations of the objects its measuring.
very short one on hinges could potentially very cheaply detect any state of being ajar.
Put into gloves – it could be the easiest way to translate the movements of a human hand to the movement of a mechanical hand – such that if you’re wearing a glove that controls the hand of a mech – you could potentially have precision same as your physical digits. Coupled with haptics – it could make driving huge mech hands very sensitive and dexterous.
woven into fabrics, it could be placed into the soles of athletic shoes and precisely monitor how the foot is working in all aspects of running walking etc – then this can be mapped to algos to better mimic the human foot to a robot foot.
woven into fishing-trawler-nets could be used to acurately map the ocean floor where trawling occures.
sandwiched into carbon fiber layers used in, say, spacecraft designs could provide hull/skin integrity measurements with minimal cost/weight. If there is a deformation in a hull or skin of a craft – it could pinpoint exactly where the deformations are happening – or a hole/crack has appeared.
In a body suit – it could relieve the need for visual mocap – and you could have it in both athelete/soldier/whatever to have a full detail of the physcial position of a remote person you are tracking such that you could get the full spatial positioning of a team as they go through a place you cant have visual of them
As well as recording human movements to have humanoid robots learn the movements easily.
lining aircraft wings could continuously measure flex/fatigue of an airframe and give a better long-term lifecycle of the stresses on the metals – or more accurately measure the overall stresses in certain high-performansce maneuvers to give realtime feedback on the stresses imposed by various actions for more refined flight control.
Basically if you can integrate into certain domains – it could be the same as adding a layer of “fluid/aero dynamics” to a real object in real time.
The list can go on.
It would be best to see how thin and tightly woven you can make these…
I have many more applications… but the above are all fairly obvious ones.
This is great work son. You’ve always given your best to whatever you apply yourself to do. Thanks for your contribution to the wealth on knowledge out there. Stave’s feedback has fully given what your research will do to the world we live in. Congratulations.
This reminds me of the joke I have always shared with you each time you make something I do not fully understand, I.e.
I do not understand what your research is all about but I am glad that such a great man like you is on our side.
Your Dad,
Bishop Paul Bupe (Dmin.)