We developed Interhaptics with the aim of standardizing the creation and delivery of rich haptics experiences for all haptics devices.
At first, the objective was daunting, and a lot of folks on the haptics communities were not convinced that was possible.
There was multiple reason for that:
- The human tactile perception is complex
- Actuators are wildly different form each other’s
- If you add surface haptics and XR to the mix it becomes a huge mess
We started this journey back in 2017 with a large research activity on the haptics technologies available and the human perceptions they could target. We realized that there were much more in common between the technology compared to what was commonly known. There was a layer of similarity between the human perception of a spatial texture delivered with an Electroadhesion device and a similar sensation delivered with a vibrotactile wideband actuator, even if the driving software and circuit were vastly different.
This work allowed us to build a rendering and encoding model for perceptual haptics rendering we delivered on the market at the end of 2019.
We tested on the market this model and technology developing partnership with haptics technology and device manufacturers allowing us to understand the limits of our first implementation. Some examples are:
- Partnership with Senseglove
- Partnership with Nanoport
We delivered several customer projects using Interhaptics. Being our own customers and having early iterations with the developer and integrator community helped us refine iteratively re-define the product and understand what needed to be modified at the core level to make it scalable on every haptics device in the world.
The breakthrough happened with the MPEG CfP. The objective was to create the .MP3 of haptics !!!! that was a thing to reflect upon and challenge our basic assumptions.
The technological and scientific challenge was hard. Meet with the same encoding and perceptual model the needs for wideband PCM encoding like music, and perfect transcoding of wildly used descriptive codec such as AHAP form Apple and IVS form Immersion.
The team did their best and in September at MPEG 136 Interhaptics proposal was chosen as the foundational architecture for the future MP3 of Haptics. Read here more about that.
We are deeply aware that without great content facing creation tools, and simple SDKs software implementers will hardly implement the standard.
For this reason, in parallel with the CfP progressing we restructured the global Interhaptics architecture to meet the challenging needs for the content creators, haptics technology manufacturers, and haptics OEMs alike.
We are glad to announce that the wait is over!
Our Haptics composer 2.0 and Interhaptics Engine are now ready to enter in Beta Testing phase and be launched on the market shortly in line with our product release roadmap.
Subscribe at this link if you want to participate to the beta.
To celebrate this milestone, we decided to offer for free our Haptics for Mobile SDK on the Unity Asset store till new year.
We would love to hear your thoughts, leave a feedback and join our discord community.