This past Wednesday, I was super excited to share my most recent haptics project with the rest of my class.
All of the models I made & edited. For this application the models needed to be low poly. This allows for quicker uploading, and ease of interactivity. The stomach and the probe are polygon models built in 3DS max. The knee was derived from the Visible human diacom data set. It is a female’s left tibia. Our class extracted the model using a program called Mimics. Mimics is a program that can segments diacom data. A layer mask is applied to a certain range of values the user defines. After I applied the value range to select identify bone (which is done in mimics with the ‘soft tissue’ preset??, yeah didn’t understand that either), I manually went through the data to double check the selected data. After some fine tuning, and the press of an ‘export’ button I had an *.stl of the tibia!
After I finished modeling I exported the geometry from 3DS max as VRML files. The VRMLs can be read by the haptic code through .iv (inventor files). In the inventor file I link to the VRML, and I also define the haptic properties of the geometry. Haptic properties are the properties that you feel. Which makes this part difficult to convey with out you playing with the device. I applied two effects to the stomach and three effects to the tibia.
The following is some video of a classmate testing out my project
To the stomach I applied viscosity as well as a fulcrum effect. Viscosity makes it feel ‘rubbery’ and hard to move through,while the fulcrum effect influences the probe to a specified point in the programs world (where it can rotate about that point freely). To the knee I applied vibration, friction, and line. I wanted to make the tibia feel like the user was drilling through bone ( yes, drilling with a probe). There are other regular (Coin3D) effects each geometry has which is shown in the picture above in the GUI. These effects are stiffness, static friction, dynamic friction, and pop-though. The pop-though is fun, it defines how hard you need to press before the probe enters the mesh and the haptic effects begin.
An extra item I worked hard to change was the orientation of one object at a time. It was understood that you need to find and change the transform node assigned to the individual object. But once this was done, if you move the world, then move the objects, things get a little funky. It seemed to me that when you move the probe, the object would move away from, oppose to with, the probe. So Cristian Luciano, our professor, explained two other matrix multiplication have to be applied in order to undo the world matrix which is affecting the position/ movement of the objects. After 3-4 explanations, I finally understood what was going on conceptually. But in the end I am glad I took the time.
This project was exciting for me and helped bring everything we are doing together, i.e. understanding how the different libraries work. I enjoyed using models which I created and applying material properties as well as haptic effects to them.
It amazes me how fast technology develops and it would be really cool is if you had the device at home and were playing with the program I just made. Just think, one day your tablet will come with a glove, that you will wear to interact with the geometry on your screen. Actually, I might even go so far as to besomeone out there already figured this out. . .
For the different libraries we use for this project please visit my resources page!