top of page

THINK.FEEL.HAPTICS

Haptic technology is now evolving and coming out of research centers and into real products and solutions. We strongly believe Haptics are rousing the mainstream technology. We in GotouchVR rendered haptics into VR and for the past few years convincing clients to touch the imagination. 

Haptic communcation is the way in which one communicates with another in a way of touch. Moreover with VR devices rendering haptics is the next generation methodology and here in GotouchVR we made it more simple and user- friendly. We believe this technology should reach everyone even the people who are not tech savy.

​

By using external device called VRTOUCH users can receive feedback from applications with physical touch.

​

ezgif.com-video-to-gif (8).gif

The detection of collision or penetration is the primary step for most of haptic rendering algorithms. In general, solid objects are rendered by detecting the penetration point on the surface, computing the surface normal vector and scaling according to a given force

Our Haptic Rendering Architecture

Haptic rendering is the process in which the stimuli is imposed to a user to convey information of a virtual haptic object. Such information is the representation of object attributes, i.e. shape, elasticity, texture, mass, etc.

 

The collision detection block provides the contact information occurring between avatars and virtual objects. Force response block yields the interaction force, temperature, etc. between avatars and virtual objects. The control algorithm computes the force with respect to the capability of haptic devices. For instance, assuming the maximum force that a haptic device can give to a user is 1N, the force calculated by the force response block must be adjusted and restrained to be within 1N.

Capture d’écran (76).png

Collision Detection
--if, how and when contacts happen

Collision detection is fundamental in VR. It detects whether, where and when a collision or contact occurs. It also determines the number of objects which collides. Collision detection is especially important in haptic rendering since it provides information for the computation of interaction forces.

Capture d’écran (78).png
x(t) and F(t) are continuous-time position and force signals exchanged between users and haptic devices. x(k) and F(k) are discrete-functions.

Some simple algorithms:

 

 

 

 

 

 

 

 

 

 

 

1) Axis-aligned` bounding box: It is computationally expensive to verify if a point is inside an object(e.g. polygonal, polyhedra) by scanning all the coordinates of the object. Rather we compute an axis aligned bounding box (AABB) which encloses an object by its maximum and minimum coordinates on its X, Y and Z axis.

 

 

2) Orientated bounding box: The axis aligned bounding box is simple but not the optimum solution. Its edges of the box are parallel to the global coordinate axes making collision detection inaccurate. In orientated bounding box (OBB), the box edges adopt the local coordinate axes of an object, therefore, the box edge are fit to the object. Fig. 3 illustrates the difference between AABB and OBB

Capture d’écran (82).png
Difference between AABB & OBB

3) Binary space partition: Binary Space Partition (BSP) is useful in terms of collision detection for multiple objects or a single object with multiple triangles. The space can be, therefore, divided into several regions with one region associated with an object or a group of objects. This architecture forms into a binary tree which can significantly decrement the searching time, since detection for some objects can be eliminated when its associated node is not collided .

Capture d’écran (84).png
Binary space partition for collision detection
Rendering attributes

Texture and Stiffness Rendering

 

The attributes of objects incorporate temperature, stiffness, texture, etc.. We herein are only concerned with stiffness and texture. 

 

 

Stiffness is the rigidity of an object which resists to deformation in response to an applied force. Stiffness rendering is widely utilized in medical simulation, such as simulation of soft tissues.

 

 

Texture rendering is composed of tactile and kinesthetic rendering. The simplest way to render texture is placing a point-based device in a 3D environment and calculating repulsing forces by varying direction and magnitude.

​

​

One example of texture rendering: record the real-world texture information

Capture d’écran (86).png

Recorded data: position, orientation, force, acceleration, etc

Frequency Mismatch

Frequency mismatch makes haptic rendering unstable.

•Refresh rate of graphic engine: 30Hz~60Hz

•Haptic device: 1kHz

 

Solutions:

• Multiple threads/processes

• Interpolation

• Virtual coupling etc.

Virtual coupling

•Intermediate layer between client & server

•Spring-damper system to exchange forces

•One spring-damper on both sides

•Forces computed on both sides

Capture d’écran (88).png

In our products, frequency mismatch is no longer an issue

Cutting edge technology for VR Touch

  • Interactions with your fingertips

  • 6 haptic devices working simultaneously 

  • Rendering either stiffness or texture

  • Supports unity

We Deliver

•Not limited to fingertips, but can support a complete human body

•Support rendering different stiffness, texture etc. simultaneously 

•Support other haptic hardware 

•Support different game engines (e.g. Unity, Unreal)

•Provide solutions for haptic products

K. Salisbury, F. Conti, and F. Barbagli, “Haptic rendering: introductory concepts,” IEEE Computer Graphics and Applications, vol. 24, pp. 24–32, March 2004. [2] D. Prattichizzo, F. Chinello, C. Pacchierotti, and M. Malvezzi, “Towards wearability in fingertip haptics: A 3-of wearable device for cutaneous force feedback,” IEEE Transactions on Haptics, vol. 6, pp. 506–516, Oct 2013. [3] C. Ericson, Real-time collision detection. CRC Press, 2004. [4] F. Largilliere, E. Coevoet, M. Sanz-Lopez, L. Grisoni, and C. Duriez, “Stiffness rendering on soft tangible devices controlled through inverse FEM ` simulation,” in International Conference on Intelligent Robots and Systems - IROS 2016, (Daejeon, South Korea), Oct. 2016. [5] R. M. Koch, M. H. Gross, F. R. Carls, D. F. von Buren, G. Fankhauser, and Y. I. H. Parish, “Simulating facial surgery using finite element models,” in ¨ Proceedings of the 23rd Annual Conference on Computer Graphics and Interactive Techniques, SIGGRAPH ’96, (New York, NY, USA), pp. 421–428, ACM, 1996. [6] M. A. Otaduy and M. C. Lin, “A perceptually-inspired force model for haptic texture rendering,” in Proceedings of the 1st Symposium on Applied Perception in Graphics and Visualization, APGV ’04, (New York, NY, USA), pp. 123–126, ACM, 2004. [7] M. A. Otaduy and M. C. Lin, “Stable and responsive six-degree-of-freedom haptic manipulation using implicit integration,” in First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. World Haptics Conference, pp. 247–256, March 2005. [8] V. Theoktisto, M. Fairn, I. Navazo, and E. Moncls, “Rendering detailed haptic textures,” in Workshop on Virtual Reality Interaction and Physical Simulation, 2005. [9] M. A. S. Cagatay Basdogan, Haptic Rendering in Virtual Environments. London, Lawrence Earlbaum,Inc., 2002. [10] K. Potter, D. Johnson, and E. Cohen, “Height field haptics,” in 12th International Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, 2004. HAPTICS ’04. Proceedings., pp. 258–265, March 2004. [11] L. Kim, G. S. Sukhatme, and M. Desbrun, “A haptic-rendering technique based on hybrid surface representation,” IEEE Computer Graphics and Applications, vol. 24, pp. 66–75, March 2004. [12] A. M. Okamura, K. J. Kuchenbecker, and M. Mahvash, “Measurement-based modeling for haptic rendering,” in Haptic Rendering: Algorithms and Applications, ch. 21, pp. 443–467, A. K. Peters, May 2008. [13] H. Culbertson, J. J. L. Delgado, and K. J. Kuchenbecker, “One hundred data-driven haptic texture models and open-source methods for rendering on 3d objects,” in 2014 IEEE Haptics Symposium (HAPTICS), pp. 319–325, Feb 2014. [14] M. C. Lin, M. Otaduy, M. C. Lin, and M. Otaduy, Haptic Rendering: Foundations, Algorithms and Applications. Natick, MA, USA: A. K. Peters, Ltd., 2008. [15] X. He and K.-S. Choi, “Stable haptic rendering for physics engines using inter-process communication and remote virtual coupling,” International Journal of Advanced Computer Science and Applications, vol. 4, 02 2013. [16] L. J. Love, Contact analysis of virtual walls. Wayne John, 1995. [17] M. D. R. R. Minsky, Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-feedback Display. PhD thesis, Cambridge, MA, USA, 1995. Not available from Univ. Microfilms Int. [18] H. M. Culbertson, Data-Driven Haptic Modeling and Rendering of Realistic Virtual Textured Surfaces. PhD thesis, University of Pennsylvania, 2015. [19] H. Culbertson, J. Jose Lopez Delgado, and K. J. Kuchenbecker, “The Penn haptic texture toolkit for modeling, rendering, and evaluating haptic virtual textures,” Departmental Papers (MEAM), 02 2014. [20] D. Prattichizzo, C. Pacchierotti, and G. Rosati, “Cutaneous force feedback as a sensory subtraction technique in haptics,” IEEE Transactions on Haptics, vol. 5, pp. 289–300, Fourth 2012. [21] J. E. Colgate, M. C. Stanley, and J. M. Brown, “Issues in the haptic display of tool use,” in Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human-Robot Interaction and Cooperative Robots, vol. 3, pp. 140–145 vol.3, Aug 1995. [22] R. J. Adams and B. Hannaford, “Stable haptic interaction with virtual environments,” IEEE Transactions on Robotics and Automation, vol. 15, pp. 465–474, Jun 1999

bottom of page