The Sense of Torque
With
A Single Phantom Haptics Device


Dennis Hancock
Dennis@DennisHancock.com



The Phantom is an example of a haptics I/O device that provides 6 degrees of freedom (DOF) input control (with the optional gimbal attachment) but is limited by design to only 3 degrees of freedom (linear) force output. In real life when we move three dimensional objects around an environment with our hands, we feel stopping forces and twisting torques during object collision. Humans use this 6 DOF feedback to intuitively accomplish path planning (e.g. removing the milk pitcher from the back of the refrigerator shelf.) It might be supposed the lack of output torque from the Phantom would limit its use to only linear placement of generalized 3D objects in a synthetic scene. However this is not entirely true. Recent experiences with a single Phantom reveal there is a sense of torque the user experiences which aids in the placement of 3D objects. To be sure, there is no real torque in the classic mechanics. The implication for this observation is that many path planning and CAD synthetic assembly tasks can be accomplished with a single Phantom, saving the cost of a second Phantom and computer system.

We have written software that computes collision detection/ force-torque generation for a 3D polygonalized object moving in a similarly characterized static world. Objects in this world can be convex or concave. Currently, no surface friction models or material compliance models have been implemented. Because of the lack of output torque capability of the Phantom device, we cannot negotiate object placement/orientation in our synthetic world to mimic the experience in the real physical world. Two Phantoms used together can display torques along the two directions perpendicular to the moment arm described by the line connecting the endpoints of the two arms. But this architecture effectively doubles the cost of the hardware, both for the haptics display and the computers. Our experience with a single Phantom suggests there are some classes of placement tasks that can be effectively performed even though no output torque can be applied.

Consider the simple scene shown in Figure 1a. The stick object encounters the solid block at a point contact, and a torque t is generated by the vector force F provided by the the users hand operating on the moment arm R formed from the point of contact to the location of force application. In the Figure, we see that the moving object initially pivots around the point of contact due to this torque and a user of a haptics device would expect to feel a twist in their hand. The evolution of the forces and torques in this example depends on many factors and several scenarios could develop. But the important point is that someone experiencing these forces can involuntarily negotiate a realistic path around the obstructing object.

The condition shown in Figure 1b shows the situation of no torque generation because there is no moment arm to act on. The user would only feel a force of opposition to motion and no torque. This comment pertains not only to a real world physical situation but also a synthetic situation where one could have a fully 6 DOF haptics display. But we all know that an applied torque can make this object rotate about this point of contact. In the synthetic world experience where a person is using a haptics device to move objects, he can voluntarily provide this torque with his hand and rotate the object. The force/torque diagram for this case in shown in Figure 1c and should be compared with the diagram of Figure 1a. The only difference in these figures is that the force vector in 1c is at the point of contact. This is the force/torque diagram that arises with a haptics device with only a linear force output and no torque output. No matter where the stick collides with the object there will be no moment arm.

It has been our observation that there is a sense of torque a person experiences in these situations that aids in placing parts in a 3D world. To be sure there is no real torque--there can't be. But there is the kinesthethic sense that one can roll an object around another in a believable way. We make the following observations about this phenomenon:

  1. When the user of a haptics device with only 3 DOF output linear force voluntarily supplies the torque of collision with his hand, there is a sense of torque that aids in the negotiated placement of objects in a 3D world.
  2. This condition is greatly aided when the colliding objects can be visualized and the person's experience with the real world can be applied.
  3. This sense of torque can only be experienced for the orientations consistent with the force vector applied by the users hand. Thus, no apparent twisting forces can be felt and current Phantoms cannot be expected to mimic screwdrivers and wrenches. (However, devices equipped with 3DOF torque instead of 3DOF linear force output could mimic the displacement of rotated objects with the voluntarily application of linear forces. These two display options are complementary.)

Our current hardware system consists of 2 Hewlett-Packard 735 UNIX workstations running asynchronously of each other. The machines run in the real time priority mode and both exchange information with each other using Berekely sockets over an Ethernet connection. One machine is dedicated solely to stereoscopic rendering of our graphics scene. This 125MHz computer runs code utilizing the Starbase graphics library and renders a 3300 flat shaded polygonal scene of the Ford Mustang at 20Hz. Stereoscopic viewing of the CRT is accomplished with non headtracked CrystalEyes eyewear. The second workstation is a 100 MHz, 50 Mflop machine that runs our OS non-specific collision detection/force-torque generation code. Both codes are written in C to avoid any execution time penalties currently associated with C++. This machine currently is executing at the 1 hapton* level due to an I/O bottleneck to the Phantom. We estimate that this hardware-software system is potentially capable of 10 haptons.

Acknowledgements. I gratefully acknowledge the financial and materiel support of Fred Kitson of the Hewlett-Packard Laboratories for this project. And hats off to Mike Goss for his assistance and PC wizardry. I would also like to thank the Hewlett-Packard Company for past support on other unreported haptics projects. Finally, I would like to thank NASA Goddard for initially funding my collision detection algorithm work in support of the Hi Gain Antenna Study for the Hubble Telescope.

* We define the hapton as the number of triangle pair collisions that can be computed 1000 times per second. Spatial accuracy of collision computation must be done to a precision of 0.1% or greater and the triangles must be intersecting on edge. The hapton is a measure of haptics computer system performance in analogy to the polygon/sec as a unit of graphics system rendering performance.