{"id":353,"date":"2013-04-17T15:15:23","date_gmt":"2013-04-17T22:15:23","guid":{"rendered":"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/cgi-bin\/salisbury_lab\/?page_id=353"},"modified":"2014-07-01T08:55:04","modified_gmt":"2014-07-01T15:55:04","slug":"six-dof-haptic-rendering-of-volumetric-data-pictures","status":"publish","type":"page","link":"https:\/\/sr.stanford.edu\/?page_id=353","title":{"rendered":"Six-DoF Haptic Rendering of Volumetric Data pictures"},"content":{"rendered":"<h1>Project Description<\/h1>\n<p>A method for six degree-of-freedom haptic rendering of isosurface geometry embedded within sampled volume data is presented. The algorithm uses a quasi-static formulation of motion constrained by multiple contacts to simulate rigid-body interaction between a haptically controlled virtual tool, represented as a point-sampled surface, and volumetric isosurfaces. Unmodified volume data, such as CT or MR images, can be rendered directly with this approach, making it particularly suitable for applications in medical or surgical simulation.<\/p>\n<h2>The Algorithm<\/h2>\n<p>The distinguishing characteristics of the presented method are:<\/p>\n<ul>\n<li>The algorithm executes at haptic update rates of 1000 Hz.<\/li>\n<li>A constraint-based approach allows for distributed contact using a massless proxy, enabling the rendering of very stiff contacts.<\/li>\n<li>Isosurfaces within volumetric data of any type (eg. CT scans) can be rendered directly at sub-voxel resolution without any preprocessing.<\/li>\n<\/ul>\n<p>Key functional components are depicted in the diagram below.<\/p>\n<p><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/c8dcca36422d8c52a8395cb080c4b05b.jpg\" border=\"0\" \/><\/p>\n<h2>Data Representation<\/h2>\n<p>The geometry of the virtual environment exists as an isosurface within a sampled volume. A central differencing scheme is used to estimate the normals on the surface for computing contact constraints and for shading in the visual rendering. The virtual tool is represented as a point shell derived from its polygonal model.<\/p>\n<table cellspacing=\"0\">\n<tbody>\n<tr>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/3fba76e29bd9ff06b3bd4d39fdfe47aa.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/a2e55237265c74973595ca2028d21fd4.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/466e4e62223e263ed31f68cc499ecce8.jpg\" border=\"0\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>Collision Detection<\/h2>\n<p>Every vertex of the point shell surface is queries against the volume intensity field during collision detection. A point\u2019s path is subdivided into feature-sized segments to detect and enforce non-penetration. Interval bisection is used to refine the contact position.<\/p>\n<p><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/a5e37f882422704bdd983059e1657d66.jpg\" border=\"0\" \/><\/p>\n<h2>Configuration Solver<\/h2>\n<p>Each contact imposes a constraint on the proxy\u2019s motion:<\/p>\n<p><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/cc351a1e21970c5a800047e59a618a8a.jpg\" border=\"0\" \/><\/p>\n<p>Minimization of the \u201cacceleration energy\u201d (Gauss\u2019 Principle) subject to the contact constraints yields the correct motion path for the proxy.<\/p>\n<p><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/646c62235c781b6f4323e3391c5464ea.jpg\" border=\"0\" \/><\/p>\n<h2>Results<\/h2>\n<p>The method was tested using a variety of haptic devices including the Force Dimension sigma.7 and our custom-built 6-DOF \u00b5Haptic device depicted below. Performance characteristics of the algorithm on four different data sets were collected.<\/p>\n<table cellspacing=\"0\">\n<tbody>\n<tr>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/88bb3999f3538d7ff56c12c6f4c05124.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/3110dbdb228527a521292f2a59b3f1c9.jpg\" border=\"0\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<p>The simulation remained stable throughout interactions that included hooking, wedging, and prying, even with a coupling stiffness set as high as 5000 N\/m. The tool could be moved quickly in free space or in contact without feeling effects of artificial mass, inertia, or viscosity.<\/p>\n<table cellspacing=\"0\">\n<tbody>\n<tr>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/1600f646d6c464cae0c5d96970e04180.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/c50063a433bb810c338ef5cccf93e64d.jpg\" border=\"0\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<table cellspacing=\"0\">\n<tbody>\n<tr>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/8832883044b7752ce009297cc9e41453.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/e19df2023b4b3c2c193ae24c2d6be8fb.jpg\" border=\"0\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<table cellspacing=\"0\">\n<tbody>\n<tr>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/c373f89c1613f58d0dba4b71a3f3a218.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/0f56feb0bc01db307ebf07abaf4d9582.jpg\" border=\"0\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<table cellspacing=\"0\">\n<tbody>\n<tr>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/af84801030f5a7cff7d187426c029400.jpg\" border=\"0\" \/><\/td>\n<td><img alt=\"\" src=\"http:\/\/www.stanford.edu\/group\/sailsbury_robotx\/images\/5c3e74a3e0c48660dfc7db24763271cd.jpg\" border=\"0\" \/><\/td>\n<\/tr>\n<\/tbody>\n<\/table>\n<h2>Conclusion<\/h2>\n<p>Physical interaction with the world using a rigid tool is inherently a six degree-of-freedom task. Our haptic rendering algorithm provides a means for exploring isosurfaces embedded within volumetric data using an arbitrarily shaped virtual instrument. The algorithm prevents object interpenetration and allows quick movements of the tool without conveying artificial mass or inertia, thus enhancing the perceived realism of the virtual object interaction.<\/p>\n<h2>Related Publications<\/h2>\n<p>Chan, S., Conti, F., Blevins, N. H., &amp; Salisbury, K. <strong>Constraint-based six degree-of-freedom haptic rendering of volume-embedded isosurfaces.<\/strong> <em>Proc. IEEE World Haptics Conference<\/em> (2011).<\/p>\n<h2>Project Staff<\/h2>\n<ul>\n<li>Sonny Chan<\/li>\n<li>Nikolas Blevins<\/li>\n<li>J. Kenneth Salisbury<\/li>\n<\/ul>\n<h2>Status<\/h2>\n<p>Active since 2002.<\/p>\n<h2>Funding Sources<\/h2>\n<p>This project was funded in part by NIH Grant 5R01LM010673-02<br \/>\nand in part by the Veterans Administration.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Project Description A method for six degree-of-freedom haptic rendering of isosurface geometry embedded within sampled volume data is presented. The algorithm uses a quasi-static formulation of motion constrained by multiple contacts to simulate rigid-body interaction between a haptically controlled virtual tool, represented as a point-sampled surface, and volumetric isosurfaces. Unmodified volume data, such as CT &hellip;<\/p>\n<p class=\"read-more\"> <a class=\"\" href=\"https:\/\/sr.stanford.edu\/?page_id=353\"> <span class=\"screen-reader-text\">Six-DoF Haptic Rendering of Volumetric Data pictures<\/span> Read More &raquo;<\/a><\/p>\n","protected":false},"author":3,"featured_media":0,"parent":205,"menu_order":0,"comment_status":"open","ping_status":"open","template":"","meta":[],"_links":{"self":[{"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/pages\/353"}],"collection":[{"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/users\/3"}],"replies":[{"embeddable":true,"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=353"}],"version-history":[{"count":7,"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/pages\/353\/revisions"}],"predecessor-version":[{"id":2327,"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/pages\/353\/revisions\/2327"}],"up":[{"embeddable":true,"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=\/wp\/v2\/pages\/205"}],"wp:attachment":[{"href":"https:\/\/sr.stanford.edu\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=353"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}