{"id":17341,"date":"2025-11-14T05:16:35","date_gmt":"2025-11-13T20:16:35","guid":{"rendered":"https:\/\/aireviewirush.com\/?p=17341"},"modified":"2025-11-14T05:16:35","modified_gmt":"2025-11-13T20:16:35","slug":"corl2025-robustdexgrasp-dexterous-robotic-hand-greedy-of-practically-any-object","status":"publish","type":"post","link":"https:\/\/aireviewirush.com\/?p=17341","title":{"rendered":"CoRL2025 \u2013 RobustDexGrasp: dexterous robotic hand greedy of practically any object"},"content":{"rendered":"<p> <br \/>\n<\/p>\n<div style=\" \">\n<p>     <img fetchpriority=\"high\" decoding=\"async\" src=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/teaser_new-Medium.jpeg\" alt=\"\" width=\"640\" height=\"293\" class=\"alignnone size-full wp-image-218206\" srcset=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/teaser_new-Medium.jpeg 640w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/teaser_new-Medium-425x195.jpeg 425w\" sizes=\"(max-width: 640px) 100vw, 640px\"><\/p>\n<div id=\"ez-toc-container\" class=\"ez-toc-v2_0_53 counter-hierarchy ez-toc-counter ez-toc-grey ez-toc-container-direction\">\n<div class=\"ez-toc-title-container\">\n<p class=\"ez-toc-title \" >Table of Contents<\/p>\n<span class=\"ez-toc-title-toggle\"><a href=\"#\" class=\"ez-toc-pull-right ez-toc-btn ez-toc-btn-xs ez-toc-btn-default ez-toc-toggle\" aria-label=\"Toggle Table of Content\" role=\"button\"><label for=\"item-69efa80635ed1\" ><span class=\"\"><span style=\"display:none;\">Toggle<\/span><span class=\"ez-toc-icon-toggle-span\"><svg style=\"fill: #999;color:#999\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" class=\"list-377408\" width=\"20px\" height=\"20px\" viewBox=\"0 0 24 24\" fill=\"none\"><path d=\"M6 6H4v2h2V6zm14 0H8v2h12V6zM4 11h2v2H4v-2zm16 0H8v2h12v-2zM4 16h2v2H4v-2zm16 0H8v2h12v-2z\" fill=\"currentColor\"><\/path><\/svg><svg style=\"fill: #999;color:#999\" class=\"arrow-unsorted-368013\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" width=\"10px\" height=\"10px\" viewBox=\"0 0 24 24\" version=\"1.2\" baseProfile=\"tiny\"><path d=\"M18.2 9.3l-6.2-6.3-6.2 6.3c-.2.2-.3.4-.3.7s.1.5.3.7c.2.2.4.3.7.3h11c.3 0 .5-.1.7-.3.2-.2.3-.5.3-.7s-.1-.5-.3-.7zM5.8 14.7l6.2 6.3 6.2-6.3c.2-.2.3-.5.3-.7s-.1-.5-.3-.7c-.2-.2-.4-.3-.7-.3h-11c-.3 0-.5.1-.7.3-.2.2-.3.5-.3.7s.1.5.3.7z\"\/><\/svg><\/span><\/span><\/label><input aria-label=\"Toggle\" aria-label=\"item-69efa80635ed1\"  type=\"checkbox\" id=\"item-69efa80635ed1\"><\/a><\/span><\/div>\n<nav><ul class='ez-toc-list ez-toc-list-level-1 ' ><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-1\" href=\"https:\/\/aireviewirush.com\/?p=17341\/#The_dexterity_hole_from_human_hand_to_robotic_hand\" title=\"The dexterity hole: from human hand to robotic hand\">The dexterity hole: from human hand to robotic hand<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-2\" href=\"https:\/\/aireviewirush.com\/?p=17341\/#The_problem_why_dexterous_greedy_stays_elusive\" title=\"The problem: why dexterous greedy stays elusive\">The problem: why dexterous greedy stays elusive<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-3\" href=\"https:\/\/aireviewirush.com\/?p=17341\/#Our_strategy_RobustDexGrasp\" title=\"Our strategy: RobustDexGrasp\">Our strategy: RobustDexGrasp<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-4\" href=\"https:\/\/aireviewirush.com\/?p=17341\/#The_outcomes_from_laboratory_to_actuality\" title=\"The outcomes: from laboratory to actuality\">The outcomes: from laboratory to actuality<\/a><\/li><li class='ez-toc-page-1 ez-toc-heading-level-2'><a class=\"ez-toc-link ez-toc-heading-5\" href=\"https:\/\/aireviewirush.com\/?p=17341\/#Past_selecting_issues_up_enabling_a_brand_new_period_of_robotic_manipulation\" title=\"Past selecting issues up: enabling a brand new period of robotic manipulation\">Past selecting issues up: enabling a brand new period of robotic manipulation<\/a><ul class='ez-toc-list-level-4'><li class='ez-toc-heading-level-4'><ul class='ez-toc-list-level-4'><li class='ez-toc-heading-level-4'><a class=\"ez-toc-link ez-toc-heading-6\" href=\"https:\/\/aireviewirush.com\/?p=17341\/#Learn_the_work_in_full\" title=\"Learn the work in full\">Learn the work in full<\/a><\/li><\/ul><\/li><\/ul><\/li><\/ul><\/nav><\/div>\n<h2><span class=\"ez-toc-section\" id=\"The_dexterity_hole_from_human_hand_to_robotic_hand\"><\/span>The dexterity hole: from human hand to robotic hand<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>Observe your personal hand. As you learn this, it\u2019s holding your cellphone or clicking your mouse with seemingly easy grace. With over 20 levels of freedom, human palms possess extraordinary dexterity, which may grip a heavy hammer, rotate a screwdriver, or immediately alter when one thing slips.<\/p>\n<p>With the same construction to human palms, dexterous robotic palms supply nice potential:<\/p>\n<p><strong>Common adaptability:<\/strong> Dealing with numerous objects from delicate needles to basketballs, adapting to every distinctive problem in actual time.<\/p>\n<p><strong>Effective manipulation:<\/strong> Executing complicated duties like key rotation, scissor use, and surgical procedures which might be not possible with easy grippers.<\/p>\n<p><strong>Talent switch:<\/strong> Their similarity to human palms makes them ultimate for studying from huge human demonstration information.<\/p>\n<p>Regardless of this potential, most present robots nonetheless depend on easy \u201cgrippers\u201d as a result of difficulties of dexterous manipulation. The pliers-like grippers are succesful solely of repetitive duties in structured environments. This \u201cdexterity hole\u201d severely limits robots\u2019 function in our each day lives.<\/p>\n<p>Amongst all manipulation abilities,\u00a0greedy stands as essentially the most elementary. It&#8217;s the gateway via which many different capabilities emerge. With out dependable greedy, robots can&#8217;t choose up instruments, manipulate objects, or carry out complicated duties. Due to this fact, we give attention to equipping dexterous robots with the aptitude to robustly grasp various objects on this work.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"The_problem_why_dexterous_greedy_stays_elusive\"><\/span>The problem: why dexterous greedy stays elusive<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>Whereas people can grasp nearly any object with minimal acutely aware effort, the trail to dexterous robotic greedy is fraught with elementary challenges which have stymied researchers for many years:<\/p>\n<p><strong>Excessive-dimensional management complexity.<\/strong>\u00a0With 20+ levels of freedom, dexterous palms current an astronomically massive management house. Every finger\u2019s motion impacts all the grasp, making it extraordinarily tough to find out optimum finger trajectories and drive distributions in real-time. Which finger ought to transfer? How a lot drive ought to be utilized? Tips on how to alter in real-time? These seemingly easy questions reveal the extraordinary complexity of dexterous greedy.<\/p>\n<p><strong>Generalization throughout various object shapes.<\/strong>\u00a0Completely different objects demand essentially totally different grasp methods. For instance, spherical objects require enveloping grasps, whereas elongated objects want precision grips. The system should generalize throughout this huge variety of shapes, sizes, and supplies with out express programming for every class.<\/p>\n<p><strong>Form uncertainty underneath monocular imaginative and prescient.<\/strong>\u00a0For sensible deployment in each day life, robots should depend on single-camera programs\u2014essentially the most accessible and cost-effective sensing answer. Moreover, we can&#8217;t assume prior information of object meshes, CAD fashions, or detailed 3D data. This creates elementary uncertainty: depth ambiguity, partial occlusions, and perspective distortions make it difficult to precisely understand object geometry and plan acceptable grasps.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"Our_strategy_RobustDexGrasp\"><\/span>Our strategy: RobustDexGrasp<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>To handle these elementary challenges, we current\u00a0<strong>RobustDexGrasp<\/strong>, a novel framework that tackles every problem with focused options:<\/p>\n<p><strong>Instructor-student curriculum for high-dimensional management.<\/strong>\u00a0We skilled our system via a two-stage reinforcement studying course of: first, a \u201ctrainer\u201d coverage learns ultimate greedy methods with privileged data (full object form and tactile sensors) via intensive exploration in simulation. Then, a \u201cscholar\u201d coverage learns from the trainer utilizing solely real-world notion (single-view level cloud, noisy joint positions) and adapts to real-world disturbances.<\/p>\n<p><strong>Hand-centric \u201cinstinct\u201d for form generalization.<\/strong>\u00a0As an alternative of capturing full 3D form options, our technique creates a easy \u201cpsychological map\u201d that solely solutions one query: \u201cThe place are the surfaces relative to my fingers proper now?\u201d This intuitive strategy ignores irrelevant particulars (like colour or ornamental patterns) and focuses solely on what issues for the grasp. It\u2019s the distinction between memorizing each element of a chair versus simply figuring out the place to place your palms to carry it\u2014one is environment friendly and adaptable, the opposite is unnecessarily sophisticated. <\/p>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/shape-1024x884.jpeg\" alt=\"\" width=\"1024\" height=\"884\" class=\"alignnone size-large wp-image-218213\" srcset=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/shape-1024x884.jpeg 1024w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/shape-425x367.jpeg 425w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/shape-768x663.jpeg 768w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/shape-1536x1326.jpeg 1536w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/shape.jpeg 1948w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\"><\/p>\n<p><strong>Multi-modal notion for uncertainty discount.<\/strong>\u00a0As an alternative of counting on imaginative and prescient alone, we mix the digital camera\u2019s view with the hand\u2019s \u201cphysique consciousness\u201d (proprioception\u2014figuring out the place its joints are) and reconstructed \u201ccontact sensation\u201d to cross-check and confirm what it\u2019s seeing. It\u2019s like the way you may squint at one thing unclear, then attain out to the touch it to make certain. This multi-sense strategy permits the robotic to deal with tough objects that will confuse vision-only programs\u2014greedy a clear glass turns into attainable as a result of the hand \u201cis aware of\u201d it\u2019s there, even when the digital camera struggles to see it clearly.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"The_outcomes_from_laboratory_to_actuality\"><\/span>The outcomes: from laboratory to actuality<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/Screenshot-2025-11-06-at-13.17.08-1024x578.jpeg\" alt=\"\" width=\"1024\" height=\"578\" class=\"size-large wp-image-218212\" srcset=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/Screenshot-2025-11-06-at-13.17.08-1024x578.jpeg 1024w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/Screenshot-2025-11-06-at-13.17.08-425x240.jpeg 425w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/Screenshot-2025-11-06-at-13.17.08-768x433.jpeg 768w, https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/Screenshot-2025-11-06-at-13.17.08.jpeg 1102w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\"><\/p>\n<p>Educated on simply 35 simulated objects, our system demonstrates wonderful real-world capabilities:<\/p>\n<p><strong>Generalization:<\/strong> It achieved a 94.6% success fee throughout a various take a look at set of 512 real-world objects, together with difficult gadgets like skinny containers, heavy instruments, clear bottles, and tender toys.<\/p>\n<p><strong>Robustness:<\/strong> The robotic might preserve a safe grip even when a major exterior drive (equal to a 250g weight) was utilized to the grasped object, exhibiting far higher resilience than earlier state-of-the-art strategies.<\/p>\n<p><strong>Adaptation:<\/strong> When objects had been by chance bumped or slipped from its grasp, the coverage dynamically adjusted finger positions and forces in real-time to get better, showcasing a degree of closed-loop management beforehand tough to realize.<\/p>\n<h2><span class=\"ez-toc-section\" id=\"Past_selecting_issues_up_enabling_a_brand_new_period_of_robotic_manipulation\"><\/span>Past selecting issues up: enabling a brand new period of robotic manipulation<span class=\"ez-toc-section-end\"><\/span><\/h2>\n<p>RobustDexGrasp represents an important step towards closing the dexterity hole between people and robots. By enabling robots to know practically any object with human-like reliability, we\u2019re unlocking new prospects for robotic purposes past greedy itself. We demonstrated how it may be seamlessly built-in with different AI modules to carry out complicated, long-horizon manipulation duties:<\/p>\n<p><strong>Greedy in muddle:<\/strong> Utilizing an object segmentation mannequin to establish the goal object, our technique allows the hand to select a selected merchandise from a crowded pile regardless of interference from different objects.<\/p>\n<p><strong>Job-oriented greedy:<\/strong> With a imaginative and prescient language mannequin because the high-level planner and our technique offering the low-level greedy talent, the robotic hand can execute grasps for particular duties, comparable to cleansing up the desk or taking part in chess with a human.<\/p>\n<p><strong>Dynamic interplay:<\/strong> Utilizing an object monitoring module, our technique can efficiently management the robotic hand to know objects shifting on a conveyor belt.<\/p>\n<p>Wanting forward, we purpose to beat present limitations, comparable to dealing with very small objects (which requires a smaller, extra anthropomorphic hand) and performing non-prehensile interactions like pushing. The journey to true robotic dexterity is ongoing, and we&#8217;re excited to be a part of it.<\/p>\n<h4><span class=\"ez-toc-section\" id=\"Learn_the_work_in_full\"><\/span>Learn the work in full<span class=\"ez-toc-section-end\"><\/span><\/h4>\n<hr class=\"xh2\"\/>\n<div class=\"pdxv  printshow\">\n<p><img decoding=\"async\" src=\"https:\/\/robohub.org\/wp-content\/uploads\/2025\/11\/figure-150x150.jpg\" class=\"grayscale\" style=\"float:left; margin: 0px 1.5em 0em 0px;  width:100px  \" alt=\"\"><\/p>\n<p>     &#13;<br \/>\nHui Zhang &#13;<br \/>\nis a PhD candidate at ETH Zurich. &#13;<br \/>\n    &#13;\n<\/p>\n<\/div>\n<\/div>\n\n","protected":false},"excerpt":{"rendered":"<p>The dexterity hole: from human hand to robotic hand Observe your personal hand. As you learn this, it\u2019s holding your cellphone or clicking your mouse with seemingly easy grace. With over 20 levels of freedom, human palms possess extraordinary dexterity, which may grip a heavy hammer, rotate a screwdriver, or immediately alter when one thing [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":17343,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[21],"tags":[],"class_list":{"0":"post-17341","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-robotics"},"_links":{"self":[{"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/posts\/17341","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcomments&post=17341"}],"version-history":[{"count":1,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/posts\/17341\/revisions"}],"predecessor-version":[{"id":17342,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/posts\/17341\/revisions\/17342"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=\/wp\/v2\/media\/17343"}],"wp:attachment":[{"href":"https:\/\/aireviewirush.com\/index.php?rest_route=%2Fwp%2Fv2%2Fmedia&parent=17341"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=%2Fwp%2Fv2%2Fcategories&post=17341"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/aireviewirush.com\/index.php?rest_route=%2Fwp%2Fv2%2Ftags&post=17341"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}