Nikolaus Correll /cs/ en A delicate touch: teaching robots to handle the unknown /cs/2024/04/02/delicate-touch-teaching-robots-handle-unknown <span>A delicate touch: teaching robots to handle the unknown</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2024-04-02T13:39:31-06:00" title="Tuesday, April 2, 2024 - 13:39">Tue, 04/02/2024 - 13:39</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/web-ex-presizes_23.png?h=53fe9e2d&amp;itok=Xf7QKT9T" width="1200" height="600" alt="to the left is a robotic gripper with strawberry, right is William Xie"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/511" hreflang="en">Graduate Student Stories</a> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> <a href="/cs/taxonomy/term/439" hreflang="en">Research</a> </div> <a href="/cs/grace-wilson">Grace Wilson</a> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p>William Xie, a first-year PhD student in computer science, is teaching a robot to reason how gently it should grasp previously unknown&nbsp;objects by using large language models (LLMs).&nbsp;</p> <p><a href="https://deligrasp.github.io/" rel="nofollow">DeliGrasp</a>, Xie's project, is an intriguing step beyond the custom, piecemeal solutions currently used to avoid pinching or crushing novel objects.&nbsp;</p> <p>In addition, Deligrasp helps the robot translate what it can 'touch' into meaningful information for people.&nbsp;</p> <p>"William has gotten some neat results by leveraging common sense information from large language models. For example, the robot can estimate and explain the ripeness of various fruits after touching them." Said his advisor, <a href="/lab/correll" rel="nofollow">Professor Nikolaus Correll</a>.&nbsp;</p> <p>Let's learn more about DeliGrasp, Xie's journey to robotics, and his plans for the conference Japan and beyond.&nbsp;</p> <p>[video:https://www.youtube.com/watch?v=OMzTgY1gxLw]</p> <h2>How would you describe this research?&nbsp;</h2> <p>As humans, we’re able to quickly intuit how exactly we need to pick up a variety of objects, including delicate produce or unwieldy, heavy objects. We’re informed by the visual appearance of an object, what prior knowledge we may have about it, and most importantly, how it feels to the touch when we initially grasp it.&nbsp;</p> <p>Robots don’t have this all-encompassing intuition though, and they don’t have end-effectors (grippers/hands) as effective as human hands. So solutions are piecemeal: the community has researched “hands” across the spectrum of mechanical construction, sensing capabilities (tactile, force, vibration, velocity), material (soft, rigid, hybrid, woven, etc…). And then the corresponding machine learning models and/or control methods to enable “appropriately forceful” gripping are bespoke for each of these architectures.</p> <p>Embedded in LLMs, which are trained on an internet’s worth of data, is common sense physical-reasoning that crudely approximates a human’s (as the saying goes: “all models are wrong, some are useful”). We use the LLM-estimated mass and friction to simplify the grasp controller and deploy it on a two-finger gripper, a prevalent and relatively simple architecture. Key to the controller working is the force feedback sensed by the gripper as it grasps an object, and knowing at what force threshold to stop—the LLM-estimated values directly determine this threshold for any arbitrary object, and our initial results are quite promising.</p> <h2>How did you get inspired to pursue this research?</h2> <p>I wouldn’t say that I was inspired to pursue this specific project. I think, like a lot of robotics research, I had been working away at a big problem for a while, and stumbled into a solution for a much smaller problem. My goal since I arrived here has been to research techniques for assistive robots and devices that restore agency for the elderly and/or mobility-impaired in their everyday lives. I’m particularly interested in shopping (but eventually generalist) robots—one problem we found is that it is really hard to determine, let alone pick ripe fruits and produce with a typical robot gripper and just a camera. In early February, I took a day to try out picking up variably sized objects via hand-tuning our MAGPIE gripper’s force sensing (an affordable, open-source gripper developed by the Correll Lab). It worked well; I let ChatGPT calibrate the gripper which worked even better, and it evolved very quickly into DeliGrasp.</p> <h2>What would you say is one of your most interesting findings so far?</h2> <p>LLMs do a reasonable job of estimating an arbitrary object’s mass (friction, not as well) from just a text description. This isn’t in the paper, but when paired with a picture, they can extend this reasoning for oddballs—gigantic paper airplanes, or miniature (plastic) fruits and vegetables.</p> <p>With our grasping method, we can sense the contact forces on the gripper as it closes around an object—this is a really good measure of ripeness, it turns out. We can then further employ LLMs to reason about these contact forces to pick out ripe fruit and vegetables!</p> <h2>What does the day-to-day of this research look like?</h2> <p>Leading up to submission, I was running experiments on the robot and picking up different objects with different strategies pretty much every day. A little repetitive, but also exciting. Prior to that, and now that I’m trying to improve the project for the next conference, I spend most of my time reading papers, thinking/coming up with ideas, and setting up small, one-off experiments to try out those ideas.</p> <h2>How did you come to study at ŷڱƵ Boulder?&nbsp;</h2> <p>For a few years, I’ve known that I really wanted to build robots that could directly, immediately help my loved ones and community. I had a very positive first research experience in my last year of undergrad and learned what it felt like to have true personal agency in pursuing work that I cared about. At the same time I knew I’d be relocating to Boulder after graduation. I was very fortunate that Nikolaus accepted me and let me keep pursuing this goal of mine.</p> <p>It’d be unfathomable if I could keep doing this research in academia or industry, though of course that would be ideal. But I’m biased toward academia, particularly teaching. I’ve been teaching high school robotics for 5 years now, and now teaching/mentoring undergrads at ŷڱƵ—each day is as fulfilling as the first. I have great mentors across the robotics faculty and senior PhD students we work in ECES 111, a giant, well-equipped space that 3 robotics labs share, and it’s great for collaboration and brainstorming.&nbsp;</p> <h2>What are your hopes for this international conference (and what conference is it?)</h2> <p>The venue is a workshop at the 2024 International Conference on Robotics and Automation (ICRA 2024), happening in Yokohama, Japan from May 13-17. The name of the workshop is a mouthful: Vision-Language Models for Navigation and Manipulation (VLMNM).</p> <p>A workshop is detached from the main conference, and kind of is its own little bubble (like a big supermarket—the conference—hosting a pop-up food tasting event—the workshop). I'm really excited to meet other researchers and pick their brains. As a first-year, I’ve spent the past year reading papers from practically everyone on the workshop panel, and from their students. I’ll probably also spend half my time exploring (eating) around the Tokyo area.<br> &nbsp;</p></div> </div> </div> </div> </div> <div>William Xie, a first-year PhD student in computer science, is teaching a robot to reason how gently it should grasp previously unknown objects by using large language models (LLMs).&nbsp;</div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 02 Apr 2024 19:39:31 +0000 Anonymous 2440 at /cs VIDEO: 3D display could soon bring touch to the digital world /cs/2023/08/01/video-3d-display-could-soon-bring-touch-digital-world <span>VIDEO: 3D display could soon bring touch to the digital world</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2023-08-01T10:59:12-06:00" title="Tuesday, August 1, 2023 - 10:59">Tue, 08/01/2023 - 10:59</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/screenshot_2023-08-01_at_10.57.02_am.png?h=866e7246&amp;itok=8UxRgwiM" width="1200" height="600" alt="Screenshot of rippling display"> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> <a href="/cs/taxonomy/term/439" hreflang="en">Research</a> </div> <span>Daniel Strain</span> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Nikolaus Correll, associate professor in the Department of Computer Science, helped create the basis for a new rippling 3D display that can bring touch to digital experiences. </div> <script> window.location.href = `/today/2023/07/31/3d-display-could-soon-bring-touch-digital-world`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 01 Aug 2023 16:59:12 +0000 Anonymous 2313 at /cs Correll and Mickelson receive fifth Open Educator Award /cs/2022/03/07/correll-and-mickelson-receive-fifth-open-educator-award <span>Correll and Mickelson receive fifth Open Educator Award</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2022-03-07T00:00:00-07:00" title="Monday, March 7, 2022 - 00:00">Mon, 03/07/2022 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/cu_open_educators_2022.jpeg?h=f214e3e9&amp;itok=_8L9A7Dd" width="1200" height="600" alt="ŷڱƵ Open Educator"> </div> </div> <div role="contentinfo" class="container ucb-article-categories" itemprop="about"> <span class="visually-hidden">Categories:</span> <div class="ucb-article-category-icon" aria-hidden="true"> <i class="fa-solid fa-folder-open"></i> </div> <a href="/cs/taxonomy/term/465"> News </a> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/475" hreflang="en">Alan Mickelson</a> <a href="/cs/taxonomy/term/469" hreflang="en">News</a> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>The University Libraries and ŷڱƵ Boulder Student Government are pleased to recognize Nikolaus Correll and Alan Mickelson as recipients of the 2022 Open Educator Award at the University of ŷڱƵ Boulder. Both associate professors were selected to receive this award for using open educational practices in their teaching.</div> <script> window.location.href = `/libraries/2022/03/07/correll-and-mickelson-receive-fifth-open-educator-award`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Mon, 07 Mar 2022 07:00:00 +0000 Anonymous 2073 at /cs Robotics team scores an award in Tokyo /cs/2018/10/25/robotics-team-scores-award-tokyo <span>Robotics team scores an award in Tokyo</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2018-10-25T14:36:44-06:00" title="Thursday, October 25, 2018 - 14:36">Thu, 10/25/2018 - 14:36</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/img_4863.jpg?h=206d5ebd&amp;itok=A7aFIbRu" width="1200" height="600" alt="The team poses with their award in front of a colorful conference banner."> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>The ŷڱƵ team was formed around the ŷڱƵ spinoff company Robotics Materials Inc., which was founded by Associate Professor Nikolaus Correll. </div> <script> window.location.href = `/engineering/2018/10/25/cu-boulder-robotics-team-scores-award-tokyo`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Thu, 25 Oct 2018 20:36:44 +0000 Anonymous 1077 at /cs Correll: To really help U.S. workers, invest in robots /cs/2017/04/04/correll-really-help-us-workers-invest-robots <span>Correll: To really help U.S. workers, invest in robots</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2017-04-04T09:13:40-06:00" title="Tuesday, April 4, 2017 - 09:13">Tue, 04/04/2017 - 09:13</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/conversation-bots.jpg?h=aed7d2b0&amp;itok=aK3LSKFy" width="1200" height="600" alt="Three students experiment with human-robot interaction and autonomous manipulation in the Correll Lab. "> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>Robotics researcher advocates for job growth by "building on our existing strengths, remaining a leader in manufacturing efficiency and doing the hard work to further improve our educational and social systems to cope with a changing workforce."</div> <script> window.location.href = `http://robohub.org/to-really-help-us-workers-invest-in-robots/`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 04 Apr 2017 15:13:40 +0000 Anonymous 656 at /cs Gaugewear Inc. to commercialize wearable technology prototype from Correll Lab /cs/2015/07/28/gaugewear-inc-commercialize-wearable-technology-prototype-correll-lab <span>Gaugewear Inc. to commercialize wearable technology prototype from Correll Lab</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2015-07-28T15:13:56-06:00" title="Tuesday, July 28, 2015 - 15:13">Tue, 07/28/2015 - 15:13</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/cue_wearable_technology_0164pc.jpg.jpg?h=e2f5ab81&amp;itok=LFlXYdx6" width="1200" height="600" alt="A close-up view of the wearable device being controlled with a finger swipe."> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>The new technology can be worn as an adjustable sleeve on the user’s arm or leg or sewn into everything from clothing and backpack straps to dog leashes, allowing “eyes-free” control of a tablet, phone or media player that doesn’t require any precise button touching. The new prototype joins popular wearable high-tech devices ranging from smart watches and fitness bands to headsets and smart gloves.</div> <script> window.location.href = `http://www.colorado.edu/today/2015/07/27/gaugewear-inc-commercialize-wearable-technology-prototype`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Tue, 28 Jul 2015 21:13:56 +0000 Anonymous 364 at /cs 'Science' article outlines advances, challenges for robotic materials /cs/2015/03/20/science-article-outlines-advances-challenges-robotic-materials <span>'Science' article outlines advances, challenges for robotic materials</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2015-03-20T00:00:00-06:00" title="Friday, March 20, 2015 - 00:00">Fri, 03/20/2015 - 00:00</time> </span> <div> <div class="imageMediaStyle focal_image_wide"> <img loading="lazy" src="/cs/sites/default/files/styles/focal_image_wide/public/article-thumbnail/f1.large_.jpg?h=b7bee6e5&amp;itok=S63PN38p" width="1200" height="600" alt="A collage showing things that have inspired robotic materials research, including banyan trees, eagles and the cuttlefish. "> </div> </div> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> <div> <div class="imageMediaStyle large_image_style"> <img loading="lazy" src="/cs/sites/default/files/styles/large_image_style/public/article-image/f1.large_1.jpg?itok=xFbwvqkn" width="1500" height="517" alt="At top: Biological systems that tightly integrate sensing, actuation, computation and communication. At bottom: The engineering applications that could be enabled by materials that take advantage of similar principles. From left: The cuttlefish (camouflage), an eagle’s wings (shape change), the banyan tree (adaptive load support) and human skin (tactile sensing)."> </div> </div> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> <div><p></p> <p>Prosthetics with a realistic sense of touch. Bridges that detect and repair their own damage. Vehicles with camouflaging capabilities.</p> <p>Advances in materials science, distributed algorithms and manufacturing processes are bringing all of these things closer to reality every day, says a&nbsp;<a href="http://www.sciencemag.org/content/347/6228/1261689" target="_blank" rel="nofollow">review published today in the journal&nbsp;<em>Science</em></a>&nbsp;by Nikolaus Correll, assistant professor of computer science, and research assistant Michael McEvoy, both of the University of ŷڱƵ Boulder.</p> <p>The "robotic materials" being developed by Correll Lab and others are often inspired by nature, Correll said.</p> <p>"We looked at organisms like the cuttlefish, which change their appearance depending on their environment, and the banyan tree, which grows above-ground roots to support the increasing weight of the trunk," Correll said. "We asked what it would take to engineer such systems."</p> <p>Robotic materials require tight integration between sensing, computation and actually changing the materials properties of the underlying material. While materials can already be programmed to change some of their properties in response to specific stimuli, robotic materials can sense stimuli and determine how to respond on their own.</p> <p>Correll and McEvoy use the example of artificial skin equipped with microphones that would analyze the sounds of a texture rubbing the skin and route information back to the central computer only when important events occurred.</p> <p>"The human sensory system automatically filters out things like the feeling of clothing rubbing on the skin," Correll said. "An artificial skin with possibly thousands of sensors could do the same thing, and only report to a central 'brain' if it touches something new."</p> <p>While all of these materials are possible, the authors caution that manufacturing techniques remain a challenge.</p> <p>"Right now, we're able to make these things in the lab on a much larger scale, but we can't scale them down," Correll said. "The same is true for nano- and microscale manufacturing, which can't be scaled up to things like a building façade."</p> <p>The field also faces an education gap, the authors say. Developing robotic materials requires interdisciplinary knowledge that currently isn't provided by materials science, computer science or robotics curricula alone.</p> <p>At ŷڱƵ-Boulder, Correll is addressing that gap with a freshman-level engineering projects class called "Materials That Think."</p> <p>"We expose engineering students to both materials and computing, no matter what their background is," he said.</p> <p>In the long run, Correll believes robotic materials will be used in everyday items, like shoe insoles that could sense pressure and adapt their stiffness to adjust to walking or running.</p> <p>"While we can imagine such a material consisting of little patches that each include an actuator, sensor and little computer, we have a hard time imagining such a complex piece of technology could ever be affordable," he said. "I think the last 10 years of advances in smartphones have demonstrated the opposite."</p></div> </div> </div> </div> </div> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 20 Mar 2015 06:00:00 +0000 Anonymous 414 at /cs Swiss newspaper interviews Nikolaus Correll about swarm robotics /cs/2013/07/03/swiss-newspaper-interviews-nikolaus-correll-about-swarm-robotics <span>Swiss newspaper interviews Nikolaus Correll about swarm robotics</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2013-07-03T00:00:00-06:00" title="Wednesday, July 3, 2013 - 00:00">Wed, 07/03/2013 - 00:00</time> </span> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <script> window.location.href = `http://www.nzz.ch/wissen/wissenschaft/der-automat-als-herdentier-1.18105390`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Wed, 03 Jul 2013 06:00:00 +0000 Anonymous 458 at /cs ŷڱƵ Boulder team develops swarm of pingpong ball-sized robots /cs/2012/12/14/cu-boulder-team-develops-swarm-pingpong-ball-sized-robots <span>ŷڱƵ Boulder team develops swarm of pingpong ball-sized robots</span> <span><span>Anonymous (not verified)</span></span> <span><time datetime="2012-12-14T00:00:00-07:00" title="Friday, December 14, 2012 - 00:00">Fri, 12/14/2012 - 00:00</time> </span> <div role="contentinfo" class="container ucb-article-tags" itemprop="keywords"> <span class="visually-hidden">Tags:</span> <div class="ucb-article-tag-icon" aria-hidden="true"> <i class="fa-solid fa-tags"></i> </div> <a href="/cs/taxonomy/term/473" hreflang="en">Nikolaus Correll</a> </div> <div class="ucb-article-content ucb-striped-content"> <div class="container"> <div class="paragraph paragraph--type--article-content paragraph--view-mode--default"> <div class="ucb-article-content-media ucb-article-content-media-above"> <div> <div class="paragraph paragraph--type--media paragraph--view-mode--default"> </div> </div> </div> <div class="ucb-article-text d-flex align-items-center" itemprop="articleBody"> </div> </div> </div> </div> <div>University of ŷڱƵ Boulder Assistant Professor Nikolaus Correll likes to think in multiples. If one robot can accomplish a singular task, think how much more could be accomplished if you had hundreds of them.</div> <script> window.location.href = `http://www.colorado.edu/today/2012/12/14/cu-boulder-team-develops-swarm-pingpong-ball-sized-robots`; </script> <h2> <div class="paragraph paragraph--type--ucb-related-articles-block paragraph--view-mode--default"> <div>Off</div> </div> </h2> <div>Traditional</div> <div>0</div> <div>On</div> <div>White</div> Fri, 14 Dec 2012 07:00:00 +0000 Anonymous 482 at /cs