hedayati /atlas/ en ATLAS@CHI2022 /atlas/2022/04/28/atlaschi2022 ATLAS@CHI2022 Anonymous (not verified) Thu, 04/28/2022 - 10:14 Categories: News Tags: ACME alistar bae bell choi danielleszafir devendorf do feature frier friske hedayati keegan living matter mcquaid news ofer phd phdstudent research suzuki unstable voida west wirfs-brock zheng

ATLAS researchers will present six published works and two workshops at the 2022 ACM Special Interest Group on Computer-Human Interaction (SIGCHI), the world’s preeminent forum for the field of human-computer interaction. The conference, commonly referred to as “CHI,” will be held hybrid-onsite April 30-May 6, 2022 in New Orleans.

Researchers affiliated with Laura Devendorf’s Unstable Design Lab will be presenting two workshops, one full paper and one journal article; Mirela Alistar’s Living Matter Lab authored two papers, one of which received a Best Paper Honorable Mention award. The ACME Lab collaborated with the VisuaLab (formerly with the ATLAS Institute) for one paper and ATLAS associated PhD students also will present one paper.

​​CHI Papers are publications of original research in the field of Human Computer Interaction that are read and cited worldwide, and have a broad impact on the development of HCI theory, method, and practice. It's a prestigious honor for papers to be accepted to CHI; within the last decade, the overall acceptance rate for CHI has only been 20-27 percent.

 

CHI 2022 papers, journal articles and workshops by ATLAS faculty and students

Living Matter Lab

. [Best Paper Honorable Mention Award].
Fiona Bell, (PhD student, ATLAS); Netta Ofer, (research master’s student, ATLAS);  Mirela Alistar, (faculty, ATLAS/Computer Science).
This paper presents ReClaym: a clay-like material made from the makers’ own compost, reflecting the makers' relationship with food, applied manual fabrication techniques and design explorations. Through a process of Intimate Making with an intimate material, researchers used ReClaym to create a collection of applications, including garden paraphernalia, games and personal household items. 

 (interactivity paper)
Fiona Bell, (ATLAS PhD student);  Netta Ofer, (research master’s student, ATLAS); Hyelin Choi (undergraduate student, Molecular Cellular and Developmental Biology);  Ella S McQuaid (undergraduate student, Mechanical Engineering); Ethan Frier (MS, CTD—Creative Industries '21); Mirela Alistar, (faculty, ATLAS/Computer Science).
In this work, researchers introduce a range of sustainable biomaterials including ReClaym, a clay-like material made from compost; Alganyl, an algae-based bioplastic; Dinoflagellates, bioluminescent algae; SCOBY, symbiotic cultures of bacteria and yeast; and Spirulina, nutrient-dense blue-green algae to create unique interactive interfaces. The researchers will present the biomaterials at CHI, where conference participants can engage with the biomaterials.

 

ACME Lab—Workshop Papers

Augmented Personification of Intelligent Music Tools for Creativity and Collaboration
ACM CHI 2022 Workshop 47: : When Interactive Assistance and Augmentation Meet Musical Instruments .
Torin Hopkins (ATLAS PhD student), Rishi Vanukuru (ATLAS PhD student), Suibi Che-Chuan Weng (Creative Industries master's student), Amy Banic, (Visiting Associate Professor, Computer Science), Ellen Yi-Luen Do (Professor, ATLAS Institute & Computer Science).

Designing and Studying Social Interactions in Shared Virtual Spaces using Mobile Augmented Reality
ACM CHI 2022 Workshop 46:
Rishi Vanukuru, (ATLAS PhD student), Amarnath Murugan, Jayesh Pillai, and Ellen Yi-Luen Do (Professor, ATLAS Institute & Computer Science). 

What to Design Next: Actuated Materials and Soft Robots for Children
ACM CHI 2022 Workshop 39: Actuated Materials and Strategies for Human Computer Interaction Design.
Chris Hill (ATLAS PhD student), Ruojia Sun, (ATLAS PhD student), Ellen Yi-Luen Do (Professor, ATLAS Institute & Computer Science).


 

ACME Lab and VisuaLab* collaboration


S. Sandra Bae, (ATLAS PhD student), Clement Zheng, (ATLAS post-doctoral research associate, PhD; Technology, Media & Society ‘20); Mary Etta West, (PhD student, Computer Science); Ellen Yi-Luen Do, (faculty, ATLAS/Computer Science); Samuel Huron, (faculty, Telecom - Institut Polytechnique de Paris); Danielle Albers Szafir (UNC Chapel Hill, former ATLAS faculty).
Physicalizations are more than just physical representations of data. Each physicalization is also (un)consciously a product of different research communities physicalization is part of, specifically of their research perspective and values. But research currently lacks a synthesis across the different communities data physicalization sits upon, including their approaches, theories, and even terminologies. To bridge these communities synergistically, ATLAS researchers present a design space that describes and analyzes physicalizations according to three facets: context (end-user considerations), structure (the physical structure of the artifact), and interactions (interactions with both the artifact and data). 

*Following Danielle Szafir's departure last summer, the ATLAS VisuaLab was closed

 

Unstable Design Lab 


Maya Livio (PhD student, Intermedia Art, Writing and Performance); Laura Devendorf (faculty, ATLAS/Information Science).
This paper introduces the concept of the eco-technical interface— which represents the sites at which human, non-human and technological interfaces overlap—as a critical zone at which designers can surface and subvert issues of multispecies relations, such as nonhuman instrumentalization. 

  (journal article)
Jordan Wirfs-Brock (PhD candidate, Information Science); Alli Fam (reporter, New Hampshire Public Radio); Laura Devendorf (faculty, ATLAS/Information Science); Brian C Keegan (faculty, Information Science).
This first-person, retrospective exploration of two radio sonification pieces illuminates the role of narrative in designing to support listeners as they learn to listen to data.

(workshop)
Jordan Wirfs-Brock , (PhD candidate, Information Science); Maxene Graze (Data Visualization Engineer, MURAL), Laura Devendorf (faculty, ATLAS/Information Science); Audrey Desjardins, (faculty, University of Washington); Visda Goudarzi (faculty, Columbia College Chicago); Mikhaila Friske, (PhD student, Information Science); Brian C Keegan  (faculty, Information Science).
This workshop engages synesthesia to explore how translating between sensory modalities might uncover new ways to experience and represent data. 

(workshop)
Verena Fuchsberger (Post Doc, Center for Human-Computer Interaction, University of Salzburg), Dorothé Smit (Research Fellow, Center for Human-Computer Interaction, University of Salzburg), Nathalia Campreguer França (Research Fellow, Center for Human-Computer Interaction,University of Salzburg); Georg Regal (Scientist, AIT Austrian Institute of Technology); Stefanie Wuschitz (Mz. Baltazar’s Lab);  Barbara Huber (Mz. Baltazar’s Lab); Joanna Kowolik (project manager, Happylab); Laura Devendorf (faculty, ATLAS/Information Science); Elisa Giaccardi (faculty, Delft University of Technology); Ambra Trotto (Research Institute of Sweden).
In this one-day workshop, organizers aim to counteract the phenomenon that access to making (e.g., in makerspaces, fablabs, etc.) is not equally distributed, with certain groups of people being underrepresented (e.g., women*).

 

Associated PhD Students

 
Ryo Suzuki (ATLAS/PhD Computer Science '20; assistant professor, University of Calgary); Adnan Karim, (MS student, University of Calgary); Tian Xia, (BS, Computer Science, University of Calgary); Hooman Hedayati, (ATLAS/PhD Computer Science ‘21), Nicolai Marquardt (faculty, University College London). 
Researchers surveyed 460 research papers, formulating key challenges and opportunities that guide and inform future research in AR and robotics.


 

ATLAS researchers will present six published works and two workshops at the 2022 ACM Special Interest Group on Computer-Human Interaction (SIGCHI), the world’s preeminent forum for the field of human-computer interaction. The conference, commonly referred to as “CHI,” will be held hybrid-onsite April 30-May 6, 2022 in New Orleans.

Off

Traditional 0 On White ]]>
Thu, 28 Apr 2022 16:14:11 +0000 Anonymous 4317 at /atlas
ATLAS researchers' algorithm helps robots detect everyone in social gatherings /atlas/2022/04/06/atlas-researchers-algorithm-helps-robots-detect-everyone-social-gatherings ATLAS researchers' algorithm helps robots detect everyone in social gatherings Anonymous (not verified) Wed, 04/06/2022 - 11:34 Categories: News Tags: briefly hedayati inbrief phdstudent research szafir

To effectively participate in a group discussion, it's important to be able to identify who is present and direct your attention accordingly. For most people, this is not hard, but designing robots able to do the same thing is quite challenging. While robots are equipped with sensors for detecting the number of people in a group, they are not always accurate and, to date, there has been little research into how robots can confirm their assumptions and correct any errors they may have made.

However, last year, two researchers in ATLAS Institute's IRON Lab* developed a solution to this problem that is described in a paper published in the March Proceedings of the International Conference on Human-Robot Interaction  (HRI '22). The authors, Hooman Hedayati  (PhD computer science '20) and Daniel Szafir, assistant professor of computer science at UNC Chapel Hill and the former director of the ATLAS IRON Lab, proposed a method to overcome situations when conversational group (F-formation) detection algorithms fail.

By studying different conversational group data sets, the researchers observed that relative to the size of a conversation group, people tend to stand in predictable locations relative to each other. Hedayati and Szafir then developed a system for identifying high probability regions where people are likely to stand in a group relative to a single participant. Using that system, the robot can reason when another person in the conversation hasn't been detected and correct their error. 

The first model estimates the true size of a conversational group, where only some participants were detected. The second model predicts the locations where any undetected participants are likely to be standing. Together, these models may improve detection algorithms and a robot's ability to detect members of a group and participate more seamlessly in a conversation.

*Following Szafir's departure last summer, the ATLAS IRON Lab was closed.

 

Publication

Hooman Hedayati and Daniel Szafir. 2022. . In Proceedings of the 2022 ACM/IEEE International Conference on Human-Robot Interaction (HRI '22). IEEE Press, 402–411 (March 7-10, 2022—virtual, originally Hokkaido, Japan).

Imagine a world where robots flawlessly detect everyone in a conversation group and also greet the newcomers. Described in a paper published in the March proceedings of the prestigious International Conference on Human-Robot Interaction  (HRI '22), Hooman Hedayati  (PhD computer science '20) and Daniel Szafir, assistant professor of computer science at UNC Chapel Hill and former ATLAS faculty member, proposed a method to overcome situations when conversational group (F-formation) detection algorithms fail.

Off

Traditional 0 On White ]]>
Wed, 06 Apr 2022 17:34:15 +0000 Anonymous 4313 at /atlas
RoomShift research: IEEE Computer Graphics and Applications magazine's cover story /atlas/2021/08/10/roomshift-research-ieee-computer-graphics-and-applications-magazines-cover-story RoomShift research: IEEE Computer Graphics and Applications magazine's cover story Anonymous (not verified) Tue, 08/10/2021 - 07:04 Categories: News Tags: do feature gross hedayati leithinger news research suzuki szafir zheng THING Lab researchers, led by recent PhD graduate, Ryo Suzuki, developed a swarm of shape-changing robots that move furniture around a room, opening up new haptic ideas for virtual reality. window.location.href = `https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=9487522&tag=1`;

Off

Traditional 0 On White ]]>
Tue, 10 Aug 2021 13:04:53 +0000 Anonymous 3951 at /atlas
AR Drum Circle research envisions enjoyable remote jamming experiences despite latency /atlas/2021/01/13/ar-drum-circle-research-envisions-enjoyable-remote-jamming-experiences-despite-latency AR Drum Circle research envisions enjoyable remote jamming experiences despite latency Anonymous (not verified) Wed, 01/13/2021 - 15:33 Categories: News Tags: ACME ardrum banic do feature gross gyory hedayati hopkins news research sholes soguero szafir

Long before the pandemic sent people scrambling into isolation, musicians have longed to jam virtually with others across the globe. Now researchers from ŷڱƵ Boulder’s ATLAS Institute’sACME Lab and Ericsson Research are developing ways for musicians to play together remotely through the AR Drum Circle project.

The difficulty with online jamming has always been latency, the tiny delay that occurs when data is transmitted from one point to the next, says Torin Hopkins, an ATLAS PhD student who leads the ATLAS team. Video conferencing participants don’t detect the delay because they generally take turns when speaking, but any lag greater than 20 milliseconds makes synchronous singing or performing unworkable, he says.

“There’s no room for delay in musical collaborations,” says Hopkins, adding that the virtual choir videos popular during the pandemic were mixed in post production.  “Yet real-time music-making with zero lag and a consistent video stream currently doesn’t exist.”

In the AR Drum Circle project, ATLAS researchers and Ericsson project collaborators are exploring ways in which remote drumming experiences can be made more enjoyable despite the latency, says Colin Soguero, the project’s app developer and an undergraduate student studying Creative Technology & Design.

“Latency is one of the biggest issues with remote collaboration, and it can be very frustrating for musicians who rely so heavily on precise coordination,” he says. 

Jamming with Avatars
Some of AR Drum Circle’s research focuses on avatars, computer-generated figures that in this case replicate the actions of real drummers participating remotely in drum circles. The avatars appear in another musician’s surroundings using augmented reality (AR), a technology that superimposes a computer-generated image on a user's view of the real world.

Using the AR Drum Circle application, Musician A prints a QR code and places it to position Musician B’s avatar in the augmented reality view. Musician A’s Android cell phone runs the application, and displays B’s avatar where the coded picture was placed. Musician B does the same. When either musician strikes a drum pad connected to their computers, the computers send that information through the internet, the corresponding avatar drummer then strikes its drum, and a drum beat is heard in both locations. The technology employs  a Musical Instrument Digital Interface (MIDI) controller, which, when one drummer strikes their drum pad, sends information to their computer, which then sends the data over the internet to the other drummers’ locations.

While live video of the drumming partners might be best, using avatars mediates the perception of the latency and—potentially—provides visual and audio information for a more satisfying musical exchange, Hopkins says. It takes just small bits of data to trigger an avatar's hand to move, whereas rendering videos requires large amounts of data to transmit every pixel of the moving images.

Adding to this, the core idea of this project is not merely collaboration, but how to minimize or leverage the effects of the inevitable latency and jitter (the deviation from true periodicity of a presumably periodic signal) in order to make collaborations that are highly sensitive to timing more successful or fun, says Mark Gross, professor of computer science, ATLAS director and a member of the project’s advisory team.

“Latency cannot be avoided, but its effects can be mitigated by being clever in portraying avatars and by anticipating future actions,” Gross says.

Sending the drum pad information over the internet to a receiving computer is “incredibly complex,” Hopkins adds. The data travels a long journey and encounters many checkpoints along the way, and small packets of information travel much faster across the network than video with sound.  Because the avatar's motion needs to be realistic, complex information is kept on the receiving device and only “start animation” messages are sent over the internet. 

Enjoying the experience
Just watching and hearing an avatar strike a drum doesn’t provide adequate information for remote drummers to synchronize, says Ellen Do, professor of computer science with ATLAS, who also participates in several drum circles. Drummers often use gestures, such as head motions and eye contact, to indicate part changes, turn taking and solos, she says. They also use striking force to control volume and hand position to control the timbre; they need to recognize the patterns of the rhythms (e.g., focusing on the down beats, space in-between the beats, the speed, embellishments, harmony, etc.) to play with others, she says.

A large part of the team’s research focuses on determining which of those gestures and expressive features might help remote drummers feel immersed in the collaborative musical experience and experience the enjoyment of feeling connected with each other, she says.

Hopkins, who plays guitar, piano, ukulele, bass, and drums, as well as sings, has missed jamming with other musicians during the pandemic.

“Meeting new people, sharing new ideas and the audiences– those are the things that I really, really miss,” Hopkins says. “Part of the project is figuring out how to incorporate that. Every time I hit the drum, is that enough to make you feel like I’m listening to you? That you feel connected, and that we feel in-sync with each other?”

Connecting in an isolated world


Over time the researchers plan to expand the study to include different types of musical jams, such as including more drummers, musicians playing different instruments, and even dancers that would interact with drummers, as might happen in a physically co-located drum circle, says Do. 

Soguero adds that the researchers are also exploring looping, which allows a player to record a drum beat and play it back later, as well as pseudo-haptics, visual effects created in a virtual environment that trick the brain to believe that it’s receiving information about touch and feel.

Regardless of the pandemic, connecting with people who are geographically distant allows for rich, connected, experiences with others who have a variety of talents, come from different cultures and have different perspectives, Hopkins says. 

Lessons learned from the AR Drum Circle study about human-human communication, or human-agent communication (with an avatar, agent or robot) could also possibly inform other computer-supported collaborative work scenarios, such as remotely collaborating in medical procedures or auto-repairs, Do says.  

“Our research raises the question, ‘Why collaborative musical experiences?’ ” says Hopkins. “Are we doing it to enjoy the company of others or because we enjoy music? How much can you strip away from either experience before you realize they are so intimately connected that designing for collaboration or musical expression alone feels disingenuous?

“Therefore, when designing the AR Drum Circle application, we focus on player-centered design strategies. Maximizing play, given the constraints of the mediating technology (augmented reality) and activity (drum circles), enables the players to feel a sense of contribution in a musical collective, giving us a much needed sense of connection in an isolated world.” 

 

 

 

AR Drum Circle's ATLAS Team: Torin Hopkins, ATLAS PhD student, is the project manager; Darren Sholes, ATLAS PhD student, is the technical lead/network engineer; Peter Gyory, ATLAS PhD student, was the former technical lead; Hooman Hedayati, PhD student in computer science, is the project's network engineer and advisor for human-robot (avatar) interaction; and Colin Soguero, an undergraduate student studying Creative Technology & Design, is the app developer. The advisory team consists of Mark D. Gross, ATLAS director and professor of computer science; Ellen Do, ATLAS and computer science professor; Amy Banic, associate professor of computer science at the University of Wyoming and visiting ATLAS professor; and Dan Szafir, ATLAS and computer science assistant professor.

Ericsson Research Project Collaborators: Amir Gomroki, head for 5G, North America; Héctor Caltenco, senior researcher; Per-Erik Brodin, research engineer;  Ali El Essaili, senior research engineer; Chris Phillips, master researcher; Alvin Jude Hari Haran, senior researcher; Per Karlsson, director, media technology research at Ericsson and head of Ericsson Research in Silicon Valley; Gunilla Berndtsson, senior researcher at Ericsson Research, Media Technologies.

 

[video:https://www.youtube.com/watch?v=e7PhJRmLt1w&feature=youtu.be]

ATLAS researchers and Ericsson Research project collaborators are exploring ways in which remote drumming experiences can be made more enjoyable despite the latency, including drumming with avatars.

Off

Traditional 0 On White ]]>
Wed, 13 Jan 2021 22:33:18 +0000 Anonymous 3489 at /atlas
Pufferfish-inspired robot could improve drone safety /atlas/2020/10/20/pufferfish-inspired-robot-could-improve-drone-safety Pufferfish-inspired robot could improve drone safety Anonymous (not verified) Tue, 10/20/2020 - 11:33 Categories: News Tags: IRON feature hedayati leithinger news pufferbot research suzuki szafir Pufferbot is an aerial robot with an expandable protective structure that deploys to encircle the drone and prevent the drone's rotors from coming in contact with obstacles or people. window.location.href = `/today/2020/10/21/pufferfish-inspired-robot-could-improve-drone-safety`;

Off

Traditional 0 On White ]]>
Tue, 20 Oct 2020 17:33:17 +0000 Anonymous 3243 at /atlas
RoomShift: A room-scale haptic and dynamic environment for VR applications /atlas/2020/09/30/roomshift-room-scale-haptic-and-dynamic-environment-vr-applications RoomShift: A room-scale haptic and dynamic environment for VR applications Anonymous (not verified) Wed, 09/30/2020 - 12:18 Categories: News Tags: THING do feature gross hedayati leithinger news research roomshift suzuki szafir zheng RoomShift is a haptic and dynamic environment that could be used to support a variety of virtual reality (VR) experiences. window.location.href = `https://techxplore.com/news/2020-09-roomshift-room-scale-haptic-dynamic-environment.html`;

Off

Traditional 0 On White ]]>
Wed, 30 Sep 2020 18:18:19 +0000 Anonymous 3253 at /atlas
PufferBot: A flying robot with an expandable body /atlas/2020/08/24/pufferbot-flying-robot-expandable-body PufferBot: A flying robot with an expandable body Anonymous (not verified) Mon, 08/24/2020 - 13:06 Categories: News Tags: IRON feature hedayati leithinger news pufferbot research suzuki szafir

PufferBot: A flying robot with an expandable body, receives worldwide media coverage

Research about PufferBot, a pufferfish-inspired aerial robot developed in ATLAS Institute's IRON and THING labs, has received worldwide attention, with the project article translated to , , , , , ,  and many more languages. In a paper to be presented at the 2020 IEEE/RSJ International Conference on Intelligent Robotics and Systems (IROS), the researchers, led by PhD student Hooman Hedayati, detailed the project's aim to improve drone safety, by providing a plastic shield that expands in size at a moment’s notice—forming a robotic airbag that could prevent dangerous collisions between people and machines.

The research was covered by and .   The project was also featured on several podcasts:  Amelia's weekly Fish Fry, EE journal, Oct. 30, 2020; and , The UAV Digest, Oct. 29, 2020. The weekly audio podcast brings coverage of unmanned aerial vehicles and systems. 

TechXplore writes about PufferBot, an actuated, expandable structure that can be used to fabricate shape-changing aerial robots. window.location.href = `https://techxplore.com/news/2020-09-pufferbot-robot-body.html`;

Off

Traditional 0 On White ]]>
Mon, 24 Aug 2020 19:06:50 +0000 Anonymous 3321 at /atlas
ATLAS research helps define the future of human-computer interaction /atlas/2020/05/01/atlas-research-helps-define-future-human-computer-interaction ATLAS research helps define the future of human-computer interaction Anonymous (not verified) Fri, 05/01/2020 - 00:00 Categories: News Tags: ACME IRON SUPER THING alistar brubaker danielleszafir devendorf do feature gach gadiraju gross hedayati kane klefeker leithinger living matter muehlbradt news research striegl suzuki szafir unstable wu zheng  

 

Helping robots behave tactfully in group situations, pinpointing ways social media can avoid reminding the bereaved of their losses, blending modern technology with ancient weaving practices to improve smart textiles, encouraging visually impaired children and sighted family members to learn Braille together through tangible blocks and computer games—these are some of the topics covered in the nine papers and two workshops by researchers at ŷڱƵ Boulder’s ATLAS Institute that were accepted to CHI 2020, the world’s preeminent conference for the field of human-computer interaction. 

Like so many other events, CHI 2020, also known as ACM’s Conference on Human Factors in Computing Systems, isn’t taking place this year, but the proceedings are published and faculty and students remain tremendously proud of their contributions. Commenting on their work, ATLAS Director Mark Gross said, “The interactions we all have with hardware and software range from the absurd to the sublime. The field of human-computer interaction has more impact today than ever before, and ATLAS students and faculty are contributing at the highest levels. I’m immensely proud of this work.”

Researchers in the Unstable Design Lab authored a remarkable four of the nine papers admitted to the conference, two of which earned honorable mention, an accolade reserved for the top 5 percent of accepted conference papers. The THING, Superhuman Computing, Living Matter, ACME and IRON labs also had papers accepted to the conference. 

"Each of these papers is unique and forward-thinking," said Laura Devendorf, director of the Unstable Design Lab, of the researchers' papers. "They show new ways of both designing, engaging, but also recycling wearable tech devices. They not only present interesting design work, but present it in a way that ties in theories and practices from inside and outside our research community: from design for disassembly to ASMR channels on YouTube."

CHI 2020 was scheduled to take place April 25 – 30, in Hawaii. “I’m particularly disappointed for our students. It’s a big opportunity for them and their careers to get that kind of exposure,” said Devendorf.

In all, CHI 2020 received 3,126 submissions and accepted 760. In 2019, CHI accepted five ATLAS papers, including three from the Unstable Design Lab and two from the Superhuman Computing Lab.
 

CHI 2020 papers, position papers and workshops by ATLAS faculty and students


Unstable Design Lab

[Honorable Mention Award]
Laura Devendorf (ATLAS/INFO Faculty), Katya Arquilla (Aerospace PhD Student), Sandra Wirtanen,  Allison Anderson (Aerospace Faculty), Steven Frost (Media Studies Faculty) 
By broadening the idea of who and what is considered “technical,” this paper examines the ways HCI practitioners, engineers and craftspeople can productively collaborate. 

[Honorable Mention Award]
Laura Devendorf (ATLAS/INFO) Faculty), Kristina Andersen, Aisling Kelliher
How can we design for difficult emotional experiences without reducing a person’s experience? In this paper three researchers design objects that illustrate their personal experiences as mothers to gain a deeper understanding of their individual struggles.

  
Shanel Wu (ATLAS), Laura Devendorf (ATLAS/INFO)
Being mindful of the massive waste streams for digital electronics and textiles, HCI researchers address sustainability and waste in smart textiles development through designing smart textile garments with reuse in mind.

  
Josephine Klefeker (ATLAS, TAM undergraduate), Libi Striegl (Intermedia Art, Writing and Performance), Laura Devendorf (ATLAS/INFO)
Researchers introduced the online subculture of
autonomous sensory meridian response (ASMR) videos, showing people slowly interacting with objects and whispering into microphones and triggering a tingling bodily sensation in viewers and listeners, as a source of inspiration for wearables and experiences of enchantment, to cultivate deeper connections with our mundane and everyday environments.


IRON Lab

 
Hooman Hedayati (PhD student, Computer Science), James Kennedy, Daniel Szafir
While humans most often learn to interpret social situations and adjust their behavior accordingly, robots must be programmed to do so. This paper explores ways for robots to detect and predict the position of individuals in human conversational groups in order to more fluidly interact and participate in a conversation with them. More information

THING Lab & ACME Lab


Ryo Suzuki, Hooman Hedayati, (both PhD student, CS), Clement Zheng (ATLAS PhD candidate), James Bohn (undergraduate, CS), Daniel Szafir, Ellen Yi-Luen Do, Mark D. Gross, Daniel Leithinger (all ATLAS faculty)
With applications in virtual tours and architectural design, this project dynamically synchronizes virtual reality with the physical environments by rearranging objects using a small swarm of robots able to elevate and relocate tables, chairs and other objects. When users can sit on, lean against, touch and otherwise interact with objects in a virtual scene, it provides more a fuller immersion in the virtual world than purely visual VR. More information

Living Matter Lab 


Mirela Alistar (ATLAS), Margherita Pevere
An exploration of the potential of DNA molecules to enable new ways for humans to interact with their stories and memories via a physical interface. The project involved encoding an elderly woman's written memories into precisely sequenced DNA and then splicing the code into the genome of a microorganism. The transformed bacteria then replicated, creating billions of facsimiles of the woman's memories. The resulting biofilm was presented in an exhibition as a sculpture. (CHI '20: Extended Abstracts)

Superhuman Computing Lab 

BrailleBlocks: Computational Braille Toys for Collaborative Learning
Vinitha Gadiraju, Annika Muehlbradt, and Shaun K. Kane (ATLAS/CS)
BrailleBlocks tactile gaming system encourages visually impaired children and their sighted family members to learn Braille together through tangible blocks and pegs and an iPad application with interactive educational games. More information.

ATLAS PhD Student in External Labs


Katie Z. Gach (ATLAS PhD Student), Jed Brubaker (INFO Faculty)
Managing Facebook pages for loved ones after their death is fraught with difficulty, according to this paper. While Facebook has created the ability for users to appoint post-mortem managers, called legacy contacts, Facebook gives them limited authority over the content, making them feel distrusted by the social network (Published in Transactions on Social Computing, invited for presentation at CHI 2020)

Workshops Organized


Robert Soden (ATLAS alumnus), Laura Devendorf (ATLAS/INFO faculty), Richmond Y. Wong, Lydia B. Chilton, Ann Light, Yoko Akama
This workshop explores the many ways uncertainty appears in research and the different types of responses that HCI has to offer. Outcomes of the workshop include exercises designed to evoke uncertainty in participants, concept mappings and a collection of essays developed by participants.

 
Ellen Yi-Luen Do(ATLAS faculty) among many others listed
This symposium showcases the latest HCI work from Asia and those focusing on incorporating Asian sociocultural factors in their design and implementation. In addition to circulating ideas and envisioning future research in human-computer interaction, this symposium aims to foster social networks among researchers and practitioners and grow the Asian research community.

Workshop Papers


Matt Whitlock (CS student), Daniel Leithinger (ATLAS faculty), Danielle Albers Szafir (ATLAS faculty/INFO affiliate faculty)
This paper on envisioning future productivity for immersive analytics was accepted to the Immersive Analytics workshop at CHI 2020.

Virtual and Augmented Reality for Public Safety
Cassandra Goodby (CTD student)
This paper explores potential applications of AR and VR technologies, haptics and voice recognition for first-responders. It was accepted to the Everyday Proxy Objects for Virtual Reality workshop at CHI 2020.

Mental Health Survey and Synthesis
Cassandra Goodby (CTD student)
This paper on tools and technologies available through mental health applications was accepted to the Technology Ecosystems: Rethinking Resources for Mental Health workshop at CHI 2020.

 

At a time when the field of human-computer interaction is becoming more important than ever, ATLAS researchers are making substantial contributions, contributing nine papers and two workshops to CHI '20.

Off

Traditional 0 On White ]]>
Fri, 01 May 2020 06:00:00 +0000 Anonymous 2529 at /atlas
Globalive Media's "Beyond Innovation:" IRON Lab research featured on globally broadcast program /atlas/2019/05/22/globalive-medias-beyond-innovation-iron-lab-research-featured-globally-broadcast-program Globalive Media's "Beyond Innovation:" IRON Lab research featured on globally broadcast program Anonymous (not verified) Wed, 05/22/2019 - 14:08 Categories: News Tags: IRON hedayati news szafir walker Research from ATLAS Institute's IRON Lab involving utilizing augmented reality to gain information about a robot's intended path of motion was featured on the globally broadcast program, "Beyond Innovation." The program features the latest business and technology trends. window.location.href = `https://youtu.be/bHlfPPCzMOs?t=57`;

Off

Traditional 0 On White ]]>
Wed, 22 May 2019 20:08:25 +0000 Anonymous 2041 at /atlas
Hooman Hedayati lands prestigious Microsoft Research internship /atlas/2018/09/11/hooman-hedayati-lands-prestigious-microsoft-research-internship Hooman Hedayati lands prestigious Microsoft Research internship Anonymous (not verified) Tue, 09/11/2018 - 13:58 Tags: IRON hedayati news newsbrief phdstudent

Hooman Hedayati, a computer science PhD student based in the ATLAS Institute’s Interactive Robotics and Novel Technologies (IRON) Lab, participated in a prestigious summer internship at Microsoft Research in Redmond, Washington, where he worked on teaching robots social skills in group conversations. His research focused on helping robots detect F-formations, group conversations that happen when “two or more people sustain a spatial and orientational relationship in which the space between them is one to which they have equal, direct and exclusive access.”

“For us, detecting F-formations is easy, and we don’t think about it,” says Hedayati.  “You know how many are in your conversational group, and you know how to position yourself in respect to others. But this task is not easy for robots.”

During the internship, Hooman and his Microsoft Research mentor, Sean Andrist, worked on developing an algorithm to help robots detect those in the same conversational group as the robot. The two plan to publish a paper about their findings.

“It was a great feeling to be surrounded by top scientists and legends in my field,” Hedayati says.

 

  Hooman Hedayati, a computer science PhD student based in the ATLAS Institute’s Interactive Robotics and Novel Technologies (IRON) Lab, participated in a prestigious summer internship at Microsoft Research in Redmond, Washington, where he worked on teaching robots social skills in group conversations.

Off

Traditional 0 On White ]]>
Tue, 11 Sep 2018 19:58:46 +0000 Anonymous 1575 at /atlas