By Joe Arney
Not many computer scientists have signs reading “Rage Against the Machine Learning” in their offices.
But in Evan Peck’s case, it’s a perfect symbol of why he was so excited to join the information science department of the College of Media, Communication and Information this fall.
“I love being here because CMCI draws students who want to use technology in service of something they already care deeply about, and not for its own sake.
Evan Peck
Associate professor, information science
“I started to believe that some of the most pressing problems our society is wrestling with don’t require deeper technical solutions, but a reimagining of the ways we’re using technology,” he said. “I was looking for deeper connections to social sciences and community-focused work—and I think that’s what information science excels at, shifting the lens of the technical in service to the community and society.”
Peck joined the University of ŷڱƵ Boulder this fall from Bucknell University, meaning he’s gone from being a Bison to a Buffalo. More than that, it gave him a chance to join a college and department that is more closely aligned with his evolving research interests, which center on information visualization—especially the way data is communicated to the public.
Establishing trust around data
He already appreciates being surrounded by faculty and students who are experts in fields like media studies and communication.
“I’m fascinated by how we encourage people to trust data, understand it and respond to it,” Peck said. “While we can advance science enough to offer compelling solutions to societal problems, we continue to share those insights to the public without an understanding of people’s cultures, beliefs and background. That’s a recipe for failure.”
If you think about some of the public health messaging you saw during the pandemic, you’ll probably remember the frustration of getting information that wasn’t helpful or didn’t reflect reality. Peck, for instance, lived in central Pennsylvania during the lockdowns. In the summer of 2020, his rural county hadn’t seen a day in which more than two people tested positive, but because most COVID maps reported risk at the state level, high caseloads in Philadelphia and Pittsburgh made all of Pennsylvania look more infectious than it was.
That degrades trust in experts, he said, “and when cases spiked in my county about a month later, I believe it had eroded trust and willingness to react to that data.”
He has taken his interest in this area to some interesting new arenas, including extensive interviews with rural Pennsylvanians at construction sites and farmers markets, to better understand how they interpreted charts and what information was important to them. The resulting research received a best paper award at the premier Human-Computer Interaction conference, has been cited by the Urban Institute and others, and helped cement his interest in information science.
“I had a moment of realization,” Peck said. “I could spend my whole career as a visualization researcher and still have zero impact on my community. So how do we engage in research that has a positive impact on the people and community around the university?”
It’s not the only area he’s looking to create impact. Peck describes himself as an advocate for undergraduate research opportunities, especially for students searching for a sense of place within their degree programs.
“It’s a mechanism for helping students explore areas that aren’t strongly represented in their core academic programs,” Peck said. “I saw this as an advisor in computer science for nearly a decade—I advised students who wanted to think deeply about how their designs impacted people, but in a curriculum in which people were a side story to their technical depth.”
An eye to ethics
He also created an initiative around ethics and computing curricula at Bucknell that’s been adopted by computer science programs everywhere. If a question was presented in an ethics context, students came up with thoughtful answers—but that reasoning did not extend into other assignments or their careers. It’s a story that’s familiar for anyone thinking about the addictiveness of social media platforms or the disruptive potential of artificial intelligence
Some computer science programs offered a single ethics course, “but it was so isolated from the rest of their technical content that students wouldn’t put them together,” Peck said.
In response, he added more ethical and critical thinking components to the core technical curriculum, and developed a set of programming assignments in which students wrestle with a societal design question in order to accomplish their programming goals.He currently has a grant through Mozilla’s Responsible Computing Challenge to continue that work at ŷڱƵ Boulder.
“It’s about connecting the dots and building habits. Students need to understand that the system I’m programming is going to have implications beyond Silicon Valley,” he said. “How can we get you to think about the human tradeoffs beyond the aggregated rules you’re creating?”
It’s the kind of question he feels renewed vigor about pursuing in the Department of Information Science.
“I love being here because CMCI draws students who want to use technology in service of something they already care deeply about, and not for its own sake,” Peck said.
“Computer science knows how to build marvelous systems, but not always how to make them work fairly or responsibly for diverse people and communities,” he added. “I think our department goes beyond the idea of ‘how do we build it,’ to think critically about who we’re designing for, who technology empowers, who it privileges, who it disadvantages.”