Exploring New Technology Across Disciplines

Rising to New Heights of Innovation

From extended reality to flying drones, journalism professor guides students as they research new technologies for storytelling.

Professor Dan Pacheco working with students in class, holding a drone in his hands.

Professor Dan Pacheco is a recognized leader in the use of emerging media platforms and extended reality. As the faculty advisor to the student organization Orange Aerial Productions, Pacheco looks to get the students FAA certified to fly drones so they can take aerial footage for stories.

In the Alan Gerry Center for Media Innovation at Syracuse University’s SI Newhouse School of Public Communications, Professor Dan Pacheco encourages students to experiment and play around with ideas. There are Oculus headsets, gaming controllers and 3D cameras. A stockpile of drones rests on a bookcase, representing the evolution of aerial image-takers. It’s a place where new technologies meet journalism, sparking creativity and innovation. “I often tell my students, it’s like we’re the super collider of ideas in the innovation lab,” says Pacheco, who holds the Peter A. Horvitz Endowed Chair in Journalism Innovation and is a professor of practice in the magazine, news and digital journalism department.

Pacheco is a pioneer in applying technologies to journalistic storytelling—from employing drones for aerial photography to creating virtual spaces that put people inside a story. He’s an entrepreneur and a recognized leader in the use of emerging media platforms and extended reality (XR), now often referred to as “the metaverse,” which includes the mix of user experiences in the simulated digital world. One of the most recent additions to Pacheco’s repertoire is photogrammetry, which enables photographers to capture entire spaces with 3D cameras and recreate them for different media platforms. While it’s easy to get caught up in the razzle-dazzle of new technology, he emphasizes “it’s not about the technology, it’s about ideas and what problems they solve for people.”

I often tell my students, it’s like we’re the super collider of ideas in the innovation lab.

—Professor Dan Pacheco, Peter A. Horvitz Endowed Chair in Journalism Innovation

A Passion for Exploration

Professor Dan Pacheco directing a student next to him to scan a small elephant object on the table.

Ryan Baker ’22 (left), a dual major in broadcast and digital journalism and marketing management, gets tips from Professor Dan Pacheco on how to 3D scan objects using the iPhone 12’s LiDAR sensors.

Pacheco preaches experimentation. In his Emerging Media Platforms and Virtual Reality Storytelling courses, he welcomes any interested students. His only prerequisite: a passion to explore and learn. “When it comes to innovation with totally new technologies, exploratory learning is the way you have to do it,” he says.

In the virtual reality (VR) class, for instance, Pacheco encourages students to draw ideas from games and use gaming software systems to deliver information through unique approaches. Case in point: One of his students created a VR language-tutorial game to help people learn Korean by interacting with objects—sharing their names, pronunciations and characters. “It was a fun game, but what you see is that by doing something in a different way, it served the mission better,” he says.

Robotics in Motion

These four-legged robots are helping to shape the future of science and technology.

Assistant Professor Zhenyu Gan holding his robotic dog in front of a whiteboard covered in calculations.

Zhenyu Gan, assistant professor of mechanical and aerospace engineering, with one of his quadruped robots.

On the second floor of Link Hall, Zhenyu Gan, an assistant professor of mechanical and aerospace engineering in the College of Engineering and Computer Science, designs medium-sized robots—namely, bipeds (two legged) and quadrupeds (four legged)—as well nor wearable robotic devices. Gan’s work reflects the growing convergence of the technological, digital and biological worlds.

Gan oversees the Dynamic Locomotion and Robotics Lab, where he builds simplistic models for legged locomotion. The lab is part of the University’s dynamic response to the rise of autonomous systems.

I want to build a robot that helps people accomplish really hard tasks.

—Jing Cheng, Ph.D. student

His research draws on science, engineering and technology. Using motion-capture data to isolate the movements of animals, Gan’s team develops simple spring-mass models. These models imitate different gait patterns, producing locomotion through a sequence of foot contacts with the ground.

While disabled and elderly people are his target audiences, nurses and caregivers, shipping and industrial workers, combat troops and first responders can benefit from wearables. “Our robotic exoskeletons prevent damage to injury-prone areas of the body while minimizing fatigue,” Gan says.

Finding Practical Applications

Ph.D students standing behind a robotic dog outside.

Gan flanked by Ph.D. students Cheng (left) and Yasser Alqaham.

Motion control is a challenge for any legged robot, especially one with two feet. Because their gait is based on an inverted pendulum model (where the center of gravity is below the pivot point), two-legged robots are physically unstable. Humans are another matter, having seemingly mastered bipedal locomotion through evolution and natural selection.

Jing Cheng, a second-year Ph.D. student, is designing a controller framework for a four-legged robot. “I’m figuring out a control scheme that helps robots move like animals,” he says.

An aspiring professor, Cheng is building on Gan’s research into movement planning and control. “I want to build a robot that helps people accomplish really hard tasks,” he says.

Turning Mistakes into Miracles

Product shot of a robot dog.

Robot dogs come in all shapes and sizes and, at Syracuse, are used to advance teaching, research and public outreach.

Gan laced his research with AI and machine learning. AI, he explains, draws on computer science and robust databases to help machines “think, work and react like humans.”

Machine learning, on the other hand, is a type of AI that helps a robot perform tasks without being programmed to do so. This is accomplished via algorithms and statistical models that draw inferences from data patterns.

Whether developing legged robots or wearable devices, Gan compares his research to a baby learning to walk—one step forward and two backwards, literally and figuratively. “We make mistakes, but we don’t consider them failures,” says Gan. “Our only mistake would be to stop trying.”

Exploring Extended Reality in Architecture

A professor and her students are pushing the boundaries of architecture.

Portrait of Professor Amber Bartosh, smiling with one hand on her hip.

School of Architecture professor Amber Bartosh researches and teaches how extended reality is used by designers to create and present spaces.

Close your eyes and imagine a glass building on a city street. Now open the front door and walk through. What is the lighting like? Will structural columns block your view? These are things that architects consider when creating designs, and these days, extended reality (XR) is changing the way designers create and present spaces. It’s a subject area School of Architecture professor Amber Bartosh researches and teaches her students. “I’m trying to push the boundaries of XR to explore how these tools visualize buildings and to demonstrate critical performance factors like ventilation, lighting and thermal control,” she says.

I’m working towards a career that combines XR technology and design, similar to the research I’m doing with Professor Bartosh and on my graduate thesis.

—Onkar Joshi, graduate student

XR describes the technologies that merge the real and virtual worlds. It relates to architecture through its capacity to enhance the physical environment through augmentation or allowing people to interact with entirely virtual environments. The most obvious example is that virtual reality allows a client or designer to walk through their design before it’s constructed. This is a particularly powerful tool for anyone who can’t read floor plans or visualize how something might look. Bartosh believes this creates an entirely new material palette and building ground for architects.

Teaching for Tomorrow and Beyond

Professor Amber Bartosh assisting a student working on a computer.

Graduate student Onkar Joshi works with Bartosh in the Interactive Design and Visualization Lab.

At Syracuse, Bartosh teaches courses that explore the design potential and spatial output of emergent technologies. As a collaborative teacher, she looks for students who might be interested in assisting with research, which she says is driven by an ambition to create more accessible, sustainable and inspirational environments.

Since 2021, graduate student Onkar Joshi has been Bartosh’s research intern in the Interactive Design and Visualization Lab—a collaborative design environment to create better living environments. He’s writing his graduate thesis on advertising in virtual reality spaces. “I’m working towards a career that combines XR technology and design, similar to the research I’m doing with Professor Bartosh and on my graduate thesis,” Joshi says.

While traditional architecture may lead graduates to jobs as interior designers or project managers, architecture students studying XR often find careers in movies, video game design or related fields where environments need to be created for characters. “XR is becoming a widely used tool in conventional practice as well because it supports designers in testing their own work as well as communicating it to others,” Bartosh says.