When you think of virtual reality, or VR, you might conjure up images of action-packed video games or immersive tours of deep ocean waters and other exotic locale. But, in recent years, scientists have started to don VR goggles too—not for entertainment, but for analyzing and comprehending their data.
At Caltech, efforts to design VR tools for the future are underway, with prototypes in development for studying everything from worms to ocean waters to biomolecules and more.
“We are thinking about what doing science will look like 10 to 15 years from now,” says Santiago Lombeyda, a computational scientist at the Center for Data-Driven Discovery (CD3), the group behind the virtual reality research, as well as other data science projects. CD3 is a joint partnership between Caltech and JPL, which is managed by Caltech for NASA. “In the future, a scientist might be working on their desktop, and then they could just grab a pair of virtual reality glasses, or they may even be already wearing regular glasses that enable VR, and then start manipulating their data in the same shared visual context of their actual work area.”
“Billions of dollars have been poured into VR in the gaming industry. We can leverage their software and ask what it can do for science and scholarship,” says George Djorgovski, director of CD3 and a professor of astronomy at Caltech. Djorgovski has been working on developing VR tools for data analylsis for more than a decade and, in 2015, developed a related startup company called Virtualitics, which combines VR with machine learning.
The CD3 group’s latest project, in collaboration with the National Cancer Institute, is to develop better tools for finding tumors in diagnostic imaging scans. To that end, they have developed a virtual reality environment, using a Vive VR headset, where a user can visualize computed tomography (CT) scans from patients and identify possible tumors. According to the Caltech and JPL researchers, the 3-D virtual environment allows radiologists to better visualize and identify potential tumors than with the standard 2-D imaging methods available now.
“The 3-D environment can really have an impact in understanding three-dimensional structures,” says Lombeyda. “And that’s why it applies so well to looking for tumors. By walking around the data and seeing it from all sides, you might find things you wouldn’t otherwise see by just looking at 2-D slices of tissues.”
The group has begun working with several radiologists to identify possible tumors in patients. The idea is to then take this VR training data and feed it into machine-learning, or artificial intelligence (AI), programs, an effort being managed by Caltech’s Ashish Mahabal, lead computational and data scientist for CD3. Once those programs have learned how to better identify tumors, they could be used in the future to help medical professionals find candidate tumors for follow-up. “The better the training data, the better the machine-learning program,” says Djorgovski.
“There is a synergy with machine learning and virtual reality,” says Dan Crichton, director of the Center for Data Science and Technology at JPL, a partner organization to CD3. “We are training AI to see things, such as tumors, that otherwise might have gone missing. This can be a useful aid to medical practitioners.”
Crichton says that components of the software they are using for their VR programs were developed originally for astronomy and planetary imaging. “Anytime you look at data in three or more dimensions, it’s challenging,” he says. “When we study planetary surfaces, for example, we want to see more than just a one-dimensional slice. We want to see the depth and how other variables change. The principles we’ve learned from this kind of imaging apply to our VR programs.”
According to the scientists, the virtual reality environment offers scientists a more instinctive way to understand their data, whether it signifies an object such as a tumor or a molecule, or is represented in the form of a graph, with many variables plotted out.
“In a 3-D virtual environment, scientists have a more intuitive and longer-lasting grasp of the spatial relations of objects. In today’s world, more and more multidimensional data are becoming available, and these tools are improving the ability of humans to comprehend them,” says Djorgovski. “In the same way that we went from black-and-white photography to color photography, or from phone calls to Skype chats, people will not want to go back from virtual reality.”
In another application of the VR program developed by the CD3 group, in collaboration with Paul Sternberg, Bren Professor of Biology, a user has the ability to manipulate a model of a tiny transparent worm called Caenorhabditis elegans (C. elegans) using a tool that seemingly grasps the worm like a pair of tongs. When the program turns on, the worm first appears on a desk surface in a virtual office environment, at a fairly small scale. The user can then enlarge the worm by throwing it on the floor. And if the user wants to see a huge model of the worm, they can toss it out of a virtual window, where it blows up to a scale bigger than a human.
“We want researchers to come to us with data that we can then quickly prep for viewing in a VR space,” says Lombeyda. “Whatever they are studying can be seen at desktop scale, or they can drop it on the floor and walk around it.”
One of the challenges of developing VR tools that scientists will actually use lies in making the experience smooth, without jerky motions. VR programs can leave people feeling dizzy, and avoiding this is something that the CD3 group continues to work on. Lombeyda says that their overall goal is to make the experience as natural and seamless as possible for a scientist working at an office desk, so that they might be checking email and then could pop on a pair of VR glasses to quickly examine new data.
“Most of the time, scientists work at desks, so we want to optimize that experience,” says Lombeyda, adding that augmented reality, or AR, glasses, in which a viewer is only partially immersed in a virtual environment, may also become a common tool for scientists. But, he says, they are focusing on VR now in this early phase of development.
The group is also working on building VR classrooms, and last year they taught a course in a virtual space. VR classrooms can bring together students in different locations, and even across the globe, to understand and manipulate data in 3-D. According to Djorgovski, this kind of setup could also be used to improve teaching skills. For example, students in a VR classroom could anonymously hit a button if they were not comprehending a certain topic. “If the teacher were to get a few alerts, they would know at that point the students were confused and they needed to backtrack.This is something that can be done more easily and more quickly in VR versus traditional classrooms.”
Recently, the scientists presented their VR tumor program at a computer graphics conference, called SIGGRAPH, to favorable reviews. “Some of the comments we kept hearing were how people were excited to see VR for something other than gaming, and also how natural the experience was in our VR setup,” says Lombeyda.
Though the technology may still be in its early phases, the researchers are excited to press on. The future of VR, they say, is not just for fun and games but for collectively making sense of our world. “All sciences are undergoing the same transformations of having to deal with larger and larger data sets, and traditional tools won’t work,” says Djorgovski. “Instead of reinventing the wheel everywhere, scientific groups are developing new methodologies for big and complex data sets. Everybody faces the same problems so we have a great opportunity for sharing solutions and ideas.”