Close this search box.

WVU research merges virtual reality and artificial intelligence to reveal and analyze gigantic images in stunning detail

Created by a West Virginia University faculty member and his former graduate student, syGlass — virtual reality software that displays scientific images in immersive 3D — will soon offer AI analysis and cloud-based streaming for extremely large datasets required by scientists like structural biologists.

Two related National Institutes of Health grants support the newest stage of the research collaboration between Gianfranco Doretto, a professor in the Lane Department of Computer Science and Electrical Engineering, and Michael Morehead, an alumnus of the WVU Benjamin M. Statler College of Engineering and Mineral Resources, where he worked with Doretto as a student.

Over the past 15 years, new microscope technologies and imaging methods have revolutionized the possibilities for three-dimensional images, Doretto said. However, the immense file sizes of those now incredibly detailed images pose analysis challenges. The new iteration of syGlass will leverage virtual reality to transform those challenges through abilities like stereoscopic vision and pattern recognition by building AI that can count and track objects.

“Biologists and neuroscientists can capture data at a speed and resolution such that they can end up with hundreds of gigabytes of data that quickly become terabytes — even petabyte-sized data in the case of electron microscope scans of the brain,” Doretto said. “With mindboggling data like that, you don’t even think about observing it. You write a computer program to analyze it and tell you something about it.”

Doretto offered the example of a neuroscientist attempting to find all the neurons present in a brain and map how those neurons connect and how the various circuits function.

“To do that, the person is working with such a large network of data that no observational tool will be helpful,” he said. “There have to be AI algorithms that help them understand the functions a certain area or circuit of the brain is attempting to perform, because the complexity is beyond our ability to observe and pinpoint. We need automated tools, AI models, that help us get there.”

When the idea of syGlass first emerged, Doretto said the motivation “was to build a tool that would allow scientists to navigate and observe very large amounts of data in three-dimensional form. We felt, if the data is in 3D, we should enable scientists to look at it in 3D just like humans observe the world in 3D, because that will allow them to make better connections and formulate more meaningful questions.”

Biologists, for example, often observe biological tissue samples to differentiate between and count various types of cells in the tissue.

“Doing something like counting cells is a lot less time-consuming and a lot more fun in virtual reality than it is in 2D,” Doretto said. “It’s also a lot less prone to error, so originally we focused on tools for helping the user count in virtual reality.”

Soon, however, he and Morehead began to ask, “Why not help the user do even less work?”

“This is where AI comes in. If you’re counting things, the AI can learn what you’re counting and try to help you count more quickly by predicting how many there are,” Doretto said. “If you find the answer satisfactory, you can begin to rely on the AI. Otherwise, you can improve it and make it better as you keep using the tool.”

In addition to incorporating AI capabilities for quantitative analysis, Doretto said the new version of syGlass will “lift the burden of data storage and data processing” by enabling users to stream images in virtual reality directly from the cloud.

“Right now, to visualize a large amount of data, you need to have a big, powerful computer next to you that can load that data and render it in virtual reality. But it has become possible to take that big fat computer and put it in the cloud, so virtual reality can be streamed to you.”

While syGlass’ early adopters were mainly neuroscientists and structural biologists, other kinds of scientists are now onboard, such as geologists “who look at data in 3D in order to make predictions about different soil characteristics,” Doretto said. Even Proctor and Gamble has used syGlass in pharmaceutical research.

But, he added, “This thing is not going to be locked into laboratories. With high schools around the country already using syGlass, now we’re going to enhance our tools for allowing teachers to produce lectures and narrations in virtual reality. When students look at a drawing of something like a biological organ in a book versus experiencing it in virtual reality, it’s the moving 3D version that sticks in their minds.”

The syGlass technology was licensed by WVU in 2017 to IstoVisio, Inc., a company founded by Doretto, Morehead and former WVU neuroscience faculty member George Spirou.

Morehead, IstoVisio’s CEO, recalled that “when we came up with the idea for syGlass, the Spirou and Doretto labs were already working together on advanced 3D visualization systems. In general, the data was all three-dimensional, but we could only explore it on two-dimensional computer monitors, which was very frustrating. We hypothesized that seeing this data in 3D would unlock the mind’s inherent capabilities for understanding relationships in structure, and years later that has been validated as many of our customers report finding things previously unseen.”

For the past two years, syGlass has been deployed in schools.

“Students love seeing the real data in VR, saying they can finally focus on their lessons when isolated in the headsets,” Morehead said. “With student engagement nationally at an all-time low, I’m very excited to bring these wonderful datasets into classrooms and nurture the spark of curiosity in the next generation of scientists.”

News Feed

Subscribe to remove popups, or just enjoy this free story and support our local businesses!