In This Story
A team in George Mason University’s computer science department is taking part in the National Science Foundation’s (NSF) I-Corps program to study the effectiveness of 3D streaming in telehealth environments.
Bo Han, an associate professor of computer science and expert in computer networks and security and privacy, received $50,000 for his project, Translation Potential of Next Generation Telepresence Enriched by Immersive Technologies. Nan Wu, the entrepreneurial lead on the team, said, “The idea is to build a system for immersive telepresence over the internet. It’s essentially a 3D video stream but the technical improvement is how it saves bandwidth by using user-motion prediction and content sampling, improving the quality of the experience.”

Wu explained that the technical adjustment is made based on the receiver’s movements. “Since I’m looking at the front side of you, the streaming only needs to send me the front side, not the back. I don’t need to see all around you, even though it’s 3D. Moreover, when you’re farther away, a lower-resolution stream is enough since I won’t notice the difference. And if you’re hidden behind other content, there’s no need in sending what I can’t see.”
I-Corps is, according to the NSF website, “an immersive, entrepreneurial training program that facilitates the transformation of invention to impact. This seven-week experiential training program prepares scientists and engineers to extend their focus beyond the university laboratory — accelerating the economic and societal benefits of NSF-funded and other basic research projects that are ready to move toward commercialization.”
I-Corps has ushered 2,500 teams since the program began in 2012, and nearly 1,400 of them launched startups that cumulatively raised $3.16 billion in subsequent funding.
Teams must interview at least 100 potential customers to determine market interest. According to Wu the team didn’t find a robust market for using the technology the way they initially envisioned because staff shortages is the biggest pain point in the field, not technical shortcomings. “We thought nurses could monitor patients in 3D and save time, but there are many AI tools already doing that, monitoring 40 patients at once,” he said.
Wu said they’ve shifted to using the technology emergency situations, believing the tools will be useful in high-pressure trauma care or emergency cases. “We are thinking of a situation where there is a remote expert connected to a patient in a rural area, for example, where there is no expert on site.” Wu said that more interviews are needed to determine the market demand for such use.