Dwight Coleman, URI Graduate School of Oceanography
Jay Ferguson, Rite-Solutions
Discoveries made along the ocean floor hold interest for oceanographers, geologists, biologists, archaeologists and ocean enthusiasts alike. The University of Rhode Island’s Inner Space Center (ISC) is one of the world’s premiere oceanographic centers. Central to its mission is the careful and quick dissemination of oceanographic data to a worldwide, online audience of educators and researchers. An early adopter of cutting edge, web-sharing technology, ISC already provides scientists and educators ready access to streaming HD video of its exploratory ocean work. While these video streams are useful in enhancing curricula and research projects, they as yet lack simultaneous reportage of the data being collected on these undersea videos. Immediate access to such data could greatly improve a remote audience’s ability to understand what it is they are watching.
In 2012, a group comprised of staff from the University of Rhode Island and Middletown’s Rite-Solutions was awarded a STAC Collaborative Research grant for a project that is changing the face of oceanographic data-sharing. Entitled, “The Inner Space Center Information Access System (ISCIAS): Innovation for Research and Education in the Ocean State,” the project centers on software built to work with ISC’s existing website hardware. The collaborators have made full use of their $199,000 STAC award, and in a relatively short time, have devised a pioneering method of sharing oceanographic data in tandem with video of the data’s collection.
Dwight Coleman is a Marine Scientist at the URI Graduate School of Oceanography, and one of the project’s collaborators. Coleman explains that during a typical exploration mission, each of the ISC’s two exploration vessels — the NOAA Ship Okeanos Explorer and E/V Nautilus — generates roughly a terabyte of data each day. These data include still images, as well as such information as side-scan sonar readings. It’s a vast amount of information, and includes the multiple, streaming, HD videos collected from cameras attached to the two ships and their ROV (remote operation vehicle) cameras.
Coleman was already working with Rite-Solutions’ Jay Ferguson on projects related to data handling and management when the two began discussing the need for something like ISCIAS. How could they improve the user experience for scientists and researchers watching the Inner Space Center’s live undersea video feeds? Knowing how much the associated data could enhance the streaming video experience, Coleman and Ferguson wondered how they might successfully match the two? Together they came up with a solution: a software program that links the different elements in real time.
The next step was to develop a smart user interface for displaying the data-video pairings. ISCIAS breaks down the video and data into a format not unlike that of popular video editing software. The video footage displays at the top of the screen. Rather than showing footage timelines, a scrolling graph at the bottom represents whatever data is being collected in the videotaped activity — precisely synchronized to the videotaped moment of collection. Viewing the separate elements together, it is impossible not to sense the value that knowledgeable users will discover in ISCIAS’ innovative presentation format.
Sara Hickox is another member of the ISCIAS team. A University of Rhode Island educator, Hickox will work with other educators to collect evaluations of the new system’s interface. “We need to adapt it to user needs,” says Coleman, referring to the researchers, scientists, academics and students who make up ISCIAS’s intended audience.
While the Inner Space Center’s stated mission to “share the excitement of undersea discovery as it happens,” the Center is also an important source of research. Thus its emphasis on telepresence — the use of virtual reality technology to allow participation in distant events — is not entirely unselfish. ISC missions have frequently benefited from sharing their undersea explorations with a worldwide audience. Coleman offers the story of one ISC exploratory mission in the Aegean Sea where a shipwreck was discovered. In a stunning example of telepresence’s value, a knowledgeable Greek viewer watching the exploration streaming online was able to provide the researchers with the ship’s name and details of it’s disappearance. The STAC-awarded ISCIAS project will certainly increase the chances of such fruitful, global collaborations.
As the project winds up its initial development phase, Ferguson and Coleman’s efforts are right on schedule, winding up this phase by working to automate their processes — data logging and analysis — in order to ensure that ISCIAS’ methodology will help with quality control of the data logged by ISCIAS.
STAC funding is intended not only to support work that will lead to future discoveries, but also to support work that might lead to follow-on funding and/or commercialization. Among future goals, the ISCIAS team has identified a need to develop data in standard formats that can be useful to a varied audience. For now, thanks to STAC, this collaborative project will have succeeded in greatly improving the Inner Space Center’s ability to share data with scientists and educators around the world.