UCSD will play a leading role in the recently approved, multimillion-dollar Ocean Observatories Initiative, a nationwide project meant to boost the public’s scientific knowledge of the oceans.
The project, which has been in the works since the early 1990s, will place high-tech, unmanned instruments along the sea floor, in the water column and across the surface of the ocean. Television cameras, remote-controlled robots and data-gathering buoys will constantly transmit data directly back to an onshore computer, located at the Scripps Institution of Oceanography.
Since the 1800s, ocean research has been conducted by a team of scientists who sail daily, spending a few hours to deploy instruments and record measurements before heading back to the lab to analyze the data.
“”Imagine trying to forecast the weather with only 10 weather stations that only transmit for a couple of hours each day,”” said Steve Bohlen, president of Joint Oceanographic Institutions in Washington, D.C., and the leader of the initiative. “”We’re looking to employ instrumentation in the ocean that will work just as modern weather forecasting does, relying on continuous data recording.””
The team of UCSD scientists will assist in the development of the project’s essential cyberinfrastructure.
“”This is [how] students, scientists and policymakers will view the data collected in real time,”” said John Orcutt, the leading scientist for the campus initiative and a Scripps professor of geophysics, in an e-mail.
Scientists from the Jacobs School of Engineering, the California Institute for Telecommunications and Information Technology, the San Diego Supercomputer Center and the UCSD School of Medicine have united to contribute their expertise to the initiative in the hopes that studying the ocean will lead to new discoveries about natural phenomena such as hurricanes, earthquakes and global warming, as well as in the physical, chemical and biological sciences.
The campus was awarded $29 million in federal funding to begin work on the cyberinfrastructure, with the promise that equal amounts will be awarded in each of six subsequent years until completion. Afterward, it will cost between $2-3 million a year to maintain.
“”Computer hardware has a lifetime of about three years and software changes even more quickly,”” Orcutt said. “”If we’re as successful as we hope, there will be growing demands for new capabilities that may well expand the current estimate.””
Other oceanographic institutes nationwide have been granted millions of dollars to support additional aspects of the project. The initiative is expected to cost $350 million to build, and $15- to $50 million a year afterward to maintain.
Bohlen predicted the entire design system would last a total of only 20 years before technological advances and increased understanding would require a widescale overhaul. At that time, he hopes the American infrastructure will be able to combine with those currently under development in Japan and Europe.
Such a network would extend at least as far as the continental shelf and connect all the oceans in what Bohlen called a “”prototype global ocean-observing system.””
This, combined with the cyberinfrastructure, will open a whole new range of possibilities to get the public involved in the research process.
“”Scientists, students, citizens and policymakers can all obtain access to [our] data,”” Orcutt said. “”The intent is to greatly democratize access through methods [like] YouTube.com, GoogleEarth.com and blogs. We hope this approach will significantly increase the size of the oceanographic community.””
Ideally, the project would spark general interest in the understanding of earth’s oceans, Bohlen said.
“”The public is usually surprised to learn how little we actually know,”” Bohlen said. “”We know more about the moon than about our oceans. This is starting to capture people’s imaginations. It is prying into the last frontier on the planet.””