AI-Assisted Marine Monitoring System Provides Long-Term Deep Dive
MarineSitu builds integrated machine vision systems for monitoring marine life and underwater environments.
MarineSitu (Seattle, WA, USA), a relatively new company, first launched in 2017 as a spin off from the University of Washington. Founder and President James Joslin, who did his graduate studies at UW and worked there for several years as a research engineer before launching MarineSitu, says the company, with funding from the U.S. Department of Energy, first started developing the technology for use in monitoring conditions around such devices as wave energy converters and underwater turbines, which are used in the relatively new renewable marine energy industry within the sector referred to as the “blue economy,” which refers, generally, to sustainable economic development of marine resources.
Using cameras for various underwater vision system applications is not new. However, Joslin notes that most cameras designed for underwater applications, often installed on underwater drones and vehicles, can only be deployed for limited periods of time before they need to be removed from the water to be cleaned and maintained.
MarineSitu’s system was initially designed for monitoring fish population trends to help ensure that an apparatus such as an underwater turbine is not endangering or otherwise negatively affecting or impacting important and/or endangered species.
“Marine energy is a new and growing industry that needs a way to ensure that it is environmentally sensitive,” Joslin says. “There really weren’t any camera systems well designed for long term deployment.”
MarineSitu set out to develop a system that can be deployed underwater for months, even years, at a time, Joslin says. Specifically, they have targeted blue energy companies and have systems deployed in several locations across the U.S.
“We have a system currently deployed in Hawaii at the wave energy center in Oahu,” he says. “This is a facility operated by the U.S. Navy that is intended for wave energy developers to demonstrate and test their systems. Our monitoring system has been in the water for 20 months and still looks very good.”
The monitoring system has two distinct components, the hardware, consisting of cameras, housing, lighting systems and controllers; and the computer and software, which builds and trains AI models to gather certain types of data for analysis.
Joslin says MarineSitu chose Allied Vision (Exton, PA, USA) Alvium GigE cameras. These cameras, capable of up to 5 Gigabit bandwidth, 34.1 MPixel images, and 464 fps frame rate, are integrated into housings MarineSitu has designed especially for long-term, low maintenance use underwater. The cameras also are equipped with Kowa (Torrance, CA, USA) LM5JCM lenses.
“We did a lot of design,” Joslin says. “They have to be resistant to corrosion, so most of the housings are made from plastic components.”
The housings are made from PVC plastic cylinder with acetal endcaps and borosilicate view ports with copper retaining rings. The copper material has some inherent anti-fouling properties; this is paired with a mechanical wiper that can be swept across the lens to keep it clear, Joslin says.
Lighting and visibility can be a challenge. Each system deployed may have one camera, or two cameras operating in stereo. Each camera is paired with up to eight high power LED lights; the lights and the cameras are also linked to and synchronized with a camera controller, also built by MarineSitu. The computer that controls the cameras is typically on land in an accessible location and connected to the internet for remote access. There is also a microcontroller that is deployed with the camera system if it includes more than one camera and/or lights. This microcontroller allows for synchronization of the different cameras and lights using hardware triggers, which need to be within a few meters of each individual component for the triggers to work.
The images are digitized directly on the camera, which is typically a GigE machine vision camera, which is transmitting over an Ethernet connection. Because standard wireless connections have very limited range underwater, the systems are typically connected to shore via fiber optic cable.
The number and deployment pattern of the lights can vary, depending on the depth of deployment, field of view, and other factors.
“Ultimately, it depends on the clarity of the water,” Joslin says. “In Puget Sound, you might have 5-10 feet of visibility; in Hawaii, you can see up to 30 meters to the ocean floor from the top of the water down.”
Adding AI to the Mix
One issue that needed to be addressed was deciding what information to keep and analyze. Obviously, cameras running 24/7 for months at a time in an ever-changing aquatic environment will potentially capture enormous quantities of data. Gathering more data than can be stored, much less needed, can happen quickly.
MarineSitu recently partnered with AI solutions developer Plainsight (San Francisco, CA, USA) to develop AI models that can effectively parse vast amounts of data, recognizing and saving only that which is needed for a given monitoring scenario.
However, until fairly recently, the default monitoring process largely consisted of real people monitoring video screens in real time, for designated periods of time, manually recording what they observed, and attempting to extrapolate the needed data from those observations.
For example, the University of Alaska participated in a study regarding installation of two power generating turbines in the Kvichak River, near the town of Igiugig, AK. The remote town had for years produced all its energy with diesel fuel that had to be flown into the town, making the cost of electricity very expensive. The town, and the state of Alaska, wanted to know if power generated by underwater turbines would be a viable, cost-efficient solution.
However, the river, which is pristine, is one of the largest salmon runs in the world, and many people in the state rely on the fishing economy as a vital part of their livelihood. They wanted to ensure that the turbines would not negatively impact the fish, or otherwise cause any unintended environmental consequences.
Two turbines were placed under the water on the riverbed. MarineSitu deployed a camera system near the turbines, and UA scientists physically monitored and gathered data.
“They basically had people working round the clock, monitoring a computer screen for six-hour shifts at a time, and they literally hand counted fish they observed on the screen,” Joslin says.
Both Joslin and Plainsight Co-Founder and Chief Product Officer Elizabeth Spears note that, while the data gathering and analysis was conducted as thoroughly as possible, it seems clear that such a process will benefit greatly from AI-assisted models.
“It is very labor intensive. In the past, for marine environments, monitoring was available only in snapshots,” Spears says. “With the AI models, it’s a night and day difference in the quality and level of monitoring you can do.”
To develop an AI model, the Plainsight team collects the data, labels or annotates it, then trains the AI model to recognize whatever parameters and assets they want it to recognize — in this case, a certain fish species. As the models learn, more detailed data, and therefore more detailed analyses.
“It can easily progress from identifying and counting individuals to identifying behaviors and migration patterns, even detecting environmental changes such as potential pollution events,” she says. “There really is no limit to what it could ultimately be able to do.”
What’s next?
The Kvichak River study is still ongoing, but those involved in the study are already seeing the benefits of AI assistance.
“The feedback we have received so far is good, it will no doubt serve as a good case study for the benefits of AI assisted underwater monitoring,” Joslin says.
MarineSitu has inspection systems deployed in several locations, including one the University of Washington is using to monitor an experimental turbine system in Puget Sound, and the system deployed at the Navy’s wave energy center in Hawaii. In addition, MarineSitu is working with the Pacific Northwest National Laboratory, which will be using the AI assisted system for a ten-year infrastructure project, Joslin says.
Joslin says the system can be used for inspection/monitoring applications in other areas of the blue economy, including infrastructure projects, analyzing fisheries and fish hatcheries, and agricultural processes.
The system, in and of itself, contains basically the same components, however, configurations, models, and deployments vary based on each project and customer’s needs, therefore each system needs some customization and assistance during the integration and launch process. For example, one system is being used for a hydroelectric dam system, monitoring the number of fish that are using “fish ladders,” a series of platforms installed on a dam to allow migrating fish to travel up and over a dam. In that situation, it is not necessary to place a camera underwater.
But challenges remain in not only building systems that can withstand the rigors that long-term deployment, but in maintaining them. Also, achieving the goal of developing a fully turnkey system, from cameras and data acquisition to machine vision solutions and cloud-based databases, requires a diligent, multidisciplined approach, Joslin says.
However, Joslin is confident that MarineSitu’s future is bright.
“I think we are well positioned to grow with the blue economy,” Joslin says.
About the Author
Jim Tatum
Senior Editor
VSD Senior Editor Jim Tatum has more than 25 years experience in print and digital journalism, covering business/industry/economic development issues, regional and local government/regulatory issues, and more. In 2019, he transitioned from newspapers to business media full time, joining VSD in 2023.