The move toward the Internet of Things is being fueled by several technological advances:
• The ever-shrinking sensors and their ability to function on incredibly small amounts of power.
• The spread of chip-enabled Internet connections.
• Wireless communications that lets sensors send data and receive instructions without being hooked up to a cumbersome network or requiring that someone visit the sensors to download and upload data.
• New methods of energy harvesting, including solar, that could power remote sensors indefinitely.
• Updated algorithms and computer methods for pulling useful information from piles of information (aka Big Data and analytics).
All of these technological areas will inevitably continue to improve in terms of performance, reduced size and power needs, and more complex programming and communications.
Although these advances are being made in the name of the IoT, which promises benefits in manufacturing and the interoperability of home appliances and infotainment devices, they will also benefit researchers in biology and earth sciences. A group of biologists is already taking advantage of IoT technology and hardware to study Lake George, a 32-mile-long, 200-ft.-deep lake in New York.
The group, a collaboration between IBM Research, Rensselaer Polytechnic Institute, The Fund for Lake George, and over 60 scientists from around the world, has already made some unexpected findings. After seeding the lake and its shoreline with an array of different sensors, including underwater sonar, they discovered higher than expected current and countercurrents deep in the lake when it is frozen-over in the winter. They also recently confirmed that a “ghost wave” 100 ft. high runs across the lake 30 ft. below the surface.
The team has also taken its data to improve its model of the lake. For example, by taking 468 million depth measurements using high-resolution bathymetry, they’ve got a better idea of what the bottom of the lake looks like. The previous model was based on just 564 depth measurements taken over the entire lake.
They’ve also refined their weather model to the point it makes two-day forecasts twice a day, and they generate forecasts at half-mile intervals across the lake. Despite the much higher resolution, the predictions of precipitation, temperature, wind speed and direction, humidity, and visibility are more accurate than ever.
Biologists and ecologists are chomping at the bit to use their newfound knowledge to identify and track invasive species in Lake George. They also want to track pollutants such as road salt and fertilizer runoff from surrounding farms, and determine where it comes from and how it gets into the lake. The ultimate goal is to better understand the interplay between organisms in the lake and protecting them by developing effective ways to combat pollution and keep out invasive species.
This same type of intensive, blanket monitoring will undoubtedly be employed to gather and analyze more useful data on volcanoes and earthquakes (including those underwater), as well as more mundane but just as critical biomes as forests, deserts, tundras, and farmlands. The potential benefits are staggering, not to mention how much all this would expand our knowledge of Earth and kickstart new areas of biological research. For example, field technicians could tag and track hundreds of animals such as mice and birds in real time. The possibilities are endless.
And on a side note, once analytics have a handle on Big Data, we could turn them loose on the decades of astronomical data and star charts stored in Russian and NASA files and databases.