NOAA Great Lakes Environmental Research Laboratory

The latest news and information about NOAA research in and around the Great Lakes

Aerial photo survey improves NOAA GLERL’s Lake Erie ice model

1 Comment

Understanding the duration, extent, and movement of Great Lakes ice is important for the Great Lakes maritime industry, public safety, and the recreational economy. Lake Erie is ice-prone, with maximum cover surpassing 80% many winters.

Multiple times a day, GLERL’s 3D ice model predicts ice thickness and concentration on the surface of Lake Erie. The output is available to the public, but the model is under development, meaning that modelers still have research to do to get it to better reflect reality.

As our scientists make adjustments to the model, they need to compare its output with actual conditions so they know that it’s getting more accurate. So, on January 13th of this year, they sent a plane with a photographer to fly the edge of the lake and take photos of the ice.

The map below shows the ice model output for that day, along with the plane’s flight path and the location of the 172 aerial photos that were captured.

NOAA GLERL Lake Erie ice model output with all aerial photo survey locations -- January 13, 2017. Credit NOAA GLERL/Kaye LaFond.

NOAA GLERL Lake Erie ice model output with all aerial photo survey locations — January 13, 2017. Map Credit NOAA GLERL/Kaye LaFond.

These photos provide a detailed look at the sometimes complex ice formations on the lake, and let our scientists know if there are places where the model is falling short.

Often, the model output can also be compared to images and surface temperature measurements taken from satellites. That information goes into the GLSEA product on our website (this is separate from the ice model). GLSEA is useful to check the ice model with. However, it’s important to get this extra information.

“These photographs not only enable us to visualize the ice field when satellite data is not available, but also allow us to recognize the spatial scale or limit below which the model has difficulty in simulating the ice structures.” says Eric Anderson, an oceanographer at GLERL and one of the modelers.

 “This is particularly evident near the Canadian coastline just east of the Detroit River mouth, where shoreline ice and detached ice floes just beyond the shoreline are not captured by the model. These floes are not only often at a smaller spatial scale than the model grid, but also the fine scale mechanical processes that affect ice concentration and thickness in this region are not accurately represented by the model physics.”

Click through the images below to see how select photos compared to the model output. To see all 172 photos, check out our album on Flickr. The photos were taken by Zachary Haslick of Aerial Associates.

This gallery contains 10 photos


2 Comments

Vertical Water Temperature in Southern Lake Michigan

Since 1990, GLERL scientists have been measuring temperature in the middle of southern Lake Michigan (at approximately 42.68, -87.07). They’ve been using a vertical chain of instruments that measure temperature from top to bottom. This is one of the longest vertical temperature records in existence anywhere in the Great Lakes, and it reveals some interesting patterns about lake temperature and the seasons. We’ve created a static infographic as well as an interactive chart that allows you to zoom in on the data and get individual measurement values.

Below, check out our infographic explaining seasonal temperature profiles in Lake Michigan.

Click here to interactively explore Lake Michigan temperature data.

Click to see an infographic explaining Lake Michigan temperature data.

Lake Michigan temperature data infographic.


1 Comment

Scientists Work Around the Clock During Seasonal Lake Michigan Cruise

Last month, scientists from GLERL, the Cooperative Institute for Limnology and Ecosystems Research (CILER), and other university partners took the research vessel Laurentian for a multi-day cruise on Lake Michigan as part of seasonal sampling to assess the spatial organization of the lower food web—spatial organization simply means the vertical and horizontal location where organisms hang out at different times of day, and the lower food web refers to small organisms at the bottom of the food chain.

The research goes on around the clock. Scientists work in shifts, taking turns sleeping and sampling. The Laurentian spends a full 24 hours at each monitoring station, sampling vertical slices of the water column. Sampling at these same stations has been going on since 2010, providing a long-term dataset that is essential for studying the impact of things like climate change and the establishment of invasive species.

Sampling focuses on planktonic (floating) organisms such as bacteria, phytoplankton (tiny plants), zooplankton (tiny animals), and larval fishes which feed on zooplankton. Many of the zooplankton migrate down into deep, dark, cold layers of the water column during the day to escape predators such as fish and other zooplankton. They return unseen to warm surface waters at night to feed on abundant phytoplankton. Knowing where everything is and who eats whom is important for understanding the system.

Our researchers use different sampling tools to study life at different scales. For example, our MOCNESS (Multiple Opening Closing Net Environmental Sampling System) is pretty good at catching larger organisms like larval fish, Mysis (opossum shrimp), and the like. The MOCNESS has a strobe flash system that stuns the organisms, making it easier to bring them into its multiple nets.

The PSS (Plankton Survey System) is a submersible V-Fin (vehicle for instrumentation) that is dragged behind the boat and measures zooplankton, chlorophyll (a measure of phytoplankton), dissolved oxygen, temperature, and light levels. Measurements are made at a very high spatial resolution from the top to the bottom of the water. At the same time fishery acoustics show where the fish are. Together, these two techniques allow us to see where much of the food web is located.

Water samples are taken at various depths and analyzed right on the boat. This is a good way to study microbes such as bacteria and very small phytoplankton. The lower food web has been pretty heavily altered by the grazing of quagga and zebra mussels. Specifically, the microbial food web (consisting of microbes such as bacteria and very small phytoplankton) makes up a larger component of the food web than before mussel invasion, and scientists are working to find out exactly how this has happened.

Check out the photos below for a glimpse of life in the field!

img_20160914_164538803_hdr

Central Michigan University students Anthony and Allie are all smiles as they prepare to head out!

img_20160914_164301419_hdr

Getting the MOCNESS ready.

img_20160915_104422544_hdr

Chief scientist Hank Vanderploeg looks at some data.

img_20160914_191857606

Filtering a water sample—filtering out the big stuff makes it easier to see microbes.

img_20160914_230710

Paul prepares the fluoroprobe.

img_20160914_203849674

Taking a water sample in the presence of a beautiful sunset!


1 Comment

Tracking Changes in Great Lakes Temperature and Ice: New Approaches

In a new study, scientists from GLERL, the University of Michigan, and other institutions take a new look at changing ice cover and surface water temperature in the Great Lakes. The paper, set to be published in Climatic Change, is novel in two ways.

While previous research focused on changes in ice cover and temperature for each lake as a whole, this study reveals how different regions of the lakes are changing at different rates.

While many scientists agree that, over the long term, climate change will reduce ice cover in the Great Lakes, this paper shows that changes in ice cover since the 1970s may have been dominated by an abrupt decline in the late 1990s (coinciding with the strong 1997-1998 winter El Niño), rather than gradually declining over the whole period.

NOAA tracks ice cover and water surface temperature of the Great Lakes at a pretty fine spatial scale. Visit our CoastWatch site and you’ll see detailed maps of surface temperature and/or ice cover updated daily.

However, when studying long-term changes in temperature and ice cover on the lakes, the scientific community has used, in the past, either lakewide average temperature data or data from just a few buoys. We knew how each lake was changing overall, but not much more.

Now, for the first time, researchers are using our detailed data to look at the changes happening in different parts of each lake.

Using GIS (geographic information system) analysis tools, researchers calculated how fast ice cover and temperature were changing on average for each of thousands of small, square areas of the lakes (1.3 km2 for ice cover, and 1.8 km2 for temperature).

The maps below show the results. Changes in ice, on the left, are reported in the number of days of ice cover lost each year. Temperature changes are reported in degrees Celsius gained per year.

Fig3top

Panel a shows the change in seasonal ice cover duration (d/yr) from 1973 to 2013, and panel b shows the change in summer surface water temperature (°C/yr) from 1994 to 2013. Maps from Mason, L.A., Riseng, C.M., Gronewold, A.D. et al. Climatic Change (2016). doi:10.1007/s10584-016-1721-2. Click image to enlarge.

The researchers also averaged these values across major subbasins of the lakes. Maps of those results are below. The color coding is the same, and again, ice cover is on the left while temperature is on the right.

Note: These subbasins aren’t random, and were outlined by scientists as a part of the Great Lakes Aquatic Habitat Framework (GLAHF), which is meeting a need (among other things) for lake study at intermediate spatial scales.

The panel on the left shows the change in seasonal ice cover duration (d/yr) from 1973 to 2013, and the panel on the right shows the change in summer surface water temperature (°C/yr) from 1994 to 2013. Maps created by Kaye LaFond for NOAA GLERL. Click image to enlarge.

Depth, prevailing winds, and currents all play a role in why some parts of the lakes are warming faster than others. A lot of information is lost if each lake is treated as a homogenous unit. With so much variation, it may not make sense for every region of the Great Lakes to use lakewide averages. Studying changes at a smaller scale could yield more useful information for local and regional decision makers.

The second part of the story has to do with how ice cover has changed in the lakes. Previous studies typically represent changes in ice cover as a long, slow decline from 1973 until today (that would be called a ‘linear trend’). However, when looking at the data more carefully, it seems the differences between the 70’s and today in many regions of the Great Lakes are better explained by a sudden jump (called a ‘change point’).

The figure below shows yearly data on ice cover for the central Lake Superior basin. It is overlaid with a linear trendline (the long, slow decline approach) as well as two flat lines, which represent the averages of the data before and after a certain point, the ‘change point’.

Annual ice cover duration (d/yr) for the central Lake Superior basin, overlaid on the left with a linear trend-line, and overlaid on the right with a change-point analysis. Graphic created by Kaye LaFond for NOAA GLERL. Click image to enlarge.

Statistical analyses show that the change point approach is much better fit for most subbasins of the Great Lakes. 

So what caused this sudden jump? Scientists aren’t sure, but the change points of the northernmost basins line up with the year 1998, which was a year with a very strong winter El Niño. This implies that changes in ice cover are due, at least in part, to the cyclical influence of the El Niño Southern Oscillation (ENSO).

All of this by no means implies that climate change didn’t have a hand in the overall decline, or that when there is a cyclical shift back upwards (this may have already happened in 2014) that pre-1998 ice cover conditions will be restored. The scientific consensus is that climate change is happening, and that it isn’t good for ice cover.

This research just asserts that within the larger and longer-term context of climate change, we need to recognize the smaller and shorter-term cycles that are likely to occur.


Leave a comment

UPDATE: GLERL Releases Drifter Buoys into Lake Erie

Update 08/09/2016: The buoys have drifted ashore and are being collected! The map below shows their full journey.

drifters map 2.1-01.png

This map shows the journey of the drifters from July 5, 2016 to August 5, 2016. Created by Kaye LaFond for NOAA GLERL. Click image to enlarge.

 

Original post 07/13/2016:

Last week, GLERL scientists released two mobile buoys with GPS tracking capabilities, known as ‘Lagrangian drifters’, into Lake Erie. We are now watching the buoys move around the lake with interest, and not just because it’s fun. The drifters help us test the accuracy of our Lake Erie hydrodynamics model, known as the Lake Erie Operational Forecasting System (LEOFS).

drifters map 2 [Converted]-01.png

This map shows the progress of the drifters as of July 13, 2016 08:19:00. Created by Kaye LaFond for NOAA GLERL. Click image to enlarge.

LEOFS is driven by meteorological data from a network of buoys, airports, coastal land stations, and weather forecasts which provide air temperatures, dew points, winds, and cloud cover.  The mathematical model then predicts water levels, temperatures, and currents (see below).

ewndcur_latest

An example of outputs from the Lake Erie Operational Forecast System (LEOFS)

 

We use these modeled currents to predict the path that something like, say, an algae bloom would take around the lake. In fact, this is the basis of our HAB tracker tool.

The strength of LEOFS is in how well the modeled currents match reality.  While there are a number of stationary buoys in Lake Erie, none provide realtime current measurements.  The drifters allow us to see how close we are getting to predicting the actual path an object would take.

Researchers will compare the actual paths of the drifters to the paths predicted by our model. This is a process known pretty universally as ‘in-situ validation’ (in-situ means “in place”). Comparing our models to reality helps us to continually improve them.

For more information and forecasts, see our Great Lakes Coastal Forecasting homepage.

For an up-to-date kmz file of the drifters (that opens as an animation in Google Earth), click here.

 

 


Leave a comment

The tricky business of predicting climate change impacts on Great Lakes water levels

An early online release of GLERL researcher Brent Lofgren’s paper entitled “Physically Plausible Methods for Projecting Changes in Great Lakes Water Levels Under Climate Change Scenarios” can be found on the American Meteorological Society’s Journal of Hydrometeorology website.

In the paper, Dr. Lofgren and his co-author, Jonathan Rouhana, explore two different ways to model the effects of climate change on evapotranspiration (the movement of water from the land to the atmosphere as the combined result of evaporation and transpiration), and, subsequently, on the water levels of the Great Lakes.

Predicting how climate change will affect the water levels of the Great Lakes is a tricky business. To answer questions like this, it is often best to use models. Modeling is central to what scientists do, both in their research as well as when communicating their explanations. Within their models, scientists study relationships between variables in nature and then apply those relationships to possible future scenarios with one or more tweaked variables.

However, earth systems are so complex and have so many moving parts, that it’s almost impossible to capture them completely in an equation or series of equations. The beauty of modeling, is that it allows scientists to start with a small amount of data and, as time goes on, to build up a better and better representation of the phenomenon they are explaining or using for prediction.

Sometimes, particularly when modeling climate change, problems arise with so-called empirically-based models. Empirically-based models are created by making observations about two or more variables over a certain time period and under certain conditions, and inferring relationships from those observations. Often, those models don’t hold up when conditions change.

An alternative is physically-based models, which use the laws of physics (like conservation of mass, energy, etc.) to make predictions. Complexity is still a hurdle, but the laws of physics hold up no matter what—even when the climate changes.

Dr. Lofgren’s paper details issues with an empirically-based model widely used in Great Lakes research, the Large Basin Runoff Model (LBRM). From the abstract:

This model uses near-surface air temperature as a primary predictor of evapotranspiration (ET); as in previous published work, we show here that its very high sensitivity to temperature makes it overestimate ET in a way that is greatly at variance with the fundamental principle of conservation of energy at the land surface. The traditional formulation is characterized here as being equivalent to having several suns in the virtual sky created by LBRM.

Several suns in the sky – wow! In the most extreme case, this method of calculating evapotranspiration behaves as though there were 565 suns. 

In the context of climate modeling, “The LBRM oversimplifies the physics of the interaction between the earth and the atmosphere,” says Dr. Lofgren.

This doesn’t mean the LBRM isn’t useful in specific instances (e.g. short-term forecasting), or that you shouldn’t ever trust empirically-based models. It just means that different types of models have their place in different circumstances, and that the LBRM probably isn’t the best choice for modeling hydrologic response under climate change conditions.

Scientists often argue about the rightness of their model, and in the process, the model can evolve or even be rejected. Consequently, models are central to the process of knowledge-building.

Scientists who dare to create models know that their models will be scrutinized and tested. Research like Dr. Lofgren’s ensures not only that models are used appropriately with an acknowledgment of their limitations, but that they are continually improved upon.

 


Leave a comment

2016 Lake Erie HABs Forecast Has Arrived

Earlier today, NOAA and partners released their forecast of Harmful Algal Blooms (HABs) for the summer of 2016. The official predicted bloom severity came in at a 5.5, far milder than last year’s 10.5, although still significant.

This spring has been relatively dry, sporting a 4 inch rain deficit since May 2016, and flows in the Maumee River are down. Consequently, the amount of total bioavailable phosphorus flowing into Lake Erie that could feed blooms is lower than the past three years.

This doesn’t mean the source of the nutrients – mainly agricultural runoff – has been addressed. Heavy, intense rainfall in the future could pick up excess nutrients and create severe blooms again.

There is a high uncertainty associated with this summer’s forecast (ranging from 3 to 7) because we don’t know for sure what the overwinter effect from last summer’s bloom is going to be — phosphorous and algae material could remain in the water and boost this year’s bloom.

p6fat-0716_o

2016 HABs Forecast

NOAA GLERL and partners will be keeping an eye on Lake Erie all summer, and in September, we’ll be sending our Environmental Sample Processor (ESPniagara) on its first mission to monitor algal toxins in real-time near the Toledo water intake.

For more information, check out our new and improved HABs and Hypoxia homepage.