Google says its ‘physics-free’ weather tool will help nowcasting of rainfall 

0

Google researchers are developing machine-learning nowcasting models for rainfall that are solely data driven.

The ‘physics-free’ models rely on image-analysis techniques and can generate forecasts with a 1km resolution within less than 10 minutes, “allowing forecasts that are nearly instantaneous” and that are “outperforming traditional models, even at these early stages of development,” said Google.

“We use a data-driven ‘physics-free’ approach, meaning that the neural network will learn to approximate the atmospheric physics from the training examples alone, not by incorporating a priori knowledge of how the atmosphere actually works,” wrote Google senior software engineer Jason Hickey in a blog about the research.

This is in contrast to current weather forecasters, Hickey said, who try to infer things about the weather based on current scientific understanding of the subject. But according to Hickey this approach can lead to inaccuracies.

To illustrate his point he compares a radar imaging of precipitation over the continental USA with a satellite image of the same location at the same time showing cloud cover.

The side-by-side of the two images reveals that “the existence of rain is related to, but not perfectly correlated with, the existence of clouds, so inferring precipitation from satellite images alone is challenging,” Hickey said.

Hickey also noted that weather forecasters are hampered by their reliance on supercomputers processing hundreds of terabytes of weather data every day from around the world.

The computational demands of processing so much data limit the spatial resolution of these large-scale models to about 5km, which Hickey said “is not sufficient for resolving weather patterns within urban areas and agricultural land.”

In contrast, Hickey’s team’s approach, which was undertaken in partnership with Cornell University, requires far less computational power as it treats weather prediction as “an image-to-image translation problem”. The researchers leveraged a state-of-the-art image analysis technique called convolutional neural networks (CNNs), a filtering process that can be used to detect meaningful patterns.

The team found that their neural network forecast outperformed three widely used forecast models, including NOAA’s High Resolution Rapid Refresh (HRRR) model. Though Hickey noted that the HRRR model “begins to outperform our current results when the prediction horizon reaches roughly five to six hours.”

Share.

About Author

Comments are closed.