maps for developers

The (former) official Mapbox blog. See mapbox.com/blog for current content.

Follow publication

3D Weather with SceneKit: aka Karl the Fog in AR

By: Jim Martin

Weather is all about volumes of air interacting, and weather forecasting happens with 3D simulations. However, those forecasts are often sliced into 2D snapshots at different altitudes for display on a flat map, and it means you’re missing a lot of crucial details. We worked with The Weather Company to show how SceneKit and ARKit can provide a better understanding of 3D weather patterns.

Left: NASA Satellite Imagery | Right: Our volumetric clouds

How we built it

San Francisco’s famous “Karl the fog” and cloud cover are our test-case for 3D weather visualization. We wanted to make sure we had a benchmark for our final result, so used the weather pattern in the satellite imagery above as we built. Finally, we had three goals that when combined, give us an interactive 3D version of these impressive satellite images:

  1. Use real data
  2. Visualize weather effects realistically
  3. Make the app performant enough for mobile AR

Data sources

First, we needed to find an image source for cloud cover to power our visualization. NOAA maintains a public source of imagery from the GOES-16 (Geostationary Operational Environmental Satellites) that covers the entire United States. Explore some of their amazing imagery, updated dozens of times a day.

Even better, RealEarth provides raster tiles of GOES-16 imagery. It isn’t a scalable data source, but it’s quick to add a tileset as a layer using our Maps SDK for iOS, so we used this as the source of ground truth for our clouds.

Rendering 3D volumes from map styles in SceneKit

To create performant 3D visuals from our imagery, we used a rendering method called “ray marching.” With this method, we can create 3D volumes from 2D map data by capturing how ‘opaque’ different regions of the map are in our map style.

We go from a 2D style (raster or vector) to a 3D volume using a custom shader.

To implement ray marching in SceneKit, we created a shader program using Metal. For every pixel rendered in the final image, the shader creates a ray that ‘marches’ through the scene, starting at the camera. As the ray moves, it accumulates color from the map view at that position in space. The idea is to use our map’s opacity and color to control where volumes appear, their height, and thickness. Brighter colors on the map mean more dense clouds at that position in the volume:

Sampling the 2D map at each step to determine the ‘cloudiness’ at each pixel.

You can see how we implemented this concept by downloading our sample project, and check out Metalkit.org if you’re interested in learning more about the fundamentals of creating custom shaders on iOS.

Visualizing other datasets

Our examples showed clouds over San Francisco, but anything you can see in a Mapbox style can be used to create these volumes. Check out some other examples of this method in action:

Download the Mapbox SceneKit SDK, or try out the source code for our volumetric weather visualizations. If you want hands-on help, join our 3D visualizations with SceneKit live session coming up on August 23rd or participate in our Explore Outdoors challenge to show off your new skills and win amazing prizes.

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

Written by Mapbox

mapping tools for developers + precise location data to change the way we explore the world

Responses (1)

Write a response