Testing sensors in fog makes Autonomous Vehicles and Drones safer
Self-flying drones and autonomous taxis that can safely operate in fog may sound futuristic, but new research at Sandia National Laboratoriesβ fog facility is bringing the future closer.
Fog can make travel by water, air and land hazardous when it becomes hard for both people and sensors to detect objects. Researchers at Sandiaβs fog facility are addressing that challenge throughΒ new optical researchΒ in computational imaging and by partnering with NASA researchers working onΒ Advanced Air Mobility,Β Teledyne FLIRΒ and others to test sensors in customized fog that can be measured and repeatedly produced on demand.
βItβs important to improve optical sensors to better perceive and identify objects through fog to protect human life, prevent property damage and enable new technologies and capabilities,β said Jeremy Wright, optical engineer.
Built in 2014, SandiaβsΒ fog chamberΒ is 180 feet long, 10 feet tall and 10 feet wide. The chamber is lined with plastic sheeting to entrap the fog.
When the team begins a test, 64 nozzles hiss as they spray a custom mixture of water and salt. As the spray spreads, the humidity builds and thick fog forms. Soon, an observer inside wonβt be able to see the walls, ceiling or entrance through the aerosol and people and objects a few feet away will be obscured or completely hidden.
Sandiaβs researchers carefully measure properties of fog over time to understand how it forms and changes. By adjusting environmental parameters, the researchers can change the fog properties to better match naturally occurring fog.
βOur team can measure and completely characterize the fog that we produce at the facility, and we can repeatedly generate similar fog on different days,β said Andres Sanchez, chemical engineer. βHaving consistent and measurable conditions is important when weβre testing how sensors perform in fog.β

Enabling safe all-weather operations for self-flying vehicles, planes and drones
Researchers from NASAβs Ames Research Center recently visited Sandia to perform a series of experiments to test how commercially available sensors perceive obstacles in fog. The Revolutionary Aviation Mobility group is part of the NASAΒ Transformational Tools and TechnologiesΒ project.
βWe tested perception technologies that might go into autonomous air vehicles,β said Nick Cramer, the lead NASA engineer for this project. βWe want to make sure these vehicles are able to operate safely in our airspace. This technology will replace a pilotβs eyes, and we need to be able to do that in all types of weather.β
The team set up a stationary drone in the chamber as a target and then tested various sensors to see how well they could perceive the drone in the fog.
βThe fog chamber at Sandia National Laboratories is incredibly important for this test,β Cramer said. βIt allows us to really tune in the parameters and look at variations over long distances. We can replicate long distances and various types of fog that are relevant to the aerospace environment.β
Cramer said one of the challenges of self-flying technology is that there would be a lot of small vehicles flying in close proximity.
βWe need to be able to detect and avoid these small vehicles,β Cramer said. βThe results of these tests will allow us to dig into what the current gaps in perception technology are to moving to autonomous vehicles.β

Fog facility helps prove technology
Teledyne FLIR has tested its own infrared cameras at Sandiaβs fog facility to determine how well they detect and classify pedestrians and other objects. Chris Posch, engineering director, automotive, for Teledyne FLIR, said the cameras could be used to improve both the safety of todayβs vehicles with advanced driver-assisted systems features such as automatic emergency breaking and autonomous vehicles of the future.
βFog testing is very difficult to do in nature because it is so fleeting and there are many inherent differences typically seen in water droplet sizes, consistency and repeatability of fog or mist,β said Posch. βAs the Sandia fog facility can repeatedly create fog with various water content and size, the facility was critical in gathering the test data in a thorough scientific manner.β
Sandia and Teledyne FLIR conducted multiple performance tests with vehicle safety sensors including visible cameras, longwave infrared cameras, midwave infrared cameras, shortwave infrared cameras and lidar.
Posch said the results showed that Teledyne FLIRβs longwave infrared cameras can accurately detect and classify pedestrians and other objects in most fog, where visible cameras are challenged.

New research to detect, locate and image objects through fog
A team of Sandia researchers recently published aΒ paperΒ in Optics Express describing current results from a three-year project to use computational imaging and the science behind how light propagates and scatters in fog to create algorithms that enable sensors to detect, locate and image objects in fog.
βCurrent methods to see through fog and with scattered light are costly and can be limited,β said Brian Bentz, electrical engineer and project lead. βWe are using what we know about how light propagates and scatters in fog to improve sensing and situational awareness capabilities.β
Bentz said the team has modelled how light propagates through fog to an object and to a detector β usually a pixel in a camera β and then inverted that modelΒ to estimate where the light came from and characteristics of the object. By changing the model, this approach can be used with either visible or thermal light.
Bentz says the team has used the model to detect, locate and characterize objects in fog and will be working on imaging objects during the projectβs final year. The team has been using Sandiaβs fog facility for experimental validations.
Parallel to this research, the Sandia team created two bench-top fog chambers to support a project at academic alliance partner, West Lafayette, Indiana-basedΒ PurdueΒ University.
Sandia is studying and characterizing the fog generated by its new bench-top fog chamber, while Purdue is using its twin system to perform experiments.
Purdue professor Kevin Webb is leading research to develop an imaging technology based on how light interferes with itself when it scatters and using those effects to detect objects.
The Sandia team has recently presented its work atΒ SPIEΒ andΒ CLEO. The computational imaging and academic alliance research was funded byΒ Laboratory Directed Research and Development.
















