An official website of the United States government.

Air Sensor Toolbox

Air Quality Exchange: Delivering High Value Air Quality Information to the Public – Workshop Summary

On June 12-13, 2019, EPA hosted a two-day workshop on “Air Quality Exchange: Delivering High Value Air Quality Information to the Public.” The purpose of the meeting was to discuss the increasing amount of air quality information being shared by various public and private entities. This information is often communicated in an inconsistent manner.

Confusion over air quality information exists because data is being generated for different purposes, needs, and users. EPA brought together stakeholders, specifically providers and interpreters of the data, to share information about their air quality products and role and perspectives on delivering air quality information to the public, and to provide a forum for discussion.  Participants came from:  

  • EPA and other federal agencies
  • Private sector, including big data management companies and sensor developers
  • State and local agencies

​The information below includes a workshop summary, presentation overviews, conclusions, next steps, and list of conferences that participants at the workshop identified as opportunities to continue the discussion. 

 ​Workshop Summary

Many common themes arose during the meeting. Specifically, there was a strong desire from the group to be more aligned in messaging air quality to the public. As individuals described their efforts to meet the demands of consumers for trusted real-time, localized, actionable information, it became apparent that most web apps are producing what  looks like EPA's Air Quality Index (AQI).  However, the data are actually generated on different temporal and spatial scales and produced by different sources (e.g. modeled vs. monitoring outputs). In order to address the issue, conversation centered around the use of air quality terms such as the Air Quality Index (AQI), NowCast, design value, current conditions, hourly concentration, real-time air quality, forecasts, air quality alert, air quality action day, reporting area and the need for the development of standardized terminology. 

Improvement in what the user sees when clicking  “source” of the data is also important and should include uncertainty in the output. Finally, individuals stated the need to include more people in the conversation. While a small group was a good place to start for the initial dialogue, the group proposed that the conversation should involve others, including, but not limited to, sensor end users (nongovernmental organizations and communities), academics, air quality modelers, health professionals, community groups, other data providers and manufacturers, international colleagues and communication specialists.

Presentation Overviews 

EPA -- EPA presentations provided an overview of air quality and health, the U.S. ambient air monitoring networks, health considerations when communicating air quality data, AirNow, non-regulatory measurements from sensors, and a perspective from an EPA regional office. The overview described general themes, including: 

  • Air pollution is a complex mixture of primary (directly emitted) and secondary (formed through atmospheric reactions) pollutants
  • Spatial scale is critical (e.g. hyper local monitoring platform data at the street level vs. the national ambient monitoring network vs. global satellite imagery
  • What is measured may change  interpretation (consider impacts from local sources)
  • Health effects vary across groups of people and duration of exposure (e.g. older adults, children, and people with pre-existing heart or lung disease are at-greater-risk from some pollutants; health effects from sub-hourly exposures are unclear
  • Extreme events such as wildfires are a “stress test” on the communication of air quality due to the complex mixture of pollutants, small-scale variability vs. long-range transport, acute as well as season-long exposures and messaging for at risk-groups.

EPA explained that the purpose of the monitoring networks is for comparison with the National Ambient Air Quality Standards (NAAQS), development and assessment of emission control strategies, long-term air quality trends, public awareness, and research. NAAQS are set at levels that protect public health and are based on extensive assessments of health studies published in peer-reviewed journals. Consistent with this evidence, the NAAQS are based on the most health-relevant averaging periods (e.g. 8-hour averaging time for ozone and 24-hour averaging time for daily PM2.5 standards).

EPA’s color-coded tool for communicating air quality and health information to the public is called the Air Quality Index (AQI). It is used to communicate both current air quality conditions (calculated using the NowCast algorithm) as well as the forecast for the next day. AirNow is the platform that shows the current, or NowCast, AQI from monitored concentrations delivered hourly by state, local, and federal air agencies. Maps are interpolated surfaces defined by a group of air quality monitors in a “reporting area” that the submitting agency defines. In addition to traditional monitoring data, millions of new data points are being generated by non-regulatory consumer devices or mid-tier research equipment, often referred to as air sensors. Work is ongoing to better understand the data quality, interpretation and management issues associated with emerging technologies, including answering frequent questions of why regulatory, AirNow, and sensor data indices do not match up?

​Private Sector -- Private sector presentations from big data companies and sensor manufacturers described the development of global solutions for characterizing forecasted and real-time air quality conditions. Big data platforms are using machine learning and artificial intelligence (AI) to combine large data sets, including regulatory and non-regulatory measurements, satellite data, model outputs, and other relevant big data sets (e.g. meteorology, traffic, health). Specifically, air quality information is being brought in by many companies as part of their weather packages and the private sector is beginning to recognize the sensitivity to non-regulatory air quality data, which is not typically a concern for meteorological data.

The private sector expressed needs to better understand how interpolation of measurement data is carried out and for communication of all air quality pollutants, including hazardous air pollutants and black carbon – not just the criteria pollutants communicated through the AQI. Many of the projects presented by sensor developers and big data providers focused on projects that produce data at the hyper-local scale within seconds to minutes. The purpose of such data is to inform individual or city-wide decisions and inspire action that results in improved environmental literacy and outcomes. Advantages of sensors include high spatial area coverage, low-cost instruments, and devices that are easy to deploy and access. However, uncertainty in the accuracy of and representativeness of the measurements still exists. Addressing the lack of data quality objectives, real-time calibration, and validation methods would be useful.

Local and State Agencies -- Local and state monitoring agencies provided presentations on the operation of their regulatory monitoring networks consistent with EPA Federal Reference Method and Federal Equivalent Methods(FRM/FEM) requirements; development of air quality forecasts, including AQI values for ozone and fine particulate matter (PM2.5) and examples of public confusion over the information they deliver to the public compared to model outputs or sensor development by the private sector. .

Local and state agencies presented multiple examples of public confusion over data fusion products from trusted weather providers, including:

  • A case when an indoor sensor resulted in a “red” alert at the county level, an area 80 miles away and 5,000 feet higher in elevation, which had the same AQI as the nearest city,
  • A model output that interpolated hazardous conditions across the state when monitors were down to protect equipment during an extreme weather event,
  • Use of sensor and model data during wildfire events that were over or underpredicting air quality conditions,
  • Difficulty integrating all the different data sources (e.g. different spatial scales, averaging times, calculation methods, forecasts, units, validation criteria) when trying to use new information and measurement technology tools.

Federal Partners -- U.S. federal partners provided perspectives on tools ranging from satellite monitoring to probabilistic forecast models. They described limitations in the use and interpretation of the data but stated that spatial resolution and information technology capabilities are getting better and data is either available or will be available soon. Globally, federal partners expressed a need for air quality information in parts of the world that don’t have traditional air monitoring networks and where pollution is worse. They stated that people look to the U.S. to provide quality information.

Partners are exploring data fusion products that use monitoring, satellite, weather, and other data to improve the international understanding of air pollution levels. Domestically, the impact of wildfire smoke is of great interest but communication with rural communities is difficult and characterizing what’s going on at the source (i.e. fire) is complex. For fires, forecasting tools are being used to look at weather, emissions, and geography data including data from traditional monitoring networks and temporary monitoring instruments. Challenges in communicating air quality to the public include variations of indices, sharing and integration of monitoring data, awareness and training on appropriate use of sensor data, and messaging high risk situations like smoke and hazardous air quality.


Emerging information and measurement technologies provide new tools for storing, processing, and analyzing data and reaching more people with air quality information. However, large data sets and different approaches in displaying outputs of varying spatial and temporal scales can result in confusion for the general public. This workshop was designed to start a dialogue to address current challenges in using these new technologies for existing applications. 

The meeting participants agreed to pursue the following steps: 

  • Work toward the development of standardized air quality terminology  
  • Create a list of air quality websites, apps, and related products including documenting the source of the data and uncertainties in outputs
  • Facilitate webinars on air quality products to better understand information being presented to the public and create a forum for questions and answers
  • Expand the conversation by coordinating sessions at upcoming meetings or conference