Technology and analytics for unprecedented insights
New, cost-effective sensing technologies and data analytics can identify areas of dirty air that a sparse network of traditional monitors might miss, filling in gaps where modeled data isn’t available and painting a more holistic picture of air pollution sources and impacts.
There is a lot of momentum to fill in data gaps on air pollution, but we still need better technologies and more advanced analytics strategies for mobile mapping.
What customers are looking for in monitoring sensors and instruments
With a growing number of lower-cost air pollution sensors and sensor systems continuing to come into the market, regulatory agencies and researchers have been evaluating their performance. We encourage innovators to become familiar with (see section 2 of the guide, “Nuts & Bolts”), which offers resources and key criteria for consideration.
Air pollutants have different physical properties, as well as different health impacts, which have implications for how best to measure and map them. Customers will require and select a monitoring instrument that is appropriate for their specific air pollution needs and goals – the pollutants that need to be measured and the data quality needed for monitoring objectives. Some may also combine different kinds of monitoring in sequence – broad-based mobile monitoring, short-term stationary monitoring, or fixed sensors (see page 26 of the guide for more potential use cases).
When developing monitoring sensors or instruments, innovators must be familiar with key performance indicators under consideration, as well as the operating parameters of the instrument. At times, customers may want to measure more than one species of pollutant, in which a customized package of various sensors will requested.
Innovators should also be familiar the costs associated with a typical program of scale, to determine whether their costs fall higher or lower than the average. For cost ranges, see the AQ-SPEC resource center, which compiles cost estimates for a wide range of PM and gaseous sensor systems, as well as EDF’s cost of hyperlocal mapping tool (see page 33 of the guide for more information on costs).
It’s important for tech providers to understand the entire data management process from start to finish, and what role they play at each step. Ensuring a smooth transfer of data to storage and its final data platform is critical. In some cases, customers may want data management to be included in the service package (see page 37 of the guide for considerations and potential pitfalls to consider).
On top of that, customers will need to establish robust quality assurance and quality control procedures for cleaning data. To ensure transparency, they may request access to all raw and processed data so others can trace the path of data from raw to final. The Code Repository can be used to help ensure efficient Data processing (see page 39 of the guide for more information on quality assurance and quality control). Once data is collected and rendered into a map, it’s important to be familiar with what is required to analyze it (see page 41 of the guide for a 4-part analytical approach).