Your one-stop web resource providing safety and security information to manufacturers

There is a way to combine smartphone resources with a novel application that allows the phones’ users to help monitor air quality.
By downloading the Visibility app, which works for smartphones running the Android system and soon will be widely available on Android app sources (an iPhone app is in the works), the user takes a picture of the sky while the sun is shining, which can then compare to established models of sky luminance to estimate visibility, said researchers at the University of Southern California Viterbi School of Engineering. Visibility directly relates to the concentration of harmful “haze aerosols,” tiny particles from dust, engine exhaust, mining or other sources in the air. Such aerosols turn the blue of a sunlit clear sky gray.
The basic principle of the Visibility app is simple, according to a paper documenting the work by USC computer science professor Gaurav Sukhatme.

Smartphone app allows users to test for air pollution. On the top is the opening screen; middle, directs the user, bottom, ready to report.

Smartphone app allows users to test for air pollution. On the top is the opening screen; middle, directs the user, bottom, ready to report.

There is one caveat: It has to be the right picture. The researchers base the visibility/pollution models on the viewing geometry of the image and the position of the sun.
The Visibility app works because modern smartphones contain a rich set of sensors that include cameras, GPS systems, compasses and accelerometers, in addition to the powerful communication capabilities that are inspiring a slew of intelligent phone applications.
Sameera Poduri, a postdoctoral researcher in Sukhatme’s lab, said the accelerometer in the phone the sensor that tells how the user is holding the phone, determining whether it displays information vertically or horizontally can “guide the user to point the camera in exactly the right direction.”
The picture must be all or mostly sky, which makes a contribution from human user judgment critical.
“Several computer vision problems that are extremely challenging to automate are trivially solved by a human,” according to the paper. β€œIn our system, segmenting sky pixels in an arbitrary image is one such problem. When the user captures an image, we ask him [or her] to select a part of the image that is sky.”
The accelerometers and the compass on the phone capture its position in three dimensions while the GPS data and time compute the exact position of the sun. The application automatically computes the camera and solar orientation, uploading this data along with the image β€” a small (100KB) black-and-white file β€” to a central computer. The central computer analyzes the image to estimate pollutant content and returns a message to the user, as well as registering the information (User identities are anonymous).
The system potentially can help fill in the many blanks in the existing maps of air pollution. Conventional air pollution monitors are expensive and thinly deployed, Sukhatme said. There are a few California counties that have no monitors at all. Even Los Angeles County has only a few.
The system has been tested in several locations, including Los Angeles (on a rooftop at USC and in Altadena) and in Phoenix, AZ. The USC rooftop camera has a built-in “ground truth” test it is near a conventional air pollution monitoring station.
So far the results are promising, but they indicate that several improvements are possible.
“We’re sure we can improve it if we get people trying it and testing it and sending data,” Sukhatme said.
The Visibility application is now available for download.

Pin It on Pinterest

Share This