airQ

airQ - Air Quality monitoring data ( for India ) collection system, written in Python3.


Keywords
air-quality, airquality, airqualitymonitoringsystem, india, pollution-levels, python3
License
MIT
Install
pip install airQ==0.3.3

Documentation

airQ v0.3.3

A near real time Air Quality Indication Data Collection Service ( for India ), made with ❤️

Consider putting ⭐️ to show love & support

Companion repo located at : airQ-insight, to power visualization

what does it do ?

  • Air quality data collector, collected from 180+ ground monitoring stations ( spread across India )
  • Unreliable JSON dataset is fetched from here, which gives current hour's pollutant statistics, from all monitoring station(s), spread across India, which are then objectified, cleaned, processed & restructured into proper format and pushed into *.json file
  • Air quality data, given by minimum, maximum & average presence of pollutants such as PM2.5, PM10, CO, NH3, SO2, OZONE & NO2, along with timeStamp, grouped under stations ( from where these were collected )
  • Automated data collection done using systemd ( hourly )

installation

airQ can easily be installed from PyPI using pip.

$ pip install airQ --user # or may be use pip3
$ python3 -m pip install airQ --user # if previous one doesn't work

usage

After installing airQ, run it using following command

$ cd # currently at $HOME
$ airQ # improper invokation
airQ - Air Quality Data Collector

	$ airQ `sink-file-path_( *.json )_`

 For making modifications on airQ-collected data
 ( collected prior to this run ),
 pass that JSON path, while invoking airQ ;)

Bad Input
$ airQ ./data/data.json # proper invokation

automation

  • Well my plan was to automate this data collection service, so that it'll keep running in hourly fashion, and keep refreshing dataset
  • And for that, I've used systemd, which will use a systemd.timer to trigger execution of airQ every hour i.e. after a delay of 1h, counted from last execution of airQ, periodically
  • For that we'll require to add two files, *.service & *.timer ( placed in ./systemd/ )

airQ.service

Well our service isn't supposed to run always, only when timer trigger asks it to run, it'll run. So in [Unit] section, it's declared it Wants, airQ.timer

[Unit]
Description=Air Quality Data collection service
Wants=airQ.timer

You need to set absolute path of current working directory in WorkingDirectory field of [Service] unit declaration

ExecStart is the command, to be executed when this service unit is invoked by airQ.timer, so absolute installation path of airQ and absolute sink path ( *.json ) is required

Make sure you update User field, to reflect changes properly, as per your system.

If you just add a Restart field under [Service] unit & give it a value always, we can make this script running always, which is helpful for running Servers, but we'll trigger execution of script using systemd.timer, pretty much like cron, but much more used & supported in almost all linux based distros

[Service]
User=anjan
WorkingDirectory=/absolute-path-to-current-working-directory/
ExecStart=/absolute-path-to-airQ /home/user/data/data.json

This declaration, makes this service a required dependency for multi-user.target

[Install]
WantedBy=multi-user.target

airQ.timer

Pretty much same as airQ.service, only Requires, airQ.service as one strong dependency, because that's the service which is to be run when this timer expires

[Unit]
Description=Air Quality Data collection service
Requires=airQ.service

Unit field specifies which service file to execute when timer expires. You can simply skip this field, if you have created a ./systemd/*.service file of same name as ./systemd/*.timer

As we're interested in running this service every 1h ( relative to last execution of airQ.service ), we've specified OnUnitActiveSec field to be 1h

[Timer]
Unit=airQ.service
OnUnitActiveSec=1h

Makes it an dependency of timers.target, so that this timer can be installed

[Install]
WantedBy=timers.target

automation in ACTION

Need to place files present ./systemd/* into /etc/systemd/system/, so that systemd can find these service & timer easily.

$ sudo cp ./systemd/* /etc/systemd/system/

We need to reload systemd daemon, to let it explore newly added service & timer unit(s).

$ sudo systemctl daemon-reload

Lets enable our timer, which will ensure our timer will keep running even after system reboot

$ sudo systemctl enable airQ.timer

Time to start this timer

$ sudo systemctl start airQ.timer

So an immediate execution of our script to be done, and after completion of so, it'll again be executed 1h later, so that we get refreshed dataset.

Check status of this timer

$ sudo systemctl status airQ.timer

Check status of this service

$ sudo systemctl status airQ.service

Consider running your instance of airQ on Cloud, mine running on AWS LightSail

visualization

This service is supposed to only collect data & properly structure it, but visualization part is done at airQ-insight

Hoping it helps 😉