Reusable Python classes for JAWS plus admin scripts to setup and manage a deployment.
- Grab project
git clone https://github.com/JeffersonLab/jaws-libp
cd jaws-libp
- Launch Compose
docker compose up
- Monitor active alarms
docker exec -it cli list_activations --monitor
- Trip an alarm
docker exec cli set_activation alarm1
Note: The docker-compose services require significant system resources - tested with 4 CPUs and 4GB memory.
Requires Python 3.9+
pip install jaws-libp
Note: Using newer versions of Python may be problematic because the dependency confluent-kafka
uses librdkafka, which often does not have a wheel file prepared for later versions of Python, meaning setuptools will attempt to compile it for you, and that often doesn't work (especially on Windows). Python 3.9 does have a wheel file for confluent-kafka so that's your safest bet. Wheel files also generally only are prepared for Windows, MacOS, and Linux. Plus only for architectures x86_64 and arm64, also only for glibc. If you use with musl libc or linux-aarch64 then you'll likely have to compile librdkafka yourself from source.
Environment variables are used to configure jaws-libp:
Name | Description |
---|---|
BOOTSTRAP_SERVER | Host and port pair pointing to a Kafka server to bootstrap the client connection to a Kafka Cluster; example: kafka:9092
|
SCHEMA_REGISTRY | URL to Confluent Schema Registry; example: http://registry:8081
|
The Docker container can optionally handle the following environment variables as well:
Name | Description |
---|---|
ALARM_LOCATIONS | Path to an alarm locations file to import (example file), else an https URL to a file, else a comma separated list of location definitions with fields separated by the pipe symbol. Example Inline CSV: name|parent
|
ALARM_SYSTEMS | Path to an alarm categories file to import (example file), else an https URL to a file, else a comma separated list of system definitions with fields. Example Inline CSV: name
|
ALARM_ACTIONS | Path to an alarm classes file to import (example file), else an https URL to a file, else a comma separated list of class definitions with fields separated by the pipe symbol. Example Inline CSV: name|category|priority|rationale|correctiveaction|latching|filterable|ondelayseconds|offdelayseconds
|
ALARMS | Path to an alarm registration instances file to import (example file), else an https URL to a file, else a comma separated list of instance definitions with fields separated by the pipe symbol. Leave epicspv field empty for SimpleProducer. Example Inline CSV: name|action|epicspv|location|maskedby|screencommand
|
ALARMS_URL_CSV | If provided, is a comma separated list of file names to append to ALARMS; ignored if ALARMS doesn't start with https ; Example. |
ALARM_OVERRIDES | Path to an alarm overrides file to import (example file), else an https URL to a file. |
This Python 3.9+ project is built with setuptools and may be run using the Python virtual environment feature to isolate dependencies. The pip tool can be used to download dependencies.
git clone https://github.com/JeffersonLab/jaws-libp
cd jaws-libp
python -m venv .venv_dev --upgrade-deps
Activate the virtual env using your shell specific command, then install in editable mode with dev deps and run build:
# Windows
.venv_dev\Scripts\activate.bat
# UNIX (SH Shell)
source .venv_dev/bin/activate
# UNIX (CSH Shell)
source .venv_dev/bin/activate.csh
pip install -e ."[dev]"
python -m build
pylint --recursive=y src/*
Note for JLab On-Site Users: Jefferson Lab has an intercepting proxy
Set up the build environment following the Build instructions.
In order to iterate rapidly when making changes it's often useful to run the Python scripts directly on the local workstation, perhaps leveraging an IDE. In this scenario run the service dependencies with Docker Compose:
docker compose -f deps.yaml up
Note: The environment variable defaults work in this scenario and are defined as:
BOOTSTRAP_SERVERS=localhost:9094
and SCHEMA_REGISTRY=http://localhost:8081
The integration tests depend on a running Kafka instance, generally in Docker. The tests run automatically via the CI GitHub Action on every commit (unless [no ci]
is included in the commit message). The tests can be run locally during development. Set up the development environment following the Develop instructions. Then with the deps.yaml
Docker containers running and the build virtual environment activated run:
pytest
- Bump the version number in the VERSION file and commit and push to GitHub (using Semantic Versioning).
- The CD GitHub Action should run automatically invoking:
- The Create release GitHub Action to tag the source and create release notes summarizing any pull requests. Edit the release notes to add any missing details.
- The Publish artifact GitHub Action to create a deployment artifact on PyPi.
- The Publish docs GitHub Action to create Sphinx docs.
- The Publish docker image GitHub Action to create a new demo Docker image.