# Statistics GUI for bwLehrpool This software aims to provide a GUI dashboard for bwLehrpool statistics. It consists of an import script, that reads the JSON reports and puts them into a database, and the dashboard written using Dash. ## Setup I recommend to install the software in a fresh python3 virtualenv. This requires the Python Virtualenv package. In Ubuntu or Debian, this can be installed from the package `python3-venv`. Create a new virtualenv and activate it: ```bash $ python3 -m venv {VENVNAME} $ cd {VENVNAME} $ source bin/activate ``` Then clone the git repository into the virtualenv and install all requirements: ```bash $ git clone https://git.openslx.org/bwlp/bwlp-statistics.git/ $ cd bwlp-statistics $ pip3 install -r requirements.txt ``` ### Database initialization This software requires an empty MySQL Database. When this is created import the database structure from `db_structure.sql`. ### Configuration Create a config file for the database importer in `import/config.ini` as follows ```ini [db] host=mysqlhost username=mysqlusername password=mysqlpassword name=mysqldbname ``` Do the same for the dashboard in `dash/config.ini`, the config file has the exact same structure. --- **Note**: For improved security, the importer should have a db user with full privileges whereas the dashboard only requires read access. However, if both of them are on the same machine and running with the same user account, this would not be much of an advantage, since an attacker exploiting the dashboard would most probably have filesystem access and would be able to read the config file from the importer. Therefore, separating the two parts at least into several users would be desirable. --- ## Usage ### Importer The import script can import one statistic file at once. It must be run with its folder as working directory. It is called as follows: ```bash $ python3 import.py report_file.json ``` Be sure to either activate the venv before running this command as seen above or to use the explicit path to the python3 from the `bin` directory of the venv. Since timestamp and IP are parsed from the filename it must have the format given in the example datasets. In a real world application one would write a shell script which calls the importer for every file in a directory. The importer will know if a report was previously imported and not import it again. Therefore, the script could run every week on all reports from the current year and the last year, but more advanced logic here would be possible. There is also a sample script `import-batch.sh` which can be run with one folder as argument. The working diretory must also be the directory of the script. ### Dashboard The dashboard can be run using systemd with the following unit file: ```ini [Unit] Description=BW Lehrpool Statistics [Service] Type=simple ExecStart={VENVPATH}/{VENVNAME}/bin/gunicorn -w 4 -b 0.0.0.0:8080 index:server User={User to run as} Group={Group to run as} WorkingDirectory=/{VENVPATH}/{VENVNAME}/bwlp-statistics/dash [Install] WantedBy=multi-user.target ``` Here in `ExecStart` the option `-w` for Gunicorn is the number of workers. This should be 2-4 workers per available core. The dashboard is available on port 8080. Putting a reverse proxy in front of the application is supported.