Grady - will correct you!
The intention of this tool is to simplify the exam correcting process at the University of Goettingen. It is deployed as a web application consisting of a Django-Rest backend and a Vue.js frontend.
Overview
Grady has three basic functions for the three types of users
- Reviewers can
- edit feedback that has been provided by tutors
- mark feedback as final if it should not be modified (only final feedback is shown to students)
- delete feedback (submission will be reassigned)
- Tutors can
- request a submission that they have to correct and submit feedback for it
- delete their own feedback
- review feedback of other tutors
- they do not see which student submitted the solution
- Students can
- review their final feedback and score in the post exam review
An overview over the database can be found in the docs folder.
Contributing
Feature proposals are welcome! If you experienced any bugs or otherwise unexpected behavior please submit an issue using the issue templates.
It is of course possible to contribute but currently there is no standardized way since the project is in a very early stage and fairly small. If you feel the need to help us out anyway, please contact us via our university email addresses.
Development
Dependencies
Make sure the following packages and tools are installed:
- Python 3.6
- Docker or a local installation of Postgres
-
npm
oryarn
(you can usenpm
to installyarn
) make
These are required to set up the project. All other application dependencies are
listed in the requirements.txt
and the package.json
files. These will be
installed automatically during the installation process.
Installing
To set up a new development instance perform the following steps:
-
Create a virtual environment with a Python3.6 interpreter and activate it. It works like this:
make .venv source .venv/bin/activate
Just type
deactivate
the get out. -
Set the environment variable
DJANGO_DEV
toTrue
like this:export DJANGO_DEV=True
-
Install backend dependencies with:
make install
-
Set up a Postgres 9.5 database. If you have docker installed the easiest way is to just run it in a docker container, like this:
docker run -d --rm --name postgres -p 5432:5432 postgres:9.5
Alternatively, take a look at the Makefile targets that should make your life easier, e.g
make db
.And apply database migrations once the database is up:
python manage.py migrate
-
Create a superuser if necessary:
python manage.py createsuperuser
More users can be added in the admin interface. You should be able to reach it via http://localhost:8000/admin.
-
Everything is set. You can start the development server with:
make run
-
Congratulations! Your backend should now be up an running. To setup the frontend see the README in the
frontend
folder.
Testing
"Code without tests is broken by design." -- (Jacob Kaplan-Moss, Django core developer)
While we try to keep test coverage as high as possible, there are still some parts that lack tests. If you contribute to the project please always make sure to run the full test suite. You can run the backend tests by running
make test
or if you want a coverage report as well you can run:
make coverage
The frontend is tested using unit and end-to-end tests, while the focus is more on end-to-end testing than unit testing. The end-to-end tests use selenium in combination with mozilla's geckodriver. In order to run the end-to-end tests run the following command
make teste2e
Notice that this will always issue a complete rebuild of the frontend. If you want to run tests without building the frontend anew, use
make teste2e-nc
Production
In order to run the app in production, a server with
Docker is needed. To make routing to the
respective instances easier, we recommend running traefik
as a reverse proxy on the server. For easier configuration of the containers
we recommend using docker-compose
. The following guide will assume both these
dependencies are available.
Setting up a new instance
Simply copy the following docker-compose.yml
onto your production server:
version: "3"
services:
postgres:
image: postgres:9.6
labels:
traefik.enable: "false"
networks:
- internal
volumes:
- ./database:/var/lib/postgresql/data
grady:
image: docker.gitlab.gwdg.de/j.michal/grady:master
restart: always
entrypoint:
- ./deploy.sh
volumes:
- ./secret:/code/secret
environment:
GRADY_INSTANCE: ${INSTANCE}
SCRIPT_NAME: ${URLPATH}
networks:
- internal
- proxy
labels:
traefik.backend: ${INSTANCE}
traefik.enable: "true"
traefik.frontend.rule: Host:${GRADY_HOST};PathPrefix:${URLPATH}
traefik.docker.network: proxy
traefik.port: "8000"
depends_on:
- postgres
networks:
proxy:
external: true
internal:
external: false
and set the INSTANCE
, URLPATH
, GRADY_HOST
variables either directly in the
compose file or within an .env
file in the same directory as the docker-compose.yml
(it will be automatically loaded by docker-compose
).
Login to gwdg gitlab docker registry by entering:
docker login docker.gitlab.gwdg.de
Running
docker-compose pull
docker-compose up -d
will download the latest postgres and grady images and run them in the background.
Importing exam data
Exam data structure
In order to import the exam data it must be in a specific format.
You need the following:
- A .json file file containing the output of the converted ILIAS export which is generated by hektor
- A plain text file containing one username per line. A new reviewer account
will be created with the corresponding username and a randomly
generated password. The passwords are written to a
.importer_passwords
file.
This step should not be skipped because a reviewer account is necessary in order to activate the tutor accounts.
Importing exam data
In order to create reviewer accounts, open an interactive shell session in the running container:
$ docker exec -it <container_id> /bin/sh
While in the shell, create a new file containing one username per line:
$ echo "user1\nuser2" > reviewers
After creating the file, call the importer script:
$ python manage.py importer
Keep in mind that you can import exam data in two ways. You can either import the .json file using the importer or you can use the frontend to import data in a more user-friendly way. In either case, you will have to use the importer in order to create the reviewer accounts.
When logging in to an instance that has no data imported you will automatically be prompted to import some data. If you are on an instance that already has data, you can find the import dialog in the dropdown menu next to the logout button. In the import dialog, simply select the .json file and upload it. This procedure may take a while depending on the file size.