Grady - will correct you!
The intention of this tool is to simplify the exam correcting process at the University of Goettingen. It is deployed as a web application consisting of a Django-Rest backend and a Vue.js frontend.
Overview
Grady has three basic functions for the three types of users
- Reviewers can
- edit feedback that has been provided by tutors
- mark feedback as final if it should not be modified (only final feedback is shown to students)
- delete feedback (submission will be reassigned)
- Tutors can
- request a submission that they have to correct and submit feedback for it
- delete their own feedback
- review feedback of other tutors
- they do not see which student submitted the solution
- Students can
- review their final feedback and score in the post exam review
An overview over the database can be found in the docs folder.
Contributing
Feature proposals are welcome! If you experienced any bugs or otherwise unexpected behavior please submit an issue using the issue templates.
It is of course possible to contribute but currently there is no standardized way since the project is in a very early stage and fairly small. If you feel the need to help us out anyway, please contact us via our university email addresses.
Development
Dependencies
Make sure the following packages and tools are installed:
- Python 3.6
- Pipenv
- Docker or a local installation of Postgres
-
npm
oryarn
(you can usenpm
to installyarn
) make
These are required to set up the project. All other application dependencies are
listed in the Pipfile
files. These will be
installed automatically during the installation process.
Installing
To set up a new development instance perform the following steps:
- Create a virtual environment with a Python3.6 interpreter and install all relevant dependencies:
pipenv install --dev
- Set the environment variable
DJANGO_DEV
toTrue
like this:
export DJANGO_DEV=True
- Enter a shell in the virtual environment:
pipenv shell
- Set up a Postgres 9.5 database. If you have docker installed the easiest way is to just run it in a docker container, like this:
docker run -d --rm --name postgres -p 5432:5432 postgres:13
Alternatively, take a look at the Makefile targets that should make your
life easier, e.g make db
.
And apply database migrations once the database is up:
python manage.py migrate
- Create a superuser if necessary:
python manage.py createsuperuser
More users can be added in the admin interface. You should be able to reach it via http://localhost:8000/admin.
- Everything is set. You can start the development server with:
python manage.py runserver
- Congratulations! Your backend should now be up an running. To setup the frontend
see the README in the
frontend
folder.
Testing
"Code without tests is broken by design." -- (Jacob Kaplan-Moss, Django core developer)
Well, currently this repository lacks tests, thats true. But that will change as
this work until now is merely a prototype that will be developed further. However,
the few existing tests can be seen as examples and can be found in the tests.py
file of each app (currently only core
). You can run those tests with
make test
or if you want a coverage report as well you can run:
make coverage
If you'd like to run the functional tests, simply run:
make teste2e path=functional_tests
or
make teste2e path=functional_tests headless=True
for headless mode (Note: You might need to install additional dependencies).
make teste2e
Notice that this will always issue a complete rebuild of the frontend. If you want to run tests without building the frontend anew, use
make teste2e-nc
Production
In order to run the app in production, a server with
Docker is needed. To make routing to the
respective instances easier, we recommend running traefik
as a reverse proxy on the server. For easier configuration of the containers
we recommend using docker-compose
. The following guide will assume both these
dependencies are available.
Setting up a new instance
Simply copy the following docker-compose.yml
onto your production server:
version: "3"
services:
postgres:
image: postgres:13
labels:
traefik.enable: "false"
networks:
- internal
volumes:
- ./database:/var/lib/postgresql/data
grady:
image: docker.gitlab.gwdg.de/j.michal/grady:master
restart: always
entrypoint:
- ./deploy.sh
volumes:
- ./secret:/code/secret
environment:
GRADY_INSTANCE: ${INSTANCE}
SCRIPT_NAME: ${URLPATH}
networks:
- internal
- proxy
labels:
traefik.backend: ${INSTANCE}
traefik.enable: "true"
traefik.frontend.rule: Host:${GRADY_HOST};PathPrefix:${URLPATH}
traefik.docker.network: proxy
traefik.port: "8000"
depends_on:
- postgres
networks:
proxy:
external: true
internal:
external: false
and set the INSTANCE
, URLPATH
, GRADY_HOST
variables either directly in the
compose file or within an .env
file in the same directory as the docker-compose.yml
(it will be automatically loaded by docker-compose
).
Login to gwdg gitlab docker registry by entering:
docker login docker.gitlab.gwdg.de
Running
docker-compose pull
docker-compose up -d
will download the latest postgres and grady images and run them in the background.
Importing exam data
Exam data structure
In order to import the exam data it must be in a specific format. You need the following:
- A .json file file containing the output of the converted ILIAS export which is generated by hektor
- A plain text file containing one username per line. A new reviewer account
will be created with the corresponding username and a randomly
generated password. The passwords are written to a
.importer_passwords
file. This step should not be skipped because a reviewer account is necessary in order to activate the tutor accounts.
Importing exam data
In order to create reviewer accounts, open an interactive shell session in the running container:
$ docker exec -it <container_id> /bin/sh
While in the shell, create a new file containing one username per line:
$ echo "user1\nuser2" > reviewers
After creating the file, call the importer script:
$ python manage.py importer
Keep in mind that you can import exam data in two ways. You can either import the .json file using the importer or you can use the frontend to import data in a more user-friendly way. In either case, you will have to use the importer in order to create the reviewer accounts.
When logging in to an instance that has no data imported you will automatically be prompted to import some data. If you are on an instance that already has data, you can find the import dialog in the dropdown menu next to the logout button. In the import dialog, simply select the .json file and upload it. This procedure may take a while depending on the file size.