About This Website

Why?
I always looked at the exam statistics and thought to myself: that data is pretty interesting, and I wondered if maybe I could do something cool with it. So I took it as a challenge. Also, it was a good opportunity for me to do some training on statistical data analysis with numpy and pandas.
How?
Collecting the Data
Since the data on this website only changes twice a year at most for a couple of years, and then does not change anymore, I chose to implement it as a static website. This makes the website a lot more simple, and I think it would be a huge waste to have it somehow query the university system every time. It would also mean I would have to fix it every time there is a change in the universities systems. Instead, I wrote scripts that scrape the data from the university website once. Those scripts are written in Python and Bash.
Analysing the Data
I did the data analysis in an interactive Jupyter Notebook, which produces json files that are then used in the website to display the data. However, that python code is also available as a python file, so if after another semester there is new data which I want to import into the website, but I don't need to go through the interactive analysis, I can just run the python script, which updates the json files for the website.
Building a Website
I am not a frontend engineer. I am familiar with some basics like html, JavaScript etc. My goal was to find the easiest and simplest way to present the data nicely. I found the template for this website which does all the heavy lifting for this frontend, and just adapted it to my content. I wrote features like populating the data and sorting the charts in JavaScript, and the charts are made with Chart.js.