The exported data is essentially tables already and all you need to do is parse them and put them in a database.
You can even directly import the tab files (though you lose label data).
The advantage of parsing them is that you can separate your data into your own logical tables specific to your survey rather than the generic tables Survey Solutions uses so it is able to be used for any type of survey.
My code is survey specific rather then generic. I read the *.sav files into my own c# data structures and save it to a database.
It is just using SPSSLib at the bottom level, where you easily read the values and labels through its interfaces.
Then you can determine what you want your database tables to look like and store the survey data appropriately.
I’m pretty sure R has the ability to read SPSS files and also to write to a database too. You just have to code it yourself so the database tables are logical for your own use case.
Engineering some software to parse the data and write to a database is a pretty trivial task for a programmer, and you can even automate the task using the API so it polls the server (e.g. once per day) to update the database when there’s new interviews.
It sounds like it might be a good way forward to do this so that you can then use the data in the database to do all of the other tasks you’ve been describing.