Database growth

We have created 5,2 millions of assignments so far in SuSo. After the assignments creations the database growth more than 30GB. We haven’t started the interview collection yet. We need to create 2.5 million more assignments. Is this growth considered as normal ?

An assignment contains two kinds of information: fixed - like responsible name and assignment number, which is by design of the software and volatile like content of the fields being preloaded - which depends of course on the questionnaire and values that you preload. Without knowing the second part it is pointless to try to do an overall assessment, but I am not surprised to see magnitudes of such size (30GB div by 5mln is just 6kb per assignment).

Expect more when you actually start the census.

You might also have dead objects. Did you perform Vacuum?

Dear @vitalii ,

Yes we have vacuum routine activated in our database instance. This instance is provided by Google Cloud as a PaaS services.

We need help to create index for the tables involve in the queries mention before and possible those involved in data export process.

The Suso database is growing 40GB per day (right now the database size is 180GB), the database engine has the storage limit of 1 Tera , with this daily growth, we will hit the maximum in a couple of weeks. The collected data is not the problem because the exported ZIP files for 1 day of collection weighs about 120MB. The problem is the size of specific tables inside the SuSo database like event tables. Following some current details about the event table weight per workspace:

Can you provide us maintenance routines for cleaning events table information in order to save storage?