Column size of the .tab files

A Suso designer of ours have a questionnaire that has multiple rosters within sections, But the output .tab file comes out with 1203 columns which SQL server does not want to import due to its 1024 column restriction.
How does Suso determine what constitutes a new . tab file
the questionnaire is Continuous Population Survey 2021 Test_V2


Hi Danie,

If multiple rosters are triggered by the same question (therefore are on the same ‘level’) they are exported in the same file.
This was originally done to minimize the number of files and manipulations needed to merge them together - for example, if you have a members roster listing all household members, and later a separate roster in the education section to filter members who are above 15 years old, Survey Solutions would output all questions, including education in the members file.

This behavior was chosen by the understanding that for ‘most’ use cases when the data are taken into a statistical analysis package, such ‘wide’ files are more convenient to work with. The approach is not the same in the traditional relational database philosophy where you’d prefer not to have too wide tables.

Unfortunately, users do not have any control to chose which ‘flavor’ to export, so your only option would be to split the file into two (or more) tables - if you’re using import data wizard, I think there is an advanced mode that allows to select the mapping (unfortunately I don’t have SQL Server available to test this myself); if you do some scripting to do the data import, you could look at doing the splitting in that scripting language before importing…

Knowing that @daniedk most likely needs to run this on a Windows machine (although it runs perfectly on my Ubuntu as well) I tried to write a little cmdlet in Powershell to split CSV file into multiple parts with fixed number of columns. Haven’t done much testing on how it would perform on huge (in terms of many rows) files but for the most cases it should be fast enough.

Thank you so much Zurab,
Much appreciated