Is there a size limit for the dataset?

I’m trying to upload a pretty large dataset but which is compressed into only a few megabytes. Is there a limit on the size of a dataset that DLS can process?

no there is no limit but if your dataset if larger than 1gb then you have to upload it via file browser.


Thanks. But when I tried to upload a large data set to the server it gave me an error: “Error tokenizing data. C error out of memory.” EDIT: This is by uploading through the browser by the way.

hy so what’s the size of your data and just to mention that via file browser you have to directly upload the folder and flies not the zip file of the data.

It was 14.3 gb. And yeah I did upload it directly to the dataset folder (inside another folder like in the instruction).

If it helps I can upload a small sample of the file (which is CSV type) so you can inspect it.

yeah sure mail us at


Thanks. I’ve sent you a sample of the file.

The dataset you have mailed us contain null values that is last 3 columns are empty
I just hide the first column and here is rest of the csv

That’s strange. I don’t remember there was any null values in my file.

Also, I have a question. Is it strictly necessary to pad the columns to have a fixed length? If it is, then how can I pad odd-length sequences? Since the padding is typically done with instances of “0;” (I presume) which has a length of 2 then is it possible to pad odd-lengthed sequences into even lengths? Or is the semi colon not counted?

Yes the column should have fix number of values
and they should be separated by ;
Yes it does not count ; in the length.
It considers the number of values in one column so to fix the length add n number of 0 at the end require to make it same.
Like if you have fix length = 5
and your value is 1;2;3;4 then add 1 “0” at the end to make it 1;2;3;4;0
for value like 1;2;3 the output will be 1;2;3;0;0

Gotcha. And btw, different columns can have different lengths yes?

yes they can have