0 votes
I am uploading a CSV file with 500k rows and 181 columns. It all  goes OK until I try to explore the data and then I only get 204k records loaded. I get an exclamation mark on the UI telling me I have hit the memory limit and the dataset has been truncated. The associated log entry is in the title. I have given the application 6GB of RAM so can't imagine it is hitting that for a CSV file that is 411MB big. I am the only user on the system right now.

Is there some other memory limit not mentioned in the Documentation?

1 Answer

+1 vote
When looking at the contents of a dataset, the rows are loaded in the memory of DSS.

The memory allocated to sampling has its own limit, set by the administrator. This is to prevent crashing the browser with insane sample sizes.

You can find this setting in: administration->settings->Limits
Now (v5.1): Administration -> Settings -> Resources control
1,325 questions
1,345 answers
11,895 users

©Dataiku 2012-2018 - Privacy Policy