0 votes
I am uploading a CSV file with 500k rows and 181 columns. It all  goes OK until I try to explore the data and then I only get 204k records loaded. I get an exclamation mark on the UI telling me I have hit the memory limit and the dataset has been truncated. The associated log entry is in the title. I have given the application 6GB of RAM so can't imagine it is hitting that for a CSV file that is 411MB big. I am the only user on the system right now.

Is there some other memory limit not mentioned in the Documentation?
asked by Woody

1 Answer

0 votes
When looking at the contents of a dataset, the rows are loaded in the memory of DSS.

The memory allocated to sampling has its own limit, set by the administrator. This is to prevent crashing the browser with insane sample sizes.

You can find this setting in: administration->settings->Limits
answered by
709 questions
727 answers
459 users