+1 vote
The DSS Data Directory is huge (several GB). Is there a way to optimize it without any data loss?

1 Answer

0 votes
Best answer

The DSS data directory (or DATA_DIR) stores all your configuration, settings, definitions of datasets, recipes, logs, etc. of your DSS installation. You can read more about the DATA_DIR.

This is a critical directory. You should make regular back-ups.

If you really need to gain some space, you have the following options:

  • Look at the managed datasets in DATA_DIR/managed_datasets and verify that there is no dataset no longer used and that has not been deleted.
  • Delete the logs of old jobs that are in DATA_DIR/jobs.

For instance:

#open the jobs directory
cd DATA_DIR/jobs
#find all directories containing logs of jobs older than 30 days
find . -mindepth 2 -maxdepth 2 -type d -mtime +30
#delete these directories
find . -mindepth 2 -maxdepth 2 -type d -mtime +30 -exec rm -rf {} \;

Keep in mind it's a touchy operation!

selected by
Coming soon: We’re working on a brand new, revamped Community experience. Want to receive updates? Sign up now!
1,322 questions
1,341 answers
11,889 users

©Dataiku 2012-2018 - Privacy Policy