+1 vote
The DSS Data Directory is huge (several GB). Is there a way to optimize it without any data loss?
by

1 Answer

0 votes
Best answer

The DSS data directory (or DATA_DIR) stores all your configuration, settings, definitions of datasets, recipes, logs, etc. of your DSS installation. You can read more about the DATA_DIR.

This is a critical directory. You should make regular back-ups.

If you really need to gain some space, you have the following options:

  • Look at the managed datasets in DATA_DIR/managed_datasets and verify that there is no dataset no longer used and that has not been deleted.
  • Delete the logs of old jobs that are in DATA_DIR/jobs.

For instance:

#open the jobs directory
cd DATA_DIR/jobs
#find all directories containing logs of jobs older than 30 days
find . -mindepth 2 -maxdepth 2 -type d -mtime +30
#delete these directories
find . -mindepth 2 -maxdepth 2 -type d -mtime +30 -exec rm -rf {} \;

Keep in mind it's a touchy operation!

by
selected by
1,213 questions
1,246 answers
1,407 comments
11,780 users

┬ęDataiku 2012-2018 - Privacy Policy