0 votes


I'm asking about how DSS monitors issues during datasets processing. I see two kinds of potential issues: 

  • Volume : Inconsistant number of records in a dataset (eg : I expected at least 1k records per  day for my "webtraffic" dataset)
  • Schema / values:  One or more rows have fields that don't respect the defined schema or expected values (eg : in webtraffic dataset, IP adresses are not valid or values of a date field are not expected). 

 Is there a way to monitor / handle those errors in DSS and be notified by email or something ?






1 Answer

+1 vote
Best answer
Hi Romain,

These features are on our roadmap. You can get in touch with our Sales team if you'd like more details.

As of today, you could write a custom recipe in Python for instance and write your tests.
selected by
Good new!

Thanks for the quick reply :)
1,322 questions
1,342 answers
11,891 users

©Dataiku 2012-2018 - Privacy Policy