0 votes

Hello,

I have a MySQL table with around 2,8 millions rows. I just created a recipe to parse a date+time column. This recipe usually works fine. But in that case, after reading around 890 000 rows, the job fails with the following error:

Failed to read data from table

Failed to read data from table, caused by: BatchUpdateException: Data truncation: Data too long for column 'text' at row 509, caused by: MysqlDataTruncation: Data truncation: Data too long for column 'text' at row 509

HTTP code: , type: com.dataiku.dip.exceptions.DataStoreIOException

Investigating a bit the log, I remarked that a column that is a "text" column in MySQL is transformed into a "varchar(500)" column. I am not sure that this is the source of the error (as it is a problem while reading the data and not writing it).

I have tried several things, including, in the original table to specify that there should be 1000 caracters max for instance, trying to change the "autodetect" (that gives "Natural Language" for this column) to "text". After those changes I asked DSS to "propagate the schema" from the original tables (it does no see any change anyway).

Have you any idea of the source of this error? 

Best,

Frédéric 

closed with the note: Solved my problem: see my comments.
asked by
closed by
OK, just saw that in the settings of the database, I can change varchar(500) to varchar(1000). Trying that now.
OK, it worked with varchar(2000). Problem solved.
994 questions
1,023 answers
1,075 comments
3,027 users

©Dataiku 2012-2018 - Privacy Policy