Run receipe

fornanthu
Level 2
Run receipe

Hi Team,



I have download a file from AWS S3, Created a recipe ( rename and add new column ) , getting an error while executing the script,



com.dataiku.dip.datasets.fs.HTTPDatasetHandler cannot be cast to com.dataiku.dip.datasets.fs.AbstractFSDatasetHandler



Logs may contain additional information



Additional technical details




  • Error type:java.lang.ClassCastException



Would you please help ?



Regards,



Nantha.



 



Detailed  log is follow,




[2017/12/16-10:08:56.982] [ActivityExecutor-28] [INFO] [dku.flow.jobrunner] running compute_churn_prepared_NP - Allocated a slot for this activity!
[2017/12/16-10:08:56.993] [ActivityExecutor-28] [INFO] [dku.flow.jobrunner] running compute_churn_prepared_NP - Run activity
[2017/12/16-10:08:57.085] [ActivityExecutor-28] [INFO] [dku.flow.activity] running compute_churn_prepared_NP - Executing default pre-activity lifecycle hook
[2017/12/16-10:08:57.217] [ActivityExecutor-28] [INFO] [dku.flow.activity] running compute_churn_prepared_NP - Checking if sources are ready
[2017/12/16-10:08:58.108] [ActivityExecutor-28] [INFO] [dku.fsproviders.http] running compute_churn_prepared_NP - Enumerate HTTP URI: https://s3-eu-west-1.amazonaws.com/dataiku-partnerships/onboarding/data/churn.csv
[2017/12/16-10:08:59.944] [ActivityExecutor-28] [DEBUG] [dku.fsproviders.http] running compute_churn_prepared_NP - HTTP response code: 200
[2017/12/16-10:09:00.001] [ActivityExecutor-28] [DEBUG] [dku.flow.activity] running compute_churn_prepared_NP - Computing hashes to propagate BEFORE activity
[2017/12/16-10:09:00.200] [ActivityExecutor-28] [INFO] [dku.fsproviders.http] running compute_churn_prepared_NP - Enumerate HTTP URI: https://s3-eu-west-1.amazonaws.com/dataiku-partnerships/onboarding/data/churn.csv
[2017/12/16-10:09:02.227] [ActivityExecutor-28] [DEBUG] [dku.fsproviders.http] running compute_churn_prepared_NP - HTTP response code: 200
[2017/12/16-10:09:02.236] [ActivityExecutor-28] [DEBUG] [dku.flow.activity] running compute_churn_prepared_NP - Recorded 1 hashes before activity run
[2017/12/16-10:09:02.240] [ActivityExecutor-28] [DEBUG] [dku.flow.activity] running compute_churn_prepared_NP - Building recipe runner of type
[2017/12/16-10:09:02.337] [ActivityExecutor-28] [INFO] [dku.recipes.engines] running compute_churn_prepared_NP - Resolved preferences projectKey=CHURN recipeType=shaker global={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}} project={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}} pplusg={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}} recipe=null resolved={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}}
[2017/12/16-10:09:02.338] [ActivityExecutor-28] [INFO] [dku.recipes.shaker] running compute_churn_prepared_NP - User-selected engine: null - used engine: DSS
[2017/12/16-10:09:02.417] [ActivityExecutor-28] [INFO] [dku.flow.shaker] running compute_churn_prepared_NP - SET PAYLOAD {
"columnsSelection": {
"mode": "ALL"
},
"explorationSampling": {
"_refreshTrigger": 0,
"selection": {
"filter": {
"distinct": false,
"enabled": false
},
"latestPartitionsN": 1,
"maxRecords": 10000,
"ordering": {
"rules": [],
"enabled": false
},
"withinFirstN": -1,
"partitionSelectionMethod": "ALL",
"maxStoredBytes": 104857600,
"targetRatio": 0.02,
"maxReadUncompressedBytes": -1,
"samplingMethod": "HEAD_SEQUENTIAL"
},
"autoRefreshSample": false
},
"explorationFilters": [],
"origin": "PREPARE_RECIPE",
"exploreUIParams": {
"autoRefresh": true
},
"steps": [
{
"preview": false,
"metaType": "PROCESSOR",
"disabled": false,
"type": "ColumnRenamer",
"params": {
"renamings": [
{
"from": "Churn?",
"to": "Churn"
}
]
},
"alwaysShowComment": false
},
{
"preview": false,
"metaType": "PROCESSOR",
"disabled": false,
"type": "FindReplace",
"params": {
"output": "",
"mapping": [
{
"from": ".",
"to": ""
}
],
"normalization": "EXACT",
"columns": [
"Churn"
],
"appliesTo": "SINGLE_COLUMN",
"stopAfterFirstMatch": false,
"matching": "SUBSTRING"
},
"alwaysShowComment": false
},
{
"preview": false,
"metaType": "PROCESSOR",
"disabled": false,
"type": "CreateColumnWithGREL",
"params": {
"expression": "rand()",
"column": "splitter"
},
"alwaysShowComment": false
},
{
"preview": false,
"metaType": "PROCESSOR",
"disabled": false,
"type": "CreateColumnWithGREL",
"params": {
"expression": "format(\u0027%.2f\u0027,splitter)",
"column": "Round"
},
"alwaysShowComment": false
},
{
"preview": false,
"metaType": "PROCESSOR",
"disabled": false,
"type": "ColumnsSelector",
"params": {
"columns": [
"splitter"
],
"keep": false,
"appliesTo": "SINGLE_COLUMN"
},
"alwaysShowComment": false
}
],
"maxProcessedMemTableBytes": -1,
"previewMode": "ALL_ROWS",
"vizSampling": {
"_refreshTrigger": 0,
"autoRefreshSample": false
},
"analysisColumnData": {},
"sorting": [],
"globalSearchQuery": "",
"coloring": {
"scheme": "MEANING_AND_STATUS",
"individualColumns": [],
"valueColoringMode": "HASH"
}
}
[2017/12/16-10:09:02.617] [ActivityExecutor-28] [INFO] [dku.flow.shaker] running compute_churn_prepared_NP - Shaker recipe, from churn of type HTTP
[2017/12/16-10:09:02.625] [ActivityExecutor-28] [DEBUG] [dku.job.activity] running compute_churn_prepared_NP - Filling source sizes
[2017/12/16-10:09:02.658] [ActivityExecutor-28] [DEBUG] [dku.job.activity] running compute_churn_prepared_NP - Done filling source sizes
[2017/12/16-10:09:02.688] [ActivityExecutor-28] [INFO] [dku.datasets.file] running compute_churn_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"CHURN/churn_prepared","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}}
[2017/12/16-10:09:02.689] [ActivityExecutor-28] [INFO] [dku.datasets.ftplike] running compute_churn_prepared_NP - Clear partitions
[2017/12/16-10:09:02.772] [ActivityExecutor-28] [WARN] [dku.fs.local] running compute_churn_prepared_NP - File does not exist: /home/fornanthu/DATA_DIR/managed_datasets/CHURN/churn_prepared
[2017/12/16-10:09:02.775] [ActivityExecutor-28] [INFO] [dku.datasets.ftplike] running compute_churn_prepared_NP - Clearing partition as a folder : 'NP'
[2017/12/16-10:09:02.784] [ActivityExecutor-28] [WARN] [dku.fs.local] running compute_churn_prepared_NP - File does not exist: /home/fornanthu/DATA_DIR/managed_datasets/CHURN/churn_prepared
[2017/12/16-10:09:02.797] [ActivityExecutor-28] [INFO] [dku.datasets.ftplike] running compute_churn_prepared_NP - Done clearing partition 'NP'
[2017/12/16-10:09:02.824] [ActivityExecutor-28] [ERROR] [dku.flow.jobrunner] running compute_churn_prepared_NP - Activity unexpectedly failed
java.lang.ClassCastException: com.dataiku.dip.datasets.fs.HTTPDatasetHandler cannot be cast to com.dataiku.dip.datasets.fs.AbstractFSDatasetHandler
at com.dataiku.dip.recipes.shaker.ShakerRecipeRunner.init(ShakerRecipeRunner.java:288)
at com.dataiku.dip.dataflow.jobrunner.ExecutionRunnablesBuilder.getRunnables(ExecutionRunnablesBuilder.java:84)
at com.dataiku.dip.dataflow.jobrunner.ActivityRunner.runActivity(ActivityRunner.java:569)
at com.dataiku.dip.dataflow.jobrunner.JobRunner.runActivity(JobRunner.java:123)
at com.dataiku.dip.dataflow.jobrunner.JobRunner.access$900(JobRunner.java:35)
at com.dataiku.dip.dataflow.jobrunner.JobRunner$ActivityExecutorThread.run(JobRunner.java:312)
0 Kudos
3 Replies
Clément_Stenac
Dataiker
Hi,

This issue was fixed in DSS 4.1.1 - You can upgrade to the latest DSS version (currently 4.1.2)
0 Kudos
BalaEnjamoori
Level 1

Hi @Clément_Stenac 

We are using DSS 9.0.3 but still seeing this error when recipes using the Spark engine with Snowflake input.

Any suggestions?

 

0 Kudos
Manuel
Dataiker Alumni

Hi @BalaEnjamoori , the error is generic and can be a result of different conditions. Please download your full job log and submit a ticket to our support team, as described in this page, https://doc.dataiku.com/dss/latest/troubleshooting/obtaining-support.html.

0 Kudos

Labels

?
Labels (2)
A banner prompting to get Dataiku