0 votes

Hey,

I'm working on DSS 2.2.1. I needed to reboot my dataiu server, and restart dataiku. Then, all my receipt (Hive and Python) don't work. I have the famous: java.lang.nullpointerexception. Here the error;

java.lang.NullPointerException
	at java.util.Hashtable.put(Hashtable.java:514)
	at java.util.Hashtable.putAll(Hashtable.java:587)
	at org.apache.hadoop.conf.CoreDefaultProperties.<init>(CoreDefaultProperties.java:76)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at java.lang.Class.newInstance(Class.java:374)
	at org.apache.hadoop.conf.Configuration.getProperties(Configuration.java:2064)
	at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2272)
	at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2224)
	at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2141)
	at org.apache.hadoop.conf.Configuration.get(Configuration.java:1081)
	at org.apache.hadoop.fs.FileSystem.getDefaultUri(FileSystem.java:177)
	at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:169)
	at com.dataiku.dctc.HadoopLoader.getFS(HadoopLoader.java:42)
	at com.dataiku.dip.datasets.fs.HDFSDatasetHandler.loadFS(HDFSDatasetHandler.java:479)
	at com.dataiku.dip.datasets.fs.HDFSDatasetHandler.enumerateFilesystem(HDFSDatasetHandler.java:149)
	at com.dataiku.dip.datasets.fs.AbstractFSDatasetHandler.enumeratePartition(AbstractFSDatasetHandler.java:384)
	at com.dataiku.dip.datasets.fs.AbstractFSDatasetHandler.getContentHash(AbstractFSDatasetHandler.java:482)
	at com.dataiku.dip.dataflow.ComputableHashComputer.getCurrentContentHash(ComputableHashComputer.java:153)
	at com.dataiku.dip.dataflow.ComputableHashComputer.getCurrentContentHash(ComputableHashComputer.java:102)
	at com.dataiku.dip.dataflow.jobrunner.ActivityRunner.checkSourcesReady(ActivityRunner.java:337)
	at com.dataiku.dip.dataflow.jobrunner.ActivityRunner.runActivity(ActivityRunner.java:434)
	at com.dataiku.dip.dataflow.jobrunner.JobRunner.runActivity(JobRunner.java:105)
	at com.dataiku.dip.dataflow.jobrunner.JobRunner.access$700(JobRunner.java:29)
	at com.dataiku.dip.dataflow.jobrunner.JobRunner$ActivityExecutorThread.run(JobRunner.java:281)

 

When I start dss I have this message in the stdout

[[email protected] bin]$ ./dss start
Waiting for Data Science Studio supervisor to start ...
backend                          STARTING
hproxy                           BACKOFF   Exited too quickly (process log may have details)
ipython                          STARTING
nginx                            STARTING

 

 

Do you hava any idea what's going on? Thank's in advance

Gautier

asked by
edited by

1 Answer

0 votes
Hi

There's definitely a problem with your HDFS. You can test  it with the following shell command "hdfs -dfs -ls"

(if it's a local hadoop cluster  you may need to restart your Hadoop services) .

Joel
answered by
925 questions
956 answers
953 comments
1,787 users