Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
I have some code where I need to run an HDFS command in Python to check if a file is present. See below for an example:
import subproces
command = 'hdfs dfs -ls /sandbox'
ssh = subprocess.Popen(command, shell=True, stdout=subprocess.PIPE).communicate()
print(ssh)
When I run this in a Jupyter notebook in Dataiku, the command completes without any problems. However, when I run the notebook as a Python recipe, I get the following error message multiple times:
java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)];
It looks as if there is a problem with Kerberos when I run the Jupyter notebook as a Recipe. What is the reason for this? Is there a Dataiku setting I can change to make sure the Kerberos ticket is generated properly?