Coming soon: We’re working on a brand new, revamped Community experience. Want to receive updates? Sign up now!

0 votes

While going through the tutorials, I am getting following error for training the model:

[2018/06/17-18:42:07.394] [MRT-164] [ERROR] []  - Processing failed$SecretKernelTimeoutException: Subprocess failed to connect, it probably crashed at startup. Check the logs.
Caused by: Socket closed
	at Method)
	... 3 more
On which machine are you running Dataiku DSS? Have you reproduced the error? Could you please send us the full log located here:
**System Info:**
CPU: Dual Core Intel Core i5-7200U (-MT MCP-)
speed/min/max: 900/400/3100 MHz
Kernel: 4.14.47-1-MANJARO x86_64 Up: 10m
Mem: 2660.0/7866.2 MiB (33.8%)
HDD: 931.51 GiB (8.2% used) Procs: 165
Shell: bash 4.4.19 inxi: 3.0.10

**Can I reproduce the error?**
This error always appears when I try to train the model. I cannot train the model at all.

**Link to full logs:** [logs](

1 Answer

+1 vote
Best answer

A bit above in the logs, you have a "Missing required dependencies ['numpy']", which means that the DSS builtin python environment has an issue.

Possibly a package has been installed that caused a change in some core dependency. The recommended way of installing packages is to use code environment, see Installing Python packages for more details.

To fix your current issue, try rebuilding DSS' builtin python environment.

selected by
Thank you so much! So, the problem was missing packages. And whenever I face same problem, I need to go to Settings --> Administration --> Code Env --> Packages to Install, and add all the required packages.
Thank you once again.
You're very welcome :)
I am facing the same issue on Archlinux. I don't manage to solve this "numpy" dependency error... I tried to build the environment in settings > Administration > Code Env. numpy is there, in my system installation, in the DDS code env but for some reasons, the training of my algo are failing and the log shows: xxx/DATA_DIR/bin/python: Missing required dependencies ['numpy']

For information, my code_env page was empty before I created a python code_env manually, I don't know if it is a normal situation.

Any idea how I could solve this issue to make dds run on archlinux?

Thanks for your support,
Have you tried the solution mentioned above about rebuilding DSS's **builtin** python environment?
Code Envs are specific environments that you can use in your Python/R recipes (in Advanced) and ML trainings (under "Python environment" in the Design part of the ML task), when you want to use specific packages.
By default and if you don't need to use additional packages, the builtin environment is the one that gets used. If it is lacking numpy, it means it's broken, and you should rebuild it.
If you need to use additional packages, then you want to create a Code Env and use that code env in your ML task or recipe.

thanks for the reply. Here is how I solved my issue:
1) in administration, create an environment as explained above
2) install the following additional packages in the custum env:
3) in your model, select the created environment

thanks again for the help, the journey with dss can continue ;)
Yes, that also works. You will need to use this environment with all your python and ML recipes though.
I would still advise also fixing the builtin DSS python environment though :)
thanks Adrien.

I didn't understand what di you mean and what can I do to "fix the built-in environment". It refuses to run because of missing numpy and I have no idea how to fix numpy without creating my environment. As a reminder, with a fresh install of dss, there is no environment in the admin tab so nothing to rebuilt.

I am missing something ;)

The Code Env tab only shows the additional environments created by the DSS administrator/user.
On a fresh DSS install, there is also the built-in python environment, which is used by default on all things running on Python.
To fix it, check the link that I referenced in my first answer above, i.e.
ok, I start understanding the logic... I performed the steps of the documentation but the dss-local-packages.txt was empty. I added inside the list of package of my custum env to "install" them in the built-in environment. The the training is still crahing because numpy is not found. When I run "DATA_DIR/bin/pip list", I get a full list of packages that includes "numpy".
thanks for your help ;)
ps: I am on archlinux
The dss-local-packages.txt can be empty, it lists the _additional_ packages that you installed on top of the base packages. The DSS installer takes care of installing the base packages under the hood.
You should not manually specify the packages from your repository, as it will override the base packages with different versions that may not expected in the built-in python env.
Can you try with just the instructions from the documentation?
I tried following the steps of the documentation: learning step crashed and the logs show that numpy is missing.
ArchLinux is not officially supported, but quick tests seem to show that we can't reproduce your issue. With a base ArchLinux from AWS, DSS starts normally and Pandas is functional after:
- installing the packages extra/jdk8-openjdk, extra/nginx, extra/freetype2, community/gcc54
- adding /usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1 to DSS' LD_LIBRARY_PATH

If you can't solve this, you can continue with your code env as long as it's not blocking you, or try either installing the builtin code env using conda or a fully custom python environment
Also, providing full installation logs may help us have other ideas.
thanks Adrien for your investigations. After installing gcc54 and adding /usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1 to LD_LIBRARY_PATH, I don't have the issue anymore.

For the record for people reading this later, just run:
export LD_LIBRARY_PATH="/usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1"
before launching the installation.

Thanks again.
Glad to hear that. Thanks for the update
1,337 questions
1,362 answers
11,912 users

©Dataiku 2012-2018 - Privacy Policy