0 votes

While going through the tutorials, I am getting following error for training the model:

[2018/06/17-18:42:07.394] [MRT-164] [ERROR] [dku.analysis.ml.python]  - Processing failed
com.dataiku.dip.io.SocketBlockLink$SecretKernelTimeoutException: Subprocess failed to connect, it probably crashed at startup. Check the logs.
	at com.dataiku.dip.io.SocketBlockLink.waitForConnection(SocketBlockLink.java:82)
	at com.dataiku.dip.io.SecretProtectedKernelLink.waitForProcess(SecretProtectedKernelLink.java:38)
	at com.dataiku.dip.io.PythonSecretProtectedKernel.start(PythonSecretProtectedKernel.java:84)
	at com.dataiku.dip.analysis.ml.shared.PRNSTrainThread.run(PRNSTrainThread.java:87)
Caused by: java.net.SocketException: Socket closed
	at java.net.PlainSocketImpl.socketAccept(Native Method)
	at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
	at java.net.ServerSocket.implAccept(ServerSocket.java:545)
	at java.net.ServerSocket.accept(ServerSocket.java:513)
	at com.dataiku.dip.io.SocketBlockLink.waitForConnection(SocketBlockLink.java:78)
	... 3 more
asked by
Hello,
On which machine are you running Dataiku DSS? Have you reproduced the error? Could you please send us the full log located here: http://res.cloudinary.com/alexcbs/image/upload/v1529271883/Screen_Shot_2018-06-17_at_22.39.59_lx6lni.png
**System Info:**
CPU: Dual Core Intel Core i5-7200U (-MT MCP-)
speed/min/max: 900/400/3100 MHz
Kernel: 4.14.47-1-MANJARO x86_64 Up: 10m
Mem: 2660.0/7866.2 MiB (33.8%)
HDD: 931.51 GiB (8.2% used) Procs: 165
Shell: bash 4.4.19 inxi: 3.0.10

**Can I reproduce the error?**
This error always appears when I try to train the model. I cannot train the model at all.

**Link to full logs:** [logs](https://github.com/virajpai/temp)

1 Answer

+1 vote
Best answer

A bit above in the logs, you have a "Missing required dependencies ['numpy']", which means that the DSS builtin python environment has an issue.

Possibly a package has been installed that caused a change in some core dependency. The recommended way of installing packages is to use code environment, see Installing Python packages for more details.

To fix your current issue, try rebuilding DSS' builtin python environment.

answered by
selected by
Hi,

thanks for the reply. Here is how I solved my issue:
1) in administration, create an environment as explained above
2) install the following additional packages in the custum env:
scipy
sklearn
statsmodels
xgboost
jinja2
3) in your model, select the created environment

thanks again for the help, the journey with dss can continue ;)
Salvatore
Yes, that also works. You will need to use this environment with all your python and ML recipes though.
I would still advise also fixing the builtin DSS python environment though :)
thanks Adrien.

I didn't understand what di you mean and what can I do to "fix the built-in environment". It refuses to run because of missing numpy and I have no idea how to fix numpy without creating my environment. As a reminder, with a fresh install of dss, there is no environment in the admin tab so nothing to rebuilt.

I am missing something ;)

Salvatore
The Code Env tab only shows the additional environments created by the DSS administrator/user.
On a fresh DSS install, there is also the built-in python environment, which is used by default on all things running on Python.
To fix it, check the link that I referenced in my first answer above, i.e.
https://doc.dataiku.com/dss/latest/installation/python.html#rebuilding-the-builtin-python-environment
ok, I start understanding the logic... I performed the steps of the documentation but the dss-local-packages.txt was empty. I added inside the list of package of my custum env to "install" them in the built-in environment. The the training is still crahing because numpy is not found. When I run "DATA_DIR/bin/pip list", I get a full list of packages that includes "numpy".
thanks for your help ;)
Salvatore
ps: I am on archlinux
The dss-local-packages.txt can be empty, it lists the _additional_ packages that you installed on top of the base packages. The DSS installer takes care of installing the base packages under the hood.
You should not manually specify the packages from your repository, as it will override the base packages with different versions that may not expected in the built-in python env.
Can you try with just the instructions from the documentation?
Hi,
I tried following the steps of the documentation: learning step crashed and the logs show that numpy is missing.
Salvatore
ArchLinux is not officially supported, but quick tests seem to show that we can't reproduce your issue. With a base ArchLinux from AWS, DSS starts normally and Pandas is functional after:
- installing the packages extra/jdk8-openjdk, extra/nginx, extra/freetype2, community/gcc54
- adding /usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1 to DSS' LD_LIBRARY_PATH

If you can't solve this, you can continue with your code env as long as it's not blocking you, or try either installing the builtin code env using conda https://doc.dataiku.com/dss/latest/installation/python.html#using-anaconda-python or a fully custom python environment https://doc.dataiku.com/dss/latest/installation/python.html#advanced-using-a-fully-custom-python-environment
Also, providing full installation logs may help us have other ideas.
thanks Adrien for your investigations. After installing gcc54 and adding /usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1 to LD_LIBRARY_PATH, I don't have the issue anymore.

For the record for people reading this later, just run:
export LD_LIBRARY_PATH="/usr/lib/gcc/x86_64-pc-linux-gnu/5.4.1"
before launching the installation.

Thanks again.
Salvatore
Glad to hear that. Thanks for the update
972 questions
998 answers
1,047 comments
2,370 users

┬ęDataiku 2012-2018 - Privacy Policy