Known Issues¶
Cannot open shared object file …¶
If you are getting an error message like libpython3.6m.so.1.0:
cannot open shared object file: No such file or directory
when
trying to import AutoNLU, it means that there is something not
completely ok with your Python setup. This is mainly a problem for
compiled packages (like AutoNLU) and you might therefore not really
notice this problem with most other packages.
Solution 1¶
In most cases, the error can be fixed by installing the static library
for your Python version. For example, in Ubuntu, the package to
install for Python 3.6 would be libpython3.6-dev
:
sudo apt install libpython3.6-dev
You will have to check what package exactly you need, depending on your Python version and Linux distribution.
Solution 2¶
If you are not able to find the appropriate package for your Python version, or the problem persists even after installing it (e.g. if you have installed a certain Python version via other means than your systems package manager) then you have to manually point Python to the correct path where to find the library.
First, we need to find where those libraries are stored on your
system. Search for the file name that was part of your original error
message using locate
. E.g. :
sudo updatedb
locate libpython3.6m.so.1.0
This might give you an output like the following:
/home/peter/mypython/lib/libpython3.6m.so.1.0
which means your static library can be found in
/home/peter/mypython/lib
. We can now add this path to the
LD_LIBRARY_PATH
evironment variable:
export LD_LIBRARY_PATH=/home/peter/mypython/lib:$LD_LIBRARY_PATH
After this command, you should be able to import AutoNLU correctly. To
make this change permanent, you have to add this line to your
.bashrc
file, which can also be done with the following command:
echo "export LD_LIBRARY_PATH=/home/peter/mypython/lib:\$LD_LIBRARY_PATH" >> ~/.bashrc
Solution for Azure Machine Learning Instances¶
Azure machine learning compute instances have exactly this problem and we have to use Solution 2. More specifically:
export LD_LIBRARY_PATH=/anaconda/envs/azureml_py36/lib:$LD_LIBRARY_PATH
And to make the change permanent:
echo "export LD_LIBRARY_PATH=/anaconda/envs/azureml_py36/lib:\$LD_LIBRARY_PATH" >> ~/.bashrc
Solution for Azure Notebooks¶
If you want to use the Notebooks that are directly integrated into the
Azure Machine Learning Studio, the previous solution does not work
because the changed LD_LIBRARY_PATH
will be ignored and we have to
directly change this in the Jupyter kernel configuration.
Open a terminal to the compute instance you are running your notebook
on and use the command jupyter kernelspec list
to find the path of
the Jupyter Kernel you are using for your Notebooks. The path of your
Kernel contains a kernel.json
file you will have to edit (e.g.
using nano kernel.json
). You have to add the line "env":
{"LD_LIBRARY_PATH":"/anaconda/envs/azureml_py36/lib"},
to this file.
After this it should look similar to this:
{
"argv": [
"/anaconda/envs/azureml_py36/bin/python",
"-m",
"ipykernel_launcher",
"-f",
"{connection_file}"
],
"env": {"LD_LIBRARY_PATH":"/anaconda/envs/azureml_py36/lib"},
"display_name": "Python 3.6 - AzureML",
"language": "python"
}
You have to restart your notebook after these changes and importing AutoNLU should also work correctly for the notebooks.