By continuing you indicate that you have read and agree to our Terms of service and Privacy policy
by scikit-learn Python Version: 1.2.0rc1 License: BSD-3-Clause
by scikit-learn Python Version: 1.2.0rc1 License: BSD-3-Clause
Support
Quality
Security
License
Reuse
kandi has reviewed scikit-learn and discovered the below as its top functions. This is intended to give you an instant insight into scikit-learn implemented functionality, and help decide if they suit your requirements.
Get all kandi verified functions for this library.
Get all kandi verified functions for this library.
scikit-learn: machine learning in Python
See all related Code Snippets
QUESTION
Installing scipy and scikit-learn on apple m1
Asked 2022-Mar-22 at 06:21The installation on the m1 chip for the following packages: Numpy 1.21.1, pandas 1.3.0, torch 1.9.0 and a few other ones works fine for me. They also seem to work properly while testing them. However when I try to install scipy or scikit-learn via pip this error appears:
ERROR: Failed building wheel for numpy
Failed to build numpy
ERROR: Could not build wheels for numpy which use PEP 517 and cannot be installed directly
Why should Numpy be build again when I have the latest version from pip already installed?
Every previous installation was done using python3.9 -m pip install ...
on Mac OS 11.3.1 with the apple m1 chip.
Maybe somebody knows how to deal with this error or if its just a matter of time.
ANSWER
Answered 2021-Aug-02 at 14:33Please see this note of scikit-learn
about
Installing on Apple Silicon M1 hardware
The recently introduced
macos/arm64
platform (sometimes also known asmacos/aarch64
) requires the open source community to upgrade the build configuation and automation to properly support it.At the time of writing (January 2021), the only way to get a working installation of scikit-learn on this hardware is to install scikit-learn and its dependencies from the conda-forge distribution, for instance using the miniforge installers:
https://github.com/conda-forge/miniforge
The following issue tracks progress on making it possible to install scikit-learn from PyPI with pip:
QUESTION
negative values for mean squared errors in sae package for R
Asked 2022-Feb-25 at 14:28I have been using "sae" package for R to use small area estimations with spatial fay-herriot models (SFH). Using different distance matrices I occasionally obtained negative values of Mean Squared Errors (MSE).
The following link may reference a similar behavior:
scikit-learn cross validation, negative values with mean squared error
In any case here is a working example:
library(sae)
v1 <- c(0.000,0.089,0.081,0.082,0.058,0.075,0.062,0.043,0.000,0.037,0.065,0.056,
0.046,0.055,0.034,0.043,0.043,0.027,0.013,0.011,0.036,0.029,0.017,0.081,
0.000,0.093,0.081,0.062,0.077,0.066,0.046,0.000,0.036,0.063,0.054,0.044,
0.053,0.033,0.041,0.041,0.026,0.012,0.010,0.035,0.028,0.016,0.073,0.091,
0.000,0.080,0.066,0.085,0.070,0.048,0.000,0.036,0.062,0.053,0.043,0.053,
0.032,0.041,0.041,0.025,0.012,0.010,0.034,0.028,0.016,0.071,0.076,0.077,
0.000,0.053,0.083,0.065,0.043,0.000,0.039,0.071,0.059,0.047,0.057,0.035,
0.044,0.044,0.027,0.013,0.011,0.037,0.030,0.017,0.060,0.070,0.075,0.065,
0.000,0.070,0.084,0.076,0.000,0.032,0.053,0.051,0.041,0.065,0.041,0.039,
0.055,0.023,0.011,0.009,0.030,0.031,0.019,0.065,0.074,0.083,0.084,0.060,
0.000,0.076,0.050,0.000,0.037,0.066,0.056,0.045,0.055,0.034,0.042,0.042,
0.026,0.013,0.010,0.035,0.029,0.017,0.056,0.067,0.072,0.069,0.077,0.079,
0.000,0.065,0.000,0.033,0.057,0.054,0.044,0.071,0.040,0.041,0.055,0.024,
0.011,0.009,0.032,0.030,0.017,0.051,0.060,0.063,0.062,0.084,0.067,0.079,
0.000,0.000,0.030,0.052,0.049,0.041,0.063,0.051,0.038,0.067,0.024,0.011,
0.009,0.031,0.040,0.027,0.015,0.018,0.019,0.026,0.004,0.022,0.013,0.000,
0.000,0.064,0.036,0.045,0.057,0.030,0.051,0.057,0.032,0.077,0.097,0.110,
0.070,0.066,0.089,0.024,0.029,0.029,0.041,0.009,0.035,0.021,0.000,0.018,
0.000,0.059,0.071,0.098,0.045,0.050,0.077,0.028,0.082,0.045,0.040,0.099,
0.048,0.054,0.054,0.059,0.059,0.072,0.039,0.065,0.051,0.031,0.000,0.050,
0.000,0.077,0.060,0.069,0.043,0.056,0.051,0.034,0.016,0.013,0.045,0.036,
0.021,0.042,0.047,0.047,0.059,0.033,0.053,0.045,0.024,0.000,0.058,0.079,
0.000,0.075,0.067,0.054,0.071,0.048,0.040,0.018,0.015,0.055,0.045,0.026,
0.028,0.033,0.033,0.046,0.015,0.040,0.030,0.005,0.000,0.094,0.068,0.089,
0.000,0.053,0.053,0.099,0.033,0.062,0.027,0.022,0.083,0.049,0.037,0.046,
0.050,0.051,0.060,0.055,0.056,0.070,0.046,0.000,0.043,0.071,0.069,0.053,
0.000,0.051,0.050,0.074,0.031,0.014,0.012,0.041,0.037,0.020,0.018,0.023,
0.023,0.035,0.023,0.029,0.033,0.032,0.000,0.053,0.053,0.071,0.065,0.061,
0.000,0.080,0.089,0.045,0.016,0.035,0.066,0.095,0.057,0.024,0.030,0.030,
0.043,0.012,0.036,0.027,0.002,0.000,0.075,0.063,0.086,0.102,0.050,0.070,
0.000,0.039,0.063,0.028,0.023,0.094,0.066,0.038,0.038,0.042,0.042,0.052,
0.050,0.047,0.058,0.058,0.000,0.035,0.062,0.060,0.046,0.086,0.078,0.049,
0.000,0.030,0.011,0.021,0.042,0.057,0.035,0.018,0.022,0.022,0.031,0.005,
0.027,0.016,0.000,0.039,0.091,0.045,0.057,0.076,0.037,0.051,0.075,0.031,
0.000,0.069,0.063,0.095,0.052,0.076,0.016,0.019,0.019,0.027,0.004,0.023,
0.014,0.000,0.076,0.070,0.038,0.048,0.062,0.031,0.045,0.062,0.027,0.085,
0.000,0.104,0.078,0.061,0.090,0.014,0.018,0.018,0.025,0.004,0.021,0.013,
0.000,0.084,0.063,0.035,0.043,0.056,0.028,0.058,0.056,0.039,0.076,0.099,
0.000,0.070,0.076,0.105,0.020,0.024,0.024,0.035,0.005,0.029,0.017,0.000,
0.022,0.096,0.050,0.064,0.085,0.040,0.058,0.091,0.035,0.084,0.050,0.045,
0.000,0.056,0.069,0.005,0.011,0.011,0.026,0.000,0.019,0.010,0.010,0.016,
0.058,0.047,0.068,0.070,0.041,0.114,0.090,0.068,0.052,0.039,0.067,0.074,
0.000,0.103,0.006,0.010,0.010,0.019,0.000,0.014,0.007,0.006,0.057,0.071,
0.032,0.045,0.061,0.026,0.070,0.061,0.042,0.086,0.082,0.106,0.091,0.097,
0.000)
dmat <- data.frame(matrix(v1,byrow=TRUE,nrow=23))
y <- c(0.057,0.074,0.067,0.071,0.031,0.070,0.067,0.047,0.075,0.028,0.051,0.085,
0.037,0.070,0.082,0.084,0.063,0.070,0.085,0.070,0.059,0.050,0.064)
x <- c(0.032,0.041,0.053,0.056,0.060,0.055,0.083,0.060,0.074,0.035,0.041,0.044,
0.034,0.048,0.045,0.038,0.047,0.043,0.057,0.062,0.041,0.062,0.045)
vary <- c(0.00018,0.00014,0.00016,0.00003,0.00029,0.00015,0.00029,0.00039,
0.00005,0.00008,0.00013,0.00017,0.00010,0.00027,0.00114,0.00051,
0.00031,0.00002,0.00038,0.00024,0.00016,0.00019,0.00014)
fit1 <- mseSFH(y ~ x,vardir=vary,proxmat=dmat)
fit1$mse[fit1$mse < 0]
I'm not sure if this is the appropriate forum for the question.
Thanks in advance,
Joao
ANSWER
Answered 2022-Feb-25 at 14:28I'm pretty sure that this is due to bias correction that generally takes place when you have MSE. You can read about the formula for bias correction that is used in the references they provided in ?sae::meanSFH
. In one of the articles, they provided a case study where the average MSE is negative. (I found this in Molina et al., 2009. They identify the bias correction in a few places, but it's very clear on pp. 452-453.)
You can visualize the errors and see how very close they are to zero.
data.frame(UnbiasedMSE = fit1$mse) %>%
ggplot(aes(x = seq_along(UnbiasedMSE),
y = UnbiasedMSE)) +
geom_line() + geom_point() +
scale_y_continuous(labels = scales::comma) +
theme_bw()
QUESTION
Colab: (0) UNIMPLEMENTED: DNN library is not found
Asked 2022-Feb-08 at 19:27I have pretrained model for object detection (Google Colab + TensorFlow) inside Google Colab and I run it two-three times per week for new images I have and everything was fine for the last year till this week. Now when I try to run model I have this message:
Graph execution error:
2 root error(s) found.
(0) UNIMPLEMENTED: DNN library is not found.
[[{{node functional_1/conv1_conv/Conv2D}}]]
[[StatefulPartitionedCall/SecondStagePostprocessor/BatchMultiClassNonMaxSuppression/MultiClassNonMaxSuppression/Reshape_5/_126]]
(1) UNIMPLEMENTED: DNN library is not found.
[[{{node functional_1/conv1_conv/Conv2D}}]]
0 successful operations.
0 derived errors ignored. [Op:__inference_restored_function_body_27380] ***
Never happended before.
Before I can run my model I have to install Tensor Flow object detection API with this command:
import os
os.chdir('/project/models/research')
!protoc object_detection/protos/*.proto --python_out=.
!cp object_detection/packages/tf2/setup.py .
!python -m pip install .
This is the output of command:
Processing /content/gdrive/MyDrive/models/research
DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
Collecting avro-python3
Downloading avro-python3-1.10.2.tar.gz (38 kB)
Collecting apache-beam
Downloading apache_beam-2.35.0-cp37-cp37m-manylinux2010_x86_64.whl (9.9 MB)
|████████████████████████████████| 9.9 MB 1.6 MB/s
Requirement already satisfied: pillow in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (7.1.2)
Requirement already satisfied: lxml in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (4.2.6)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (3.2.2)
Requirement already satisfied: Cython in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (0.29.27)
Requirement already satisfied: contextlib2 in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (0.5.5)
Collecting tf-slim
Downloading tf_slim-1.1.0-py2.py3-none-any.whl (352 kB)
|████████████████████████████████| 352 kB 50.5 MB/s
Requirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.15.0)
Requirement already satisfied: pycocotools in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (2.0.4)
Collecting lvis
Downloading lvis-0.5.3-py3-none-any.whl (14 kB)
Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.4.1)
Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.3.5)
Collecting tf-models-official>=2.5.1
Downloading tf_models_official-2.8.0-py2.py3-none-any.whl (2.2 MB)
|████████████████████████████████| 2.2 MB 38.3 MB/s
Collecting tensorflow_io
Downloading tensorflow_io-0.24.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (23.4 MB)
|████████████████████████████████| 23.4 MB 1.7 MB/s
Requirement already satisfied: keras in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (2.7.0)
Collecting opencv-python-headless
Downloading opencv_python_headless-4.5.5.62-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (47.7 MB)
|████████████████████████████████| 47.7 MB 74 kB/s
Collecting sacrebleu
Downloading sacrebleu-2.0.0-py3-none-any.whl (90 kB)
|████████████████████████████████| 90 kB 10.4 MB/s
Requirement already satisfied: kaggle>=1.3.9 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.5.12)
Requirement already satisfied: psutil>=5.4.3 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (5.4.8)
Requirement already satisfied: oauth2client in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (4.1.3)
Collecting tensorflow-addons
Downloading tensorflow_addons-0.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)
|████████████████████████████████| 1.1 MB 37.8 MB/s
Requirement already satisfied: gin-config in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.5.0)
Requirement already satisfied: tensorflow-datasets in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (4.0.1)
Collecting sentencepiece
Downloading sentencepiece-0.1.96-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.2 MB)
|████████████████████████████████| 1.2 MB 37.5 MB/s
Collecting tensorflow-model-optimization>=0.4.1
Downloading tensorflow_model_optimization-0.7.0-py2.py3-none-any.whl (213 kB)
|████████████████████████████████| 213 kB 42.7 MB/s
Collecting pyyaml<6.0,>=5.1
Downloading PyYAML-5.4.1-cp37-cp37m-manylinux1_x86_64.whl (636 kB)
|████████████████████████████████| 636 kB 53.3 MB/s
Collecting tensorflow-text~=2.8.0
Downloading tensorflow_text-2.8.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (4.9 MB)
|████████████████████████████████| 4.9 MB 46.1 MB/s
Requirement already satisfied: google-api-python-client>=1.6.7 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.12.10)
Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.19.5)
Requirement already satisfied: tensorflow-hub>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.12.0)
Collecting seqeval
Downloading seqeval-1.2.2.tar.gz (43 kB)
|████████████████████████████████| 43 kB 2.1 MB/s
Collecting tensorflow~=2.8.0
Downloading tensorflow-2.8.0-cp37-cp37m-manylinux2010_x86_64.whl (497.5 MB)
|████████████████████████████████| 497.5 MB 28 kB/s
Collecting py-cpuinfo>=3.3.0
Downloading py-cpuinfo-8.0.0.tar.gz (99 kB)
|████████████████████████████████| 99 kB 10.1 MB/s
Requirement already satisfied: google-auth<3dev,>=1.16.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.35.0)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.1)
Requirement already satisfied: httplib2<1dev,>=0.15.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.17.4)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.0.4)
Requirement already satisfied: google-api-core<3dev,>=1.21.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.26.3)
Requirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (57.4.0)
Requirement already satisfied: pytz in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2018.9)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.54.0)
Requirement already satisfied: requests<3.0.0dev,>=2.18.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.23.0)
Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (21.3)
Requirement already satisfied: protobuf>=3.12.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.17.3)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.2.8)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.7/dist-packages (from google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.8)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.2.4)
Requirement already satisfied: certifi in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2021.10.8)
Requirement already satisfied: urllib3 in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.24.3)
Requirement already satisfied: python-dateutil in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2.8.2)
Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (4.62.3)
Requirement already satisfied: python-slugify in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (5.0.2)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from packaging>=14.3->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.7)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.7/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.4.8)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.10)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.4)
Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.1.0)
Requirement already satisfied: libclang>=9.0.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (13.0.0)
Requirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.1.0)
Requirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.6.3)
Requirement already satisfied: gast>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.4.0)
Requirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.2.0)
Requirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.10.0.2)
Requirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.13.3)
Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.23.1)
Collecting tf-estimator-nightly==2.8.0.dev2021122109
Downloading tf_estimator_nightly-2.8.0.dev2021122109-py2.py3-none-any.whl (462 kB)
|████████████████████████████████| 462 kB 49.5 MB/s
Requirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.1.2)
Collecting tensorboard<2.9,>=2.8
Downloading tensorboard-2.8.0-py3-none-any.whl (5.8 MB)
|████████████████████████████████| 5.8 MB 41.2 MB/s
Requirement already satisfied: flatbuffers>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (2.0)
Collecting keras
Downloading keras-2.8.0-py2.py3-none-any.whl (1.4 MB)
|████████████████████████████████| 1.4 MB 41.2 MB/s
Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.3.0)
Collecting numpy>=1.15.4
Downloading numpy-1.21.5-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.7 MB)
|████████████████████████████████| 15.7 MB 41.4 MB/s
Requirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.0.0)
Requirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.43.0)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /usr/local/lib/python3.7/dist-packages (from astunparse>=1.6.0->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.37.1)
Requirement already satisfied: cached-property in /usr/local/lib/python3.7/dist-packages (from h5py>=2.9.0->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.5.2)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.6.1)
Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.0.1)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.4.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.8.1)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.3.6)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.3.1)
Requirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (4.10.1)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.7.0)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.2.0)
Requirement already satisfied: dm-tree~=0.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-optimization>=0.4.1->tf-models-official>=2.5.1->object-detection==0.1) (0.1.6)
Requirement already satisfied: crcmod<2.0,>=1.7 in /usr/local/lib/python3.7/dist-packages (from apache-beam->object-detection==0.1) (1.7)
Collecting fastavro<2,>=0.21.4
Downloading fastavro-1.4.9-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.3 MB)
|████████████████████████████████| 2.3 MB 38.1 MB/s
Requirement already satisfied: pyarrow<7.0.0,>=0.15.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam->object-detection==0.1) (6.0.1)
Requirement already satisfied: pydot<2,>=1.2.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam->object-detection==0.1) (1.3.0)
Collecting proto-plus<2,>=1.7.1
Downloading proto_plus-1.19.9-py3-none-any.whl (45 kB)
|████████████████████████████████| 45 kB 3.2 MB/s
Collecting requests<3.0.0dev,>=2.18.0
Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)
|████████████████████████████████| 63 kB 1.8 MB/s
Collecting dill<0.3.2,>=0.3.1.1
Downloading dill-0.3.1.1.tar.gz (151 kB)
|████████████████████████████████| 151 kB 44.4 MB/s
Collecting numpy>=1.15.4
Downloading numpy-1.20.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.3 MB)
|████████████████████████████████| 15.3 MB 21.1 MB/s
Collecting orjson<4.0
Downloading orjson-3.6.6-cp37-cp37m-manylinux_2_24_x86_64.whl (245 kB)
|████████████████████████████████| 245 kB 53.2 MB/s
Collecting hdfs<3.0.0,>=2.1.0
Downloading hdfs-2.6.0-py3-none-any.whl (33 kB)
Collecting pymongo<4.0.0,>=3.8.0
Downloading pymongo-3.12.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (508 kB)
|████████████████████████████████| 508 kB 44.3 MB/s
Requirement already satisfied: docopt in /usr/local/lib/python3.7/dist-packages (from hdfs<3.0.0,>=2.1.0->apache-beam->object-detection==0.1) (0.6.2)
Collecting protobuf>=3.12.0
Downloading protobuf-3.19.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB)
|████████████████████████████████| 1.1 MB 47.3 MB/s
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.0.11)
Requirement already satisfied: opencv-python>=4.1.0.25 in /usr/local/lib/python3.7/dist-packages (from lvis->object-detection==0.1) (4.1.2.30)
Requirement already satisfied: cycler>=0.10.0 in /usr/local/lib/python3.7/dist-packages (from lvis->object-detection==0.1) (0.11.0)
Requirement already satisfied: kiwisolver>=1.1.0 in /usr/local/lib/python3.7/dist-packages (from lvis->object-detection==0.1) (1.3.2)
Requirement already satisfied: text-unidecode>=1.3 in /usr/local/lib/python3.7/dist-packages (from python-slugify->kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.3)
Requirement already satisfied: regex in /usr/local/lib/python3.7/dist-packages (from sacrebleu->tf-models-official>=2.5.1->object-detection==0.1) (2019.12.20)
Requirement already satisfied: tabulate>=0.8.9 in /usr/local/lib/python3.7/dist-packages (from sacrebleu->tf-models-official>=2.5.1->object-detection==0.1) (0.8.9)
Collecting portalocker
Downloading portalocker-2.3.2-py2.py3-none-any.whl (15 kB)
Collecting colorama
Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB)
Requirement already satisfied: scikit-learn>=0.21.3 in /usr/local/lib/python3.7/dist-packages (from seqeval->tf-models-official>=2.5.1->object-detection==0.1) (1.0.2)
Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.7/dist-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official>=2.5.1->object-detection==0.1) (1.1.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official>=2.5.1->object-detection==0.1) (3.1.0)
Requirement already satisfied: typeguard>=2.7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-addons->tf-models-official>=2.5.1->object-detection==0.1) (2.7.1)
Requirement already satisfied: promise in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (2.3)
Requirement already satisfied: future in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (0.16.0)
Requirement already satisfied: attrs>=18.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (21.4.0)
Requirement already satisfied: importlib-resources in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (5.4.0)
Requirement already satisfied: tensorflow-metadata in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (1.6.0)
Collecting tensorflow-io-gcs-filesystem>=0.23.1
Downloading tensorflow_io_gcs_filesystem-0.24.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.1 MB)
|████████████████████████████████| 2.1 MB 40.9 MB/s
Building wheels for collected packages: object-detection, py-cpuinfo, dill, avro-python3, seqeval
Building wheel for object-detection (setup.py) ... done
Created wheel for object-detection: filename=object_detection-0.1-py3-none-any.whl size=1686316 sha256=775b8c34c800b3b3139d1067abd686af9ce9158011fccfb5450ccfd9bf424a5a
Stored in directory: /tmp/pip-ephem-wheel-cache-rmw0fvil/wheels/d0/e3/e9/b9ffe85019ec441e90d8ff9eddee9950c4c23b7598204390b9
Building wheel for py-cpuinfo (setup.py) ... done
Created wheel for py-cpuinfo: filename=py_cpuinfo-8.0.0-py3-none-any.whl size=22257 sha256=ac956c4c039868fdba78645bea056754e667e8840bea783ad2ca75e4d3e682c6
Stored in directory: /root/.cache/pip/wheels/d2/f1/1f/041add21dc9c4220157f1bd2bd6afe1f1a49524c3396b94401
Building wheel for dill (setup.py) ... done
Created wheel for dill: filename=dill-0.3.1.1-py3-none-any.whl size=78544 sha256=d9c6cdfd69aea2b4d78e6afbbe2bc530394e4081eb186eb4f4cd02373ca739fd
Stored in directory: /root/.cache/pip/wheels/a4/61/fd/c57e374e580aa78a45ed78d5859b3a44436af17e22ca53284f
Building wheel for avro-python3 (setup.py) ... done
Created wheel for avro-python3: filename=avro_python3-1.10.2-py3-none-any.whl size=44010 sha256=4eca8b4f30e4850d5dabccee36c40c8dda8a6c7e7058cfb7f0258eea5ce7b2b3
Stored in directory: /root/.cache/pip/wheels/d6/e5/b1/6b151d9b535ee50aaa6ab27d145a0104b6df02e5636f0376da
Building wheel for seqeval (setup.py) ... done
Created wheel for seqeval: filename=seqeval-1.2.2-py3-none-any.whl size=16180 sha256=0ddfa46d0e36e9be346a90833ef11cc0d38cc7e744be34c5a0d321f997a30cba
Stored in directory: /root/.cache/pip/wheels/05/96/ee/7cac4e74f3b19e3158dce26a20a1c86b3533c43ec72a549fd7
Successfully built object-detection py-cpuinfo dill avro-python3 seqeval
Installing collected packages: requests, protobuf, numpy, tf-estimator-nightly, tensorflow-io-gcs-filesystem, tensorboard, keras, tensorflow, portalocker, dill, colorama, tf-slim, tensorflow-text, tensorflow-model-optimization, tensorflow-addons, seqeval, sentencepiece, sacrebleu, pyyaml, pymongo, py-cpuinfo, proto-plus, orjson, opencv-python-headless, hdfs, fastavro, tf-models-official, tensorflow-io, lvis, avro-python3, apache-beam, object-detection
Attempting uninstall: requests
Found existing installation: requests 2.23.0
Uninstalling requests-2.23.0:
Successfully uninstalled requests-2.23.0
Attempting uninstall: protobuf
Found existing installation: protobuf 3.17.3
Uninstalling protobuf-3.17.3:
Successfully uninstalled protobuf-3.17.3
Attempting uninstall: numpy
Found existing installation: numpy 1.19.5
Uninstalling numpy-1.19.5:
Successfully uninstalled numpy-1.19.5
Attempting uninstall: tensorflow-io-gcs-filesystem
Found existing installation: tensorflow-io-gcs-filesystem 0.23.1
Uninstalling tensorflow-io-gcs-filesystem-0.23.1:
Successfully uninstalled tensorflow-io-gcs-filesystem-0.23.1
Attempting uninstall: tensorboard
Found existing installation: tensorboard 2.7.0
Uninstalling tensorboard-2.7.0:
Successfully uninstalled tensorboard-2.7.0
Attempting uninstall: keras
Found existing installation: keras 2.7.0
Uninstalling keras-2.7.0:
Successfully uninstalled keras-2.7.0
Attempting uninstall: tensorflow
Found existing installation: tensorflow 2.7.0
Uninstalling tensorflow-2.7.0:
Successfully uninstalled tensorflow-2.7.0
Attempting uninstall: dill
Found existing installation: dill 0.3.4
Uninstalling dill-0.3.4:
Successfully uninstalled dill-0.3.4
Attempting uninstall: pyyaml
Found existing installation: PyYAML 3.13
Uninstalling PyYAML-3.13:
Successfully uninstalled PyYAML-3.13
Attempting uninstall: pymongo
Found existing installation: pymongo 4.0.1
Uninstalling pymongo-4.0.1:
Successfully uninstalled pymongo-4.0.1
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
yellowbrick 1.3.post1 requires numpy<1.20,>=1.16.0, but you have numpy 1.20.3 which is incompatible.
multiprocess 0.70.12.2 requires dill>=0.3.4, but you have dill 0.3.1.1 which is incompatible.
google-colab 1.0.0 requires requests~=2.23.0, but you have requests 2.27.1 which is incompatible.
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
albumentations 0.1.12 requires imgaug<0.2.7,>=0.2.5, but you have imgaug 0.2.9 which is incompatible.
Successfully installed apache-beam-2.35.0 avro-python3-1.10.2 colorama-0.4.4 dill-0.3.1.1 fastavro-1.4.9 hdfs-2.6.0 keras-2.8.0 lvis-0.5.3 numpy-1.20.3 object-detection-0.1 opencv-python-headless-4.5.5.62 orjson-3.6.6 portalocker-2.3.2 proto-plus-1.19.9 protobuf-3.19.4 py-cpuinfo-8.0.0 pymongo-3.12.3 pyyaml-5.4.1 requests-2.27.1 sacrebleu-2.0.0 sentencepiece-0.1.96 seqeval-1.2.2 tensorboard-2.8.0 tensorflow-2.8.0 tensorflow-addons-0.15.0 tensorflow-io-0.24.0 tensorflow-io-gcs-filesystem-0.24.0 tensorflow-model-optimization-0.7.0 tensorflow-text-2.8.1 tf-estimator-nightly-2.8.0.dev2021122109 tf-models-official-2.8.0 tf-slim-1.1.0
I am noticing that this command uninstalling tensorflow 2.7 and installing tensorflow 2.8. I am not sure it was happening before. Maybe it's the reason DNN library link is missing o something?
I can confirm these:
Somebody can help? Thanks.
ANSWER
Answered 2022-Feb-07 at 09:19It happened the same to me last friday. I think it has something to do with Cuda instalation in Google Colab but I don't know exactly the reason
QUESTION
How to install local package with conda
Asked 2022-Feb-05 at 04:16I have a local python project called jive
that I would like to use in an another project. My current method of using jive
in other projects is to activate the conda env for the project, then move to my jive
directory and use python setup.py install
. This works fine, and when I use conda list
, I see everything installed in the env including jive
, with a note that jive
was installed using pip.
But what I really want is to do this with full conda. When I want to use jive
in another project, I want to just put jive
in that projects environment.yml
.
So I did the following:
meta.yaml
so I could use conda-build to build jive
locallyconda build .
jive
source as expectedenvironment.yml
, and add 'local' to the list of channels.When I activate the environment and use conda list
, it lists all the dependencies including jive
, as desired. But when I open python interpreter, I cannot import jive
, it says there is no such package. (If use python setup.py install
, I can import it.)
How can I fix the build/install so that this works?
Here is the meta.yaml, which lives in the jive
project top level directory:
package:
name: jive
version: "0.2.1"
source:
path: .
build:
script: python -m pip install --no-deps --ignore-installed .
requirements:
host:
- python>=3.5
- pip
- setuptools
run:
- python>=3.5
- numpy
- pandas
- scipy
- seaborn
- matplotlib
- scikit-learn
- statsmodels
- joblib
- bokeh
test:
imports: jive
And here is the output of conda build .
No numpy version specified in conda_build_config.yaml. Falling back to default numpy value of 1.16
WARNING:conda_build.metadata:No numpy version specified in conda_build_config.yaml. Falling back to default numpy value of 1.16
Adding in variants from internal_defaults
INFO:conda_build.variants:Adding in variants from internal_defaults
Adding in variants from /Users/thomaskeefe/.conda/conda_build_config.yaml
INFO:conda_build.variants:Adding in variants from /Users/thomaskeefe/.conda/conda_build_config.yaml
Attempting to finalize metadata for jive
INFO:conda_build.metadata:Attempting to finalize metadata for jive
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
BUILD START: ['jive-0.2.1-py310_0.tar.bz2']
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
## Package Plan ##
environment location: /opt/miniconda3/conda-bld/jive_1642185595622/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pla
The following NEW packages will be INSTALLED:
bzip2: 1.0.8-h1de35cc_0
ca-certificates: 2021.10.26-hecd8cb5_2
certifi: 2021.5.30-py310hecd8cb5_0
libcxx: 12.0.0-h2f01273_0
libffi: 3.3-hb1e8313_2
ncurses: 6.3-hca72f7f_2
openssl: 1.1.1m-hca72f7f_0
pip: 21.2.4-py310hecd8cb5_0
python: 3.10.0-hdfd78df_3
readline: 8.1.2-hca72f7f_1
setuptools: 58.0.4-py310hecd8cb5_0
sqlite: 3.37.0-h707629a_0
tk: 8.6.11-h7bc2e8c_0
tzdata: 2021e-hda174b7_0
wheel: 0.37.1-pyhd3eb1b0_0
xz: 5.2.5-h1de35cc_0
zlib: 1.2.11-h4dc903c_4
Preparing transaction: ...working... done
Verifying transaction: ...working... done
Executing transaction: ...working... done
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
Copying /Users/thomaskeefe/Documents/py_jive to /opt/miniconda3/conda-bld/jive_1642185595622/work/
source tree in: /opt/miniconda3/conda-bld/jive_1642185595622/work
export PREFIX=/opt/miniconda3/conda-bld/jive_1642185595622/_h_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_pla
export BUILD_PREFIX=/opt/miniconda3/conda-bld/jive_1642185595622/_build_env
export SRC_DIR=/opt/miniconda3/conda-bld/jive_1642185595622/work
Processing $SRC_DIR
DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
Building wheels for collected packages: jive
Building wheel for jive (setup.py): started
Building wheel for jive (setup.py): finished with status 'done'
Created wheel for jive: filename=jive-0.2.1-py3-none-any.whl size=46071 sha256=b312955cb2fd917bc4e684a575407b884190680f2dddad7fcb9ac25e5b290fc9
Stored in directory: /private/tmp/pip-ephem-wheel-cache-rbpkt2an/wheels/15/68/82/4ed7cd246fbc4c72cf764b425a03230247589bd2394a7e457b
Successfully built jive
Installing collected packages: jive
Successfully installed jive-0.2.1
Resource usage statistics from building jive:
Process count: 3
CPU time: Sys=0:00:00.3, User=0:00:00.5
Memory: 53.7M
Disk usage: 50.4K
Time elapsed: 0:00:06.1
Packaging jive
INFO:conda_build.build:Packaging jive
INFO conda_build.build:build(2289): Packaging jive
Packaging jive-0.2.1-py310_0
INFO:conda_build.build:Packaging jive-0.2.1-py310_0
INFO conda_build.build:bundle_conda(1529): Packaging jive-0.2.1-py310_0
compiling .pyc files...
number of files: 70
Fixing permissions
INFO :: Time taken to mark (prefix)
0 replacements in 0 files was 0.06 seconds
TEST START: /opt/miniconda3/conda-bld/osx-64/jive-0.2.1-py310_0.tar.bz2
Adding in variants from /var/folders/dd/t85p2jdn3sd11bsdnl7th6p00000gn/T/tmp4o3im7d1/info/recipe/conda_build_config.yaml
INFO:conda_build.variants:Adding in variants from /var/folders/dd/t85p2jdn3sd11bsdnl7th6p00000gn/T/tmp4o3im7d1/info/recipe/conda_build_config.yaml
INFO conda_build.variants:_combine_spec_dictionaries(234): Adding in variants from /var/folders/dd/t85p2jdn3sd11bsdnl7th6p00000gn/T/tmp4o3im7d1/info/recipe/conda_build_config.yaml
Renaming work directory '/opt/miniconda3/conda-bld/jive_1642185595622/work' to '/opt/miniconda3/conda-bld/jive_1642185595622/work_moved_jive-0.2.1-py310_0_osx-64'
INFO:conda_build.utils:Renaming work directory '/opt/miniconda3/conda-bld/jive_1642185595622/work' to '/opt/miniconda3/conda-bld/jive_1642185595622/work_moved_jive-0.2.1-py310_0_osx-64'
INFO conda_build.utils:shutil_move_more_retrying(2091): Renaming work directory '/opt/miniconda3/conda-bld/jive_1642185595622/work' to '/opt/miniconda3/conda-bld/jive_1642185595622/work_moved_jive-0.2.1-py310_0_osx-64'
shutil.move(work)=/opt/miniconda3/conda-bld/jive_1642185595622/work, dest=/opt/miniconda3/conda-bld/jive_1642185595622/work_moved_jive-0.2.1-py310_0_osx-64)
INFO:conda_build.utils:shutil.move(work)=/opt/miniconda3/conda-bld/jive_1642185595622/work, dest=/opt/miniconda3/conda-bld/jive_1642185595622/work_moved_jive-0.2.1-py310_0_osx-64)
INFO conda_build.utils:shutil_move_more_retrying(2098): shutil.move(work)=/opt/miniconda3/conda-bld/jive_1642185595622/work, dest=/opt/miniconda3/conda-bld/jive_1642185595622/work_moved_jive-0.2.1-py310_0_osx-64)
Collecting package metadata (repodata.json): ...working... done
Solving environment: ...working... done
## Package Plan ##
environment location: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol
The following NEW packages will be INSTALLED:
blas: 1.0-mkl
bokeh: 2.4.2-py39hecd8cb5_0
bottleneck: 1.3.2-py39he3068b8_1
brotli: 1.0.9-hb1e8313_2
ca-certificates: 2021.10.26-hecd8cb5_2
certifi: 2021.10.8-py39hecd8cb5_2
cycler: 0.11.0-pyhd3eb1b0_0
fonttools: 4.25.0-pyhd3eb1b0_0
freetype: 2.11.0-hd8bbffd_0
giflib: 5.2.1-haf1e3a3_0
intel-openmp: 2021.4.0-hecd8cb5_3538
jinja2: 3.0.2-pyhd3eb1b0_0
jive: 0.2.1-py310_0 local
joblib: 1.1.0-pyhd3eb1b0_0
jpeg: 9d-h9ed2024_0
kiwisolver: 1.3.1-py39h23ab428_0
lcms2: 2.12-hf1fd2bf_0
libcxx: 12.0.0-h2f01273_0
libffi: 3.3-hb1e8313_2
libgfortran: 3.0.1-h93005f0_2
libpng: 1.6.37-ha441bb4_0
libtiff: 4.2.0-h87d7836_0
libwebp: 1.2.0-hacca55c_0
libwebp-base: 1.2.0-h9ed2024_0
llvm-openmp: 12.0.0-h0dcd299_1
lz4-c: 1.9.3-h23ab428_1
markupsafe: 2.0.1-py39h9ed2024_0
matplotlib: 3.5.0-py39hecd8cb5_0
matplotlib-base: 3.5.0-py39h4f681db_0
mkl: 2021.4.0-hecd8cb5_637
mkl-service: 2.4.0-py39h9ed2024_0
mkl_fft: 1.3.1-py39h4ab4a9b_0
mkl_random: 1.2.2-py39hb2f4e1b_0
munkres: 1.1.4-py_0
ncurses: 6.3-hca72f7f_2
numexpr: 2.8.1-py39h2e5f0a9_0
numpy: 1.21.2-py39h4b4dc7a_0
numpy-base: 1.21.2-py39he0bd621_0
olefile: 0.46-pyhd3eb1b0_0
openssl: 1.1.1m-hca72f7f_0
packaging: 21.3-pyhd3eb1b0_0
pandas: 1.3.5-py39h743cdd8_0
patsy: 0.5.2-py39hecd8cb5_0
pillow: 8.4.0-py39h98e4679_0
pip: 21.2.4-py39hecd8cb5_0
pyparsing: 3.0.4-pyhd3eb1b0_0
python: 3.9.7-h88f2d9e_1
python-dateutil: 2.8.2-pyhd3eb1b0_0
pytz: 2021.3-pyhd3eb1b0_0
pyyaml: 6.0-py39hca72f7f_1
readline: 8.1.2-hca72f7f_1
scikit-learn: 1.0.2-py39hae1ba45_0
scipy: 1.7.3-py39h8c7af03_0
seaborn: 0.11.2-pyhd3eb1b0_0
setuptools: 58.0.4-py39hecd8cb5_0
six: 1.16.0-pyhd3eb1b0_0
sqlite: 3.37.0-h707629a_0
statsmodels: 0.13.0-py39hca72f7f_0
threadpoolctl: 2.2.0-pyh0d69192_0
tk: 8.6.11-h7bc2e8c_0
tornado: 6.1-py39h9ed2024_0
typing_extensions: 3.10.0.2-pyh06a4308_0
tzdata: 2021e-hda174b7_0
wheel: 0.37.1-pyhd3eb1b0_0
xz: 5.2.5-h1de35cc_0
yaml: 0.2.5-haf1e3a3_0
zlib: 1.2.11-h4dc903c_4
zstd: 1.4.9-h322a384_0
Preparing transaction: ...working... done
Verifying transaction: ...working...
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::intel-openmp-2021.4.0-hecd8cb5_3538, defaults/osx-64::llvm-openmp-12.0.0-h0dcd299_1
path: 'lib/libiomp5.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'bin/webpinfo'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'bin/webpmux'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'include/webp/decode.h'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'include/webp/encode.h'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'include/webp/mux.h'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'include/webp/mux_types.h'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'include/webp/types.h'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebp.7.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebp.a'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebp.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebpdecoder.3.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebpdecoder.a'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebpdecoder.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebpmux.3.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebpmux.a'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/libwebpmux.dylib'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/pkgconfig/libwebp.pc'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/pkgconfig/libwebpdecoder.pc'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'lib/pkgconfig/libwebpmux.pc'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'share/man/man1/cwebp.1'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'share/man/man1/dwebp.1'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'share/man/man1/webpinfo.1'
ClobberWarning: This transaction has incompatible packages due to a shared path.
packages: defaults/osx-64::libwebp-base-1.2.0-h9ed2024_0, defaults/osx-64::libwebp-1.2.0-hacca55c_0
path: 'share/man/man1/webpmux.1'
done
Executing transaction: ...working...
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/llvm-openmp-12.0.0-h0dcd299_1/lib/libiomp5.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libiomp5.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/bin/webpinfo
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/bin/webpinfo
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/bin/webpmux
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/bin/webpmux
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/include/webp/decode.h
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/include/webp/decode.h
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/include/webp/encode.h
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/include/webp/encode.h
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/include/webp/mux.h
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/include/webp/mux.h
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/include/webp/mux_types.h
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/include/webp/mux_types.h
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/include/webp/types.h
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/include/webp/types.h
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebp.7.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebp.7.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebp.a
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebp.a
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebp.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebp.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebpdecoder.3.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebpdecoder.3.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebpdecoder.a
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebpdecoder.a
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebpdecoder.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebpdecoder.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebpmux.3.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebpmux.3.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebpmux.a
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebpmux.a
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/lib/libwebpmux.dylib
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/libwebpmux.dylib
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/.condatmp/1018f8ab-87a7-4fa8-a41c-4c14cc77cfff
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/pkgconfig/libwebp.pc
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/.condatmp/e3701fae-f2cd-44e9-9dc6-c71f499cd2c2
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/pkgconfig/libwebpdecoder.pc
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/.condatmp/0f4bcf50-01e5-404d-b1a4-8a87d45c22c5
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/lib/pkgconfig/libwebpmux.pc
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/share/man/man1/cwebp.1
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/share/man/man1/cwebp.1
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/share/man/man1/dwebp.1
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/share/man/man1/dwebp.1
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/share/man/man1/webpinfo.1
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/share/man/man1/webpinfo.1
ClobberWarning: Conda was asked to clobber an existing path.
source path: /opt/miniconda3/pkgs/libwebp-1.2.0-hacca55c_0/share/man/man1/webpmux.1
target path: /opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol/share/man/man1/webpmux.1
Installed package of scikit-learn can be accelerated using scikit-learn-intelex.
More details are available here: https://intel.github.io/scikit-learn-intelex
For example:
$ conda install scikit-learn-intelex
$ python -m sklearnex my_application.py
done
export PREFIX=/opt/miniconda3/conda-bld/jive_1642185595622/_test_env_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehold_placehol
export SRC_DIR=/opt/miniconda3/conda-bld/jive_1642185595622/test_tmp
Traceback (most recent call last):
File "/opt/miniconda3/conda-bld/jive_1642185595622/test_tmp/run_test.py", line 2, in <module>
import jive
ModuleNotFoundError: No module named 'jive'
import: 'jive'
Tests failed for jive-0.2.1-py310_0.tar.bz2 - moving package to /opt/miniconda3/conda-bld/broken
WARNING:conda_build.build:Tests failed for jive-0.2.1-py310_0.tar.bz2 - moving package to /opt/miniconda3/conda-bld/broken
WARNING conda_build.build:tests_failed(2970): Tests failed for jive-0.2.1-py310_0.tar.bz2 - moving package to /opt/miniconda3/conda-bld/broken
TESTS FAILED: jive-0.2.1-py310_0.tar.bz2
EDIT: I added a test:
section to the meta.yaml as merv suggested.
ANSWER
Answered 2022-Feb-05 at 04:16The immediate error is that the build is generating a Python 3.10 version, but when testing Conda doesn't recognize any constraint on the Python version, and creates a Python 3.9 environment.
I think the main issue is that python >=3.5
is only a valid constraint when doing noarch
builds, which this is not. That is, once a package builds with a given Python version, the version must be constrained to exactly that version (up through minor). So, in this case, the package is built with Python 3.10, but it reports in its metadata that it is compatible with all versions of Python 3.5+, which simply isn't true because Conda Python packages install the modules into Python-version-specific site-packages
(e.g., lib/python-3.10/site-packages/jive
).
Typically, Python versions are controlled by either the --python
argument given to conda-build
or a matrix supplied by the conda_build_config.yaml
file (see documentation on "Build variants").
Try adjusting the meta.yaml
to something like
package:
name: jive
version: "0.2.1"
source:
path: .
build:
script: python -m pip install --no-deps --ignore-installed .
requirements:
host:
- python
- pip
- setuptools
run:
- python
- numpy
- pandas
- scipy
- seaborn
- matplotlib
- scikit-learn
- statsmodels
- joblib
- bokeh
If you want to use it in a Python 3.9 environment, then use conda build --python 3.9 .
.
QUESTION
Cannot find conda info. Please verify your conda installation on EMR
Asked 2022-Feb-05 at 00:17I am trying to install conda on EMR and below is my bootstrap script, it looks like conda is getting installed but it is not getting added to environment variable. When I manually update the $PATH
variable on EMR master node, it can identify conda
. I want to use conda on Zeppelin.
I also tried adding condig into configuration like below while launching my EMR instance however I still get the below mentioned error.
"classification": "spark-env",
"properties": {
"conda": "/home/hadoop/conda/bin"
}
[hadoop@ip-172-30-5-150 ~]$ PATH=/home/hadoop/conda/bin:$PATH
[hadoop@ip-172-30-5-150 ~]$ conda
usage: conda [-h] [-V] command ...
conda is a tool for managing and deploying applications, environments and packages.
#!/usr/bin/env bash
# Install conda
wget https://repo.continuum.io/miniconda/Miniconda3-4.2.12-Linux-x86_64.sh -O /home/hadoop/miniconda.sh \
&& /bin/bash ~/miniconda.sh -b -p $HOME/conda
conda config --set always_yes yes --set changeps1 no
conda install conda=4.2.13
conda config -f --add channels conda-forge
rm ~/miniconda.sh
echo bootstrap_conda.sh completed. PATH now: $PATH
export PYSPARK_PYTHON="/home/hadoop/conda/bin/python3.5"
echo -e '\nexport PATH=$HOME/conda/bin:$PATH' >> $HOME/.bashrc && source $HOME/.bashrc
conda create -n zoo python=3.7 # "zoo" is conda environment name, you can use any name you like.
conda activate zoo
sudo pip3 install tensorflow
sudo pip3 install boto3
sudo pip3 install botocore
sudo pip3 install numpy
sudo pip3 install pandas
sudo pip3 install scipy
sudo pip3 install s3fs
sudo pip3 install matplotlib
sudo pip3 install -U tqdm
sudo pip3 install -U scikit-learn
sudo pip3 install -U scikit-multilearn
sudo pip3 install xlutils
sudo pip3 install natsort
sudo pip3 install pydot
sudo pip3 install python-pydot
sudo pip3 install python-pydot-ng
sudo pip3 install pydotplus
sudo pip3 install h5py
sudo pip3 install graphviz
sudo pip3 install recmetrics
sudo pip3 install openpyxl
sudo pip3 install xlrd
sudo pip3 install xlwt
sudo pip3 install tensorflow.io
sudo pip3 install Cython
sudo pip3 install ray
sudo pip3 install zoo
sudo pip3 install analytics-zoo
sudo pip3 install analytics-zoo[ray]
#sudo /usr/bin/pip-3.6 install -U imbalanced-learn
ANSWER
Answered 2022-Feb-05 at 00:17I got the conda working by modifying the script as below, emr python versions were colliding with the conda version.:
wget https://repo.anaconda.com/miniconda/Miniconda3-py37_4.9.2-Linux-x86_64.sh -O /home/hadoop/miniconda.sh \
&& /bin/bash ~/miniconda.sh -b -p $HOME/conda
echo -e '\n export PATH=$HOME/conda/bin:$PATH' >> $HOME/.bashrc && source $HOME/.bashrc
conda config --set always_yes yes --set changeps1 no
conda config -f --add channels conda-forge
conda create -n zoo python=3.7 # "zoo" is conda environment name
conda init bash
source activate zoo
conda install python 3.7.0 -c conda-forge orca
sudo /home/hadoop/conda/envs/zoo/bin/python3.7 -m pip install virtualenv
and setting zeppelin python and pyspark parameters to:
“spark.pyspark.python": "/home/hadoop/conda/envs/zoo/bin/python3",
"spark.pyspark.virtualenv.enabled": "true",
"spark.pyspark.virtualenv.type":"native",
"spark.pyspark.virtualenv.bin.path":"/home/hadoop/conda/envs/zoo/bin/,
"zeppelin.pyspark.python" : "/home/hadoop/conda/bin/python",
"zeppelin.python": "/home/hadoop/conda/bin/python"
Orca only support TF upto 1.5 hence it was not working as I am using TF2.
QUESTION
Updating Python sklearn Lasso(normalize=True) to Use Pipeline
Asked 2021-Dec-28 at 10:34I am new to Python. I am trying to practice basic regularization by following along with a DataCamp exercise using this CSV: https://assets.datacamp.com/production/repositories/628/datasets/a7e65287ebb197b1267b5042955f27502ec65f31/gm_2008_region.csv
# Import numpy and pandas
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
# Read the CSV file into a DataFrame: df
df = pd.read_csv('gm_2008_region.csv')
# Create arrays for features and target variable
X = df.drop(['life','Region'], axis=1)
y = df['life'].values.reshape(-1,1)
df_columns = df.drop(['life','Region'], axis=1).columns
The code that I use for the DataCamp exercise is as follows:
# Import Lasso
from sklearn.linear_model import Lasso
# Instantiate a lasso regressor: lasso
lasso = Lasso(alpha=0.4, normalize=True)
# Fit the regressor to the data
lasso.fit(X, y)
# Compute and print the coefficients
lasso_coef = lasso.coef_
print(lasso_coef)
# Plot the coefficients
plt.plot(range(len(df_columns)), lasso_coef)
plt.xticks(range(len(df_columns)), df_columns.values, rotation=60)
plt.margins(0.02)
plt.show()
I get the output above, indicating that child_mortality is the most important feature in predicting life expectancy, but this code also results in a deprecation warning due to the use of "normalize."
I'd like to update this code using the current best practice. I have tried the following, but I get a different output. I am hoping someone can help identify what I need to modify in the updated code in order to produce the same output.
# Modified based on https://scikit-learn.org/stable/modules/preprocessing.html#preprocessing-scaler
# and https://stackoverflow.com/questions/28822756/getting-model-attributes-from-pipeline
# Import Lasso
from sklearn.linear_model import Lasso
from sklearn.pipeline import Pipeline
from sklearn.preprocessing import StandardScaler
# Instantiate a lasso regressor: lasso
#lasso = Lasso(alpha=0.4, normalize=True)
pipe = Pipeline(steps=[
('scaler',StandardScaler()),
('lasso',Lasso(alpha=0.4))
])
# Fit the regressor to the data
#lasso.fit(X, y)
pipe.fit(X, y)
# Compute and print the coefficients
#lasso_coef = lasso.coef_
#print(lasso_coef)
lasso_coef = pipe.named_steps['lasso'].coef_
print(lasso_coef)
# Plot the coefficients
plt.plot(range(len(df_columns)), lasso_coef)
plt.xticks(range(len(df_columns)), df_columns.values, rotation=60)
plt.margins(0.02)
plt.show()
As you can see, I draw the same conclusion, but I'd be more comfortable that I was doing this correctly if the output images were more similar. What am I doing wrong with the Pipeline?
ANSWER
Answered 2021-Nov-24 at 09:45When you set Lasso(..normalize=True)
the normalization is different from that in StandardScaler()
. It divides by the l2-norm instead of the standard deviation. If you read the help page:
normalize bool, default=False This parameter is ignored when fit_intercept is set to False. If True, the regressors X will be normalized before regression by subtracting the mean and dividing by the l2-norm. If you wish to standardize, please use StandardScaler before calling fit on an estimator with normalize=False.
Deprecated since version 1.0: normalize was deprecated in version 1.0 and will be removed in 1.2.
It is also touched upon in this post. Since it will be deprecated, I think it's better to just use the StandardScaler normalization. You can see it's reproducible as long as you scale it in the same way:
lasso = Lasso(alpha=0.4,random_state=99)
lasso.fit(StandardScaler().fit_transform(X),y)
print(lasso.coef_)
[-0. -0.30409556 -2.33203165 -0. 0.51040194 1.45942351
-1.02516505 -4.57678764]
pipe = Pipeline(steps=[
('scaler',StandardScaler()),
('lasso',Lasso(alpha=0.4,random_state=99))
])
pipe.fit(X, y)
lasso_coef = pipe.named_steps['lasso'].coef_
print(lasso_coef)
[-0. -0.30409556 -2.33203165 -0. 0.51040194 1.45942351
-1.02516505 -4.57678764]
QUESTION
Can't deploy streamlit app on share.streamlit.io
Asked 2021-Dec-25 at 14:42I am working with a simple ML model with streamlit. It runs fine on my local machine inside conda environment, but it shows Error installing requirements when I try to deploy it on share.streamlit.io.
The error message is the following:
ERROR: Could not find a version that satisfies the requirement pywin32==303 (from versions: none)
ERROR: No matching distribution found for pywin32==303
This is the requirements.txt file for my model:
altair==4.1.0
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
astor==0.8.1
attrs==21.2.0
backcall==0.2.0
base58==2.1.1
bleach==4.1.0
blinker==1.4
cachetools==5.0.0
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.9
click==7.1.2
colorama==0.4.4
cycler==0.11.0
debugpy==1.5.1
decorator==5.1.0
defusedxml==0.7.1
entrypoints==0.3
fonttools==4.28.5
gitdb==4.0.9
GitPython==3.1.24
idna==3.3
ipykernel==6.6.0
ipython==7.30.1
ipython-genutils==0.2.0
ipywidgets==7.6.5
jedi==0.18.1
Jinja2==3.0.3
joblib==1.1.0
jsonschema==4.3.2
jupyter-client==7.1.0
jupyter-core==4.9.1
jupyterlab-pygments==0.1.2
jupyterlab-widgets==1.0.2
kiwisolver==1.3.2
MarkupSafe==2.0.1
matplotlib==3.5.1
matplotlib-inline==0.1.3
mistune==0.8.4
nbclient==0.5.9
nbconvert==6.3.0
nbformat==5.1.3
nest-asyncio==1.5.4
notebook==6.4.6
numpy==1.21.5
packaging==21.3
pandas==1.3.5
pandocfilters==1.5.0
parso==0.8.3
pickleshare==0.7.5
Pillow==8.4.0
prometheus-client==0.12.0
prompt-toolkit==3.0.24
protobuf==3.19.1
pyarrow==6.0.1
pycparser==2.21
pydeck==0.7.1
Pygments==2.10.0
Pympler==1.0.1
pyparsing==3.0.6
pyrsistent==0.18.0
python-dateutil==2.8.2
pytz==2021.3
pytz-deprecation-shim==0.1.0.post0
pywin32==303
pywinpty==1.1.6
pyzmq==22.3.0
requests==2.26.0
scikit-learn==1.0.1
scipy==1.7.3
seaborn==0.11.2
Send2Trash==1.8.0
six==1.16.0
smmap==5.0.0
streamlit==1.3.0
terminado==0.12.1
testpath==0.5.0
threadpoolctl==3.0.0
toml==0.10.2
toolz==0.11.2
tornado==6.1
traitlets==5.1.1
typing_extensions==4.0.1
tzdata==2021.5
tzlocal==4.1
urllib3==1.26.7
validators==0.18.2
watchdog==2.1.6
wcwidth==0.2.5
webencodings==0.5.1
widgetsnbextension==3.5.2
wincertstore==0.2
What should I do to resolve this error?
ANSWER
Answered 2021-Dec-25 at 14:42Streamlit share runs the app in a linux environment meaning there is no pywin32 because this is for windows.
Delete the pywin32 from the requirements file and also the pywinpty==1.1.6 for the same reason.
After deleting these requirements re-deploy your app and it will work.
QUESTION
Sklearn: Calibrate a multi-label classification with CalibratedClassifierCV
Asked 2021-Dec-18 at 17:38I have built a number of sklearn classifier models to perform multi-label classification and I would like to calibrate their predict_proba
outputs so that I can obtain confidence scores. I would also like to use metrics such as sklearn.metrics.recall_score
to evaluate them.
I have 4 labels to predict and the true labels are multi-hot encoded (e.g. [0, 1, 1, 1]
). As a result, CalibratedClassifierCV
does not directly accept my data:
clf = tree.DecisionTreeClassifier(max_depth=15)
clf = clf.fit(train_X, train_Y)
calibrated_clf = CalibratedClassifierCV(clf, cv="prefit", method="sigmoid")
calibrated_clf.fit(dev_X, dev_Y)
This would return an error:
ValueError: classes [[0 1]
[0 1]
[0 1]
[0 1]] mismatch with the labels [0 1 2 3] found in the data
Thus, I tried to wrap it in a OneVsRestClassifier
:
clf = OneVsRestClassifier(tree.DecisionTreeClassifier(max_depth=15), n_jobs=4)
clf = clf.fit(train_X, train_Y)
calibrated_clf = CalibratedClassifierCV(clf, cv="prefit", method="sigmoid")
calibrated_clf.fit(dev_X, dev_Y)
Note that MultiOutputClassifier
and ClassifierChain
do not work even though they possibly suit my problem better.
It works, but the predict
output of the calibrated classifier is multi-class instead of multi-label because of its implementation. There are four classes ([0 1 2 3]
) but if there is no need to put a label, it still predicts a 0
.
Upon further inspection by means of calibration curves, it turns out the base estimator wrapped inside the calibrated classifier is not calibrated at all. That is, (calibrated_clf.calibrated_classifiers_)[0].base_estimator
returns the same clf
as before calibration.
I would like to observe the performance of my (calibrated) models doing deterministic (predict
) and probabilistic (predict_proba
) predictions. How should I design my model/wrap things in other containers to get both calibrated probabilities for each label and comprehensible label predictions?
ANSWER
Answered 2021-Dec-17 at 15:33In your example, you're using a DecisionTreeClassifier
which by default support targets of dimension (n, m) where m > 1.
However if you want to have as result the marginal probability of each class then use the OneVsRestClassifier.
Notice that CalibratedClassifierCV
expects target to be 1d so the "trick" is to extend it to support Multilabel Classification with MultiOutputClassifier.
Full Example
from sklearn.datasets import make_multilabel_classification
from sklearn.tree import DecisionTreeClassifier
from sklearn.multioutput import MultiOutputClassifier
from sklearn.model_selection import train_test_split, StratifiedKFold
import numpy as np
# Generate a sample multilabel target
X, y = make_multilabel_classification(n_classes=4, random_state=0)
y
>>>
array([[1, 0, 1, 0],
[0, 0, 0, 0],
[1, 0, 1, 0],
...
[0, 0, 0, 0],
[0, 1, 1, 1],
[1, 1, 0, 1]])
# Split in train/test
X_train, X_test, y_train, y_test = train_test_split(
X, y, test_size=0.9, random_state=42
)
# Splits Stratify target variable
cv = StratifiedKFold(n_splits=2)
# Decision tree support by default multiclass target or use OneVsRest if marginal probabilities
clf = OneVsRestClassifier(DecisionTreeClassifier(max_depth=10))
# Calibrate estimator probabilities
calibrated_clf = CalibratedClassifierCV(base_estimator=clf, cv=cv)
# calibrated_clf target is one dimensional, extend classifier to multi-target classification.
multioutput_clf = MultiOutputClassifier(calibrated_clf).fit(X_train, y_train)
# Check predict
multioutput_clf.predict(X_test[-5:])
>>>
array([[0, 0, 1, 1],
[0, 0, 0, 1],
[0, 0, 0, 1],
[0, 0, 0, 1],
[0, 0, 0, 1]])
# Check predict_proba
multioutput_clf.predict_proba(X_test[-5:])
>>>
[array([[0.78333315, 0.21666685],
[0.78333315, 0.21666685],
[0.78333315, 0.21666685],
[0.78333315, 0.21666685],
[0.78333315, 0.21666685]]),
array([[0.59166537, 0.40833463],
[0.59166537, 0.40833463],
[0.40833361, 0.59166639],
[0.59166537, 0.40833463],
[0.59166537, 0.40833463]]),
array([[0.61666922, 0.38333078],
[0.61666427, 0.38333573],
[0.80000098, 0.19999902],
[0.61666427, 0.38333573],
[0.61666427, 0.38333573]]),
array([[0.26874774, 0.73125226],
[0.26874774, 0.73125226],
[0.45208444, 0.54791556],
[0.26874774, 0.73125226],
[0.26874774, 0.73125226]])]
Notice that the result from predict_proba
is a list with 4 arrays, each array is the probability to belong to the class i. For example, inside the first sample of the first array is the probability that first sample belongs to class 1 and so on.
Regarding the calibration curves, scikit-learn provides examples to plot probability path for two dimension and three dimension targets.
QUESTION
understanding sklearn calibratedClassifierCV
Asked 2021-Dec-03 at 13:03Hi all I am having trouble understanding how to use the output of sklearn.calibration.CalibratedClassifierCV
.
I have calibrated my binary classifier using this method, and results are greatly improved. However I am not sure how to interpret the results. sklearn guide states that, after calibration,
the output of
predict_proba
method can be directly interpreted as a confidence level. For instance, a well calibrated (binary) classifier should classify the samples such that among the samples to which it gave a predict_proba value close to 0.8, approximately 80% actually belong to the positive class.
Now I would like to reduce false positive by applying a cutoff at .6 for the model to predict label True
. Without the calibration, I would have simply used my_model.predict_proba() > .6
.
However, it seems that after calibration the meaning of predict_proba has changed, so I am not sure if I can do that anymore.
From a quick testing it seems that predict and predict_proba follow the same logic I would expect before calibration. The output of:
pred = my_model.predict(valid_x)
proba= my_model.predict_proba(valid_x)
pd.DataFrame({"label": pred, "proba": proba[:,1]})
Where everything that has a probability of above .5 gets to be classifed as True, and everything below .5 as False.
Can you confirm that, after calibration, I can still use predict_proba
to apply a different cutoff to identify my labels?
2 https://scikit-learn.org/stable/modules/calibration.html#calibration
ANSWER
Answered 2021-Dec-03 at 13:03For me, you can actually use predict_proba()
after calibration to apply a different cutoff.
What happens within class CalibratedClassifierCV
(as you noticed) is effectively that the output of predict()
is based on the output of predict_proba()
(see here for reference), i.e. np.argmax(self.predict_proba(X), axis=1) == self.predict(X)
.
On the other side, for the non-calibrated classifier that you're passing to CalibratedClassifierCV
(depending on whether it is a probabilistic classifier or not) the above equality may or may not hold (e.g. it does not for an SVC()
classifier - see here, for instance, for some other details on this).
QUESTION
Meaning of `penalty` and `loss` in LinearSVC
Asked 2021-Nov-18 at 18:08Anti-closing preamble: I have read the question "difference between penalty and loss parameters in Sklearn LinearSVC library" but I find the answer there not to be specific enough. Therefore, I’m reformulating the question:
I am familiar with SVM theory and I’m experimenting with LinearSVC class in Python. However, the documentation is not quite clear regarding the meaning of penalty
and loss
parameters. I recon that loss
refers to the penalty for points violating the margin (usually denoted by the Greek letter xi or zeta in the objective function), while penalty
is the norm of the vector determining the class boundary, usually denoted by w. Can anyone confirm or deny this?
If my guess is right, then penalty = 'l1'
would lead to minimisation of the L1-norm of the vector w, like in LASSO regression. How does this relate to the maximum-margin idea of the SVM? Can anyone point me to a publication regarding this question? In the original paper describing LIBLINEAR I could not find any reference to L1 penalty.
Also, if my guess is right, why doesn't LinearSVC support the combination of penalty='l2'
and loss='hinge'
(the standard combination in SVC) when dual=False
? When trying it, I get the
ValueError: Unsupported set of arguments
ANSWER
Answered 2021-Nov-18 at 18:08Though very late, I'll try to give my answer. According to the doc, here's the considered primal optimization problem for LinearSVC
:
,
phi
being the Identity matrix, given that LinearSVC
only solves linear problems.
Effectively, this is just one of the possible problems that LinearSVC
admits (it is the L2-regularized, L1-loss in the terms of the LIBLINEAR paper) and not the default one (which is the L2-regularized, L2-loss).
The LIBLINEAR paper gives a more general formulation for what concerns what's referred to as loss
in Chapter 2, then it further elaborates also on what's referred to as penalty
within the Appendix (A2+A4).
Basically, it states that LIBLINEAR is meant to solve the following unconstrained optimization pb with different loss
functions xi(w;x,y)
(which are hinge
and squared_hinge
); the default setting of the model in LIBLINEAR does not consider the bias term, that's why you won't see any reference to b
from now on (there are many posts on SO on this).
hinge
or L1-losssquared_hinge
or L2-loss.For what concerns the penalty
, basically this represents the norm of the vector w
used. The appendix elaborates on the different problems:
penalty='l2'
, loss='hinge'
): penalty='l2'
, loss='squared_hinge'
), default in LinearSVC
:penalty='l1'
, loss='squared_hinge'
):Instead, as stated within the documentation, LinearSVC
does not support the combination of penalty='l1'
and loss='hinge'
. As far as I see the paper does not specify why, but I found a possible answer here (within the answer by Arun Iyer).
Eventually, effectively the combination of penalty='l2'
, loss='hinge'
, dual=False
is not supported as specified in here (it is just not implemented in LIBLINEAR) or here; not sure whether that's the case, but within the LIBLINEAR paper from Appendix B onwards it is specified the optimization pb that's solved (which in the case of L2-regularized, L1-loss seems to be the dual).
For a theoretical discussion on SVC pbs in general, I found that chapter really useful; it shows how the minimization of the norm of w
relates to the idea of the maximum-margin.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
No vulnerabilities reported
Save this library and start creating your kit
PyPI
pip install scikit-learn
HTTPS
https://github.com/scikit-learn/scikit-learn.git
CLI
gh repo clone scikit-learn/scikit-learn
SSH
git@github.com:scikit-learn/scikit-learn.git
Share this Page
See Similar Libraries in
by tensorflow
by ytdl-org
by tensorflow
by pytorch
by keras-team
See all Machine Learning Libraries
by scikit-learn HTML
by scikit-learn Python
by scikit-learn Python
by scikit-learn Shell
by scikit-learn Python
See all Libraries by this author
by ytdl-org
by scikit-learn
by tensorflow
by tensorflow
by keras-team
See all Machine Learning Libraries
by yangliuy
by asanoja
by allr
by mrmans0n
by LearnLib
See all Machine Learning Libraries
by firepick1
by vpejovic
by yangliuy
by asanoja
by allr
See all Machine Learning Libraries
by mrmans0n
by Credntia
by gatagat
by Hvass-Labs
by viadee
See all Machine Learning Libraries
Save this library and start creating your kit
Open Weaver – Develop Applications Faster with Open Source