easydict | Access dict values as attributes
kandi X-RAY | easydict Summary
Support
Quality
Security
License
Reuse
- Initialize the object .
- Recursively set attributes .
- Update the attributes of a dictionary .
- Remove key from dict .
easydict Key Features
easydict Examples and Code Snippets
parser = argparse.ArgumentParser()
parser.add_argument('--batch_size', default=100, type=int, help='batch size')
parser.add_argument('--train_steps', default=1000, type=int,
help='number of training steps')
args = parser.parse_args(argv[1:])
args = easydict.EasyDict({
"batch_size": 100,
"train_steps": 1000
})
funtion(batch_size=100, train_steps=1000)
args = (100, 1000)
function(*args)
args = {"batch_size": 100,
"train_steps": 1000}
function(**args)
yaml.load(fo.read(), Loader=yaml.FullLoader)
sudo apt update
sudo apt install ffmpeg ImageMagick
def __setattr__(self, name, value):
if isinstance(value, (list, tuple)):
value = [self.__class__(x)
if isinstance(x, dict) else x for x in value]
elif isinstance(value, dict) and not isinstance(value, self.__class__):
value = self.__class__(value)
conf = EasyDict()
conf.name = 'CD Group'
conf.name2 = ['CD Group']
print(conf.name , conf.name2, sep='\n')
>>
CD Group
['CD Group']
# Dataset.
desc += '-dataset'; dataset = EasyDict(tfrecord_dir='dataset', resolution=128); train.mirror_augment = False
conda install easydict -c conda-forge
pytorch 1.5.1 py3.6_cpu_0 [cpuonly] pytorch
Trending Discussions on easydict
Trending Discussions on easydict
QUESTION
I use the easydict library to set the configurations, but when I store the configuration as strings, it turns the string into a list of the string itself, as follows:
from easydict import EasyDict
conf = EasyDict()
conf.name = 'CD Group'
print(conf.name)
> ['CD Group'] # a list
But what I hope to achieve is:
> CD Group # string type
Why is it like this? Thanks for any feedback!
An interesting point is that when I pass the configuration from the argparse into the easydict, the type doesn't get an error, as follow:
conf.file_name = args.file_name # --file_name input.xlsx
print(conf.file_name)
> input.xlsx
ANSWER
Answered 2022-Feb-13 at 03:44looking at __setattr__
def __setattr__(self, name, value):
if isinstance(value, (list, tuple)):
value = [self.__class__(x)
if isinstance(x, dict) else x for x in value]
elif isinstance(value, dict) and not isinstance(value, self.__class__):
value = self.__class__(value)
as you can see in __setattr__
method , if you pass a variable as list or tuple , you get a list, if not you get the original type back.
and I don't have the problem you are mentioning :
conf = EasyDict()
conf.name = 'CD Group'
conf.name2 = ['CD Group']
print(conf.name , conf.name2, sep='\n')
output:
>>
CD Group
['CD Group']
QUESTION
I am trying to run this github repo found at this link: https://github.com/HowieMa/DeepSORT_YOLOv5_Pytorch After installing the requirements via pip install -r requirements.txt. I am running this in a python 3.8 virtual environment, on a dji manifold 2g which runs on an Nvidia jetson tx2.
The following is the terminal output.
$ python main.py --cam 0 --display
Namespace(agnostic_nms=False, augment=False, cam=0, classes=[0], conf_thres=0.5, config_deepsort='./configs/deep_sort.yaml', device='', display=True, display_height=600, display_width=800, fourcc='mp4v', frame_interval=2, img_size=640, input_path='input_480.mp4', iou_thres=0.5, save_path='output/', save_txt='output/predict/', weights='yolov5/weights/yolov5s.pt')
Initialize DeepSORT & YOLO-V5
Using CPU
Using webcam 0
Traceback (most recent call last):
File "main.py", line 259, in
with VideoTracker(args) as vdo_trk:
File "main.py", line 53, in __init__
cfg.merge_from_file(args.config_deepsort)
File "/home/dji/Desktop/targetTrackers/howieMa/DeepSORT_YOLOv5_Pytorch/utils_ds/parser.py", line 23, in merge_from_file
self.update(yaml.load(fo.read()))
TypeError: load() missing 1 required positional argument: 'Loader'
I have found some suggestions on github, such as in here TypeError: load() missing 1 required positional argument: 'Loader' in Google Colab, which suggests to change yaml.load to yaml.safe_load
This is the code block to modify:
class YamlParser(edict):
"""
This is yaml parser based on EasyDict.
"""
def __init__(self, cfg_dict=None, config_file=None):
if cfg_dict is None:
cfg_dict = {}
if config_file is not None:
assert(os.path.isfile(config_file))
with open(config_file, 'r') as fo:
cfg_dict.update(yaml.load(fo.read()))
super(YamlParser, self).__init__(cfg_dict)
def merge_from_file(self, config_file):
with open(config_file, 'r') as fo:
self.update(yaml.load(fo.read()))
def merge_from_dict(self, config_dict):
self.update(config_dict)
However, changing yaml.load into yaml.safe_load leads me to this error instead
$ python main.py --cam 0 --display
Namespace(agnostic_nms=False, augment=False, cam=0, classes=[0], conf_thres=0.5, config_deepsort='./configs/deep_sort.yaml', device='', display=True, display_height=600, display_width=800, fourcc='mp4v', frame_interval=2, img_size=640, input_path='input_480.mp4', iou_thres=0.5, save_path='output/', save_txt='output/predict/', weights='yolov5/weights/yolov5s.pt')
Initialize DeepSORT & YOLO-V5
Using CPU
Using webcam 0
Done..
Camera ...
Done. Create output file output/results.mp4
Illegal instruction (core dumped)
Has anyone encountered anything similar ? Thank you !
ANSWER
Answered 2021-Nov-11 at 05:39Try this:
yaml.load(fo.read(), Loader=yaml.FullLoader)
It seems that pyyaml>=5.1 requires a Loader
argument.
QUESTION
I downloaded a requirements.txt
file from a GitHub repository, but it appears to be little different than the normal format of requirements.txt
file.
- Can you tell me how the author generated this kind of
requirements.txt
file? Which tools did they use? - How can I use this particular file format to instantiate the Python environment? I have tried executing the commands
conda install --file requirements.txt
andpip install -r requirements.txt
on a Windows ‘ machine, but to no avail.
https://github.com/wvangansbeke/Unsupervised-Classification/blob/master/requirements.txt
""" This file contains a list of packages and their versions that were used to produce the results. """
- _libgcc_mutex=0.1=main
- blas=1.0=mkl
- bzip2=1.0.8=h7b6447c_0
- ca-certificates=2020.1.1=0
- cairo=1.14.12=h8948797_3
- certifi=2020.4.5.1=py37_0
- cffi=1.14.0=py37h2e261b9_0
- cmake=3.14.0=h52cb24c_0
- cudatoolkit=10.0.130=0
- cycler=0.10.0=py37_0
- dbus=1.13.12=h746ee38_0
- easydict=1.9=py_0
- expat=2.2.6=he6710b0_0
- faiss-gpu=1.6.3=py37h1a5d453_0
- ffmpeg=4.0=hcdf2ecd_0
- fontconfig=2.13.0=h9420a91_0
- freeglut=3.0.0=hf484d3e_5
- freetype=2.9.1=h8a8886c_1
- glib=2.63.1=h5a9c865_0
- graphite2=1.3.13=h23475e2_0
- gst-plugins-base=1.14.0=hbbd80ab_1
- gstreamer=1.14.0=hb453b48_1
- h5py=2.8.0=py37h989c5e5_3
- harfbuzz=1.8.8=hffaf4a1_0
- hdf5=1.10.2=hba1933b_1
- icu=58.2=h9c2bf20_1
- imageio=2.8.0=py_0
- intel-openmp=2020.0=166
- jasper=2.0.14=h07fcdf6_1
- joblib=0.14.1=py_0
- jpeg=9b=h024ee3a_2
- kiwisolver=1.1.0=py37he6710b0_0
- krb5=1.17.1=h173b8e3_0
- ld_impl_linux-64=2.33.1=h53a641e_7
- libcurl=7.69.1=h20c2e04_0
- libedit=3.1.20181209=hc058e9b_0
- libffi=3.2.1=hd88cf55_4
- libgcc-ng=9.1.0=hdf63c60_0
- libgfortran-ng=7.3.0=hdf63c60_0
- libglu=9.0.0=hf484d3e_1
- libopencv=3.4.2=hb342d67_1
- libopus=1.3.1=h7b6447c_0
- libpng=1.6.37=hbc83047_0
- libprotobuf=3.11.4=hd408876_0
- libssh2=1.9.0=h1ba5d50_1
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.1.0=h2733197_0
- libuuid=1.0.3=h1bed415_2
- libvpx=1.7.0=h439df22_0
- libxcb=1.13=h1bed415_1
- libxml2=2.9.9=hea5a465_1
- matplotlib=3.1.3=py37_0
- matplotlib-base=3.1.3=py37hef1b27d_0
- mkl=2020.0=166
- mkl-service=2.3.0=py37he904b0f_0
- mkl_fft=1.0.15=py37ha843d7b_0
- mkl_random=1.1.0=py37hd6b4f25_0
- ncurses=6.2=he6710b0_0
- ninja=1.9.0=py37hfd86e86_0
- numpy=1.18.1=py37h4f9e942_0
- numpy-base=1.18.1=py37hde5b4d6_1
- olefile=0.46=py_0
- opencv=3.4.2=py37h6fd60c2_1
- openssl=1.1.1g=h7b6447c_0
- pcre=8.43=he6710b0_0
- pillow=7.0.0=py37hb39fc2d_0
- pip=20.0.2=py37_1
- pixman=0.38.0=h7b6447c_0
- protobuf=3.11.4=py37he6710b0_0
- py-opencv=3.4.2=py37hb342d67_1
- pycparser=2.20=py_0
- pyparsing=2.4.6=py_0
- pyqt=5.9.2=py37h05f1152_2
- python=3.7.7=hcf32534_0_cpython
- python-dateutil=2.8.1=py_0
- pytorch=1.4.0=py3.7_cuda10.0.130_cudnn7.6.3_0
- pyyaml=5.3.1=py37h7b6447c_0
- qt=5.9.7=h5867ecd_1
- readline=8.0=h7b6447c_0
- rhash=1.3.8=h1ba5d50_0
- scikit-learn=0.22.1=py37hd81dba3_0
- scipy=1.4.1=py37h0b6359f_0
- setuptools=46.1.3=py37_0
- sip=4.19.8=py37hf484d3e_0
- six=1.14.0=py37_0
- sqlite=3.31.1=h7b6447c_0
- swig=3.0.12=h38cdd7d_3
- tensorboardx=2.0=py_0
- termcolor=1.1.0=py37_1
- tk=8.6.8=hbc83047_0
- torchvision=0.5.0=py37_cu100
- tornado=6.0.4=py37h7b6447c_1
- typing=3.6.4=py37_0
- wheel=0.34.2=py37_0
- xz=5.2.4=h14c3975_4
- yaml=0.1.7=had09818_2
- zlib=1.2.11=h7b6447c_3
- zstd=1.3.7=h0b5b093_0
- pip:
- blis==0.4.1
- catalogue==1.0.0
- chardet==3.0.4
- cymem==2.0.3
- en-core-web-sm==2.2.5
- idna==2.9
- importlib-metadata==1.6.0
- murmurhash==1.0.2
- plac==1.1.3
- preshed==3.0.2
- requests==2.23.0
- spacy==2.2.4
- srsly==1.0.2
- thinc==7.4.0
- tqdm==4.45.0
- urllib3==1.25.8
- wasabi==0.6.0
- zipp==3.1.0
ANSWER
Answered 2021-Oct-17 at 01:46This looks like a conda environment.yml
file. It can be used to create a conda environment, like so
conda env create --file requirements.txt
QUESTION
data source: https://catalog.data.gov/dataset/nyc-transit-subway-entrance-and-exit-data
I tried looking for a similar problem but I can't find an answer and the error does not help much. I'm kinda frustrated at this point. Thanks for the help. I'm calculating the closest distance from a point.
df_subway = pd.read_csv('/content/drive/MyDrive/Despliegue_de_modelos/NYC_Transit_Subway_Entrance_And_Exit_Data.csv')
geometry = [Point(xy) for xy in zip(df_subway['Station Longitude'], df_subway['Station Latitude'])]
# Coordinate reference system :
crs = {'init': 'EPSG:4326'}
# Creating a Geographic data frame
gdf_subway_entrance_geometry = gpd.GeoDataFrame(df_subway, crs=crs, geometry=geometry).to_crs('EPSG:5234')
gdf_subway_entrance_geometry
df_yes_entry = gdf_subway_entrance_geometry[gdf_subway_entrance_geometry.Entry=='YES']
df_yes_entry
from shapely.geometry import Point, MultiPoint
from shapely.ops import nearest_points
pts = MultiPoint(df_yes_entry['geometry']) #it fails in this line
pt = Point(gpdPoint.x, gpdPoint.y)
#[o.wkt for o in nearest_points(pt, pts)]
for o in nearest_points(pt, pts):
print(o)
The problem is that if I do the same but with gdf_subway_entrance_geometry instead of df_yes_entry it works, but I need to do some filters!
This is the error:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
/usr/local/lib/python3.7/dist-packages/pandas/core/indexes/base.py in get_loc(self, key, method, tolerance)
2897 try:
-> 2898 return self._engine.get_loc(casted_key)
2899 except KeyError as err:
pandas/_libs/index.pyx in pandas._libs.index.IndexEngine.get_loc()
pandas/_libs/index.pyx in pandas._libs.index.IndexEngine.get_loc()
pandas/_libs/hashtable_class_helper.pxi in pandas._libs.hashtable.Int64HashTable.get_item()
pandas/_libs/hashtable_class_helper.pxi in pandas._libs.hashtable.Int64HashTable.get_item()
KeyError: 13
The above exception was the direct cause of the following exception:
KeyError Traceback (most recent call last)
7 frames
in ()
1 from shapely.geometry import Point, MultiPoint
2 from shapely.ops import nearest_points
----> 3 pts = MultiPoint(df_yes_entry['geometry'])
4 pt = Point(gpdPoint.x, gpdPoint.y)
5 #[o.wkt for o in nearest_points(pt, pts)]
/usr/local/lib/python3.7/dist-packages/shapely/geometry/multipoint.py in __init__(self, points)
56 pass
57 else:
---> 58 self._geom, self._ndim = geos_multipoint_from_py(points)
59
60 def shape_factory(self, *args):
/usr/local/lib/python3.7/dist-packages/shapely/geometry/multipoint.py in geos_multipoint_from_py(ob)
169 # add to coordinate sequence
170 for i in range(m):
--> 171 coords = ob[i]
172 geom, ndims = point.geos_point_from_py(coords)
173
/usr/local/lib/python3.7/dist-packages/geopandas/geoseries.py in __getitem__(self, key)
606
607 def __getitem__(self, key):
--> 608 return self._wrapped_pandas_method("__getitem__", key)
609
610 @doc(pd.Series)
/usr/local/lib/python3.7/dist-packages/geopandas/geoseries.py in _wrapped_pandas_method(self, mtd, *args, **kwargs)
599 def _wrapped_pandas_method(self, mtd, *args, **kwargs):
600 """Wrap a generic pandas method to ensure it returns a GeoSeries"""
--> 601 val = getattr(super(), mtd)(*args, **kwargs)
602 if type(val) == Series:
603 val.__class__ = GeoSeries
/usr/local/lib/python3.7/dist-packages/pandas/core/series.py in __getitem__(self, key)
880
881 elif key_is_scalar:
--> 882 return self._get_value(key)
883
884 if is_hashable(key):
/usr/local/lib/python3.7/dist-packages/pandas/core/series.py in _get_value(self, label, takeable)
988
989 # Similar to Index.get_value, but we do not fall back to positional
--> 990 loc = self.index.get_loc(label)
991 return self.index._get_values_for_loc(self, loc, label)
992
/usr/local/lib/python3.7/dist-packages/pandas/core/indexes/base.py in get_loc(self, key, method, tolerance)
2898 return self._engine.get_loc(casted_key)
2899 except KeyError as err:
-> 2900 raise KeyError(key) from err
2901
2902 if tolerance is not None:
KeyError: 13
I am working with colab this are my packages:
Package Version
----------------------------- --------------
absl-py 0.12.0
alabaster 0.7.12
albumentations 0.1.12
altair 4.1.0
appdirs 1.4.4
argcomplete 1.12.3
argon2-cffi 21.1.0
arviz 0.11.4
astor 0.8.1
astropy 4.3.1
astunparse 1.6.3
atari-py 0.2.9
atomicwrites 1.4.0
attrs 21.2.0
audioread 2.1.9
autograd 1.3
Babel 2.9.1
backcall 0.2.0
beautifulsoup4 4.6.3
bleach 4.1.0
blis 0.4.1
bokeh 2.4.0
Bottleneck 1.3.2
branca 0.4.2
bs4 0.0.1
CacheControl 0.12.6
cached-property 1.5.2
cachetools 4.2.4
Cartopy 0.19.0.post1
catalogue 1.0.0
certifi 2021.5.30
cffi 1.14.6
cftime 1.5.1
chardet 3.0.4
charset-normalizer 2.0.6
clang 5.0
click 7.1.2
click-plugins 1.1.1
cligj 0.7.2
cloudpickle 1.3.0
cmake 3.12.0
cmdstanpy 0.9.5
colorcet 2.0.6
colorlover 0.3.0
community 1.0.0b1
contextlib2 0.5.5
convertdate 2.3.2
coverage 3.7.1
coveralls 0.5
crcmod 1.7
cufflinks 0.17.3
cvxopt 1.2.7
cvxpy 1.0.31
cycler 0.10.0
cymem 2.0.5
Cython 0.29.24
daft 0.0.4
dask 2.12.0
datascience 0.10.6
debugpy 1.0.0
decorator 4.4.2
defusedxml 0.7.1
descartes 1.1.0
dill 0.3.4
distributed 1.25.3
dlib 19.18.0
dm-tree 0.1.6
docopt 0.6.2
docutils 0.17.1
dopamine-rl 1.0.5
earthengine-api 0.1.284
easydict 1.9
ecos 2.0.7.post1
editdistance 0.5.3
en-core-web-sm 2.2.5
entrypoints 0.3
ephem 4.1
et-xmlfile 1.1.0
fa2 0.3.5
fastai 1.0.61
fastdtw 0.3.4
fastprogress 1.0.0
fastrlock 0.6
fbprophet 0.7.1
feather-format 0.4.1
filelock 3.3.0
Fiona 1.8.20
firebase-admin 4.4.0
fix-yahoo-finance 0.0.22
Flask 1.1.4
flatbuffers 1.12
folium 0.8.3
future 0.16.0
gast 0.4.0
GDAL 2.2.2
gdown 3.6.4
gensim 3.6.0
geographiclib 1.52
geopandas 0.10.1
geopy 1.17.0
geoviews 1.9.2
gin-config 0.4.0
glob2 0.7
google 2.0.3
google-api-core 1.26.3
google-api-python-client 1.12.8
google-auth 1.35.0
google-auth-httplib2 0.0.4
google-auth-oauthlib 0.4.6
google-cloud-bigquery 1.21.0
google-cloud-bigquery-storage 1.1.0
google-cloud-core 1.0.3
google-cloud-datastore 1.8.0
google-cloud-firestore 1.7.0
google-cloud-language 1.2.0
google-cloud-storage 1.18.1
google-cloud-translate 1.5.0
google-colab 1.0.0
google-pasta 0.2.0
google-resumable-media 0.4.1
googleapis-common-protos 1.53.0
googledrivedownloader 0.4
graphviz 0.10.1
greenlet 1.1.2
grpcio 1.41.0
gspread 3.0.1
gspread-dataframe 3.0.8
gym 0.17.3
h5py 3.1.0
HeapDict 1.0.1
hijri-converter 2.2.2
holidays 0.10.5.2
holoviews 1.14.6
html5lib 1.0.1
htmlmin 0.1.12
httpimport 0.5.18
httplib2 0.17.4
httplib2shim 0.0.3
humanize 0.5.1
hyperopt 0.1.2
ideep4py 2.0.0.post3
idna 2.10
ImageHash 4.2.1
imageio 2.4.1
imagesize 1.2.0
imbalanced-learn 0.4.3
imblearn 0.0
imgaug 0.2.9
importlib-metadata 4.8.1
importlib-resources 5.2.2
imutils 0.5.4
inflect 2.1.0
iniconfig 1.1.1
intel-openmp 2021.4.0
intervaltree 2.1.0
ipykernel 4.10.1
ipython 5.5.0
ipython-genutils 0.2.0
ipython-sql 0.3.9
ipywidgets 7.6.5
itsdangerous 1.1.0
jax 0.2.21
jaxlib 0.1.71+cuda111
jdcal 1.4.1
jedi 0.18.0
jieba 0.42.1
Jinja2 2.11.3
joblib 1.0.1
jpeg4py 0.1.4
jsonschema 2.6.0
jupyter 1.0.0
jupyter-client 5.3.5
jupyter-console 5.2.0
jupyter-core 4.8.1
jupyterlab-pygments 0.1.2
jupyterlab-widgets 1.0.2
kaggle 1.5.12
kapre 0.3.5
keras 2.6.0
Keras-Preprocessing 1.1.2
keras-vis 0.4.1
kiwisolver 1.3.2
korean-lunar-calendar 0.2.1
librosa 0.8.1
lightgbm 2.2.3
llvmlite 0.34.0
lmdb 0.99
LunarCalendar 0.0.9
lxml 4.2.6
Markdown 3.3.4
MarkupSafe 2.0.1
matplotlib 3.2.2
matplotlib-inline 0.1.3
matplotlib-venn 0.11.6
missingno 0.5.0
mistune 0.8.4
mizani 0.6.0
mkl 2019.0
mlxtend 0.14.0
more-itertools 8.10.0
moviepy 0.2.3.5
mpmath 1.2.1
msgpack 1.0.2
multimethod 1.6
multiprocess 0.70.12.2
multitasking 0.0.9
munch 2.5.0
murmurhash 1.0.5
music21 5.5.0
natsort 5.5.0
nbclient 0.5.4
nbconvert 5.6.1
nbformat 5.1.3
nest-asyncio 1.5.1
netCDF4 1.5.7
networkx 2.6.3
nibabel 3.0.2
nltk 3.2.5
notebook 5.3.1
numba 0.51.2
numexpr 2.7.3
numpy 1.19.5
nvidia-ml-py3 7.352.0
oauth2client 4.1.3
oauthlib 3.1.1
okgrade 0.4.3
opencv-contrib-python 4.1.2.30
opencv-python 4.1.2.30
openpyxl 2.5.9
opt-einsum 3.3.0
osqp 0.6.2.post0
packaging 21.0
palettable 3.3.0
pandas 1.1.5
pandas-datareader 0.9.0
pandas-gbq 0.13.3
pandas-profiling 3.1.0
pandocfilters 1.5.0
panel 0.12.4
param 1.11.1
parso 0.8.2
pathlib 1.0.1
patsy 0.5.2
pep517 0.11.0
pexpect 4.8.0
phik 0.12.0
pickleshare 0.7.5
Pillow 7.1.2
pip 21.1.3
pip-tools 6.2.0
plac 1.1.3
plotly 4.4.1
plotnine 0.6.0
pluggy 0.7.1
pooch 1.5.1
portpicker 1.3.9
prefetch-generator 1.0.1
preshed 3.0.5
prettytable 2.2.1
progressbar2 3.38.0
prometheus-client 0.11.0
promise 2.3
prompt-toolkit 1.0.18
protobuf 3.17.3
psutil 5.4.8
psycopg2 2.7.6.1
ptyprocess 0.7.0
py 1.10.0
pyarrow 3.0.0
pyasn1 0.4.8
pyasn1-modules 0.2.8
pycocotools 2.0.2
pycparser 2.20
pyct 0.4.8
pydantic 1.8.2
pydata-google-auth 1.2.0
pydot 1.3.0
pydot-ng 2.0.0
pydotplus 2.0.2
PyDrive 1.3.1
pyemd 0.5.1
pyerfa 2.0.0
pyglet 1.5.0
Pygments 2.6.1
pygobject 3.26.1
pymc3 3.11.4
PyMeeus 0.5.11
pymongo 3.12.0
pymystem3 0.2.0
PyOpenGL 3.1.5
pyparsing 2.4.7
pyproj 3.2.1
pyrsistent 0.18.0
pyshp 2.1.3
pysndfile 1.3.8
PySocks 1.7.1
pystan 2.19.1.1
pytest 3.6.4
python-apt 0.0.0
python-chess 0.23.11
python-dateutil 2.8.2
python-louvain 0.15
python-slugify 5.0.2
python-utils 2.5.6
pytz 2018.9
pyviz-comms 2.1.0
PyWavelets 1.1.1
PyYAML 5.4.1
pyzmq 22.3.0
qdldl 0.1.5.post0
qtconsole 5.1.1
QtPy 1.11.2
regex 2019.12.20
requests 2.26.0
requests-oauthlib 1.3.0
resampy 0.2.2
retrying 1.3.3
rpy2 3.4.5
rsa 4.7.2
scikit-image 0.16.2
scikit-learn 0.22.2.post1
scipy 1.7.1
screen-resolution-extra 0.0.0
scs 2.1.4
seaborn 0.11.2
semver 2.13.0
Send2Trash 1.8.0
setuptools 57.4.0
setuptools-git 1.2
Shapely 1.7.1
simplegeneric 0.8.1
six 1.15.0
sklearn 0.0
sklearn-pandas 1.8.0
smart-open 5.2.1
snowballstemmer 2.1.0
sortedcontainers 2.4.0
SoundFile 0.10.3.post1
spacy 2.2.4
Sphinx 1.8.5
sphinxcontrib-serializinghtml 1.1.5
sphinxcontrib-websupport 1.2.4
SQLAlchemy 1.4.25
sqlparse 0.4.2
srsly 1.0.5
statsmodels 0.10.2
sympy 1.7.1
tables 3.4.4
tabulate 0.8.9
tangled-up-in-unicode 0.1.0
tblib 1.7.0
tensorboard 2.6.0
tensorboard-data-server 0.6.1
tensorboard-plugin-wit 1.8.0
tensorflow 2.6.0
tensorflow-datasets 4.0.1
tensorflow-estimator 2.6.0
tensorflow-gcs-config 2.6.0
tensorflow-hub 0.12.0
tensorflow-metadata 1.2.0
tensorflow-probability 0.14.1
termcolor 1.1.0
terminado 0.12.1
testpath 0.5.0
text-unidecode 1.3
textblob 0.15.3
Theano-PyMC 1.1.2
thinc 7.4.0
tifffile 2021.8.30
toml 0.10.2
tomli 1.2.1
toolz 0.11.1
torch 1.9.0+cu111
torchsummary 1.5.1
torchtext 0.10.0
torchvision 0.10.0+cu111
tornado 5.1.1
tqdm 4.62.3
traitlets 5.1.0
tweepy 3.10.0
typeguard 2.7.1
typing-extensions 3.10.0.2
tzlocal 1.5.1
uritemplate 3.0.1
urllib3 1.24.3
vega-datasets 0.9.0
visions 0.7.4
wasabi 0.8.2
wcwidth 0.2.5
webencodings 0.5.1
Werkzeug 1.0.1
wheel 0.37.0
widgetsnbextension 3.5.1
wordcloud 1.5.0
wrapt 1.12.1
xarray 0.18.2
xgboost 0.90
xkit 0.0.0
xlrd 1.1.0
xlwt 1.3.0
yellowbrick 0.9.1
zict 2.0.0
zipp 3.6.0
ANSWER
Answered 2021-Oct-11 at 14:21geopandas 0.10.1
- have noted that your data is on kaggle, so start by sourcing it
- there really is only one issue
shapely.geometry.MultiPoint()
constructor does not work with a filtered series. Pass it a numpy array instead and it works. - full code below, have randomly selected a point to serve as
gpdPoint
# https://www.kaggle.com/new-york-state/nys-nyc-transit-subway-entrance-and-exit-data
import kaggle.cli
import sys, requests, urllib
import pandas as pd
from pathlib import Path
from zipfile import ZipFile
# fmt: off
# download data set
url = "https://www.kaggle.com/new-york-state/nys-nyc-transit-subway-entrance-and-exit-data"
sys.argv = [sys.argv[0]] + f"datasets download {urllib.parse.urlparse(url).path[1:]}".split(" ")
kaggle.cli.main()
zfile = ZipFile(f'{urllib.parse.urlparse(url).path.split("/")[-1]}.zip')
dfs = {f.filename: pd.read_csv(zfile.open(f)) for f in zfile.infolist() if Path(f.filename).suffix in [".csv"]}
# fmt: on
df_subway = dfs['nyc-transit-subway-entrance-and-exit-data.csv']
from shapely.geometry import Point, MultiPoint
from shapely.ops import nearest_points
import geopandas as gpd
geometry = [Point(xy) for xy in zip(df_subway['Station Longitude'], df_subway['Station Latitude'])]
# Coordinate reference system :
crs = {'init': 'EPSG:4326'}
# Creating a Geographic data frame
gdf_subway_entrance_geometry = gpd.GeoDataFrame(df_subway, crs=crs, geometry=geometry).to_crs('EPSG:5234')
gdf_subway_entrance_geometry
df_yes_entry = gdf_subway_entrance_geometry
df_yes_entry = gdf_subway_entrance_geometry[gdf_subway_entrance_geometry.Entry=='YES']
df_yes_entry
# randomly select a point....
gpdPoint = gdf_subway_entrance_geometry.sample(1).geometry.tolist()[0]
pts = MultiPoint(df_yes_entry['geometry'].values) # does not work with a geopandas series, works with a numpy array
pt = Point(gpdPoint.x, gpdPoint.y)
#[o.wkt for o in nearest_points(pt, pts)]
for o in nearest_points(pt, pts):
print(o)
QUESTION
Having trouble with CUDA + Pytorch this is the error. I reinstalled CUDA and cudnn multiple times.
Conda env is detecting GPU but its giving errors with pytorch and certain cuda libraries. I tried with Cuda 10.1 and 10.0, and cudnn version 8 and 7.6.5, Added cuda to path and everything.
However anaconda is showing cuda tool kit 9.0 is installed, whilst I clearly installed 10.0, so I am not entirely sure what's the deal with that.
=> loading model from models/pytorch/pose_coco/pose_hrnet_w32_256x192.pth
Traceback (most recent call last):
File "hydroman2.py", line 580, in
pose_model.load_state_dict(torch.load(cfg.TEST.MODEL_FILE), strict=False)
File "C:\Users\Fardin\anaconda3\envs\myenv\lib\site-packages\torch\serialization.py", line 593, in load
return _legacy_load(opened_file, map_location, pickle_module, **pickle_load_args)
File "C:\Users\Fardin\anaconda3\envs\myenv\lib\site-packages\torch\serialization.py", line 773, in _legacy_load
result = unpickler.load()
File "C:\Users\Fardin\anaconda3\envs\myenv\lib\site-packages\torch\serialization.py", line 729, in persistent_load
deserialized_objects[root_key] = restore_location(obj, location)
File "C:\Users\Fardin\anaconda3\envs\myenv\lib\site-packages\torch\serialization.py", line 178, in default_restore_location
result = fn(storage, location)
File "C:\Users\Fardin\anaconda3\envs\myenv\lib\site-packages\torch\serialization.py", line 154, in _cuda_deserialize
device = validate_cuda_device(location)
File "C:\Users\Fardin\anaconda3\envs\myenv\lib\site-packages\torch\serialization.py", line 138, in validate_cuda_device
raise RuntimeError('Attempting to deserialize object on a CUDA '
RuntimeError: Attempting to deserialize object on a CUDA device but torch.cuda.is_available() is False. If you are running on a CPU-only machine, please use torch.load with map_location=torch.device('cpu') to map your storages to the CPU.
System info
System info:
--------------------------------------------------------------------------------
__Time Stamp__
Report started (local time) : 2021-03-19 19:59:06.957967
UTC start time : 2021-03-19 15:59:06.957967
Running time (s) : 4.003899
__Hardware Information__
Machine : AMD64
CPU Name : znver1
CPU Count : 12
Number of accessible CPUs : 12
List of accessible CPUs cores : 0 1 2 3 4 5 6 7 8 9 10 11
CFS Restrictions (CPUs worth of runtime) : None
CPU Features : 64bit adx aes avx avx2 bmi bmi2
clflushopt clzero cmov cx16 cx8
f16c fma fsgsbase fxsr lzcnt mmx
movbe mwaitx pclmul popcnt prfchw
rdrnd rdseed sahf sha sse sse2
sse3 sse4.1 sse4.2 sse4a ssse3
xsave xsavec xsaveopt xsaves
Memory Total (MB) : 16334
Memory Available (MB) : 8787
__OS Information__
Platform Name : Windows-10-10.0.19041-SP0
Platform Release : 10
OS Name : Windows
OS Version : 10.0.19041
OS Specific Version : 10 10.0.19041 SP0 Multiprocessor Free
Libc Version : ?
__Python Information__
Python Compiler : MSC v.1916 64 bit (AMD64)
Python Implementation : CPython
Python Version : 3.8.5
Python Locale : en_US.cp1252
__LLVM Information__
LLVM Version : 10.0.1
__CUDA Information__
CUDA Device Initialized : True
CUDA Driver Version : 11020
CUDA Detect Output:
Found 1 CUDA devices
id 0 b'GeForce GTX 1070' [SUPPORTED]
compute capability: 6.1
pci device id: 0
pci bus id: 6
Summary:
1/1 devices are supported
CUDA Librairies Test Output:
Finding cublas from
named cublas.dll
trying to open library... ERROR: failed to open cublas:
Could not find module 'cublas.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Finding cusparse from
named cusparse.dll
trying to open library... ERROR: failed to open cusparse:
Could not find module 'cusparse.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Finding cufft from
named cufft.dll
trying to open library... ERROR: failed to open cufft:
Could not find module 'cufft.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Finding curand from
named curand.dll
trying to open library... ERROR: failed to open curand:
Could not find module 'curand.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Finding nvvm from
named nvvm.dll
trying to open library... ERROR: failed to open nvvm:
Could not find module 'nvvm.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Finding cudart from
named cudart.dll
trying to open library... ERROR: failed to open cudart:
Could not find module 'cudart.dll' (or one of its dependencies). Try using the full path with constructor syntax.
Finding libdevice from
searching for compute_20... ERROR: can't open libdevice for compute_20
searching for compute_30... ERROR: can't open libdevice for compute_30
searching for compute_35... ERROR: can't open libdevice for compute_35
searching for compute_50... ERROR: can't open libdevice for compute_50
__ROC information__
ROC Available : False
ROC Toolchains : None
HSA Agents Count : 0
HSA Agents:
None
HSA Discrete GPUs Count : 0
HSA Discrete GPUs : None
__SVML Information__
SVML State, config.USING_SVML : True
SVML Library Loaded : True
llvmlite Using SVML Patched LLVM : True
SVML Operational : True
__Threading Layer Information__
TBB Threading Layer Available : False
+--> Disabled due to Unknown import problem.
OpenMP Threading Layer Available : True
+-->Vendor: MS
Workqueue Threading Layer Available : True
+-->Workqueue imported successfully.
__Numba Environment Variable Information__
None found.
__Conda Information__
Conda Build : 3.20.5
Conda Env : 4.9.2
Conda Platform : win-64
Conda Python Version : 3.8.5.final.0
Conda Root Writable : True
__Installed Packages__
_pytorch_select 1.1.0 cpu anaconda
_tflow_select 2.3.0 mkl anaconda
absl-py 0.12.0 pypi_0 pypi
alabaster 0.7.12 pypi_0 pypi
appdirs 1.4.3 py36h28b3542_0 anaconda
argparse 1.4.0 pypi_0 pypi
asn1crypto 1.3.0 py36_0 anaconda
astor 0.8.1 pyh9f0ad1d_0 conda-forge
astunparse 1.6.3 pypi_0 pypi
atomicwrites 1.4.0 py_0 anaconda
attrs 19.3.0 py_0 anaconda
babel 2.9.0 pypi_0 pypi
backcall 0.2.0 py_0 anaconda
backports 1.0 py_2 anaconda
backports.weakref 1.0.post1 py36h9f0ad1d_1001 conda-forge
blas 1.0 mkl anaconda
bleach 1.5.0 py36_0 conda-forge
blinker 1.4 py_1 conda-forge
brotlipy 0.7.0 py36he774522_1000 anaconda
bzip2 1.0.8 he774522_0 anaconda
ca-certificates 2020.10.14 0 anaconda
cachetools 4.1.1 py_0 anaconda
certifi 2020.6.20 py36_0 anaconda
cffi 1.14.0 py36h7a1dbc1_0 anaconda
chardet 3.0.4 py36_1003 anaconda
click 7.1.2 pyh9f0ad1d_0 conda-forge
cloudpickle 1.4.1 py_0 anaconda
colorama 0.4.3 py_0 anaconda
contextlib2 0.6.0.post1 py_0 anaconda
cpuonly 1.0 0 pytorch
cryptography 2.9.2 py36h7a1dbc1_0 anaconda
cudatoolkit 9.0 1 anaconda
cudnn 7.6.5 cuda9.0_0 anaconda
curl 7.71.0 h2a8f88b_0 anaconda
cycler 0.10.0 py36h009560c_0 anaconda
cython 0.29.22 pypi_0 pypi
cytoolz 0.10.1 py36he774522_0 anaconda
dask-core 2.19.0 py_0 anaconda
decorator 4.4.2 py_0 anaconda
defusedxml 0.6.0 py_0 anaconda
dlib 19.20 py36h5653133_1 conda-forge
docker-py 4.2.1 py36h9f0ad1d_0 conda-forge
docker-pycreds 0.4.0 py_0 anaconda
docutils 0.16 pypi_0 pypi
easydict 1.7 pypi_0 pypi
entrypoints 0.3 py36_0 anaconda
ffmpeg 2.7.0 0 menpo
flake8 3.8.3 py_0 anaconda
flake8-polyfill 1.0.2 py36_0 anaconda
flake8-quotes 3.0.0 pyh9f0ad1d_0 conda-forge
flatbuffers 1.12 pypi_0 pypi
freetype 2.10.2 hd328e21_0 anaconda
gast 0.2.2 pypi_0 pypi
geos 3.8.1 h33f27b4_0 anaconda
gettext 0.19.8.1 hb01d8f6_1002 conda-forge
git 2.23.0 h6bb4b03_0 anaconda
glib 2.58.3 py36h04c7ab9_1004 conda-forge
google-auth 1.28.0 pypi_0 pypi
google-auth-oauthlib 0.4.3 pypi_0 pypi
google-pasta 0.2.0 pyh8c360ce_0 conda-forge
grpcio 1.32.0 pypi_0 pypi
h5py 2.10.0 py36h5e291fa_0 anaconda
hdf5 1.10.4 h7ebc959_0 anaconda
html5lib 0.9999999 py36_0 conda-forge
icc_rt 2019.0.0 h0cc432a_1 anaconda
icu 58.2 ha925a31_3 anaconda
idna 2.10 py_0 anaconda
imageio 2.8.0 py_0 anaconda
imageio-ffmpeg 0.4.2 py_0 conda-forge
imagesize 1.2.0 pypi_0 pypi
imgaug 0.4.0 pypi_0 pypi
importlib-metadata 1.7.0 py36_0 anaconda
importlib_metadata 1.7.0 0 anaconda
intel-openmp 2019.4 245 anaconda
ipykernel 5.3.0 py36h5ca1d4c_0 anaconda
ipyparallel 6.3.0 pypi_0 pypi
ipython 7.16.1 py36h5ca1d4c_0 anaconda
ipython_genutils 0.2.0 py36_0 anaconda
ipywidgets 7.5.1 py_0 anaconda
jedi 0.17.1 py36_0 anaconda
jinja2 2.11.2 py_0 anaconda
joblib 0.15.1 py_0 anaconda
jpeg 9d he774522_0 conda-forge
json-tricks 3.15.5 pypi_0 pypi
jsonschema 3.2.0 py36_0 anaconda
jupyter 1.0.0 py36_7 anaconda
jupyter_client 6.1.3 py_0 anaconda
jupyter_console 6.1.0 py_0 anaconda
jupyter_core 4.6.3 py36_0 anaconda
keras-applications 1.0.8 py_1 anaconda
keras-preprocessing 1.1.2 pypi_0 pypi
kiwisolver 1.2.0 py36h74a9793_0 anaconda
krb5 1.18.2 hc04afaa_0 anaconda
leptonica 1.78.0 h919f142_2 conda-forge
libarchive 3.3.3 h0643e63_5 anaconda
libcurl 7.71.0 h2a8f88b_0 anaconda
libffi 3.2.1 h6538335_1007 conda-forge
libgpuarray 0.7.6 hfa6e2cd_1003 conda-forge
libiconv 1.15 vc14h29686d3_5 [vc14] anaconda
libmklml 2019.0.5 0 anaconda
libpng 1.6.37 h2a8f88b_0 anaconda
libprotobuf 3.12.3 h7bd577a_0 anaconda
libsodium 1.0.18 h62dcd97_0 anaconda
libssh2 1.9.0 h7a1dbc1_1 anaconda
libtiff 4.1.0 h56a325e_0 anaconda
libwebp 1.0.2 hfa6e2cd_5 conda-forge
libxml2 2.9.10 h464c3ec_1 anaconda
libxslt 1.1.34 he774522_0 anaconda
lxml 4.5.0 py36h1350720_0 anaconda
lz4-c 1.8.1.2 h2fa13f4_0 anaconda
lzo 2.10 he774522_2 anaconda
m2w64-gcc-libgfortran 5.3.0 6 conda-forge
m2w64-gcc-libs 5.3.0 7 conda-forge
m2w64-gcc-libs-core 5.3.0 7 conda-forge
m2w64-gmp 6.1.0 2 conda-forge
m2w64-libwinpthread-git 5.0.0.4634.697f757 2 conda-forge
mako 1.1.0 py_0 anaconda
markdown 3.3.4 pypi_0 pypi
markupsafe 1.1.1 py36he774522_0 anaconda
matplotlib 3.1.3 py36_0 anaconda
matplotlib-base 3.1.3 py36h64f37c6_0 anaconda
mccabe 0.6.1 py36_1 anaconda
mistune 0.8.4 py36he774522_0 anaconda
mkl 2018.0.3 1 anaconda
mkl_fft 1.0.6 py36hdbbee80_0 anaconda
mkl_random 1.0.1 py36h77b88f5_1 anaconda
mock 4.0.3 pypi_0 pypi
more-itertools 8.4.0 py_0 anaconda
moviepy 1.0.1 py_0 conda-forge
msys2-conda-epoch 20160418 1 conda-forge
nbconvert 5.6.1 py36_0 anaconda
nbformat 5.0.7 py_0 anaconda
networkx 2.4 py_0 anaconda
ninja 1.9.0 py36h74a9793_0 anaconda
nose 1.3.7 pypi_0 pypi
notebook 6.0.3 py36_0 anaconda
numpy 1.19.5 pypi_0 pypi
oauthlib 3.1.0 py_0 anaconda
olefile 0.46 py36_0 anaconda
opencv-python 3.4.1.15 pypi_0 pypi
openjpeg 2.3.1 h57dd2e7_3 conda-forge
openssl 1.1.1h he774522_0 anaconda
opt-einsum 3.3.0 pypi_0 pypi
packaging 20.4 py_0 anaconda
pandas 1.0.3 py36h47e9c7a_0 anaconda
pandoc 2.9.2.1 0 anaconda
pandocfilters 1.4.2 py36_1 anaconda
parso 0.7.0 py_0 anaconda
pcre 8.44 ha925a31_0 anaconda
pep8-naming 0.8.2 py36_0 anaconda
pickleshare 0.7.5 py36_0 anaconda
pillow 7.1.2 py36hcc1f983_0 anaconda
pip 20.2.4 py36_0 anaconda
pluggy 0.13.1 py36_0 anaconda
poppler 0.87.0 hdbe765f_0 conda-forge
poppler-data 0.4.9 1 conda-forge
proglog 0.1.9 py_0 conda-forge
prometheus_client 0.8.0 py_0 anaconda
prompt-toolkit 3.0.5 py_0 anaconda
prompt_toolkit 3.0.5 0 anaconda
protobuf 3.12.3 py36h33f27b4_0 anaconda
psutil 5.8.0 pypi_0 pypi
py 1.9.0 py_0 anaconda
pyasn1 0.4.8 py_0 anaconda
pyasn1-modules 0.2.8 pypi_0 pypi
pycocotools 2.0 pypi_0 pypi
pycodestyle 2.6.0 py_0 anaconda
pycparser 2.20 py_0 anaconda
pyflakes 2.2.0 py_0 anaconda
pygments 2.6.1 py_0 anaconda
pygpu 0.7.6 py36h7725771_1001 conda-forge
pyjwt 1.7.1 py_0 conda-forge
pyopenssl 19.1.0 py36_0 anaconda
pyparsing 2.4.7 py_0 anaconda
pyqt 5.9.2 py36h6538335_2 anaconda
pyreadline 2.1 py36_1001 conda-forge
pyrsistent 0.16.0 py36he774522_0 anaconda
pysocks 1.7.1 py36_0 anaconda
pytesseract 0.3.3 pyh8c360ce_0 conda-forge
pytest 5.4.3 py36_0 anaconda
python 3.6.10 h9f7ef89_1 anaconda
python-dateutil 2.8.1 py_0 anaconda
python_abi 3.6 1_cp36m conda-forge
pytorch 1.5.1 py3.6_cpu_0 [cpuonly] pytorch
pytz 2020.1 py_0 anaconda
pywavelets 1.1.1 py36he774522_0 anaconda
pywin32 223 py36hfa6e2cd_1 anaconda
pywinpty 0.5.7 py36_0 anaconda
pyyaml 5.3.1 py36he774522_0 anaconda
pyzmq 19.0.1 py36ha925a31_1 anaconda
qt 5.9.7 vc14h73c81de_0 [vc14] anaconda
qtconsole 4.7.5 py_0 anaconda
qtpy 1.9.0 py_0 anaconda
requests 2.24.0 py_0 anaconda
requests-oauthlib 1.3.0 pyh9f0ad1d_0 conda-forge
rsa 4.6 pyh9f0ad1d_0 conda-forge
scikit-image 0.16.2 py36h47e9c7a_0 anaconda
scikit-learn 0.20.1 py36hb854c30_0 anaconda
scipy 1.4.1 pypi_0 pypi
send2trash 1.5.0 py36_0 anaconda
setuptools 50.3.0 py36h9490d1a_1 anaconda
shapely 1.6.4 pypi_0 pypi
simplejson 3.17.0 py36he774522_0 anaconda
sip 4.19.8 py36h6538335_0 anaconda
six 1.15.0 py_0 anaconda
sklearn 0.0 pypi_0 pypi
slidingwindow 0.0.14 pypi_0 pypi
snowballstemmer 2.1.0 pypi_0 pypi
sphinx 3.5.2 pypi_0 pypi
sphinxcontrib-applehelp 1.0.2 pypi_0 pypi
sphinxcontrib-devhelp 1.0.2 pypi_0 pypi
sphinxcontrib-htmlhelp 1.0.3 pypi_0 pypi
sphinxcontrib-jsmath 1.0.1 pypi_0 pypi
sphinxcontrib-qthelp 1.0.3 pypi_0 pypi
sphinxcontrib-serializinghtml 1.1.4 pypi_0 pypi
sqlite 3.32.3 h2a8f88b_0 anaconda
swig 3.0.12 h047fa9f_3 anaconda
tbb 2020.0 h74a9793_0 anaconda
tbb4py 2020.0 py36h74a9793_0 anaconda
tensorboard 1.13.1 pypi_0 pypi
tensorboard-plugin-wit 1.8.0 pypi_0 pypi
tensorboardx 1.6 py_0 conda-forge
tensorflow 2.4.1 pypi_0 pypi
tensorflow-estimator 1.13.0 pypi_0 pypi
tensorflow-gpu 1.13.1 pypi_0 pypi
tensorflow-gpu-estimator 2.1.0 pypi_0 pypi
termcolor 1.1.0 pypi_0 pypi
terminado 0.8.3 py36_0 anaconda
testpath 0.4.4 py_0 anaconda
theano 1.0.4 py36h003fed8_1002 conda-forge
threadpoolctl 2.1.0 pyh5ca1d4c_0 anaconda
tk 8.6.10 he774522_0 anaconda
toolz 0.10.0 py_0 anaconda
torchfile 0.1.0 py_0 conda-forge
torchvision 0.6.1 py36_cpu [cpuonly] pytorch
tornado 6.0.4 py36he774522_1 anaconda
tqdm 4.47.0 py_0 anaconda
traitlets 4.3.3 py36_0 anaconda
typing-extensions 3.7.4.3 pypi_0 pypi
urllib3 1.25.11 py_0 anaconda
vc 14.1 h0510ff6_4 anaconda
visdom 0.1.8.9 0 conda-forge
vs2015_runtime 14.16.27012 hf0eaf9b_3 anaconda
vs2017_win-64 19.16.27038 h2e3bad8_2 conda-forge
vswhere 2.7.1 h21ff451_0 anaconda
wcwidth 0.2.5 py_0 anaconda
webencodings 0.5.1 py36_1 anaconda
websocket-client 0.57.0 py36_1 anaconda
werkzeug 1.0.1 pyh9f0ad1d_0 conda-forge
wget 1.16.3 0 menpo
wheel 0.35.1 py_0 anaconda
widgetsnbextension 3.5.1 py36_0 anaconda
win_inet_pton 1.1.0 py36_0 anaconda
wincertstore 0.2 py36h7fe50ca_0 anaconda
winpty 0.4.3 4 anaconda
wrapt 1.12.1 py36h68a101e_1 conda-forge
xz 5.2.5 h62dcd97_0 anaconda
yacs 0.1.8 pypi_0 pypi
yaml 0.1.7 hc54c509_2 anaconda
zeromq 4.3.2 ha925a31_2 anaconda
zipp 3.3.1 py_0 anaconda
zlib 1.2.11 h62dcd97_4 anaconda
zstd 1.3.7 h508b16e_0 anaconda
No errors reported.
ANSWER
Answered 2021-Mar-20 at 10:44From the list of libraries, it looks like you've installed CPU only version of the Pytorch.
pytorch 1.5.1 py3.6_cpu_0 [cpuonly] pytorch
You can see the available conda packages here for different CUDA + Python versions: https://anaconda.org/pytorch/pytorch/files . When you install the pytorch version, make sure it also matches with the CUDA version of your computer.
QUESTION
When I type conda env create -f environment.yml
I constantly get
Collecting package metadata (repodata.json): done Solving environment: failed
ResolvePackageNotFound:
- tk==8.6.8=hbc83047_0
- zlib==1.2.11=h7b6447c_3
- av==8.0.2=py37h06622b3_4
- lame==3.100=h7f98852_1001
- xz==5.2.4=h14c3975_4
- mkl_random==1.0.2=py37hd81dba3_0
- x264==1!152.20180806=h14c3975_0
- numpy-base==1.16.4=py37hde5b4d6_0
- certifi==2020.12.5=py37h06a4308_0
- _openmp_mutex==4.5=1_llvm
- llvm-openmp==11.0.0=hfc4b9b4_1
- freetype==2.9.1=h8a8886c_1
- scikit-learn==0.22.1=py37hd81dba3_0
- libgfortran-ng==7.3.0=hdf63c60_0
- readline==7.0=h7b6447c_5
- mkl_fft==1.0.12=py37ha843d7b_0
- libpng==1.6.37=hbc83047_0
- libedit==3.1.20181209=hc058e9b_0
- libffi==3.2.1=hd88cf55_4
- nettle==3.6=he412f7d_0
- gnutls==3.6.13=h85f3911_1
- python==3.7.3=h0371630_0
- gmp==6.2.1=h58526e2_0
- _libgcc_mutex==0.1=conda_forge
- libgcc-ng==9.3.0=h5dbcf3e_17
- mkl-service==2.3.0=py37he904b0f_0
- ffmpeg==4.3.1=h3215721_1
- openh264==2.1.1=h8b12597_0
- mkl==2019.4=243
- numpy==1.16.4=py37h7e9f1db_0
- ca-certificates==2020.12.8=h06a4308_0
- libiconv==1.16=h516909a_0
- intel-openmp==2019.4=243
- libstdcxx-ng==9.1.0=hdf63c60_0
- zstd==1.3.7=h0b5b093_0
- ncurses==6.1=he6710b0_1
- jpeg==9b=h024ee3a_2
- openssl==1.1.1i=h27cfd23_0
- bzip2==1.0.8=h7f98852_4
- sqlite==3.28.0=h7b6447c_0
- libtiff==4.0.10=h2733197_2
What should I do?
My yml
file is:
name: StyleFlow
channels:
- anaconda
- defaults
- conda-forge
dependencies:
- _libgcc_mutex=0.1=conda_forge
- _openmp_mutex=4.5=1_llvm
- av=8.0.2=py37h06622b3_4
- blas=1.0=mkl
- bzip2=1.0.8=h7f98852_4
- ca-certificates=2020.12.8=h06a4308_0
- certifi=2020.12.5=py37h06a4308_0
- ffmpeg=4.3.1=h3215721_1
- freetype=2.9.1=h8a8886c_1
- gmp=6.2.1=h58526e2_0
- gnutls=3.6.13=h85f3911_1
- intel-openmp=2019.4=243
- joblib=0.14.1=py_0
- jpeg=9b=h024ee3a_2
- lame=3.100=h7f98852_1001
- libedit=3.1.20181209=hc058e9b_0
- libffi=3.2.1=hd88cf55_4
- libgcc-ng=9.3.0=h5dbcf3e_17
- libgfortran-ng=7.3.0=hdf63c60_0
- libiconv=1.16=h516909a_0
- libpng=1.6.37=hbc83047_0
- libstdcxx-ng=9.1.0=hdf63c60_0
- libtiff=4.0.10=h2733197_2
- llvm-openmp=11.0.0=hfc4b9b4_1
- mkl=2019.4=243
- mkl-service=2.3.0=py37he904b0f_0
- mkl_fft=1.0.12=py37ha843d7b_0
- mkl_random=1.0.2=py37hd81dba3_0
- natsort=6.0.0=py_0
- ncurses=6.1=he6710b0_1
- nettle=3.6=he412f7d_0
- numpy=1.16.4=py37h7e9f1db_0
- numpy-base=1.16.4=py37hde5b4d6_0
- olefile=0.46=py37_0
- openh264=2.1.1=h8b12597_0
- openssl=1.1.1i=h27cfd23_0
- pip=19.1.1=py37_0
- python=3.7.3=h0371630_0
- python_abi=3.7=1_cp37m
- readline=7.0=h7b6447c_5
- scikit-learn=0.22.1=py37hd81dba3_0
- setuptools=41.0.1=py37_0
- sqlite=3.28.0=h7b6447c_0
- tk=8.6.8=hbc83047_0
- wheel=0.33.4=py37_0
- x264=1!152.20180806=h14c3975_0
- xz=5.2.4=h14c3975_4
- zlib=1.2.11=h7b6447c_3
- zstd=1.3.7=h0b5b093_0
- pip:
- absl-py==0.7.1
- appdirs==1.4.4
- astor==0.8.0
- astunparse==1.6.3
- attrs==19.1.0
- backcall==0.1.0
- bleach==3.1.0
- cachetools==4.1.0
- cffi==1.12.3
- chardet==3.0.4
- cloudpickle==1.2.1
- cycler==0.10.0
- cytoolz==0.9.0.1
- dask==2.1.0
- decorator==4.4.0
- defusedxml==0.6.0
- deprecated==1.2.6
- dill==0.2.9
- dlib==19.21.0
- dominate==2.3.5
- easydict==1.9
- entrypoints==0.3
- gast==0.2.2
- google-auth==1.14.3
- google-auth-oauthlib==0.4.1
- google-pasta==0.2.0
- grpcio==1.22.0
- h5py==2.10.0
- helpdev==0.6.10
- idna==2.8
- imageio==2.5.0
- importlib-metadata==0.18
- imutils==0.5.3
- ipykernel==5.1.1
- ipython==7.6.0
- ipython-genutils==0.2.0
- ipywidgets==7.4.2
- jedi==0.13.3
- jinja2==2.10.1
- jsonschema==3.0.1
- jupyter==1.0.0
- jupyter-client==5.2.4
- jupyter-console==6.0.0
- jupyter-core==4.5.0
- keras==2.2.4
- keras-applications==1.0.8
- keras-preprocessing==1.1.0
- kiwisolver==1.1.0
- mako==1.1.2
- markdown==3.1.1
- markupsafe==1.1.1
- matplotlib==3.1.0
- mistune==0.8.4
- nbconvert==5.5.0
- nbformat==4.4.0
- networkx==2.3
- notebook==5.7.8
- oauthlib==3.1.0
- opencv-python==4.1.0.25
- opt-einsum==3.2.1
- pandocfilters==1.4.2
- parso==0.5.0
- pexpect==4.7.0
- pickleshare==0.7.5
- pillow==6.0.0
- prometheus-client==0.7.1
- prompt-toolkit==2.0.9
- protobuf==3.8.0
- psutil==5.6.3
- ptyprocess==0.6.0
- pyasn1==0.4.8
- pyasn1-modules==0.2.8
- pycparser==2.19
- pycuda==2019.1.2
- pygments==2.4.2
- pyparsing==2.4.0
- pyqt5==5.13.0
- pyqt5-sip==4.19.18
- pyrsistent==0.14.11
- pyside2==5.13.0
- python-dateutil==2.8.0
- pytools==2020.1
- pytz==2019.1
- pywavelets==1.0.3
- pyyaml==5.1.1
- pyzmq==18.0.0
- qdarkgraystyle==1.0.2
- qdarkstyle==2.7
- qtconsole==4.5.1
- requests==2.22.0
- requests-oauthlib==1.3.0
- rsa==4.0
- scikit-image==0.15.0
- scikit-video==1.1.11
- scipy==1.2.1
- send2trash==1.5.0
- shiboken2==5.13.0
- six==1.12.0
- tensorboard==1.15.0
- tensorboard-plugin-wit==1.6.0.post3
- tensorflow-estimator==1.15.1
- tensorflow-gpu==1.15.0
- termcolor==1.1.0
- terminado==0.8.2
- testpath==0.4.2
- toolz==0.9.0
- torch==1.1.0
- torchdiffeq==0.0.1
- torchvision==0.3.0
- tornado==6.0.3
- tqdm==4.32.1
- traitlets==4.3.2
- urllib3==1.25.3
- wcwidth==0.1.7
- webencodings==0.5.1
- werkzeug==0.15.4
- widgetsnbextension==3.4.2
- wrapt==1.11.2
- zipp==0.5.2
ANSWER
Answered 2021-Jan-15 at 14:57Conda does not work well with large environments in which everything pinned to specific versions (in contrast to other ecosystems in which pinning everything is the standard). The result of conda env export
, which is what this probably is, here also includes the build numbers, which are almost always too specific (and often platform-specific) for the purpose of installing the right version of the software. It's great for things like reproducibility of scientific work (specific versions and builds of everything need to be known), but not great for installing software (there is plenty of flexibility in versions that should work with any package).
I'd start by removing the build pins (dropping everything after the second =
in each line) so that only the versions are pinned. After that, I'd start removing version pins.
QUESTION
Got the DLC-GPU.yaml from here: https://github.com/DeepLabCut/DeepLabCut/blob/master/conda-environments/DLC-GPU.yaml
(base) mona@mona:~/research$ conda env create --name dlc --file=DLC-GPU.yaml
Collecting package metadata (repodata.json): done
Solving environment: done
Downloading and Extracting Packages
gettext-0.19.8.1 | 2.9 MB | ################################################################################################################################################################ | 100%
ipython-7.18.1 | 989 KB | ################################################################################################################################################################ | 100%
jupyter_console-6.2. | 26 KB | ################################################################################################################################################################ | 100%
qtconsole-4.7.7 | 96 KB | ################################################################################################################################################################ | 100%
prompt_toolkit-3.0.7 | 12 KB | ################################################################################################################################################################ | 100%
geos-3.8.0 | 961 KB | ################################################################################################################################################################ | 100%
pango-1.45.3 | 361 KB | ################################################################################################################################################################ | 100%
shapely-1.7.1 | 390 KB | ################################################################################################################################################################ | 100%
pyzmq-19.0.2 | 438 KB | ################################################################################################################################################################ | 100%
nb_conda_kernels-2.3 | 26 KB | ################################################################################################################################################################ | 100%
notebook-6.1.1 | 4.0 MB | ################################################################################################################################################################ | 100%
harfbuzz-2.4.0 | 850 KB | ################################################################################################################################################################ | 100%
jedi-0.17.2 | 912 KB | ################################################################################################################################################################ | 100%
gstreamer-1.14.0 | 3.1 MB | ################################################################################################################################################################ | 100%
argon2-cffi-20.1.0 | 46 KB | ################################################################################################################################################################ | 100%
pyrsistent-0.17.3 | 89 KB | ################################################################################################################################################################ | 100%
ipykernel-5.3.4 | 179 KB | ################################################################################################################################################################ | 100%
nb_conda-2.2.1 | 32 KB | ################################################################################################################################################################ | 100%
cffi-1.14.3 | 223 KB | ################################################################################################################################################################ | 100%
glib-2.65.0 | 2.9 MB | ################################################################################################################################################################ | 100%
fribidi-1.0.10 | 103 KB | ################################################################################################################################################################ | 100%
Preparing transaction: done
Verifying transaction: done
Executing transaction: \ Enabling nb_conda_kernels...
Status: enabled
/ Config option `kernel_spec_manager_class` not recognized by `EnableNBExtensionApp`.
Enabling notebook extension nb_conda/main...
- Validating: OK
Enabling tree extension nb_conda/tree...
- Validating: OK
Config option `kernel_spec_manager_class` not recognized by `EnableServerExtensionApp`.
Enabling: nb_conda
- Writing config: /home/mona/anaconda3/envs/dlc/etc/jupyter
- Validating...
nb_conda 2.2.1 OK
done
Installing pip dependencies: / Ran pip subprocess with arguments:
['/home/mona/anaconda3/envs/dlc/bin/python', '-m', 'pip', 'install', '-U', '-r', '/home/mona/research/condaenv.i4wb4gx_.requirements.txt']
Pip subprocess output:
Collecting deeplabcut
Using cached deeplabcut-2.1.8.2-py3-none-any.whl (400 kB)
Requirement already satisfied, skipping upgrade: certifi in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (2020.6.20)
Collecting matplotlib==3.0.3
Using cached matplotlib-3.0.3-cp37-cp37m-manylinux1_x86_64.whl (13.0 MB)
Collecting numpy==1.16.4
Using cached numpy-1.16.4-cp37-cp37m-manylinux1_x86_64.whl (17.3 MB)
Collecting scikit-image
Downloading scikit_image-0.17.2-cp37-cp37m-manylinux1_x86_64.whl (12.5 MB)
Collecting scikit-learn
Downloading scikit_learn-0.23.2-cp37-cp37m-manylinux1_x86_64.whl (6.8 MB)
Requirement already satisfied, skipping upgrade: ipython-genutils in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.2.0)
Collecting ruamel.yaml~=0.15
Using cached ruamel.yaml-0.16.12-py2.py3-none-any.whl (111 kB)
Collecting tqdm
Using cached tqdm-4.49.0-py2.py3-none-any.whl (69 kB)
Requirement already satisfied, skipping upgrade: six in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (1.15.0)
Collecting tables
Using cached tables-3.6.1-cp37-cp37m-manylinux1_x86_64.whl (4.3 MB)
Collecting tensorpack>=0.9.7.1
Using cached tensorpack-0.10.1-py2.py3-none-any.whl (291 kB)
Collecting statsmodels
Downloading statsmodels-0.12.0-cp37-cp37m-manylinux1_x86_64.whl (9.5 MB)
Requirement already satisfied, skipping upgrade: ipython in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (7.18.1)
Requirement already satisfied, skipping upgrade: setuptools in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (49.6.0.post20200814)
Collecting pyyaml>=5.1
Using cached PyYAML-5.3.1.tar.gz (269 kB)
Collecting intel-openmp
Using cached intel_openmp-2020.0.133-py2.py3-none-manylinux1_x86_64.whl (919 kB)
Collecting opencv-python~=3.4
Downloading opencv_python-3.4.11.43-cp37-cp37m-manylinux2014_x86_64.whl (49.1 MB)
Requirement already satisfied, skipping upgrade: python-dateutil in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (2.8.1)
Processing /home/mona/.cache/pip/wheels/db/37/b8/b3785332f8246f1306c2863553860ca65e1824fc4c8251c7f1/moviepy-1.0.1-py3-none-any.whl
Collecting patsy
Using cached patsy-0.5.1-py2.py3-none-any.whl (231 kB)
Processing /home/mona/.cache/pip/wheels/88/96/68/c2be18e7406804be2e593e1c37845f2dd20ac2ce1381ce40b0/easydict-1.9-py3-none-any.whl
Collecting pandas>=1.0.
Using cached pandas-1.1.2-cp37-cp37m-manylinux1_x86_64.whl (10.5 MB)
Requirement already satisfied, skipping upgrade: wheel in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.35.1)
Collecting click
Using cached click-7.1.2-py2.py3-none-any.whl (82 kB)
Collecting imgaug
Using cached imgaug-0.4.0-py2.py3-none-any.whl (948 kB)
Collecting chardet
Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Requirement already satisfied, skipping upgrade: scipy in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (1.5.2)
Requirement already satisfied, skipping upgrade: h5py~=2.7 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (2.10.0)
Collecting requests
Using cached requests-2.24.0-py2.py3-none-any.whl (61 kB)
Collecting kiwisolver>=1.0.1
Downloading kiwisolver-1.2.0-cp37-cp37m-manylinux1_x86_64.whl (88 kB)
Requirement already satisfied, skipping upgrade: pyparsing!=2.0.4,!=2.1.2,!=2.1.6,>=2.0.1 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from matplotlib==3.0.3->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (2.4.7)
Collecting cycler>=0.10
Using cached cycler-0.10.0-py2.py3-none-any.whl (6.5 kB)
Collecting pillow!=7.1.0,!=7.1.1,>=4.3.0
Using cached Pillow-7.2.0-cp37-cp37m-manylinux1_x86_64.whl (2.2 MB)
Collecting imageio>=2.3.0
Using cached imageio-2.9.0-py3-none-any.whl (3.3 MB)
Collecting tifffile>=2019.7.26
Using cached tifffile-2020.9.22-py3-none-any.whl (153 kB)
Collecting networkx>=2.0
Using cached networkx-2.5-py3-none-any.whl (1.6 MB)
Collecting PyWavelets>=1.1.1
Downloading PyWavelets-1.1.1-cp37-cp37m-manylinux1_x86_64.whl (4.4 MB)
Collecting threadpoolctl>=2.0.0
Using cached threadpoolctl-2.1.0-py3-none-any.whl (12 kB)
Collecting joblib>=0.11
Using cached joblib-0.16.0-py3-none-any.whl (300 kB)
Collecting ruamel.yaml.clib>=0.1.2; platform_python_implementation == "CPython" and python_version < "3.9"
Using cached ruamel.yaml.clib-0.2.2-cp37-cp37m-manylinux1_x86_64.whl (547 kB)
Collecting numexpr>=2.6.2
Using cached numexpr-2.7.1-cp37-cp37m-manylinux1_x86_64.whl (162 kB)
Collecting psutil>=5
Using cached psutil-5.7.2.tar.gz (460 kB)
Collecting msgpack>=0.5.2
Downloading msgpack-1.0.0-cp37-cp37m-manylinux1_x86_64.whl (275 kB)
Collecting msgpack-numpy>=0.4.4.2
Using cached msgpack_numpy-0.4.7-py2.py3-none-any.whl (6.4 kB)
Collecting tabulate>=0.7.7
Using cached tabulate-0.8.7-py3-none-any.whl (24 kB)
Requirement already satisfied, skipping upgrade: termcolor>=1.1 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from tensorpack>=0.9.7.1->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (1.1.0)
Requirement already satisfied, skipping upgrade: pyzmq>=16 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from tensorpack>=0.9.7.1->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (19.0.2)
Requirement already satisfied, skipping upgrade: traitlets>=4.2 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (4.3.3)
Requirement already satisfied, skipping upgrade: pygments in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (2.7.1)
Requirement already satisfied, skipping upgrade: prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (3.0.7)
Requirement already satisfied, skipping upgrade: pexpect>4.3; sys_platform != "win32" in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (4.8.0)
Requirement already satisfied, skipping upgrade: decorator in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (4.4.2)
Requirement already satisfied, skipping upgrade: jedi>=0.10 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.17.2)
Requirement already satisfied, skipping upgrade: pickleshare in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.7.5)
Requirement already satisfied, skipping upgrade: backcall in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.2.0)
Processing /home/mona/.cache/pip/wheels/12/36/1f/dc61e6ac10781d63cf6fa045eb09fa613a667384e12cb6e6e0/proglog-0.1.9-py3-none-any.whl
Collecting imageio-ffmpeg>=0.2.0; python_version >= "3.4"
Using cached imageio_ffmpeg-0.4.2-py3-none-manylinux2010_x86_64.whl (26.9 MB)
Collecting pytz>=2017.2
Using cached pytz-2020.1-py2.py3-none-any.whl (510 kB)
Requirement already satisfied, skipping upgrade: Shapely in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from imgaug->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (1.7.1)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
Using cached urllib3-1.25.10-py2.py3-none-any.whl (127 kB)
Collecting idna<3,>=2.5
Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Requirement already satisfied, skipping upgrade: wcwidth in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from prompt-toolkit!=3.0.0,!=3.0.1,<3.1.0,>=2.0.0->ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.2.5)
Requirement already satisfied, skipping upgrade: ptyprocess>=0.5 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from pexpect>4.3; sys_platform != "win32"->ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.6.0)
Requirement already satisfied, skipping upgrade: parso<0.8.0,>=0.7.0 in /home/mona/anaconda3/envs/dlc/lib/python3.7/site-packages (from jedi>=0.10->ipython->deeplabcut->-r /home/mona/research/condaenv.i4wb4gx_.requirements.txt (line 1)) (0.7.0)
Building wheels for collected packages: pyyaml, psutil
Building wheel for pyyaml (setup.py): started
Building wheel for pyyaml (setup.py): finished with status 'done'
Created wheel for pyyaml: filename=PyYAML-5.3.1-cp37-cp37m-linux_x86_64.whl size=44619 sha256=bb81132d4bd4786a05057991336e112ed3ff6dde5653560f264e447e3b2f0e9c
Stored in directory: /home/mona/.cache/pip/wheels/5e/03/1e/e1e954795d6f35dfc7b637fe2277bff021303bd9570ecea653
Building wheel for psutil (setup.py): started
Building wheel for psutil (setup.py): finished with status 'done'
Created wheel for psutil: filename=psutil-5.7.2-cp37-cp37m-linux_x86_64.whl size=292666 sha256=2b7326dd5f5f648c6f167599c04470dc27949e7d56211052d7231aa9318db886
Stored in directory: /home/mona/.cache/pip/wheels/2d/43/97/00701864a7bee6d9e1a52dd682537dcbf1d013d0e2e6f0c1f1
Successfully built pyyaml psutil
Installing collected packages: kiwisolver, cycler, numpy, matplotlib, pillow, imageio, tifffile, networkx, PyWavelets, scikit-image, threadpoolctl, joblib, scikit-learn, ruamel.yaml.clib, ruamel.yaml, tqdm, numexpr, tables, psutil, msgpack, msgpack-numpy, tabulate, tensorpack, pytz, pandas, patsy, statsmodels, pyyaml, intel-openmp, opencv-python, proglog, chardet, urllib3, idna, requests, imageio-ffmpeg, moviepy, easydict, click, imgaug, deeplabcut
Attempting uninstall: numpy
Found existing installation: numpy 1.19.1
Uninstalling numpy-1.19.1:
Successfully uninstalled numpy-1.19.1
Successfully installed PyWavelets-1.1.1 chardet-3.0.4 click-7.1.2 cycler-0.10.0 deeplabcut-2.1.8.2 easydict-1.9 idna-2.10 imageio-2.9.0 imageio-ffmpeg-0.4.2 imgaug-0.4.0 intel-openmp-2020.0.133 joblib-0.16.0 kiwisolver-1.2.0 matplotlib-3.0.3 moviepy-1.0.1 msgpack-1.0.0 msgpack-numpy-0.4.7 networkx-2.5 numexpr-2.7.1 numpy-1.16.4 opencv-python-3.4.11.43 pandas-1.1.2 patsy-0.5.1 pillow-7.2.0 proglog-0.1.9 psutil-5.7.2 pytz-2020.1 pyyaml-5.3.1 requests-2.24.0 ruamel.yaml-0.16.12 ruamel.yaml.clib-0.2.2 scikit-image-0.17.2 scikit-learn-0.23.2 statsmodels-0.12.0 tables-3.6.1 tabulate-0.8.7 tensorpack-0.10.1 threadpoolctl-2.1.0 tifffile-2020.9.22 tqdm-4.49.0 urllib3-1.25.10
done
#
# To activate this environment, use
#
# $ conda activate dlc
#
# To deactivate an active environment, use
#
# $ conda deactivate
(base) mona@mona:~/research$ conda activate dlc
(dlc) mona@mona:~/research$ pip install deeplabcut==2.2b8
(dlc) mona@mona:~/research$ ipython
Python 3.7.9 (default, Aug 31 2020, 12:42:55)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.18.1 -- An enhanced Interactive Python. Type '?' for help.
In [1]: import deeplabcut as dlc
---------------------------------------------------------------------------
CalledProcessError Traceback (most recent call last)
in
----> 1 import deeplabcut as dlc
~/anaconda3/envs/dlc/lib/python3.7/site-packages/deeplabcut/__init__.py in
36 else: # standard use [wxpython supported]
37 mpl.use("WxAgg")
---> 38 from deeplabcut import generate_training_dataset
39 from deeplabcut import refine_training_dataset
40 from deeplabcut.generate_training_dataset import (
~/anaconda3/envs/dlc/lib/python3.7/site-packages/deeplabcut/generate_training_dataset/__init__.py in
16 else:
17 from deeplabcut.generate_training_dataset.auxfun_drag_label import *
---> 18 from deeplabcut.generate_training_dataset.labeling_toolbox import *
19 from deeplabcut.generate_training_dataset.multiple_individuals_labeling_toolbox import *
20 from deeplabcut.generate_training_dataset.frame_extraction_toolbox import *
~/anaconda3/envs/dlc/lib/python3.7/site-packages/deeplabcut/generate_training_dataset/labeling_toolbox.py in
31
32 from deeplabcut.generate_training_dataset import auxfun_drag_label
---> 33 from deeplabcut.utils import auxiliaryfunctions
34
35 # ###########################################################################
~/anaconda3/envs/dlc/lib/python3.7/site-packages/deeplabcut/utils/__init__.py in
4 from deeplabcut.utils.conversioncode import *
5 from deeplabcut.utils.frameselectiontools import *
----> 6 from deeplabcut.utils.make_labeled_video import *
7 from deeplabcut.utils.plotting import *
8 from deeplabcut.utils.video_processor import *
~/anaconda3/envs/dlc/lib/python3.7/site-packages/deeplabcut/utils/make_labeled_video.py in
26 import matplotlib.pyplot as plt
27 import numpy as np
---> 28 from matplotlib.animation import FFMpegWriter
29 from matplotlib.collections import LineCollection
30 from skimage.draw import circle, line_aa
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/animation.py in
735 # Combine ImageMagick options with pipe-based writing
736 @writers.register('imagemagick')
--> 737 class ImageMagickWriter(ImageMagickBase, MovieWriter):
738 '''Pipe-based animated gif.
739
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/animation.py in wrapper(writerClass)
118 def wrapper(writerClass):
119 self._registered[name] = writerClass
--> 120 if writerClass.isAvailable():
121 self.avail[name] = writerClass
122 return writerClass
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/animation.py in isAvailable(cls)
728 def isAvailable(cls):
729 try:
--> 730 return super().isAvailable()
731 except FileNotFoundError: # May be raised by get_executable_info.
732 return False
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/animation.py in isAvailable(cls)
425 Check to see if a MovieWriter subclass is actually available.
426 '''
--> 427 return shutil.which(cls.bin_path()) is not None
428
429
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/animation.py in bin_path(cls)
722 binpath = super().bin_path()
723 if binpath == 'convert':
--> 724 binpath = mpl._get_executable_info('magick').executable
725 return binpath
726
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/__init__.py in _get_executable_info(name)
383 raise FileNotFoundError(
384 "Failed to find an ImageMagick installation")
--> 385 return impl([path, "--version"], r"^Version: ImageMagick (\S*)")
386 elif name == "pdftops":
387 info = impl(["pdftops", "-v"], "^pdftops version (.*)",
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/__init__.py in impl(args, regex, min_ver, ignore_exit_code)
328 output = _cpe.output
329 else:
--> 330 raise _cpe
331 match = re.search(regex, output)
332 if match:
~/anaconda3/envs/dlc/lib/python3.7/site-packages/matplotlib/__init__.py in impl(args, regex, min_ver, ignore_exit_code)
323 output = subprocess.check_output(
324 args, stderr=subprocess.STDOUT,
--> 325 universal_newlines=True, errors="replace")
326 except subprocess.CalledProcessError as _cpe:
327 if ignore_exit_code:
~/anaconda3/envs/dlc/lib/python3.7/subprocess.py in check_output(timeout, *popenargs, **kwargs)
409
410 return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
--> 411 **kwargs).stdout
412
413
~/anaconda3/envs/dlc/lib/python3.7/subprocess.py in run(input, capture_output, timeout, check, *popenargs, **kwargs)
510 if check and retcode:
511 raise CalledProcessError(retcode, process.args,
--> 512 output=stdout, stderr=stderr)
513 return CompletedProcess(process.args, retcode, stdout, stderr)
514
CalledProcessError: Command '['convert', '--version']' returned non-zero exit status 1.
In [2]:
$ lsb_release -a
LSB Version: core-11.1.0ubuntu2-noarch:security-11.1.0ubuntu2-noarch
Distributor ID: Ubuntu
Description: Ubuntu 20.04.1 LTS
Release: 20.04
Codename: focal
Just for sake of reproducibility in future, I am also dumping DLC-GPU.yaml here:
# DLC-GPU.yaml
#DeepLabCut2.0 Toolbox (deeplabcut.org)
#© A. & M. Mathis Labs
#https://github.com/AlexEMG/DeepLabCut
#Please see AUTHORS for contributors.
#https://github.com/AlexEMG/DeepLabCut/blob/master/AUTHORS
#Licensed under GNU Lesser General Public License v3.0
#
# DeepLabCut environment
# FIRST: INSTALL CORRECT DRIVER for GPU, see https://stackoverflow.com/questions/30820513/what-is-the-correct-version-of-cuda-for-my-nvidia-driver/30820690
#
# Suggested by Jan Eglinger see https://github.com/AlexEMG/DeepLabCut/issues/112
#
# install: conda env create -f DLC-GPU.yaml
# update: conda env update -f DLC-GPU.yaml
name: DLC-GPU
dependencies:
- python=3.7
- pip
- tensorflow-gpu==1.13.1
- cudnn=7
- wxpython<4.1.0
- jupyter
- nb_conda
- Shapely
- pip:
- deeplabcut
ANSWER
Answered 2020-Sep-24 at 21:01matplotlib.animation
requires ffmpeg
for saving movies and ImageMagick
for saving animated gifs.
See https://matplotlib.org/users/installing.html#install-requirements
Install them with your system package manager:
sudo apt update
sudo apt install ffmpeg ImageMagick
QUESTION
there!
When running test_net.py in pytorch1.0 Faster R-CNN and demo.py on coco dataset with faster_rcnn_1_10_9771.pth(the pretrained resnet101 model on coco dataset provided by jwyang), I encounter the same errors below :
Called with args:
Namespace(batch_size=1, cfg_file='cfgs/res101.yml', checkepoch=10, checkpoint=9771, checksession=1, class_agnostic=True, cuda=True, dataset='coco', image_dir='/home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0/images', load_dir='/home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0/models', mGPUs=True, net='res101', parallel_type=0, set_cfgs=None, vis=True, webcam_num=-1)
/home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0/lib/model/utils/config.py:374: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
yaml_cfg = edict(yaml.load(f))
Using config:
{'ANCHOR_RATIOS': [0.5, 1, 2],
'ANCHOR_SCALES': [4, 8, 16, 32],
'CROP_RESIZE_WITH_MAX_POOL': False,
'CUDA': False,
'DATA_DIR': '/home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0/data',
'DEDUP_BOXES': 0.0625,
'EPS': 1e-14,
'EXP_DIR': 'res101',
'FEAT_STRIDE': [16],
'GPU_ID': 0,
'MATLAB': 'matlab',
'MAX_NUM_GT_BOXES': 20,
'MOBILENET': {'DEPTH_MULTIPLIER': 1.0,
'FIXED_LAYERS': 5,
'REGU_DEPTH': False,
'WEIGHT_DECAY': 4e-05},
'PIXEL_MEANS': array([[[102.9801, 115.9465, 122.7717]]]),
'POOLING_MODE': 'align',
'POOLING_SIZE': 7,
'RESNET': {'FIXED_BLOCKS': 1, 'MAX_POOL': False},
'RNG_SEED': 3,
'ROOT_DIR': '/home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0',
'TEST': {'BBOX_REG': True,
'HAS_RPN': True,
'MAX_SIZE': 1000,
'MODE': 'nms',
'NMS': 0.3,
'PROPOSAL_METHOD': 'gt',
'RPN_MIN_SIZE': 16,
'RPN_NMS_THRESH': 0.7,
'RPN_POST_NMS_TOP_N': 300,
'RPN_PRE_NMS_TOP_N': 6000,
'RPN_TOP_N': 5000,
'SCALES': [600],
'SVM': False},
'TRAIN': {'ASPECT_GROUPING': False,
'BATCH_SIZE': 128,
'BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
'BBOX_NORMALIZE_MEANS': [0.0, 0.0, 0.0, 0.0],
'BBOX_NORMALIZE_STDS': [0.1, 0.1, 0.2, 0.2],
'BBOX_NORMALIZE_TARGETS': True,
'BBOX_NORMALIZE_TARGETS_PRECOMPUTED': True,
'BBOX_REG': True,
'BBOX_THRESH': 0.5,
'BG_THRESH_HI': 0.5,
'BG_THRESH_LO': 0.0,
'BIAS_DECAY': False,
'BN_TRAIN': False,
'DISPLAY': 20,
'DOUBLE_BIAS': False,
'FG_FRACTION': 0.25,
'FG_THRESH': 0.5,
'GAMMA': 0.1,
'HAS_RPN': True,
'IMS_PER_BATCH': 1,
'LEARNING_RATE': 0.001,
'MAX_SIZE': 1000,
'MOMENTUM': 0.9,
'PROPOSAL_METHOD': 'gt',
'RPN_BATCHSIZE': 256,
'RPN_BBOX_INSIDE_WEIGHTS': [1.0, 1.0, 1.0, 1.0],
'RPN_CLOBBER_POSITIVES': False,
'RPN_FG_FRACTION': 0.5,
'RPN_MIN_SIZE': 8,
'RPN_NEGATIVE_OVERLAP': 0.3,
'RPN_NMS_THRESH': 0.7,
'RPN_POSITIVE_OVERLAP': 0.7,
'RPN_POSITIVE_WEIGHT': -1.0,
'RPN_POST_NMS_TOP_N': 2000,
'RPN_PRE_NMS_TOP_N': 12000,
'SCALES': [600],
'SNAPSHOT_ITERS': 5000,
'SNAPSHOT_KEPT': 3,
'SNAPSHOT_PREFIX': 'res101_faster_rcnn',
'STEPSIZE': [30000],
'SUMMARY_INTERVAL': 180,
'TRIM_HEIGHT': 600,
'TRIM_WIDTH': 600,
'TRUNCATED': False,
'USE_ALL_GT': True,
'USE_FLIPPED': True,
'USE_GT': False,
'WEIGHT_DECAY': 0.0001},
'USE_GPU_NMS': True}
load checkpoint /home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0/models/res101/coco/faster_rcnn_1_10_9771.pth
Traceback (most recent call last):
File "/home/ubuntu/users/fasterrcnn/faster-rcnn.pytorch-pytorch-1.0/demo.py", line 205, in
fasterRCNN.load_state_dict(checkpoint['model'])
File "/home/ubuntu/users/anaconda3/envs/fasterrcnn/lib/python3.6/site-packages/torch/nn/modules/module.py", line 769, in load_state_dict
self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for resnet:
**size mismatch for RCNN_bbox_pred.weight:** copying a param with shape torch.Size([324, 2048]) from checkpoint, the shape in current model is torch.Size([4, 2048]).
**size mismatch for RCNN_bbox_pred.bias**: copying a param with shape torch.Size([324]) from checkpoint, the shape in current model is torch.Size([4]).
Process finished with exit code 1
And here is my envs:
# Name Version Build Channel
_libgcc_mutex 0.1 main https://mirrors.ustc.edu.cn/anaconda/pkgs/main
blas 1.0 mkl https://mirrors.ustc.edu.cn/anaconda/pkgs/main
bzip2 1.0.8 h7b6447c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
ca-certificates 2020.1.1 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
cairo 1.14.12 h8948797_3 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
certifi 2020.4.5.1 py36_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
cffi 1.14.0 py36he30daa8_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
cuda100 1.0 0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/pytorch
cycler 0.10.0 py36_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
cython 0.29.17 py36he6710b0_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
dbus 1.13.14 hb2f20db_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
easydict 1.9 py_0 https://mirrors.ustc.edu.cn/anaconda/cloud/conda-forge
expat 2.2.6 he6710b0_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
faster-rcnn 0.1 dev_0
ffmpeg 4.0 hcdf2ecd_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
fontconfig 2.13.0 h9420a91_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
freeglut 3.0.0 hf484d3e_5 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
freetype 2.9.1 h8a8886c_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
glib 2.63.1 h3eb4bd4_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
graphite2 1.3.13 h23475e2_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
gst-plugins-base 1.14.0 hbbd80ab_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
gstreamer 1.14.0 hb31296c_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
harfbuzz 1.8.8 hffaf4a1_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
hdf5 1.10.2 hba1933b_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
icu 58.2 he6710b0_3 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
intel-openmp 2020.1 217 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
jasper 2.0.14 h07fcdf6_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
jpeg 9b h024ee3a_2 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
kiwisolver 1.2.0 py36hfd86e86_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
ld_impl_linux-64 2.33.1 h53a641e_7 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libedit 3.1.20181209 hc058e9b_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libffi 3.3 he6710b0_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libgcc-ng 9.1.0 hdf63c60_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libgfortran-ng 7.3.0 hdf63c60_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libglu 9.0.0 hf484d3e_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
libopencv 3.4.2 hb342d67_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
libopus 1.3.1 h7b6447c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
libpng 1.6.37 hbc83047_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libprotobuf 3.11.4 hd408876_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
libstdcxx-ng 9.1.0 hdf63c60_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libtiff 4.1.0 h2733197_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libuuid 1.0.3 h1bed415_2 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libvpx 1.7.0 h439df22_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
libxcb 1.13 h1bed415_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
libxml2 2.9.9 hea5a465_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
matplotlib 3.1.3 py36_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
matplotlib-base 3.1.3 py36hef1b27d_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
mkl 2020.1 217 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
mkl-service 2.3.0 py36he904b0f_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
mkl_fft 1.0.15 py36ha843d7b_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
mkl_random 1.1.1 py36h0573a6f_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
ncurses 6.2 he6710b0_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
ninja 1.9.0 py36hfd86e86_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
numpy 1.18.1 py36h4f9e942_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
numpy-base 1.18.1 py36hde5b4d6_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
olefile 0.46 py36_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
opencv 3.4.2 py36h6fd60c2_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
openssl 1.1.1g h7b6447c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
pcre 8.43 he6710b0_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
pillow 7.1.2 py36hb39fc2d_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
pip 20.0.2 py36_3 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
pixman 0.38.0 h7b6447c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
protobuf 3.11.4 py36he6710b0_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
py-opencv 3.4.2 py36hb342d67_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
pycparser 2.20 py_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
pyparsing 2.4.7 py_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
pyqt 5.9.2 py36h05f1152_2 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
python 3.6.10 h7579374_2 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
python-dateutil 2.8.1 py_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
pytorch 1.0.0 py3.6_cuda10.0.130_cudnn7.4.1_1 [cuda100] https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/pytorch
pyyaml 5.3.1 py36h7b6447c_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
qt 5.9.7 h5867ecd_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
readline 8.0 h7b6447c_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
scipy 1.2.1 py36h7c811a0_0 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
setuptools 46.4.0 py36_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
sip 4.19.8 py36hf484d3e_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
six 1.14.0 py36_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
sqlite 3.31.1 h62c20be_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
tensorboardx 2.0 py_0 conda-forge
tk 8.6.8 hbc83047_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
torchvision 0.2.0 py36h17b6947_1 https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/pytorch
tornado 6.0.4 py36h7b6447c_1 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
wheel 0.34.2 py36_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
xz 5.2.5 h7b6447c_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
yaml 0.1.7 had09818_2 https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/main
zlib 1.2.11 h7b6447c_3 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
zstd 1.3.7 h0b5b093_0 https://mirrors.ustc.edu.cn/anaconda/pkgs/main
to __C.ANCHOR_SCALES = [4,8,16,32].
Also tried modifying the classes of pascal voc dataset to the classes of coco dataset
coco_classes = np.asarray(["__background__",
"person", "bicycle", "car", "motorcycle", "airplane", "bus", "train", "truck", "boat",
"traffic light", "fire hydrant", "stop sign", "parking meter", "bench", "bird", "cat", "dog", "horse", "sheep",
"cow", "elephant", "bear", "zebra", "giraffe", "backpack", "umbrella", "handbag", "tie", "suitcase", "frisbee",
"skis", "snowboard", "sports ball", "kite", "baseball bat", "baseball glove", "skateboard", "surfboard",
"tennis racket", "bottle", "wine glass", "cup", "fork", "knife", "spoon", "bowl", "banana", "apple", "sandwich",
"orange", "broccoli", "carrot", "hot dog", "pizza", "donut", "cake", "chair", "couch", "potted plant", "bed",
"dining table", "toilet", "tv", "laptop", "mouse", "remote", "keyboard", "cell phone", "microwave", "oven",
"toaster", "sink", "refrigerator", "book", "clock", "vase", "scissors", "teddy bear", "hair drier",
"toothbrush"])
But still don't work for me.
Anyone help?
ANSWER
Answered 2020-Jun-08 at 03:36It says your model doesn't fit the pre-trained parameters you want to load.
Maybe check the model you're using and the .pth
file and find out if they match or what.
Or post the code of your model and let's see what's going wrong.
QUESTION
I use EasyDict
and want to assign a dict
with key of type int
to it
from easydict import EasyDict as edict
cfg = edict()
cfg.tt = {'0': 'aeroplane'} # this is ok
cfg.tt = {0: 'aeroplane'} # this raises error, but this is what I want to use!
How should I do if I wanna assign the dict
I want, thanks
ANSWER
Answered 2020-Jan-31 at 13:04It is because EasyDict
is converting any value of dict
to an EasyDict
. And making an attribute out of the keys. int
values cannot be attributes so this won't work. You can install PermissiveDict
, which does much the same as EasyDict
but does not try to convert values to it's own type.
pip install permissive-dict
Your example:
from permissive_dict import PermissiveDict
cfg = PermissiveDict()
cfg.tt = {'0': 'aeroplane'} # this is ok
cfg.tt = {0: 'aeroplane'} # this does not raise errors, but this is what I want to use!
cfg.tt[0]) == cfg.TT[0] == cfg.tT[0] == cfg.Tt[0] == 'aeroplane'
QUESTION
I want to use including and after tensorflow2.0 in Docker. I want to use (https://github.com/tensorlayer/srgan).
My Dockerfile is
FROM tensorflow/tensorflow:latest-gpu-py3
ENV HOME=/home
ENV user=hogehoge
WORKDIR $HOME
RUN useradd -u 1000 -m -d /home/${user} ${user} \
&& chown -R ${user} /home/${user}
RUN pip install tensorlayer easydict
USER ${USER}
I build the container with:
docker build -t tensorflow .
sudo docker run --rm --gpus all -it -v /media/hikarukondo/Workspace/BLUE_TAG/workspace/:/home/ tensorflow
in container,
python train.py
And then I get.
2020-01-14 05:39:56.390997: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libnvinfer.so.6
2020-01-14 05:39:56.392064: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libnvinfer_plugin.so.6
2020-01-14 05:40:00.523011: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcuda.so.1
2020-01-14 05:40:00.542402: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.542772: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1555] Found device 0 with properties:
pciBusID: 0000:01:00.0 name: GeForce RTX 2070 computeCapability: 7.5
coreClock: 1.62GHz coreCount: 36 deviceMemorySize: 7.79GiB deviceMemoryBandwidth: 417.29GiB/s
2020-01-14 05:40:00.542794: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1
2020-01-14 05:40:00.542831: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
2020-01-14 05:40:00.543925: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10
2020-01-14 05:40:00.544139: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10
2020-01-14 05:40:00.545110: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10
2020-01-14 05:40:00.545615: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10
2020-01-14 05:40:00.545639: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-01-14 05:40:00.545738: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.546108: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.546413: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1697] Adding visible gpu devices: 0
2020-01-14 05:40:00.546665: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2020-01-14 05:40:00.567683: I tensorflow/core/platform/profile_utils/cpu_utils.cc:94] CPU Frequency: 3696000000 Hz
2020-01-14 05:40:00.567909: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x5795ae0 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2020-01-14 05:40:00.567922: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
2020-01-14 05:40:00.626426: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.626828: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x5776b10 initialized for platform CUDA (this does not guarantee that XLA will be used). Devices:
2020-01-14 05:40:00.626856: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): GeForce RTX 2070, Compute Capability 7.5
2020-01-14 05:40:00.627044: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.627339: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1555] Found device 0 with properties:
pciBusID: 0000:01:00.0 name: GeForce RTX 2070 computeCapability: 7.5
coreClock: 1.62GHz coreCount: 36 deviceMemorySize: 7.79GiB deviceMemoryBandwidth: 417.29GiB/s
2020-01-14 05:40:00.627360: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1
2020-01-14 05:40:00.627368: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcublas.so.10
2020-01-14 05:40:00.627382: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcufft.so.10
2020-01-14 05:40:00.627392: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcurand.so.10
2020-01-14 05:40:00.627402: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusolver.so.10
2020-01-14 05:40:00.627412: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcusparse.so.10
2020-01-14 05:40:00.627419: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-01-14 05:40:00.627460: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.627732: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.628005: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1697] Adding visible gpu devices: 0
2020-01-14 05:40:00.628040: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudart.so.10.1
2020-01-14 05:40:00.801827: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1096] Device interconnect StreamExecutor with strength 1 edge matrix:
2020-01-14 05:40:00.801853: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1102] 0
2020-01-14 05:40:00.801858: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1115] 0: N
2020-01-14 05:40:00.802029: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.802406: I tensorflow/stream_executor/cuda/cuda_gpu_executor.cc:981] successful NUMA node read from SysFS had negative value (-1), but there must be at least one NUMA node, so returning NUMA node zero
2020-01-14 05:40:00.802727: I tensorflow/core/common_runtime/gpu/gpu_device.cc:1241] Created TensorFlow device (/job:localhost/replica:0/task:0/device:GPU:0 with 6664 MB memory) -> physical GPU (device: 0, name: GeForce RTX 2070, pci bus id: 0000:01:00.0, compute capability: 7.5)
2020-01-14 05:40:01.135124: I tensorflow/stream_executor/platform/default/dso_loader.cc:44] Successfully opened dynamic library libcudnn.so.7
2020-01-14 05:40:01.604467: E tensorflow/stream_executor/cuda/cuda_dnn.cc:329] Could not create cudnn handle: CUDNN_STATUS_INTERNAL_ERROR
2020-01-14 05:40:01.609256: E tensorflow/stream_executor/cuda/cuda_dnn.cc:329] Could not create cudnn handle: CUDNN_STATUS_INTERNAL_ERROR
Traceback (most recent call last):
File "train.py", line 204, in
evaluate()
File "train.py", line 171, in evaluate
G = get_G([1, None, None, 3])
File "/home/srgan/model.py", line 14, in get_G
n = Conv2d(64, (3, 3), (1, 1), act=tf.nn.relu, padding='SAME', W_init=w_init)(nin)
File "/usr/local/lib/python3.6/dist-packages/tensorlayer/layers/core.py", line 225, in __call__
outputs = self.forward(input_tensors, *args, **kwargs)
File "/usr/local/lib/python3.6/dist-packages/tensorlayer/layers/convolution/simplified_conv.py", line 271, in forward
name=self.name,
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/nn_ops.py", line 1914, in conv2d_v2
name=name)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/nn_ops.py", line 2011, in conv2d
name=name)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/ops/gen_nn_ops.py", line 937, in conv2d
_ops.raise_from_not_ok_status(e, name)
File "/usr/local/lib/python3.6/dist-packages/tensorflow_core/python/framework/ops.py", line 6606, in raise_from_not_ok_status
six.raise_from(core._status_to_exception(e.code, message), None)
File "", line 3, in raise_from
tensorflow.python.framework.errors_impl.UnknownError: Failed to get convolution algorithm. This is probably because cuDNN failed to initialize, so try looking to see if a warning log message was printed above. [Op:Conv2D] name: conv2d_1
Docker version 19.03.5, build I have 1 GeForce RTX 2070 installed and available in my machine. My current driver version is 440.33.01.
I am wondering if I'm doing something wrong? Or is there an issue with the Docker build?
ANSWER
Answered 2020-Jan-14 at 05:57Can you try setting
config.gpu_options.allow_growth = True
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install easydict
You can use easydict like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page