urllib3 | Python HTTP library with thread-safe connection pooling | HTTP library
kandi X-RAY | urllib3 Summary
Support
Quality
Security
License
Reuse
- Open a URL to a given connection .
- Creates a new URLS2 context .
- Connect to the remote server .
- Parse a url and return a Url object .
- Increment the number of retries .
- Performs a handshake .
- Wrap a socket .
- Read data from the underlying connection .
- Generate the headers to use .
- encode request body
urllib3 Key Features
urllib3 Examples and Code Snippets
>>> import httpx >>> from urllib3_transport import URLLib3Transport >>> client = httpx.Client(transport=URLLib3Transport()) >>> client.get("https://example.org")
Read more in this `issue `_.
import urllib3.contrib.securetransport urllib3.contrib.securetransport.inject_into_urllib3()
Trending Discussions on urllib3
Trending Discussions on urllib3
QUESTION
I'm trying to install eth-brownie using 'pipx install eth-brownie' but I get an error saying
pip failed to build package: cytoolz
Some possibly relevant errors from pip install:
build\lib.win-amd64-3.10\cytoolz\functoolz.cp310-win_amd64.pyd : fatal error LNK1120: 1 unresolved externals
error: command 'C:\\Program Files (x86)\\Microsoft Visual Studio\\2019\\BuildTools\\VC\\Tools\\MSVC\\14.29.30133\\bin\\HostX86\\x64\\link.exe' failed with exit code 1120
I've had a look at the log file and it shows that it failed to build cytoolz. It also mentions "ALERT: Cython not installed. Building without Cython.". From my limited understanding Cytoolz is apart of Cython so i think the reason why the installation for eth-brownie failed is because it could not build cytoolz as it was trying to build it without Cython. The thing is I already have cython installed:
C:\Users\alaiy>pip install cython
Requirement already satisfied: cython in c:\python310\lib\site-packages (0.29.24)
Extract from the log file (I can paste the whole thing but its lengthy):
Building wheels for collected packages: bitarray, cytoolz, lru-dict, parsimonious, psutil, pygments-lexer-solidity, varint, websockets, wrapt
Building wheel for bitarray (setup.py): started
Building wheel for bitarray (setup.py): finished with status 'done'
Created wheel for bitarray: filename=bitarray-1.2.2-cp310-cp310-win_amd64.whl size=55783 sha256=d4ae97234d659ed9ff1f0c0201e82c7e321bd3f4e122f6c2caee225172e7bfb2
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\1d\29\a8\5364620332cc833df35535f54074cf1e51f94d07d2a660bd6d
Building wheel for cytoolz (setup.py): started
Building wheel for cytoolz (setup.py): finished with status 'error'
Running setup.py clean for cytoolz
Building wheel for lru-dict (setup.py): started
Building wheel for lru-dict (setup.py): finished with status 'done'
Created wheel for lru-dict: filename=lru_dict-1.1.7-cp310-cp310-win_amd64.whl size=12674 sha256=6a7e7b2068eb8481650e0a2ae64c94223b3d2c018f163c5a0e7c1d442077450a
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\47\0a\dc\b156cb52954bbc1c31b4766ca3f0ed9eae9b218812bca89d7b
Building wheel for parsimonious (setup.py): started
Building wheel for parsimonious (setup.py): finished with status 'done'
Created wheel for parsimonious: filename=parsimonious-0.8.1-py3-none-any.whl size=42724 sha256=f9235a9614af0f5204d6bb35b8bd30b9456eae3021b5c2a9904345ad7d07a49d
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\b1\12\f1\7a2f39b30d6780ae9f2be9a52056595e0d97c1b4531d183085
Building wheel for psutil (setup.py): started
Building wheel for psutil (setup.py): finished with status 'done'
Created wheel for psutil: filename=psutil-5.8.0-cp310-cp310-win_amd64.whl size=246135 sha256=834ab1fd1dd0c18e574fc0fbf07922e605169ac68be70b8a64fb90c49ad4ae9b
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\12\a3\6d\615295409067d58a62a069d30d296d61d3ac132605e3a9555c
Building wheel for pygments-lexer-solidity (setup.py): started
Building wheel for pygments-lexer-solidity (setup.py): finished with status 'done'
Created wheel for pygments-lexer-solidity: filename=pygments_lexer_solidity-0.7.0-py3-none-any.whl size=7321 sha256=46355292f790d07d941a745cd58b64c5592e4c24357f7cc80fe200c39ab88d32
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\36\fd\bc\6ff4fe156d46016eca64c9652a1cd7af6411070c88acbeabf5
Building wheel for varint (setup.py): started
Building wheel for varint (setup.py): finished with status 'done'
Created wheel for varint: filename=varint-1.0.2-py3-none-any.whl size=1979 sha256=36b744b26ba7534a494757e16ab6e171d9bb60a4fe4663557d57034f1150b678
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\39\48\5e\33919c52a2a695a512ca394a5308dd12626a40bbcd288de814
Building wheel for websockets (setup.py): started
Building wheel for websockets (setup.py): finished with status 'done'
Created wheel for websockets: filename=websockets-9.1-cp310-cp310-win_amd64.whl size=91765 sha256=a00a9c801269ea2b86d72c0b0b654dc67672519721afeac8f912a157e52901c0
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\79\f7\4e\873eca27ecd6d7230caff265283a5a5112ad4cd1d945c022dd
Building wheel for wrapt (setup.py): started
Building wheel for wrapt (setup.py): finished with status 'done'
Created wheel for wrapt: filename=wrapt-1.12.1-cp310-cp310-win_amd64.whl size=33740 sha256=ccd729b6e3915164ac4994aef731f21cd232466b3f6c4823c9fda14b07e821c3
Stored in directory: c:\users\alaiy\appdata\local\pip\cache\wheels\8e\61\d3\d9e7053100177668fa43216a8082868c55015f8706abd974f2
Successfully built bitarray lru-dict parsimonious psutil pygments-lexer-solidity varint websockets wrapt
Failed to build cytoolz
Installing collected packages: toolz, eth-typing, eth-hash, cytoolz, six, pyparsing, eth-utils, varint, urllib3, toml, rlp, pyrsistent, pycryptodome, py, pluggy, parsimonious, packaging, netaddr, multidict, iniconfig, idna, hexbytes, eth-keys, colorama, charset-normalizer, certifi, base58, attrs, atomicwrites, yarl, typing-extensions, requests, python-dateutil, pytest, multiaddr, jsonschema, inflection, eth-rlp, eth-keyfile, eth-abi, chardet, bitarray, async-timeout, websockets, wcwidth, tomli, sortedcontainers, semantic-version, regex, pywin32, pytest-forked, pyjwt, pygments, protobuf, platformdirs, pathspec, mythx-models, mypy-extensions, lru-dict, ipfshttpclient, execnet, eth-account, dataclassy, click, asttokens, aiohttp, wrapt, web3, vyper, vvm, tqdm, pyyaml, pythx, python-dotenv, pytest-xdist, pygments-lexer-solidity, py-solc-x, py-solc-ast, psutil, prompt-toolkit, lazy-object-proxy, hypothesis, eth-event, eip712, black, eth-brownie
Running setup.py install for cytoolz: started
Running setup.py install for cytoolz: finished with status 'error'
PIP STDERR
----------
WARNING: The candidate selected for download or install is a yanked version: 'protobuf' candidate (version 3.18.0 at https://files.pythonhosted.org/packages/74/4e/9f3cb458266ef5cdeaa1e72a90b9eda100e3d1803cbd7ec02f0846da83c3/protobuf-3.18.0-py2.py3-none-any.whl#sha256=615099e52e9fbc9fde00177267a94ca820ecf4e80093e390753568b7d8cb3c1a (from https://pypi.org/simple/protobuf/))
Reason for being yanked: This version claims to support Python 2 but does not
ERROR: Command errored out with exit status 1:
command: 'C:\Users\alaiy\.local\pipx\venvs\eth-brownie\Scripts\python.exe' -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'C:\\Users\\alaiy\\AppData\\Local\\Temp\\pip-install-d1bskwa2\\cytoolz_f765f335272241adba2138f1920a35cd\\setup.py'"'"'; __file__='"'"'C:\\Users\\alaiy\\AppData\\Local\\Temp\\pip-install-d1bskwa2\\cytoolz_f765f335272241adba2138f1920a35cd\\setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(__file__) if os.path.exists(__file__) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d 'C:\Users\alaiy\AppData\Local\Temp\pip-wheel-pxzumeav'
cwd: C:\Users\alaiy\AppData\Local\Temp\pip-install-d1bskwa2\cytoolz_f765f335272241adba2138f1920a35cd\
Complete output (70 lines):
ALERT: Cython not installed. Building without Cython.
running bdist_wheel
running build
running build_py
creating build
creating build\lib.win-amd64-3.10
creating build\lib.win-amd64-3.10\cytoolz
copying cytoolz\compatibility.py -> build\lib.win-amd64-3.10\cytoolz
copying cytoolz\utils_test.py -> build\lib.win-amd64-3.10\cytoolz
Any help would be appreciated!
Edit: Found a solution. Cython appears to not be supported on Python 3.10 (ref https://github.com/eth-brownie/brownie/issues/1300 and https://github.com/cython/cython/issues/4046). I downgraded to Python 3.9.7 and eth-brownie installation worked!)
ANSWER
Answered 2022-Jan-02 at 09:59I used pip install eth-brownie and it worked fine, I didnt need to downgrade. Im new to this maybe I could be wrong but it worked fine with me.
QUESTION
I want to install packages from poetry.lock
file; using poetry install
.
However, the majority of packages throw the exact same error, indicating a shared fundamental problem.
What is causing this? What is the standard fix?
Specification:
- Windows 10,
- Visual Studio Code,
- Python 3.8.10 & Poetry 1.1.11,
- Ubuntu Bash.
Terminal:
rm poetry.lock
poetry update
poetry install
me@PF2DCSXD:/mnt/c/Users/user/Documents/GitHub/workers-python/workers/composite_key$ poetry update
Updating dependencies
Resolving dependencies... (217.2s)
Writing lock file
Package operations: 55 installs, 8 updates, 0 removals
• Updating pyparsing (3.0.4 -> 2.4.7)
• Updating pyyaml (5.4.1 -> 6.0)
• Installing arrow (1.2.1)
• Installing chardet (4.0.0)
• Updating itsdangerous (1.1.0 -> 2.0.1)
• Updating jinja2 (2.11.3 -> 3.0.2)
• Updating packaging (20.9 -> 21.2)
• Installing text-unidecode (1.3)
• Updating werkzeug (1.0.1 -> 2.0.2)
• Installing binaryornot (0.4.4)
• Installing bokeh (2.4.1): Failed
AttributeError
'Link' object has no attribute 'name'
at ~/.local/share/pypoetry/venv/lib/python3.8/site-packages/poetry/installation/executor.py:632 in _download_link
628│ raise RuntimeError(
629│ "Invalid hashes ({}) for {} using archive {}. Expected one of {}.".format(
630│ ", ".join(sorted(archive_hashes)),
631│ package,
→ 632│ archive.name,
633│ ", ".join(sorted(hashes)),
634│ )
635│ )
636│
• Updating flask (1.1.4 -> 2.0.2)
• Installing jinja2-time (0.2.0)
• Installing poyo (0.5.0)
• Installing python-slugify (5.0.2)
me@PF2DCSXD:/mnt/c/Users/user/Documents/GitHub/workers-python/workers/composite_key$ ls
Dockerfile azure-pipeline-composite_key.yaml compositekey docs poetry.lock pyproject.toml
me@PF2DCSXD:/mnt/c/Users/user/Documents/GitHub/workers-python/workers/composite_key$ poetry install
Installing dependencies from lock file
Package operations: 48 installs, 1 update, 3 removals
• Removing cffi (1.15.0)
• Removing colorama (0.4.4)
• Removing pycparser (2.20)
• Installing bokeh (2.4.1): Failed
AttributeError
'Link' object has no attribute 'name'
at ~/.local/share/pypoetry/venv/lib/python3.8/site-packages/poetry/installation/executor.py:632 in _download_link
628│ raise RuntimeError(
629│ "Invalid hashes ({}) for {} using archive {}. Expected one of {}.".format(
630│ ", ".join(sorted(archive_hashes)),
631│ package,
→ 632│ archive.name,
633│ ", ".join(sorted(hashes)),
634│ )
635│ )
636│
Suggested Solution Failed:
danielbellhv@PF2DCSXD:/mnt/c/Users/dabell/Documents/GitHub/workers-python/workers/data_simulator$ pip install poetry==1.1.7
Defaulting to user installation because normal site-packages is not writeable
Collecting poetry==1.1.7
Downloading poetry-1.1.7-py2.py3-none-any.whl (173 kB)
|████████████████████████████████| 173 kB 622 kB/s
Requirement already satisfied: cachy<0.4.0,>=0.3.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.3.0)
Requirement already satisfied: requests-toolbelt<0.10.0,>=0.9.1 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.9.1)
Requirement already satisfied: pkginfo<2.0,>=1.4 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (1.7.1)
Requirement already satisfied: shellingham<2.0,>=1.1 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (1.4.0)
Requirement already satisfied: tomlkit<1.0.0,>=0.7.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.7.2)
Requirement already satisfied: cachecontrol[filecache]<0.13.0,>=0.12.4 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.12.8)
Requirement already satisfied: html5lib<2.0,>=1.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (1.1)
Requirement already satisfied: poetry-core<1.1.0,>=1.0.3 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (1.0.4)
Requirement already satisfied: crashtest<0.4.0,>=0.3.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.3.1)
Requirement already satisfied: clikit<0.7.0,>=0.6.2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.6.2)
Requirement already satisfied: keyring<22.0.0,>=21.2.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (21.8.0)
Requirement already satisfied: pexpect<5.0.0,>=4.7.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (4.8.0)
Requirement already satisfied: cleo<0.9.0,>=0.8.1 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (0.8.1)
Requirement already satisfied: virtualenv<21.0.0,>=20.0.26 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (20.10.0)
Requirement already satisfied: packaging<21.0,>=20.4 in /home/danielbellhv/.local/lib/python3.8/site-packages (from poetry==1.1.7) (20.9)
Requirement already satisfied: requests<3.0,>=2.18 in /usr/lib/python3/dist-packages (from poetry==1.1.7) (2.22.0)
Requirement already satisfied: msgpack>=0.5.2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from cachecontrol[filecache]<0.13.0,>=0.12.4->poetry==1.1.7) (1.0.2)
Requirement already satisfied: lockfile>=0.9 in /home/danielbellhv/.local/lib/python3.8/site-packages (from cachecontrol[filecache]<0.13.0,>=0.12.4->poetry==1.1.7) (0.12.2)
Requirement already satisfied: pastel<0.3.0,>=0.2.0 in /home/danielbellhv/.local/lib/python3.8/site-packages (from clikit<0.7.0,>=0.6.2->poetry==1.1.7) (0.2.1)
Requirement already satisfied: pylev<2.0,>=1.3 in /home/danielbellhv/.local/lib/python3.8/site-packages (from clikit<0.7.0,>=0.6.2->poetry==1.1.7) (1.4.0)
Requirement already satisfied: webencodings in /home/danielbellhv/.local/lib/python3.8/site-packages (from html5lib<2.0,>=1.0->poetry==1.1.7) (0.5.1)
Requirement already satisfied: six>=1.9 in /usr/lib/python3/dist-packages (from html5lib<2.0,>=1.0->poetry==1.1.7) (1.14.0)
Requirement already satisfied: jeepney>=0.4.2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from keyring<22.0.0,>=21.2.0->poetry==1.1.7) (0.7.1)
Requirement already satisfied: SecretStorage>=3.2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from keyring<22.0.0,>=21.2.0->poetry==1.1.7) (3.3.1)
Requirement already satisfied: pyparsing>=2.0.2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from packaging<21.0,>=20.4->poetry==1.1.7) (2.4.7)
Requirement already satisfied: ptyprocess>=0.5 in /home/danielbellhv/.local/lib/python3.8/site-packages (from pexpect<5.0.0,>=4.7.0->poetry==1.1.7) (0.7.0)
Requirement already satisfied: platformdirs<3,>=2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from virtualenv<21.0.0,>=20.0.26->poetry==1.1.7) (2.4.0)
Requirement already satisfied: distlib<1,>=0.3.1 in /home/danielbellhv/.local/lib/python3.8/site-packages (from virtualenv<21.0.0,>=20.0.26->poetry==1.1.7) (0.3.3)
Requirement already satisfied: filelock<4,>=3.2 in /home/danielbellhv/.local/lib/python3.8/site-packages (from virtualenv<21.0.0,>=20.0.26->poetry==1.1.7) (3.3.2)
Requirement already satisfied: backports.entry-points-selectable>=1.0.4 in /home/danielbellhv/.local/lib/python3.8/site-packages (from virtualenv<21.0.0,>=20.0.26->poetry==1.1.7) (1.1.0)
Requirement already satisfied: cryptography>=2.0 in /usr/lib/python3/dist-packages (from SecretStorage>=3.2->keyring<22.0.0,>=21.2.0->poetry==1.1.7) (2.8)
Installing collected packages: poetry
Successfully installed poetry-1.1.7
danielbellhv@PF2DCSXD:/mnt/c/Users/dabell/Documents/GitHub/workers-python/workers/data_simulator$ pip install poetry-core==1.0.4
Defaulting to user installation because normal site-packages is not writeable
Requirement already satisfied: poetry-core==1.0.4 in /home/danielbellhv/.local/lib/python3.8/site-packages (1.0.4)
danielbellhv@PF2DCSXD:/mnt/c/Users/dabell/Documents/GitHub/workers-python/workers/data_simulator$ poetry install
Installing dependencies from lock file
Package operations: 82 installs, 0 updates, 0 removals
• Installing certifi (2021.5.30): Pending...
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
• Installing certifi (2021.5.30): Failed
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
JSONDecodeError
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
Expecting value: line 1 column 1 (char 0)
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
396│ if ord0 == 0xfeff:
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
397│ idx += 1
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
399│ idx += 3
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
401│
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing pyasn1 (0.4.8): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
• Installing charset-normalizer (2.0.3): Failed
JSONDecodeError
Expecting value: line 1 column 1 (char 0)
at /usr/lib/python3/dist-packages/simplejson/decoder.py:400 in raw_decode
396│ if ord0 == 0xfeff:
397│ idx += 1
398│ elif ord0 == 0xef and s[idx:idx + 3] == '\xef\xbb\xbf':
399│ idx += 3
→ 400│ return self.scan_once(s, idx=_w(s, idx).end())
401│
• Installing idna (3.2): Failed
...
• Installing pyasn1 (0.4.8): Failed
...
• Installing urllib3 (1.26.6): Failed
...
Please let me know if there is anything else I can add to post.
ANSWER
Answered 2022-Mar-23 at 10:22This looks to be an active issue relating to poetry. See here - Issue #4085. Some suggest a workaround by downgrading poetry-core
down to 1.0.4.
There is an active PR to fix the issue.
QUESTION
I have been trying out an open-sourced personal AI assistant script. The script works fine but I want to create an executable so that I can gift the executable to one of my friends. However, when I try to create the executable using the auto-py-to-exe, it states the below error:
Running auto-py-to-exe v2.10.1
Building directory: C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
Provided command: pyinstaller --noconfirm --onedir --console --no-embed-manifest "C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py"
Recursion Limit is set to 5000
Executing: pyinstaller --noconfirm --onedir --console --no-embed-manifest C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py --distpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\application --workpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\build --specpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
42681 INFO: PyInstaller: 4.6
42690 INFO: Python: 3.10.0
42732 INFO: Platform: Windows-10-10.0.19042-SP0
42744 INFO: wrote C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec
42764 INFO: UPX is not available.
42772 INFO: Extending PYTHONPATH with paths
['C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310']
43887 INFO: checking Analysis
43891 INFO: Building Analysis because Analysis-00.toc is non existent
43895 INFO: Initializing module dependency graph...
43915 INFO: Caching module graph hooks...
43975 INFO: Analyzing base_library.zip ...
54298 INFO: Processing pre-find module path hook distutils from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-distutils.py'.
54306 INFO: distutils: retargeting to non-venv dir 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib'
57474 INFO: Caching module dependency graph...
58088 INFO: running Analysis Analysis-00.toc
58132 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable
required by C:\Users\Tarun\AppData\Local\Programs\Python\Python310\python.exe
58365 INFO: Analyzing C:\Users\Tarun\AppData\Local\Programs\Python\Python310\AI_Ass.py
59641 INFO: Processing pre-safe import module hook urllib3.packages.six.moves from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-urllib3.packages.six.moves.py'.
An error occurred while packaging
Traceback (most recent call last):
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\auto_py_to_exe\packaging.py", line 131, in package
run_pyinstaller()
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 124, in run
run_build(pyi_config, spec_file, **vars(args))
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 58, in run_build
PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 782, in main
build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 714, in build
exec(code, spec_namespace)
File "C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec", line 7, in
a = Analysis(['C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py'],
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 277, in __init__
self.__postinit__()
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\datastruct.py", line 155, in __postinit__
self.assemble()
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 439, in assemble
priority_scripts.append(self.graph.add_script(script))
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 265, in add_script
self._top_script_node = super().add_script(pathname)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1433, in add_script
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
target_package, target_module_partname = self._find_head_package(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
target_package = self._safe_import_module(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
target_package, target_module_partname = self._find_head_package(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
target_package = self._safe_import_module(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
target_package, target_module_partname = self._find_head_package(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
target_package = self._safe_import_module(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
target_package, target_module_partname = self._find_head_package(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
target_package = self._safe_import_module(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
submodule = self._safe_import_module(head, mname, submodule)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
submodule = self._safe_import_module(head, mname, submodule)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
self._process_imports(n)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
target_module = self._safe_import_hook(*import_info, **kwargs)[0]
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
target_modules = self.import_hook(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
submodule = self._safe_import_module(head, mname, submodule)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
return super()._safe_import_module(module_basename, module_name, parent_package)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2061, in _safe_import_module
n = self._scan_code(module, co, co_ast)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2645, in _scan_code
self._scan_bytecode(
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2749, in _scan_bytecode
for inst in util.iterate_instructions(module_code_object):
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 147, in iterate_instructions
yield from iterate_instructions(constant)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 139, in iterate_instructions
yield from get_instructions(code_object)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 338, in _get_instructions_bytes
argval, argrepr = _get_const_info(arg, constants)
File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 292, in _get_const_info
argval = const_list[const_index]
IndexError: tuple index out of range
Project output will not be moved to output folder
Complete.
I understand that there is a thread already about similar issue but it still doesn't solve the issue. Hence seeking out help
I really have no idea why is the error occurring and how to resolve it. I am pasting the script below for your reference. Can some one please help? Thank you in advance
#importing libraries
import speech_recognition as sr
import pyttsx3
import datetime
import wikipedia
import webbrowser
import os
import time
import subprocess
from ecapture import ecapture as ec
import wolframalpha
import json
import requests
#setting up speech engine
engine=pyttsx3.init('sapi5')
voices=engine.getProperty('voices')
engine.setProperty('voice','voices[1].id')
def speak(text):
engine.say(text)
engine.runAndWait()
#Greet user
def wishMe():
hour=datetime.datetime.now().hour
if hour>=0 and hour<12:
speak("Hello,Good Morning")
print("Hello,Good Morning")
elif hour>=12 and hour<18:
speak("Hello,Good Afternoon")
print("Hello,Good Afternoon")
else:
speak("Hello,Good Evening")
print("Hello,Good Evening")
#Setting up the command function for your AI assistant
def takeCommand():
r=sr.Recognizer()
with sr.Microphone() as source:
print("Listening...")
audio=r.listen(source)
try:
statement=r.recognize_google(audio,language='en-in')
print(f"user said:{statement}\n")
except Exception as e:
speak("Pardon me, please say that again")
return "None"
return statement
print("Loading your AI personal assistant Friday")
speak("Loading your AI personal assistant Friday")
wishMe()
#main function
if __name__=='__main__':
while True:
speak("Tell me how can I help you now?")
statement = takeCommand().lower()
if statement==0:
continue
if "good bye" in statement or "ok bye" in statement or "stop" in statement:
speak('your personal assistant Friday is shutting down,Good bye')
print('your personal assistant Friday is shutting down,Good bye')
break
if 'wikipedia' in statement:
speak('Searching Wikipedia...')
statement =statement.replace("wikipedia", "")
results = wikipedia.summary(statement, sentences=10)
webbrowser.open_new_tab("https://en.wikipedia.org/wiki/"+ statement)
speak("According to Wikipedia")
print(results)
speak(results)
elif 'open youtube' in statement:
webbrowser.register('chrome', None,
webbrowser.BackgroundBrowser("C://Program Files (x86)//Google//Chrome//Application//chrome.exe"))
webbrowser.get('chrome').open_new_tab("https://www.youtube.com")
#webbrowser.open_new_tab("https://www.youtube.com")
speak("youtube is open now")
time.sleep(5)
elif 'open google' in statement:
webbrowser.open_new_tab("https://www.google.com")
speak("Google chrome is open now")
time.sleep(5)
elif 'open gmail' in statement:
webbrowser.open_new_tab("gmail.com")
speak("Google Mail open now")
time.sleep(5)
elif 'time' in statement:
strTime=datetime.datetime.now().strftime("%H:%M:%S")
speak(f"the time is {strTime}")
elif 'news' in statement:
news = webbrowser.open_new_tab("https://timesofindia.indiatimes.com/home/headlines")
speak('Here are some headlines from the Times of India,Happy reading')
time.sleep(6)
elif "camera" in statement or "take a photo" in statement:
ec.capture(0,"robo camera","img.jpg")
elif 'search' in statement:
statement = statement.replace("search", "")
webbrowser.open_new_tab(statement)
time.sleep(5)
elif 'who are you' in statement or 'what can you do' in statement:
speak('I am Friday version 1 point O your personal assistant. I am programmed to minor tasks like'
'opening youtube,google chrome, gmail and stackoverflow ,predict time,take a photo,search wikipedia,predict weather'
'In different cities, get top headline news from times of india and you can ask me computational or geographical questions too!')
elif "who made you" in statement or "who created you" in statement or "who discovered you" in statement:
speak("I was built by Mirthula")
print("I was built by Mirthula")
elif "log off" in statement or "sign out" in statement:
speak("Ok , your pc will log off in 10 sec make sure you exit from all applications")
subprocess.call(["shutdown", "/l"])
time.sleep(3)
ANSWER
Answered 2021-Nov-05 at 02:2042681 INFO: PyInstaller: 4.6
42690 INFO: Python: 3.10.0
There's the issue. Python 3.10.0 has a bug with PyInstaller 4.6. The problem isn't you or PyInstaller. Try converting it using Python 3.9.7 instead. Ironic, considering 3.10.0 was suppose to be a bugfix update.
QUESTION
I read ton of articles, but still can't figure out what I'm missing. I'm running a django website from virtualenv. Here's my config file. The website address is replaced by , can't use that here.
Config
ServerAdmin sidharth@collaboration-management
ServerName
ServerAlias
DocumentRoot /home/sidharth/Downloads/gmcweb
ErrorLog ${APACHE_LOG_DIR}/error.log
CustomLog ${APACHE_LOG_DIR}/access.log combined
Alias /static /home/sidharth/Downloads/gmcweb/static
Require all granted
Require all granted
WSGIDaemonProcess gmcweb python-home=/home/sidharth/Downloads/gmcwebenvlin python-path=/home/sidharth/Downloads/gmcweb
WSGIProcessGroup gmcweb
WSGIScriptAlias / /home/sidharth/Downloads/gmcweb/gmcweb/wsgi.py
Here's my WSGI.py file, didn't change anything never had to earlier
import os
from django.core.wsgi import get_wsgi_application
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'gmcweb.settings')
application = get_wsgi_application()
Python Versions
My virtualenv python version is 3.9.5 Default Google VM python version is 3.6.9
Python Installed Libraries
Package Version
------------------------ ---------
asgiref 3.4.0
attrs 21.2.0
autopep8 1.5.7
beautifulsoup4 4.9.3
certifi 2021.5.30
cffi 1.14.5
chardet 4.0.0
cryptography 3.4.7
defusedxml 0.7.1
Django 3.2.4
django-allauth 0.44.0
django-livereload-server 0.3.2
idna 2.10
jsonschema 3.2.0
oauthlib 3.1.1
pip 21.2.3
pycodestyle 2.7.0
pycparser 2.20
PyJWT 2.1.0
pyrsistent 0.18.0
python3-openid 3.2.0
pytz 2021.1
requests 2.25.1
requests-oauthlib 1.3.0
setuptools 57.4.0
six 1.16.0
soupsieve 2.2.1
sqlparse 0.4.1
toml 0.10.2
tornado 6.1
urllib3 1.26.6
I installed apache modwsgi as well sudo apt-get install python3-pip apache2 libapache2-mod-wsgi-py3
Error Log File
[Thu Sep 23 15:05:06.554545 2021] [mpm_event:notice] [pid 32077:tid 140392561593280] AH00489: Apache/2.4.29 (Ubuntu) mod_wsgi/4.5.17 Python/3.6 configured -- resuming normal operations
[Thu Sep 23 15:05:06.554594 2021] [core:notice] [pid 32077:tid 140392561593280] AH00094: Command line: '/usr/sbin/apache2'
[Thu Sep 23 15:05:19.081581 2021] [wsgi:error] [pid 32617:tid 140392409851648] [remote 103.206.177.13:49604] mod_wsgi (pid=32617): Target WSGI script '/home/sidharth/Downloads/gmcweb/gmcweb/wsgi.py' c$
[Thu Sep 23 15:05:19.081638 2021] [wsgi:error] [pid 32617:tid 140392409851648] [remote 103.206.177.13:49604] mod_wsgi (pid=32617): Exception occurred processing WSGI script '/home/sidharth/Downloads/g$
[Thu Sep 23 15:05:19.081828 2021] [wsgi:error] [pid 32617:tid 140392409851648] [remote 103.206.177.13:49604] Traceback (most recent call last):
[Thu Sep 23 15:05:19.081849 2021] [wsgi:error] [pid 32617:tid 140392409851648] [remote 103.206.177.13:49604] File "/home/sidharth/Downloads/gmcweb/gmcweb/wsgi.py", line 12, in
[Thu Sep 23 15:05:19.081853 2021] [wsgi:error] [pid 32617:tid 140392409851648] [remote 103.206.177.13:49604] from django.core.wsgi import get_wsgi_application
[Thu Sep 23 15:05:19.081867 2021] [wsgi:error] [pid 32617:tid 140392409851648] [remote 103.206.177.13:49604] ModuleNotFoundError: No module named 'django'
[Thu Sep 23 15:05:32.244779 2021] [wsgi:error] [pid 32617:tid 140392325842688] [remote 103.206.177.13:52916] mod_wsgi (pid=32617): Target WSGI script '/home/sidharth/Downloads/gmcweb/gmcweb/wsgi.py' c$
[Thu Sep 23 15:05:32.244845 2021] [wsgi:error] [pid 32617:tid 140392325842688] [remote 103.206.177.13:52916] mod_wsgi (pid=32617): Exception occurred processing WSGI script '/home/sidharth/Downloads/g$
[Thu Sep 23 15:05:32.244924 2021] [wsgi:error] [pid 32617:tid 140392325842688] [remote 103.206.177.13:52916] Traceback (most recent call last):
[Thu Sep 23 15:05:32.244946 2021] [wsgi:error] [pid 32617:tid 140392325842688] [remote 103.206.177.13:52916] File "/home/sidharth/Downloads/gmcweb/gmcweb/wsgi.py", line 12, in
[Thu Sep 23 15:05:32.244951 2021] [wsgi:error] [pid 32617:tid 140392325842688] [remote 103.206.177.13:52916] from django.core.wsgi import get_wsgi_application
[Thu Sep 23 15:05:32.244966 2021] [wsgi:error] [pid 32617:tid 140392325842688] [remote 103.206.177.13:52916] ModuleNotFoundError: No module named 'django'
ANSWER
Answered 2021-Sep-23 at 15:28The error says that either you haven't got Django installed or didn't activate the virtual environment in which the Django was installed. Make sure that you check the list of installed packages and find Django in there, via:
$pip list
QUESTION
I have pretrained model for object detection (Google Colab + TensorFlow) inside Google Colab and I run it two-three times per week for new images I have and everything was fine for the last year till this week. Now when I try to run model I have this message:
Graph execution error:
2 root error(s) found.
(0) UNIMPLEMENTED: DNN library is not found.
[[{{node functional_1/conv1_conv/Conv2D}}]]
[[StatefulPartitionedCall/SecondStagePostprocessor/BatchMultiClassNonMaxSuppression/MultiClassNonMaxSuppression/Reshape_5/_126]]
(1) UNIMPLEMENTED: DNN library is not found.
[[{{node functional_1/conv1_conv/Conv2D}}]]
0 successful operations.
0 derived errors ignored. [Op:__inference_restored_function_body_27380] ***
Never happended before.
Before I can run my model I have to install Tensor Flow object detection API with this command:
import os
os.chdir('/project/models/research')
!protoc object_detection/protos/*.proto --python_out=.
!cp object_detection/packages/tf2/setup.py .
!python -m pip install .
This is the output of command:
Processing /content/gdrive/MyDrive/models/research
DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
Collecting avro-python3
Downloading avro-python3-1.10.2.tar.gz (38 kB)
Collecting apache-beam
Downloading apache_beam-2.35.0-cp37-cp37m-manylinux2010_x86_64.whl (9.9 MB)
|████████████████████████████████| 9.9 MB 1.6 MB/s
Requirement already satisfied: pillow in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (7.1.2)
Requirement already satisfied: lxml in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (4.2.6)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (3.2.2)
Requirement already satisfied: Cython in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (0.29.27)
Requirement already satisfied: contextlib2 in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (0.5.5)
Collecting tf-slim
Downloading tf_slim-1.1.0-py2.py3-none-any.whl (352 kB)
|████████████████████████████████| 352 kB 50.5 MB/s
Requirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.15.0)
Requirement already satisfied: pycocotools in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (2.0.4)
Collecting lvis
Downloading lvis-0.5.3-py3-none-any.whl (14 kB)
Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.4.1)
Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.3.5)
Collecting tf-models-official>=2.5.1
Downloading tf_models_official-2.8.0-py2.py3-none-any.whl (2.2 MB)
|████████████████████████████████| 2.2 MB 38.3 MB/s
Collecting tensorflow_io
Downloading tensorflow_io-0.24.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (23.4 MB)
|████████████████████████████████| 23.4 MB 1.7 MB/s
Requirement already satisfied: keras in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (2.7.0)
Collecting opencv-python-headless
Downloading opencv_python_headless-4.5.5.62-cp36-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (47.7 MB)
|████████████████████████████████| 47.7 MB 74 kB/s
Collecting sacrebleu
Downloading sacrebleu-2.0.0-py3-none-any.whl (90 kB)
|████████████████████████████████| 90 kB 10.4 MB/s
Requirement already satisfied: kaggle>=1.3.9 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.5.12)
Requirement already satisfied: psutil>=5.4.3 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (5.4.8)
Requirement already satisfied: oauth2client in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (4.1.3)
Collecting tensorflow-addons
Downloading tensorflow_addons-0.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)
|████████████████████████████████| 1.1 MB 37.8 MB/s
Requirement already satisfied: gin-config in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.5.0)
Requirement already satisfied: tensorflow-datasets in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (4.0.1)
Collecting sentencepiece
Downloading sentencepiece-0.1.96-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.2 MB)
|████████████████████████████████| 1.2 MB 37.5 MB/s
Collecting tensorflow-model-optimization>=0.4.1
Downloading tensorflow_model_optimization-0.7.0-py2.py3-none-any.whl (213 kB)
|████████████████████████████████| 213 kB 42.7 MB/s
Collecting pyyaml<6.0,>=5.1
Downloading PyYAML-5.4.1-cp37-cp37m-manylinux1_x86_64.whl (636 kB)
|████████████████████████████████| 636 kB 53.3 MB/s
Collecting tensorflow-text~=2.8.0
Downloading tensorflow_text-2.8.1-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (4.9 MB)
|████████████████████████████████| 4.9 MB 46.1 MB/s
Requirement already satisfied: google-api-python-client>=1.6.7 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.12.10)
Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.19.5)
Requirement already satisfied: tensorflow-hub>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.12.0)
Collecting seqeval
Downloading seqeval-1.2.2.tar.gz (43 kB)
|████████████████████████████████| 43 kB 2.1 MB/s
Collecting tensorflow~=2.8.0
Downloading tensorflow-2.8.0-cp37-cp37m-manylinux2010_x86_64.whl (497.5 MB)
|████████████████████████████████| 497.5 MB 28 kB/s
Collecting py-cpuinfo>=3.3.0
Downloading py-cpuinfo-8.0.0.tar.gz (99 kB)
|████████████████████████████████| 99 kB 10.1 MB/s
Requirement already satisfied: google-auth<3dev,>=1.16.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.35.0)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.1)
Requirement already satisfied: httplib2<1dev,>=0.15.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.17.4)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.0.4)
Requirement already satisfied: google-api-core<3dev,>=1.21.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.26.3)
Requirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (57.4.0)
Requirement already satisfied: pytz in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2018.9)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.54.0)
Requirement already satisfied: requests<3.0.0dev,>=2.18.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.23.0)
Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (21.3)
Requirement already satisfied: protobuf>=3.12.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.17.3)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.2.8)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.7/dist-packages (from google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.8)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.2.4)
Requirement already satisfied: certifi in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2021.10.8)
Requirement already satisfied: urllib3 in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.24.3)
Requirement already satisfied: python-dateutil in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2.8.2)
Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (4.62.3)
Requirement already satisfied: python-slugify in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (5.0.2)
Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from packaging>=14.3->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.7)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.7/dist-packages (from pyasn1-modules>=0.2.1->google-auth<3dev,>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.4.8)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.10)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.4)
Requirement already satisfied: termcolor>=1.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.1.0)
Requirement already satisfied: libclang>=9.0.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (13.0.0)
Requirement already satisfied: h5py>=2.9.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.1.0)
Requirement already satisfied: astunparse>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.6.3)
Requirement already satisfied: gast>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.4.0)
Requirement already satisfied: google-pasta>=0.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.2.0)
Requirement already satisfied: typing-extensions>=3.6.6 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.10.0.2)
Requirement already satisfied: wrapt>=1.11.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.13.3)
Requirement already satisfied: tensorflow-io-gcs-filesystem>=0.23.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.23.1)
Collecting tf-estimator-nightly==2.8.0.dev2021122109
Downloading tf_estimator_nightly-2.8.0.dev2021122109-py2.py3-none-any.whl (462 kB)
|████████████████████████████████| 462 kB 49.5 MB/s
Requirement already satisfied: keras-preprocessing>=1.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.1.2)
Collecting tensorboard<2.9,>=2.8
Downloading tensorboard-2.8.0-py3-none-any.whl (5.8 MB)
|████████████████████████████████| 5.8 MB 41.2 MB/s
Requirement already satisfied: flatbuffers>=1.12 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (2.0)
Collecting keras
Downloading keras-2.8.0-py2.py3-none-any.whl (1.4 MB)
|████████████████████████████████| 1.4 MB 41.2 MB/s
Requirement already satisfied: opt-einsum>=2.3.2 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.3.0)
Collecting numpy>=1.15.4
Downloading numpy-1.21.5-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.7 MB)
|████████████████████████████████| 15.7 MB 41.4 MB/s
Requirement already satisfied: absl-py>=0.4.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.0.0)
Requirement already satisfied: grpcio<2.0,>=1.24.3 in /usr/local/lib/python3.7/dist-packages (from tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.43.0)
Requirement already satisfied: wheel<1.0,>=0.23.0 in /usr/local/lib/python3.7/dist-packages (from astunparse>=1.6.0->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.37.1)
Requirement already satisfied: cached-property in /usr/local/lib/python3.7/dist-packages (from h5py>=2.9.0->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.5.2)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.6.1)
Requirement already satisfied: werkzeug>=0.11.15 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.0.1)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (0.4.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.8.1)
Requirement already satisfied: markdown>=2.6.8 in /usr/local/lib/python3.7/dist-packages (from tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.3.6)
Requirement already satisfied: requests-oauthlib>=0.7.0 in /usr/local/lib/python3.7/dist-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (1.3.1)
Requirement already satisfied: importlib-metadata>=4.4 in /usr/local/lib/python3.7/dist-packages (from markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (4.10.1)
Requirement already satisfied: zipp>=0.5 in /usr/local/lib/python3.7/dist-packages (from importlib-metadata>=4.4->markdown>=2.6.8->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.7.0)
Requirement already satisfied: oauthlib>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.9,>=2.8->tensorflow~=2.8.0->tf-models-official>=2.5.1->object-detection==0.1) (3.2.0)
Requirement already satisfied: dm-tree~=0.1.1 in /usr/local/lib/python3.7/dist-packages (from tensorflow-model-optimization>=0.4.1->tf-models-official>=2.5.1->object-detection==0.1) (0.1.6)
Requirement already satisfied: crcmod<2.0,>=1.7 in /usr/local/lib/python3.7/dist-packages (from apache-beam->object-detection==0.1) (1.7)
Collecting fastavro<2,>=0.21.4
Downloading fastavro-1.4.9-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.3 MB)
|████████████████████████████████| 2.3 MB 38.1 MB/s
Requirement already satisfied: pyarrow<7.0.0,>=0.15.1 in /usr/local/lib/python3.7/dist-packages (from apache-beam->object-detection==0.1) (6.0.1)
Requirement already satisfied: pydot<2,>=1.2.0 in /usr/local/lib/python3.7/dist-packages (from apache-beam->object-detection==0.1) (1.3.0)
Collecting proto-plus<2,>=1.7.1
Downloading proto_plus-1.19.9-py3-none-any.whl (45 kB)
|████████████████████████████████| 45 kB 3.2 MB/s
Collecting requests<3.0.0dev,>=2.18.0
Downloading requests-2.27.1-py2.py3-none-any.whl (63 kB)
|████████████████████████████████| 63 kB 1.8 MB/s
Collecting dill<0.3.2,>=0.3.1.1
Downloading dill-0.3.1.1.tar.gz (151 kB)
|████████████████████████████████| 151 kB 44.4 MB/s
Collecting numpy>=1.15.4
Downloading numpy-1.20.3-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (15.3 MB)
|████████████████████████████████| 15.3 MB 21.1 MB/s
Collecting orjson<4.0
Downloading orjson-3.6.6-cp37-cp37m-manylinux_2_24_x86_64.whl (245 kB)
|████████████████████████████████| 245 kB 53.2 MB/s
Collecting hdfs<3.0.0,>=2.1.0
Downloading hdfs-2.6.0-py3-none-any.whl (33 kB)
Collecting pymongo<4.0.0,>=3.8.0
Downloading pymongo-3.12.3-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (508 kB)
|████████████████████████████████| 508 kB 44.3 MB/s
Requirement already satisfied: docopt in /usr/local/lib/python3.7/dist-packages (from hdfs<3.0.0,>=2.1.0->apache-beam->object-detection==0.1) (0.6.2)
Collecting protobuf>=3.12.0
Downloading protobuf-3.19.4-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.1 MB)
|████████████████████████████████| 1.1 MB 47.3 MB/s
Requirement already satisfied: charset-normalizer~=2.0.0 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<3dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.0.11)
Requirement already satisfied: opencv-python>=4.1.0.25 in /usr/local/lib/python3.7/dist-packages (from lvis->object-detection==0.1) (4.1.2.30)
Requirement already satisfied: cycler>=0.10.0 in /usr/local/lib/python3.7/dist-packages (from lvis->object-detection==0.1) (0.11.0)
Requirement already satisfied: kiwisolver>=1.1.0 in /usr/local/lib/python3.7/dist-packages (from lvis->object-detection==0.1) (1.3.2)
Requirement already satisfied: text-unidecode>=1.3 in /usr/local/lib/python3.7/dist-packages (from python-slugify->kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.3)
Requirement already satisfied: regex in /usr/local/lib/python3.7/dist-packages (from sacrebleu->tf-models-official>=2.5.1->object-detection==0.1) (2019.12.20)
Requirement already satisfied: tabulate>=0.8.9 in /usr/local/lib/python3.7/dist-packages (from sacrebleu->tf-models-official>=2.5.1->object-detection==0.1) (0.8.9)
Collecting portalocker
Downloading portalocker-2.3.2-py2.py3-none-any.whl (15 kB)
Collecting colorama
Downloading colorama-0.4.4-py2.py3-none-any.whl (16 kB)
Requirement already satisfied: scikit-learn>=0.21.3 in /usr/local/lib/python3.7/dist-packages (from seqeval->tf-models-official>=2.5.1->object-detection==0.1) (1.0.2)
Requirement already satisfied: joblib>=0.11 in /usr/local/lib/python3.7/dist-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official>=2.5.1->object-detection==0.1) (1.1.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official>=2.5.1->object-detection==0.1) (3.1.0)
Requirement already satisfied: typeguard>=2.7 in /usr/local/lib/python3.7/dist-packages (from tensorflow-addons->tf-models-official>=2.5.1->object-detection==0.1) (2.7.1)
Requirement already satisfied: promise in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (2.3)
Requirement already satisfied: future in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (0.16.0)
Requirement already satisfied: attrs>=18.1.0 in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (21.4.0)
Requirement already satisfied: importlib-resources in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (5.4.0)
Requirement already satisfied: tensorflow-metadata in /usr/local/lib/python3.7/dist-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (1.6.0)
Collecting tensorflow-io-gcs-filesystem>=0.23.1
Downloading tensorflow_io_gcs_filesystem-0.24.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (2.1 MB)
|████████████████████████████████| 2.1 MB 40.9 MB/s
Building wheels for collected packages: object-detection, py-cpuinfo, dill, avro-python3, seqeval
Building wheel for object-detection (setup.py) ... done
Created wheel for object-detection: filename=object_detection-0.1-py3-none-any.whl size=1686316 sha256=775b8c34c800b3b3139d1067abd686af9ce9158011fccfb5450ccfd9bf424a5a
Stored in directory: /tmp/pip-ephem-wheel-cache-rmw0fvil/wheels/d0/e3/e9/b9ffe85019ec441e90d8ff9eddee9950c4c23b7598204390b9
Building wheel for py-cpuinfo (setup.py) ... done
Created wheel for py-cpuinfo: filename=py_cpuinfo-8.0.0-py3-none-any.whl size=22257 sha256=ac956c4c039868fdba78645bea056754e667e8840bea783ad2ca75e4d3e682c6
Stored in directory: /root/.cache/pip/wheels/d2/f1/1f/041add21dc9c4220157f1bd2bd6afe1f1a49524c3396b94401
Building wheel for dill (setup.py) ... done
Created wheel for dill: filename=dill-0.3.1.1-py3-none-any.whl size=78544 sha256=d9c6cdfd69aea2b4d78e6afbbe2bc530394e4081eb186eb4f4cd02373ca739fd
Stored in directory: /root/.cache/pip/wheels/a4/61/fd/c57e374e580aa78a45ed78d5859b3a44436af17e22ca53284f
Building wheel for avro-python3 (setup.py) ... done
Created wheel for avro-python3: filename=avro_python3-1.10.2-py3-none-any.whl size=44010 sha256=4eca8b4f30e4850d5dabccee36c40c8dda8a6c7e7058cfb7f0258eea5ce7b2b3
Stored in directory: /root/.cache/pip/wheels/d6/e5/b1/6b151d9b535ee50aaa6ab27d145a0104b6df02e5636f0376da
Building wheel for seqeval (setup.py) ... done
Created wheel for seqeval: filename=seqeval-1.2.2-py3-none-any.whl size=16180 sha256=0ddfa46d0e36e9be346a90833ef11cc0d38cc7e744be34c5a0d321f997a30cba
Stored in directory: /root/.cache/pip/wheels/05/96/ee/7cac4e74f3b19e3158dce26a20a1c86b3533c43ec72a549fd7
Successfully built object-detection py-cpuinfo dill avro-python3 seqeval
Installing collected packages: requests, protobuf, numpy, tf-estimator-nightly, tensorflow-io-gcs-filesystem, tensorboard, keras, tensorflow, portalocker, dill, colorama, tf-slim, tensorflow-text, tensorflow-model-optimization, tensorflow-addons, seqeval, sentencepiece, sacrebleu, pyyaml, pymongo, py-cpuinfo, proto-plus, orjson, opencv-python-headless, hdfs, fastavro, tf-models-official, tensorflow-io, lvis, avro-python3, apache-beam, object-detection
Attempting uninstall: requests
Found existing installation: requests 2.23.0
Uninstalling requests-2.23.0:
Successfully uninstalled requests-2.23.0
Attempting uninstall: protobuf
Found existing installation: protobuf 3.17.3
Uninstalling protobuf-3.17.3:
Successfully uninstalled protobuf-3.17.3
Attempting uninstall: numpy
Found existing installation: numpy 1.19.5
Uninstalling numpy-1.19.5:
Successfully uninstalled numpy-1.19.5
Attempting uninstall: tensorflow-io-gcs-filesystem
Found existing installation: tensorflow-io-gcs-filesystem 0.23.1
Uninstalling tensorflow-io-gcs-filesystem-0.23.1:
Successfully uninstalled tensorflow-io-gcs-filesystem-0.23.1
Attempting uninstall: tensorboard
Found existing installation: tensorboard 2.7.0
Uninstalling tensorboard-2.7.0:
Successfully uninstalled tensorboard-2.7.0
Attempting uninstall: keras
Found existing installation: keras 2.7.0
Uninstalling keras-2.7.0:
Successfully uninstalled keras-2.7.0
Attempting uninstall: tensorflow
Found existing installation: tensorflow 2.7.0
Uninstalling tensorflow-2.7.0:
Successfully uninstalled tensorflow-2.7.0
Attempting uninstall: dill
Found existing installation: dill 0.3.4
Uninstalling dill-0.3.4:
Successfully uninstalled dill-0.3.4
Attempting uninstall: pyyaml
Found existing installation: PyYAML 3.13
Uninstalling PyYAML-3.13:
Successfully uninstalled PyYAML-3.13
Attempting uninstall: pymongo
Found existing installation: pymongo 4.0.1
Uninstalling pymongo-4.0.1:
Successfully uninstalled pymongo-4.0.1
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
yellowbrick 1.3.post1 requires numpy<1.20,>=1.16.0, but you have numpy 1.20.3 which is incompatible.
multiprocess 0.70.12.2 requires dill>=0.3.4, but you have dill 0.3.1.1 which is incompatible.
google-colab 1.0.0 requires requests~=2.23.0, but you have requests 2.27.1 which is incompatible.
datascience 0.10.6 requires folium==0.2.1, but you have folium 0.8.3 which is incompatible.
albumentations 0.1.12 requires imgaug<0.2.7,>=0.2.5, but you have imgaug 0.2.9 which is incompatible.
Successfully installed apache-beam-2.35.0 avro-python3-1.10.2 colorama-0.4.4 dill-0.3.1.1 fastavro-1.4.9 hdfs-2.6.0 keras-2.8.0 lvis-0.5.3 numpy-1.20.3 object-detection-0.1 opencv-python-headless-4.5.5.62 orjson-3.6.6 portalocker-2.3.2 proto-plus-1.19.9 protobuf-3.19.4 py-cpuinfo-8.0.0 pymongo-3.12.3 pyyaml-5.4.1 requests-2.27.1 sacrebleu-2.0.0 sentencepiece-0.1.96 seqeval-1.2.2 tensorboard-2.8.0 tensorflow-2.8.0 tensorflow-addons-0.15.0 tensorflow-io-0.24.0 tensorflow-io-gcs-filesystem-0.24.0 tensorflow-model-optimization-0.7.0 tensorflow-text-2.8.1 tf-estimator-nightly-2.8.0.dev2021122109 tf-models-official-2.8.0 tf-slim-1.1.0
I am noticing that this command uninstalling tensorflow 2.7 and installing tensorflow 2.8. I am not sure it was happening before. Maybe it's the reason DNN library link is missing o something?
I can confirm these:
- Nothing was changed inside pretrained model or already installed model or object_detection source files I downloaded a year ago.
- I tried to run command !pip install dnn - not working
- I tried to restart runtime (without disconnecting) - not working
Somebody can help? Thanks.
ANSWER
Answered 2022-Feb-07 at 09:19It happened the same to me last friday. I think it has something to do with Cuda instalation in Google Colab but I don't know exactly the reason
QUESTION
i'm trying to install pygame package on my computer which one is not connected to internet.
(env : windows10, python 3.9(anaconda))
so i downloaded a "pygame-2.1.2.tar" file from www.pypi.org and then tried to install it
from cmd with "python setup.py install" commend.
then it shows error message like below
===============================================================================
(base) C:\pythonpackage\pygame-2.1.2>python setup.py install
WARNING, No "Setup" File Exists, Running "buildconfig/config.py"
Using WINDOWS configuration...
Making dir :prebuilt_downloads:
Downloading... https://www.libsdl.org/release/SDL2-devel-2.0.18-VC.zip
d561079ec622b0bab5a9e02976f5d540b0622da
---
For help with compilation see:
https://www.pygame.org/wiki/CompileWindows
To contribute to pygame development see:
https://www.pygame.org/contribute.html
---
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connection.py", line 174, in
_new_conn
conn = connection.create_connection(
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\util\connection.py", line 73,
in create_connection
for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
File "C:\ProgramData\Anaconda3\lib\socket.py", line 954, in getaddrinfo
for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno 11002] getaddrinfo failed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 699,
in urlopen
httplib_response = self._make_request(
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 382,
in _make_request
self._validate_conn(conn)
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 1010,
in _validate_conn
conn.connect()
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connection.py", line 358, in
connect
conn = self._new_conn()
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connection.py", line 186, in
_new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: : Failed to establish a new connection: [Errno 11002] getaddrinfo
failed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\adapters.py", line 439, in
send
resp = conn.urlopen(
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\connectionpool.py", line 755,
in urlopen
retries = retries.increment(
File "C:\ProgramData\Anaconda3\lib\site-packages\urllib3\util\retry.py", line 574, in
increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='www.libsdl.org', port=443):
Max retries exceeded with url: /release/SDL2-devel-2.0.18-VC.zip (Caused by
NewConnectionError(':
Failed to establish a new connection: [Errno 11002] getaddrinfo failed'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:\pythonpackage\pygame-2.1.2\setup.py", line 359, in
buildconfig.config.main(AUTO_CONFIG)
File "C:\pythonpackage\pygame-2.1.2\buildconfig\config.py", line 225, in main
deps = CFG.main(**kwds)
File "C:\pythonpackage\pygame-2.1.2\buildconfig\config_win.py", line 497, in main
and download_win_prebuilt.ask(**download_kwargs):
File "C:\pythonpackage\pygame-2.1.2\buildconfig\download_win_prebuilt.py", line 290, in
ask
update(x86=x86, x64=x64)
File "C:\pythonpackage\pygame-2.1.2\buildconfig\download_win_prebuilt.py", line 273, in
update
download_prebuilts(download_dir, x86=x86, x64=x64)
File "C:\pythonpackage\pygame-2.1.2\buildconfig\download_win_prebuilt.py", line 124, in
download_prebuilts
download_sha1_unzip(url, checksum, temp_dir, 1)
File "C:\pythonpackage\pygame-2.1.2\buildconfig\download_win_prebuilt.py", line 47, in
download_sha1_unzip
response = requests.get(url)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\api.py", line 75, in get
return request('get', url, params=params, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\api.py", line 61, in request
return session.request(method=method, url=url, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\sessions.py", line 542, in
request
resp = self.send(prep, **send_kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\sessions.py", line 655, in
send
r = adapter.send(request, **kwargs)
File "C:\ProgramData\Anaconda3\lib\site-packages\requests\adapters.py", line 516, in
send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPSConnectionPool(host='www.libsdl.org', port=443):
Max retries exceeded with url: /release/SDL2-devel-2.0.18-VC.zip (Caused by
NewConnectionError(':
Failed to establish a new connection: [Errno 11002] getaddrinfo failed'))
===============================================================================
please let me know how to install pygame on offline environment.
i need your help!! please~~
ANSWER
Answered 2022-Feb-08 at 04:49you can run command: pip install pygame
QUESTION
I created a Dockerfile
for tensorflow-serving
as follows:
FROM tensorflow/serving
COPY /model_dir /models/model/
and I docker-compose
it this way
tensorflow-servings:
container_name: tfserving_classifier
build: ./some_model_dir
ports:
- 8501:8501
In the tensorflow-container, the model is located in /models/model/1
Here is how I tried to serve it
# server URL
url = 'http://localhost:8501/v1/models/model/1:predict'
def make_prediction(instances):
data = json.dumps({"signature_name": "serving_default", "instances": instances.tolist()})
headers = {"content-type": "application/json"}
json_response = requests.post(url, data=data, headers=headers)
predictions = json.loads(json_response.text)['predictions']
return predictions
Here is the python code container message:
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8501): Max retries exceeded with url: /v1/models/model/1:predict (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 111] Connection refused'))
I believe this is due to incorrect URL, how can I get the correct URL for my tensorflow-serving?
Here is the tensorflow-serving container message:
I tensorflow_serving/model_servers/server.cc:393] Running gRPC ModelServer at 0.0.0.0:8500 ...
I tensorflow_serving/model_servers/server.cc:414] Exporting HTTP/REST API at:localhost:8501 ...
ANSWER
Answered 2021-Sep-30 at 15:45localhost
only reaches inside the container, use service name or container name of tensorflow to reach it from the script container
http://tensorflow-servings:8501/v1/models/model/1:predict
QUESTION
I am working with a simple ML model with streamlit. It runs fine on my local machine inside conda environment, but it shows Error installing requirements when I try to deploy it on share.streamlit.io.
The error message is the following:
ERROR: Could not find a version that satisfies the requirement pywin32==303 (from versions: none)
ERROR: No matching distribution found for pywin32==303
This is the requirements.txt file for my model:
altair==4.1.0
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
astor==0.8.1
attrs==21.2.0
backcall==0.2.0
base58==2.1.1
bleach==4.1.0
blinker==1.4
cachetools==5.0.0
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.9
click==7.1.2
colorama==0.4.4
cycler==0.11.0
debugpy==1.5.1
decorator==5.1.0
defusedxml==0.7.1
entrypoints==0.3
fonttools==4.28.5
gitdb==4.0.9
GitPython==3.1.24
idna==3.3
ipykernel==6.6.0
ipython==7.30.1
ipython-genutils==0.2.0
ipywidgets==7.6.5
jedi==0.18.1
Jinja2==3.0.3
joblib==1.1.0
jsonschema==4.3.2
jupyter-client==7.1.0
jupyter-core==4.9.1
jupyterlab-pygments==0.1.2
jupyterlab-widgets==1.0.2
kiwisolver==1.3.2
MarkupSafe==2.0.1
matplotlib==3.5.1
matplotlib-inline==0.1.3
mistune==0.8.4
nbclient==0.5.9
nbconvert==6.3.0
nbformat==5.1.3
nest-asyncio==1.5.4
notebook==6.4.6
numpy==1.21.5
packaging==21.3
pandas==1.3.5
pandocfilters==1.5.0
parso==0.8.3
pickleshare==0.7.5
Pillow==8.4.0
prometheus-client==0.12.0
prompt-toolkit==3.0.24
protobuf==3.19.1
pyarrow==6.0.1
pycparser==2.21
pydeck==0.7.1
Pygments==2.10.0
Pympler==1.0.1
pyparsing==3.0.6
pyrsistent==0.18.0
python-dateutil==2.8.2
pytz==2021.3
pytz-deprecation-shim==0.1.0.post0
pywin32==303
pywinpty==1.1.6
pyzmq==22.3.0
requests==2.26.0
scikit-learn==1.0.1
scipy==1.7.3
seaborn==0.11.2
Send2Trash==1.8.0
six==1.16.0
smmap==5.0.0
streamlit==1.3.0
terminado==0.12.1
testpath==0.5.0
threadpoolctl==3.0.0
toml==0.10.2
toolz==0.11.2
tornado==6.1
traitlets==5.1.1
typing_extensions==4.0.1
tzdata==2021.5
tzlocal==4.1
urllib3==1.26.7
validators==0.18.2
watchdog==2.1.6
wcwidth==0.2.5
webencodings==0.5.1
widgetsnbextension==3.5.2
wincertstore==0.2
What should I do to resolve this error?
ANSWER
Answered 2021-Dec-25 at 14:42Streamlit share runs the app in a linux environment meaning there is no pywin32 because this is for windows.
Delete the pywin32 from the requirements file and also the pywinpty==1.1.6 for the same reason.
After deleting these requirements re-deploy your app and it will work.
QUESTION
this is my first question on Stackoverflow. I hope my question is clear, otherwise let me know and don't hesitate to ask me more details.
I'm trying to package a streamlit app for a personal project. I'm developing under linux but I have to deploy the app on Windows. I want it to be a standalone executable, which once run opens the browser tab to display the app, and exits when the tab is closed. I would like to use pynsist
library to package the app (already used for another project and it worked fine).
I followed the suggestion found in this discussion. It worked fine on ubuntu, and apparently also on Windows after packaging the app with pynsist. "Apparently" because the executable run, but no browser tab was open to display the app.
Here is some snippets of my code.
Project structure
|- installer.cfg
|- src
|- main.py
|- run_app.py
main.py
import streamlit as st
st.title("Test")
st.title("My first app deployed with Pynsist!")
run_app.py (EDIT 2 after comment by Thomas K)
import os
import subprocess
import sys
from src.config import EnvironmentalVariableNames as EnvVar, get_env
def main():
executable = sys.executable
result = subprocess.run(
f"{executable} -m streamlit run {os.path.join(get_env(EnvVar.EMPORIO_VESTIARIO_DASHBOARD_WORKING_DIR), 'src', 'main.py')}",
shell=True,
capture_output=True,
text=True,
)
if __name__ == "__main__":
main()
EMPORIO_VESTIARIO_DASHBOARD_WORKING_DIR
is an environmental variable to make the app work on both linux and windows (on windows, it is set to the installation directory).
pynsist installer.cfg
EDIT: including dependencies of streamlit discovered through pip list
EDIT 2: added MarkupSafe as dependency of Jinja2
[Application]
name=Emporio Vestiario Dashboard
version=0.1.0
# How to lunch the app - this calls the 'main' function from the 'myapp' package:
entry_point=src.run_app:main
icon=resources/caritas-logo.ico
[Python]
version=3.8.10
bitness=64
[Include]
# Packages from PyPI that your application requires, one per line
# These must have wheels on PyPI:
pypi_wheels = altair==4.1.0
astor==0.8.1
attrs==21.2.0
backcall==0.2.0
backports.zoneinfo==0.2.1
base58==2.1.0
bleach==4.1.0
blinker==1.4
cachetools==4.2.2
certifi==2021.5.30
cffi==1.14.6
charset-normalizer==2.0.6
click==7.1.2
decorator==5.1.0
defusedxml==0.7.1
distlib==0.3.3
entrypoints==0.3
idna==3.2
jsonschema==3.2.0
mistune==0.8.4
mypy-extensions==0.4.3
numpy==1.21.1
packaging==21.0
pandas==1.3.3
pandocfilters==1.5.0
parso==0.8.2
pillow==8.3.2
platformdirs==2.4.0
prompt-toolkit==3.0.20
protobuf==3.18.0
pyarrow==5.0.0
pycparser==2.20
pydeck==0.7.0
pyparsing==2.4.7
pyrsistent==0.18.0
python-dateutil==2.8.2
pytz==2021.1
requests==2.26.0
requests-download==0.1.2
send2trash==1.8.0
setuptools==57.0.0
six==1.14.0
smmap==4.0.0
streamlit==0.89.0
terminado==0.12.1
testpath==0.5.0
toml==0.10.2
tomli==1.2.1
toolz==0.11.1
tornado==6.1
traitlets==5.1.0
typing-extensions==3.10.0.2
tzlocal==3.0
urllib3==1.26.7
validators==0.18.2
Jinja2==3.0.1
MarkupSafe==2.0.1
Looking at the executable output on Windows, the current working directory is correctly printed, but no other output (streamlit app initialization message, or error messages) is printed. I tried to open the browser and go to localhost:8501
, but I got connection error.
Any hints on how to make the code execute and automatically open the browser tab? Any help is greatly appreciated!
EDIT: as pointed out in the comment to the last package in installer.cfg
, the app (with Jinja2 dependency) is correctly installed on windows, but when launched, the app still cannot find Jinja2 dependency. This is the traceback:
Traceback (most recent call last):
File "Emporio_Vestiario_Dashboard.launch.pyw", line 34, in
from src.run_app import main
File "C:\Users\tantardini\develop\caritas\pkgs\src\run_app.py", line 6, in
import streamlit
File "C:\Users\tantardini\develop\caritas\pkgs\streamlit\__init__.py", line 75, in
from streamlit.delta_generator import DeltaGenerator as _DeltaGenerator
File "C:\Users\tantardini\develop\caritas\pkgs\streamlit\delta_generator.py", line 70, in
from streamlit.elements.arrow import ArrowMixin
File "C:\Users\tantardini\develop\caritas\pkgs\streamlit\elements\arrow.py", line 20, in
from pandas.io.formats.style import Styler
File "C:\Users\tantardini\develop\caritas\pkgs\pandas\io\formats\style.py", line 49, in
jinja2 = import_optional_dependency("jinja2", extra="DataFrame.style requires jinja2.")
File "C:\Users\tantardini\develop\caritas\pkgs\pandas\compat\_optional.py", line 118, in import_optional_dependency
raise ImportError(msg) from None
ImportError: Missing optional dependency 'Jinja2'. DataFrame.style requires jinja2. Use pip or conda to install Jinja2.
EDIT 2: thanks to the helpful hints by Thomas K, I came up with half a solution. The app runs and streamlit is started.
But.
These are the log messages:
Welcome to Streamlit!
If you're one of our development partners or you're interested in getting
personal technical support or Streamlit updates, please enter your email
address below. Otherwise, you may leave the field blank.
Email:
2021-10-11 20:56:53.202 WARNING streamlit.config:
Warning: the config option 'server.enableCORS=false' is not compatible with 'server.enableXsrfProtection=true'.
As a result, 'server.enableCORS' is being overridden to 'true'.
More information:
In order to protect against CSRF attacks, we send a cookie with each request.
To do so, we must specify allowable origins, which places a restriction on
cross-origin resource sharing.
If cross origin resource sharing is required, please disable server.enableXsrfProtection.
2021-10-11 20:56:53.202 DEBUG streamlit.logger: Initialized tornado logs
2021-10-11 20:56:53.202 ERROR streamlit.credentials:
It seems that the execution of the app is stopped becuase it is waiting for some credentials. I found here that a .streamlit/credentials.toml
can be added, but I'm not sure on the exact location on windows. I've also tried to explicitly add --server.headless=false
in the subprocess.run
command, but again with no effect.
Why the app doesn't start automatically like on Linux? Is there a way to start the app without additional configurations by the user?
ANSWER
Answered 2021-Nov-25 at 09:40EDIT: a streamlit example was added to the examples of pynsist
repo. There you can find a minimal and refined example of a working application (which also includes plotly).
ORIGINAL ANSWER
Finally I get it to work. In my last attempt, I made a mistake by setting --server.headless=false
, while it must be true
instead. I found that an additional flag to the streamlit run command is needed: --global.developmentMode=false
. This make the deploy work, even if I could not find any reference to this configuration in the streamlit configurations.
Working code follows.
Project structure
|- wheels/
|- installer.cfg
|- src
|- main.py
|- run_app.py
main.py
import streamlit as st
st.title("Test")
st.title("My first app deployed with Pynsist!")
run_app.py
import os
import subprocess
import sys
import webbrowser
from src.config import EnvironmentalVariableNames as EnvVar, get_env
def main():
# Getting path to python executable (full path of deployed python on Windows)
executable = sys.executable
# Open browser tab. May temporarily display error until streamlit server is started.
webbrowser.open("http://localhost:8501")
# Run streamlit server
path_to_main = os.path.join(
get_env(EnvVar.EMPORIO_VESTIARIO_DASHBOARD_WORKING_DIR), "src", "app.py"
)
result = subprocess.run(
f"{executable} -m streamlit run {path_to_main} --server.headless=true --global.developmentMode=false",
shell=True,
capture_output=True,
text=True,
)
# These are printed only when server is stopped.
# NOTE: you have to manually stop streamlit server killing process.
print(result.stdout)
print(result.stderr)
if __name__ == "__main__":
main()
Some notes:
webbrowser.open
is necessary to automatically open a new tab in the browser to show the streamlit app. Thesubprocess.run
lines only starts a new streamlit server.- As I pointed out in the comments, once exiting the streamlit tab in the browser, the streamlit server is still there and active. You may access again the dashboard by only typing
localhost:8501
in the address bar. If you click multiple times on the Windows app icon, multiple streamlit servers are started. I've tried with only two active at the same time, and they do not show conflicting behaviour. To stop them you have to manually end tasks through task manager, for instance.
installer.cfg
[Application]
name=Emporio Vestiario Dashboard
version=0.1.0
# How to lunch the app - this calls the 'main' function from the 'myapp' package:
entry_point=src.run_app:main
icon=resources/caritas-logo.ico
[Python]
version=3.8.10
bitness=64
[Include]
# Packages from PyPI that your application requires, one per line
# These must have wheels on PyPI:
pypi_wheels = altair==4.1.0
astor==0.8.1
attrs==21.2.0
backcall==0.2.0
backports.zoneinfo==0.2.1
base58==2.1.0
bleach==4.1.0
blinker==1.4
cachetools==4.2.2
certifi==2021.5.30
cffi==1.14.6
charset-normalizer==2.0.6
click==7.1.2
decorator==5.1.0
defusedxml==0.7.1
distlib==0.3.3
entrypoints==0.3
idna==3.2
jsonschema==3.2.0
mistune==0.8.4
mypy-extensions==0.4.3
numpy==1.21.1
packaging==21.0
pandas==1.3.3
pandocfilters==1.5.0
parso==0.8.2
pillow==8.3.2
platformdirs==2.4.0
prompt-toolkit==3.0.20
protobuf==3.18.0
pyarrow==5.0.0
pycparser==2.20
pydeck==0.7.0
pyparsing==2.4.7
pyrsistent==0.18.0
python-dateutil==2.8.2
pytz==2021.1
requests==2.26.0
requests-download==0.1.2
send2trash==1.8.0
setuptools==57.0.0
six==1.14.0
smmap==4.0.0
streamlit==0.89.0
terminado==0.12.1
testpath==0.5.0
toml==0.10.2
tomli==1.2.1
toolz==0.11.1
tornado==6.1
traitlets==5.1.0
typing-extensions==3.10.0.2
tzlocal==3.0
urllib3==1.26.7
validators==0.18.2
Jinja2==3.0.1
MarkupSafe==2.0.1
extra_wheel_sources = ./wheels
Note: blinker
extra wheels is required.
QUESTION
I am trying to install the Tensorflow Object Detection API on a Google Colab and the part that installs the API, shown below, takes a very long time to execute (in excess of one hour) and eventually fails to install.
# Install the Object Detection API
%%bash
cd models/research/
protoc object_detection/protos/*.proto --python_out=.
cp object_detection/packages/tf2/setup.py .
python -m pip install
To discover What I was doing wrong, I reverted to the "Eager Few Shot Object Detection Colab" example available at https://github.com/tensorflow/models/blob/master/research/object_detection/colab_tutorials/eager_few_shot_od_training_tf2_colab.ipynb in a Google Colab Pro notebook, and the "python -m pip install" part hangs as well. Normally, this Colab runs in under 10 minutes, but in Google PRO Colab it is not running at all.
I can't seem to pinpoint what is causing this installation to fail. Anyone has any idea why the Object Detection API is no longer installing on Google Colab notebooks?
Update... yesterday the installation took over two hours, and failes, and this is the output:
Processing /content/models/research
Collecting avro-python3
Using cached avro-python3-1.10.2.tar.gz (38 kB)
Collecting apache-beam
Using cached apache_beam-2.34.0-cp37-cp37m-manylinux2010_x86_64.whl (9.8 MB)
Requirement already satisfied: pillow in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (7.1.2)
Requirement already satisfied: lxml in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (4.2.6)
Requirement already satisfied: matplotlib in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (3.2.2)
Requirement already satisfied: Cython in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (0.29.24)
Requirement already satisfied: contextlib2 in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (0.5.5)
Collecting tf-slim
Using cached tf_slim-1.1.0-py2.py3-none-any.whl (352 kB)
Requirement already satisfied: six in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.15.0)
Requirement already satisfied: pycocotools in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (2.0.2)
Collecting lvis
Using cached lvis-0.5.3-py3-none-any.whl (14 kB)
Requirement already satisfied: scipy in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.4.1)
Requirement already satisfied: pandas in /usr/local/lib/python3.7/dist-packages (from object-detection==0.1) (1.1.5)
Collecting tf-models-official>=2.5.1
Using cached tf_models_official-2.7.0-py2.py3-none-any.whl (1.8 MB)
Collecting tensorflow_io
Using cached tensorflow_io-0.22.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (22.7 MB)
Collecting keras==2.6.0
Using cached keras-2.6.0-py2.py3-none-any.whl (1.3 MB)
Collecting tensorflow-addons
Using cached tensorflow_addons-0.15.0-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (1.1 MB)
Requirement already satisfied: kaggle>=1.3.9 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.5.12)
Requirement already satisfied: gin-config in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.5.0)
Collecting sacrebleu
Using cached sacrebleu-2.0.0-py3-none-any.whl (90 kB)
Requirement already satisfied: psutil>=5.4.3 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (5.4.8)
Collecting py-cpuinfo>=3.3.0
Using cached py-cpuinfo-8.0.0.tar.gz (99 kB)
Collecting tensorflow-text>=2.7.0
Using cached tensorflow_text-2.7.0-cp37-cp37m-manylinux2010_x86_64.whl (4.9 MB)
Requirement already satisfied: oauth2client in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (4.1.3)
Collecting seqeval
Using cached seqeval-1.2.2.tar.gz (43 kB)
Requirement already satisfied: numpy>=1.15.4 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.19.5)
Collecting sentencepiece
Using cached sentencepiece-0.1.96-cp37-cp37m-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (1.2 MB)
Requirement already satisfied: tensorflow-datasets in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (4.0.1)
Collecting tensorflow-model-optimization>=0.4.1
Using cached tensorflow_model_optimization-0.7.0-py2.py3-none-any.whl (213 kB)
Requirement already satisfied: tensorflow-hub>=0.6.0 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.12.0)
Collecting opencv-python-headless
Using cached opencv_python_headless-4.5.4.58-cp37-cp37m-manylinux2014_x86_64.whl (47.6 MB)
Collecting tensorflow>=2.7.0
Using cached tensorflow-2.7.0-cp37-cp37m-manylinux2010_x86_64.whl (489.6 MB)
Requirement already satisfied: google-api-python-client>=1.6.7 in /usr/local/lib/python3.7/dist-packages (from tf-models-official>=2.5.1->object-detection==0.1) (1.12.8)
Collecting pyyaml>=5.1
Using cached PyYAML-6.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (596 kB)
Requirement already satisfied: google-auth>=1.16.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.35.0)
Requirement already satisfied: uritemplate<4dev,>=3.0.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.1)
Requirement already satisfied: google-api-core<2dev,>=1.21.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.26.3)
Requirement already satisfied: httplib2<1dev,>=0.15.0 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.17.4)
Requirement already satisfied: google-auth-httplib2>=0.0.3 in /usr/local/lib/python3.7/dist-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.0.4)
Requirement already satisfied: packaging>=14.3 in /usr/local/lib/python3.7/dist-packages (from google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (21.2)
Requirement already satisfied: requests<3.0.0dev,>=2.18.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.23.0)
Requirement already satisfied: googleapis-common-protos<2.0dev,>=1.6.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (1.53.0)
Requirement already satisfied: protobuf>=3.12.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.17.3)
Requirement already satisfied: pytz in /usr/local/lib/python3.7/dist-packages (from google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2018.9)
Requirement already satisfied: setuptools>=40.3.0 in /usr/local/lib/python3.7/dist-packages (from google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (57.4.0)
Requirement already satisfied: rsa<5,>=3.1.4 in /usr/local/lib/python3.7/dist-packages (from google-auth>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.7.2)
Requirement already satisfied: pyasn1-modules>=0.2.1 in /usr/local/lib/python3.7/dist-packages (from google-auth>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.2.8)
Requirement already satisfied: cachetools<5.0,>=2.0.0 in /usr/local/lib/python3.7/dist-packages (from google-auth>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.2.4)
Requirement already satisfied: urllib3 in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.24.3)
Requirement already satisfied: python-slugify in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (5.0.2)
Requirement already satisfied: tqdm in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (4.62.3)
Requirement already satisfied: certifi in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2021.10.8)
Requirement already satisfied: python-dateutil in /usr/local/lib/python3.7/dist-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2.8.2)
Requirement already satisfied: pyparsing<3,>=2.0.2 in /usr/local/lib/python3.7/dist-packages (from packaging>=14.3->google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.4.7)
Requirement already satisfied: pyasn1<0.5.0,>=0.4.6 in /usr/local/lib/python3.7/dist-packages (from pyasn1-modules>=0.2.1->google-auth>=1.16.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (0.4.8)
Requirement already satisfied: idna<3,>=2.5 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.10)
Requirement already satisfied: chardet<4,>=3.0.2 in /usr/local/lib/python3.7/dist-packages (from requests<3.0.0dev,>=2.18.0->google-api-core<2dev,>=1.21.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (3.0.4)
INFO: pip is looking at multiple versions of six to determine which version is compatible with other requirements. This could take a while.
Collecting six
Using cached six-1.16.0-py2.py3-none-any.whl (11 kB)
Using cached six-1.15.0-py2.py3-none-any.whl (10 kB)
Using cached six-1.14.0-py2.py3-none-any.whl (10 kB)
Using cached six-1.13.0-py2.py3-none-any.whl (10 kB)
INFO: pip is looking at multiple versions of scipy to determine which version is compatible with other requirements. This could take a while.
Collecting scipy
Using cached scipy-1.7.2-cp37-cp37m-manylinux_2_12_x86_64.manylinux2010_x86_64.whl (38.2 MB)
INFO: pip is looking at multiple versions of six to determine which version is compatible with other requirements. This could take a while.
Using cached scipy-1.7.1-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (28.5 MB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
Using cached scipy-1.7.0-cp37-cp37m-manylinux_2_5_x86_64.manylinux1_x86_64.whl (28.5 MB)
Using cached scipy-1.6.3-cp37-cp37m-manylinux1_x86_64.whl (27.4 MB)
Using cached scipy-1.6.2-cp37-cp37m-manylinux1_x86_64.whl (27.4 MB)
Using cached scipy-1.6.1-cp37-cp37m-manylinux1_x86_64.whl (27.4 MB)
Using cached scipy-1.6.0-cp37-cp37m-manylinux1_x86_64.whl (27.4 MB)
INFO: pip is looking at multiple versions of scipy to determine which version is compatible with other requirements. This could take a while.
Using cached scipy-1.5.4-cp37-cp37m-manylinux1_x86_64.whl (25.9 MB)
Using cached scipy-1.5.3-cp37-cp37m-manylinux1_x86_64.whl (25.9 MB)
Using cached scipy-1.5.2-cp37-cp37m-manylinux1_x86_64.whl (25.9 MB)
Using cached scipy-1.5.1-cp37-cp37m-manylinux1_x86_64.whl (25.9 MB)
Using cached scipy-1.5.0-cp37-cp37m-manylinux1_x86_64.whl (25.9 MB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
Using cached scipy-1.4.1-cp37-cp37m-manylinux1_x86_64.whl (26.1 MB)
Using cached scipy-1.4.0-cp37-cp37m-manylinux1_x86_64.whl (26.1 MB)
Using cached scipy-1.3.3-cp37-cp37m-manylinux1_x86_64.whl (25.2 MB)
Using cached scipy-1.3.2-cp37-cp37m-manylinux1_x86_64.whl (25.2 MB)
Using cached scipy-1.3.1-cp37-cp37m-manylinux1_x86_64.whl (25.2 MB)
Using cached scipy-1.3.0-cp37-cp37m-manylinux1_x86_64.whl (25.2 MB)
Using cached scipy-1.2.3-cp37-cp37m-manylinux1_x86_64.whl (24.8 MB)
Using cached scipy-1.2.2-cp37-cp37m-manylinux1_x86_64.whl (24.8 MB)
Using cached scipy-1.2.1-cp37-cp37m-manylinux1_x86_64.whl (24.8 MB)
Using cached scipy-1.2.0-cp37-cp37m-manylinux1_x86_64.whl (26.6 MB)
Using cached scipy-1.1.0-cp37-cp37m-manylinux1_x86_64.whl (31.2 MB)
Using cached scipy-1.0.1.tar.gz (15.5 MB)
Using cached scipy-1.0.0.tar.gz (15.2 MB)
Using cached scipy-0.19.1.tar.gz (14.1 MB)
INFO: pip is looking at multiple versions of rsa to determine which version is compatible with other requirements. This could take a while.
Collecting rsa<5,>=3.1.4
Using cached rsa-4.7.2-py3-none-any.whl (34 kB)
Using cached rsa-4.7.1-py3-none-any.whl (36 kB)
Using cached rsa-4.7-py3-none-any.whl (34 kB)
Using cached rsa-4.6-py3-none-any.whl (47 kB)
Using cached rsa-4.5-py2.py3-none-any.whl (36 kB)
Using cached rsa-4.4.1-py2.py3-none-any.whl (33 kB)
Using cached rsa-4.3-py2.py3-none-any.whl (36 kB)
INFO: pip is looking at multiple versions of rsa to determine which version is compatible with other requirements. This could take a while.
Using cached rsa-4.2.tar.gz (46 kB)
Using cached rsa-4.1-py3-none-any.whl (32 kB)
Using cached rsa-4.0-py2.py3-none-any.whl (38 kB)
Using cached rsa-3.4.2-py2.py3-none-any.whl (46 kB)
Using cached rsa-3.4.1-py2.py3-none-any.whl (46 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
Using cached rsa-3.4-py2.py3-none-any.whl (46 kB)
Using cached rsa-3.3-py2.py3-none-any.whl (44 kB)
Using cached rsa-3.2.3-py2.py3-none-any.whl (44 kB)
Using cached rsa-3.2.2-py2.py3-none-any.whl (44 kB)
Using cached rsa-3.2-py2.py3-none-any.whl (43 kB)
Using cached rsa-3.1.4.tar.gz (36 kB)
INFO: pip is looking at multiple versions of idna to determine which version is compatible with other requirements. This could take a while.
Collecting idna<3,>=2.5
Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Using cached idna-2.9-py2.py3-none-any.whl (58 kB)
Using cached idna-2.8-py2.py3-none-any.whl (58 kB)
Using cached idna-2.7-py2.py3-none-any.whl (58 kB)
Using cached idna-2.6-py2.py3-none-any.whl (56 kB)
Using cached idna-2.5-py2.py3-none-any.whl (55 kB)
INFO: pip is looking at multiple versions of chardet to determine which version is compatible with other requirements. This could take a while.
Collecting chardet<4,>=3.0.2
Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
INFO: pip is looking at multiple versions of idna to determine which version is compatible with other requirements. This could take a while.
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
Downloading chardet-3.0.3-py2.py3-none-any.whl (133 kB)
Downloading chardet-3.0.2-py2.py3-none-any.whl (133 kB)
INFO: pip is looking at multiple versions of certifi to determine which version is compatible with other requirements. This could take a while.
Collecting certifi
Downloading certifi-2021.10.8-py2.py3-none-any.whl (149 kB)
INFO: pip is looking at multiple versions of chardet to determine which version is compatible with other requirements. This could take a while.
Downloading certifi-2021.5.30-py2.py3-none-any.whl (145 kB)
Downloading certifi-2020.12.5-py2.py3-none-any.whl (147 kB)
INFO: This is taking longer than usual. You might need to provide the dependency resolver with stricter constraints to reduce runtime. If you want to abort this run, you can press Ctrl + C to do so. To improve how pip performs, tell us what happened here: https://pip.pypa.io/surveys/backtracking
Downloading certifi-2020.11.8-py2.py3-none-any.whl (155 kB)
Downloading certifi-2020.6.20-py2.py3-none-any.whl (156 kB)
Downloading certifi-2020.4.5.2-py2.py3-none-any.whl (157 kB)
Downloading certifi-2020.4.5.1-py2.py3-none-any.whl (157 kB)
INFO: pip is looking at multiple versions of certifi to determine which version is compatible with other requirements. This could take a while.
Downloading certifi-2020.4.5-py2.py3-none-any.whl (156 kB)
Downloading certifi-2019.11.28-py2.py3-none-any.whl (156 kB)
Downloading certifi-2019.9.11-py2.py3-none-any.whl (154 kB)
Downloading certifi-2019.6.16-py2.py3-none-any.whl (157 kB)
Downloading certifi-2019.3.9-py2.py3-none-any.whl (158 kB)
DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default.
pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555.
ERROR: Exception:
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/pip/_internal/cli/base_command.py", line 180, in _main
status = self.run(options, args)
File "/usr/local/lib/python3.7/dist-packages/pip/_internal/cli/req_command.py", line 199, in wrapper
return func(self, options, args)
File "/usr/local/lib/python3.7/dist-packages/pip/_internal/commands/install.py", line 319, in run
reqs, check_supported_wheels=not options.target_dir
File "/usr/local/lib/python3.7/dist-packages/pip/_internal/resolution/resolvelib/resolver.py", line 128, in resolve
requirements, max_rounds=try_to_avoid_resolution_too_deep
File "/usr/local/lib/python3.7/dist-packages/pip/_vendor/resolvelib/resolvers.py", line 473, in resolve
state = resolution.resolve(requirements, max_rounds=max_rounds)
File "/usr/local/lib/python3.7/dist-packages/pip/_vendor/resolvelib/resolvers.py", line 384, in resolve
raise ResolutionTooDeep(max_rounds)
pip._vendor.resolvelib.resolvers.ResolutionTooDeep: 2000000
Ivan
ANSWER
Answered 2021-Nov-19 at 00:16Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install urllib3
You can use urllib3 like any standard Python library. You will need to make sure that you have a development environment consisting of a Python distribution including header files, a compiler, pip, and git installed. Make sure that your pip, setuptools, and wheel are up to date. When using pip it is generally recommended to install packages in a virtual environment to avoid changes to the system.
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page