Popular New Releases in Caching
caffeine
3.0.6
bigcache
v3.0.2
freecache
v1.2.1
cache
v5.4.7
jetcache
Popular Libraries in Caching
by ben-manes java
11391 Apache-2.0
A high performance caching library for Java
by golang go
10602 Apache-2.0
groupcache is a caching and cache-filling library, intended as a replacement for memcached in many cases.
by allegro go
5617 Apache-2.0
Efficient cache for gigabytes of data written in Go.
by JakeWharton java
5583 Apache-2.0
Java implementation of a Disk-based LRU cache which specifically targets Android compatibility.
by Haneke swift
5103 Apache-2.0
A lightweight generic cache for iOS written in Swift with extra love for images.
by danikula java
4749 Apache-2.0
Cache support for any video player with help of single line
by php-fig php
4283 MIT
by isaacs javascript
4006 ISC
by coocood go
3693 MIT
A cache library for Go with zero GC overhead.
Trending New libraries in Caching
by rjyo typescript
391
Add a cache layer for server-side-rendered pages with stale-while-revalidate. Can be considered as an implementation of next.js's Incremental Static Regeneration which works with getServerSideProps.
by moka-rs rust
383 NOASSERTION
A high performance concurrent caching library for Rust
by pmndrs typescript
305
๐ฆ A promise caching strategy for React Suspense
by patrixr javascript
261 MIT
:electric_plug: A cache middleware for https://strapi.io
by yitd php
198
Any-Proxyๅฏไปฅ็จPHPๅธฎๅฉไฝ ๅฎ็พๅฟๅๅๅไปฃ็ๆต่งไปปไฝ็ฝ็ซ
by GoogleChromeLabs javascript
185 Apache-2.0
Native File System API with legacy fallback in the browser
by asyncins go
183 MIT
้บฆ่ฟชๆฏ๏ผไธๆฌพๅบไบ Golang Redis ๅ Mist ็ฎๆณ็่ถ ้ซๆง่ฝๅ จๅฑๅฏไธ ID ๅๅทๆๅก
by benbjohnson go
168 MIT
Implementation of io/fs.FS that appends SHA256 hashes to filenames to allow for aggressive HTTP caching.
by rec python
166 MIT
๐งฟ safer writing in Python ๐งฟ
Top Authors in Caching
1
19 Libraries
643
2
10 Libraries
679
3
8 Libraries
128
4
7 Libraries
175
5
7 Libraries
38
6
7 Libraries
348
7
7 Libraries
77
8
6 Libraries
551
9
6 Libraries
64
10
6 Libraries
41
1
19 Libraries
643
2
10 Libraries
679
3
8 Libraries
128
4
7 Libraries
175
5
7 Libraries
38
6
7 Libraries
348
7
7 Libraries
77
8
6 Libraries
551
9
6 Libraries
64
10
6 Libraries
41
Trending Kits in Caching
Here are some famous Python Data Caching Libraries. Some of the Python Data Caching Libraries' use cases include Pre-computing expensive calculations, Caching API responses, Data warehouse optimization, Caching web pages.
Python data caching libraries are software libraries that provide a way to store and quickly access data temporarily. Caching libraries can be used to store the results of expensive operations such as database queries or remote API calls so that they can be reused more quickly later. This can improve the performance of a program by reducing the time spent waiting on slow operations.
Let us have a look at some of the Python Data Caching Libraries in detail.
redis-py
- Provides a robust set of data structures for caching and managing data.
- Provides a high-performance server-side cache for Python applications.
- Offers built-in support for transactions and atomic operations
django-cacheops
- Automatically invalidates caches when the underlying models are changed.
- Seamlessly integrates with the Django ORM, allowing for easy integration of caching into existing Django applications.
- Allows for custom caching rules to be defined, allowing for more complex caching behaviors.
cachetools
- Guarantees atomic operations on read/write of cached data, ensuring that data is never corrupted by concurrent access.
- Provides thread-safe access to cached data, reducing the risk of race conditions.
- Provides the ability to specify a variety of expiration policies, such as time-based, size-based, or manual expiration.
flask-caching
- Allows you to configure the cache behavior, such as the type of caching to use.
- Supports caching backends such as Redis, Memcached, SimpleCache, and more.
- Allows you to use multiple levels of caching, such as page, application, and session caching.
pickledb
- Supports both synchronous and asynchronous mode.
- Provides an interface for querying data efficiently with its query() method.
- Extremely flexible and allows users to store any type of data, including strings, lists, dictionaries, and objects.
beaker
- Supports multiple caching regions, allowing data to be cached separately in different regions.
- Supports both simple key/value caching and more complex object caching.
- Built-in support for plugins, allowing developers to easily extend the library to support additional caching backends.
python-memcached
- Easy to deploy and is written in pure Python, making it highly portable.
- Supports both binary and text-based protocol for communication with the cache.
- Supports both distributed and local caching, making it suitable for both development and production deployments.
shelves
- Persistent storage system, meaning that the data is stored in a file and will remain in the file even after the program exits.
- Supports concurrent access to data, allowing multiple clients to access and modify the data at the same time.
- Offers advanced features such as transaction support, versioning, and expiration.
Trending Discussions on Caching
IndexError: tuple index out of range when I try to create an executable from a python script using auto-py-to-exe
Display customer specific information on product detail page - what about the caching?
Netlify says, "error Gatsby requires Node.js 14.15.0 or higher (you have v12.18.0)"โyet I have the newest Node version?
Unable to build and deploy Rails 6.0.4.1 app on heroku - Throws gyp verb cli error
How can I dynamically allocate cyclic data?
Command failed with exit code 134: npm run generate
CircleCI (Started 11/1/2021) Canโt find Python executable โpythonโ, you can set the PYTHON env variable
how to change create-react-app PWA to network-first
Does .NET Framework have an OS-independent global DNS cache?
Spark "first" Window function is taking much longer than "last"
QUESTION
IndexError: tuple index out of range when I try to create an executable from a python script using auto-py-to-exe
Asked 2022-Feb-24 at 15:03I have been trying out an open-sourced personal AI assistant script. The script works fine but I want to create an executable so that I can gift the executable to one of my friends. However, when I try to create the executable using the auto-py-to-exe, it states the below error:
1Running auto-py-to-exe v2.10.1
2Building directory: C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
3Provided command: pyinstaller --noconfirm --onedir --console --no-embed-manifest "C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py"
4Recursion Limit is set to 5000
5Executing: pyinstaller --noconfirm --onedir --console --no-embed-manifest C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py --distpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\application --workpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\build --specpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
6
742681 INFO: PyInstaller: 4.6
842690 INFO: Python: 3.10.0
942732 INFO: Platform: Windows-10-10.0.19042-SP0
1042744 INFO: wrote C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec
1142764 INFO: UPX is not available.
1242772 INFO: Extending PYTHONPATH with paths
13['C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310']
1443887 INFO: checking Analysis
1543891 INFO: Building Analysis because Analysis-00.toc is non existent
1643895 INFO: Initializing module dependency graph...
1743915 INFO: Caching module graph hooks...
1843975 INFO: Analyzing base_library.zip ...
1954298 INFO: Processing pre-find module path hook distutils from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-distutils.py'.
2054306 INFO: distutils: retargeting to non-venv dir 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib'
2157474 INFO: Caching module dependency graph...
2258088 INFO: running Analysis Analysis-00.toc
2358132 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable
24 required by C:\Users\Tarun\AppData\Local\Programs\Python\Python310\python.exe
2558365 INFO: Analyzing C:\Users\Tarun\AppData\Local\Programs\Python\Python310\AI_Ass.py
2659641 INFO: Processing pre-safe import module hook urllib3.packages.six.moves from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-urllib3.packages.six.moves.py'.
27An error occurred while packaging
28Traceback (most recent call last):
29 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\auto_py_to_exe\packaging.py", line 131, in package
30 run_pyinstaller()
31 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 124, in run
32 run_build(pyi_config, spec_file, **vars(args))
33 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 58, in run_build
34 PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
35 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 782, in main
36 build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
37 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 714, in build
38 exec(code, spec_namespace)
39 File "C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec", line 7, in <module>
40 a = Analysis(['C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py'],
41 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 277, in __init__
42 self.__postinit__()
43 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\datastruct.py", line 155, in __postinit__
44 self.assemble()
45 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 439, in assemble
46 priority_scripts.append(self.graph.add_script(script))
47 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 265, in add_script
48 self._top_script_node = super().add_script(pathname)
49 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1433, in add_script
50 self._process_imports(n)
51 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
52 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
53 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
54 target_modules = self.import_hook(
55 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
56 target_package, target_module_partname = self._find_head_package(
57 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
58 target_package = self._safe_import_module(
59 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
60 return super()._safe_import_module(module_basename, module_name, parent_package)
61 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
62 self._process_imports(n)
63 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
64 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
65 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
66 target_modules = self.import_hook(
67 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
68 target_package, target_module_partname = self._find_head_package(
69 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
70 target_package = self._safe_import_module(
71 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
72 return super()._safe_import_module(module_basename, module_name, parent_package)
73 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
74 self._process_imports(n)
75 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
76 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
77 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
78 target_modules = self.import_hook(
79 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
80 target_package, target_module_partname = self._find_head_package(
81 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
82 target_package = self._safe_import_module(
83 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
84 return super()._safe_import_module(module_basename, module_name, parent_package)
85 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
86 self._process_imports(n)
87 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
88 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
89 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
90 target_modules = self.import_hook(
91 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
92 target_package, target_module_partname = self._find_head_package(
93 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
94 target_package = self._safe_import_module(
95 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
96 return super()._safe_import_module(module_basename, module_name, parent_package)
97 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
98 self._process_imports(n)
99 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
100 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
101 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
102 target_modules = self.import_hook(
103 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
104 submodule = self._safe_import_module(head, mname, submodule)
105 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
106 return super()._safe_import_module(module_basename, module_name, parent_package)
107 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
108 self._process_imports(n)
109 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
110 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
111 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
112 target_modules = self.import_hook(
113 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
114 submodule = self._safe_import_module(head, mname, submodule)
115 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
116 return super()._safe_import_module(module_basename, module_name, parent_package)
117 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
118 self._process_imports(n)
119 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
120 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
121 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
122 target_modules = self.import_hook(
123 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
124 submodule = self._safe_import_module(head, mname, submodule)
125 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
126 return super()._safe_import_module(module_basename, module_name, parent_package)
127 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2061, in _safe_import_module
128 n = self._scan_code(module, co, co_ast)
129 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2645, in _scan_code
130 self._scan_bytecode(
131 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2749, in _scan_bytecode
132 for inst in util.iterate_instructions(module_code_object):
133 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 147, in iterate_instructions
134 yield from iterate_instructions(constant)
135 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 139, in iterate_instructions
136 yield from get_instructions(code_object)
137 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 338, in _get_instructions_bytes
138 argval, argrepr = _get_const_info(arg, constants)
139 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 292, in _get_const_info
140 argval = const_list[const_index]
141IndexError: tuple index out of range
142
143Project output will not be moved to output folder
144Complete.
145
I understand that there is a thread already about similar issue but it still doesn't solve the issue. Hence seeking out help
I really have no idea why is the error occurring and how to resolve it. I am pasting the script below for your reference. Can some one please help? Thank you in advance
1Running auto-py-to-exe v2.10.1
2Building directory: C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
3Provided command: pyinstaller --noconfirm --onedir --console --no-embed-manifest "C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py"
4Recursion Limit is set to 5000
5Executing: pyinstaller --noconfirm --onedir --console --no-embed-manifest C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py --distpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\application --workpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\build --specpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
6
742681 INFO: PyInstaller: 4.6
842690 INFO: Python: 3.10.0
942732 INFO: Platform: Windows-10-10.0.19042-SP0
1042744 INFO: wrote C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec
1142764 INFO: UPX is not available.
1242772 INFO: Extending PYTHONPATH with paths
13['C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310']
1443887 INFO: checking Analysis
1543891 INFO: Building Analysis because Analysis-00.toc is non existent
1643895 INFO: Initializing module dependency graph...
1743915 INFO: Caching module graph hooks...
1843975 INFO: Analyzing base_library.zip ...
1954298 INFO: Processing pre-find module path hook distutils from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-distutils.py'.
2054306 INFO: distutils: retargeting to non-venv dir 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib'
2157474 INFO: Caching module dependency graph...
2258088 INFO: running Analysis Analysis-00.toc
2358132 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable
24 required by C:\Users\Tarun\AppData\Local\Programs\Python\Python310\python.exe
2558365 INFO: Analyzing C:\Users\Tarun\AppData\Local\Programs\Python\Python310\AI_Ass.py
2659641 INFO: Processing pre-safe import module hook urllib3.packages.six.moves from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-urllib3.packages.six.moves.py'.
27An error occurred while packaging
28Traceback (most recent call last):
29 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\auto_py_to_exe\packaging.py", line 131, in package
30 run_pyinstaller()
31 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 124, in run
32 run_build(pyi_config, spec_file, **vars(args))
33 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 58, in run_build
34 PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
35 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 782, in main
36 build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
37 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 714, in build
38 exec(code, spec_namespace)
39 File "C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec", line 7, in <module>
40 a = Analysis(['C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py'],
41 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 277, in __init__
42 self.__postinit__()
43 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\datastruct.py", line 155, in __postinit__
44 self.assemble()
45 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 439, in assemble
46 priority_scripts.append(self.graph.add_script(script))
47 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 265, in add_script
48 self._top_script_node = super().add_script(pathname)
49 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1433, in add_script
50 self._process_imports(n)
51 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
52 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
53 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
54 target_modules = self.import_hook(
55 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
56 target_package, target_module_partname = self._find_head_package(
57 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
58 target_package = self._safe_import_module(
59 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
60 return super()._safe_import_module(module_basename, module_name, parent_package)
61 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
62 self._process_imports(n)
63 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
64 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
65 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
66 target_modules = self.import_hook(
67 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
68 target_package, target_module_partname = self._find_head_package(
69 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
70 target_package = self._safe_import_module(
71 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
72 return super()._safe_import_module(module_basename, module_name, parent_package)
73 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
74 self._process_imports(n)
75 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
76 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
77 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
78 target_modules = self.import_hook(
79 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
80 target_package, target_module_partname = self._find_head_package(
81 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
82 target_package = self._safe_import_module(
83 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
84 return super()._safe_import_module(module_basename, module_name, parent_package)
85 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
86 self._process_imports(n)
87 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
88 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
89 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
90 target_modules = self.import_hook(
91 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
92 target_package, target_module_partname = self._find_head_package(
93 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
94 target_package = self._safe_import_module(
95 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
96 return super()._safe_import_module(module_basename, module_name, parent_package)
97 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
98 self._process_imports(n)
99 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
100 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
101 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
102 target_modules = self.import_hook(
103 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
104 submodule = self._safe_import_module(head, mname, submodule)
105 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
106 return super()._safe_import_module(module_basename, module_name, parent_package)
107 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
108 self._process_imports(n)
109 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
110 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
111 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
112 target_modules = self.import_hook(
113 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
114 submodule = self._safe_import_module(head, mname, submodule)
115 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
116 return super()._safe_import_module(module_basename, module_name, parent_package)
117 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
118 self._process_imports(n)
119 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
120 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
121 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
122 target_modules = self.import_hook(
123 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
124 submodule = self._safe_import_module(head, mname, submodule)
125 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
126 return super()._safe_import_module(module_basename, module_name, parent_package)
127 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2061, in _safe_import_module
128 n = self._scan_code(module, co, co_ast)
129 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2645, in _scan_code
130 self._scan_bytecode(
131 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2749, in _scan_bytecode
132 for inst in util.iterate_instructions(module_code_object):
133 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 147, in iterate_instructions
134 yield from iterate_instructions(constant)
135 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 139, in iterate_instructions
136 yield from get_instructions(code_object)
137 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 338, in _get_instructions_bytes
138 argval, argrepr = _get_const_info(arg, constants)
139 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 292, in _get_const_info
140 argval = const_list[const_index]
141IndexError: tuple index out of range
142
143Project output will not be moved to output folder
144Complete.
145#importing libraries
146
147import speech_recognition as sr
148import pyttsx3
149import datetime
150import wikipedia
151import webbrowser
152import os
153import time
154import subprocess
155from ecapture import ecapture as ec
156import wolframalpha
157import json
158import requests
159
160#setting up speech engine
161engine=pyttsx3.init('sapi5')
162voices=engine.getProperty('voices')
163engine.setProperty('voice','voices[1].id')
164
165def speak(text):
166 engine.say(text)
167 engine.runAndWait()
168
169#Greet user
170def wishMe():
171 hour=datetime.datetime.now().hour
172 if hour>=0 and hour<12:
173 speak("Hello,Good Morning")
174 print("Hello,Good Morning")
175 elif hour>=12 and hour<18:
176 speak("Hello,Good Afternoon")
177 print("Hello,Good Afternoon")
178 else:
179 speak("Hello,Good Evening")
180 print("Hello,Good Evening")
181
182#Setting up the command function for your AI assistant
183def takeCommand():
184 r=sr.Recognizer()
185 with sr.Microphone() as source:
186 print("Listening...")
187 audio=r.listen(source)
188
189 try:
190 statement=r.recognize_google(audio,language='en-in')
191 print(f"user said:{statement}\n")
192
193 except Exception as e:
194 speak("Pardon me, please say that again")
195 return "None"
196 return statement
197
198print("Loading your AI personal assistant Friday")
199speak("Loading your AI personal assistant Friday")
200wishMe()
201
202#main function
203if __name__=='__main__':
204
205
206 while True:
207 speak("Tell me how can I help you now?")
208 statement = takeCommand().lower()
209 if statement==0:
210 continue
211
212 if "good bye" in statement or "ok bye" in statement or "stop" in statement:
213 speak('your personal assistant Friday is shutting down,Good bye')
214 print('your personal assistant Friday is shutting down,Good bye')
215 break
216
217
218 if 'wikipedia' in statement:
219 speak('Searching Wikipedia...')
220 statement =statement.replace("wikipedia", "")
221 results = wikipedia.summary(statement, sentences=10)
222 webbrowser.open_new_tab("https://en.wikipedia.org/wiki/"+ statement)
223 speak("According to Wikipedia")
224 print(results)
225 speak(results)
226
227 elif 'open youtube' in statement:
228 webbrowser.register('chrome', None,
229 webbrowser.BackgroundBrowser("C://Program Files (x86)//Google//Chrome//Application//chrome.exe"))
230 webbrowser.get('chrome').open_new_tab("https://www.youtube.com")
231 #webbrowser.open_new_tab("https://www.youtube.com")
232 speak("youtube is open now")
233 time.sleep(5)
234
235 elif 'open google' in statement:
236 webbrowser.open_new_tab("https://www.google.com")
237 speak("Google chrome is open now")
238 time.sleep(5)
239
240 elif 'open gmail' in statement:
241 webbrowser.open_new_tab("gmail.com")
242 speak("Google Mail open now")
243 time.sleep(5)
244
245 elif 'time' in statement:
246 strTime=datetime.datetime.now().strftime("%H:%M:%S")
247 speak(f"the time is {strTime}")
248
249 elif 'news' in statement:
250 news = webbrowser.open_new_tab("https://timesofindia.indiatimes.com/home/headlines")
251 speak('Here are some headlines from the Times of India,Happy reading')
252 time.sleep(6)
253
254 elif "camera" in statement or "take a photo" in statement:
255 ec.capture(0,"robo camera","img.jpg")
256
257 elif 'search' in statement:
258 statement = statement.replace("search", "")
259 webbrowser.open_new_tab(statement)
260 time.sleep(5)
261
262 elif 'who are you' in statement or 'what can you do' in statement:
263 speak('I am Friday version 1 point O your personal assistant. I am programmed to minor tasks like'
264 'opening youtube,google chrome, gmail and stackoverflow ,predict time,take a photo,search wikipedia,predict weather'
265 'In different cities, get top headline news from times of india and you can ask me computational or geographical questions too!')
266
267
268 elif "who made you" in statement or "who created you" in statement or "who discovered you" in statement:
269 speak("I was built by Mirthula")
270 print("I was built by Mirthula")
271
272 elif "log off" in statement or "sign out" in statement:
273 speak("Ok , your pc will log off in 10 sec make sure you exit from all applications")
274 subprocess.call(["shutdown", "/l"])
275
276time.sleep(3)
277
ANSWER
Answered 2021-Nov-05 at 02:201Running auto-py-to-exe v2.10.1
2Building directory: C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
3Provided command: pyinstaller --noconfirm --onedir --console --no-embed-manifest "C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py"
4Recursion Limit is set to 5000
5Executing: pyinstaller --noconfirm --onedir --console --no-embed-manifest C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py --distpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\application --workpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\build --specpath C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x
6
742681 INFO: PyInstaller: 4.6
842690 INFO: Python: 3.10.0
942732 INFO: Platform: Windows-10-10.0.19042-SP0
1042744 INFO: wrote C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec
1142764 INFO: UPX is not available.
1242772 INFO: Extending PYTHONPATH with paths
13['C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310']
1443887 INFO: checking Analysis
1543891 INFO: Building Analysis because Analysis-00.toc is non existent
1643895 INFO: Initializing module dependency graph...
1743915 INFO: Caching module graph hooks...
1843975 INFO: Analyzing base_library.zip ...
1954298 INFO: Processing pre-find module path hook distutils from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_find_module_path\\hook-distutils.py'.
2054306 INFO: distutils: retargeting to non-venv dir 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib'
2157474 INFO: Caching module dependency graph...
2258088 INFO: running Analysis Analysis-00.toc
2358132 INFO: Adding Microsoft.Windows.Common-Controls to dependent assemblies of final executable
24 required by C:\Users\Tarun\AppData\Local\Programs\Python\Python310\python.exe
2558365 INFO: Analyzing C:\Users\Tarun\AppData\Local\Programs\Python\Python310\AI_Ass.py
2659641 INFO: Processing pre-safe import module hook urllib3.packages.six.moves from 'C:\\Users\\Tarun\\AppData\\Local\\Programs\\Python\\Python310\\lib\\site-packages\\PyInstaller\\hooks\\pre_safe_import_module\\hook-urllib3.packages.six.moves.py'.
27An error occurred while packaging
28Traceback (most recent call last):
29 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\auto_py_to_exe\packaging.py", line 131, in package
30 run_pyinstaller()
31 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 124, in run
32 run_build(pyi_config, spec_file, **vars(args))
33 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\__main__.py", line 58, in run_build
34 PyInstaller.building.build_main.main(pyi_config, spec_file, **kwargs)
35 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 782, in main
36 build(specfile, kw.get('distpath'), kw.get('workpath'), kw.get('clean_build'))
37 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 714, in build
38 exec(code, spec_namespace)
39 File "C:\Users\Tarun\AppData\Local\Temp\tmpjaw1ky1x\AI_Ass.spec", line 7, in <module>
40 a = Analysis(['C:/Users/Tarun/AppData/Local/Programs/Python/Python310/AI_Ass.py'],
41 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 277, in __init__
42 self.__postinit__()
43 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\datastruct.py", line 155, in __postinit__
44 self.assemble()
45 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\building\build_main.py", line 439, in assemble
46 priority_scripts.append(self.graph.add_script(script))
47 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 265, in add_script
48 self._top_script_node = super().add_script(pathname)
49 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1433, in add_script
50 self._process_imports(n)
51 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
52 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
53 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
54 target_modules = self.import_hook(
55 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
56 target_package, target_module_partname = self._find_head_package(
57 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
58 target_package = self._safe_import_module(
59 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
60 return super()._safe_import_module(module_basename, module_name, parent_package)
61 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
62 self._process_imports(n)
63 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
64 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
65 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
66 target_modules = self.import_hook(
67 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
68 target_package, target_module_partname = self._find_head_package(
69 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
70 target_package = self._safe_import_module(
71 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
72 return super()._safe_import_module(module_basename, module_name, parent_package)
73 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
74 self._process_imports(n)
75 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
76 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
77 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
78 target_modules = self.import_hook(
79 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
80 target_package, target_module_partname = self._find_head_package(
81 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
82 target_package = self._safe_import_module(
83 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
84 return super()._safe_import_module(module_basename, module_name, parent_package)
85 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
86 self._process_imports(n)
87 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
88 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
89 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
90 target_modules = self.import_hook(
91 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1505, in import_hook
92 target_package, target_module_partname = self._find_head_package(
93 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1684, in _find_head_package
94 target_package = self._safe_import_module(
95 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
96 return super()._safe_import_module(module_basename, module_name, parent_package)
97 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
98 self._process_imports(n)
99 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
100 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
101 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
102 target_modules = self.import_hook(
103 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
104 submodule = self._safe_import_module(head, mname, submodule)
105 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
106 return super()._safe_import_module(module_basename, module_name, parent_package)
107 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
108 self._process_imports(n)
109 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
110 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
111 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
112 target_modules = self.import_hook(
113 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
114 submodule = self._safe_import_module(head, mname, submodule)
115 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
116 return super()._safe_import_module(module_basename, module_name, parent_package)
117 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2062, in _safe_import_module
118 self._process_imports(n)
119 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2850, in _process_imports
120 target_module = self._safe_import_hook(*import_info, **kwargs)[0]
121 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2301, in _safe_import_hook
122 target_modules = self.import_hook(
123 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 1518, in import_hook
124 submodule = self._safe_import_module(head, mname, submodule)
125 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\depend\analysis.py", line 387, in _safe_import_module
126 return super()._safe_import_module(module_basename, module_name, parent_package)
127 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2061, in _safe_import_module
128 n = self._scan_code(module, co, co_ast)
129 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2645, in _scan_code
130 self._scan_bytecode(
131 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\modulegraph.py", line 2749, in _scan_bytecode
132 for inst in util.iterate_instructions(module_code_object):
133 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 147, in iterate_instructions
134 yield from iterate_instructions(constant)
135 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\site-packages\PyInstaller\lib\modulegraph\util.py", line 139, in iterate_instructions
136 yield from get_instructions(code_object)
137 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 338, in _get_instructions_bytes
138 argval, argrepr = _get_const_info(arg, constants)
139 File "C:\Users\Tarun\AppData\Local\Programs\Python\Python310\lib\dis.py", line 292, in _get_const_info
140 argval = const_list[const_index]
141IndexError: tuple index out of range
142
143Project output will not be moved to output folder
144Complete.
145#importing libraries
146
147import speech_recognition as sr
148import pyttsx3
149import datetime
150import wikipedia
151import webbrowser
152import os
153import time
154import subprocess
155from ecapture import ecapture as ec
156import wolframalpha
157import json
158import requests
159
160#setting up speech engine
161engine=pyttsx3.init('sapi5')
162voices=engine.getProperty('voices')
163engine.setProperty('voice','voices[1].id')
164
165def speak(text):
166 engine.say(text)
167 engine.runAndWait()
168
169#Greet user
170def wishMe():
171 hour=datetime.datetime.now().hour
172 if hour>=0 and hour<12:
173 speak("Hello,Good Morning")
174 print("Hello,Good Morning")
175 elif hour>=12 and hour<18:
176 speak("Hello,Good Afternoon")
177 print("Hello,Good Afternoon")
178 else:
179 speak("Hello,Good Evening")
180 print("Hello,Good Evening")
181
182#Setting up the command function for your AI assistant
183def takeCommand():
184 r=sr.Recognizer()
185 with sr.Microphone() as source:
186 print("Listening...")
187 audio=r.listen(source)
188
189 try:
190 statement=r.recognize_google(audio,language='en-in')
191 print(f"user said:{statement}\n")
192
193 except Exception as e:
194 speak("Pardon me, please say that again")
195 return "None"
196 return statement
197
198print("Loading your AI personal assistant Friday")
199speak("Loading your AI personal assistant Friday")
200wishMe()
201
202#main function
203if __name__=='__main__':
204
205
206 while True:
207 speak("Tell me how can I help you now?")
208 statement = takeCommand().lower()
209 if statement==0:
210 continue
211
212 if "good bye" in statement or "ok bye" in statement or "stop" in statement:
213 speak('your personal assistant Friday is shutting down,Good bye')
214 print('your personal assistant Friday is shutting down,Good bye')
215 break
216
217
218 if 'wikipedia' in statement:
219 speak('Searching Wikipedia...')
220 statement =statement.replace("wikipedia", "")
221 results = wikipedia.summary(statement, sentences=10)
222 webbrowser.open_new_tab("https://en.wikipedia.org/wiki/"+ statement)
223 speak("According to Wikipedia")
224 print(results)
225 speak(results)
226
227 elif 'open youtube' in statement:
228 webbrowser.register('chrome', None,
229 webbrowser.BackgroundBrowser("C://Program Files (x86)//Google//Chrome//Application//chrome.exe"))
230 webbrowser.get('chrome').open_new_tab("https://www.youtube.com")
231 #webbrowser.open_new_tab("https://www.youtube.com")
232 speak("youtube is open now")
233 time.sleep(5)
234
235 elif 'open google' in statement:
236 webbrowser.open_new_tab("https://www.google.com")
237 speak("Google chrome is open now")
238 time.sleep(5)
239
240 elif 'open gmail' in statement:
241 webbrowser.open_new_tab("gmail.com")
242 speak("Google Mail open now")
243 time.sleep(5)
244
245 elif 'time' in statement:
246 strTime=datetime.datetime.now().strftime("%H:%M:%S")
247 speak(f"the time is {strTime}")
248
249 elif 'news' in statement:
250 news = webbrowser.open_new_tab("https://timesofindia.indiatimes.com/home/headlines")
251 speak('Here are some headlines from the Times of India,Happy reading')
252 time.sleep(6)
253
254 elif "camera" in statement or "take a photo" in statement:
255 ec.capture(0,"robo camera","img.jpg")
256
257 elif 'search' in statement:
258 statement = statement.replace("search", "")
259 webbrowser.open_new_tab(statement)
260 time.sleep(5)
261
262 elif 'who are you' in statement or 'what can you do' in statement:
263 speak('I am Friday version 1 point O your personal assistant. I am programmed to minor tasks like'
264 'opening youtube,google chrome, gmail and stackoverflow ,predict time,take a photo,search wikipedia,predict weather'
265 'In different cities, get top headline news from times of india and you can ask me computational or geographical questions too!')
266
267
268 elif "who made you" in statement or "who created you" in statement or "who discovered you" in statement:
269 speak("I was built by Mirthula")
270 print("I was built by Mirthula")
271
272 elif "log off" in statement or "sign out" in statement:
273 speak("Ok , your pc will log off in 10 sec make sure you exit from all applications")
274 subprocess.call(["shutdown", "/l"])
275
276time.sleep(3)
27742681 INFO: PyInstaller: 4.6
27842690 INFO: Python: 3.10.0
279
There's the issue. Python 3.10.0 has a bug with PyInstaller 4.6. The problem isn't you or PyInstaller. Try converting it using Python 3.9.7 instead. Ironic, considering 3.10.0 was suppose to be a bugfix update.
QUESTION
Display customer specific information on product detail page - what about the caching?
Asked 2022-Jan-28 at 10:57We want to display customer (actually customer-group) specific information on product detail pages in Shopware 6.
There seems to be the HTTP cache and we are afraid that the page would be cached if a specific customer group displays the page and the information would be leaked to non-customers.
Is this assumption correct?
The documentation does not reveal much information about this.
Is there a way to set specific cache tags, so that the information is only displayed to the correct customer group?
Or do we need to fetch the data dynamically via AJAX?
Bonus question: Can the HTTP cache be simulated in automatic tests to ensure the functionality works?
What I found out so far:
The is annotation
@httpCache
for controller, which seems to control whether a page is cached or notThe cache key is generated in
\Shopware\Storefront\Framework\Cache\HttpCacheKeyGenerator::generate
. It take the full request URI into account, and somecacheHash
which is injected. I believe it would not take the customer group into accountMaybe this
generate()
method could be decorated, but I am not sure if that is the right way.There is a cookie being set
sw-cache-hash
which influences the caching. It takes the customer into account.sw-cache-hash
is created here:
1 if ($context->getCustomer() || $cart->getLineItems()->count() > 0) {
2 $cookie = Cookie::create(self::CONTEXT_CACHE_COOKIE, $this->buildCacheHash($context));
3 $cookie->setSecureDefault($request->isSecure());
4
5 $response->headers->setCookie($cookie);
6 } else {
7 $response->headers->removeCookie(self::CONTEXT_CACHE_COOKIE);
8 $response->headers->clearCookie(self::CONTEXT_CACHE_COOKIE);
9 }
10
So as soon you are logged in or have some items in the cart, a different cache hash is used. This depends on the following, but not on the customer group it self:
1 if ($context->getCustomer() || $cart->getLineItems()->count() > 0) {
2 $cookie = Cookie::create(self::CONTEXT_CACHE_COOKIE, $this->buildCacheHash($context));
3 $cookie->setSecureDefault($request->isSecure());
4
5 $response->headers->setCookie($cookie);
6 } else {
7 $response->headers->removeCookie(self::CONTEXT_CACHE_COOKIE);
8 $response->headers->clearCookie(self::CONTEXT_CACHE_COOKIE);
9 }
10private function buildCacheHash(SalesChannelContext $context): string
11{
12 return md5(json_encode([
13 $context->getRuleIds(),
14 $context->getContext()->getVersionId(),
15 $context->getCurrency()->getId(),
16 ]));
17}
18
ANSWER
Answered 2022-Jan-28 at 10:51As you can see in the last code snippet, it takes into account the active Rule ids. This means that if you create a rule (through Settings > Rule Builder) that is active on a certain group, but not on another or no group, it will be taken into account & create a different cache hash for the different customer groups.
QUESTION
Netlify says, "error Gatsby requires Node.js 14.15.0 or higher (you have v12.18.0)"โyet I have the newest Node version?
Asked 2022-Jan-08 at 07:21After migrating from Remark to MDX, my builds on Netlify are failing.
I get this error when trying to build:
110:13:28 AM: $ npm run build
210:13:29 AM: > blog-gatsby@0.1.0 build /opt/build/repo
310:13:29 AM: > gatsby build
410:13:30 AM: error Gatsby requires Node.js 14.15.0 or higher (you have v12.18.0).
510:13:30 AM: Upgrade Node to the latest stable release: https://gatsby.dev/upgrading-node-js
6
Yet when I run node -v
in my terminal, it says v17.2.0.
I assume it's not a coincidence that this happened after migrating. Can the problem be because of my node-modules folder? Or is there something in my gatsby-config.js or package.json files I need to change?
My package.json file:
110:13:28 AM: $ npm run build
210:13:29 AM: > blog-gatsby@0.1.0 build /opt/build/repo
310:13:29 AM: > gatsby build
410:13:30 AM: error Gatsby requires Node.js 14.15.0 or higher (you have v12.18.0).
510:13:30 AM: Upgrade Node to the latest stable release: https://gatsby.dev/upgrading-node-js
6{
7 "name": "blog-gatsby",
8 "private": true,
9 "description": "A starter for a blog powered by Gatsby and Markdown",
10 "version": "0.1.0",
11 "author": "Magnus Kolstad <kolstadmagnus@gmail.com>",
12 "bugs": {
13 "url": "https://kolstadmagnus.no"
14 },
15 "dependencies": {
16 "@mdx-js/mdx": "^1.6.22",
17 "@mdx-js/react": "^1.6.22",
18 "gatsby": "^4.3.0",
19 "gatsby-plugin-feed": "^4.3.0",
20 "gatsby-plugin-gatsby-cloud": "^4.3.0",
21 "gatsby-plugin-google-analytics": "^4.3.0",
22 "gatsby-plugin-image": "^2.3.0",
23 "gatsby-plugin-manifest": "^4.3.0",
24 "gatsby-plugin-mdx": "^3.4.0",
25 "gatsby-plugin-offline": "^5.3.0",
26 "gatsby-plugin-react-helmet": "^5.3.0",
27 "gatsby-plugin-sharp": "^4.3.0",
28 "gatsby-remark-copy-linked-files": "^5.3.0",
29 "gatsby-remark-images": "^6.3.0",
30 "gatsby-remark-prismjs": "^6.3.0",
31 "gatsby-remark-responsive-iframe": "^5.3.0",
32 "gatsby-remark-smartypants": "^5.3.0",
33 "gatsby-source-filesystem": "^4.3.0",
34 "gatsby-transformer-sharp": "^4.3.0",
35 "prismjs": "^1.25.0",
36 "react": "^17.0.1",
37 "react-dom": "^17.0.1",
38 "react-helmet": "^6.1.0",
39 "typeface-merriweather": "0.0.72",
40 "typeface-montserrat": "0.0.75"
41 },
42 "devDependencies": {
43 "prettier": "^2.4.1"
44 },
45 "homepage": "https://kolstadmagnus.no",
46 "keywords": [
47 "blog"
48 ],
49 "license": "0BSD",
50 "main": "n/a",
51 "repository": {
52 "type": "git",
53 "url": "git+https://github.com/gatsbyjs/gatsby-starter-blog.git"
54 },
55 "scripts": {
56 "build": "gatsby build",
57 "develop": "gatsby develop",
58 "format": "prettier --write \"**/*.{js,jsx,ts,tsx,json,md}\"",
59 "start": "gatsby develop",
60 "serve": "gatsby serve",
61 "clean": "gatsby clean",
62 "test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1"
63 }
64}
65
What am I doing wrong here?
Update #1
110:13:28 AM: $ npm run build
210:13:29 AM: > blog-gatsby@0.1.0 build /opt/build/repo
310:13:29 AM: > gatsby build
410:13:30 AM: error Gatsby requires Node.js 14.15.0 or higher (you have v12.18.0).
510:13:30 AM: Upgrade Node to the latest stable release: https://gatsby.dev/upgrading-node-js
6{
7 "name": "blog-gatsby",
8 "private": true,
9 "description": "A starter for a blog powered by Gatsby and Markdown",
10 "version": "0.1.0",
11 "author": "Magnus Kolstad <kolstadmagnus@gmail.com>",
12 "bugs": {
13 "url": "https://kolstadmagnus.no"
14 },
15 "dependencies": {
16 "@mdx-js/mdx": "^1.6.22",
17 "@mdx-js/react": "^1.6.22",
18 "gatsby": "^4.3.0",
19 "gatsby-plugin-feed": "^4.3.0",
20 "gatsby-plugin-gatsby-cloud": "^4.3.0",
21 "gatsby-plugin-google-analytics": "^4.3.0",
22 "gatsby-plugin-image": "^2.3.0",
23 "gatsby-plugin-manifest": "^4.3.0",
24 "gatsby-plugin-mdx": "^3.4.0",
25 "gatsby-plugin-offline": "^5.3.0",
26 "gatsby-plugin-react-helmet": "^5.3.0",
27 "gatsby-plugin-sharp": "^4.3.0",
28 "gatsby-remark-copy-linked-files": "^5.3.0",
29 "gatsby-remark-images": "^6.3.0",
30 "gatsby-remark-prismjs": "^6.3.0",
31 "gatsby-remark-responsive-iframe": "^5.3.0",
32 "gatsby-remark-smartypants": "^5.3.0",
33 "gatsby-source-filesystem": "^4.3.0",
34 "gatsby-transformer-sharp": "^4.3.0",
35 "prismjs": "^1.25.0",
36 "react": "^17.0.1",
37 "react-dom": "^17.0.1",
38 "react-helmet": "^6.1.0",
39 "typeface-merriweather": "0.0.72",
40 "typeface-montserrat": "0.0.75"
41 },
42 "devDependencies": {
43 "prettier": "^2.4.1"
44 },
45 "homepage": "https://kolstadmagnus.no",
46 "keywords": [
47 "blog"
48 ],
49 "license": "0BSD",
50 "main": "n/a",
51 "repository": {
52 "type": "git",
53 "url": "git+https://github.com/gatsbyjs/gatsby-starter-blog.git"
54 },
55 "scripts": {
56 "build": "gatsby build",
57 "develop": "gatsby develop",
58 "format": "prettier --write \"**/*.{js,jsx,ts,tsx,json,md}\"",
59 "start": "gatsby develop",
60 "serve": "gatsby serve",
61 "clean": "gatsby clean",
62 "test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1"
63 }
64}
657:11:59 PM: failed Building production JavaScript and CSS bundles - 20.650s
667:11:59 PM: error Generating JavaScript bundles failed
677:11:59 PM: Module build failed (from ./node_modules/url-loader/dist/cjs.js):
687:11:59 PM: Error: error:0308010C:digital envelope routines::unsupported
697:11:59 PM: at new Hash (node:internal/crypto/hash:67:19)
707:11:59 PM: at Object.createHash (node:crypto:130:10)
717:11:59 PM: at getHashDigest (/opt/build/repo/node_modules/file-loader/node_modules/loader-utils/lib/getHashDigest.js:46:34)
727:11:59 PM: at /opt/build/repo/node_modules/file-loader/node_modules/loader-utils/lib/interpolateName.js:113:11
737:11:59 PM: at String.replace (<anonymous>)
747:11:59 PM: at interpolateName (/opt/build/repo/node_modules/file-loader/node_modules/loader-utils/lib/interpolateName.js:110:8)
757:11:59 PM: at Object.loader (/opt/build/repo/node_modules/file-loader/dist/index.js:29:48)
767:11:59 PM: at Object.loader (/opt/build/repo/node_modules/url-loader/dist/index.js:127:19)
777:11:59 PM: โ
787:11:59 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
797:11:59 PM: "build.command" failed
807:11:59 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
817:11:59 PM: โ
827:11:59 PM: Error message
837:11:59 PM: Command failed with exit code 1: npm run build
847:11:59 PM: โ
857:11:59 PM: Error location
867:11:59 PM: In Build command from Netlify app:
877:11:59 PM: npm run build
887:11:59 PM: โ
897:11:59 PM: Resolved config
907:11:59 PM: build:
917:11:59 PM: command: npm run build
927:11:59 PM: commandOrigin: ui
937:11:59 PM: publish: /opt/build/repo/public
947:11:59 PM: publishOrigin: ui
957:11:59 PM: plugins:
967:11:59 PM: - inputs: {}
977:11:59 PM: origin: ui
987:11:59 PM: package: '@netlify/plugin-gatsby'
997:11:59 PM: redirects:
1007:12:00 PM: - from: /api/*
101 status: 200
102 to: /.netlify/functions/gatsby
103 - force: true
104 from: https://magnuskolstad.com
105 status: 301
106 to: https://kolstadmagnus.no
107 redirectsOrigin: config
108Caching artifacts
109
ANSWER
Answered 2022-Jan-08 at 07:21The problem is that you have Node 17.2.0. locally but in Netlify's environment, you are running a lower version (by default it's not set as 17.2.0). So the local environment is OK, Netlify environment is KO because of this mismatch of Node versions.
When Netlify deploys your site it installs and builds again your site so you should ensure that both environments work under the same conditions. Otherwise, both node_modules
will differ so your application will have different behavior or eventually won't even build because of dependency errors.
You can easily play with the Node version in multiple ways but I'd recommend using the .nvmrc
file. Just run the following command in the root of your project:
110:13:28 AM: $ npm run build
210:13:29 AM: > blog-gatsby@0.1.0 build /opt/build/repo
310:13:29 AM: > gatsby build
410:13:30 AM: error Gatsby requires Node.js 14.15.0 or higher (you have v12.18.0).
510:13:30 AM: Upgrade Node to the latest stable release: https://gatsby.dev/upgrading-node-js
6{
7 "name": "blog-gatsby",
8 "private": true,
9 "description": "A starter for a blog powered by Gatsby and Markdown",
10 "version": "0.1.0",
11 "author": "Magnus Kolstad <kolstadmagnus@gmail.com>",
12 "bugs": {
13 "url": "https://kolstadmagnus.no"
14 },
15 "dependencies": {
16 "@mdx-js/mdx": "^1.6.22",
17 "@mdx-js/react": "^1.6.22",
18 "gatsby": "^4.3.0",
19 "gatsby-plugin-feed": "^4.3.0",
20 "gatsby-plugin-gatsby-cloud": "^4.3.0",
21 "gatsby-plugin-google-analytics": "^4.3.0",
22 "gatsby-plugin-image": "^2.3.0",
23 "gatsby-plugin-manifest": "^4.3.0",
24 "gatsby-plugin-mdx": "^3.4.0",
25 "gatsby-plugin-offline": "^5.3.0",
26 "gatsby-plugin-react-helmet": "^5.3.0",
27 "gatsby-plugin-sharp": "^4.3.0",
28 "gatsby-remark-copy-linked-files": "^5.3.0",
29 "gatsby-remark-images": "^6.3.0",
30 "gatsby-remark-prismjs": "^6.3.0",
31 "gatsby-remark-responsive-iframe": "^5.3.0",
32 "gatsby-remark-smartypants": "^5.3.0",
33 "gatsby-source-filesystem": "^4.3.0",
34 "gatsby-transformer-sharp": "^4.3.0",
35 "prismjs": "^1.25.0",
36 "react": "^17.0.1",
37 "react-dom": "^17.0.1",
38 "react-helmet": "^6.1.0",
39 "typeface-merriweather": "0.0.72",
40 "typeface-montserrat": "0.0.75"
41 },
42 "devDependencies": {
43 "prettier": "^2.4.1"
44 },
45 "homepage": "https://kolstadmagnus.no",
46 "keywords": [
47 "blog"
48 ],
49 "license": "0BSD",
50 "main": "n/a",
51 "repository": {
52 "type": "git",
53 "url": "git+https://github.com/gatsbyjs/gatsby-starter-blog.git"
54 },
55 "scripts": {
56 "build": "gatsby build",
57 "develop": "gatsby develop",
58 "format": "prettier --write \"**/*.{js,jsx,ts,tsx,json,md}\"",
59 "start": "gatsby develop",
60 "serve": "gatsby serve",
61 "clean": "gatsby clean",
62 "test": "echo \"Write tests! -> https://gatsby.dev/unit-testing\" && exit 1"
63 }
64}
657:11:59 PM: failed Building production JavaScript and CSS bundles - 20.650s
667:11:59 PM: error Generating JavaScript bundles failed
677:11:59 PM: Module build failed (from ./node_modules/url-loader/dist/cjs.js):
687:11:59 PM: Error: error:0308010C:digital envelope routines::unsupported
697:11:59 PM: at new Hash (node:internal/crypto/hash:67:19)
707:11:59 PM: at Object.createHash (node:crypto:130:10)
717:11:59 PM: at getHashDigest (/opt/build/repo/node_modules/file-loader/node_modules/loader-utils/lib/getHashDigest.js:46:34)
727:11:59 PM: at /opt/build/repo/node_modules/file-loader/node_modules/loader-utils/lib/interpolateName.js:113:11
737:11:59 PM: at String.replace (<anonymous>)
747:11:59 PM: at interpolateName (/opt/build/repo/node_modules/file-loader/node_modules/loader-utils/lib/interpolateName.js:110:8)
757:11:59 PM: at Object.loader (/opt/build/repo/node_modules/file-loader/dist/index.js:29:48)
767:11:59 PM: at Object.loader (/opt/build/repo/node_modules/url-loader/dist/index.js:127:19)
777:11:59 PM: โ
787:11:59 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
797:11:59 PM: "build.command" failed
807:11:59 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
817:11:59 PM: โ
827:11:59 PM: Error message
837:11:59 PM: Command failed with exit code 1: npm run build
847:11:59 PM: โ
857:11:59 PM: Error location
867:11:59 PM: In Build command from Netlify app:
877:11:59 PM: npm run build
887:11:59 PM: โ
897:11:59 PM: Resolved config
907:11:59 PM: build:
917:11:59 PM: command: npm run build
927:11:59 PM: commandOrigin: ui
937:11:59 PM: publish: /opt/build/repo/public
947:11:59 PM: publishOrigin: ui
957:11:59 PM: plugins:
967:11:59 PM: - inputs: {}
977:11:59 PM: origin: ui
987:11:59 PM: package: '@netlify/plugin-gatsby'
997:11:59 PM: redirects:
1007:12:00 PM: - from: /api/*
101 status: 200
102 to: /.netlify/functions/gatsby
103 - force: true
104 from: https://magnuskolstad.com
105 status: 301
106 to: https://kolstadmagnus.no
107 redirectsOrigin: config
108Caching artifacts
109node -v > .nvmrc
110
This should create a .nvmrc
file containing the Node version (node -v
) in it. When Netlify finds this file during the build process, it uses it as a base Node version so it installs all the dependencies accordingly.
The file is also useful to tell other contributors which Node version are you using.
QUESTION
Unable to build and deploy Rails 6.0.4.1 app on heroku - Throws gyp verb cli error
Asked 2022-Jan-02 at 10:07Hi i was deploying a branch on heroku and threw up this error. I also tried deploying a branch which worked perfectly, but that is also showing the same error.
local yarn verion : 1.22.17 local node version : v12.22.7 Please help !!!
Tried building without yarn.lock and package-lock same thing.
This is how it starts Heroku deployment build log through CLI
1yarn install v1.22.17
2remote: warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies caused by unsynchronized lock files. To clear this warning, remove package-lock.json.
3remote: [1/5] Validating package.json...
4remote: [2/5] Resolving packages...
5remote: [3/5] Fetching packages...
6remote: [4/5] Linking dependencies...
7remote: warning " > webpack-dev-server@4.6.0" has unmet peer dependency "webpack@^4.37.0 || ^5.0.0".
8remote: warning "webpack-dev-server > webpack-dev-middleware@5.2.1" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
9remote: [5/5] Building fresh packages...
10remote: error /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass: Command failed.
11remote: Exit code: 1
12remote: Command: node scripts/build.js
13remote: Arguments:
14remote: Directory: /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
15remote: Output:
16remote: Building: /tmp/build_df192222/bin/node /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
17remote: gyp info it worked if it ends with ok
18remote: gyp verb cli [
19remote: gyp verb cli '/tmp/build_df192222/bin/node',
20remote: gyp verb cli '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js',
21remote: gyp verb cli 'rebuild',
22
. . . . . `
1yarn install v1.22.17
2remote: warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies caused by unsynchronized lock files. To clear this warning, remove package-lock.json.
3remote: [1/5] Validating package.json...
4remote: [2/5] Resolving packages...
5remote: [3/5] Fetching packages...
6remote: [4/5] Linking dependencies...
7remote: warning " > webpack-dev-server@4.6.0" has unmet peer dependency "webpack@^4.37.0 || ^5.0.0".
8remote: warning "webpack-dev-server > webpack-dev-middleware@5.2.1" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
9remote: [5/5] Building fresh packages...
10remote: error /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass: Command failed.
11remote: Exit code: 1
12remote: Command: node scripts/build.js
13remote: Arguments:
14remote: Directory: /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
15remote: Output:
16remote: Building: /tmp/build_df192222/bin/node /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
17remote: gyp info it worked if it ends with ok
18remote: gyp verb cli [
19remote: gyp verb cli '/tmp/build_df192222/bin/node',
20remote: gyp verb cli '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js',
21remote: gyp verb cli 'rebuild',
22remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h: In function โvoid v8::internal::PerformCastCheck(T*)โ:
23remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:38: error: โremove_cv_tโ is not a member of โstdโ; did you mean โremove_cvโ?
24remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
25remote: | ^~~~~~~~~~~
26remote: | remove_cv
27remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:38: error: โremove_cv_tโ is not a member of โstdโ; did you mean โremove_cvโ?
28remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
29remote: | ^~~~~~~~~~~
30remote: | remove_cv
31remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:50: error: template argument 2 is invalid
32remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
33remote: | ^
34remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:63: error: โ::Performโ has not been declared
35remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
36remote: | ^~~~~~~
37remote: ../src/binding.cpp: In function โNan::NAN_METHOD_RETURN_TYPE render(Nan::NAN_METHOD_ARGS_TYPE)โ:
38remote: ../src/binding.cpp:284:98: warning: cast between incompatible function types from โvoid (*)(uv_work_t*)โ {aka โvoid (*)(uv_work_s*)โ} to โuv_after_work_cbโ {aka โvoid (*)(uv_work_s*, int)โ} [-Wcast-function-type]
39remote: 284 | int status = uv_queue_work(uv_default_loop(), &ctx_w->request, compile_it, (uv_after_work_cb)MakeCallback);
40remote: | ^~~~~~~~~~~~
41remote: ../src/binding.cpp: In function โNan::NAN_METHOD_RETURN_TYPE render_file(Nan::NAN_METHOD_ARGS_TYPE)โ:
42remote: ../src/binding.cpp:320:98: warning: cast between incompatible function types from โvoid (*)(uv_work_t*)โ {aka โvoid (*)(uv_work_s*)โ} to โuv_after_work_cbโ {aka โvoid (*)(uv_work_s*, int)โ} [-Wcast-function-type]
43remote: 320 | int status = uv_queue_work(uv_default_loop(), &ctx_w->request, compile_it, (uv_after_work_cb)MakeCallback);
44remote: | ^~~~~~~~~~~~
45remote: In file included from ../../../../../nan/nan.h:58,
46remote: from ../src/binding.cpp:1:
47remote: ../src/binding.cpp: At global scope:
48remote: /app/.node-gyp/16.13.1/include/node/node.h:821:43: warning: cast between incompatible function types from โvoid (*)(Nan::ADDON_REGISTER_FUNCTION_ARGS_TYPE)โ {aka โvoid (*)(v8::Local<v8::Object>)โ} to โnode::addon_register_funcโ {aka โvoid (*)(v8::Local<v8::Object>, v8::Local<v8::Value>, void*)โ} [-Wcast-function-type]
49remote: 821 | (node::addon_register_func) (regfunc), \
50remote: | ^
51remote: /app/.node-gyp/16.13.1/include/node/node.h:855:3: note: in expansion of macro โNODE_MODULE_Xโ
52remote: 855 | NODE_MODULE_X(modname, regfunc, NULL, 0) // NOLINT (readability/null_usage)
53remote: | ^~~~~~~~~~~~~
54remote: ../src/binding.cpp:358:1: note: in expansion of macro โNODE_MODULEโ
55remote: 358 | NODE_MODULE(binding, RegisterModule);
56remote: | ^~~~~~~~~~~
57remote: make: *** [binding.target.mk:133: Release/obj.target/binding/src/binding.o] Error 1
58remote: make: Leaving directory '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass/build'
59remote: gyp ERR! build error
60remote: gyp ERR! stack Error: `make` failed with exit code: 2
61remote: gyp ERR! stack at ChildProcess.onExit (/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/lib/build.js:262:23)
62remote: gyp ERR! stack at ChildProcess.emit (node:events:390:28)
63remote: gyp ERR! stack at Process.ChildProcess._handle.onexit (node:internal/child_process:290:12)
64remote: gyp ERR! System Linux 4.4.0-1097-aws
65remote: gyp ERR! command "/tmp/build_df192222/bin/node" "/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
66remote: gyp ERR! cwd /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
67remote: gyp ERR! node -v v16.13.1
68remote: gyp ERR! node-gyp -v v3.8.0
69remote: gyp ERR! not ok
70remote: Build failed with error code: 1
71remote:
72remote: !
73remote: ! Precompiling assets failed.
74remote: !
75remote: ! Push rejected, failed to compile Ruby app.
76remote:
77remote: ! Push failed
78
Though it is a Rails app I added node in engines to package.json.
1yarn install v1.22.17
2remote: warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies caused by unsynchronized lock files. To clear this warning, remove package-lock.json.
3remote: [1/5] Validating package.json...
4remote: [2/5] Resolving packages...
5remote: [3/5] Fetching packages...
6remote: [4/5] Linking dependencies...
7remote: warning " > webpack-dev-server@4.6.0" has unmet peer dependency "webpack@^4.37.0 || ^5.0.0".
8remote: warning "webpack-dev-server > webpack-dev-middleware@5.2.1" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
9remote: [5/5] Building fresh packages...
10remote: error /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass: Command failed.
11remote: Exit code: 1
12remote: Command: node scripts/build.js
13remote: Arguments:
14remote: Directory: /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
15remote: Output:
16remote: Building: /tmp/build_df192222/bin/node /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
17remote: gyp info it worked if it ends with ok
18remote: gyp verb cli [
19remote: gyp verb cli '/tmp/build_df192222/bin/node',
20remote: gyp verb cli '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js',
21remote: gyp verb cli 'rebuild',
22remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h: In function โvoid v8::internal::PerformCastCheck(T*)โ:
23remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:38: error: โremove_cv_tโ is not a member of โstdโ; did you mean โremove_cvโ?
24remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
25remote: | ^~~~~~~~~~~
26remote: | remove_cv
27remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:38: error: โremove_cv_tโ is not a member of โstdโ; did you mean โremove_cvโ?
28remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
29remote: | ^~~~~~~~~~~
30remote: | remove_cv
31remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:50: error: template argument 2 is invalid
32remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
33remote: | ^
34remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:63: error: โ::Performโ has not been declared
35remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
36remote: | ^~~~~~~
37remote: ../src/binding.cpp: In function โNan::NAN_METHOD_RETURN_TYPE render(Nan::NAN_METHOD_ARGS_TYPE)โ:
38remote: ../src/binding.cpp:284:98: warning: cast between incompatible function types from โvoid (*)(uv_work_t*)โ {aka โvoid (*)(uv_work_s*)โ} to โuv_after_work_cbโ {aka โvoid (*)(uv_work_s*, int)โ} [-Wcast-function-type]
39remote: 284 | int status = uv_queue_work(uv_default_loop(), &ctx_w->request, compile_it, (uv_after_work_cb)MakeCallback);
40remote: | ^~~~~~~~~~~~
41remote: ../src/binding.cpp: In function โNan::NAN_METHOD_RETURN_TYPE render_file(Nan::NAN_METHOD_ARGS_TYPE)โ:
42remote: ../src/binding.cpp:320:98: warning: cast between incompatible function types from โvoid (*)(uv_work_t*)โ {aka โvoid (*)(uv_work_s*)โ} to โuv_after_work_cbโ {aka โvoid (*)(uv_work_s*, int)โ} [-Wcast-function-type]
43remote: 320 | int status = uv_queue_work(uv_default_loop(), &ctx_w->request, compile_it, (uv_after_work_cb)MakeCallback);
44remote: | ^~~~~~~~~~~~
45remote: In file included from ../../../../../nan/nan.h:58,
46remote: from ../src/binding.cpp:1:
47remote: ../src/binding.cpp: At global scope:
48remote: /app/.node-gyp/16.13.1/include/node/node.h:821:43: warning: cast between incompatible function types from โvoid (*)(Nan::ADDON_REGISTER_FUNCTION_ARGS_TYPE)โ {aka โvoid (*)(v8::Local<v8::Object>)โ} to โnode::addon_register_funcโ {aka โvoid (*)(v8::Local<v8::Object>, v8::Local<v8::Value>, void*)โ} [-Wcast-function-type]
49remote: 821 | (node::addon_register_func) (regfunc), \
50remote: | ^
51remote: /app/.node-gyp/16.13.1/include/node/node.h:855:3: note: in expansion of macro โNODE_MODULE_Xโ
52remote: 855 | NODE_MODULE_X(modname, regfunc, NULL, 0) // NOLINT (readability/null_usage)
53remote: | ^~~~~~~~~~~~~
54remote: ../src/binding.cpp:358:1: note: in expansion of macro โNODE_MODULEโ
55remote: 358 | NODE_MODULE(binding, RegisterModule);
56remote: | ^~~~~~~~~~~
57remote: make: *** [binding.target.mk:133: Release/obj.target/binding/src/binding.o] Error 1
58remote: make: Leaving directory '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass/build'
59remote: gyp ERR! build error
60remote: gyp ERR! stack Error: `make` failed with exit code: 2
61remote: gyp ERR! stack at ChildProcess.onExit (/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/lib/build.js:262:23)
62remote: gyp ERR! stack at ChildProcess.emit (node:events:390:28)
63remote: gyp ERR! stack at Process.ChildProcess._handle.onexit (node:internal/child_process:290:12)
64remote: gyp ERR! System Linux 4.4.0-1097-aws
65remote: gyp ERR! command "/tmp/build_df192222/bin/node" "/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
66remote: gyp ERR! cwd /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
67remote: gyp ERR! node -v v16.13.1
68remote: gyp ERR! node-gyp -v v3.8.0
69remote: gyp ERR! not ok
70remote: Build failed with error code: 1
71remote:
72remote: !
73remote: ! Precompiling assets failed.
74remote: !
75remote: ! Push rejected, failed to compile Ruby app.
76remote:
77remote: ! Push failed
78{
79 "name": "travel_empire",
80 "private": true,
81 "dependencies": {
82 "@fortawesome/fontawesome-free": "^5.15.4",
83 "@popperjs/core": "^2.10.2",
84 "@rails/actioncable": "^6.0.0",
85 "@rails/activestorage": "^6.0.0",
86 "@rails/ujs": "^6.0.0",
87 "@rails/webpacker": "4.3.0",
88 "bootstrap": "4.3.1",
89 "bootstrap-icons": "^1.5.0",
90 "easy-autocomplete": "^1.3.5",
91 "jquery": "^3.6.0",
92 "jquery-ui-dist": "^1.12.1",
93 "js-autocomplete": "^1.0.4",
94 "node-sass": "^7.0.0",
95 "popper.js": "^1.16.1",
96 "turbolinks": "^5.2.0"
97 },
98 "version": "0.1.0",
99 "devDependencies": {
100 "webpack-dev-server": "^4.6.0"
101 },
102 "engines": {
103 "node": "16.x"
104 }
105}
106
Gemfile
1yarn install v1.22.17
2remote: warning package-lock.json found. Your project contains lock files generated by tools other than Yarn. It is advised not to mix package managers in order to avoid resolution inconsistencies caused by unsynchronized lock files. To clear this warning, remove package-lock.json.
3remote: [1/5] Validating package.json...
4remote: [2/5] Resolving packages...
5remote: [3/5] Fetching packages...
6remote: [4/5] Linking dependencies...
7remote: warning " > webpack-dev-server@4.6.0" has unmet peer dependency "webpack@^4.37.0 || ^5.0.0".
8remote: warning "webpack-dev-server > webpack-dev-middleware@5.2.1" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
9remote: [5/5] Building fresh packages...
10remote: error /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass: Command failed.
11remote: Exit code: 1
12remote: Command: node scripts/build.js
13remote: Arguments:
14remote: Directory: /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
15remote: Output:
16remote: Building: /tmp/build_df192222/bin/node /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
17remote: gyp info it worked if it ends with ok
18remote: gyp verb cli [
19remote: gyp verb cli '/tmp/build_df192222/bin/node',
20remote: gyp verb cli '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js',
21remote: gyp verb cli 'rebuild',
22remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h: In function โvoid v8::internal::PerformCastCheck(T*)โ:
23remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:38: error: โremove_cv_tโ is not a member of โstdโ; did you mean โremove_cvโ?
24remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
25remote: | ^~~~~~~~~~~
26remote: | remove_cv
27remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:38: error: โremove_cv_tโ is not a member of โstdโ; did you mean โremove_cvโ?
28remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
29remote: | ^~~~~~~~~~~
30remote: | remove_cv
31remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:50: error: template argument 2 is invalid
32remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
33remote: | ^
34remote: /app/.node-gyp/16.13.1/include/node/v8-internal.h:492:63: error: โ::Performโ has not been declared
35remote: 492 | !std::is_same<Data, std::remove_cv_t<T>>::value>::Perform(data);
36remote: | ^~~~~~~
37remote: ../src/binding.cpp: In function โNan::NAN_METHOD_RETURN_TYPE render(Nan::NAN_METHOD_ARGS_TYPE)โ:
38remote: ../src/binding.cpp:284:98: warning: cast between incompatible function types from โvoid (*)(uv_work_t*)โ {aka โvoid (*)(uv_work_s*)โ} to โuv_after_work_cbโ {aka โvoid (*)(uv_work_s*, int)โ} [-Wcast-function-type]
39remote: 284 | int status = uv_queue_work(uv_default_loop(), &ctx_w->request, compile_it, (uv_after_work_cb)MakeCallback);
40remote: | ^~~~~~~~~~~~
41remote: ../src/binding.cpp: In function โNan::NAN_METHOD_RETURN_TYPE render_file(Nan::NAN_METHOD_ARGS_TYPE)โ:
42remote: ../src/binding.cpp:320:98: warning: cast between incompatible function types from โvoid (*)(uv_work_t*)โ {aka โvoid (*)(uv_work_s*)โ} to โuv_after_work_cbโ {aka โvoid (*)(uv_work_s*, int)โ} [-Wcast-function-type]
43remote: 320 | int status = uv_queue_work(uv_default_loop(), &ctx_w->request, compile_it, (uv_after_work_cb)MakeCallback);
44remote: | ^~~~~~~~~~~~
45remote: In file included from ../../../../../nan/nan.h:58,
46remote: from ../src/binding.cpp:1:
47remote: ../src/binding.cpp: At global scope:
48remote: /app/.node-gyp/16.13.1/include/node/node.h:821:43: warning: cast between incompatible function types from โvoid (*)(Nan::ADDON_REGISTER_FUNCTION_ARGS_TYPE)โ {aka โvoid (*)(v8::Local<v8::Object>)โ} to โnode::addon_register_funcโ {aka โvoid (*)(v8::Local<v8::Object>, v8::Local<v8::Value>, void*)โ} [-Wcast-function-type]
49remote: 821 | (node::addon_register_func) (regfunc), \
50remote: | ^
51remote: /app/.node-gyp/16.13.1/include/node/node.h:855:3: note: in expansion of macro โNODE_MODULE_Xโ
52remote: 855 | NODE_MODULE_X(modname, regfunc, NULL, 0) // NOLINT (readability/null_usage)
53remote: | ^~~~~~~~~~~~~
54remote: ../src/binding.cpp:358:1: note: in expansion of macro โNODE_MODULEโ
55remote: 358 | NODE_MODULE(binding, RegisterModule);
56remote: | ^~~~~~~~~~~
57remote: make: *** [binding.target.mk:133: Release/obj.target/binding/src/binding.o] Error 1
58remote: make: Leaving directory '/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass/build'
59remote: gyp ERR! build error
60remote: gyp ERR! stack Error: `make` failed with exit code: 2
61remote: gyp ERR! stack at ChildProcess.onExit (/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/lib/build.js:262:23)
62remote: gyp ERR! stack at ChildProcess.emit (node:events:390:28)
63remote: gyp ERR! stack at Process.ChildProcess._handle.onexit (node:internal/child_process:290:12)
64remote: gyp ERR! System Linux 4.4.0-1097-aws
65remote: gyp ERR! command "/tmp/build_df192222/bin/node" "/tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
66remote: gyp ERR! cwd /tmp/build_df192222/node_modules/@rails/webpacker/node_modules/node-sass
67remote: gyp ERR! node -v v16.13.1
68remote: gyp ERR! node-gyp -v v3.8.0
69remote: gyp ERR! not ok
70remote: Build failed with error code: 1
71remote:
72remote: !
73remote: ! Precompiling assets failed.
74remote: !
75remote: ! Push rejected, failed to compile Ruby app.
76remote:
77remote: ! Push failed
78{
79 "name": "travel_empire",
80 "private": true,
81 "dependencies": {
82 "@fortawesome/fontawesome-free": "^5.15.4",
83 "@popperjs/core": "^2.10.2",
84 "@rails/actioncable": "^6.0.0",
85 "@rails/activestorage": "^6.0.0",
86 "@rails/ujs": "^6.0.0",
87 "@rails/webpacker": "4.3.0",
88 "bootstrap": "4.3.1",
89 "bootstrap-icons": "^1.5.0",
90 "easy-autocomplete": "^1.3.5",
91 "jquery": "^3.6.0",
92 "jquery-ui-dist": "^1.12.1",
93 "js-autocomplete": "^1.0.4",
94 "node-sass": "^7.0.0",
95 "popper.js": "^1.16.1",
96 "turbolinks": "^5.2.0"
97 },
98 "version": "0.1.0",
99 "devDependencies": {
100 "webpack-dev-server": "^4.6.0"
101 },
102 "engines": {
103 "node": "16.x"
104 }
105}
106source 'https://rubygems.org'
107git_source(:github) { |repo| "https://github.com/#{repo}.git" }
108
109ruby '2.7.3'
110
111# Bundle edge Rails instead: gem 'rails', github: 'rails/rails'
112gem 'rails', '~> 6.0.3', '>= 6.0.3.7'
113
114gem 'mongoid', git: 'https://github.com/mongodb/mongoid.git'
115
116
117# Use Puma as the app server
118gem 'puma', '~> 4.1'
119# Use SCSS for stylesheets
120gem 'sass-rails', '>= 6'
121# Transpile app-like JavaScript. Read more: https://github.com/rails/webpacker
122gem 'webpacker', '~> 4.0'
123# Turbolinks makes navigating your web application faster. Read more: https://github.com/turbolinks/turbolinks
124gem 'turbolinks', '~> 5'
125# Build JSON APIs with ease. Read more: https://github.com/rails/jbuilder
126gem 'jbuilder', '~> 2.7'
127# Use Redis adapter to run Action Cable in production
128# gem 'redis', '~> 4.0'
129# Use Active Model has_secure_password
130
131
132# Use Active Storage variant
133# gem 'image_processing', '~> 1.2'
134
135
136gem 'axlsx'
137gem 'caxlsx_rails'
138
139
140#Bootstrap for UI
141gem 'bootstrap', '~> 5.1.0'
142gem 'bootstrap-timepicker-rails', '~> 0.1.3'
143gem 'bootstrap-select-rails', '~> 1.6', '>= 1.6.3'
144#JQuery Rails
145gem 'jquery-rails'
146
147 gem 'rails_12factor', group: :production
148# Reduces boot times through caching; required in config/boot.rb
149gem 'bootsnap', '>= 1.4.2', require: false
150
151group :development, :test do
152 # Call 'byebug' anywhere in the code to stop execution and get a debugger console
153 gem 'byebug', platforms: [:mri, :mingw, :x64_mingw]
154end
155
156group :development do
157 # Access an interactive console on exception pages or by calling 'console' anywhere in the code.
158 gem 'web-console', '>= 3.3.0'
159 gem 'listen', '~> 3.2'
160 gem 'pry'
161 # Spring speeds up development by keeping your application running in the background. Read more: https://github.com/rails/spring
162 gem 'spring'
163 gem 'spring-watcher-listen', '~> 2.0.0'
164end
165
166group :test do
167 # Adds support for Capybara system testing and selenium driver
168 gem 'capybara', '>= 2.15'
169 gem 'selenium-webdriver'
170 # Easy installation and use of web drivers to run system tests with browsers
171 gem 'webdrivers'
172 gem 'cucumber-rails', require: false
173 gem 'database_cleaner'
174end
175
176# Windows does not include zoneinfo files, so bundle the tzinfo-data gem
177gem 'tzinfo-data', platforms: [:mingw, :mswin, :x64_mingw, :jruby]
178
179#HTTParty for RESTful API calls
180gem 'httparty'
181
182
183#Paperclip for storing files
184gem 'paperclip'
185gem "mongoid-paperclip", :require => "mongoid_paperclip"
186
187gem "letter_opener", :group => :development
188
ANSWER
Answered 2021-Dec-18 at 14:32I had a similar problem but resolved by following steps.
- Run the following command.
heroku buildpacks:add heroku/nodejs --index 1
- Update node version from
16.x
to12.16.2
in package.json.
QUESTION
How can I dynamically allocate cyclic data?
Asked 2021-Dec-24 at 10:14For the sake of example let's define a toy automaton type:
1data Automaton =
2 Auto
3 { success ::
4 Automaton
5 , failure ::
6 Automaton
7 }
8
This structure is designed to by cyclic, we can imagine each Automaton
as a state with success and failure transitions to other states. So finite automata must be defined recursively. For example here is the simplest automaton:
1data Automaton =
2 Auto
3 { success ::
4 Automaton
5 , failure ::
6 Automaton
7 }
8sink =
9 Auto sink sink
10
It consists of 1 state that always transitions to itself. If we want we can make more complex automata:
1data Automaton =
2 Auto
3 { success ::
4 Automaton
5 , failure ::
6 Automaton
7 }
8sink =
9 Auto sink sink
10-- Transitions to a sink once it encounters a failure
11otto1 =
12 Auto otto1 sink
13
14-- Mutually recursive automata
15otto2 =
16 Auto otto2 otto3
17
18otto3 =
19 Auto otto3 otto2
20
These are nice. But it might be nice to take user input and construct an automaton. For example maybe build one out of a transition matrix. Here's a naive implementation:
1data Automaton =
2 Auto
3 { success ::
4 Automaton
5 , failure ::
6 Automaton
7 }
8sink =
9 Auto sink sink
10-- Transitions to a sink once it encounters a failure
11otto1 =
12 Auto otto1 sink
13
14-- Mutually recursive automata
15otto2 =
16 Auto otto2 otto3
17
18otto3 =
19 Auto otto3 otto2
20fromTransition :: [(Int, Int)] -> Automaton
21fromTransition tMatrix =
22 go 0
23 where
24 go n =
25 let
26 (succ, fail) =
27 tMatrix !! n
28 in
29 Auto (go succ) (go fail)
30
However when we try to do this there starts to be an issue. Our previous examples which were O(1)
to follow a transition. However automata produced by this are O(n)
to follow a transition, since barring caching, a list must be indexed every time we make a transition. In addition, the input list must be kept in memory as long as this automaton is. Making this basically just worse than using the transition matrix as the automaton.
What I'd really like is automata built dynamically with the method to be just as performant as ones built statically like the ones shown earlier. I'd like some way to analyze the input, construct an automaton and then free the input up up.
In a language with mutation this is easy to do because we can create the structure bit by bit, leaving holes behind to correct later.
I'd also really like to not drag the IO
in because once introduced it can't be contained.
Is there a nice way to allocate cyclic structures dynamically like I want?
ANSWER
Answered 2021-Dec-24 at 00:37Laziness to the rescue. We can recursively define a list of all the sub-automata, such that their transitions index into that same list:
1data Automaton =
2 Auto
3 { success ::
4 Automaton
5 , failure ::
6 Automaton
7 }
8sink =
9 Auto sink sink
10-- Transitions to a sink once it encounters a failure
11otto1 =
12 Auto otto1 sink
13
14-- Mutually recursive automata
15otto2 =
16 Auto otto2 otto3
17
18otto3 =
19 Auto otto3 otto2
20fromTransition :: [(Int, Int)] -> Automaton
21fromTransition tMatrix =
22 go 0
23 where
24 go n =
25 let
26 (succ, fail) =
27 tMatrix !! n
28 in
29 Auto (go succ) (go fail)
30fromTransition :: [(Int, Int)] -> Automaton
31fromTransition m = a !! 0 where
32 a = map (\(succ,fail) -> Auto (a !! succ) (a !! fail)) m
33
After all the transitions have been traversed at least once, the resulting automaton will be the cyclic graph you expect, without any reference to the matrix (and in particular, transitions will be taken in constant time).
We can also force the automaton ahead of time using seq
.
1data Automaton =
2 Auto
3 { success ::
4 Automaton
5 , failure ::
6 Automaton
7 }
8sink =
9 Auto sink sink
10-- Transitions to a sink once it encounters a failure
11otto1 =
12 Auto otto1 sink
13
14-- Mutually recursive automata
15otto2 =
16 Auto otto2 otto3
17
18otto3 =
19 Auto otto3 otto2
20fromTransition :: [(Int, Int)] -> Automaton
21fromTransition tMatrix =
22 go 0
23 where
24 go n =
25 let
26 (succ, fail) =
27 tMatrix !! n
28 in
29 Auto (go succ) (go fail)
30fromTransition :: [(Int, Int)] -> Automaton
31fromTransition m = a !! 0 where
32 a = map (\(succ,fail) -> Auto (a !! succ) (a !! fail)) m
33fromTransition :: [(Int, Int)] -> Automaton
34fromTransition m = forced `seq` (a !! 0) where
35 a = map (\(succ,fail) -> Auto (a !! succ) (a !! fail)) m
36 forced = foldr (\(Auto x y) r -> x `seq` y `seq` r) () a
37
QUESTION
Command failed with exit code 134: npm run generate
Asked 2021-Nov-11 at 18:11I'm trying to deploy my nuxt js project to netlify. The installation part works fine, But it returns an error in the build process.
I tried to search google but I can't find any solution to this problem.
I also tried this command CI= npm run generate
13:16:42 PM: $ npm run generate
23:16:43 PM: > portfolio@1.0.0 generate
33:16:43 PM: > nuxt generate
43:16:50 PM: node: ../src/coroutine.cc:134: void* find_thread_id_key(void*): Assertion `thread_id_key != 0x7777' failed.
5Aborted
63:16:50 PM: โ
73:16:50 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
83:16:50 PM: "build.command" failed
93:16:50 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
103:16:50 PM: โ
113:16:50 PM: Error message
123:16:50 PM: Command failed with exit code 134: npm run generate
133:16:50 PM: โ
143:16:50 PM: Error location
153:16:50 PM: In Build command from Netlify app:
163:16:50 PM: npm run generate
173:16:50 PM: โ
183:16:50 PM: Resolved config
193:16:50 PM: build:
203:16:50 PM: command: npm run generate
213:16:50 PM: commandOrigin: ui
223:16:50 PM: publish: /opt/build/repo/dist
233:16:50 PM: publishOrigin: ui
243:16:50 PM: Caching artifacts
253:16:50 PM: Started saving node modules
263:16:50 PM: Finished saving node modules
273:16:50 PM: Started saving build plugins
283:16:50 PM: Finished saving build plugins
293:16:50 PM: Started saving pip cache
303:16:50 PM: Finished saving pip cache
313:16:50 PM: Started saving emacs cask dependencies
323:16:50 PM: Finished saving emacs cask dependencies
333:16:50 PM: Started saving maven dependencies
343:16:50 PM: Finished saving maven dependencies
353:16:50 PM: Started saving boot dependencies
363:16:50 PM: Finished saving boot dependencies
373:16:50 PM: Started saving rust rustup cache
383:16:50 PM: Finished saving rust rustup cache
393:16:50 PM: Started saving go dependencies
403:16:50 PM: Finished saving go dependencies
413:16:52 PM: Build failed due to a user error: Build script returned non-zero exit code: 2
423:16:52 PM: Creating deploy upload records
433:16:52 PM: Failing build: Failed to build site
443:16:52 PM: Failed during stage 'building site': Build script returned non-zero exit code: 2
453:16:52 PM: Finished processing build request in 1m34.230804646s
46
47
Here is the Nuxt Config - My target is to build a static site. You may guess this is my portfolio site. I'm working on my portfolio to get a better job.
nuxt.config.js
13:16:42 PM: $ npm run generate
23:16:43 PM: > portfolio@1.0.0 generate
33:16:43 PM: > nuxt generate
43:16:50 PM: node: ../src/coroutine.cc:134: void* find_thread_id_key(void*): Assertion `thread_id_key != 0x7777' failed.
5Aborted
63:16:50 PM: โ
73:16:50 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
83:16:50 PM: "build.command" failed
93:16:50 PM: โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ
103:16:50 PM: โ
113:16:50 PM: Error message
123:16:50 PM: Command failed with exit code 134: npm run generate
133:16:50 PM: โ
143:16:50 PM: Error location
153:16:50 PM: In Build command from Netlify app:
163:16:50 PM: npm run generate
173:16:50 PM: โ
183:16:50 PM: Resolved config
193:16:50 PM: build:
203:16:50 PM: command: npm run generate
213:16:50 PM: commandOrigin: ui
223:16:50 PM: publish: /opt/build/repo/dist
233:16:50 PM: publishOrigin: ui
243:16:50 PM: Caching artifacts
253:16:50 PM: Started saving node modules
263:16:50 PM: Finished saving node modules
273:16:50 PM: Started saving build plugins
283:16:50 PM: Finished saving build plugins
293:16:50 PM: Started saving pip cache
303:16:50 PM: Finished saving pip cache
313:16:50 PM: Started saving emacs cask dependencies
323:16:50 PM: Finished saving emacs cask dependencies
333:16:50 PM: Started saving maven dependencies
343:16:50 PM: Finished saving maven dependencies
353:16:50 PM: Started saving boot dependencies
363:16:50 PM: Finished saving boot dependencies
373:16:50 PM: Started saving rust rustup cache
383:16:50 PM: Finished saving rust rustup cache
393:16:50 PM: Started saving go dependencies
403:16:50 PM: Finished saving go dependencies
413:16:52 PM: Build failed due to a user error: Build script returned non-zero exit code: 2
423:16:52 PM: Creating deploy upload records
433:16:52 PM: Failing build: Failed to build site
443:16:52 PM: Failed during stage 'building site': Build script returned non-zero exit code: 2
453:16:52 PM: Finished processing build request in 1m34.230804646s
46
47export default {
48 // Target: https://go.nuxtjs.dev/config-target
49 target: 'static',
50
51 // Global page headers: https://go.nuxtjs.dev/config-head
52 head: {
53 title: 'Hasibur',
54 htmlAttrs: {
55 lang: 'en'
56 },
57 meta: [
58 { charset: 'utf-8' },
59 { name: 'viewport', content: 'width=device-width, initial-scale=1' },
60 { hid: 'description', name: 'description', content: '' },
61 { name: 'format-detection', content: 'telephone=no' }
62 ],
63 link: [
64 { rel: 'icon', type: 'image/x-icon', href: '/favicon.ico' }
65 ]
66 },
67
68 // Global CSS: https://go.nuxtjs.dev/config-css
69 css: [
70 '@/assets/scss/main.scss'
71 ],
72
73 // Plugins to run before rendering page: https://go.nuxtjs.dev/config-plugins
74 plugins: [
75 { src: '~/plugins/components.js', mode: 'client' },
76 { src: '~/plugins/fontawesome.js', mode: 'client' },
77 ],
78
79 // Auto import components: https://go.nuxtjs.dev/config-components
80 components: true,
81
82 // Modules for dev and build (recommended): https://go.nuxtjs.dev/config-modules
83 buildModules: [
84 // https://go.nuxtjs.dev/tailwindcss
85 '@nuxtjs/tailwindcss',
86 ],
87
88 // Modules: https://go.nuxtjs.dev/config-modules
89 modules: [
90 // https://go.nuxtjs.dev/axios
91 '@nuxtjs/axios',
92 ],
93
94 // Axios module configuration: https://go.nuxtjs.dev/config-axios
95 axios: {},
96
97 // Build Configuration: https://go.nuxtjs.dev/config-build
98 build: {
99 }
100}
101
102
ANSWER
Answered 2021-Nov-11 at 15:42I just had the same problem and solved it thanks to this question. The problem seems to be fibers.
The steps I took to fix it:
- uninstall fibers:
npm uninstall fibers
- delete
package-lock.json
&node_modules/
- install packages again:
npm install
Simply removing fibers from package.json
isn't enough as Netlify seems to still find the package in package-lock.json
.
QUESTION
CircleCI (Started 11/1/2021) Canโt find Python executable โpythonโ, you can set the PYTHON env variable
Asked 2021-Nov-08 at 09:06As of this morning, CircleCI is failing for me with this strange build error:
1Can't find Python executable "python", you can set the PYTHON env variable
2
I noticed it on a new commit. of course, thinking it was my new commit I forced pushed my last known passing commit onto main branch.
In particular, this seems to have started for me this morning (11/1), and the build is now failing on the very same commit that passed 16 hours ago (isnโt that fun)
The full error is:
1Can't find Python executable "python", you can set the PYTHON env variable
2#!/bin/bash -eo pipefail
3if [ ! -f "package.json" ]; then
4 echo
5 echo "---"
6 echo "Unable to find your package.json file. Did you forget to set the app-dir parameter?"
7 echo "---"
8 echo
9 echo "Current directory: $(pwd)"
10 echo
11 echo
12 echo "List directory: "
13 echo
14 ls
15 exit 1
16fi
17case yarn in
18 npm)
19 if [[ "false" == "true" ]]; then
20 npm install
21 else
22 npm ci
23 fi
24 ;;
25 yarn)
26 if [[ "false" == "true" ]]; then
27 yarn install
28 else
29 yarn install --frozen-lockfile
30 fi
31 ;;
32esac
33
34yarn install v1.22.15
35[1/4] Resolving packages...
36[2/4] Fetching packages...
37info fsevents@2.3.2: The platform "linux" is incompatible with this module.
38info "fsevents@2.3.2" is an optional dependency and failed compatibility check. Excluding it from installation.
39info fsevents@1.2.13: The platform "linux" is incompatible with this module.
40info "fsevents@1.2.13" is an optional dependency and failed compatibility check. Excluding it from installation.
41[3/4] Linking dependencies...
42warning " > @babel/preset-react@7.13.13" has unmet peer dependency "@babel/core@^7.0.0-0".
43warning "@babel/preset-react > @babel/plugin-transform-react-display-name@7.14.2" has unmet peer dependency "@babel/core@^7.0.0-0".
44warning "@babel/preset-react > @babel/plugin-transform-react-jsx@7.13.12" has unmet peer dependency "@babel/core@^7.0.0-0".
45warning "@babel/preset-react > @babel/plugin-transform-react-jsx-development@7.12.17" has unmet peer dependency "@babel/core@^7.0.0-0".
46warning "@babel/preset-react > @babel/plugin-transform-react-pure-annotations@7.12.1" has unmet peer dependency "@babel/core@^7.0.0-0".
47warning "@babel/preset-react > @babel/plugin-transform-react-jsx > @babel/plugin-syntax-jsx@7.12.13" has unmet peer dependency "@babel/core@^7.0.0-0".
48warning " > @reactchartjs/react-chart.js@1.0.0-rc.4" has incorrect peer dependency "chart.js@^2.3".
49warning " > styled-components@5.3.0" has unmet peer dependency "react-is@>= 16.8.0".
50warning " > webpack-dev-server@3.11.2" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
51warning "webpack-dev-server > webpack-dev-middleware@3.7.3" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
52[4/4] Building fresh packages...
53error /home/circleci/project/node_modules/node-sass: Command failed.
54Exit code: 1
55Command: node scripts/build.js
56Arguments:
57Directory: /home/circleci/project/node_modules/node-sass
58Output:
59Building: /usr/local/bin/node /home/circleci/project/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
60gyp info it worked if it ends with ok
61gyp verb cli [
62gyp verb cli '/usr/local/bin/node',
63gyp verb cli '/home/circleci/project/node_modules/node-gyp/bin/node-gyp.js',
64gyp verb cli 'rebuild',
65gyp verb cli '--verbose',
66gyp verb cli '--libsass_ext=',
67gyp verb cli '--libsass_cflags=',
68gyp verb cli '--libsass_ldflags=',
69gyp verb cli '--libsass_library='
70gyp verb cli ]
71gyp info using node-gyp@3.8.0
72gyp info using node@16.13.0 | linux | x64
73gyp verb command rebuild []
74gyp verb command clean []
75gyp verb clean removing "build" directory
76gyp verb command configure []
77gyp verb check python checking for Python executable "python2" in the PATH
78gyp verb `which` failed Error: not found: python2
79gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
80gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
81gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
82gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
83gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
84gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
85gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21)
86gyp verb `which` failed python2 Error: not found: python2
87gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
88gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
89gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
90gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
91gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
92gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
93gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21) {
94gyp verb `which` failed code: 'ENOENT'
95gyp verb `which` failed }
96gyp verb check python checking for Python executable "python" in the PATH
97gyp verb `which` failed Error: not found: python
98gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
99gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
100gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
101gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
102gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
103gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
104gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21)
105gyp verb `which` failed python Error: not found: python
106gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
107gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
108gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
109gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
110gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
111gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
112gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21) {
113gyp verb `which` failed code: 'ENOENT'
114gyp verb `which` failed }
115gyp ERR! configure error
116gyp ERR! stack Error: Can't find Python executable "python", you can set the PYTHON env variable.
117gyp ERR! stack at PythonFinder.failNoPython (/home/circleci/project/node_modules/node-gyp/lib/configure.js:484:19)
118gyp ERR! stack at PythonFinder.<anonymous> (/home/circleci/project/node_modules/node-gyp/lib/configure.js:406:16)
119gyp ERR! stack at F (/home/circleci/project/node_modules/which/which.js:68:16)
120gyp ERR! stack at E (/home/circleci/project/node_modules/which/which.js:80:29)
121gyp ERR! stack at /home/circleci/project/node_modules/which/which.js:89:16
122gyp ERR! stack at /home/circleci/project/node_modules/isexe/index.js:42:5
123gyp ERR! stack at /home/circleci/project/node_modules/isexe/mode.js:8:5
124gyp ERR! stack at FSReqCallback.oncomplete (node:fs:198:21)
125gyp ERR! System Linux 4.15.0-1110-aws
126gyp ERR! command "/usr/local/bin/node" "/home/circleci/project/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
127gyp ERR! cwd /home/circleci/project/node_modules/node-sass
128gyp ERR! node -v v16.13.0
129gyp ERR! node-gyp -v v3.8.0
130gyp ERR! not ok
131Build failed with error code: 1
132info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
133
134Exited with code exit status 1
135CircleCI received exit code 1
136
My circleci config file (which has not changed) is:
1Can't find Python executable "python", you can set the PYTHON env variable
2#!/bin/bash -eo pipefail
3if [ ! -f "package.json" ]; then
4 echo
5 echo "---"
6 echo "Unable to find your package.json file. Did you forget to set the app-dir parameter?"
7 echo "---"
8 echo
9 echo "Current directory: $(pwd)"
10 echo
11 echo
12 echo "List directory: "
13 echo
14 ls
15 exit 1
16fi
17case yarn in
18 npm)
19 if [[ "false" == "true" ]]; then
20 npm install
21 else
22 npm ci
23 fi
24 ;;
25 yarn)
26 if [[ "false" == "true" ]]; then
27 yarn install
28 else
29 yarn install --frozen-lockfile
30 fi
31 ;;
32esac
33
34yarn install v1.22.15
35[1/4] Resolving packages...
36[2/4] Fetching packages...
37info fsevents@2.3.2: The platform "linux" is incompatible with this module.
38info "fsevents@2.3.2" is an optional dependency and failed compatibility check. Excluding it from installation.
39info fsevents@1.2.13: The platform "linux" is incompatible with this module.
40info "fsevents@1.2.13" is an optional dependency and failed compatibility check. Excluding it from installation.
41[3/4] Linking dependencies...
42warning " > @babel/preset-react@7.13.13" has unmet peer dependency "@babel/core@^7.0.0-0".
43warning "@babel/preset-react > @babel/plugin-transform-react-display-name@7.14.2" has unmet peer dependency "@babel/core@^7.0.0-0".
44warning "@babel/preset-react > @babel/plugin-transform-react-jsx@7.13.12" has unmet peer dependency "@babel/core@^7.0.0-0".
45warning "@babel/preset-react > @babel/plugin-transform-react-jsx-development@7.12.17" has unmet peer dependency "@babel/core@^7.0.0-0".
46warning "@babel/preset-react > @babel/plugin-transform-react-pure-annotations@7.12.1" has unmet peer dependency "@babel/core@^7.0.0-0".
47warning "@babel/preset-react > @babel/plugin-transform-react-jsx > @babel/plugin-syntax-jsx@7.12.13" has unmet peer dependency "@babel/core@^7.0.0-0".
48warning " > @reactchartjs/react-chart.js@1.0.0-rc.4" has incorrect peer dependency "chart.js@^2.3".
49warning " > styled-components@5.3.0" has unmet peer dependency "react-is@>= 16.8.0".
50warning " > webpack-dev-server@3.11.2" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
51warning "webpack-dev-server > webpack-dev-middleware@3.7.3" has unmet peer dependency "webpack@^4.0.0 || ^5.0.0".
52[4/4] Building fresh packages...
53error /home/circleci/project/node_modules/node-sass: Command failed.
54Exit code: 1
55Command: node scripts/build.js
56Arguments:
57Directory: /home/circleci/project/node_modules/node-sass
58Output:
59Building: /usr/local/bin/node /home/circleci/project/node_modules/node-gyp/bin/node-gyp.js rebuild --verbose --libsass_ext= --libsass_cflags= --libsass_ldflags= --libsass_library=
60gyp info it worked if it ends with ok
61gyp verb cli [
62gyp verb cli '/usr/local/bin/node',
63gyp verb cli '/home/circleci/project/node_modules/node-gyp/bin/node-gyp.js',
64gyp verb cli 'rebuild',
65gyp verb cli '--verbose',
66gyp verb cli '--libsass_ext=',
67gyp verb cli '--libsass_cflags=',
68gyp verb cli '--libsass_ldflags=',
69gyp verb cli '--libsass_library='
70gyp verb cli ]
71gyp info using node-gyp@3.8.0
72gyp info using node@16.13.0 | linux | x64
73gyp verb command rebuild []
74gyp verb command clean []
75gyp verb clean removing "build" directory
76gyp verb command configure []
77gyp verb check python checking for Python executable "python2" in the PATH
78gyp verb `which` failed Error: not found: python2
79gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
80gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
81gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
82gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
83gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
84gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
85gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21)
86gyp verb `which` failed python2 Error: not found: python2
87gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
88gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
89gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
90gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
91gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
92gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
93gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21) {
94gyp verb `which` failed code: 'ENOENT'
95gyp verb `which` failed }
96gyp verb check python checking for Python executable "python" in the PATH
97gyp verb `which` failed Error: not found: python
98gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
99gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
100gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
101gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
102gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
103gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
104gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21)
105gyp verb `which` failed python Error: not found: python
106gyp verb `which` failed at getNotFoundError (/home/circleci/project/node_modules/which/which.js:13:12)
107gyp verb `which` failed at F (/home/circleci/project/node_modules/which/which.js:68:19)
108gyp verb `which` failed at E (/home/circleci/project/node_modules/which/which.js:80:29)
109gyp verb `which` failed at /home/circleci/project/node_modules/which/which.js:89:16
110gyp verb `which` failed at /home/circleci/project/node_modules/isexe/index.js:42:5
111gyp verb `which` failed at /home/circleci/project/node_modules/isexe/mode.js:8:5
112gyp verb `which` failed at FSReqCallback.oncomplete (node:fs:198:21) {
113gyp verb `which` failed code: 'ENOENT'
114gyp verb `which` failed }
115gyp ERR! configure error
116gyp ERR! stack Error: Can't find Python executable "python", you can set the PYTHON env variable.
117gyp ERR! stack at PythonFinder.failNoPython (/home/circleci/project/node_modules/node-gyp/lib/configure.js:484:19)
118gyp ERR! stack at PythonFinder.<anonymous> (/home/circleci/project/node_modules/node-gyp/lib/configure.js:406:16)
119gyp ERR! stack at F (/home/circleci/project/node_modules/which/which.js:68:16)
120gyp ERR! stack at E (/home/circleci/project/node_modules/which/which.js:80:29)
121gyp ERR! stack at /home/circleci/project/node_modules/which/which.js:89:16
122gyp ERR! stack at /home/circleci/project/node_modules/isexe/index.js:42:5
123gyp ERR! stack at /home/circleci/project/node_modules/isexe/mode.js:8:5
124gyp ERR! stack at FSReqCallback.oncomplete (node:fs:198:21)
125gyp ERR! System Linux 4.15.0-1110-aws
126gyp ERR! command "/usr/local/bin/node" "/home/circleci/project/node_modules/node-gyp/bin/node-gyp.js" "rebuild" "--verbose" "--libsass_ext=" "--libsass_cflags=" "--libsass_ldflags=" "--libsass_library="
127gyp ERR! cwd /home/circleci/project/node_modules/node-sass
128gyp ERR! node -v v16.13.0
129gyp ERR! node-gyp -v v3.8.0
130gyp ERR! not ok
131Build failed with error code: 1
132info Visit https://yarnpkg.com/en/docs/cli/install for documentation about this command.
133
134Exited with code exit status 1
135CircleCI received exit code 1
136version: 2.1 # Use 2.1 to enable using orbs and other features.
137
138# Declare the orbs that we'll use in our config.
139# read more about orbs: https://circleci.com/docs/2.0/using-orbs/
140orbs:
141 ruby: circleci/ruby@1.0
142 node: circleci/node@2
143
144jobs:
145 build: # our first job, named "build"
146 docker:
147 - image: circleci/ruby:2.7.4-node-browsers # use a tailored CircleCI docker image.
148 auth:
149 username: mydockerhub-user
150 password: $DOCKERHUB_PASSWORD # context / project UI env-var reference
151 steps:
152 - checkout # pull down our git code.
153 - ruby/install-deps # use the ruby orb to install dependencies
154 # use the node orb to install our packages
155 # specifying that we use `yarn` and to cache dependencies with `yarn.lock`
156 # learn more: https://circleci.com/docs/2.0/caching/
157 - node/install-packages:
158 pkg-manager: yarn
159 cache-key: "yarn.lock"
160
161 test: # our next job, called "test"
162
163
164
ANSWER
Answered 2021-Nov-08 at 09:06Try using a next-generation Ruby image. In your case, change circleci/ruby:2.7.4-node-browsers
to cimg/ruby:2.7.4-browsers
. You can find the full list of images here.
QUESTION
how to change create-react-app PWA to network-first
Asked 2021-Oct-30 at 21:41I have a React app that I implemented PWA with, I want to change the caching strategy to network first but I have no idea how to do so, I have read many articles about it but none of them tells you how to do it actually, this is my code below and I appreciate any help with it:
index.js
:
1import React from 'react';
2import ReactDOM from 'react-dom';
3
4import App from './App';
5import * as serviceWorkerRegistration from './serviceWorkerRegistration';
6import reportWebVitals from './reportWebVitals';
7
8ReactDOM.render(
9 <React.StrictMode>
10 <App />
11 </React.StrictMode>,
12 document.getElementById('root')
13);
14
15
16// If you want your app to work offline and load faster, you can change
17// unregister() to register() below. Note this comes with some pitfalls.
18// Learn more about service workers: https://cra.link/PWA
19serviceWorkerRegistration.register();
20
21// If you want to start measuring performance in your app, pass a function
22// to log results (for example: reportWebVitals(console.log))
23// or send to an analytics endpoint. Learn more:
24reportWebVitals();
25
service-worker.js
:
1import React from 'react';
2import ReactDOM from 'react-dom';
3
4import App from './App';
5import * as serviceWorkerRegistration from './serviceWorkerRegistration';
6import reportWebVitals from './reportWebVitals';
7
8ReactDOM.render(
9 <React.StrictMode>
10 <App />
11 </React.StrictMode>,
12 document.getElementById('root')
13);
14
15
16// If you want your app to work offline and load faster, you can change
17// unregister() to register() below. Note this comes with some pitfalls.
18// Learn more about service workers: https://cra.link/PWA
19serviceWorkerRegistration.register();
20
21// If you want to start measuring performance in your app, pass a function
22// to log results (for example: reportWebVitals(console.log))
23// or send to an analytics endpoint. Learn more:
24reportWebVitals();
25/* eslint-disable no-restricted-globals */
26
27// This service worker can be customized!
28// See https://developers.google.com/web/tools/workbox/modules
29// for the list of available Workbox modules, or add any other
30// code you'd like.
31// You can also remove this file if you'd prefer not to use a
32// service worker, and the Workbox build step will be skipped.
33
34import { clientsClaim } from 'workbox-core';
35import { ExpirationPlugin } from 'workbox-expiration';
36import { precacheAndRoute, createHandlerBoundToURL } from 'workbox-precaching';
37import { registerRoute } from 'workbox-routing';
38import { StaleWhileRevalidate } from 'workbox-strategies';
39
40clientsClaim();
41
42// Precache all of the assets generated by your build process.
43// Their URLs are injected into the manifest variable below.
44// This variable must be present somewhere in your service worker file,
45// even if you decide not to use precaching. See https://cra.link/PWA
46precacheAndRoute(self.__WB_MANIFEST);
47
48// Set up App Shell-style routing, so that all navigation requests
49// are fulfilled with your index.html shell. Learn more at
50// https://developers.google.com/web/fundamentals/architecture/app-shell
51const fileExtensionRegexp = new RegExp('/[^/?]+\\.[^/]+$');
52registerRoute(
53 // Return false to exempt requests from being fulfilled by index.html.
54 ({ request, url }) => {
55 // If this isn't a navigation, skip.
56 if (request.mode !== 'navigate') {
57 return false;
58 } // If this is a URL that starts with /_, skip.
59
60 if (url.pathname.startsWith('/_')) {
61 return false;
62 } // If this looks like a URL for a resource, because it contains // a file extension, skip.
63
64 if (url.pathname.match(fileExtensionRegexp)) {
65 return false;
66 } // Return true to signal that we want to use the handler.
67
68 return true;
69 },
70 createHandlerBoundToURL(process.env.PUBLIC_URL + '/index.html')
71);
72
73// An example runtime caching route for requests that aren't handled by the
74// precache, in this case same-origin .png requests like those from in public/
75registerRoute(
76 // Add in any other file extensions or routing criteria as needed.
77 ({ url }) => url.origin === self.location.origin && url.pathname.endsWith('.png'), // Customize this strategy as needed, e.g., by changing to CacheFirst.
78 new StaleWhileRevalidate({
79 cacheName: 'images',
80 plugins: [
81 // Ensure that once this runtime cache reaches a maximum size the
82 // least-recently used images are removed.
83 new ExpirationPlugin({ maxEntries: 50 }),
84 ],
85 })
86);
87
88// This allows the web app to trigger skipWaiting via
89// registration.waiting.postMessage({type: 'SKIP_WAITING'})
90self.addEventListener('message', (event) => {
91 if (event.data && event.data.type === 'SKIP_WAITING') {
92 self.skipWaiting();
93 }
94});
95
96// Any other custom service worker logic can go here.
97
serviceWorkerRegistration.js
:
1import React from 'react';
2import ReactDOM from 'react-dom';
3
4import App from './App';
5import * as serviceWorkerRegistration from './serviceWorkerRegistration';
6import reportWebVitals from './reportWebVitals';
7
8ReactDOM.render(
9 <React.StrictMode>
10 <App />
11 </React.StrictMode>,
12 document.getElementById('root')
13);
14
15
16// If you want your app to work offline and load faster, you can change
17// unregister() to register() below. Note this comes with some pitfalls.
18// Learn more about service workers: https://cra.link/PWA
19serviceWorkerRegistration.register();
20
21// If you want to start measuring performance in your app, pass a function
22// to log results (for example: reportWebVitals(console.log))
23// or send to an analytics endpoint. Learn more:
24reportWebVitals();
25/* eslint-disable no-restricted-globals */
26
27// This service worker can be customized!
28// See https://developers.google.com/web/tools/workbox/modules
29// for the list of available Workbox modules, or add any other
30// code you'd like.
31// You can also remove this file if you'd prefer not to use a
32// service worker, and the Workbox build step will be skipped.
33
34import { clientsClaim } from 'workbox-core';
35import { ExpirationPlugin } from 'workbox-expiration';
36import { precacheAndRoute, createHandlerBoundToURL } from 'workbox-precaching';
37import { registerRoute } from 'workbox-routing';
38import { StaleWhileRevalidate } from 'workbox-strategies';
39
40clientsClaim();
41
42// Precache all of the assets generated by your build process.
43// Their URLs are injected into the manifest variable below.
44// This variable must be present somewhere in your service worker file,
45// even if you decide not to use precaching. See https://cra.link/PWA
46precacheAndRoute(self.__WB_MANIFEST);
47
48// Set up App Shell-style routing, so that all navigation requests
49// are fulfilled with your index.html shell. Learn more at
50// https://developers.google.com/web/fundamentals/architecture/app-shell
51const fileExtensionRegexp = new RegExp('/[^/?]+\\.[^/]+$');
52registerRoute(
53 // Return false to exempt requests from being fulfilled by index.html.
54 ({ request, url }) => {
55 // If this isn't a navigation, skip.
56 if (request.mode !== 'navigate') {
57 return false;
58 } // If this is a URL that starts with /_, skip.
59
60 if (url.pathname.startsWith('/_')) {
61 return false;
62 } // If this looks like a URL for a resource, because it contains // a file extension, skip.
63
64 if (url.pathname.match(fileExtensionRegexp)) {
65 return false;
66 } // Return true to signal that we want to use the handler.
67
68 return true;
69 },
70 createHandlerBoundToURL(process.env.PUBLIC_URL + '/index.html')
71);
72
73// An example runtime caching route for requests that aren't handled by the
74// precache, in this case same-origin .png requests like those from in public/
75registerRoute(
76 // Add in any other file extensions or routing criteria as needed.
77 ({ url }) => url.origin === self.location.origin && url.pathname.endsWith('.png'), // Customize this strategy as needed, e.g., by changing to CacheFirst.
78 new StaleWhileRevalidate({
79 cacheName: 'images',
80 plugins: [
81 // Ensure that once this runtime cache reaches a maximum size the
82 // least-recently used images are removed.
83 new ExpirationPlugin({ maxEntries: 50 }),
84 ],
85 })
86);
87
88// This allows the web app to trigger skipWaiting via
89// registration.waiting.postMessage({type: 'SKIP_WAITING'})
90self.addEventListener('message', (event) => {
91 if (event.data && event.data.type === 'SKIP_WAITING') {
92 self.skipWaiting();
93 }
94});
95
96// Any other custom service worker logic can go here.
97// This optional code is used to register a service worker.
98// register() is not called by default.
99
100// This lets the app load faster on subsequent visits in production, and gives
101// it offline capabilities. However, it also means that developers (and users)
102// will only see deployed updates on subsequent visits to a page, after all the
103// existing tabs open on the page have been closed, since previously cached
104// resources are updated in the background.
105
106// To learn more about the benefits of this model and instructions on how to
107// opt-in, read https://cra.link/PWA
108
109const isLocalhost = Boolean(
110 window.location.hostname === 'localhost' ||
111 // [::1] is the IPv6 localhost address.
112 window.location.hostname === '[::1]' ||
113 // 127.0.0.0/8 are considered localhost for IPv4.
114 window.location.hostname.match(/^127(?:\.(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$/)
115);
116
117export function register(config) {
118 if (process.env.NODE_ENV === 'production' && 'serviceWorker' in navigator) {
119 // The URL constructor is available in all browsers that support SW.
120 const publicUrl = new URL(process.env.PUBLIC_URL, window.location.href);
121 if (publicUrl.origin !== window.location.origin) {
122 // Our service worker won't work if PUBLIC_URL is on a different origin
123 // from what our page is served on. This might happen if a CDN is used to
124 // serve assets; see https://github.com/facebook/create-react-app/issues/2374
125 return;
126 }
127
128 window.addEventListener('load', () => {
129 const swUrl = `${process.env.PUBLIC_URL}/service-worker.js`;
130
131 if (isLocalhost) {
132 // This is running on localhost. Let's check if a service worker still exists or not.
133 checkValidServiceWorker(swUrl, config);
134
135 // Add some additional logging to localhost, pointing developers to the
136 // service worker/PWA documentation.
137 navigator.serviceWorker.ready.then(() => {
138 console.log(
139 'This web app is being served cache-first by a service ' +
140 'worker. To learn more, visit https://cra.link/PWA'
141 );
142 });
143 } else {
144 // Is not localhost. Just register service worker
145 registerValidSW(swUrl, config);
146 }
147 });
148 }
149}
150
151function registerValidSW(swUrl, config) {
152 navigator.serviceWorker
153 .register(swUrl)
154 .then((registration) => {
155 registration.onupdatefound = () => {
156 const installingWorker = registration.installing;
157 if (installingWorker == null) {
158 return;
159 }
160 installingWorker.onstatechange = () => {
161 if (installingWorker.state === 'installed') {
162 if (navigator.serviceWorker.controller) {
163 // At this point, the updated precached content has been fetched,
164 // but the previous service worker will still serve the older
165 // content until all client tabs are closed.
166 console.log(
167 'New content is available and will be used when all ' +
168 'tabs for this page are closed. See https://cra.link/PWA.'
169 );
170
171 // Execute callback
172 if (config && config.onUpdate) {
173 config.onUpdate(registration);
174 }
175 } else {
176 // At this point, everything has been precached.
177 // It's the perfect time to display a
178 // "Content is cached for offline use." message.
179 console.log('Content is cached for offline use.');
180
181 // Execute callback
182 if (config && config.onSuccess) {
183 config.onSuccess(registration);
184 }
185 }
186 }
187 };
188 };
189 })
190 .catch((error) => {
191 console.error('Error during service worker registration:', error);
192 });
193}
194
195function checkValidServiceWorker(swUrl, config) {
196 // Check if the service worker can be found. If it can't reload the page.
197 fetch(swUrl, {
198 headers: { 'Service-Worker': 'script' },
199 })
200 .then((response) => {
201 // Ensure service worker exists, and that we really are getting a JS file.
202 const contentType = response.headers.get('content-type');
203 if (
204 response.status === 404 ||
205 (contentType != null && contentType.indexOf('javascript') === -1)
206 ) {
207 // No service worker found. Probably a different app. Reload the page.
208 navigator.serviceWorker.ready.then((registration) => {
209 registration.unregister().then(() => {
210 window.location.reload();
211 });
212 });
213 } else {
214 // Service worker found. Proceed as normal.
215 registerValidSW(swUrl, config);
216 }
217 })
218 .catch(() => {
219 console.log('No internet connection found. App is running in offline mode.');
220 });
221}
222
223export function unregister() {
224 if ('serviceWorker' in navigator) {
225 navigator.serviceWorker.ready
226 .then((registration) => {
227 registration.unregister();
228 })
229 .catch((error) => {
230 console.error(error.message);
231 });
232 }
233}
234
ANSWER
Answered 2021-Oct-30 at 21:41the solution to my problem was answered in this article about all PWA strategies: https://jakearchibald.com/2014/offline-cookbook/#network-falling-back-to-cache
and what I had to do was add this piece of code to the end of my service-worker.js
file:
1import React from 'react';
2import ReactDOM from 'react-dom';
3
4import App from './App';
5import * as serviceWorkerRegistration from './serviceWorkerRegistration';
6import reportWebVitals from './reportWebVitals';
7
8ReactDOM.render(
9 <React.StrictMode>
10 <App />
11 </React.StrictMode>,
12 document.getElementById('root')
13);
14
15
16// If you want your app to work offline and load faster, you can change
17// unregister() to register() below. Note this comes with some pitfalls.
18// Learn more about service workers: https://cra.link/PWA
19serviceWorkerRegistration.register();
20
21// If you want to start measuring performance in your app, pass a function
22// to log results (for example: reportWebVitals(console.log))
23// or send to an analytics endpoint. Learn more:
24reportWebVitals();
25/* eslint-disable no-restricted-globals */
26
27// This service worker can be customized!
28// See https://developers.google.com/web/tools/workbox/modules
29// for the list of available Workbox modules, or add any other
30// code you'd like.
31// You can also remove this file if you'd prefer not to use a
32// service worker, and the Workbox build step will be skipped.
33
34import { clientsClaim } from 'workbox-core';
35import { ExpirationPlugin } from 'workbox-expiration';
36import { precacheAndRoute, createHandlerBoundToURL } from 'workbox-precaching';
37import { registerRoute } from 'workbox-routing';
38import { StaleWhileRevalidate } from 'workbox-strategies';
39
40clientsClaim();
41
42// Precache all of the assets generated by your build process.
43// Their URLs are injected into the manifest variable below.
44// This variable must be present somewhere in your service worker file,
45// even if you decide not to use precaching. See https://cra.link/PWA
46precacheAndRoute(self.__WB_MANIFEST);
47
48// Set up App Shell-style routing, so that all navigation requests
49// are fulfilled with your index.html shell. Learn more at
50// https://developers.google.com/web/fundamentals/architecture/app-shell
51const fileExtensionRegexp = new RegExp('/[^/?]+\\.[^/]+$');
52registerRoute(
53 // Return false to exempt requests from being fulfilled by index.html.
54 ({ request, url }) => {
55 // If this isn't a navigation, skip.
56 if (request.mode !== 'navigate') {
57 return false;
58 } // If this is a URL that starts with /_, skip.
59
60 if (url.pathname.startsWith('/_')) {
61 return false;
62 } // If this looks like a URL for a resource, because it contains // a file extension, skip.
63
64 if (url.pathname.match(fileExtensionRegexp)) {
65 return false;
66 } // Return true to signal that we want to use the handler.
67
68 return true;
69 },
70 createHandlerBoundToURL(process.env.PUBLIC_URL + '/index.html')
71);
72
73// An example runtime caching route for requests that aren't handled by the
74// precache, in this case same-origin .png requests like those from in public/
75registerRoute(
76 // Add in any other file extensions or routing criteria as needed.
77 ({ url }) => url.origin === self.location.origin && url.pathname.endsWith('.png'), // Customize this strategy as needed, e.g., by changing to CacheFirst.
78 new StaleWhileRevalidate({
79 cacheName: 'images',
80 plugins: [
81 // Ensure that once this runtime cache reaches a maximum size the
82 // least-recently used images are removed.
83 new ExpirationPlugin({ maxEntries: 50 }),
84 ],
85 })
86);
87
88// This allows the web app to trigger skipWaiting via
89// registration.waiting.postMessage({type: 'SKIP_WAITING'})
90self.addEventListener('message', (event) => {
91 if (event.data && event.data.type === 'SKIP_WAITING') {
92 self.skipWaiting();
93 }
94});
95
96// Any other custom service worker logic can go here.
97// This optional code is used to register a service worker.
98// register() is not called by default.
99
100// This lets the app load faster on subsequent visits in production, and gives
101// it offline capabilities. However, it also means that developers (and users)
102// will only see deployed updates on subsequent visits to a page, after all the
103// existing tabs open on the page have been closed, since previously cached
104// resources are updated in the background.
105
106// To learn more about the benefits of this model and instructions on how to
107// opt-in, read https://cra.link/PWA
108
109const isLocalhost = Boolean(
110 window.location.hostname === 'localhost' ||
111 // [::1] is the IPv6 localhost address.
112 window.location.hostname === '[::1]' ||
113 // 127.0.0.0/8 are considered localhost for IPv4.
114 window.location.hostname.match(/^127(?:\.(?:25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)){3}$/)
115);
116
117export function register(config) {
118 if (process.env.NODE_ENV === 'production' && 'serviceWorker' in navigator) {
119 // The URL constructor is available in all browsers that support SW.
120 const publicUrl = new URL(process.env.PUBLIC_URL, window.location.href);
121 if (publicUrl.origin !== window.location.origin) {
122 // Our service worker won't work if PUBLIC_URL is on a different origin
123 // from what our page is served on. This might happen if a CDN is used to
124 // serve assets; see https://github.com/facebook/create-react-app/issues/2374
125 return;
126 }
127
128 window.addEventListener('load', () => {
129 const swUrl = `${process.env.PUBLIC_URL}/service-worker.js`;
130
131 if (isLocalhost) {
132 // This is running on localhost. Let's check if a service worker still exists or not.
133 checkValidServiceWorker(swUrl, config);
134
135 // Add some additional logging to localhost, pointing developers to the
136 // service worker/PWA documentation.
137 navigator.serviceWorker.ready.then(() => {
138 console.log(
139 'This web app is being served cache-first by a service ' +
140 'worker. To learn more, visit https://cra.link/PWA'
141 );
142 });
143 } else {
144 // Is not localhost. Just register service worker
145 registerValidSW(swUrl, config);
146 }
147 });
148 }
149}
150
151function registerValidSW(swUrl, config) {
152 navigator.serviceWorker
153 .register(swUrl)
154 .then((registration) => {
155 registration.onupdatefound = () => {
156 const installingWorker = registration.installing;
157 if (installingWorker == null) {
158 return;
159 }
160 installingWorker.onstatechange = () => {
161 if (installingWorker.state === 'installed') {
162 if (navigator.serviceWorker.controller) {
163 // At this point, the updated precached content has been fetched,
164 // but the previous service worker will still serve the older
165 // content until all client tabs are closed.
166 console.log(
167 'New content is available and will be used when all ' +
168 'tabs for this page are closed. See https://cra.link/PWA.'
169 );
170
171 // Execute callback
172 if (config && config.onUpdate) {
173 config.onUpdate(registration);
174 }
175 } else {
176 // At this point, everything has been precached.
177 // It's the perfect time to display a
178 // "Content is cached for offline use." message.
179 console.log('Content is cached for offline use.');
180
181 // Execute callback
182 if (config && config.onSuccess) {
183 config.onSuccess(registration);
184 }
185 }
186 }
187 };
188 };
189 })
190 .catch((error) => {
191 console.error('Error during service worker registration:', error);
192 });
193}
194
195function checkValidServiceWorker(swUrl, config) {
196 // Check if the service worker can be found. If it can't reload the page.
197 fetch(swUrl, {
198 headers: { 'Service-Worker': 'script' },
199 })
200 .then((response) => {
201 // Ensure service worker exists, and that we really are getting a JS file.
202 const contentType = response.headers.get('content-type');
203 if (
204 response.status === 404 ||
205 (contentType != null && contentType.indexOf('javascript') === -1)
206 ) {
207 // No service worker found. Probably a different app. Reload the page.
208 navigator.serviceWorker.ready.then((registration) => {
209 registration.unregister().then(() => {
210 window.location.reload();
211 });
212 });
213 } else {
214 // Service worker found. Proceed as normal.
215 registerValidSW(swUrl, config);
216 }
217 })
218 .catch(() => {
219 console.log('No internet connection found. App is running in offline mode.');
220 });
221}
222
223export function unregister() {
224 if ('serviceWorker' in navigator) {
225 navigator.serviceWorker.ready
226 .then((registration) => {
227 registration.unregister();
228 })
229 .catch((error) => {
230 console.error(error.message);
231 });
232 }
233}
234self.addEventListener('fetch', (event) => {
235 event.respondWith(async function () {
236 try {
237 return await fetch(event.request);
238 } catch (err) {
239 return caches.match(event.request);
240 }
241 }());
242});
243
you can also find code to implement other strategies in the article
QUESTION
Does .NET Framework have an OS-independent global DNS cache?
Asked 2021-Oct-15 at 12:00First of all, I've tried all recommendations from C# DNS-related SO threads and other internet articles - messing with ServicePointManager/ServicePoint settings, setting automatic request connection close via HTTP headers, changing connection lease times - nothing helped. It seems like all those settings are intended for fixing DNS issues in long-running processes (like web services). It even makes sense if a process would have it's own DNS cache to minimize DNS queries or OS DNS cache reading. But it's not my case.
The problemOur production infrastructure uses HA (high availability) DNS for swapping server nodes during maintenance or functional problems. And it's built in a way that in some places we have multiple CNAME-records which in fact point to the same HA A-record like that:
- eu.site1.myprodserver.com (CNAME) > eu.ha.myprodserver.com (A)
- eu.site2.myprodserver.com (CNAME) > eu.ha.myprodserver.com (A)
The TTL of all these records is 60 seconds. So when the European node is in trouble or maintenance, the A-record switches to the IP address of some other node.
Then we have a monitoring utility which is executed once in 5 minutes and uses both site1 and site2. For it to work properly both names must point to the same DC, because data sync between DCs doesn't happen that fast. Since both CNAMEs are in fact linked to the same A-record with short TTL at a first glance it seems like nothing can go wrong. But it turns out it can.
The utility is written in C# for .NET Framework 4.7.2 and uses HttpClient class for performing requests to both sites. Yeah, it's him again.
We have noticed that when a server node switch occurs the utility often starts acting as if site1 and site2 were in different DCs. The pattern of its behavior in such moments is strictly determined, so it's not like it gets confused somewhere in the middle of the process - it incorrecly resolves one or both of these addresses from the very start.
I've made another much simpler utility which just sends one GET-request to site1 and then started intentionally switching nodes on and off and running this utility to see which DC would serve its request. And the results were very frustrating.
Despite the Windows DNS cache already being updated (checked via ipconfig
and Get-DnsClientCache
cmdlet) and despite the overall records' TTL of 60 seconds the HttpClient keeps sending requests to the old IP address sometimes for another 15-20 minutes. Even when I've completely shut down the "outdated" application server - the utility kept trying to connect to it, so even connection failures don't wake it up.
It becomes even more frustrating if you start running ipconfig /flushdns
in between utility runs. Sometimes after flushdns the utility realizes that the IP has changed. But as soon as you make another flushdns (or this is even not needed - I haven't 100% clearly figured this out) and run the utility again - it goes back to the old address! Unbelievable!
And add even more frustration. If you resolve the IP address from within the same utility using Dns.GetHostEntry method (which uses cache as per this comment) right before calling HttpClient, the resolve result would be correct... But the HttpClient would anyway make a connection to an IP address of seemengly his own independent choice. So HttpClient somehow does not seem to rely on built-in .NET Framework DNS resolving.
So the questions are:
- Where does a newly created .NET Framework process take those cached DNS results from?
- Even if there is some kind of a mystical global .NET-specific DNS cache, then why does it absolutely ignore TTL?
- How is it possible at all that it reverts to the outdated old IP address after it has already once "understood" that the address has changed?
P.S. I have worked this all around by implementing a custom HttpClientHandler which performs DNS queries on each hostname's first usage thus it's independent from external DNS caches (except for caching at intermediate DNS servers which also affects things to some extent). But that was a little tricky in terms of TLS certificates validation and the final solution does not seem to be production ready - but we use it for monitoring only so for us it's OK. If anyone is interested in this, I'll show the class code which somewhat resembles this answer's example.
Update 2021-10-08The utility works from behind a corporate proxy. In fact there are multiple proxies for load balancing. So I am now also in process of verifying this:
- If the DNS resolving is performed by the proxies and they don't respect the TTL or if they cache (keep alive) TCP connections by hostnames - this would explain the whole problem
- If it's possible that different proxies handle HTTP requests on different runs of the utility - this would answer the most frustrating question #3
The answer to "Does .NET Framework has an OS-independent global DNS cache?" is NO. HttpClient class or .NET Framework in general had nothing to do with all of this. Posted my investigation results as an accepted answer.
ANSWER
Answered 2021-Oct-14 at 21:32HttpClient, please forgive me! It was not your fault!
Well, this investigation was huge. And I'll have to split the answer into two parts since there turned out to be two unconnected problems.
1. The proxy server problemAs I said, the utility was being tested from behind a corporate proxy. In case if you haven't known (like I haven't till the latest days) when using a proxy server it's not your machine performing DNS queries - it's the proxy server doing this for you.
I've made some measurements to understand for how long does the utility keep connecting to the wrong DC after the DNS record switch. And the answer was the fantastic exact 30 minutes. This experiment has also clearly shown that local Windows DNS cache has nothing to do with it: those 30 minutes were starting exactly at the point when the proxy server was waking up (was finally starting to send HTTP requests to the right DC).
The exact number of 30 minutes has helped one of our administrators to finally figure out that the proxy servers have a configuration parameter of minimal DNS TTL which is set to 1800 seconds by default. So the proxies have their own DNS cache. These are hardware Cisco proxies and the admin has also noted that this parameter is "hidden quite deeply" and is not even mentioned in the user manual.
As soon as the minimal proxies' DNS TTL was changed from 1800 seconds to 1 second (yeah, admins have no mercy) the issue stopped reproducing on my machine.
But what about "forgetting" the just-understood correct IP address and falling back to the old one?Well. As I also said there are several proxies. There is a single corporate proxy DNS name, but if you run nslookup
for it - it shows multiple IPs behind it. Each time the proxy server's IP address is resolved (for example when local cache expires) - there's quite a bit of a chance that you'll jump onto another proxy server.
And that's exactly what ipconfig /flushdns
has been doing to me. As soon as I started playing around with proxy servers using their direct IP addresses instead of their common DNS name I found that different proxies may easily route identical requests to different DCs. That's because some of them have those 30-minutes-cached DNS records while others have to perform resolving.
Unfortunately, after the proxies theory has been proven, another news came in: the production monitoring servers are placed outside of the corporate network and they do not use any proxy servers. So here we go...
2. The short TTL and public DNS servers problemThe monitoring servers are configured to use 8.8.8.8 and 8.8.4.4 Google's DNS servers. Resolve responses for our short-lived DNS records from these servers are somewhat weird:
- The returned TTL of CNAME records swings at around 1 hour mark. It gradually decreases for several minutes and then jumps back to 3600 seconds - and so on.
- The returned TTL of the root A-record is almost always exactly 60 seconds. I was occasionally receiving various numbers less than 60 but there was no any obvious humanly-percievable logic. So it seems like these IP addresses in fact point to balancers that distribute requests between multiple similar DNS servers which are not synced with each other (and each of them has its own cache).
Windows is not stupid and according to my experiments it doesn't care about CNAME's TTL and only cares about the root A-record TTL, so its client cache even for CNAME records is never assigned a TTL higher than 60 seconds.
But due to the inconsistency (or in some sense over-consistency?) of the A-record TTL which Google's servers return (unpredictable 0-60 seconds) the Windows local cache gets confused. There were two facts which demonstrated it:
- Multiple calls to
Resolve-DnsName
for site1 and site2 over several minutes with random pauses between them have eventually led toGet-ClientDnsCache
showing the local cache TTLs of the two site names diverged on up to 15 seconds. This is a big enough difference to sometimes mess the things up. And that's just my short experiment, so I'm quite sure that it might actually get bigger. - Executing
Invoke-WebRequest
to each of the sites one right after another once in every 3-5 seconds while switching the DNS records has let me twicely face a situation when the requests went to different DCs.
Conclusion?The latter experiment had one strange detail I can't explain. Calling
Get-DnsClientCache
afterInvoke-WebRequest
shows no records appear in the local cache for the just-requested site names. But anyway the problem clearly has been reproduced.
It would take time to see whether my workaround with real-time DNS resolving would bring any improvement. Unfortunately, I don't believe it will - the DNS servers used at production (which would eventually be used by the monitoring utility for real-time IP resolving) are public Google DNS which are not reliable in my case.
And one thing which is worse than an intermittently failing monitoring utility is that real-world users are also relying on public DNS servers and they definitely do face problems during our maintenance works or significant failures.
So have we learned anything out of all this?
- Maybe a short DNS TTL is generally a bad practice?
- Maybe we should install additional routers, assign them static IPs, attach the DNS names to them and then route traffic internally between our DCs to finally stop relying on DNS records changing?
- Or maybe public DNS servers are doing a bad job?
- Or maybe the technological singularity is closer than we think?
I have no idea. But its quite possible that "yes" is the right answer to all of these questions.
However there is one thing we surely have learned: network hardware manufacturers shall write their documentation better.
QUESTION
Spark "first" Window function is taking much longer than "last"
Asked 2021-Sep-24 at 02:46I'm working on a pyspark routine to interpolate the missing values in a configuration table.
Imagine a table of configuration values that go from 0 to 50,000. The user specifies a few data points in between (say at 0, 50, 100, 500, 2000, 500000) and we interpolate the remainder. My solution mostly follows this blog post quite closely, except I'm not using any UDFs.
In troubleshooting the performance of this (takes ~3 minutes) I found that one particular window function is taking all of the time, and everything else I'm doing takes mere seconds.
Here is the main area of interest - where I use window functions to fill in the previous and next user-supplied configuration values:
1from pyspark.sql import Window, functions as F
2
3# Create partition windows that are required to generate new rows from the ones provided
4win_last = Window.partitionBy('PORT_TYPE', 'loss_process').orderBy('rank').rowsBetween(Window.unboundedPreceding, 0)
5win_next = Window.partitionBy('PORT_TYPE', 'loss_process').orderBy('rank').rowsBetween(0, Window.unboundedFollowing)
6
7# Join back in the provided config table to populate the "known" scale factors
8df_part1 = (df_scale_factors_template
9 .join(df_users_config, ['PORT_TYPE', 'loss_process', 'rank'], 'leftouter')
10 # Add computed columns that can lookup the prior config and next config for each missing value
11 .withColumn('last_rank', F.last( F.col('rank'), ignorenulls=True).over(win_last))
12 .withColumn('last_sf', F.last( F.col('scale_factor'), ignorenulls=True).over(win_last))
13).cache()
14debug_log_dataframe(df_part1 , 'df_part1') # Force a .count() and time Part1
15
16df_part2 = (df_part1
17 .withColumn('next_rank', F.first(F.col('rank'), ignorenulls=True).over(win_next))
18 .withColumn('next_sf', F.first(F.col('scale_factor'), ignorenulls=True).over(win_next))
19).cache()
20debug_log_dataframe(df_part2 , 'df_part2') # Force a .count() and time Part2
21
22df_part3 = (df_part2
23 # Implements standard linear interpolation: y = y1 + ((y2-y1)/(x2-x1)) * (x-x1)
24 .withColumn('scale_factor',
25 F.when(F.col('last_rank')==F.col('next_rank'), F.col('last_sf')) # Handle div/0 case
26 .otherwise(F.col('last_sf') + ((F.col('next_sf')-F.col('last_sf'))/(F.col('next_rank')-F.col('last_rank'))) * (F.col('rank')-F.col('last_rank'))))
27 .select('PORT_TYPE', 'loss_process', 'rank', 'scale_factor')
28).cache()
29debug_log_dataframe(df_part3, 'df_part3', explain: True) # Force a .count() and time Part3
30
The above used to be a single chained dataframe statement, but I've since split it into 3 parts so that I could isolate the part that's taking so long. The results are:
Part 1: Generated 8 columns and 300006 rows in 0.65 seconds
Part 2: Generated 10 columns and 300006 rows in 189.55 seconds
Part 3: Generated 4 columns and 300006 rows in 0.24 seconds
Why do my calls to
first()
over Window.unboundedFollowing
take so much longer than last()
over Window.unboundedPreceding
?
Some notes to head off questions / concerns:
debug_log_dataframe
is just a helper function to force the dataframe execution/cache with a.Count()
and time it to yield the above logs.- We're actually operating on 6 config tables of 50001 rows at once (hence the partitioning and row count)
- As a sanity check, I've ruled out the effects of
cache()
reuse by explicitlyunpersist()
ing before timing subsequent runs - I'm quite confident in the above measurements.
Physical Plan:
To help answer this question, I call explain()
on the result of part3 to confirm, among other things, that caching is having the desired effect. Here it is annotated to highlight the problem area:
The only differences I can see is that:
- The first two calls (to
last
) showRunningWindowFunction
, whereas the calls tonext
just readWindow
- Part 1 had a *(3) next to it, but Part 2 does not.
Some things I tried:
- I tried further splitting part 2 into separate dataframes - the result is that each
first
statement takes half of the total time (~98 seconds) - I tried reversing the order in which I generate these columns (e.g. placing the calls to 'last' after the calls to 'first') but there's no difference. Whichever dataframe ends up containing the calls to
first
is the slow one.
I feel like I've done as much digging as I can and am kind of hoping a spark expert will take one look at know where this time is coming from.
ANSWER
Answered 2021-Sep-24 at 02:46The solution that doesn't answer the question
In trying various things to speed up my routine, it occurred to me to try re-rewriting my usages of first()
to just be usages of last()
with a reversed sort order.
So rewriting this:
1from pyspark.sql import Window, functions as F
2
3# Create partition windows that are required to generate new rows from the ones provided
4win_last = Window.partitionBy('PORT_TYPE', 'loss_process').orderBy('rank').rowsBetween(Window.unboundedPreceding, 0)
5win_next = Window.partitionBy('PORT_TYPE', 'loss_process').orderBy('rank').rowsBetween(0, Window.unboundedFollowing)
6
7# Join back in the provided config table to populate the "known" scale factors
8df_part1 = (df_scale_factors_template
9 .join(df_users_config, ['PORT_TYPE', 'loss_process', 'rank'], 'leftouter')
10 # Add computed columns that can lookup the prior config and next config for each missing value
11 .withColumn('last_rank', F.last( F.col('rank'), ignorenulls=True).over(win_last))
12 .withColumn('last_sf', F.last( F.col('scale_factor'), ignorenulls=True).over(win_last))
13).cache()
14debug_log_dataframe(df_part1 , 'df_part1') # Force a .count() and time Part1
15
16df_part2 = (df_part1
17 .withColumn('next_rank', F.first(F.col('rank'), ignorenulls=True).over(win_next))
18 .withColumn('next_sf', F.first(F.col('scale_factor'), ignorenulls=True).over(win_next))
19).cache()
20debug_log_dataframe(df_part2 , 'df_part2') # Force a .count() and time Part2
21
22df_part3 = (df_part2
23 # Implements standard linear interpolation: y = y1 + ((y2-y1)/(x2-x1)) * (x-x1)
24 .withColumn('scale_factor',
25 F.when(F.col('last_rank')==F.col('next_rank'), F.col('last_sf')) # Handle div/0 case
26 .otherwise(F.col('last_sf') + ((F.col('next_sf')-F.col('last_sf'))/(F.col('next_rank')-F.col('last_rank'))) * (F.col('rank')-F.col('last_rank'))))
27 .select('PORT_TYPE', 'loss_process', 'rank', 'scale_factor')
28).cache()
29debug_log_dataframe(df_part3, 'df_part3', explain: True) # Force a .count() and time Part3
30win_next = (Window.partitionBy('PORT_TYPE', 'loss_process')
31 .orderBy('rank').rowsBetween(0, Window.unboundedFollowing))
32
33df_part2 = (df_part1
34 .withColumn('next_rank', F.first(F.col('rank'), ignorenulls=True).over(win_next))
35 .withColumn('next_sf', F.first(F.col('scale_factor'), ignorenulls=True).over(win_next))
36)
37
As this:
1from pyspark.sql import Window, functions as F
2
3# Create partition windows that are required to generate new rows from the ones provided
4win_last = Window.partitionBy('PORT_TYPE', 'loss_process').orderBy('rank').rowsBetween(Window.unboundedPreceding, 0)
5win_next = Window.partitionBy('PORT_TYPE', 'loss_process').orderBy('rank').rowsBetween(0, Window.unboundedFollowing)
6
7# Join back in the provided config table to populate the "known" scale factors
8df_part1 = (df_scale_factors_template
9 .join(df_users_config, ['PORT_TYPE', 'loss_process', 'rank'], 'leftouter')
10 # Add computed columns that can lookup the prior config and next config for each missing value
11 .withColumn('last_rank', F.last( F.col('rank'), ignorenulls=True).over(win_last))
12 .withColumn('last_sf', F.last( F.col('scale_factor'), ignorenulls=True).over(win_last))
13).cache()
14debug_log_dataframe(df_part1 , 'df_part1') # Force a .count() and time Part1
15
16df_part2 = (df_part1
17 .withColumn('next_rank', F.first(F.col('rank'), ignorenulls=True).over(win_next))
18 .withColumn('next_sf', F.first(F.col('scale_factor'), ignorenulls=True).over(win_next))
19).cache()
20debug_log_dataframe(df_part2 , 'df_part2') # Force a .count() and time Part2
21
22df_part3 = (df_part2
23 # Implements standard linear interpolation: y = y1 + ((y2-y1)/(x2-x1)) * (x-x1)
24 .withColumn('scale_factor',
25 F.when(F.col('last_rank')==F.col('next_rank'), F.col('last_sf')) # Handle div/0 case
26 .otherwise(F.col('last_sf') + ((F.col('next_sf')-F.col('last_sf'))/(F.col('next_rank')-F.col('last_rank'))) * (F.col('rank')-F.col('last_rank'))))
27 .select('PORT_TYPE', 'loss_process', 'rank', 'scale_factor')
28).cache()
29debug_log_dataframe(df_part3, 'df_part3', explain: True) # Force a .count() and time Part3
30win_next = (Window.partitionBy('PORT_TYPE', 'loss_process')
31 .orderBy('rank').rowsBetween(0, Window.unboundedFollowing))
32
33df_part2 = (df_part1
34 .withColumn('next_rank', F.first(F.col('rank'), ignorenulls=True).over(win_next))
35 .withColumn('next_sf', F.first(F.col('scale_factor'), ignorenulls=True).over(win_next))
36)
37win_next = (Window.partitionBy('PORT_TYPE', 'loss_process')
38 .orderBy(F.desc('rank')).rowsBetween(Window.unboundedPreceding, 0))
39
40df_part2 = (df_part1
41 .withColumn('next_rank', F.last(F.col('rank'), ignorenulls=True).over(win_next))
42 .withColumn('next_sf', F.last(F.col('scale_factor'), ignorenulls=True).over(win_next))
43)
44
Much to my amazement, this actually solved the performance problem, and now the entire dataframe is generated in just 3 seconds. I'm pleased, but still vexed.
As I somewhat predicted, the query plan now includes a new SORT step before creating these next two columns, and they've changed from Window
to RunningWindowFunction
as the first two. Here's the new plan (without the code broken up into 3 separate cached parts anymore, because that was just to troubleshoot performance):
As for the question:
Why do my calls to first() over Window.unboundedFollowing take so much longer than last() over Window.unboundedPreceding?
I'm hoping someone can still answer this, for academic reasons
Community Discussions contain sources that include Stack Exchange Network
Tutorials and Learning Resources in Caching
Tutorials and Learning Resources are not available at this moment for Caching