Popular New Releases in Compression
zstd
Zstandard v1.5.2
brotli
v1.0.9
lz4
LZ4 v1.9.3
Compressor
snappy
Snappy 1.1.9
Popular Libraries in Compression
by facebook c
16713 NOASSERTION
Zstandard - Fast real-time compression algorithm
by Curzibn java
12531 Apache-2.0
Luban(鲁班)—Image compression with efficiency very close to WeChat Moments/可能是最接近微信朋友圈的图片压缩算法
by google c
10397 MIT
Brotli compression format
by upx c
8801 NOASSERTION
UPX - the Ultimate Packer for eXecutables
by Stuk javascript
7745 NOASSERTION
Create, read and edit .zip files with Javascript
by ImageOptim html
7240 GPL-2.0
GUI image optimizer for Mac
by lz4 c
6730 NOASSERTION
Extremely Fast Compression algorithm
by zetbaitsu kotlin
6104
An android image compression library.
by google c++
5052 NOASSERTION
A fast compressor/decompressor
Trending New libraries in Compression
by 101arrowz typescript
949 MIT
High performance (de)compression in an 8kB package
by caoscott python
884 MIT
PyTorch Implementation of "Lossless Image Compression through Super-Resolution"
by funbox javascript
703 MIT
CLI image optimization tool
by szq0214 python
563
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks
by mhx c++
457 GPL-3.0
A fast high compression read-only file system
by ouch-org rust
444 NOASSERTION
Painless compression and decompression for your terminal
by BuzonIO python
366 MIT
Writing large ZIP archives without memory inflation
by jinfeihan57 c
318
A new p7zip fork with additional codecs and improvements (forked from https://sourceforge.net/projects/p7zip/).
by libjxl c++
306 NOASSERTION
JPEG XL image format reference implementation
Top Authors in Compression
1
13 Libraries
2373
2
11 Libraries
21410
3
8 Libraries
74
4
7 Libraries
85
5
6 Libraries
800
6
5 Libraries
378
7
5 Libraries
19
8
5 Libraries
2129
9
5 Libraries
60
10
5 Libraries
1261
1
13 Libraries
2373
2
11 Libraries
21410
3
8 Libraries
74
4
7 Libraries
85
5
6 Libraries
800
6
5 Libraries
378
7
5 Libraries
19
8
5 Libraries
2129
9
5 Libraries
60
10
5 Libraries
1261
Trending Kits in Compression
No Trending Kits are available at this moment for Compression
Trending Discussions on Compression
Fixing git HTTPS Error: "bad key length" on macOS 12
git gc: error: Could not read 0000000000000000000000000000000000000000
Vuejs Webpack Compression Plugin not compressing
Is Shannon-Fano coding ambiguous?
Why does this .c file #include itself?
APL Fork/Train with Compression
.NET 6 failing at Decompress large gzip text
angular 13: Module not found: Error: Can't resolve 'rxjs/operators'
JavaScript: V8 question: are small integers pooled?
Paramiko authentication fails with "Agreed upon 'rsa-sha2-512' pubkey algorithm" (and "unsupported public key algorithm: rsa-sha2-512" in sshd log)
QUESTION
Fixing git HTTPS Error: "bad key length" on macOS 12
Asked 2022-Mar-29 at 17:34I am using a company-hosted (Bitbucket) git repository that is accessible via HTTPS. Accessing it (e.g. git fetch
) worked using macOS 11 (Big Sur), but broke after an update to macOS 12 Monterey.
*
After the update of macOS to 12 Monterey my previous git setup broke. Now I am getting the following error message:
1$ git fetch
2fatal: unable to access 'https://.../':
3error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
4
For what it's worth, using curl
does not work either:
1$ git fetch
2fatal: unable to access 'https://.../':
3error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
4$ curl --insecure -L -v https://...
5* Trying ...
6* Connected to ... (...) port 443 (#0)
7* ALPN, offering h2
8* ALPN, offering http/1.1
9* successfully set certificate verify locations:
10* CAfile: /etc/ssl/cert.pem
11* CApath: none
12* TLSv1.2 (OUT), TLS handshake, Client hello (1):
13* TLSv1.2 (IN), TLS handshake, Server hello (2):
14* TLSv1.2 (IN), TLS handshake, Certificate (11):
15* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
16* TLSv1.2 (IN), TLS handshake, Server finished (14):
17* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
18* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
19* TLSv1.2 (OUT), TLS handshake, Finished (20):
20* error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
21* Closing connection 0
22curl: (35) error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
23
Accessing the same HTTPS-source via Safari or Firefox works.
As far as I understand, the underlying error "bad key length" error is coming from OpenSSL/LibreSSL, this would be consistent with both git and curl failing after an OS upgrade.
This is the output from openssl:
1$ git fetch
2fatal: unable to access 'https://.../':
3error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
4$ curl --insecure -L -v https://...
5* Trying ...
6* Connected to ... (...) port 443 (#0)
7* ALPN, offering h2
8* ALPN, offering http/1.1
9* successfully set certificate verify locations:
10* CAfile: /etc/ssl/cert.pem
11* CApath: none
12* TLSv1.2 (OUT), TLS handshake, Client hello (1):
13* TLSv1.2 (IN), TLS handshake, Server hello (2):
14* TLSv1.2 (IN), TLS handshake, Certificate (11):
15* TLSv1.2 (IN), TLS handshake, Server key exchange (12):
16* TLSv1.2 (IN), TLS handshake, Server finished (14):
17* TLSv1.2 (OUT), TLS handshake, Client key exchange (16):
18* TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1):
19* TLSv1.2 (OUT), TLS handshake, Finished (20):
20* error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
21* Closing connection 0
22curl: (35) error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length
23$ openssl s_client -servername ... -connect ...:443
24CONNECTED(00000005)
25depth=2 C = US, O = DigiCert Inc, OU = www.digicert.com, CN = DigiCert Global Root G2
26verify return:1
27depth=1 C = US, O = DigiCert Inc, OU = www.digicert.com, CN = Thawte TLS RSA CA G1
28verify return:1
29depth=0 ...
304593010348:error:06FFF089:digital envelope routines:CRYPTO_internal:bad key length:
31/System/Volumes/Data/SWE/macOS/BuildRoots/b8ff8433dc/Library/Caches/com.apple.xbs
32/Sources/libressl/libressl-75/libressl-2.8/crypto/apple/hmac/hmac.c:188:
33---
34Certificate chain
35 ...
36---
37No client certificate CA names sent
38Server Temp Key: DH, 2048 bits
39---
40SSL handshake has read 4105 bytes and written 318 bytes
41---
42New, TLSv1/SSLv3, Cipher is DHE-RSA-AES256-GCM-SHA384
43Server public key is 4096 bit
44Secure Renegotiation IS supported
45Compression: NONE
46Expansion: NONE
47No ALPN negotiated
48SSL-Session:
49 Protocol : TLSv1.2
50 Cipher : DHE-RSA-AES256-GCM-SHA384
51 Session-ID: 1FA062DC9EEC9A310FF8231F1EB11A3BD6E0778F7AB6E98EAD1020A44CF1A407
52 Session-ID-ctx:
53 Master-Key:
54 Start Time: 1635319904
55 Timeout : 7200 (sec)
56 Verify return code: 0 (ok)
57---
58
59
I did try to add the server's certificates into a custom pem file and setting http.sslCAInfo, but that didn't work. As a workaround, I am currently using a proxy that decrypts/re-encrypts HTTPS traffic.
How do I configure git (or all LibreSSL users) to accept the server's certificate?
ANSWER
Answered 2021-Nov-02 at 07:12Unfortunately I can't provide you with a fix, but I've found a workaround for that exact same problem (company-hosted bitbucket resulting in exact same error).
I also don't know exactly why the problem occurs, but my best guess would be that the libressl library shipped with Monterey has some sort of problem with specific (?TLSv1.3) certs. This guess is because the brew-installed openssl v1.1 and v3 don't throw that error when executed with /opt/homebrew/opt/openssl/bin/openssl s_client -connect ...:443
To get around that error, I've built git from source built against different openssl and curl implementations:
- install
autoconf
,openssl
andcurl
with brew (I think you can select the openssl lib you like, i.e. v1.1 or v3, I chose v3) - clone git version you like, i.e.
git clone --branch v2.33.1 https://github.com/git/git.git
cd git
make configure
(that is why autoconf is needed)- execute
LDFLAGS="-L/opt/homebrew/opt/openssl@3/lib -L/opt/homebrew/opt/curl/lib" CPPFLAGS="-I/opt/homebrew/opt/openssl@3/include -I/opt/homebrew/opt/curl/include" ./configure --prefix=$HOME/git
(here LDFLAGS and CPPFLAGS include the libs git will be built against, the right flags are emitted by brew on install success of curl and openssl; --prefix is the install directory of git, defaults to/usr/local
but can be changed) make install
- ensure to add the install directory's subfolder
/bin
to the front of your$PATH
to "override" the default git shipped by Monterey - restart terminal
- check that
git version
shows the new version
This should help for now, but as I already said, this is only a workaround, hopefully Apple fixes their libressl fork ASAP.
QUESTION
git gc: error: Could not read 0000000000000000000000000000000000000000
Asked 2022-Mar-28 at 14:181git gc
2error: Could not read 0000000000000000000000000000000000000000
3Enumerating objects: 147323, done.
4Counting objects: 100% (147323/147323), done.
5Delta compression using up to 4 threads
6Compressing objects: 100% (36046/36046), done.
7Writing objects: 100% (147323/147323), done.
8Total 147323 (delta 91195), reused 147323 (delta 91195), pack-reused 0
9
What is going on here? Should I worry or ignore the the problem?
For example git gc --help
and similar have nothing that appears to explain the problem.
I am running git version 2.35.1 on Lubuntu 20.04.
This issue was forwarded to git mailing list ( https://public-inbox.org/git/571c0796-66d4-e8c7-c5a5-2e7a28132aa9@kdbg.org/ ) and is being solved.
(ideally it would be added to the answer but edit queue is full)
ANSWER
Answered 2022-Mar-28 at 14:18This error is harmless in the sense that it does not indicate a broken repository. It is a bug that was introduced in Git 2.35 and that should be fixed in later releases.
The worst that can happen is that git gc
does not prune all objects that are referenced from reflogs.
The error is triggered by an invocation of git reflog expire --all
that git gc
does behind the scenes.
The trigger are empty reflog files in the .git/logs
directory structure that were left behind after a branch was deleted. As a workaround you can remove these empty files. This command lets you find them and check their size:
1git gc
2error: Could not read 0000000000000000000000000000000000000000
3Enumerating objects: 147323, done.
4Counting objects: 100% (147323/147323), done.
5Delta compression using up to 4 threads
6Compressing objects: 100% (36046/36046), done.
7Writing objects: 100% (147323/147323), done.
8Total 147323 (delta 91195), reused 147323 (delta 91195), pack-reused 0
9find .git/logs -type f -size 0c | xargs ls -ld
10
Pick only the files that do not correspond to a branch.
(Also, I am uncertain about the operation of -size 0c
, hence, do make sure not to remove all the listed files blindly, but only those that have no corresponding branch and are actually empty.)
This issue was forwarded to git mailing list based on this SO question and is being solved.
QUESTION
Vuejs Webpack Compression Plugin not compressing
Asked 2022-Mar-28 at 12:53I need help debugging Webpack's Compression Plugin.
SUMMARY OF PROBLEM
- Goal is to enable asset compression and reduce my app's bundle size. Using the Brotli algorithm as the default, and gzip as a fallback for unsupported browsers.
- I expected a content-encoding field within an asset's Response Headers. Instead, they're loaded without the field. I used the Chrome dev tools' network tab to confirm this. For context, see the following snippet:
- No errors show in my browser or IDE when running locally.
WHAT I TRIED
- Using different implementations for the compression plugin. See below list of approaches:
- (With Webpack Chain API)
1config
2 .plugin('brotliCompress')
3 .use(CompressionWebpackPlugin, [{
4 exclude: /.map$/,
5 cache: true,
6 algorithm: 'brotliCompress',
7 test: /\.(js|css|html|svg)$/,
8 threshold: 10240,
9 minRatio: 0.8,
10 }])
11
- (With Webpack Chain API)
1config
2 .plugin('brotliCompress')
3 .use(CompressionWebpackPlugin, [{
4 exclude: /.map$/,
5 cache: true,
6 algorithm: 'brotliCompress',
7 test: /\.(js|css|html|svg)$/,
8 threshold: 10240,
9 minRatio: 0.8,
10 }])
11config
12 .plugin('gzip')
13 .use(CompressionWebpackPlugin, [{
14 algorithm: 'gzip',
15 test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
16 threshold: 8192, // Assets larger than 8192 bytes are not processed
17 minRatio: 0.8, // Assets compressing worse that this ratio are not processed
18 }])
19
- (With Webpack Chain API)
1config
2 .plugin('brotliCompress')
3 .use(CompressionWebpackPlugin, [{
4 exclude: /.map$/,
5 cache: true,
6 algorithm: 'brotliCompress',
7 test: /\.(js|css|html|svg)$/,
8 threshold: 10240,
9 minRatio: 0.8,
10 }])
11config
12 .plugin('gzip')
13 .use(CompressionWebpackPlugin, [{
14 algorithm: 'gzip',
15 test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
16 threshold: 8192, // Assets larger than 8192 bytes are not processed
17 minRatio: 0.8, // Assets compressing worse that this ratio are not processed
18 }])
19config
20 .plugin('CompressionPlugin')
21 .use(CompressionWebpackPlugin)
22
- (Using vue-cli-plugin: compression) This fails due to a Missing generator error when I use
vue invoke compression
in response to an IDE console message after I runvue add compression
as an alternative to using Webpack Chain API for compression configuration.
1config
2 .plugin('brotliCompress')
3 .use(CompressionWebpackPlugin, [{
4 exclude: /.map$/,
5 cache: true,
6 algorithm: 'brotliCompress',
7 test: /\.(js|css|html|svg)$/,
8 threshold: 10240,
9 minRatio: 0.8,
10 }])
11config
12 .plugin('gzip')
13 .use(CompressionWebpackPlugin, [{
14 algorithm: 'gzip',
15 test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
16 threshold: 8192, // Assets larger than 8192 bytes are not processed
17 minRatio: 0.8, // Assets compressing worse that this ratio are not processed
18 }])
19config
20 .plugin('CompressionPlugin')
21 .use(CompressionWebpackPlugin)
22 pluginOptions: {
23 compression: {
24 brotli: {
25 filename: '[file].br[query]',
26 algorithm: 'brotliCompress',
27 include: /\.(js|css|html|svg|json)(\?.*)?$/i,
28 minRatio: 0.8,
29 },
30 gzip: {
31 filename: '[file].gz[query]',
32 algorithm: 'gzip',
33 include: /\.(js|css|html|svg|json)(\?.*)?$/i,
34 minRatio: 0.8
35 }
36 }
37 },
38
- Lastly, I tried setting the threshold field to 0 as well as raising it larger than 10k bytes.
POINTS OF SIGNIFICANCE
- The above attempts didn't achieve the goal I stated in the first summary bullet and were used in place of the previous approaches tested.
- I prioritized my efforts with Webpack Chain API since it resulted in no errors when rebuilding and running the app.
REFERENCED LINKS/DOCS
- https://webpack.js.org/plugins/compression-webpack-plugin/
- https://github.com/neutrinojs/webpack-chain/tree/main
- https://neutrinojs.org/webpack-chain/#config-plugins-adding
- https://github.com/nklayman/vue-cli-plugin-electron-builder/issues/500 (similar generator issue with another plugin)
- https://webpack.js.org/plugins/compression-webpack-plugin/
- Use webpack-chain to do webpack configuration in vue.config.js, so how to use speed-measure-webpack-plugin plugin? (not a valid answer, but referenced syntax nonetheless)
- https://github.com/vuejs/vue-cli/issues/6091#issuecomment-738536334
- Webpack prerender-spa-plugin with compression-webpack-plugin. index.html not compressed
CODE
vue.config.js
1config
2 .plugin('brotliCompress')
3 .use(CompressionWebpackPlugin, [{
4 exclude: /.map$/,
5 cache: true,
6 algorithm: 'brotliCompress',
7 test: /\.(js|css|html|svg)$/,
8 threshold: 10240,
9 minRatio: 0.8,
10 }])
11config
12 .plugin('gzip')
13 .use(CompressionWebpackPlugin, [{
14 algorithm: 'gzip',
15 test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
16 threshold: 8192, // Assets larger than 8192 bytes are not processed
17 minRatio: 0.8, // Assets compressing worse that this ratio are not processed
18 }])
19config
20 .plugin('CompressionPlugin')
21 .use(CompressionWebpackPlugin)
22 pluginOptions: {
23 compression: {
24 brotli: {
25 filename: '[file].br[query]',
26 algorithm: 'brotliCompress',
27 include: /\.(js|css|html|svg|json)(\?.*)?$/i,
28 minRatio: 0.8,
29 },
30 gzip: {
31 filename: '[file].gz[query]',
32 algorithm: 'gzip',
33 include: /\.(js|css|html|svg|json)(\?.*)?$/i,
34 minRatio: 0.8
35 }
36 }
37 },
38const path = require('path')
39const CompressionWebpackPlugin = require('compression-webpack-plugin')
40
41function resolve (dir) {
42 return path.join(__dirname, dir)
43}
44
45module.exports = {
46 /* ....shortened for brevity */
47
48 // Compress option VI (with vue cli plugin, generator bug when invoked)
49 // pluginOptions: {
50 // compression: {
51 // brotli: {
52 // filename: '[file].br[query]',
53 // algorithm: 'brotliCompress',
54 // include: /\.(js|css|html|svg|json)(\?.*)?$/i,
55 // minRatio: 0.8,
56 // },
57 // gzip: {
58 // filename: '[file].gz[query]',
59 // algorithm: 'gzip',
60 // include: /\.(js|css|html|svg|json)(\?.*)?$/i,
61 // minRatio: 0.8
62 // }
63 // }
64 // },
65
66 chainWebpack: config => {
67 config
68 .resolve.alias
69 .set('@', resolve('src'))
70
71 config
72 .plugins.delete('prefetch')
73
74 config
75 .optimization.splitChunks()
76
77 config
78 .output
79 .chunkFilename('[id].js')
80
81 // The below configurations are recommeneded only in prod.
82 // config.when(process.env.NODE_ENV === 'production', config => { config... })
83
84 // Compress option VII
85 // config
86 // .plugin('gzip')
87 // .use(CompressionWebpackPlugin, [{
88 // algorithm: 'gzip',
89 // test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
90 // threshold: 8192, // Assets larger than 8192 bytes are not processed
91 // minRatio: 0.8, // Assets compressing worse that this ratio are not processed
92 // }])
93
94 // Compress option VIII
95 // config
96 // .plugin('CompressionPlugin')
97 // .use(CompressionWebpackPlugin)
98
99 config
100 .plugin('brotliCompress')
101 .use(CompressionWebpackPlugin, [{
102 exclude: /.map$/,
103 // deleteOriginalAssets: true,
104 cache: true,
105 algorithm: 'brotliCompress',
106 test: /\.(js|css|html|svg)$/,
107 threshold: 10240,
108 minRatio: 0.8,
109 }])
110 },
111}
112
package.json
1config
2 .plugin('brotliCompress')
3 .use(CompressionWebpackPlugin, [{
4 exclude: /.map$/,
5 cache: true,
6 algorithm: 'brotliCompress',
7 test: /\.(js|css|html|svg)$/,
8 threshold: 10240,
9 minRatio: 0.8,
10 }])
11config
12 .plugin('gzip')
13 .use(CompressionWebpackPlugin, [{
14 algorithm: 'gzip',
15 test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
16 threshold: 8192, // Assets larger than 8192 bytes are not processed
17 minRatio: 0.8, // Assets compressing worse that this ratio are not processed
18 }])
19config
20 .plugin('CompressionPlugin')
21 .use(CompressionWebpackPlugin)
22 pluginOptions: {
23 compression: {
24 brotli: {
25 filename: '[file].br[query]',
26 algorithm: 'brotliCompress',
27 include: /\.(js|css|html|svg|json)(\?.*)?$/i,
28 minRatio: 0.8,
29 },
30 gzip: {
31 filename: '[file].gz[query]',
32 algorithm: 'gzip',
33 include: /\.(js|css|html|svg|json)(\?.*)?$/i,
34 minRatio: 0.8
35 }
36 }
37 },
38const path = require('path')
39const CompressionWebpackPlugin = require('compression-webpack-plugin')
40
41function resolve (dir) {
42 return path.join(__dirname, dir)
43}
44
45module.exports = {
46 /* ....shortened for brevity */
47
48 // Compress option VI (with vue cli plugin, generator bug when invoked)
49 // pluginOptions: {
50 // compression: {
51 // brotli: {
52 // filename: '[file].br[query]',
53 // algorithm: 'brotliCompress',
54 // include: /\.(js|css|html|svg|json)(\?.*)?$/i,
55 // minRatio: 0.8,
56 // },
57 // gzip: {
58 // filename: '[file].gz[query]',
59 // algorithm: 'gzip',
60 // include: /\.(js|css|html|svg|json)(\?.*)?$/i,
61 // minRatio: 0.8
62 // }
63 // }
64 // },
65
66 chainWebpack: config => {
67 config
68 .resolve.alias
69 .set('@', resolve('src'))
70
71 config
72 .plugins.delete('prefetch')
73
74 config
75 .optimization.splitChunks()
76
77 config
78 .output
79 .chunkFilename('[id].js')
80
81 // The below configurations are recommeneded only in prod.
82 // config.when(process.env.NODE_ENV === 'production', config => { config... })
83
84 // Compress option VII
85 // config
86 // .plugin('gzip')
87 // .use(CompressionWebpackPlugin, [{
88 // algorithm: 'gzip',
89 // test: new RegExp('\\.(' + ['js', 'css'].join('|') + ')$'),
90 // threshold: 8192, // Assets larger than 8192 bytes are not processed
91 // minRatio: 0.8, // Assets compressing worse that this ratio are not processed
92 // }])
93
94 // Compress option VIII
95 // config
96 // .plugin('CompressionPlugin')
97 // .use(CompressionWebpackPlugin)
98
99 config
100 .plugin('brotliCompress')
101 .use(CompressionWebpackPlugin, [{
102 exclude: /.map$/,
103 // deleteOriginalAssets: true,
104 cache: true,
105 algorithm: 'brotliCompress',
106 test: /\.(js|css|html|svg)$/,
107 threshold: 10240,
108 minRatio: 0.8,
109 }])
110 },
111}
112"dependencies": {
113 "@auth0/auth0-spa-js": "^1.15.0",
114 "audio-recorder-polyfill": "^0.4.1",
115 "compression-webpack-plugin": "^6.0.0",
116 "core-js": "^3.6.5",
117 "dotenv": "^8.2.0",
118 "dotenv-expand": "^5.1.0",
119 "moment": "^2.29.1",
120 "register-service-worker": "^1.7.1",
121 "uuid": "^3.4.0",
122 "vue": "^2.6.11",
123 "vue-loader": "^15.9.8",
124 "vue-router": "^3.5.1",
125 "vuex": "^3.6.2"
126 },
127 "devDependencies": {
128 "@vue/cli-plugin-babel": "~4.5.0",
129 "@vue/cli-plugin-eslint": "~4.5.0",
130 "@vue/cli-plugin-pwa": "~4.5.0",
131 "@vue/cli-service": "~4.5.0",
132 "babel-eslint": "^10.1.0",
133 "eslint": "^6.7.2",
134 "eslint-plugin-vue": "^6.2.2",
135 "vue-cli-plugin-compression": "~1.1.5",
136 "vue-template-compiler": "^2.6.11",
137 "webpack": "^4.46.0"
138 }
139
I appreciate all input. Thanks.
ANSWER
Answered 2021-Sep-30 at 14:59It's not clear which server is serving up these assets. If it's Express, looking at the screenshot with the header X-Powered-By
, https://github.com/expressjs/compression/issues/71 shows that Brotli support hasn't been added to Express yet.
There might be a way to just specify the header for content-encoding
manually though.
QUESTION
Is Shannon-Fano coding ambiguous?
Asked 2022-Mar-08 at 19:38Is the Shannon-Fano coding as described in Fano's paper The Transmission of Information (1952) really ambiguous?
In Detail:3 papers
Claude E. Shannon published his famous paper A Mathematical Theory of Communication in July 1948. In this paper he invented the term bit as we know it today and he also defined what we call Shannon entropy today. And he also proposed an entropy based data compression algorithm in this paper. But Shannon's algorithm was so weak, that under certain circumstances the "compressed" messages could be even longer than in fix length coding. A few month later (March 1949) Robert M. Fano published an improved version of Shannons algorithm in the paper The Transmission of Information. 3 years after Fano (in September 1952) his student David A. Huffman published an even better version in his paper A Method for the Construction of Minimum-Redundancy Codes. Hoffman Coding is more efficient than its two predecessors and it is still used today. But my question is about the algorithm published by Fano which usually is called Shannon-Fano-Coding.
The algorithm
This description is based on the description from Wikipedia. Sorry, I did not fully read Fano's paper. I only browsed through it. It is 37 pages long and I really tried hard to find a passage where he talks about the topic of my question, but I could not find it. So, here is how Shannon-Fano encoding works:
- Count how often each character appears in the message.
- Sort all characters by frequency, characters with highest frequency on top of the list
- Divide the list into two parts, such that the sums of frequencies in both parts are as equal as possible. Add the bit
0
to one part and the bit1
to the other part. - Repeat step 3 on each part that contains 2 or more characters until all parts consist of only 1 character.
- Concatenate all bits from all rounds. This is the Shannon-Fano-code of that character.
An example
Let's execute this on a really tiny example (I think it's the smallest message where the problem appears). Here is the message to encode:
1aaabcde
2
Steps 1 and 2 produce the first 2 columns of both tables shown below. But if Wikipedia's explanation of Fanos's algorithm is correct, then step 3 is ambiguous. If you apply this step on my example, you have two possibilities to split the list in 2 parts (see below). These possibilities produce different codes, which by itself would not be worth to be mentioned. But the point is: The two possibilities produce codes of different lengths.
possibility 1
If there are 2 ways to split the list such that both parts are as equal to each other as possible, then put that character, that stands at the splitting point (this is character b
in my example) to the part containing the low frequent characters
1aaabcde
2+------+-------+-----+-----+-----+-----+-----+-----+------+
3| | | round1 | round2 | round3 | |
4| char | frequ | sum | bit | sum | bit | sum | bit | code |
5+------+-------+-----+-----+-----+-----+-----+-----+------+
6| a | 3 | 3 | 0 | | 0 |
7| | +-----+-----+-----+-----+-----+-----+------+
8| b | 1 | | | | | 1 | 0 | 100 |
9| | | | | 2 | 0 +-----+-----+------+
10| c | 1 | | | | | 1 | 1 | 101 |
11| | | 4 | 1 +-----+-----+-----+-----+------+
12| d | 1 | | | | | 1 | 0 | 110 |
13| | | | | 2 | 1 +-----+-----+------+
14| e | 1 | | | | | 1 | 1 | 111 |
15+------+-------+-----+-----+-----+-----+-----+-----+------+
16
The encoded message is
1aaabcde
2+------+-------+-----+-----+-----+-----+-----+-----+------+
3| | | round1 | round2 | round3 | |
4| char | frequ | sum | bit | sum | bit | sum | bit | code |
5+------+-------+-----+-----+-----+-----+-----+-----+------+
6| a | 3 | 3 | 0 | | 0 |
7| | +-----+-----+-----+-----+-----+-----+------+
8| b | 1 | | | | | 1 | 0 | 100 |
9| | | | | 2 | 0 +-----+-----+------+
10| c | 1 | | | | | 1 | 1 | 101 |
11| | | 4 | 1 +-----+-----+-----+-----+------+
12| d | 1 | | | | | 1 | 0 | 110 |
13| | | | | 2 | 1 +-----+-----+------+
14| e | 1 | | | | | 1 | 1 | 111 |
15+------+-------+-----+-----+-----+-----+-----+-----+------+
16000100101110111 length = 15 bit
17aaab c d e
18
possibility 2
If there are 2 ways to split the list such that both parts are as equal to each other as possible, then put that character, that stands at the splitting point to the part containing the high frequent characters
1aaabcde
2+------+-------+-----+-----+-----+-----+-----+-----+------+
3| | | round1 | round2 | round3 | |
4| char | frequ | sum | bit | sum | bit | sum | bit | code |
5+------+-------+-----+-----+-----+-----+-----+-----+------+
6| a | 3 | 3 | 0 | | 0 |
7| | +-----+-----+-----+-----+-----+-----+------+
8| b | 1 | | | | | 1 | 0 | 100 |
9| | | | | 2 | 0 +-----+-----+------+
10| c | 1 | | | | | 1 | 1 | 101 |
11| | | 4 | 1 +-----+-----+-----+-----+------+
12| d | 1 | | | | | 1 | 0 | 110 |
13| | | | | 2 | 1 +-----+-----+------+
14| e | 1 | | | | | 1 | 1 | 111 |
15+------+-------+-----+-----+-----+-----+-----+-----+------+
16000100101110111 length = 15 bit
17aaab c d e
18+------+-------+-----+-----+-----+-----+-----+-----+------+
19| | | round1 | round2 | round3 | |
20| char | frequ | sum | bit | sum | bit | sum | bit | code |
21+------+-------+-----+-----+-----+-----+-----+-----+------+
22| a | 3 | | | 3 | 0 | | 00 |
23| | | 4 | 0 +-----+-----+ +------+
24| b | 1 | | | 1 | 1 | | 01 |
25| | +-----+-----+-----+-----+-----+-----+------+
26| c | 1 | | | | | 1 | 0 | 100 |
27| | | | | 2 | 0 |-----+-----+------+
28| d | 1 | 3 | 1 | | | 1 | 1 | 101 |
29| | | | +-----+-----+-----+-----+------+
30| e | 1 | | | 1 | 1 | | 11 |
31+------+-------+-----+-----+-----+-----+-----+-----+------+
32
The encoded message is
1aaabcde
2+------+-------+-----+-----+-----+-----+-----+-----+------+
3| | | round1 | round2 | round3 | |
4| char | frequ | sum | bit | sum | bit | sum | bit | code |
5+------+-------+-----+-----+-----+-----+-----+-----+------+
6| a | 3 | 3 | 0 | | 0 |
7| | +-----+-----+-----+-----+-----+-----+------+
8| b | 1 | | | | | 1 | 0 | 100 |
9| | | | | 2 | 0 +-----+-----+------+
10| c | 1 | | | | | 1 | 1 | 101 |
11| | | 4 | 1 +-----+-----+-----+-----+------+
12| d | 1 | | | | | 1 | 0 | 110 |
13| | | | | 2 | 1 +-----+-----+------+
14| e | 1 | | | | | 1 | 1 | 111 |
15+------+-------+-----+-----+-----+-----+-----+-----+------+
16000100101110111 length = 15 bit
17aaab c d e
18+------+-------+-----+-----+-----+-----+-----+-----+------+
19| | | round1 | round2 | round3 | |
20| char | frequ | sum | bit | sum | bit | sum | bit | code |
21+------+-------+-----+-----+-----+-----+-----+-----+------+
22| a | 3 | | | 3 | 0 | | 00 |
23| | | 4 | 0 +-----+-----+ +------+
24| b | 1 | | | 1 | 1 | | 01 |
25| | +-----+-----+-----+-----+-----+-----+------+
26| c | 1 | | | | | 1 | 0 | 100 |
27| | | | | 2 | 0 |-----+-----+------+
28| d | 1 | 3 | 1 | | | 1 | 1 | 101 |
29| | | | +-----+-----+-----+-----+------+
30| e | 1 | | | 1 | 1 | | 11 |
31+------+-------+-----+-----+-----+-----+-----+-----+------+
320000000110010111 length = 16 bit
33a a a b c d e
34
So, it is one bit longer.
So, here are my questions:
- Is Wikipedia's description of Shannon-Fano Coding really correct and complete? If this is the case, than Shannon-Fano Coding is ambiguous.
- Or did Fano in his paper add another step that is missing in Wikipedia's description? If this is the case: How did Fano solve the problem described here? Which of both versions is compatible with Fano's original description?
ANSWER
Answered 2022-Mar-08 at 19:00To directly answer your question, without further elaboration about how to break ties, two different implementations of Shannon-Fano could produce different codes of different lengths for the same inputs.
As @MattTimmermans noted in the comments, Shannon-Fano does not always produce optimal prefix-free codings the way that, say, Huffman coding does. It might therefore be helpful to think of it less as an algorithm and more of a heuristic - something that likely will produce a good code but isn't guaranteed to give an optimal solution. Many heuristics suffer from similar issues, where minor tweaks in the input or how ties are broken could result in different results. A good example of this is the greedy coloring algorithm for finding vertex colorings of graphs. The linked Wikipedia article includes an example in which changing the order in which nodes are visited by the same basic algorithm yields wildly different results.
Even algorithms that produce optimal results, however, can sometimes produce different optimal results based on tiebreaks. Take Huffman coding, for example, which works by repeatedly finding the two lowest-weight trees assembled so far and merging them together. In the event that there are three or more trees at some intermediary step that are all tied for the same weight, different implementations of Huffman coding could produce different prefix-free codes based on which two they join together. The resulting trees would all be equally "good," though, in that they'd all produce outputs of the same length. (That's largely because, unlike Shannon-Fano, Huffman coding is guaranteed to produce an optimal encoding.)
That being said, it's easy to adjust Shannon-Fano so that it always produces a consistent result. For example, you could say "in the event of a tie, choose the partition that puts fewer items into the top group," at which point you would always consistently produce the same coding. It wouldn't necessarily be an optimal encoding, but, then again, since Shannon-Fano was never guaranteed to do so, this is probably not a major concern.
If, on the other hand, you're interested in the question of "when Shannon-Fano has to break a tie, how do I decide how to break the tie to produce the optimal solution?," then I'm not sure of a way to do this other than recursively trying both options and seeing which one is better, which in the worst case leads to exponentially-slow runtimes. But perhaps someone else here can find a way to do that>
QUESTION
Why does this .c file #include itself?
Asked 2022-Feb-18 at 07:48Why does this .c
file #include
itself?
vsimple.c
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16
ANSWER
Answered 2022-Feb-18 at 07:48The file includes itself so the same source code can be used to generate 4 different sets of functions for specific values of the macro USIZE
.
The #include
directives are actually enclosed in an #ifndef
, which limits the recursion to a single level:
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58
The functions defined in this module are
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58// vsencNN: compress array with n unsigned (NN bits in[n]) values to the buffer out. Return value = end of compressed output buffer out
59unsigned char *vsenc8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
60unsigned char *vsenc16(unsigned short *__restrict in, size_t n, unsigned char *__restrict out);
61unsigned char *vsenc32(unsigned *__restrict in, size_t n, unsigned char *__restrict out);
62unsigned char *vsenc64(uint64_t *__restrict in, size_t n, unsigned char *__restrict out);
63
64// vsdecNN: decompress buffer into an array of n unsigned values. Return value = end of compressed input buffer in
65unsigned char *vsdec8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
66unsigned char *vsdec16(unsigned char *__restrict in, size_t n, unsigned short *__restrict out);
67unsigned char *vsdec32(unsigned char *__restrict in, size_t n, unsigned *__restrict out);
68unsigned char *vsdec64(unsigned char *__restrict in, size_t n, uint64_t *__restrict out);
69
They are all expanded from the two function definitions in vsimple.c:
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58// vsencNN: compress array with n unsigned (NN bits in[n]) values to the buffer out. Return value = end of compressed output buffer out
59unsigned char *vsenc8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
60unsigned char *vsenc16(unsigned short *__restrict in, size_t n, unsigned char *__restrict out);
61unsigned char *vsenc32(unsigned *__restrict in, size_t n, unsigned char *__restrict out);
62unsigned char *vsenc64(uint64_t *__restrict in, size_t n, unsigned char *__restrict out);
63
64// vsdecNN: decompress buffer into an array of n unsigned values. Return value = end of compressed input buffer in
65unsigned char *vsdec8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
66unsigned char *vsdec16(unsigned char *__restrict in, size_t n, unsigned short *__restrict out);
67unsigned char *vsdec32(unsigned char *__restrict in, size_t n, unsigned *__restrict out);
68unsigned char *vsdec64(unsigned char *__restrict in, size_t n, uint64_t *__restrict out);
69unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
70 ...
71}
72
73unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
74 ...
75}
76
The TEMPLATE2
and TEMPLATE3
macros are defined in conf.h as
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58// vsencNN: compress array with n unsigned (NN bits in[n]) values to the buffer out. Return value = end of compressed output buffer out
59unsigned char *vsenc8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
60unsigned char *vsenc16(unsigned short *__restrict in, size_t n, unsigned char *__restrict out);
61unsigned char *vsenc32(unsigned *__restrict in, size_t n, unsigned char *__restrict out);
62unsigned char *vsenc64(uint64_t *__restrict in, size_t n, unsigned char *__restrict out);
63
64// vsdecNN: decompress buffer into an array of n unsigned values. Return value = end of compressed input buffer in
65unsigned char *vsdec8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
66unsigned char *vsdec16(unsigned char *__restrict in, size_t n, unsigned short *__restrict out);
67unsigned char *vsdec32(unsigned char *__restrict in, size_t n, unsigned *__restrict out);
68unsigned char *vsdec64(unsigned char *__restrict in, size_t n, uint64_t *__restrict out);
69unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
70 ...
71}
72
73unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
74 ...
75}
76#define TEMPLATE2_(_x_, _y_) _x_##_y_
77#define TEMPLATE2(_x_, _y_) TEMPLATE2_(_x_,_y_)
78
79#define TEMPLATE3_(_x_,_y_,_z_) _x_##_y_##_z_
80#define TEMPLATE3(_x_,_y_,_z_) TEMPLATE3_(_x_, _y_, _z_)
81
These macros are classic preprocessor constructions to create identifiers via token pasting. TEMPLATE2
and TEMPLATE2_
are more commonly called GLUE
and XGLUE
.
The function template starts as:
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58// vsencNN: compress array with n unsigned (NN bits in[n]) values to the buffer out. Return value = end of compressed output buffer out
59unsigned char *vsenc8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
60unsigned char *vsenc16(unsigned short *__restrict in, size_t n, unsigned char *__restrict out);
61unsigned char *vsenc32(unsigned *__restrict in, size_t n, unsigned char *__restrict out);
62unsigned char *vsenc64(uint64_t *__restrict in, size_t n, unsigned char *__restrict out);
63
64// vsdecNN: decompress buffer into an array of n unsigned values. Return value = end of compressed input buffer in
65unsigned char *vsdec8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
66unsigned char *vsdec16(unsigned char *__restrict in, size_t n, unsigned short *__restrict out);
67unsigned char *vsdec32(unsigned char *__restrict in, size_t n, unsigned *__restrict out);
68unsigned char *vsdec64(unsigned char *__restrict in, size_t n, uint64_t *__restrict out);
69unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
70 ...
71}
72
73unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
74 ...
75}
76#define TEMPLATE2_(_x_, _y_) _x_##_y_
77#define TEMPLATE2(_x_, _y_) TEMPLATE2_(_x_,_y_)
78
79#define TEMPLATE3_(_x_,_y_,_z_) _x_##_y_##_z_
80#define TEMPLATE3(_x_,_y_,_z_) TEMPLATE3_(_x_, _y_, _z_)
81unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) ...
82
It is expanded in the first recursive inclusion with USIZE
defined as 8
into:
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58// vsencNN: compress array with n unsigned (NN bits in[n]) values to the buffer out. Return value = end of compressed output buffer out
59unsigned char *vsenc8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
60unsigned char *vsenc16(unsigned short *__restrict in, size_t n, unsigned char *__restrict out);
61unsigned char *vsenc32(unsigned *__restrict in, size_t n, unsigned char *__restrict out);
62unsigned char *vsenc64(uint64_t *__restrict in, size_t n, unsigned char *__restrict out);
63
64// vsdecNN: decompress buffer into an array of n unsigned values. Return value = end of compressed input buffer in
65unsigned char *vsdec8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
66unsigned char *vsdec16(unsigned char *__restrict in, size_t n, unsigned short *__restrict out);
67unsigned char *vsdec32(unsigned char *__restrict in, size_t n, unsigned *__restrict out);
68unsigned char *vsdec64(unsigned char *__restrict in, size_t n, uint64_t *__restrict out);
69unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
70 ...
71}
72
73unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
74 ...
75}
76#define TEMPLATE2_(_x_, _y_) _x_##_y_
77#define TEMPLATE2(_x_, _y_) TEMPLATE2_(_x_,_y_)
78
79#define TEMPLATE3_(_x_,_y_,_z_) _x_##_y_##_z_
80#define TEMPLATE3(_x_,_y_,_z_) TEMPLATE3_(_x_, _y_, _z_)
81unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) ...
82unsigned char *vsenc8(uint8_t *__restrict in, size_t n, unsigned char *__restrict out) ...
83
The second recursive inclusion, with USIZE
defined as 16
, expands the template as:
1#define USIZE 8
2#include "vsimple.c"
3#undef USIZE
4
5#define USIZE 16
6#include "vsimple.c"
7#undef USIZE
8
9#define USIZE 32
10#include "vsimple.c"
11#undef USIZE
12
13#define USIZE 64
14#include "vsimple.c"
15#undef USIZE
16#ifndef USIZE
17
18// common definitions
19...
20//
21
22#define VSENC vsenc
23#define VSDEC vsdec
24
25#define USIZE 8
26#include "vsimple.c"
27#undef USIZE
28
29#define USIZE 16
30#include "vsimple.c"
31#undef USIZE
32
33#define USIZE 32
34#include "vsimple.c"
35#undef USIZE
36
37#define USIZE 64
38#include "vsimple.c"
39#undef USIZE
40
41#else // defined(USIZE)
42
43// macro expanded size specific functions using token pasting
44
45...
46
47#define uint_t TEMPLATE3(uint, USIZE, _t)
48
49unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
50 ...
51}
52
53unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
54 ...
55}
56
57#endif
58// vsencNN: compress array with n unsigned (NN bits in[n]) values to the buffer out. Return value = end of compressed output buffer out
59unsigned char *vsenc8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
60unsigned char *vsenc16(unsigned short *__restrict in, size_t n, unsigned char *__restrict out);
61unsigned char *vsenc32(unsigned *__restrict in, size_t n, unsigned char *__restrict out);
62unsigned char *vsenc64(uint64_t *__restrict in, size_t n, unsigned char *__restrict out);
63
64// vsdecNN: decompress buffer into an array of n unsigned values. Return value = end of compressed input buffer in
65unsigned char *vsdec8( unsigned char *__restrict in, size_t n, unsigned char *__restrict out);
66unsigned char *vsdec16(unsigned char *__restrict in, size_t n, unsigned short *__restrict out);
67unsigned char *vsdec32(unsigned char *__restrict in, size_t n, unsigned *__restrict out);
68unsigned char *vsdec64(unsigned char *__restrict in, size_t n, uint64_t *__restrict out);
69unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) {
70 ...
71}
72
73unsigned char *TEMPLATE2(VSDEC, USIZE)(unsigned char *__restrict ip, size_t n, uint_t *__restrict op) {
74 ...
75}
76#define TEMPLATE2_(_x_, _y_) _x_##_y_
77#define TEMPLATE2(_x_, _y_) TEMPLATE2_(_x_,_y_)
78
79#define TEMPLATE3_(_x_,_y_,_z_) _x_##_y_##_z_
80#define TEMPLATE3(_x_,_y_,_z_) TEMPLATE3_(_x_, _y_, _z_)
81unsigned char *TEMPLATE2(VSENC, USIZE)(uint_t *__restrict in, size_t n, unsigned char *__restrict out) ...
82unsigned char *vsenc8(uint8_t *__restrict in, size_t n, unsigned char *__restrict out) ...
83unsigned char *vsenc16(uint16_t *__restrict in, size_t n, unsigned char *__restrict out) ...
84
and 2 more inclusions define vsenc32
and vsenc64
.
This usage of preprocessed source code is more common with separate files: one for the instantiating part that has all the common definitions, especially the macros, and a separate file for the code and data templates, which is included multiple times with different macro definitions.
A good example is the generation of enums, string and structures arrays from atom and opcode definitions in QuickJS.
QUESTION
APL Fork/Train with Compression
Asked 2022-Feb-18 at 07:42I want to select elements from an array based on some test. Currently, I am trying to do that with a compression, and I would like to write it as a tacit function. (I'm very new to APL, so feel free to suggest other options.) Below is a minimal (not-)working example.
The third line below shows that I can use the testing function f
on vec
and then do the compression, and the fifth line shows I can apply the identity function to vec
(as expected). So based on my understanding of the train documentation, I should be able to make a fork from f
and ⊢
with /
as the center prong. Below shows that this does not work, and I presume it is because Dyalog is interpreting the sixth and eighth lines as doing an f
-reduce. Is there a way to indicate that I want a compression train and not a reduce? (and/or is there a better way to do this altogether?)
1 vec ← 10⍴⍳3
2 f ← {⍵≤2}
3 (f vec) / vec
41 2 1 2 1 2 1
5 (f vec) / (⊢ vec)
61 2 1 2 1 2 1
7 (f/⊢) vec
81
9 (f(/)⊢) vec
101
11
ANSWER
Answered 2022-Feb-18 at 07:42Yes, by making /
an operand, it is forced to behave as a function. As per APL Wiki, applying ⊢
atop the result of /
solves the problem:
1 vec ← 10⍴⍳3
2 f ← {⍵≤2}
3 (f vec) / vec
41 2 1 2 1 2 1
5 (f vec) / (⊢ vec)
61 2 1 2 1 2 1
7 (f/⊢) vec
81
9 (f(/)⊢) vec
101
11 vec ← 10⍴⍳3
12 f ← {⍵≤2}
13 (f⊢⍤/⊢) vec
141 2 1 2 1 2 1
15
QUESTION
.NET 6 failing at Decompress large gzip text
Asked 2022-Feb-01 at 10:43I have to decompress some gzip text in .NET 6 app, however, on a string that is 20,627 characters long, it only decompresses about 1/3 of it. The code I am using code works for this string in .NET 5 or .NETCore 3.1 As well as smaller compressed strings.
1public static string Decompress(this string compressedText)
2{
3 var gZipBuffer = Convert.FromBase64String(compressedText);
4 using var memoryStream = new MemoryStream();
5 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
6 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
7 var buffer = new byte[dataLength];
8 memoryStream.Position = 0;
9 using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress))
10 {
11 gZipStream.Read(buffer, 0, buffer.Length);
12 }
13 return Encoding.UTF8.GetString(buffer);
14}
15
The results look something like this:
Star of amazing text..... ...Text is fine till 33,619 after that is allNULLNULLNULLNULL
The rest of the file after the 33,618 characters is just nulls.
I have no idea why this is happening.
Edit: I updated this when I found the issue was not Blazor but in fact .NET 6. I took a project that was working in .NET Core 3.1 changed nothing other than compiling for .NET 6 and got the same error. The update reflects this.
Edit2: Just tested and it works in .NET 5 so it just .NET 6 that this error happens in.
ANSWER
Answered 2022-Feb-01 at 10:43Just confirmed that the article linked in the comments below the question contains a valid clue on the issue.
Corrected code would be:
1public static string Decompress(this string compressedText)
2{
3 var gZipBuffer = Convert.FromBase64String(compressedText);
4 using var memoryStream = new MemoryStream();
5 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
6 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
7 var buffer = new byte[dataLength];
8 memoryStream.Position = 0;
9 using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress))
10 {
11 gZipStream.Read(buffer, 0, buffer.Length);
12 }
13 return Encoding.UTF8.GetString(buffer);
14}
15string Decompress(string compressedText)
16{
17 var gZipBuffer = Convert.FromBase64String(compressedText);
18
19 using var memoryStream = new MemoryStream();
20 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
21 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
22
23 var buffer = new byte[dataLength];
24 memoryStream.Position = 0;
25
26 using var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress);
27
28 int totalRead = 0;
29 while (totalRead < buffer.Length)
30 {
31 int bytesRead = gZipStream.Read(buffer, totalRead, buffer.Length - totalRead);
32 if (bytesRead == 0) break;
33 totalRead += bytesRead;
34 }
35
36 return Encoding.UTF8.GetString(buffer);
37}
38
This approach changes
1public static string Decompress(this string compressedText)
2{
3 var gZipBuffer = Convert.FromBase64String(compressedText);
4 using var memoryStream = new MemoryStream();
5 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
6 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
7 var buffer = new byte[dataLength];
8 memoryStream.Position = 0;
9 using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress))
10 {
11 gZipStream.Read(buffer, 0, buffer.Length);
12 }
13 return Encoding.UTF8.GetString(buffer);
14}
15string Decompress(string compressedText)
16{
17 var gZipBuffer = Convert.FromBase64String(compressedText);
18
19 using var memoryStream = new MemoryStream();
20 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
21 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
22
23 var buffer = new byte[dataLength];
24 memoryStream.Position = 0;
25
26 using var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress);
27
28 int totalRead = 0;
29 while (totalRead < buffer.Length)
30 {
31 int bytesRead = gZipStream.Read(buffer, totalRead, buffer.Length - totalRead);
32 if (bytesRead == 0) break;
33 totalRead += bytesRead;
34 }
35
36 return Encoding.UTF8.GetString(buffer);
37}
38gZipStream.Read(buffer, 0, buffer.Length);
39
to
1public static string Decompress(this string compressedText)
2{
3 var gZipBuffer = Convert.FromBase64String(compressedText);
4 using var memoryStream = new MemoryStream();
5 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
6 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
7 var buffer = new byte[dataLength];
8 memoryStream.Position = 0;
9 using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress))
10 {
11 gZipStream.Read(buffer, 0, buffer.Length);
12 }
13 return Encoding.UTF8.GetString(buffer);
14}
15string Decompress(string compressedText)
16{
17 var gZipBuffer = Convert.FromBase64String(compressedText);
18
19 using var memoryStream = new MemoryStream();
20 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
21 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
22
23 var buffer = new byte[dataLength];
24 memoryStream.Position = 0;
25
26 using var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress);
27
28 int totalRead = 0;
29 while (totalRead < buffer.Length)
30 {
31 int bytesRead = gZipStream.Read(buffer, totalRead, buffer.Length - totalRead);
32 if (bytesRead == 0) break;
33 totalRead += bytesRead;
34 }
35
36 return Encoding.UTF8.GetString(buffer);
37}
38gZipStream.Read(buffer, 0, buffer.Length);
39 int totalRead = 0;
40 while (totalRead < buffer.Length)
41 {
42 int bytesRead = gZipStream.Read(buffer, totalRead, buffer.Length - totalRead);
43 if (bytesRead == 0) break;
44 totalRead += bytesRead;
45 }
46
which takes the Read
's return value into account correctly.
Without the change, the issue is easily repeatable on any string random enough to produce a gzip of length > ~10kb.
Here's the compressor, if anyone's interested in testing this on your own
1public static string Decompress(this string compressedText)
2{
3 var gZipBuffer = Convert.FromBase64String(compressedText);
4 using var memoryStream = new MemoryStream();
5 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
6 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
7 var buffer = new byte[dataLength];
8 memoryStream.Position = 0;
9 using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress))
10 {
11 gZipStream.Read(buffer, 0, buffer.Length);
12 }
13 return Encoding.UTF8.GetString(buffer);
14}
15string Decompress(string compressedText)
16{
17 var gZipBuffer = Convert.FromBase64String(compressedText);
18
19 using var memoryStream = new MemoryStream();
20 int dataLength = BitConverter.ToInt32(gZipBuffer, 0);
21 memoryStream.Write(gZipBuffer, 4, gZipBuffer.Length - 4);
22
23 var buffer = new byte[dataLength];
24 memoryStream.Position = 0;
25
26 using var gZipStream = new GZipStream(memoryStream, CompressionMode.Decompress);
27
28 int totalRead = 0;
29 while (totalRead < buffer.Length)
30 {
31 int bytesRead = gZipStream.Read(buffer, totalRead, buffer.Length - totalRead);
32 if (bytesRead == 0) break;
33 totalRead += bytesRead;
34 }
35
36 return Encoding.UTF8.GetString(buffer);
37}
38gZipStream.Read(buffer, 0, buffer.Length);
39 int totalRead = 0;
40 while (totalRead < buffer.Length)
41 {
42 int bytesRead = gZipStream.Read(buffer, totalRead, buffer.Length - totalRead);
43 if (bytesRead == 0) break;
44 totalRead += bytesRead;
45 }
46string Compress(string plainText)
47{
48 var buffer = Encoding.UTF8.GetBytes(plainText);
49 using var memoryStream = new MemoryStream();
50
51 var lengthBytes = BitConverter.GetBytes((int)buffer.Length);
52 memoryStream.Write(lengthBytes, 0, lengthBytes.Length);
53
54 using var gZipStream = new GZipStream(memoryStream, CompressionMode.Compress);
55
56 gZipStream.Write(buffer, 0, buffer.Length);
57 gZipStream.Flush();
58
59 var gZipBuffer = memoryStream.ToArray();
60
61 return Convert.ToBase64String(gZipBuffer);
62}
63
QUESTION
angular 13: Module not found: Error: Can't resolve 'rxjs/operators'
Asked 2022-Jan-22 at 05:29I have upgraded my angular to angular 13. when I run to build SSR it gives me following error.
1ERROR in ./node_modules/@angular/common/fesm2015/http.mjs 12:0-56
2Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/common/fesm2015'
3Did you mean 'index.js'?
4BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
5(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
6The extension in the request is mandatory for it to be fully specified.
7Add the extension to the request.
8 @ ./src/app/app.server.module.ts 6:0-57 16:25-42
9 @ ./src/main.server.ts 3:0-58 3:0-58
10 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
11
12ERROR in ./node_modules/@angular/core/fesm2015/core.mjs 8:0-39
13Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/core/fesm2015'
14Did you mean 'index.js'?
15BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
16(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
17The extension in the request is mandatory for it to be fully specified.
18Add the extension to the request.
19 @ ./server.ts 30:0-47 35:0-14
20
21ERROR in ./node_modules/@angular/forms/fesm2015/forms.mjs 11:0-37
22Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/forms/fesm2015'
23Did you mean 'index.js'?
24BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
25(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
26The extension in the request is mandatory for it to be fully specified.
27Add the extension to the request.
28 @ ./src/app/app.module.ts 12:0-45 78:12-23
29 @ ./src/app/app.server.module.ts 3:0-41 12:18-27
30 @ ./src/main.server.ts 3:0-58 3:0-58
31 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
32
33ERROR in ./node_modules/@angular/platform-server/fesm2015/platform-server.mjs 21:0-39
34Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/platform-server/fesm2015'
35Did you mean 'index.js'?
36BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
37(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
38The extension in the request is mandatory for it to be fully specified.
39Add the extension to the request.
40 @ ./src/main.server.ts 4:0-77 4:0-77 4:0-77
41 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
42
43ERROR in ./node_modules/@angular/router/fesm2015/router.mjs 10:0-180
44Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/router/fesm2015'
45Did you mean 'index.js'?
46BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
47(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
48The extension in the request is mandatory for it to be fully specified.
49Add the extension to the request.
50 @ ./src/app/app.component.ts 2:0-48 35:31-44
51 @ ./src/app/app.server.module.ts 2:0-47 13:20-32
52 @ ./src/main.server.ts 3:0-58 3:0-58
53 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
54
my package.json file:
1ERROR in ./node_modules/@angular/common/fesm2015/http.mjs 12:0-56
2Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/common/fesm2015'
3Did you mean 'index.js'?
4BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
5(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
6The extension in the request is mandatory for it to be fully specified.
7Add the extension to the request.
8 @ ./src/app/app.server.module.ts 6:0-57 16:25-42
9 @ ./src/main.server.ts 3:0-58 3:0-58
10 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
11
12ERROR in ./node_modules/@angular/core/fesm2015/core.mjs 8:0-39
13Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/core/fesm2015'
14Did you mean 'index.js'?
15BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
16(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
17The extension in the request is mandatory for it to be fully specified.
18Add the extension to the request.
19 @ ./server.ts 30:0-47 35:0-14
20
21ERROR in ./node_modules/@angular/forms/fesm2015/forms.mjs 11:0-37
22Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/forms/fesm2015'
23Did you mean 'index.js'?
24BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
25(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
26The extension in the request is mandatory for it to be fully specified.
27Add the extension to the request.
28 @ ./src/app/app.module.ts 12:0-45 78:12-23
29 @ ./src/app/app.server.module.ts 3:0-41 12:18-27
30 @ ./src/main.server.ts 3:0-58 3:0-58
31 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
32
33ERROR in ./node_modules/@angular/platform-server/fesm2015/platform-server.mjs 21:0-39
34Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/platform-server/fesm2015'
35Did you mean 'index.js'?
36BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
37(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
38The extension in the request is mandatory for it to be fully specified.
39Add the extension to the request.
40 @ ./src/main.server.ts 4:0-77 4:0-77 4:0-77
41 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
42
43ERROR in ./node_modules/@angular/router/fesm2015/router.mjs 10:0-180
44Module not found: Error: Can't resolve 'rxjs/operators' in '/Users/nr/aws/jobsaf-website-staging/application/node_modules/@angular/router/fesm2015'
45Did you mean 'index.js'?
46BREAKING CHANGE: The request 'rxjs/operators' failed to resolve only because it was resolved as fully specified
47(probably because the origin is strict EcmaScript Module, e. g. a module with javascript mimetype, a '*.mjs' file, or a '*.js' file where the package.json contains '"type": "module"').
48The extension in the request is mandatory for it to be fully specified.
49Add the extension to the request.
50 @ ./src/app/app.component.ts 2:0-48 35:31-44
51 @ ./src/app/app.server.module.ts 2:0-47 13:20-32
52 @ ./src/main.server.ts 3:0-58 3:0-58
53 @ ./server.ts 32:0-52 40:15-30 44:0-34 44:0-34
54{
55 "name": "admin-panel",
56 "version": "0.0.0",
57 "license": "MIT",
58 "angular-cli": {},
59 "scripts": {
60 "start": "DEBUG=jobsaf-website:* nodemon --inspect --trace-warnings --legacy-watch --trace-warnings ./bin/www",
61 "seed": "node ./seeds/static-tables.js",
62 "test-jobsaf": "mocha --timeout 10000",
63 "rm-web": "rm -rf ./public/web/*",
64 "ng": "node ./node_modules/@angular/cli/bin/ng serve --host 0.0.0.0",
65 "ng:build": "node --max_old_space_size=5048 ./node_modules/@angular/cli/bin/ng build --configuration production --aot",
66 "build:server:prod": "node --max_old_space_size=4048 ./node_modules/@angular/cli/bin/ng run jobsaf-website:server:prod && webpack --config webpack.server.config.js",
67 "build:browser:prod": "node --max_old_space_size=4048 ./node_modules/@angular/cli/bin/ng build --configuration production --aot --vendor-chunk --deleteOutputPath=true --buildOptimizer --progress=true",
68 "build:server:staging": "node --max_old_space_size=4048 ./node_modules/@angular/cli/bin/ng run jobsaf-website:server:staging && webpack --config webpack.server.config.js",
69 "build:browser:staging": "node --max_old_space_size=4048 ./node_modules/@angular/cli/bin/ng build --configuration production --configuration=staging --aot --vendor-chunk --deleteOutputPath=true --buildOptimizer",
70 "build:stats": "node --max_old_space_size=3192 node_modules/@angular/cli/bin/ng build --configuration production --aot --vendor-chunk --deleteOutputPath=true --buildOptimizer --progress=true --configuration production --stats-json",
71 "build:prod": "npm run rm-web && npm run build:server:prod && npm run build:browser:prod",
72 "build:staging": "npm run rm-web && npm run build:server:staging && npm run build:browser:staging",
73 "server": "node local.js",
74 "file:migration": "APP_FILE_MIGRATION=true node ./migration/file-migration.js",
75 "test_env": "set NODE_ENV=test",
76 "jest": "jest --detectOpenHandles --watchAll --config ./jest.config.js",
77 "coverage": "jest -i --coverage",
78 "jest:ci": "jest --detectOpenHandles --forceExit --config ./jest.config.js",
79 "test": "npm run test_env && npm run jest",
80 "test:ci": "npm run test_env && npm run seed && npm run jest:ci",
81 "dev:ssr": "ng run jobsaf-website:serve-ssr",
82 "serve:ssr": "node public/web/server/main.js",
83 "build:ssr": "ng build --configuration production && ng run jobsaf-website:server:prod",
84 "prerender": "ng run jobsaf-website:prerender",
85 "postinstall": "ngcc"
86 },
87 "private": true,
88 "napa": {
89 "jquery.flot.spline": "miloszfalinski/jquery.flot.spline",
90 "ika.jvectormap": "kakirigi/ika.jvectormap"
91 },
92 "dependencies": {
93 "@angular/animations": "^13.0.2",
94 "@angular/common": "^13.0.2",
95 "@angular/compiler": "^13.0.2",
96 "@angular/compiler-cli": "^13.0.2",
97 "@angular/core": "^13.0.2",
98 "@angular/forms": "^13.0.2",
99 "@angular/material": "^13.0.2",
100 "@angular/platform-browser": "^13.0.2",
101 "@angular/platform-browser-dynamic": "^13.0.2",
102 "@angular/platform-server": "^13.0.2",
103 "@angular/pwa": "^13.0.3",
104 "@angular/router": "^13.0.2",
105 "@angular/service-worker": "^13.0.2",
106 "@fortawesome/angular-fontawesome": "^0.10.1",
107 "@fortawesome/fontawesome-svg-core": "^1.2.36",
108 "@fortawesome/free-brands-svg-icons": "^5.15.4",
109 "@fortawesome/free-solid-svg-icons": "^5.15.4",
110 "@fullcalendar/core": "^5.10.1",
111 "@hapi/joi": "^15.1.0",
112 "@ng-select/ng-select": "^8.1.1",
113 "@nguniversal/common": "^13.0.1",
114 "@nguniversal/express-engine": "^13.0.1",
115 "@ngx-loading-bar/core": "^5.1.2",
116 "@ngxs/store": "^3.7.3-dev.master-1e7127b",
117 "@schematics/angular": "^13.0.3",
118 "@sindresorhus/slugify": "^1.1.0",
119 "@trademe/ng-defer-load": "^8.2.1",
120 "@types/jquery": "^3.5.8",
121 "angular-archwizard": "^7.0.0",
122 "angular2-uuid": "^1.1.1",
123 "apicache": "^1.6.3",
124 "archiver": "^5.3.0",
125 "aws-sdk": "^2.1031.0",
126 "bluebird": "^3.7.2",
127 "bootstrap": "5.1.3",
128 "compression": "^1.7.4",
129 "compromise": "^13.11.4",
130 "cookie-parser": "^1.4.6",
131 "core-js": "3.19.1",
132 "cors": "~2.8.5",
133 "debug": "^4.3.2",
134 "dotenv": "^10.0.0",
135 "easyimage": "^3.1.1",
136 "ejs": "^3.1.6",
137 "exceljs": "^4.3.0",
138 "express": "^4.17.1",
139 "express-jwt": "^6.1.0",
140 "express-mongo-sanitize": "^2.1.0",
141 "express-rate-limit": "^5.5.1",
142 "express-useragent": "^1.0.15",
143 "express-validator": "^6.13.0",
144 "feed": "^4.2.2",
145 "file-saver": "^2.0.5",
146 "firebase-admin": "^10.0.0",
147 "font-awesome": "^4.7.0",
148 "generate-password": "^1.7.0",
149 "google-auth-library": "^7.10.2",
150 "hammerjs": "^2.0.8",
151 "helmet": "^4.6.0",
152 "html-pdf": "^3.0.1",
153 "http-status": "^1.5.0",
154 "intl-tel-input": "^17.0.13",
155 "izitoast": "1.4.0",
156 "joi-objectid": "^4.0.2",
157 "jquery": "^3.6.0",
158 "jsonwebtoken": "^8.5.1",
159 "jwt-decode": "^3.1.2",
160 "keyword-extractor": "0.0.20",
161 "kickbox": "^2.0.4",
162 "libphonenumber-js": "^1.9.43",
163 "localstorage-polyfill": "^1.0.1",
164 "lodash": "^4.17.21",
165 "lodash.uniq": "^4.5.0",
166 "md5": "^2.3.0",
167 "moment": "^2.29.1",
168 "mongoose": "5.8.11",
169 "mongoose-history": "^0.8.0",
170 "mongoose-unique-validator": "^2.0.3",
171 "mongoose-url-slugs": "^1.0.2",
172 "multer": "^1.4.3",
173 "multer-s3": "^2.10.0",
174 "multer-s3-transform": "^2.10.3",
175 "mysql": "^2.18.1",
176 "ng-recaptcha": "^9.0.0",
177 "ng2-file-upload": "^1.4.0",
178 "ngx-auth": "^5.4.0",
179 "ngx-bootstrap": "^6.1.0",
180 "ngx-facebook": "^3.0.0-0",
181 "ngx-img-cropper": "^11.0.0",
182 "ngx-infinite-scroll": "^10.0.1",
183 "ngx-moment": "^5.0.0",
184 "ngx-pagination": "^5.1.1",
185 "ngx-quill-editor": "^2.2.2",
186 "ngx-toastr": "^14.2.0",
187 "node-schedule": "^2.0.0",
188 "nodemailer": "^6.7.1",
189 "passport": "^0.5.0",
190 "passport-facebook-token": "^4.0.0",
191 "passport-google-id-token": "^0.4.7",
192 "passport-google-token": "^0.1.2",
193 "passport-linkedin-token": "^0.1.1",
194 "passport-local": "^1.0.0",
195 "pdf-to-text": "0.0.7",
196 "phantomjs-prebuilt": "^2.1.16",
197 "phone": "^3.1.10",
198 "phpass": "^0.1.1",
199 "rand-token": "^1.0.1",
200 "request": "^2.88.2",
201 "request-ip": "^2.1.3",
202 "rxjs": "^6.5.5",
203 "sharp": "^0.29.3",
204 "showdown": "^1.9.1",
205 "simple-line-icons": "^2.5.5",
206 "socket.io": "^4.3.2",
207 "socket.io-client": "^4.3.2",
208 "socket.io-redis": "^5.4.0",
209 "socketio-auth": "^0.1.1",
210 "textract": "^2.5.0",
211 "ts-loader": "9.2.6",
212 "underscore": "^1.13.1",
213 "unique-random-array": "^2.0.0",
214 "url": "^0.11.0",
215 "util": "^0.12.4",
216 "uuid": "^8.3.2",
217 "winston": "^3.3.3",
218 "xlsx": "^0.17.4",
219 "xss-clean": "^0.1.1",
220 "zone.js": "~0.11.4",
221 "zxcvbn": "^4.4.2"
222 },
223 "devDependencies": {
224 "@angular-devkit/build-angular": "~13.0.3",
225 "@angular/cli": "^13.0.3",
226 "@types/express": "^4.17.13",
227 "@types/hammerjs": "^2.0.40",
228 "@types/mocha": "^9.0.0",
229 "@types/node": "^16.11.7",
230 "@types/underscore": "^1.11.3",
231 "husky": "^7.0.0",
232 "jasmine-core": "~3.10.1",
233 "jasmine-spec-reporter": "~7.0.0",
234 "jest": "^27.3.1",
235 "karma": "^6.3.9",
236 "karma-chrome-launcher": "~3.1.0",
237 "karma-coverage-istanbul-reporter": "^3.0.3",
238 "karma-jasmine": "~4.0.1",
239 "karma-jasmine-html-reporter": "^1.7.0",
240 "lint-staged": "^12.0.2",
241 "mocha": "^9.1.3",
242 "ng-diff-match-patch": "^3.0.1",
243 "nodemon": "^2.0.15",
244 "protractor": "^7.0.0",
245 "supertest": "^6.1.6",
246 "tslib": "^2.3.1",
247 "tslint": "^6.1.3",
248 "typescript": "4.4.3",
249 "webpack": "^5.64.1",
250 "webpack-cli": "^4.9.1"
251 }
252}
253
254
Any Idea
ANSWER
Answered 2022-Jan-22 at 05:29I just solve this issue by correcting the RxJS version to 7.4.0
. I hope this can solve others issue as well.
QUESTION
JavaScript: V8 question: are small integers pooled?
Asked 2022-Jan-17 at 12:37was looking at this V8 design doc where it has a section for Constant Pool Entries
it says
Constant pools are used to store heap objects and small integers that are referenced as constants in generated bytecode. and
... Small integers and the strong referenced oddball type’s have bytecodes to load them directly and do not go into the constant pool.
So I am confused: are small integers pooled or not?
My understanding is that it is not worth it pooling small integers if sizeof(int) < sizeof(int *)
- because it is cheaper to just copy the actual integer instead of copying the pointer that points to the integer in the constant pool. Also variables that hold integers can be optimised to be stored directly in CPU registers and skip being allocated in memory first.
Also, are they located on the V8 heap or the stack? My understanding had always been that smis are just be the immediate values allocated on the stack instead of being a pointer + an integer allocated on heap. Also if you take a heap snapshot using chrome devtool you cannot find smis in the heap snapshot - only heap number such as big integers or double like 3.14 are on the heap until I saw this article https://v8.dev/blog/pointer-compression#value-tagging-in-v8
JavaScript values in V8 are represented as objects and allocated on the V8 heap, no matter if they are objects, arrays, numbers or strings. This allows us to represent any value as a pointer to an object.
Now I am just baffled - are smis also allocated on the heap?
ANSWER
Answered 2022-Jan-17 at 12:37V8 developer here.
are small integers pooled or not?
They are not (at least not right now). That said, this is a small implementation detail and could be done either way: it would totally be possible to use the constant pool for Smis. I suppose the decision to build special machinery for Smis (instead of reusing the general-purpose constant pool) was made because things turned out to be more efficient that way.
it is not worth it pooling small integers if
sizeof(int) < sizeof(int *)
The details are different (a Smi is not an int
, and constant pool slots are referenced by index rather than C++ pointer), but this reasoning does go in the right direction: avoiding indirections can save time and memory.
are smis also allocated on the heap?
Yes, everything is allocated on the heap. The stack is only useful for temporary (and sufficiently small) things; that's largely unrelated to the type of thing.
The "trick" of Smis is that they're not stored as separate objects: when you have an object that refers to a Smi, such as let foo = {smi: 42}
, then the value 42
can be smi-encoded and stored directly inside the "foo" object (whereas if the value was 42.5
, then the object would store a pointer to a separate "HeapNumber"). But since the object is on the heap, so is the Smi.
@DanielCruz
What I understand [...] is that constant small integers are pooled. Variable small integers are not.
Nope. Any literal that occurs in source code is "constant". Whether you use let
or const
for your variables has nothing to do with this.
QUESTION
Paramiko authentication fails with "Agreed upon 'rsa-sha2-512' pubkey algorithm" (and "unsupported public key algorithm: rsa-sha2-512" in sshd log)
Asked 2022-Jan-13 at 14:49I have a Python 3 application running on CentOS Linux 7.7 executing SSH commands against remote hosts. It works properly but today I encountered an odd error executing a command against a "new" remote server (server based on RHEL 6.10):
encountered RSA key, expected OPENSSH key
Executing the same command from the system shell (using the same private key of course) works perfectly fine.
On the remote server I discovered in /var/log/secure
that when SSH connection and commands are issued from the source server with Python (using Paramiko) sshd complains about unsupported public key algorithm:
userauth_pubkey: unsupported public key algorithm: rsa-sha2-512
Note that target servers with higher RHEL/CentOS like 7.x don't encounter the issue.
It seems like Paramiko picks/offers the wrong algorithm when negotiating with the remote server when on the contrary SSH shell performs the negotiation properly in the context of this "old" target server. How to get the Python program to work as expected?
Python code
1import paramiko
2import logging
3
4ssh_user = "my_user"
5ssh_keypath = "/path/to/.ssh/my_key.rsa"
6server = "server.tld"
7
8ssh_client = paramiko.SSHClient()
9ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
10ssh_client.connect(server,port=22,username=ssh_user, key_filename=ssh_keypath)
11
12# SSH command
13cmd = "echo TEST : $(hostname)"
14
15stdin, stdout, stderr = ssh_client.exec_command(cmd, get_pty=True)
16exit_code = stdout.channel.recv_exit_status()
17
18cmd_raw_output = stdout.readlines()
19out = "".join(cmd_raw_output)
20out_msg = out.strip()
21
22# Ouput (logger code omitted)
23logger.debug(out_msg)
24
25if ssh_client is not None:
26 ssh_client.close()
27
Shell command equivalent
1import paramiko
2import logging
3
4ssh_user = "my_user"
5ssh_keypath = "/path/to/.ssh/my_key.rsa"
6server = "server.tld"
7
8ssh_client = paramiko.SSHClient()
9ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
10ssh_client.connect(server,port=22,username=ssh_user, key_filename=ssh_keypath)
11
12# SSH command
13cmd = "echo TEST : $(hostname)"
14
15stdin, stdout, stderr = ssh_client.exec_command(cmd, get_pty=True)
16exit_code = stdout.channel.recv_exit_status()
17
18cmd_raw_output = stdout.readlines()
19out = "".join(cmd_raw_output)
20out_msg = out.strip()
21
22# Ouput (logger code omitted)
23logger.debug(out_msg)
24
25if ssh_client is not None:
26 ssh_client.close()
27ssh -i /path/to/.ssh/my_key.rsa my_user@server.tld "echo TEST : $(hostname)"
28
Paramiko logs (DEBUG)
1import paramiko
2import logging
3
4ssh_user = "my_user"
5ssh_keypath = "/path/to/.ssh/my_key.rsa"
6server = "server.tld"
7
8ssh_client = paramiko.SSHClient()
9ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
10ssh_client.connect(server,port=22,username=ssh_user, key_filename=ssh_keypath)
11
12# SSH command
13cmd = "echo TEST : $(hostname)"
14
15stdin, stdout, stderr = ssh_client.exec_command(cmd, get_pty=True)
16exit_code = stdout.channel.recv_exit_status()
17
18cmd_raw_output = stdout.readlines()
19out = "".join(cmd_raw_output)
20out_msg = out.strip()
21
22# Ouput (logger code omitted)
23logger.debug(out_msg)
24
25if ssh_client is not None:
26 ssh_client.close()
27ssh -i /path/to/.ssh/my_key.rsa my_user@server.tld "echo TEST : $(hostname)"
28DEB [YYYYmmdd-HH:MM:30.475] thr=1 paramiko.transport: starting thread (client mode): 0xf6054ac8
29DEB [YYYYmmdd-HH:MM:30.476] thr=1 paramiko.transport: Local version/idstring: SSH-2.0-paramiko_2.9.1
30DEB [YYYYmmdd-HH:MM:30.490] thr=1 paramiko.transport: Remote version/idstring: SSH-2.0-OpenSSH_5.3
31INF [YYYYmmdd-HH:MM:30.490] thr=1 paramiko.transport: Connected (version 2.0, client OpenSSH_5.3)
32DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: === Key exchange possibilities ===
33DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: kex algos: diffie-hellman-group-exchange-sha256, diffie-hellman-group-exchange-sha1, diffie-hellman-group14-sha1, diffie-hellman-group1-sha1
34DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: server key: ssh-rsa, ssh-dss
35DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: client encrypt: aes128-ctr, aes192-ctr, aes256-ctr, arcfour256, arcfour128, aes128-cbc, 3des-cbc, blowfish-cbc, cast128-cbc, aes192-cbc, aes256-cbc, arcfour, rijndael-cbc@lysator.liu.se
36DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: server encrypt: aes128-ctr, aes192-ctr, aes256-ctr, arcfour256, arcfour128, aes128-cbc, 3des-cbc, blowfish-cbc, cast128-cbc, aes192-cbc, aes256-cbc, arcfour, rijndael-cbc@lysator.liu.se
37DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client mac: hmac-md5, hmac-sha1, umac-64@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-ripemd160, hmac-ripemd160@openssh.com, hmac-sha1-96, hmac-md5-96
38DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server mac: hmac-md5, hmac-sha1, umac-64@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-ripemd160, hmac-ripemd160@openssh.com, hmac-sha1-96, hmac-md5-96
39DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client compress: none, zlib@openssh.com
40DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server compress: none, zlib@openssh.com
41DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client lang: <none>
42DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server lang: <none>.
43DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: kex follows: False
44DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: === Key exchange agreements ===
45DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: Kex: diffie-hellman-group-exchange-sha256
46DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: HostKey: ssh-rsa
47DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: Cipher: aes128-ctr
48DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: MAC: hmac-sha2-256
49DEB [YYYYmmdd-HH:MM:30.501] thr=1 paramiko.transport: Compression: none
50DEB [YYYYmmdd-HH:MM:30.501] thr=1 paramiko.transport: === End of kex handshake ===
51DEB [YYYYmmdd-HH:MM:30.548] thr=1 paramiko.transport: Got server p (2048 bits)
52DEB [YYYYmmdd-HH:MM:30.666] thr=1 paramiko.transport: kex engine KexGexSHA256 specified hash_algo <built-in function openssl_sha256>
53DEB [YYYYmmdd-HH:MM:30.667] thr=1 paramiko.transport: Switch to new keys ...
54DEB [YYYYmmdd-HH:MM:30.669] thr=2 paramiko.transport: Adding ssh-rsa host key for server.tld: b'caea********************.'
55DEB [YYYYmmdd-HH:MM:30.674] thr=2 paramiko.transport: Trying discovered key b'b49c********************' in /path/to/.ssh/my_key.rsa
56DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: userauth is OK
57DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: Finalizing pubkey algorithm for key of type 'ssh-rsa'
58DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: Our pubkey algorithm list: ['rsa-sha2-512', 'rsa-sha2-256', 'ssh-rsa']
59DEB [YYYYmmdd-HH:MM:30.723] thr=1 paramiko.transport: Server-side algorithm list: ['']
60DEB [YYYYmmdd-HH:MM:30.723] thr=1 paramiko.transport: Agreed upon 'rsa-sha2-512' pubkey algorithm
61INF [YYYYmmdd-HH:MM:30.735] thr=1 paramiko.transport: Authentication (publickey) failed.
62DEB [YYYYmmdd-HH:MM:30.739] thr=2 paramiko.transport: Trying SSH agent key b'9d37********************'
63DEB [YYYYmmdd-HH:MM:30.747] thr=1 paramiko.transport: userauth is OK.
64DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Finalizing pubkey algorithm for key of type 'ssh-rsa'
65DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Our pubkey algorithm list: ['rsa-sha2-512', 'rsa-sha2-256', 'ssh-rsa']
66DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Server-side algorithm list: ['']
67DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Agreed upon 'rsa-sha2-512' pubkey algorithm
68INF [YYYYmmdd-HH:MM:30.868] thr=1 paramiko.transport: Authentication (publickey) failed...
69
Shell command logs
1import paramiko
2import logging
3
4ssh_user = "my_user"
5ssh_keypath = "/path/to/.ssh/my_key.rsa"
6server = "server.tld"
7
8ssh_client = paramiko.SSHClient()
9ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
10ssh_client.connect(server,port=22,username=ssh_user, key_filename=ssh_keypath)
11
12# SSH command
13cmd = "echo TEST : $(hostname)"
14
15stdin, stdout, stderr = ssh_client.exec_command(cmd, get_pty=True)
16exit_code = stdout.channel.recv_exit_status()
17
18cmd_raw_output = stdout.readlines()
19out = "".join(cmd_raw_output)
20out_msg = out.strip()
21
22# Ouput (logger code omitted)
23logger.debug(out_msg)
24
25if ssh_client is not None:
26 ssh_client.close()
27ssh -i /path/to/.ssh/my_key.rsa my_user@server.tld "echo TEST : $(hostname)"
28DEB [YYYYmmdd-HH:MM:30.475] thr=1 paramiko.transport: starting thread (client mode): 0xf6054ac8
29DEB [YYYYmmdd-HH:MM:30.476] thr=1 paramiko.transport: Local version/idstring: SSH-2.0-paramiko_2.9.1
30DEB [YYYYmmdd-HH:MM:30.490] thr=1 paramiko.transport: Remote version/idstring: SSH-2.0-OpenSSH_5.3
31INF [YYYYmmdd-HH:MM:30.490] thr=1 paramiko.transport: Connected (version 2.0, client OpenSSH_5.3)
32DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: === Key exchange possibilities ===
33DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: kex algos: diffie-hellman-group-exchange-sha256, diffie-hellman-group-exchange-sha1, diffie-hellman-group14-sha1, diffie-hellman-group1-sha1
34DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: server key: ssh-rsa, ssh-dss
35DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: client encrypt: aes128-ctr, aes192-ctr, aes256-ctr, arcfour256, arcfour128, aes128-cbc, 3des-cbc, blowfish-cbc, cast128-cbc, aes192-cbc, aes256-cbc, arcfour, rijndael-cbc@lysator.liu.se
36DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: server encrypt: aes128-ctr, aes192-ctr, aes256-ctr, arcfour256, arcfour128, aes128-cbc, 3des-cbc, blowfish-cbc, cast128-cbc, aes192-cbc, aes256-cbc, arcfour, rijndael-cbc@lysator.liu.se
37DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client mac: hmac-md5, hmac-sha1, umac-64@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-ripemd160, hmac-ripemd160@openssh.com, hmac-sha1-96, hmac-md5-96
38DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server mac: hmac-md5, hmac-sha1, umac-64@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-ripemd160, hmac-ripemd160@openssh.com, hmac-sha1-96, hmac-md5-96
39DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client compress: none, zlib@openssh.com
40DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server compress: none, zlib@openssh.com
41DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client lang: <none>
42DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server lang: <none>.
43DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: kex follows: False
44DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: === Key exchange agreements ===
45DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: Kex: diffie-hellman-group-exchange-sha256
46DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: HostKey: ssh-rsa
47DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: Cipher: aes128-ctr
48DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: MAC: hmac-sha2-256
49DEB [YYYYmmdd-HH:MM:30.501] thr=1 paramiko.transport: Compression: none
50DEB [YYYYmmdd-HH:MM:30.501] thr=1 paramiko.transport: === End of kex handshake ===
51DEB [YYYYmmdd-HH:MM:30.548] thr=1 paramiko.transport: Got server p (2048 bits)
52DEB [YYYYmmdd-HH:MM:30.666] thr=1 paramiko.transport: kex engine KexGexSHA256 specified hash_algo <built-in function openssl_sha256>
53DEB [YYYYmmdd-HH:MM:30.667] thr=1 paramiko.transport: Switch to new keys ...
54DEB [YYYYmmdd-HH:MM:30.669] thr=2 paramiko.transport: Adding ssh-rsa host key for server.tld: b'caea********************.'
55DEB [YYYYmmdd-HH:MM:30.674] thr=2 paramiko.transport: Trying discovered key b'b49c********************' in /path/to/.ssh/my_key.rsa
56DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: userauth is OK
57DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: Finalizing pubkey algorithm for key of type 'ssh-rsa'
58DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: Our pubkey algorithm list: ['rsa-sha2-512', 'rsa-sha2-256', 'ssh-rsa']
59DEB [YYYYmmdd-HH:MM:30.723] thr=1 paramiko.transport: Server-side algorithm list: ['']
60DEB [YYYYmmdd-HH:MM:30.723] thr=1 paramiko.transport: Agreed upon 'rsa-sha2-512' pubkey algorithm
61INF [YYYYmmdd-HH:MM:30.735] thr=1 paramiko.transport: Authentication (publickey) failed.
62DEB [YYYYmmdd-HH:MM:30.739] thr=2 paramiko.transport: Trying SSH agent key b'9d37********************'
63DEB [YYYYmmdd-HH:MM:30.747] thr=1 paramiko.transport: userauth is OK.
64DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Finalizing pubkey algorithm for key of type 'ssh-rsa'
65DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Our pubkey algorithm list: ['rsa-sha2-512', 'rsa-sha2-256', 'ssh-rsa']
66DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Server-side algorithm list: ['']
67DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Agreed upon 'rsa-sha2-512' pubkey algorithm
68INF [YYYYmmdd-HH:MM:30.868] thr=1 paramiko.transport: Authentication (publickey) failed...
69OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017
70debug1: Reading configuration data /etc/ssh/ssh_config
71debug1: /etc/ssh/ssh_config line 58: Applying options for *
72debug2: resolving "server.tld" port 22
73debug2: ssh_connect_direct: needpriv 0
74debug1: Connecting to server.tld [server.tld] port 22.
75debug1: Connection established.
76debug1: permanently_set_uid: 0/0
77debug1: key_load_public: No such file or directory
78debug1: identity file /path/to/.ssh/my_key.rsa type -1
79debug1: key_load_public: No such file or directory
80debug1: identity file /path/to/.ssh/my_key.rsa-cert type -1
81debug1: Enabling compatibility mode for protocol 2.0
82debug1: Local version string SSH-2.0-OpenSSH_7.4
83debug1: Remote protocol version 2.0, remote software version OpenSSH_5.3
84debug1: match: OpenSSH_5.3 pat OpenSSH_5* compat 0x0c000000
85debug2: fd 3 setting O_NONBLOCK
86debug1: Authenticating to server.tld:22 as 'my_user'
87debug3: hostkeys_foreach: reading file "/path/to/.ssh/known_hosts"
88debug3: record_hostkey: found key type RSA in file /path/to/.ssh/known_hosts:82
89debug3: load_hostkeys: loaded 1 keys from server.tld
90debug3: order_hostkeyalgs: prefer hostkeyalgs: ssh-rsa-cert-v01@openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa
91debug3: send packet: type 20
92debug1: SSH2_MSG_KEXINIT sent
93debug3: receive packet: type 20
94debug1: SSH2_MSG_KEXINIT received
95debug2: local client KEXINIT proposal
96debug2: KEX algorithms: curve25519-sha256,curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha256,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,ext-info-c
97debug2: host key algorithms: ssh-rsa-cert-v01@openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384-cert-v01@openssh.com,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519-cert-v01@openssh.com,ssh-dss-cert-v01@openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-ed25519,ssh-dss
98debug2: ciphers ctos: chacha20-poly1305@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc
99debug2: ciphers stoc: chacha20-poly1305@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc
100debug2: MACs ctos: umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1
101debug2: MACs stoc: umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1
102debug2: compression ctos: none,zlib@openssh.com,zlib
103debug2: compression stoc: none,zlib@openssh.com,zlib
104debug2: languages ctos:
105debug2: languages stoc:
106debug2: first_kex_follows 0
107debug2: reserved 0
108debug2: peer server KEXINIT proposal
109debug2: KEX algorithms: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
110debug2: host key algorithms: ssh-rsa,ssh-dss
111debug2: ciphers ctos: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
112debug2: ciphers stoc: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
113debug2: MACs ctos: hmac-md5,hmac-sha1,umac-64@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
114debug2: MACs stoc: hmac-md5,hmac-sha1,umac-64@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
115debug2: compression ctos: none,zlib@openssh.com
116debug2: compression stoc: none,zlib@openssh.com
117debug2: languages ctos:
118debug2: languages stoc:
119debug2: first_kex_follows 0
120debug2: reserved 0
121debug1: kex: algorithm: diffie-hellman-group-exchange-sha256
122debug1: kex: host key algorithm: ssh-rsa
123debug1: kex: server->client cipher: aes128-ctr MAC: umac-64@openssh.com compression: none
124debug1: kex: client->server cipher: aes128-ctr MAC: umac-64@openssh.com compression: none
125debug1: kex: diffie-hellman-group-exchange-sha256 need=16 dh_need=16
126debug1: kex: diffie-hellman-group-exchange-sha256 need=16 dh_need=16
127debug3: send packet: type 34
128debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<3072<8192) sent
129debug3: receive packet: type 31
130debug1: got SSH2_MSG_KEX_DH_GEX_GROUP
131debug2: bits set: 1502/3072
132debug3: send packet: type 32
133debug1: SSH2_MSG_KEX_DH_GEX_INIT sent
134debug3: receive packet: type 33
135debug1: got SSH2_MSG_KEX_DH_GEX_REPLY
136debug1: Server host key: ssh-.:**************************************************
137debug3: hostkeys_foreach: reading file "/path/to/.ssh/known_hosts"
138debug3: record_hostkey: found key type RSA in file /path/to/.ssh/known_hosts:8..2
139debug3: load_hostkeys: loaded 1 keys from server.tld
140debug1: Host 'server.tld' is known and matches the RSA host key.
141debug1: Found key in /path/to/.ssh/known_hosts:82
142debug2: bits set: 1562/3072
143debug3: send packet: type 21
144debug2: set_newkeys: mode 1
145debug1: rekey after 4294967296 blocks
146debug1: SSH2_MSG_NEWKEYS sent
147debug1: expecting SSH2_MSG_NEWKEYS
148debug3: receive packet: type 21
149debug1: SSH2_MSG_NEWKEYS received
150debug2: set_newkeys: mode 0
151debug1: rekey after 4294967296 blocks
152debug2: key: <foo> (0x55bcf6d1d320), agent
153debug2: key: /path/to/.ssh/my_key.rsa ((nil)), explicit
154debug3: send packet: type 5
155debug3: receive packet: type 6
156debug2: service_accept: ssh-userauth
157debug1: SSH2_MSG_SERVICE_ACCEPT received
158debug3: send packet: type 50
159debug3: receive packet: type 51
160debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password
161debug3: start over, passed a different list publickey,gssapi-keyex,gssapi-with-mic,password
162debug3: preferred gssapi-keyex,gssapi-with-mic,publickey,keyboard-interactive,password
163debug3: authmethod_lookup gssapi-keyex
164debug3: remaining preferred: gssapi-with-mic,publickey,keyboard-interactive,password
165debug3: authmethod_is_enabled gssapi-keyex
166debug1: Next authentication method: gssapi-keyex
167debug1: No valid Key exchange context
168debug2: we did not send a packet, disable method
169debug3: authmethod_lookup gssapi-with-mic
170debug3: remaining preferred: publickey,keyboard-interactive,password
171debug3: authmethod_is_enabled gssapi-with-mic
172debug1: Next authentication method: gssapi-with-mic
173debug1: Unspecified GSS failure. Minor code may provide more information
174No Kerberos credentials available (default cache: KEYRING:persistent:0)
175
176debug1: Unspecified GSS failure. Minor code may provide more information
177No Kerberos credentials available (default cache: KEYRING:persistent:0)
178
179debug2: we did not send a packet, disable method
180debug3: authmethod_lookup publickey
181debug3: remaining preferred: keyboard-interactive,password
182debug3: authmethod_is_enabled publickey
183debug1: Next authentication method: publickey
184debug1: Offering RSA public key: <foo>
185debug3: send_pubkey_test
186debug3: send packet: type 50
187debug2: we sent a publickey packet, wait for reply
188debug3: receive packet: type 51
189debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password
190debug1: Trying private key: /path/to/.ssh/my_key.rsa
191debug3: sign_and_send_pubkey: RSA SHA256:**********************************
192debug3: send packet: type 50
193debug2: we sent a publickey packet, wait for reply
194debug3: receive packet: type 52
195debug1: Authentication succeeded (publickey).
196Authenticated to server.tld ([server.tld]:22).
197debug1: channel 0: new [client-session]
198debug3: ssh_session2_open: channel_new: 0
199debug2: channel 0: send open
200debug3: send packet: type 90
201debug1: Requesting no-more-sessions@openssh.com
202debug3: send packet: type 80
203debug1: Entering interactive session.
204debug1: pledge: network
205debug3: receive packet: type 91
206debug2: callback start
207debug2: fd 3 setting TCP_NODELAY
208debug3: ssh_packet_set_tos: set IP_TOS 0x08
209debug2: client_session2_setup: id 0
210debug1: Sending environment.
211debug3: Ignored env XDG_SESSION_ID
212debug3: Ignored env HOSTNAME
213debug3: Ignored env SELINUX_ROLE_REQUESTED
214debug3: Ignored env TERM
215debug3: Ignored env SHELL
216debug3: Ignored env HISTSIZE
217debug3: Ignored env SSH_CLIENT
218debug3: Ignored env SELINUX_USE_CURRENT_RANGE
219debug3: Ignored env SSH_TTY
220debug3: Ignored env CDPATH
221debug3: Ignored env USER
222debug3: Ignored env LS_COLORS
223debug3: Ignored env SSH_AUTH_SOCK
224debug3: Ignored env MAIL
225debug3: Ignored env PATH
226debug3: Ignored env PWD
227debug1: Sending env LANG = xx_XX.UTF-8
228debug2: channel 0: request env confirm 0
229debug3: send packet: type 98
230debug3: Ignored env SELINUX_LEVEL_REQUESTED
231debug3: Ignored env HISTCONTROL
232debug3: Ignored env SHLVL
233debug3: Ignored env HOME
234debug3: Ignored env LOGNAME
235debug3: Ignored env SSH_CONNECTION
236debug3: Ignored env LESSOPEN
237debug3: Ignored env XDG_RUNTIME_DIR
238debug3: Ignored env _
239debug1: Sending command: echo TEST : $(hostname)
240debug2: channel 0: request exec confirm 1
241debug3: send packet: type 98
242debug2: callback done
243debug2: channel 0: open confirm rwindow 0 rmax 32768
244debug2: channel 0: rcvd adjust 2097152
245debug3: receive packet: type 99
246debug2: channel_input_status_confirm: type 99 id 0
247debug2: exec request accepted on channel 0
248TEST : server.tld
249debug3: receive packet: type 96
250debug2: channel 0: rcvd eof
251debug2: channel 0: output open -> drain
252debug2: channel 0: obuf empty
253debug2: channel 0: close_write
254debug2: channel 0: output drain -> closed
255debug3: receive packet: type 98
256debug1: client_input_channel_req: channel 0 rtype exit-status reply 0
257debug3: receive packet: type 98
258debug1: client_input_channel_req: channel 0 rtype eow@openssh.com reply 0
259debug2: channel 0: rcvd eow
260debug2: channel 0: close_read
261debug2: channel 0: input open -> closed
262debug3: receive packet: type 97
263debug2: channel 0: rcvd close
264debug3: channel 0: will not send data after close
265debug2: channel 0: almost dead
266debug2: channel 0: gc: notify user
267debug2: channel 0: gc: user detached
268debug2: channel 0: send close
269debug3: send packet: type 97
270debug2: channel 0: is dead
271debug2: channel 0: garbage collecting
272debug1: channel 0: free: client-session, nchannels 1
273debug3: channel 0: status: The following connections are open:
274 #0 client-session (t4 r0 i3/0 o3/0 fd -1/-1 cc -1)
275
276debug3: send packet: type 1
277Transferred: sent 3264, received 2656 bytes, in 0.0 seconds.
278Bytes per second: sent 92349.8, received 75147.4
279debug1: Exit status 0
280.
281
ANSWER
Answered 2022-Jan-13 at 14:49Imo, it's a bug in Paramiko. It does not handle correctly absence of server-sig-algs
extension on the server side.
Try disabling rsa-sha2-*
on Paramiko side altogether:
1import paramiko
2import logging
3
4ssh_user = "my_user"
5ssh_keypath = "/path/to/.ssh/my_key.rsa"
6server = "server.tld"
7
8ssh_client = paramiko.SSHClient()
9ssh_client.set_missing_host_key_policy(paramiko.AutoAddPolicy())
10ssh_client.connect(server,port=22,username=ssh_user, key_filename=ssh_keypath)
11
12# SSH command
13cmd = "echo TEST : $(hostname)"
14
15stdin, stdout, stderr = ssh_client.exec_command(cmd, get_pty=True)
16exit_code = stdout.channel.recv_exit_status()
17
18cmd_raw_output = stdout.readlines()
19out = "".join(cmd_raw_output)
20out_msg = out.strip()
21
22# Ouput (logger code omitted)
23logger.debug(out_msg)
24
25if ssh_client is not None:
26 ssh_client.close()
27ssh -i /path/to/.ssh/my_key.rsa my_user@server.tld "echo TEST : $(hostname)"
28DEB [YYYYmmdd-HH:MM:30.475] thr=1 paramiko.transport: starting thread (client mode): 0xf6054ac8
29DEB [YYYYmmdd-HH:MM:30.476] thr=1 paramiko.transport: Local version/idstring: SSH-2.0-paramiko_2.9.1
30DEB [YYYYmmdd-HH:MM:30.490] thr=1 paramiko.transport: Remote version/idstring: SSH-2.0-OpenSSH_5.3
31INF [YYYYmmdd-HH:MM:30.490] thr=1 paramiko.transport: Connected (version 2.0, client OpenSSH_5.3)
32DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: === Key exchange possibilities ===
33DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: kex algos: diffie-hellman-group-exchange-sha256, diffie-hellman-group-exchange-sha1, diffie-hellman-group14-sha1, diffie-hellman-group1-sha1
34DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: server key: ssh-rsa, ssh-dss
35DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: client encrypt: aes128-ctr, aes192-ctr, aes256-ctr, arcfour256, arcfour128, aes128-cbc, 3des-cbc, blowfish-cbc, cast128-cbc, aes192-cbc, aes256-cbc, arcfour, rijndael-cbc@lysator.liu.se
36DEB [YYYYmmdd-HH:MM:30.498] thr=1 paramiko.transport: server encrypt: aes128-ctr, aes192-ctr, aes256-ctr, arcfour256, arcfour128, aes128-cbc, 3des-cbc, blowfish-cbc, cast128-cbc, aes192-cbc, aes256-cbc, arcfour, rijndael-cbc@lysator.liu.se
37DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client mac: hmac-md5, hmac-sha1, umac-64@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-ripemd160, hmac-ripemd160@openssh.com, hmac-sha1-96, hmac-md5-96
38DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server mac: hmac-md5, hmac-sha1, umac-64@openssh.com, hmac-sha2-256, hmac-sha2-512, hmac-ripemd160, hmac-ripemd160@openssh.com, hmac-sha1-96, hmac-md5-96
39DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client compress: none, zlib@openssh.com
40DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server compress: none, zlib@openssh.com
41DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: client lang: <none>
42DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: server lang: <none>.
43DEB [YYYYmmdd-HH:MM:30.499] thr=1 paramiko.transport: kex follows: False
44DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: === Key exchange agreements ===
45DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: Kex: diffie-hellman-group-exchange-sha256
46DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: HostKey: ssh-rsa
47DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: Cipher: aes128-ctr
48DEB [YYYYmmdd-HH:MM:30.500] thr=1 paramiko.transport: MAC: hmac-sha2-256
49DEB [YYYYmmdd-HH:MM:30.501] thr=1 paramiko.transport: Compression: none
50DEB [YYYYmmdd-HH:MM:30.501] thr=1 paramiko.transport: === End of kex handshake ===
51DEB [YYYYmmdd-HH:MM:30.548] thr=1 paramiko.transport: Got server p (2048 bits)
52DEB [YYYYmmdd-HH:MM:30.666] thr=1 paramiko.transport: kex engine KexGexSHA256 specified hash_algo <built-in function openssl_sha256>
53DEB [YYYYmmdd-HH:MM:30.667] thr=1 paramiko.transport: Switch to new keys ...
54DEB [YYYYmmdd-HH:MM:30.669] thr=2 paramiko.transport: Adding ssh-rsa host key for server.tld: b'caea********************.'
55DEB [YYYYmmdd-HH:MM:30.674] thr=2 paramiko.transport: Trying discovered key b'b49c********************' in /path/to/.ssh/my_key.rsa
56DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: userauth is OK
57DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: Finalizing pubkey algorithm for key of type 'ssh-rsa'
58DEB [YYYYmmdd-HH:MM:30.722] thr=1 paramiko.transport: Our pubkey algorithm list: ['rsa-sha2-512', 'rsa-sha2-256', 'ssh-rsa']
59DEB [YYYYmmdd-HH:MM:30.723] thr=1 paramiko.transport: Server-side algorithm list: ['']
60DEB [YYYYmmdd-HH:MM:30.723] thr=1 paramiko.transport: Agreed upon 'rsa-sha2-512' pubkey algorithm
61INF [YYYYmmdd-HH:MM:30.735] thr=1 paramiko.transport: Authentication (publickey) failed.
62DEB [YYYYmmdd-HH:MM:30.739] thr=2 paramiko.transport: Trying SSH agent key b'9d37********************'
63DEB [YYYYmmdd-HH:MM:30.747] thr=1 paramiko.transport: userauth is OK.
64DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Finalizing pubkey algorithm for key of type 'ssh-rsa'
65DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Our pubkey algorithm list: ['rsa-sha2-512', 'rsa-sha2-256', 'ssh-rsa']
66DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Server-side algorithm list: ['']
67DEB [YYYYmmdd-HH:MM:30.748] thr=1 paramiko.transport: Agreed upon 'rsa-sha2-512' pubkey algorithm
68INF [YYYYmmdd-HH:MM:30.868] thr=1 paramiko.transport: Authentication (publickey) failed...
69OpenSSH_7.4p1, OpenSSL 1.0.2k-fips 26 Jan 2017
70debug1: Reading configuration data /etc/ssh/ssh_config
71debug1: /etc/ssh/ssh_config line 58: Applying options for *
72debug2: resolving "server.tld" port 22
73debug2: ssh_connect_direct: needpriv 0
74debug1: Connecting to server.tld [server.tld] port 22.
75debug1: Connection established.
76debug1: permanently_set_uid: 0/0
77debug1: key_load_public: No such file or directory
78debug1: identity file /path/to/.ssh/my_key.rsa type -1
79debug1: key_load_public: No such file or directory
80debug1: identity file /path/to/.ssh/my_key.rsa-cert type -1
81debug1: Enabling compatibility mode for protocol 2.0
82debug1: Local version string SSH-2.0-OpenSSH_7.4
83debug1: Remote protocol version 2.0, remote software version OpenSSH_5.3
84debug1: match: OpenSSH_5.3 pat OpenSSH_5* compat 0x0c000000
85debug2: fd 3 setting O_NONBLOCK
86debug1: Authenticating to server.tld:22 as 'my_user'
87debug3: hostkeys_foreach: reading file "/path/to/.ssh/known_hosts"
88debug3: record_hostkey: found key type RSA in file /path/to/.ssh/known_hosts:82
89debug3: load_hostkeys: loaded 1 keys from server.tld
90debug3: order_hostkeyalgs: prefer hostkeyalgs: ssh-rsa-cert-v01@openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa
91debug3: send packet: type 20
92debug1: SSH2_MSG_KEXINIT sent
93debug3: receive packet: type 20
94debug1: SSH2_MSG_KEXINIT received
95debug2: local client KEXINIT proposal
96debug2: KEX algorithms: curve25519-sha256,curve25519-sha256@libssh.org,ecdh-sha2-nistp256,ecdh-sha2-nistp384,ecdh-sha2-nistp521,diffie-hellman-group-exchange-sha256,diffie-hellman-group16-sha512,diffie-hellman-group18-sha512,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha256,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1,ext-info-c
97debug2: host key algorithms: ssh-rsa-cert-v01@openssh.com,rsa-sha2-512,rsa-sha2-256,ssh-rsa,ecdsa-sha2-nistp256-cert-v01@openssh.com,ecdsa-sha2-nistp384-cert-v01@openssh.com,ecdsa-sha2-nistp521-cert-v01@openssh.com,ssh-ed25519-cert-v01@openssh.com,ssh-dss-cert-v01@openssh.com,ecdsa-sha2-nistp256,ecdsa-sha2-nistp384,ecdsa-sha2-nistp521,ssh-ed25519,ssh-dss
98debug2: ciphers ctos: chacha20-poly1305@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc
99debug2: ciphers stoc: chacha20-poly1305@openssh.com,aes128-ctr,aes192-ctr,aes256-ctr,aes128-gcm@openssh.com,aes256-gcm@openssh.com,aes128-cbc,aes192-cbc,aes256-cbc
100debug2: MACs ctos: umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1
101debug2: MACs stoc: umac-64-etm@openssh.com,umac-128-etm@openssh.com,hmac-sha2-256-etm@openssh.com,hmac-sha2-512-etm@openssh.com,hmac-sha1-etm@openssh.com,umac-64@openssh.com,umac-128@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-sha1
102debug2: compression ctos: none,zlib@openssh.com,zlib
103debug2: compression stoc: none,zlib@openssh.com,zlib
104debug2: languages ctos:
105debug2: languages stoc:
106debug2: first_kex_follows 0
107debug2: reserved 0
108debug2: peer server KEXINIT proposal
109debug2: KEX algorithms: diffie-hellman-group-exchange-sha256,diffie-hellman-group-exchange-sha1,diffie-hellman-group14-sha1,diffie-hellman-group1-sha1
110debug2: host key algorithms: ssh-rsa,ssh-dss
111debug2: ciphers ctos: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
112debug2: ciphers stoc: aes128-ctr,aes192-ctr,aes256-ctr,arcfour256,arcfour128,aes128-cbc,3des-cbc,blowfish-cbc,cast128-cbc,aes192-cbc,aes256-cbc,arcfour,rijndael-cbc@lysator.liu.se
113debug2: MACs ctos: hmac-md5,hmac-sha1,umac-64@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
114debug2: MACs stoc: hmac-md5,hmac-sha1,umac-64@openssh.com,hmac-sha2-256,hmac-sha2-512,hmac-ripemd160,hmac-ripemd160@openssh.com,hmac-sha1-96,hmac-md5-96
115debug2: compression ctos: none,zlib@openssh.com
116debug2: compression stoc: none,zlib@openssh.com
117debug2: languages ctos:
118debug2: languages stoc:
119debug2: first_kex_follows 0
120debug2: reserved 0
121debug1: kex: algorithm: diffie-hellman-group-exchange-sha256
122debug1: kex: host key algorithm: ssh-rsa
123debug1: kex: server->client cipher: aes128-ctr MAC: umac-64@openssh.com compression: none
124debug1: kex: client->server cipher: aes128-ctr MAC: umac-64@openssh.com compression: none
125debug1: kex: diffie-hellman-group-exchange-sha256 need=16 dh_need=16
126debug1: kex: diffie-hellman-group-exchange-sha256 need=16 dh_need=16
127debug3: send packet: type 34
128debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<3072<8192) sent
129debug3: receive packet: type 31
130debug1: got SSH2_MSG_KEX_DH_GEX_GROUP
131debug2: bits set: 1502/3072
132debug3: send packet: type 32
133debug1: SSH2_MSG_KEX_DH_GEX_INIT sent
134debug3: receive packet: type 33
135debug1: got SSH2_MSG_KEX_DH_GEX_REPLY
136debug1: Server host key: ssh-.:**************************************************
137debug3: hostkeys_foreach: reading file "/path/to/.ssh/known_hosts"
138debug3: record_hostkey: found key type RSA in file /path/to/.ssh/known_hosts:8..2
139debug3: load_hostkeys: loaded 1 keys from server.tld
140debug1: Host 'server.tld' is known and matches the RSA host key.
141debug1: Found key in /path/to/.ssh/known_hosts:82
142debug2: bits set: 1562/3072
143debug3: send packet: type 21
144debug2: set_newkeys: mode 1
145debug1: rekey after 4294967296 blocks
146debug1: SSH2_MSG_NEWKEYS sent
147debug1: expecting SSH2_MSG_NEWKEYS
148debug3: receive packet: type 21
149debug1: SSH2_MSG_NEWKEYS received
150debug2: set_newkeys: mode 0
151debug1: rekey after 4294967296 blocks
152debug2: key: <foo> (0x55bcf6d1d320), agent
153debug2: key: /path/to/.ssh/my_key.rsa ((nil)), explicit
154debug3: send packet: type 5
155debug3: receive packet: type 6
156debug2: service_accept: ssh-userauth
157debug1: SSH2_MSG_SERVICE_ACCEPT received
158debug3: send packet: type 50
159debug3: receive packet: type 51
160debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password
161debug3: start over, passed a different list publickey,gssapi-keyex,gssapi-with-mic,password
162debug3: preferred gssapi-keyex,gssapi-with-mic,publickey,keyboard-interactive,password
163debug3: authmethod_lookup gssapi-keyex
164debug3: remaining preferred: gssapi-with-mic,publickey,keyboard-interactive,password
165debug3: authmethod_is_enabled gssapi-keyex
166debug1: Next authentication method: gssapi-keyex
167debug1: No valid Key exchange context
168debug2: we did not send a packet, disable method
169debug3: authmethod_lookup gssapi-with-mic
170debug3: remaining preferred: publickey,keyboard-interactive,password
171debug3: authmethod_is_enabled gssapi-with-mic
172debug1: Next authentication method: gssapi-with-mic
173debug1: Unspecified GSS failure. Minor code may provide more information
174No Kerberos credentials available (default cache: KEYRING:persistent:0)
175
176debug1: Unspecified GSS failure. Minor code may provide more information
177No Kerberos credentials available (default cache: KEYRING:persistent:0)
178
179debug2: we did not send a packet, disable method
180debug3: authmethod_lookup publickey
181debug3: remaining preferred: keyboard-interactive,password
182debug3: authmethod_is_enabled publickey
183debug1: Next authentication method: publickey
184debug1: Offering RSA public key: <foo>
185debug3: send_pubkey_test
186debug3: send packet: type 50
187debug2: we sent a publickey packet, wait for reply
188debug3: receive packet: type 51
189debug1: Authentications that can continue: publickey,gssapi-keyex,gssapi-with-mic,password
190debug1: Trying private key: /path/to/.ssh/my_key.rsa
191debug3: sign_and_send_pubkey: RSA SHA256:**********************************
192debug3: send packet: type 50
193debug2: we sent a publickey packet, wait for reply
194debug3: receive packet: type 52
195debug1: Authentication succeeded (publickey).
196Authenticated to server.tld ([server.tld]:22).
197debug1: channel 0: new [client-session]
198debug3: ssh_session2_open: channel_new: 0
199debug2: channel 0: send open
200debug3: send packet: type 90
201debug1: Requesting no-more-sessions@openssh.com
202debug3: send packet: type 80
203debug1: Entering interactive session.
204debug1: pledge: network
205debug3: receive packet: type 91
206debug2: callback start
207debug2: fd 3 setting TCP_NODELAY
208debug3: ssh_packet_set_tos: set IP_TOS 0x08
209debug2: client_session2_setup: id 0
210debug1: Sending environment.
211debug3: Ignored env XDG_SESSION_ID
212debug3: Ignored env HOSTNAME
213debug3: Ignored env SELINUX_ROLE_REQUESTED
214debug3: Ignored env TERM
215debug3: Ignored env SHELL
216debug3: Ignored env HISTSIZE
217debug3: Ignored env SSH_CLIENT
218debug3: Ignored env SELINUX_USE_CURRENT_RANGE
219debug3: Ignored env SSH_TTY
220debug3: Ignored env CDPATH
221debug3: Ignored env USER
222debug3: Ignored env LS_COLORS
223debug3: Ignored env SSH_AUTH_SOCK
224debug3: Ignored env MAIL
225debug3: Ignored env PATH
226debug3: Ignored env PWD
227debug1: Sending env LANG = xx_XX.UTF-8
228debug2: channel 0: request env confirm 0
229debug3: send packet: type 98
230debug3: Ignored env SELINUX_LEVEL_REQUESTED
231debug3: Ignored env HISTCONTROL
232debug3: Ignored env SHLVL
233debug3: Ignored env HOME
234debug3: Ignored env LOGNAME
235debug3: Ignored env SSH_CONNECTION
236debug3: Ignored env LESSOPEN
237debug3: Ignored env XDG_RUNTIME_DIR
238debug3: Ignored env _
239debug1: Sending command: echo TEST : $(hostname)
240debug2: channel 0: request exec confirm 1
241debug3: send packet: type 98
242debug2: callback done
243debug2: channel 0: open confirm rwindow 0 rmax 32768
244debug2: channel 0: rcvd adjust 2097152
245debug3: receive packet: type 99
246debug2: channel_input_status_confirm: type 99 id 0
247debug2: exec request accepted on channel 0
248TEST : server.tld
249debug3: receive packet: type 96
250debug2: channel 0: rcvd eof
251debug2: channel 0: output open -> drain
252debug2: channel 0: obuf empty
253debug2: channel 0: close_write
254debug2: channel 0: output drain -> closed
255debug3: receive packet: type 98
256debug1: client_input_channel_req: channel 0 rtype exit-status reply 0
257debug3: receive packet: type 98
258debug1: client_input_channel_req: channel 0 rtype eow@openssh.com reply 0
259debug2: channel 0: rcvd eow
260debug2: channel 0: close_read
261debug2: channel 0: input open -> closed
262debug3: receive packet: type 97
263debug2: channel 0: rcvd close
264debug3: channel 0: will not send data after close
265debug2: channel 0: almost dead
266debug2: channel 0: gc: notify user
267debug2: channel 0: gc: user detached
268debug2: channel 0: send close
269debug3: send packet: type 97
270debug2: channel 0: is dead
271debug2: channel 0: garbage collecting
272debug1: channel 0: free: client-session, nchannels 1
273debug3: channel 0: status: The following connections are open:
274 #0 client-session (t4 r0 i3/0 o3/0 fd -1/-1 cc -1)
275
276debug3: send packet: type 1
277Transferred: sent 3264, received 2656 bytes, in 0.0 seconds.
278Bytes per second: sent 92349.8, received 75147.4
279debug1: Exit status 0
280.
281ssh_client.connect(
282 server, username=ssh_user, key_filename=ssh_keypath,
283 disabled_algorithms=dict(pubkeys=["rsa-sha2-512", "rsa-sha2-256"]))
284
(note that there's no need to specify port=22
, as that's the default)
I've found related Paramiko issue:
RSA key auth failing from paramiko 2.9.x client to dropbear server
Though it refers to Paramiko 2.9.0 change log, which seems to imply that the behavior is deliberate:
When the server does not send
server-sig-algs
, Paramiko will attempt the first algorithm in the above list. Clients connecting to legacy servers should thus usedisabled_algorithms
to turn off SHA2.
Since 2.9.2, Paramiko will say:
DEB [20220113-14:46:13.882] thr=1 paramiko.transport: Server did not send a server-sig-algs list; defaulting to our first preferred algo ('rsa-sha2-512')
DEB [20220113-14:46:13.882] thr=1 paramiko.transport: NOTE: you may use the 'disabled_algorithms' SSHClient/Transport init kwarg to disable that or other algorithms if your server does not support them!
Obligatory warning: Do not use AutoAddPolicy
– You are losing a protection against MITM attacks by doing so. For a correct solution, see Paramiko "Unknown Server".
Your code for waiting for command to complete and reading its output is flawed too. See Wait to finish command executed with Python Paramiko. And for most purposes, the get_pty=True
is not a good idea either.
Community Discussions contain sources that include Stack Exchange Network
Tutorials and Learning Resources in Compression
Tutorials and Learning Resources are not available at this moment for Compression