lz4 | LZ4 compression and decompression in pure Go | Compression library
kandi X-RAY | lz4 Summary
Support
Quality
Security
License
Reuse
Currently covering the most popular Java, JavaScript and Python libraries. See a Sample Here
lz4 Key Features
lz4 Examples and Code Snippets
// 1. Create config object Config config = new Config(); config.useClusterServers() // use "rediss://" for SSL connection .addNodeAddress("redis://127.0.0.1:7181"); // or read config from file config = Config.fromYAML(new File("config-file.yaml"));
// 2. Create Redisson instance // Sync and Async API RedissonClient redisson = Redisson.create(config); // Reactive API RedissonReactiveClient redissonReactive = redisson.reactive(); // RxJava3 API RedissonRxClient redissonRx = redisson.rxJava();
// 3. Get Redis based implementation of java.util.concurrent.ConcurrentMap RMap map = redisson.getMap("myMap"); RMapReactive mapReactive = redissonReactive.getMap("myMap"); RMapRx mapRx = redissonRx.getMap("myMap");
// 4. Get Redis based implementation of java.util.concurrent.locks.Lock RLock lock = redisson.getLock("myLock"); RLockReactive lockReactive = redissonReactive.getLock("myLock"); RLockRx lockRx = redissonRx.getLock("myLock");
// 4. Get Redis based implementation of java.util.concurrent.ExecutorService RExecutorService executor = redisson.getExecutorService("myExecutorService"); // over 50 Redis based Java objects and services ...
Trending Discussions on lz4
Trending Discussions on lz4
QUESTION
What is the difference between Arrow IPC and Feather?
The official documentation says:
Version 2 (V2), the default version, which is exactly represented as the Arrow IPC file format on disk. V2 files support storing all Arrow data types as well as compression with LZ4 or ZSTD. V2 was first made available in Apache Arrow 0.17.0.
While vaex, a pandas alternative, has two different functions, one for Arrow IPC and one for Feather. polars, another pandas alternative, indicate that Arrow IPC and Feather are the same.
ANSWER
Answered 2021-Jun-09 at 20:18TL;DR There is no difference between the Arrow IPC file format and Feather V2.
There's some confusion because of the two versions of Feather, and because of the Arrow IPC file format vs the Arrow IPC stream format.
For the two versions of Feather, see the FAQ entry:
What about the “Feather” file format?
The Feather v1 format was a simplified custom container for writing a subset of the Arrow format to disk prior to the development of the Arrow IPC file format. “Feather version 2” is now exactly the Arrow IPC file format and we have retained the “Feather” name and APIs for backwards compatibility.
So IPC == Feather(V2). Some places refer to Feather mean Feather(V1) which is different from the IPC file format. However, that doesn't seem to be the issue here: Polars and Vaex appear to use Feather to mean Feather(V2) (though Vaex slightly misleadingly says "Feather is exactly represented as the Arrow IPC file format on disk, but also support compression").
Vaex exposes both export_arrow
and export_feather
. This relates to another point of Arrow, as it defines both an IPC stream format and an IPC file format. They differ in that the file format has a magic string (for file identification) and a footer (to support random access reads) (documentation).
export_feather
always writes the IPC file format (==FeatherV2), while export_arrow
lets you choose between the IPC file format and the IPC stream format. Looking at where export_feather
was added I think the confusion might stem from the PyArrow APIs making it obvious how to enable compression with the Feather API methods (which are a user-friendly convenience) but not with the IPC file writer (which is what export_arrow
uses). But ultimately, the format being written is the same.
QUESTION
Getting this error while building docker images on Mac OS BigSur with M1 chip.
What I've tried: Installed docker for Apple Silicon Graphic M1 from docker site
It fails while trying to install RocksDB from Docker
# docker.local
FROM golang:1.12.4-alpine3.9
RUN apk add bash build-base grep git
# Install RocksDB
RUN apk add coreutils linux-headers perl zlib-dev bzip2-dev lz4-dev snappy-dev zstd-libs zstd-dev && \
cd /tmp && \
wget -O - https://github.com/facebook/rocksdb/archive/v5.18.3.tar.gz | tar xz && \
cd /tmp/rocksdb* && \
make -j $(nproc) install-shared OPT=-g0 USE_RTTI=1 && \
rm -R /tmp/rocksdb* && \
apk del coreutils linux-headers perl
Errors:
#6 9.903 cc1plus: error: unknown value 'armv8-a-march=armv8-a' for -march
#6 9.903 cc1plus: note: valid arguments are: armv8-a armv8.1-a armv8.2-a armv8.3-a armv8.4-a native
#6 9.906 cc1plus: error: unknown value 'armv8-a-march=armv8-a' for -march
#6 9.906 cc1plus: note: valid arguments are: armv8-a armv8.1-a armv8.2-a armv8.3-a armv8.4-a native
#6 9.907 install -d /usr/local/lib
#6 9.908 CC shared-objects/cache/clock_cache.o
#6 9.908 CC shared-objects/cache/lru_cache.o
#6 9.909 CC shared-objects/cache/sharded_cache.o
#6 9.909 for header_dir in `find "include/rocksdb" -type d`; do \
#6 9.909 install -d /usr/local/$header_dir; \
#6 9.909 done
#6 9.911 cc1plus: error: unknown value 'armv8-a-march=armv8-a' for -march
#6 9.911 cc1plus: note: valid arguments are: armv8-a armv8.1-a armv8.2-a armv8.3-a armv8.4-a native
#6 9.912 make: *** [Makefile:684: shared-objects/cache/clock_cache.o] Error 1
#6 9.912 make: *** Waiting for unfinished jobs....
#6 9.912 make: *** [Makefile:684: shared-objects/cache/lru_cache.o] Error 1
#6 9.913 make: *** [Makefile:684: shared-objects/cache/sharded_cache.o] Error 1
#6 9.914 for header in `find "include/rocksdb" -type f -name *.h`; do \
#6 9.914 install -C -m 644 $header /usr/local/$header; \
#6 9.914 done
ANSWER
Answered 2021-May-31 at 17:35There are a couple of issues to address. The dockerfile as you have it will download a base golang ARM image, and try to use that to build. That's fine, as long as the required libs "know how" to build with an arm architecture. If they don't know how to build under arm (as seems to be the case here), you may want to try building under an AMD image of golang.
Intel / AMD containers will run under ARM docker on an M1. There are a few ways to build AMD containers on an M1. You can use buildkit, and then: docker buildx build --platform linux/amd64 .
or, you can add the arch to the source image by modifying the Dockerfile
to include something like:
FROM --platform=linux/amd64 golang:1.12.4-alpine3.9
which would use the amd64 arch of the golang image (assuming one exists). This is what I often use to build an image on ARM. This works even if docker is native ARM.
QUESTION
I am learning C programming from "Learn c the hard way by Zed Shaw". He asks the learner to try and break their own code.
So I tried the following C code and thought printing more values that I gave argv will break it but it did not until later.
#include
int main(int argc, char *argv[])
{
int i = 0;
printf("This is argc: %d\n",argc);
printf("This is argv[argc]: %s\n",argv[argc]);
printf("This is argv[0]: %s\n",argv[0]);
for(i=argc;i<100;i++)
printf("This is argv[%d]: %s\n",i,argv[i]);
for(i=1;i
When I try to print argv upto 100: I see the following when I was expecting some kind of out of bound or segmentation fault.
./exp10_so These are cmd args
This is argc: 5
This is argv[argc]: (null)
This is argv[0]: ./exp10_so
This is argv[5]: (null)
This is argv[6]: TERMINATOR_DBUS_NAME=net.tenshu.Terminator21a9d5db22c73a993ff0b42f64b396873
This is argv[7]: GTK_RC_FILES=/etc/gtk/gtkrc:/home/ab/.gtkrc:/home/ab/.config/gtkrc
This is argv[8]: _=/home/ab/Projects/learn_c_the_hard_way/./exp10_so
This is argv[9]: LANG=en_IN
This is argv[10]: GTK3_MODULES=xapp-gtk3-module
This is argv[11]: XDG_CURRENT_DESKTOP=KDE
This is argv[12]: QT_LINUX_ACCESSIBILITY_ALWAYS_ON=1
This is argv[13]: LC_IDENTIFICATION=en_IN
This is argv[14]: XCURSOR_THEME=breeze_cursors
This is argv[15]: XDG_SESSION_CLASS=user
This is argv[16]: XDG_SESSION_TYPE=x11
This is argv[17]: SHLVL=1
This is argv[18]: TERMINATOR_UUID=urn:uuid:4496f24b-8a64-43af-ab5a-03fc7e722242
This is argv[19]: DESKTOP_SESSION=plasma
This is argv[20]: LC_MEASUREMENT=en_IN
This is argv[21]: OLDPWD=/home/ab/Projects
This is argv[22]: HOME=/home/ab
This is argv[23]: KDE_SESSION_VERSION=5
This is argv[24]: USER=ab
This is argv[25]: TERMINATOR_DBUS_PATH=/net/tenshu/Terminator2
This is argv[26]: SESSION_MANAGER=local/tgh:@/tmp/.ICE-unix/2372,unix/tgh:/tmp/.ICE-unix/2372
This is argv[27]: XDG_SESSION_PATH=/org/freedesktop/DisplayManager/Session1
This is argv[28]: DBUS_SESSION_BUS_ADDRESS=unix:path=/run/user/1000/bus
This is argv[29]: XDG_VTNR=1
This is argv[30]: XDG_SEAT=seat0
This is argv[31]: LC_NUMERIC=en_IN
This is argv[32]: BROWSER=/usr/bin/firefox
This is argv[33]: GTK_MODULES=canberra-gtk-module
This is argv[34]: XDG_SEAT_PATH=/org/freedesktop/DisplayManager/Seat0
This is argv[35]: XDG_DATA_DIRS=/home/ab/.local/share/flatpak/exports/share:/var/lib/flatpak/exports/share:/usr/local/share:/usr/share:/var/lib/snapd/desktop
This is argv[36]: XDG_SESSION_DESKTOP=KDE
This is argv[37]: VTE_VERSION=6401
This is argv[38]: KDE_SESSION_UID=1000
This is argv[39]: LC_TIME=en_IN
This is argv[40]: MAIL=/var/spool/mail/ab
This is argv[41]: LOGNAME=ab
This is argv[42]: QT_AUTO_SCREEN_SCALE_FACTOR=0
This is argv[43]: LC_PAPER=en_IN
This is argv[44]: PATH=/usr/local/nginx/sbin:/home/ab/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/bin:/usr/lib/jvm/default/bin:/usr/bin/site_perl:/usr/bin/vendor_perl:/usr/bin/core_perl:/var/lib/snapd/snap/bin
This is argv[45]: QT_SCREEN_SCALE_FACTORS=LVDS1=1;DP1=1;HDMI1=1;VGA1=1;VIRTUAL1=1;
This is argv[46]: XDG_RUNTIME_DIR=/run/user/1000
This is argv[47]: SHELL=/bin/zsh
This is argv[48]: XDG_SESSION_ID=2
This is argv[49]: LC_MONETARY=en_IN
This is argv[50]: GTK2_RC_FILES=/etc/gtk-2.0/gtkrc:/home/ab/.gtkrc-2.0:/home/ab/.config/gtkrc-2.0
This is argv[51]: LC_TELEPHONE=en_IN
This is argv[52]: EDITOR=/usr/bin/nano
This is argv[53]: COLORTERM=truecolor
This is argv[54]: MOTD_SHOWN=pam
This is argv[55]: KDE_APPLICATIONS_AS_SCOPE=1
This is argv[56]: PAM_KWALLET5_LOGIN=/run/user/1000/kwallet5.socket
This is argv[57]: KDE_FULL_SESSION=true
This is argv[58]: XAUTHORITY=/home/ab/.Xauthority
This is argv[59]: LC_NAME=en_IN
This is argv[60]: DISPLAY=:0
This is argv[61]: LC_ADDRESS=en_IN
This is argv[62]: PWD=/home/ab/Projects/learn_c_the_hard_way
This is argv[63]: XCURSOR_SIZE=24
This is argv[64]: TERM=xterm-256color
This is argv[65]: ZSH=/home/ab/.oh-my-zsh
This is argv[66]: PAGER=less
This is argv[67]: LESS=-R
This is argv[68]: LSCOLORS=Gxfxcxdxbxegedabagacad
This is argv[69]: LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=00:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arc=01;31:*.arj=01;31:*.taz=01;31:*.lha=01;31:*.lz4=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.tzo=01;31:*.t7z=01;31:*.zip=01;31:*.z=01;31:*.dz=01;31:*.gz=01;31:*.lrz=01;31:*.lz=01;31:*.lzo=01;31:*.xz=01;31:*.zst=01;31:*.tzst=01;31:*.bz2=01;31:*.bz=01;31:*.tbz=01;31:*.tbz2=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.war=01;31:*.ear=01;31:*.sar=01;31:*.rar=01;31:*.alz=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.cab=01;31:*.wim=01;31:*.swm=01;31:*.dwm=01;31:*.esd=01;31:*.jpg=01;35:*.jpeg=01;35:*.mjpg=01;35:*.mjpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.webm=01;35:*.webp=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=00;36:*.au=00;36:*.flac=00;36:*.m4a=00;36:*.mid=00;36:*.midi=00;36:*.mka=00;36:*.mp3=00;36:*.mpc=00;36:*.ogg=00;36:*.ra=00;36:*.wav=00;36:*.oga=00;36:*.opus=00;36:*.spx=00;36:*.xspf=00;36:
This is argv[70]: LD_LIBRARY_PATH=/usr/local/lib
This is argv[71]: (null)
AddressSanitizer:DEADLYSIGNAL
=================================================================
==69851==ERROR: AddressSanitizer: SEGV on unknown address 0x000000000021 (pc 0x7f3c30d7b4c6 bp 0x7ffe273b2ba0 sp 0x7ffe273b22e8 T0)
==69851==The signal is caused by a READ memory access.
==69851==Hint: address points to the zero page.
#0 0x7f3c30d7b4c6 in __sanitizer::internal_strlen(char const*) /build/gcc/src/gcc/libsanitizer/sanitizer_common/sanitizer_libc.cpp:167
#1 0x7f3c30d0d057 in printf_common /build/gcc/src/gcc/libsanitizer/sanitizer_common/sanitizer_common_interceptors_format.inc:545
#2 0x7f3c30d0d41c in __interceptor_vprintf /build/gcc/src/gcc/libsanitizer/sanitizer_common/sanitizer_common_interceptors.inc:1639
#3 0x7f3c30d0d517 in __interceptor_printf /build/gcc/src/gcc/libsanitizer/sanitizer_common/sanitizer_common_interceptors.inc:1697
#4 0x562c5e03f290 in main /home/ab/Projects/learn_c_the_hard_way/exp10_so.c:13
#5 0x7f3c30b0ab24 in __libc_start_main (/usr/lib/libc.so.6+0x27b24)
#6 0x562c5e03f0bd in _start (/home/ab/Projects/learn_c_the_hard_way/exp10_so+0x10bd)
AddressSanitizer can not provide additional info.
SUMMARY: AddressSanitizer: SEGV /build/gcc/src/gcc/libsanitizer/sanitizer_common/sanitizer_libc.cpp:167 in __sanitizer::internal_strlen(char const*)
==69851==ABORTING
ANSWER
Answered 2021-May-30 at 09:48A segmentation fault happens when the code try to access a memory region that is not available.
Accessing an array out of bounds doesn't means that the memory before or after the area occupied by the array is not available: The compiler or the runtime usually put all varibales or data in general in a given block of memory. If your array is the last item of such a memory block, the accessing it with a to big index will produce a Segmentaion Fault but is the array is in the middle of the memory block, you will just access memory used for other data, giving unexpected result and undefined behavior.
If the array (In may example, but valid for anything) is written, accessing available memory will not produce a segmentation fault but will overwrite something else. It may produce unexpected results or crash or segmentation fault later! This kind of bug is frequently very difficult to find because the unexpected result/behavior looks completely independent of the root cause.
QUESTION
I have an application using Boot Strap running with cassandra 4.0, Cassandra java drive 4.11.1, spark 3.1.1 into ubuntu 20.4 with jdk 8_292 and python 3.6.
When I run a function that it call CQL by spark, the tomcat gave me the error bellow.
Stack trace:
java.io.IOException: Failed to open native connection to Cassandra at {127.0.0.1:9042} :: Error instantiating class com.datastax.spark.connector.cql.LocalNodeFirstLoad>
at com.datastax.spark.connector.cql.CassandraConnector$.createSession(CassandraConnector.scala:182) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.cql.CassandraConnector$.$anonfun$sessionCache$1(CassandraConnector.scala:170) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.cql.RefCountedCache.createNewValueAndKeys(RefCountedCache.scala:32) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.cql.RefCountedCache.syncAcquire(RefCountedCache.scala:69) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.cql.RefCountedCache.acquire(RefCountedCache.scala:57) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.cql.CassandraConnector.openSession(CassandraConnector.scala:90) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.cql.CassandraConnector.withSessionDo(CassandraConnector.scala:112) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at com.datastax.spark.connector.datasource.CassandraCatalog$.com$datastax$spark$connector$datasource$CassandraCatalog$$getMetadata(CassandraCatalog.scala:455) >
at com.datastax.spark.connector.datasource.CassandraCatalog$.getTableMetaData(CassandraCatalog.scala:421) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at org.apache.spark.sql.cassandra.DefaultSource.getTable(DefaultSource.scala:68) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at org.apache.spark.sql.cassandra.DefaultSource.inferSchema(DefaultSource.scala:72) ~[spark-cassandra-connector_2.12-3.0.0.jar:3.0.0]
at org.apache.spark.sql.execution.datasources.v2.DataSourceV2Utils$.getTableFromProvider(DataSourceV2Utils.scala:81) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
at org.apache.spark.sql.DataFrameReader.$anonfun$load$1(DataFrameReader.scala:296) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
at scala.Option.map(Option.scala:230) ~[scala-library-2.12.11.jar:na]
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:266) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:226) ~[spark-sql_2.12-3.1.1.jar:3.1.1]
at br.com.genesis.configuration.DataSourceCassandraConfig.setParametros(DataSourceCassandraConfig.java:40) ~[classes/:2.4.2]
at br.com.genesis.service.EstTaxaFalhaService.verificaAmostragemMinima(EstTaxaFalhaService.java:109) ~[classes/:2.4.2]
at br.com.genesis.controller.estatistico.EstTaxaFalhaController.geraAmostraMinima(EstTaxaFalhaController.java:306) ~[classes/:2.4.2]
The cause by is:
Caused by: java.lang.IllegalArgumentException: Error instantiating class com.datastax.spark.connector.cql.LocalNodeFirstLoadBalancingPolicy (specified by basic.load-ba>
at com.datastax.oss.driver.internal.core.util.Reflection.buildFromConfig(Reflection.java:253) ~[java-driver-core-4.11.1.jar:na]
at com.datastax.oss.driver.internal.core.util.Reflection.buildFromConfigProfiles(Reflection.java:162) ~[java-driver-core-4.11.1.jar:na]
at com.datastax.oss.driver.internal.core.context.DefaultDriverContext.buildLoadBalancingPolicies(DefaultDriverContext.java:338) ~[java-driver-core-4.11.1.jar:n>
at com.datastax.oss.driver.internal.core.util.concurrent.LazyReference.get(LazyReference.java:55) ~[java-driver-core-4.11.1.jar:na]
at com.datastax.oss.driver.internal.core.context.DefaultDriverContext.getLoadBalancingPolicies(DefaultDriverContext.java:687) ~[java-driver-core-4.11.1.jar:na]
at com.datastax.oss.driver.internal.core.session.DefaultSession$SingleThreaded.init(DefaultSession.java:338) ~[java-driver-core-4.11.1.jar:na]
at com.datastax.oss.driver.internal.core.session.DefaultSession$SingleThreaded.access$1100(DefaultSession.java:300) ~[java-driver-core-4.11.1.jar:na]
at com.datastax.oss.driver.internal.core.session.DefaultSession.lambda$init$0(DefaultSession.java:146) ~[java-driver-core-4.11.1.jar:na]
at io.netty.util.concurrent.PromiseTask.runTask(PromiseTask.java:98) ~[netty-common-4.1.58.Final.jar:4.1.58.Final]
at io.netty.util.concurrent.PromiseTask.run(PromiseTask.java:106) ~[netty-common-4.1.58.Final.jar:4.1.58.Final]
at io.netty.channel.DefaultEventLoop.run(DefaultEventLoop.java:54) ~[netty-all-4.1.58.Final.jar:4.1.58.Final]
at io.netty.util.concurrent.SingleThreadEventExecutor$4.run(SingleThreadEventExecutor.java:989) ~[netty-common-4.1.58.Final.jar:4.1.58.Final]
at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) ~[netty-common-4.1.58.Final.jar:4.1.58.Final]
at io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ~[netty-common-4.1.58.Final.jar:4.1.58.Final]
... 1 common frames omitted
Caused by: java.lang.NoSuchMethodError: com.datastax.oss.driver.internal.core.context.InternalDriverContext.getNodeFilter(Ljava/lang/String;)Ljava/util/function/Predic>
at com.datastax.spark.connector.cql.LocalNodeFirstLoadBalancingPolicy.(LocalNodeFirstLoadBalancingPolicy.scala:40) ~[spark-cassandra-connector-driver_2.1>
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[na:1.8.0_292]
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) ~[na:1.8.0_292]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[na:1.8.0_292]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[na:1.8.0_292]
at com.datastax.oss.driver.internal.core.util.Reflection.buildFromConfig(Reflection.java:247) ~[java-driver-core-4.11.1.jar:na]
... 14 common frames omitted
First, I checked the nodetool status and it is ok:
~$ nodetool status
Datacenter: datacenter1
=======================
Status=Up/Down
|/ State=Normal/Leaving/Joining/Moving
-- Address Load Tokens Owns (effective) Host ID Rack
UN 127.0.0.1 556.74 KiB 16 100.0% f457b508-1b91-456c-85bc-1a621c5c1d78 rack1
Then I checked the CQLSH prompt and it's ok :
~$cqlsh
Connected to SSP at 127.0.0.1:9042
[cqlsh 6.0.0 | Cassandra 4.0 | CQL spec 3.4.5 | Native protocol v5]
Use HELP for help.
cqlsh>
I'm using Cassandra as localhost (127.0.0.1). The .yaml and -env.sh are standard configurated.
I ran the command PS - ef and returned the cassandra's PID:
~$ ps -ef | grep cassandra
ubuntu 11139 1 1 18:28 ? 00:02:22 /usr/lib/jvm/java-8-openjdk-amd64/bin/java -ea -da:net.openhft... -XX:+UseThreadPriorities -XX:+HeapDumpOnOutOfMemoryError -Xss256k -XX:+AlwaysPreTouch -XX:-UseBiasedLocking -XX:+UseTLAB -XX:+ResizeTLAB -XX:+UseNUMA -XX:+PerfDisableSharedMem -Djava.net.preferIPv4Stack=true -XX:ThreadPriorityPolicy=42 -XX:+UseParNewGC -XX:+UseConcMarkSweepGC -XX:+CMSParallelRemarkEnabled -XX:SurvivorRatio=8 -XX:MaxTenuringThreshold=1 -XX:CMSInitiatingOccupancyFraction=75 -XX:+UseCMSInitiatingOccupancyOnly -XX:CMSWaitDuration=10000 -XX:+CMSParallelInitialMarkEnabled -XX:+CMSEdenChunksRecordAlways -XX:+CMSClassUnloadingEnabled -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintHeapAtGC -XX:+PrintTenuringDistribution -XX:+PrintGCApplicationStoppedTime -XX:+PrintPromotionFailure -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=10M -Xloggc:/var/log/cassandra/gc.log -Xms4002M -Xmx4002M -Xmn400M -XX:+UseCondCardMark -XX:CompileCommandFile=/etc/cassandra/hotspot_compiler -javaagent:/usr/share/cassandra/lib/jamm-0.3.2.jar -Djava.rmi.server.hostname=127.0.0.1 -Dcassandra.jmx.local.port=7198 -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.password.file=/etc/cassandra/jmxremote.password -Djava.library.path=/usr/share/cassandra/lib/sigar-bin -XX:OnOutOfMemoryError=kill -9 %p -Dlogback.configurationFile=logback.xml -Dcassandra.logdir=/var/log/cassandra -Dcassandra.storagedir=/var/lib/cassandra -cp /etc/cassandra:/usr/share/cassandra/lib/HdrHistogram-2.1.9.jar:/usr/share/cassandra/lib/ST4-4.0.8.jar:/usr/share/cassandra/lib/airline-0.8.jar:/usr/share/cassandra/lib/antlr-runtime-3.5.2.jar:/usr/share/cassandra/lib/asm-7.1.jar:/usr/share/cassandra/lib/caffeine-2.3.5.jar:/usr/share/cassandra/lib/cassandra-driver-core-3.11.0-shaded.jar:/usr/share/cassandra/lib/chronicle-bytes-2.20.111.jar:/usr/share/cassandra/lib/chronicle-core-2.20.126.jar:/usr/share/cassandra/lib/chronicle-queue-5.20.123.jar:/usr/share/cassandra/lib/chronicle-threads-2.20.111.jar:/usr/share/cassandra/lib/chronicle-wire-2.20.117.jar:/usr/share/cassandra/lib/commons-cli-1.1.jar:/usr/share/cassandra/lib/commons-codec-1.9.jar:/usr/share/cassandra/lib/commons-lang3-3.11.jar:/usr/share/cassandra/lib/commons-math3-3.2.jar:/usr/share/cassandra/lib/concurrent-trees-2.4.0.jar:/usr/share/cassandra/lib/ecj-4.6.1.jar:/usr/share/cassandra/lib/guava-27.0-jre.jar:/usr/share/cassandra/lib/high-scale-lib-1.0.6.jar:/usr/share/cassandra/lib/hppc-0.8.1.jar:/usr/share/cassandra/lib/j2objc-annotations-1.3.jar:/usr/share/cassandra/lib/jackson-annotations-2.9.10.jar:/usr/share/cassandra/lib/jackson-core-2.9.10.jar:/usr/share/cassandra/lib/jackson-databind-2.9.10.8.jar:/usr/share/cassandra/lib/jamm-0.3.2.jar:/usr/share/cassandra/lib/java-cup-runtime-11b-20160615.jar:/usr/share/cassandra/lib/javax.inject-1.jar:/usr/share/cassandra/lib/jbcrypt-0.3m.jar:/usr/share/cassandra/lib/jcl-over-slf4j-1.7.25.jar:/usr/share/cassandra/lib/jcommander-1.30.jar:/usr/share/cassandra/lib/jctools-core-3.1.0.jar:/usr/share/cassandra/lib/jflex-1.8.2.jar:/usr/share/cassandra/lib/jna-5.6.0.jar:/usr/share/cassandra/lib/json-simple-1.1.jar:/usr/share/cassandra/lib/jvm-attach-api-1.5.jar:/usr/share/cassandra/lib/log4j-over-slf4j-1.7.25.jar:/usr/share/cassandra/lib/logback-classic-1.2.3.jar:/usr/share/cassandra/lib/logback-core-1.2.3.jar:/usr/share/cassandra/lib/lz4-java-1.7.1.jar:/usr/share/cassandra/lib/metrics-core-3.1.5.jar:/usr/share/cassandra/lib/metrics-jvm-3.1.5.jar:/usr/share/cassandra/lib/metrics-logback-3.1.5.jar:/usr/share/cassandra/lib/mxdump-0.14.jar:/usr/share/cassandra/lib/netty-all-4.1.58.Final.jar:/usr/share/cassandra/lib/netty-tcnative-boringssl-static-2.0.36.Final.jar:/usr/share/cassandra/lib/ohc-core-0.5.1.jar:/usr/share/cassandra/lib/ohc-core-j8-0.5.1.jar:/usr/share/cassandra/lib/psjava-0.1.19.jar:/usr/share/cassandra/lib/reporter-config-base-3.0.3.jar:/usr/share/cassandra/lib/reporter-config3-3.0.3.jar:/usr/share/cassandra/lib/sigar-1.6.4.jar:/usr/share/cassandra/lib/sjk-cli-0.14.jar:/usr/share/cassandra/lib/sjk-core-0.14.jar:/usr/share/cassandra/lib/sjk-json-0.14.jar:/usr/share/cassandra/lib/sjk-stacktrace-0.14.jar:/usr/share/cassandra/lib/slf4j-api-1.7.25.jar:/usr/share/cassandra/lib/snakeyaml-1.26.jar:/usr/share/cassandra/lib/snappy-java-1.1.2.6.jar:/usr/share/cassandra/lib/snowball-stemmer-1.3.0.581.1.jar:/usr/share/cassandra/lib/stream-2.5.2.jar:/usr/share/cassandra/lib/zstd-jni-1.3.8-5.jar:/usr/share/cassandra/lib/jsr223/*/*.jar:/usr/share/cassandra/apache-cassandra-4.0.jar:/usr/share/cassandra/apache-cassandra.jar:/usr/share/cassandra/fqltool.jar:/usr/share/cassandra/stress.jar: org.apache.cassandra.service.CassandraDaemon
The tomcat 9 connected with the cassandra as well.
POM.xml
org.springframework.boot
spring-boot-starter-parent
2.4.2
UTF-8
1.8
1.8
junit
junit
test
org.springframework.boot
spring-boot-starter-web
org.springframework.boot
spring-boot-starter-tomcat
provided
org.springframework.boot
spring-boot
javax.persistence
javax.persistence-api
org.springframework.boot
spring-boot-devtools
org.springframework.boot
spring-boot-starter-data-cassandra
javax.servlet
javax.servlet-api
provided
org.springframework.boot
spring-boot-starter-thymeleaf
org.python
jython
2.7.2
org.slf4j
slf4j-api
com.datastax.spark
spark-cassandra-connector_2.12
3.0.0
org.apache.spark
spark-core_2.12
3.1.1
org.apache.spark
spark-sql_2.12
3.1.1
org.apache.commons
commons-exec
1.3
org.springframework
spring-core
org.javassist
javassist
3.27.0-GA
javax.xml.bind
jaxb-api
org.apache.commons
commons-math3
3.6.1
org.yaml
snakeyaml
com.google.code.gson
gson
org.apache.commons
commons-rng-parent
1.3
pom
org.hamcrest
hamcrest-core
test
org.json
json
20200518
com.opencsv
opencsv
4.2
com.datastax.oss
java-driver-core
4.11.1
com.datastax.oss
java-driver-query-builder
joda-time
joda-time
2.10.10
org.springframework.boot
spring-boot-configuration-processor
org.codehaus.janino
commons-compiler
org.codehaus.janino
janino
com.datastax.oss
native-protocol
1.5.0
calculosSSP
CalculosSSP
1.2.8
Does anyone have an idea about what is happening?
best,
ANSWER
Answered 2021-May-25 at 23:23I openned two JIRA to understand this problem. See the links below:
QUESTION
I am trying to build a docker image. This is the full dockerfile:
FROM ubuntu
ENV LANG=C.UTF-8 LC_ALL=C.UTF-8
ENV PATH /opt/conda/bin:$PATH
ENV TZ=Europe/Athens
RUN ln -snf /usr/share/zoneinfo/$TZ /etc/localtime && echo $TZ > /etc/timezone
RUN apt-get update --fix-missing && apt-get install -y wget bzip2 ca-certificates \
libglib2.0-0 libxext6 libsm6 libxrender1 \
git mercurial subversion
RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda2-4.5.11-Linux-x86_64.sh -O ~/miniconda.sh && \
/bin/bash ~/miniconda.sh -b -p /opt/conda && \
rm ~/miniconda.sh && \
ln -s /opt/conda/etc/profile.d/conda.sh /etc/profile.d/conda.sh && \
echo ". /opt/conda/etc/profile.d/conda.sh" >> ~/.bashrc && \
echo "conda activate base" >> ~/.bashrc
RUN apt-get install -y curl grep sed dpkg && \
TINI_VERSION=`curl https://github.com/krallin/tini/releases/latest | grep -o "/v.*\"" | sed 's:^..\(.*\).$:\1:'` && \
curl -L "https://github.com/krallin/tini/releases/download/v${TINI_VERSION}/tini_${TINI_VERSION}.deb" > tini.deb && \
dpkg -i tini.deb && \
rm tini.deb && \
apt-get clean
ENTRYPOINT [ "/usr/bin/tini", "--" ]
CMD [ "/bin/bash" ]
#SECOND PART
RUN apt install -y libgl1-mesa-glx
RUN conda install conda-build
RUN apt-get install -y git
WORKDIR /
RUN git clone https://github.com/cadquery/cadquery.git
WORKDIR /cadquery
RUN conda env create -n cq -f environment.yml
RUN echo "source activate cq" > ~/.bashrc
ENV PATH /opt/conda/envs/cq/bin:$PATH
WORKDIR /testing
However, when it is time to conda build - more specifically on STEP 12, this line here:
RUN conda install conda-build
i get errors.
It seems to be installing the packages normally, then it fails.:
Proceed ([y]/n)?
ruamel_yaml-0.15.100 | 268 KB | ########## | 100%
readline-8.1 | 464 KB | ########## | 100%
bzip2-1.0.8 | 105 KB | ########## | 100%
tzdata-2020f | 123 KB | ########## | 100%
xz-5.2.5 | 438 KB | ########## | 100%
tk-8.6.10 | 3.2 MB | ########## | 100%
conda-build-3.21.4 | 585 KB | ########## | 100%
cffi-1.14.5 | 227 KB | ########## | 100%
ld_impl_linux-64-2.3 | 645 KB | ########## | 100%
urllib3-1.26.4 | 99 KB | ########## | 100%
pyyaml-5.4.1 | 180 KB | ########## | 100%
pip-21.1.1 | 2.0 MB | ########## | 100%
lz4-c-1.9.3 | 216 KB | ########## | 100%
beautifulsoup4-4.9.3 | 86 KB | ########## | 100%
python-3.9.5 | 22.7 MB | ########## | 100%
pkginfo-1.7.0 | 42 KB | ########## | 100%
tqdm-4.59.0 | 90 KB | ########## | 100%
setuptools-52.0.0 | 880 KB | ########## | 100%
python-libarchive-c- | 50 KB | ########## | 100%
cryptography-3.4.7 | 1.0 MB | ########## | 100%
icu-58.2 | 22.7 MB | ########## | 100%
pysocks-1.7.1 | 31 KB | ########## | 100%
libxml2-2.9.10 | 1.3 MB | ########## | 100%
certifi-2020.12.5 | 143 KB | ########## | 100%
openssl-1.1.1k | 3.8 MB | ########## | 100%
libgcc-ng-9.1.0 | 8.1 MB | ########## | 100%
patchelf-0.12 | 92 KB | ########## | 100%
glob2-0.7 | 12 KB | ########## | 100%
idna-2.10 | 52 KB | ########## | 100%
liblief-0.10.1 | 2.0 MB | ########## | 100%
pycparser-2.20 | 94 KB | ########## | 100%
chardet-4.0.0 | 198 KB | ########## | 100%
py-lief-0.10.1 | 1.3 MB | ########## | 100%
markupsafe-2.0.1 | 22 KB | ########## | 100%
zlib-1.2.11 | 120 KB | ########## | 100%
wheel-0.36.2 | 31 KB | ########## | 100%
conda-4.10.1 | 3.1 MB | ########## | 100%
libffi-3.3 | 54 KB | ########## | 100%
yaml-0.2.5 | 87 KB | ########## | 100%
libarchive-3.4.2 | 1.6 MB | ########## | 100%
ca-certificates-2021 | 120 KB | ########## | 100%
conda-package-handli | 967 KB | ########## | 100%
filelock-3.0.12 | 10 KB | ########## | 100%
requests-2.25.1 | 51 KB | ########## | 100%
ncurses-6.2 | 1.1 MB | ########## | 100%
pytz-2021.1 | 244 KB | ########## | 100%
pycosat-0.6.3 | 108 KB | ########## | 100%
psutil-5.8.0 | 342 KB | ########## | 100%
sqlite-3.35.4 | 1.4 MB | ########## | 100%
zstd-1.4.9 | 809 KB | ########## | 100%
jinja2-3.0.0 | 99 KB | ########## | 100%
brotlipy-0.7.0 | 349 KB | ########## | 100%
ripgrep-12.1.1 | 1.5 MB | ########## | 100%
_libgcc_mutex-0.1 | 3 KB | ########## | 100%
six-1.15.0 | 13 KB | ########## | 100%
soupsieve-2.2.1 | 30 KB | ########## | 100%
pyopenssl-20.0.1 | 48 KB | ########## | 100%
Downloading and Extracting Packages
UnicodeDecodeError('ascii', '/info/test/tests/data/\xed\x94\x84\xeb\xa1\x9c\xea\xb7\xb8\xeb\x9e\xa8.zip.json', 22, 23, 'ordinal not in range(128)')
The command '/bin/sh -c conda install conda-build' returned a non-zero code: 1
ANSWER
Answered 2021-May-25 at 22:50I replicated this error with the continuumio/miniconda2:4.5.11
Docker image:
$ docker run --rm -it continuumio/miniconda2:4.5.11 bash
(base) root@a285050719ad:/# conda install -y conda-build
# ... similar output as OP ...
UnicodeDecodeError('ascii', '/info/test/tests/data/\xed\x94\x84\xeb\xa1\x9c\xea\xb7\xb8\xeb\x9e\xa8.zip.json', 22, 23, 'ordinal not in range(128)')
Additionally, attempting to upgrade the conda
package fails with some extra advice:
(base) root@a285050719ad:/# conda update conda
Solving environment: done
EncodingError: A unicode encoding or decoding error has occurred.
Python 2 is the interpreter under which conda is running in your base environment.
Replacing your base environment with one having Python 3 may help resolve this issue.
If you still have a need for Python 2 environments, consider using 'conda create'
and 'conda activate'. For example:
$ conda create -n py2 python=2
$ conda activate py2
Error details: UnicodeDecodeError('ascii', '/info/test/tests/data/\xed\x94\x84\xeb\xa1\x9c\xea\xb7\xb8\xeb\x9e\xa8.zip.json', 22, 23, 'ordinal not in range(128)')
That is, you really shouldn't be using these old Miniconda2 images because the conda
is no longer compatible with the Anaconda Cloud repository.
A clean solution is to install a newer Miniconda (or Miniforge or Mambaforge). The latest ones all have Python 3 in the base. If for some reason one must have Python 2 in the base, which means you can't have the latest conda
nor the latest conda-build
, then it seems Miniconda up to 4.8.3 supported Python 2.
If possible, use the latest version, as in Python 3. One can always create a Python 2 environment if needed - just better that it not be in the base. Suggested solution:
RUN wget --quiet https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O ~/miniconda.sh && \
...
Also, consider whether you can more simply start from an existing Docker image with Conda preinstalled.
Update Conda in Place (Not Recommended)A dirty version would be still using the same 4.5.11 installer, but then upgrading immediately. In the Docker image I can get it to upgrade to 4.8 and keep Python 2.7, then conda-build
will install at 3.18.11 (current as of May 2021 is 3.21.4).
This could similarly be done in the Dockerfile with something like
RUN conda install -y python=2.7 conda=4.8 && \
conda clean -qafy && \
conda install -y conda-build && \
conda clean -qafy
Note: I needed the first clean to get conda-build
to install.
QUESTION
We have an existing application which is working fine with the SpringBoot 2.2.2.RELEASE. Now we tried to upgrade it to the SpringBoot 2.4.2 version and application is not getting started and throws the following error. In the classpath I could see only one spring-webmvc-5.3.2.jar file.
Below is the pom.xml for the referance:
4.0.0
org.springframework.boot
spring-boot-starter-parent
2.4.2
com.test
test-api
1.0
Microservice
Microservice
jar
13
org.springframework.cloud
spring-cloud-sleuth
3.0.0
pom
import
org.springframework.boot
spring-boot-starter-validation
org.springframework.boot
spring-boot-starter-web
org.springframework.boot
spring-boot-starter-logging
org.springframework.boot
spring-boot-starter-log4j2
org.springframework.boot
spring-boot-configuration-processor
org.springframework.boot
spring-boot-starter-data-jpa
org.mariadb.jdbc
mariadb-java-client
com.fasterxml.jackson.core
jackson-databind
com.fasterxml.jackson.datatype
jackson-datatype-jsr310
org.springframework.boot
spring-boot-starter-test
test
org.junit.vintage
junit-vintage-engine
org.springframework.kafka
spring-kafka
io.grpc
grpc-netty
1.25.0
io.grpc
grpc-protobuf
1.25.0
io.grpc
grpc-stub
1.25.0
org.springframework.boot
spring-boot-starter-actuator
org.springframework.boot
spring-boot-devtools
org.hibernate
hibernate-jpamodelgen
org.springdoc
springdoc-openapi-ui
1.5.2
org.springdoc
springdoc-openapi-data-rest
1.5.2
com.datadoghq
dd-trace-api
0.66.0
org.springframework.cloud
spring-cloud-starter-sleuth
org.redisson
redisson
3.13.2
commons-codec
commons-codec
1.15
test-api
org.springframework.boot
spring-boot-maven-plugin
repackage
exec
***************************
APPLICATION FAILED TO START
***************************
Description:
An attempt was made to call a method that does not exist. The attempt was made from the following location:
org.springframework.boot.autoconfigure.web.servlet.WebMvcAutoConfiguration$EnableWebMvcConfiguration.lambda$addResourceHandlers$0(WebMvcAutoConfiguration.java:411)
The following method did not exist:
'org.springframework.web.servlet.config.annotation.ResourceHandlerRegistration org.springframework.web.servlet.config.annotation.ResourceHandlerRegistration.addResourceLocations(org.springframework.core.io.Resource[])'
The method's class, org.springframework.web.servlet.config.annotation.ResourceHandlerRegistration, is available from the following locations:
jar:file:/C:/Users/test/.m2/repository/org/springframework/spring-webmvc/5.3.2/spring-webmvc-5.3.2.jar!/org/springframework/web/servlet/config/annotation/ResourceHandlerRegistration.class
The class hierarchy was loaded from the following locations:
org.springframework.web.servlet.config.annotation.ResourceHandlerRegistration: file:/C:/Users/test/.m2/repository/org/springframework/spring-webmvc/5.3.2/spring-webmvc-5.3.2.jar
Action:
Correct the classpath of your application so that it contains a single, compatible version of org.springframework.web.servlet.config.annotation.ResourceHandlerRegistration
Dependency Tree:
C:\Users\test-api>mvn dependency:tree
[INFO] Scanning for projects...
[INFO]
[INFO] ------------------< com.test:test-api >------------------
[INFO] Microservice 1.0
[INFO] --------------------------------[ jar ]---------------------------------
[INFO]
[INFO] --- maven-dependency-plugin:3.1.2:tree (default-cli) @ test-api ---
[WARNING] The POM for org.apache.maven:maven-artifact:jar:3.0 is invalid, transitive dependencies (if any) will not be available, enable debug logging for more details
[INFO] com.fmr.AP135913:test-api:jar:1.0
[INFO] +- org.springframework.boot:spring-boot-starter-validation:jar:2.4.2:compile
[INFO] | +- org.springframework.boot:spring-boot-starter:jar:2.4.2:compile
[INFO] | | +- org.springframework.boot:spring-boot-starter-logging:jar:2.4.2:compile
[INFO] | | | +- ch.qos.logback:logback-classic:jar:1.2.3:compile
[INFO] | | | | \- ch.qos.logback:logback-core:jar:1.2.3:compile
[INFO] | | | \- org.apache.logging.log4j:log4j-to-slf4j:jar:2.13.3:compile
[INFO] | | \- jakarta.annotation:jakarta.annotation-api:jar:1.3.5:compile
[INFO] | +- org.glassfish:jakarta.el:jar:3.0.3:compile
[INFO] | \- org.hibernate.validator:hibernate-validator:jar:6.1.7.Final:compile
[INFO] | +- jakarta.validation:jakarta.validation-api:jar:2.0.2:compile
[INFO] | \- com.fasterxml:classmate:jar:1.5.1:compile
[INFO] +- org.springframework.boot:spring-boot-starter-web:jar:2.4.2:compile
[INFO] | +- org.springframework.boot:spring-boot-starter-json:jar:2.4.2:compile
[INFO] | | +- com.fasterxml.jackson.datatype:jackson-datatype-jdk8:jar:2.11.3:compile
[INFO] | | \- com.fasterxml.jackson.module:jackson-module-parameter-names:jar:2.11.3:compile
[INFO] | +- org.springframework.boot:spring-boot-starter-tomcat:jar:2.4.2:compile
[INFO] | | +- org.apache.tomcat.embed:tomcat-embed-core:jar:9.0.41:compile
[INFO] | | \- org.apache.tomcat.embed:tomcat-embed-websocket:jar:9.0.41:compile
[INFO] | +- org.springframework:spring-web:jar:5.3.2:compile
[INFO] | | \- org.springframework:spring-beans:jar:5.3.2:compile
[INFO] | \- org.springframework:spring-webmvc:jar:5.3.2:compile
[INFO] | +- org.springframework:spring-aop:jar:5.3.2:compile
[INFO] | \- org.springframework:spring-expression:jar:5.3.2:compile
[INFO] +- org.springframework.boot:spring-boot-starter-log4j2:jar:2.4.2:compile
[INFO] | +- org.apache.logging.log4j:log4j-slf4j-impl:jar:2.13.3:compile
[INFO] | | \- org.apache.logging.log4j:log4j-api:jar:2.13.3:compile
[INFO] | +- org.apache.logging.log4j:log4j-core:jar:2.13.3:compile
[INFO] | +- org.apache.logging.log4j:log4j-jul:jar:2.13.3:compile
[INFO] | \- org.slf4j:jul-to-slf4j:jar:1.7.30:compile
[INFO] +- org.springframework.boot:spring-boot-configuration-processor:jar:2.4.2:compile
[INFO] +- org.springframework.boot:spring-boot-starter-data-jpa:jar:2.4.2:compile
[INFO] | +- org.springframework.boot:spring-boot-starter-aop:jar:2.4.2:compile
[INFO] | | \- org.aspectj:aspectjweaver:jar:1.9.6:compile
[INFO] | +- org.springframework.boot:spring-boot-starter-jdbc:jar:2.4.2:compile
[INFO] | | +- com.zaxxer:HikariCP:jar:3.4.5:compile
[INFO] | | \- org.springframework:spring-jdbc:jar:5.3.2:compile
[INFO] | +- jakarta.transaction:jakarta.transaction-api:jar:1.3.3:compile
[INFO] | +- jakarta.persistence:jakarta.persistence-api:jar:2.2.3:compile
[INFO] | +- org.hibernate:hibernate-core:jar:5.4.27.Final:compile
[INFO] | | +- org.javassist:javassist:jar:3.27.0-GA:compile
[INFO] | | +- antlr:antlr:jar:2.7.7:compile
[INFO] | | +- org.jboss:jandex:jar:2.1.3.Final:compile
[INFO] | | +- org.dom4j:dom4j:jar:2.1.3:compile
[INFO] | | \- org.hibernate.common:hibernate-commons-annotations:jar:5.1.2.Final:compile
[INFO] | +- org.springframework.data:spring-data-jpa:jar:2.4.2:compile
[INFO] | | +- org.springframework.data:spring-data-commons:jar:2.4.2:compile
[INFO] | | \- org.springframework:spring-orm:jar:5.3.2:compile
[INFO] | \- org.springframework:spring-aspects:jar:5.3.2:compile
[INFO] +- org.mariadb.jdbc:mariadb-java-client:jar:2.7.1:compile
[INFO] +- com.fasterxml.jackson.core:jackson-databind:jar:2.11.3:compile
[INFO] | +- com.fasterxml.jackson.core:jackson-annotations:jar:2.11.3:compile
[INFO] | \- com.fasterxml.jackson.core:jackson-core:jar:2.11.3:compile
[INFO] +- com.fasterxml.jackson.datatype:jackson-datatype-jsr310:jar:2.11.3:compile
[INFO] +- org.springframework.boot:spring-boot-starter-test:jar:2.4.2:test
[INFO] | +- org.springframework.boot:spring-boot-test:jar:2.4.2:test
[INFO] | +- org.springframework.boot:spring-boot-test-autoconfigure:jar:2.4.2:test
[INFO] | +- com.jayway.jsonpath:json-path:jar:2.4.0:compile
[INFO] | | \- net.minidev:json-smart:jar:2.3:compile
[INFO] | | \- net.minidev:accessors-smart:jar:1.2:compile
[INFO] | | \- org.ow2.asm:asm:jar:5.0.4:compile
[INFO] | +- jakarta.xml.bind:jakarta.xml.bind-api:jar:2.3.3:compile
[INFO] | | \- jakarta.activation:jakarta.activation-api:jar:1.2.2:compile
[INFO] | +- org.assertj:assertj-core:jar:3.18.1:test
[INFO] | +- org.hamcrest:hamcrest:jar:2.2:test
[INFO] | +- org.junit.jupiter:junit-jupiter:jar:5.7.0:test
[INFO] | | +- org.junit.jupiter:junit-jupiter-api:jar:5.7.0:test
[INFO] | | | +- org.apiguardian:apiguardian-api:jar:1.1.0:test
[INFO] | | | +- org.opentest4j:opentest4j:jar:1.2.0:test
[INFO] | | | \- org.junit.platform:junit-platform-commons:jar:1.7.0:test
[INFO] | | +- org.junit.jupiter:junit-jupiter-params:jar:5.7.0:test
[INFO] | | \- org.junit.jupiter:junit-jupiter-engine:jar:5.7.0:test
[INFO] | | \- org.junit.platform:junit-platform-engine:jar:1.7.0:test
[INFO] | +- org.mockito:mockito-core:jar:3.6.28:test
[INFO] | | +- net.bytebuddy:byte-buddy-agent:jar:1.10.19:test
[INFO] | | \- org.objenesis:objenesis:jar:3.0.1:test
[INFO] | +- org.mockito:mockito-junit-jupiter:jar:3.6.28:test
[INFO] | +- org.skyscreamer:jsonassert:jar:1.5.0:test
[INFO] | | \- com.vaadin.external.google:android-json:jar:0.0.20131108.vaadin1:test
[INFO] | +- org.springframework:spring-core:jar:5.3.2:compile
[INFO] | | \- org.springframework:spring-jcl:jar:5.3.2:compile
[INFO] | +- org.springframework:spring-test:jar:5.3.2:test
[INFO] | \- org.xmlunit:xmlunit-core:jar:2.7.0:test
[INFO] +- org.springframework.kafka:spring-kafka:jar:2.6.5:compile
[INFO] | +- org.springframework:spring-context:jar:5.3.2:compile
[INFO] | +- org.springframework:spring-messaging:jar:5.3.2:compile
[INFO] | +- org.springframework:spring-tx:jar:5.3.2:compile
[INFO] | +- org.springframework.retry:spring-retry:jar:1.3.1:compile
[INFO] | \- org.apache.kafka:kafka-clients:jar:2.6.0:compile
[INFO] | +- com.github.luben:zstd-jni:jar:1.4.4-7:compile
[INFO] | +- org.lz4:lz4-java:jar:1.7.1:compile
[INFO] | \- org.xerial.snappy:snappy-java:jar:1.1.7.3:compile
[INFO] +- io.grpc:grpc-netty:jar:1.25.0:compile
[INFO] | +- io.grpc:grpc-core:jar:1.25.0:compile (version selected from constraint [1.25.0,1.25.0])
[INFO] | | +- com.google.code.gson:gson:jar:2.8.6:compile
[INFO] | | +- com.google.android:annotations:jar:4.1.1.4:compile
[INFO] | | +- io.perfmark:perfmark-api:jar:0.19.0:compile
[INFO] | | +- io.opencensus:opencensus-api:jar:0.21.0:compile
[INFO] | | \- io.opencensus:opencensus-contrib-grpc-metrics:jar:0.21.0:compile
[INFO] | +- io.netty:netty-codec-http2:jar:4.1.55.Final:compile
[INFO] | | \- io.netty:netty-codec-http:jar:4.1.55.Final:compile
[INFO] | \- io.netty:netty-handler-proxy:jar:4.1.55.Final:compile
[INFO] | \- io.netty:netty-codec-socks:jar:4.1.55.Final:compile
[INFO] +- io.grpc:grpc-protobuf:jar:1.25.0:compile
[INFO] | +- io.grpc:grpc-api:jar:1.25.0:compile
[INFO] | | +- io.grpc:grpc-context:jar:1.25.0:compile
[INFO] | | +- com.google.errorprone:error_prone_annotations:jar:2.3.3:compile
[INFO] | | +- com.google.code.findbugs:jsr305:jar:3.0.2:compile
[INFO] | | \- org.codehaus.mojo:animal-sniffer-annotations:jar:1.17:compile
[INFO] | +- com.google.protobuf:protobuf-java:jar:3.10.0:compile
[INFO] | +- com.google.guava:guava:jar:28.1-android:compile
[INFO] | | +- com.google.guava:failureaccess:jar:1.0.1:compile
[INFO] | | +- com.google.guava:listenablefuture:jar:9999.0-empty-to-avoid-conflict-with-guava:compile
[INFO] | | +- org.checkerframework:checker-compat-qual:jar:2.5.5:compile
[INFO] | | \- com.google.j2objc:j2objc-annotations:jar:1.3:compile
[INFO] | +- com.google.api.grpc:proto-google-common-protos:jar:1.12.0:compile
[INFO] | \- io.grpc:grpc-protobuf-lite:jar:1.25.0:compile
[INFO] +- io.grpc:grpc-stub:jar:1.25.0:compile
[INFO] +- org.springframework.boot:spring-boot-starter-actuator:jar:2.4.2:compile
[INFO] | +- org.springframework.boot:spring-boot-actuator-autoconfigure:jar:2.4.2:compile
[INFO] | | \- org.springframework.boot:spring-boot-actuator:jar:2.4.2:compile
[INFO] | \- io.micrometer:micrometer-core:jar:1.6.2:compile
[INFO] | +- org.hdrhistogram:HdrHistogram:jar:2.1.12:compile
[INFO] | \- org.latencyutils:LatencyUtils:jar:2.0.3:runtime
[INFO] +- org.springframework.boot:spring-boot-devtools:jar:2.4.2:compile
[INFO] | +- org.springframework.boot:spring-boot:jar:2.4.2:compile
[INFO] | \- org.springframework.boot:spring-boot-autoconfigure:jar:2.4.2:compile
[INFO] +- org.hibernate:hibernate-jpamodelgen:jar:5.4.27.Final:compile
[INFO] | +- org.jboss.logging:jboss-logging:jar:3.4.1.Final:compile
[INFO] | +- javax.xml.bind:jaxb-api:jar:2.3.1:compile
[INFO] | | \- javax.activation:javax.activation-api:jar:1.2.0:compile
[INFO] | \- org.glassfish.jaxb:jaxb-runtime:jar:2.3.3:compile
[INFO] | +- org.glassfish.jaxb:txw2:jar:2.3.3:compile
[INFO] | +- com.sun.istack:istack-commons-runtime:jar:3.0.11:compile
[INFO] | \- com.sun.activation:jakarta.activation:jar:1.2.2:runtime
[INFO] +- org.springdoc:springdoc-openapi-ui:jar:1.5.2:compile
[INFO] | +- org.springdoc:springdoc-openapi-webmvc-core:jar:1.5.2:compile
[INFO] | | \- org.springdoc:springdoc-openapi-common:jar:1.5.2:compile
[INFO] | | +- io.swagger.core.v3:swagger-models:jar:2.1.6:compile
[INFO] | | +- io.swagger.core.v3:swagger-annotations:jar:2.1.6:compile
[INFO] | | +- io.swagger.core.v3:swagger-integration:jar:2.1.6:compile
[INFO] | | | \- io.swagger.core.v3:swagger-core:jar:2.1.6:compile
[INFO] | | +- io.github.classgraph:classgraph:jar:4.8.69:compile
[INFO] | | \- org.apache.commons:commons-lang3:jar:3.11:compile
[INFO] | +- org.webjars:swagger-ui:jar:3.38.0:compile
[INFO] | \- org.webjars:webjars-locator-core:jar:0.46:compile
[INFO] +- org.springdoc:springdoc-openapi-data-rest:jar:1.5.2:compile
[INFO] | +- org.springdoc:springdoc-openapi-hateoas:jar:1.5.2:compile
[INFO] | | \- org.springframework.hateoas:spring-hateoas:jar:1.2.3:compile
[INFO] | \- org.springframework.data:spring-data-rest-core:jar:3.4.2:compile
[INFO] | +- org.springframework.plugin:spring-plugin-core:jar:2.0.0.RELEASE:compile
[INFO] | \- org.atteo:evo-inflector:jar:1.2.2:compile
[INFO] +- com.datadoghq:dd-trace-api:jar:0.66.0:compile
[INFO] | \- org.slf4j:slf4j-api:jar:1.7.30:compile
[INFO] +- org.springframework.cloud:spring-cloud-starter-sleuth:jar:3.0.0:compile
[INFO] | +- org.springframework.cloud:spring-cloud-starter:jar:3.0.0:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-context:jar:3.0.0:compile
[INFO] | | | \- org.springframework.security:spring-security-crypto:jar:5.4.2:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-commons:jar:3.0.0:compile
[INFO] | | \- org.springframework.security:spring-security-rsa:jar:1.0.9.RELEASE:compile
[INFO] | | \- org.bouncycastle:bcpkix-jdk15on:jar:1.64:compile
[INFO] | | \- org.bouncycastle:bcprov-jdk15on:jar:1.64:compile
[INFO] | +- org.springframework.cloud:spring-cloud-sleuth-autoconfigure:jar:3.0.0:compile
[INFO] | | +- org.springframework.cloud:spring-cloud-sleuth-instrumentation:jar:3.0.0:compile
[INFO] | | | \- org.springframework.cloud:spring-cloud-sleuth-api:jar:3.0.0:compile
[INFO] | | \- org.aspectj:aspectjrt:jar:1.9.6:compile
[INFO] | \- org.springframework.cloud:spring-cloud-sleuth-brave:jar:3.0.0:compile
[INFO] | +- io.zipkin.brave:brave:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-context-slf4j:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-messaging:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-rpc:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-spring-rabbit:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-kafka-clients:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-kafka-streams:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-httpclient:jar:5.13.2:compile
[INFO] | | \- io.zipkin.brave:brave-instrumentation-http:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-httpasyncclient:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-jms:jar:5.13.2:compile
[INFO] | +- io.zipkin.brave:brave-instrumentation-mongodb:jar:5.13.2:compile
[INFO] | +- io.zipkin.aws:brave-propagation-aws:jar:0.21.3:compile
[INFO] | \- io.zipkin.reporter2:zipkin-reporter-metrics-micrometer:jar:2.16.1:compile
[INFO] | \- io.zipkin.reporter2:zipkin-reporter:jar:2.16.1:compile
[INFO] | \- io.zipkin.zipkin2:zipkin:jar:2.23.0:compile
[INFO] +- org.redisson:redisson:jar:3.13.2:compile
[INFO] | +- io.netty:netty-common:jar:4.1.55.Final:compile
[INFO] | +- io.netty:netty-codec:jar:4.1.55.Final:compile
[INFO] | +- io.netty:netty-buffer:jar:4.1.55.Final:compile
[INFO] | +- io.netty:netty-transport:jar:4.1.55.Final:compile
[INFO] | | \- io.netty:netty-resolver:jar:4.1.55.Final:compile
[INFO] | +- io.netty:netty-resolver-dns:jar:4.1.55.Final:compile
[INFO] | | \- io.netty:netty-codec-dns:jar:4.1.55.Final:compile
[INFO] | +- io.netty:netty-handler:jar:4.1.55.Final:compile
[INFO] | +- javax.cache:cache-api:jar:1.1.1:compile
[INFO] | +- io.projectreactor:reactor-core:jar:3.4.1:compile
[INFO] | | \- org.reactivestreams:reactive-streams:jar:1.0.3:compile
[INFO] | +- io.reactivex.rxjava2:rxjava:jar:2.2.20:compile
[INFO] | +- org.jboss.marshalling:jboss-marshalling-river:jar:2.0.9.Final:compile
[INFO] | | \- org.jboss.marshalling:jboss-marshalling:jar:2.0.9.Final:compile
[INFO] | +- org.yaml:snakeyaml:jar:1.27:compile
[INFO] | +- com.fasterxml.jackson.dataformat:jackson-dataformat-yaml:jar:2.11.3:compile
[INFO] | +- net.bytebuddy:byte-buddy:jar:1.10.19:compile
[INFO] | \- org.jodd:jodd-bean:jar:5.0.13:compile
[INFO] | \- org.jodd:jodd-core:jar:5.0.13:compile
[INFO] \- commons-codec:commons-codec:jar:1.15:compile
[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 6.008 s
[INFO] Finished at: 2021-01-25T11:32:30-05:00
[INFO] ------------------------------------------------------------------------
C:\Users\test-api>
ANSWER
Answered 2021-Jan-29 at 14:01Importing spring-cloud-sleuth as a BOM for dependency management is very suspect.
After replacing this
org.springframework.cloud
spring-cloud-sleuth
3.0.0
pom
import
with this one it is working fine.
org.springframework.cloud
spring-cloud-dependencies
${spring-cloud.version}
pom
import
QUESTION
I added this to my cargo toml file, following the instructions here
[dependencies]
clickhouse = "0.6.3"
reflection = "0.1.3"
but when I run cargo build
I get a failure saying:
Compiling clickhouse v0.6.3
error[E0433]: failed to resolve: could not find `test` in `tokio`
--> /Users/gudjonragnar/.cargo/registry/src/github.com-1ecc6299db9ec823/clickhouse-0.6.3/src/compression/lz4.rs:163:10
|
163 | #[tokio::test]
| ^^^^ could not find `test` in `tokio`
error: aborting due to previous error
For more information about this error, try `rustc --explain E0433`.
error: could not compile `clickhouse`
To learn more, run the command again with --verbose.
warning: build failed, waiting for other jobs to finish...
error: build failed
I am quite new to Rust so I don't know what to do here, any thoughts?
I am running on MacOS BigSur if that is relevant.
ANSWER
Answered 2021-May-13 at 20:49I am getting this error on Linux as well. This appears to be an issue in the clickhouse crate, but it can be fixed in your Cargo.toml. #[tokio::test]
refers to a macro which requires both the "rt" and "macros" features, but the Cargo.toml file in the clickhouse crate only includes the "rt" feature. In order to add this feature so that the crate will compile, you can add a line to your Cargo.toml for tokio that enables that feature:
tokio = { version = "1.0.1", features = ["rt", "macros"] }
Adding this line fixed the compiler error for me.
I noticed there is another clickhouse crate, which might also be helpful
QUESTION
Previously I've reported it into kafkacat
tracker but the issue has been closed as related to cyrus-sasl
/krb5
.
podman run --rm -it --name kafkacat-DEV \
-v$(pwd)/conf/integration:/conf -v$(pwd)/conf/integration/krb5.conf:/etc/krb5.conf \
localhost/kafkacat_gssapi:1 \
kafkacat \
-b kafka-int.epm-eco.projects.epam.com:9095 \
-Xssl.ca.location=/conf/epm-eco-int.ca.crt \
-Xsecurity.protocol=SASL_SSL \
-Xsasl.mechanisms=GSSAPI \
'-Xsasl.kerberos.kinit.cmd=cat /conf/paswd | /usr/bin/kinit Pavel_Alexeev@PETERSBURG.EPAM.COM' \
-Xsasl.kerberos.service.name=kafka \
-m30 -L
Error:
Password for Pavel_Alexeev@PETERSBURG.EPAM.COM:
%5|1620820968.991|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620820969.336|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
%3|1620820969.336|FAIL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: Failed to initialize SASL authentication: SASL handshake failed (start (-1)): SASL(-1): generic failure: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found) (after 345ms in state AUTH_REQ)
%5|1620820970.006|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620820970.137|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
%3|1620820970.137|FAIL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: Failed to initialize SASL authentication: SASL handshake failed (start (-1)): SASL(-1): generic failure: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found) (after 131ms in state AUTH_REQ, 1 identical error(s) suppressed)
%5|1620820971.431|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620820972.935|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
%5|1620820976.319|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620820976.745|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
%5|1620820987.183|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620820987.651|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
%5|1620820998.114|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620820998.480|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
% ERROR: Failed to acquire metadata: Local: Broker transport failure
Where localhost/kafkacat_gssapi:1
just built from Dockerfile
:
FROM docker.io/edenhill/kafkacat:1.6.0
RUN apk add --no-cache cyrus-sasl cyrus-sasl-gssapiv2 krb5 openssl ca-certificates
$ podman run -it --rm localhost/kafkacat_gssapi:1 -V
kafkacat - Apache Kafka producer and consumer tool
https://github.com/edenhill/kafkacat
Copyright (c) 2014-2019, Magnus Edenhill
Version 1.6.0 (JSON, Avro, Transactions, librdkafka 1.5.0 builtin.features=gzip,snappy,ssl,sasl,regex,lz4,sasl_gssapi,sasl_plain,sasl_scram,plugins,zstd,sasl_oauthbearer)
Same run on Fedora 33:
kafkacat \
-b kafka-int.epm-eco.projects.epam.com:9095 \
-Xssl.ca.location=conf/integration/epm-eco-int.ca.crt \
-Xsecurity.protocol=SASL_SSL \
-Xsasl.mechanisms=GSSAPI \
'-Xsasl.kerberos.kinit.cmd=cat conf/paswd | /usr/bin/kinit Pavel_Alexeev@PETERSBURG.EPAM.COM' \
-Xsasl.kerberos.service.name=kafka \
-m 30 \
-L
Password for Pavel_Alexeev@PETERSBURG.EPAM.COM:
%5|1620821374.957|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%5|1620821384.957|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%5|1620821385.027|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 2
Metadata for all topics (from broker -1: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap):
3 brokers:
broker 202 at ecsc00a09eab.epam.com:9095
broker 201 at ecsc00a09eaa.epam.com:9095 (controller)
broker 203 at ecsc00a09eac.epam.com:9095
1440 topics:
topic "test-topic" with 1 partitions:
partition 0, leader 202, replicas: 202,203,201, isrs: 201,203,202
topic "datahub.epm_prj.EffectiveFrom" with 1 partitions:
partition 0, leader 201, replicas: 201,203, isrs: 201,203
...
What interesting, if I run with KRB5_TRACE=/dev/stdout
in direct working case I see (full log):
[2616356] 1620807906.324107: Sending DNS URI query for _kerberos.PETERSBURG.EPAM.COM.
[2616356] 1620807906.324108: No URI records found
[2616356] 1620807906.324109: Sending DNS SRV query for _kerberos._tcp.PETERSBURG.EPAM.COM.
[2616356] 1620807906.324110: SRV answer: 0 100 88 "evrupetsa0001.petersburg.epam.com."
[2616356] 1620807906.324111: SRV answer: 0 100 88 "evrupetsa0007.petersburg.epam.com."
[2616356] 1620807906.324112: SRV answer: 0 100 88 "evbyminsa0007.petersburg.epam.com."
[2616356] 1620807906.324113: SRV answer: 0 100 88 "evusprisa0049.petersburg.epam.com."
[2616356] 1620807906.324114: SRV answer: 0 100 88 "evhubudsa0001.petersburg.epam.com."
[2616356] 1620807906.324115: Resolving hostname evrupetsa0001.petersburg.epam.com.
[2616356] 1620807906.324116: Initiating TCP connection to stream 10.66.110.11:88
[2616356] 1620807906.324117: Sending TCP request to stream 10.66.110.11:88
[2616356] 1620807906.324118: Received answer (4491 bytes) from stream 10.66.110.11:88
[2616356] 1620807906.324119: Terminating TCP connection to stream 10.66.110.11:88
[2616356] 1620807906.324120: Sending DNS URI query for _kerberos.PETERSBURG.EPAM.COM.
[2616356] 1620807906.324121: No URI records found
[2616356] 1620807906.324122: Sending DNS SRV query for _kerberos-master._tcp.PETERSBURG.EPAM.COM.
[2616356] 1620807906.324123: No SRV records found
[2616356] 1620807906.324124: Response was not from master KDC
[2616356] 1620807906.324125: Processing preauth types: PA-ETYPE-INFO2 (19)
[2616356] 1620807906.324126: Selected etype info: etype aes256-cts, salt "PETERSBURG.EPAM.COMPavel_Alexeev", params ""
[2616356] 1620807906.324127: Produced preauth for next request: (empty)
[2616356] 1620807906.324128: AS key determined by preauth: aes256-cts/83C9
[2616356] 1620807906.324129: Decrypted AS reply; session key is: aes256-cts/CCEB
[2616356] 1620807906.324130: FAST negotiation: unavailable
[2616356] 1620807906.324131: Initializing FILE:/tmp/krb5cc_1000 with default princ Pavel_Alexeev@PETERSBURG.EPAM.COM
[2616356] 1620807906.324132: Storing Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM in FILE:/tmp/krb5cc_1000
[2616356] 1620807906.324133: Storing config in FILE:/tmp/krb5cc_1000 for krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM: pa_type: 2
[2616356] 1620807906.324134: Storing Pavel_Alexeev@PETERSBURG.EPAM.COM -> krb5_ccache_conf_data/pa_type/krbtgt\/PETERSBURG.EPAM.COM\@PETERSBURG.EPAM.COM@X-CACHECONF: in FILE:/tmp/krb5cc_1000
%5|1620807906.993|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
[2616353] 1620807906.996539: ccselect can't find appropriate cache for server principal kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM
[2616353] 1620807906.996540: Getting credentials Pavel_Alexeev@PETERSBURG.EPAM.COM -> kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM using ccache FILE:/tmp/krb5cc_1000
[2616353] 1620807906.996541: Retrieving Pavel_Alexeev@PETERSBURG.EPAM.COM -> kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM from FILE:/tmp/krb5cc_1000 with result: -1765328243/Matching credential not found (filename: /tmp/krb5cc_1000)
[2616353] 1620807906.996542: Retrieving Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/EPAM.COM@EPAM.COM from FILE:/tmp/krb5cc_1000 with result: -1765328243/Matching credential not found (filename: /tmp/krb5cc_1000)
[2616353] 1620807906.996543: Retrieving Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM from FILE:/tmp/krb5cc_1000 with result: 0/Success
[2616353] 1620807906.996544: Starting with TGT for client realm: Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM
[2616353] 1620807906.996545: Retrieving Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/EPAM.COM@EPAM.COM from FILE:/tmp/krb5cc_1000 with result: -1765328243/Matching credential not found (filename: /tmp/krb5cc_1000)
[2616353] 1620807906.996546: Requesting TGT krbtgt/EPAM.COM@PETERSBURG.EPAM.COM using TGT krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM
[2616353] 1620807906.996547: Generated subkey for TGS request: aes256-cts/69EE
[2616353] 1620807906.996548: etypes requested in TGS request: aes256-cts, aes128-cts, aes256-sha2, aes128-sha2, rc4-hmac, camellia128-cts, camellia256-cts
[2616353] 1620807906.996550: Encoding request body and padata into FAST request
[2616353] 1620807906.996551: Sending request (4611 bytes) to PETERSBURG.EPAM.COM
[2616353] 1620807906.996552: Sending DNS URI query for _kerberos.PETERSBURG.EPAM.COM.
[2616353] 1620807907.003430: No URI records found
[2616353] 1620807907.003431: Sending DNS SRV query for _kerberos._udp.PETERSBURG.EPAM.COM.
[2616353] 1620807907.003432: SRV answer: 0 100 88 "evrupetsa0007.petersburg.epam.com."
[2616353] 1620807907.003433: SRV answer: 0 100 88 "evrupetsa0001.petersburg.epam.com."
[2616353] 1620807907.003434: SRV answer: 0 100 88 "evusprisa0049.petersburg.epam.com."
[2616353] 1620807907.003435: SRV answer: 0 100 88 "evhubudsa0001.petersburg.epam.com."
[2616353] 1620807907.003436: SRV answer: 0 100 88 "evbyminsa0007.petersburg.epam.com."
[2616353] 1620807907.003437: Sending DNS SRV query for _kerberos._tcp.PETERSBURG.EPAM.COM.
[2616353] 1620807907.003438: SRV answer: 0 100 88 "evrupetsa0007.petersburg.epam.com."
[2616353] 1620807907.003439: SRV answer: 0 100 88 "evbyminsa0007.petersburg.epam.com."
[2616353] 1620807907.003440: SRV answer: 0 100 88 "evusprisa0049.petersburg.epam.com."
[2616353] 1620807907.003441: SRV answer: 0 100 88 "evhubudsa0001.petersburg.epam.com."
[2616353] 1620807907.003442: SRV answer: 0 100 88 "evrupetsa0001.petersburg.epam.com."
[2616353] 1620807907.003443: Resolving hostname evrupetsa0007.petersburg.epam.com.
[2616353] 1620807907.003444: Resolving hostname evrupetsa0001.petersburg.epam.com.
[2616353] 1620807907.003445: Resolving hostname evusprisa0049.petersburg.epam.com.
[2616353] 1620807907.003446: Resolving hostname evhubudsa0001.petersburg.epam.com.
[2616353] 1620807907.003447: Resolving hostname evbyminsa0007.petersburg.epam.com.
[2616353] 1620807907.003448: Resolving hostname evrupetsa0007.petersburg.epam.com.
[2616353] 1620807907.003449: Initiating TCP connection to stream 10.66.110.17:88
[2616353] 1620807907.003450: Sending TCP request to stream 10.66.110.17:88
[2616353] 1620807907.003451: Received answer (4542 bytes) from stream 10.66.110.17:88
[2616353] 1620807907.003452: Terminating TCP connection to stream 10.66.110.17:88
[2616353] 1620807907.003453: Sending DNS URI query for _kerberos.PETERSBURG.EPAM.COM.
[2616353] 1620807907.003454: No URI records found
[2616353] 1620807907.003455: Sending DNS SRV query for _kerberos-master._tcp.PETERSBURG.EPAM.COM.
[2616353] 1620807907.003456: No SRV records found
[2616353] 1620807907.003457: Response was not from master KDC
[2616353] 1620807907.003458: Decoding FAST response
[2616353] 1620807907.003459: FAST reply key: aes256-cts/9007
[2616353] 1620807907.003460: TGS reply is for Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/EPAM.COM@PETERSBURG.EPAM.COM with session key rc4-hmac/7459
[2616353] 1620807907.003461: TGS request result: 0/Success
[2616353] 1620807907.003462: Storing Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/EPAM.COM@PETERSBURG.EPAM.COM in FILE:/tmp/krb5cc_1000
[2616353] 1620807907.003463: Received TGT for service realm: krbtgt/EPAM.COM@PETERSBURG.EPAM.COM
[2616353] 1620807907.003464: Requesting tickets for kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM, referrals on
[2616353] 1620807907.003465: Generated subkey for TGS request: rc4-hmac/F9B4
[2616353] 1620807907.003466: etypes requested in TGS request: aes256-cts, aes128-cts, aes256-sha2, aes128-sha2, rc4-hmac, camellia128-cts, camellia256-cts
[2616353] 1620807907.003468: Encoding request body and padata into FAST request
[2616353] 1620807907.003469: Sending request (4615 bytes) to EPAM.COM
[2616353] 1620807907.003470: Sending DNS URI query for _kerberos.EPAM.COM.
[2616353] 1620807907.003471: No URI records found
[2616353] 1620807907.003472: Sending DNS SRV query for _kerberos._udp.EPAM.COM.
[2616353] 1620807907.003473: SRV answer: 0 100 88 "evusbossa0000.epam.com."
[2616353] 1620807907.003474: SRV answer: 0 100 88 "evkzastsa0000.epam.com."
...
[2616353] 1620807907.003762: Resolving hostname EVUAVINSA0000.epam.com.
[2616353] 1620807907.003763: Resolving hostname EVUSCONSA0000.epam.com.
[2616353] 1620807907.003764: Initiating TCP connection to stream 10.22.128.2:88
[2616353] 1620807907.003765: Sending TCP request to stream 10.22.128.2:88
[2616353] 1620807908.732538: Received answer (4631 bytes) from stream 10.22.128.2:88
[2616353] 1620807908.732539: Terminating TCP connection to stream 10.22.128.2:88
[2616353] 1620807908.732540: Sending DNS URI query for _kerberos.EPAM.COM.
[2616353] 1620807908.732541: No URI records found
[2616353] 1620807908.732542: Sending DNS SRV query for _kerberos-master._tcp.EPAM.COM.
[2616353] 1620807908.732543: No SRV records found
[2616353] 1620807908.732544: Response was not from master KDC
[2616353] 1620807908.732545: Decoding FAST response
[2616353] 1620807908.732546: FAST reply key: rc4-hmac/2A6B
[2616353] 1620807908.732547: TGS reply is for Pavel_Alexeev@PETERSBURG.EPAM.COM -> kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM with session key aes256-cts/F35A
[2616353] 1620807908.732548: TGS request result: 0/Success
[2616353] 1620807908.732549: Received creds for desired service kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM
[2616353] 1620807908.732550: Storing Pavel_Alexeev@PETERSBURG.EPAM.COM -> kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM in FILE:/tmp/krb5cc_1000
[2616353] 1620807908.732552: Creating authenticator for Pavel_Alexeev@PETERSBURG.EPAM.COM -> kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM, seqnum 691537013, subkey aes256-cts/A6FD, session key aes256-cts/F35A
%5|1620807908.873|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
In case of an error in the alpine container (full log):
[4] 1620808362.090471: Sending DNS URI query for _kerberos.PETERSBURG.EPAM.COM.
[4] 1620808362.090472: No URI records found
[4] 1620808362.090473: Sending DNS SRV query for _kerberos._tcp.PETERSBURG.EPAM.COM.
[4] 1620808362.090474: SRV answer: 0 100 88 "evusprisa0049.petersburg.epam.com."
[4] 1620808362.090475: SRV answer: 0 100 88 "evhubudsa0001.petersburg.epam.com."
[4] 1620808362.090476: SRV answer: 0 100 88 "evrupetsa0001.petersburg.epam.com."
[4] 1620808362.090477: SRV answer: 0 100 88 "evrupetsa0007.petersburg.epam.com."
[4] 1620808362.090478: SRV answer: 0 100 88 "evbyminsa0007.petersburg.epam.com."
[4] 1620808362.090479: Resolving hostname evusprisa0049.petersburg.epam.com.
[4] 1620808362.090480: Initiating TCP connection to stream 10.244.110.7:88
[4] 1620808362.090481: Sending TCP request to stream 10.244.110.7:88
[4] 1620808362.090482: Received answer (4491 bytes) from stream 10.244.110.7:88
[4] 1620808362.090483: Terminating TCP connection to stream 10.244.110.7:88
[4] 1620808362.090484: Sending DNS URI query for _kerberos.PETERSBURG.EPAM.COM.
[4] 1620808362.090485: No URI records found
[4] 1620808362.090486: Sending DNS SRV query for _kerberos-master._tcp.PETERSBURG.EPAM.COM.
[4] 1620808362.090487: No SRV records found
[4] 1620808362.090488: Response was not from master KDC
[4] 1620808362.090489: Processing preauth types: PA-ETYPE-INFO2 (19)
[4] 1620808362.090490: Selected etype info: etype aes256-cts, salt "PETERSBURG.EPAM.COMPavel_Alexeev", params ""
[4] 1620808362.090491: Produced preauth for next request: (empty)
[4] 1620808362.090492: AS key determined by preauth: aes256-cts/83C9
[4] 1620808362.090493: Decrypted AS reply; session key is: aes256-cts/9506
[4] 1620808362.090494: FAST negotiation: unavailable
[4] 1620808362.090495: Initializing FILE:/tmp/krb5cc_0 with default princ Pavel_Alexeev@PETERSBURG.EPAM.COM
[4] 1620808362.090496: Storing Pavel_Alexeev@PETERSBURG.EPAM.COM -> krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM in FILE:/tmp/krb5cc_0
[4] 1620808362.090497: Storing config in FILE:/tmp/krb5cc_0 for krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM: pa_type: 2
[4] 1620808362.090498: Storing Pavel_Alexeev@PETERSBURG.EPAM.COM -> krb5_ccache_conf_data/pa_type/krbtgt\/PETERSBURG.EPAM.COM\@PETERSBURG.EPAM.COM@X-CACHECONF: in FILE:/tmp/krb5cc_0
%5|1620808363.004|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI client step 1
%2|1620808363.250|LIBSASL|rdkafka#producer-1| [thrd:sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap]: sasl_ssl://kafka-int.epm-eco.projects.epam.com:9095/bootstrap: GSSAPI Error: Miscellaneous failure (see text) (Matching credential (kafka/ecsc00a09ead.epam.com@EPAM.COM) not found)
What also looks very interesting and strange to me, on host machine where all works just after kafkacat
execution klist show me 3 tickets:
$ klist
Ticket cache: FILE:/tmp/krb5cc_1000
Default principal: Pavel_Alexeev@PETERSBURG.EPAM.COM
Valid starting Expires Service principal
12/05/21 19:53:26 13/05/21 03:53:26 krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM
renew until 19/05/21 19:53:26
12/05/21 19:53:27 13/05/21 03:53:26 krbtgt/EPAM.COM@PETERSBURG.EPAM.COM
renew until 19/05/21 19:53:26
12/05/21 19:53:28 13/05/21 03:53:26 kafka/kafka-int.epm-eco.projects.epam.com@EPAM.COM
renew until 13/05/21 05:53:28
But in the same time, if I change parameter '-Xsasl.kerberos.kinit.cmd=/usr/bin/kinit --use-referrals --password-file=/conf/paswd Pavel_Alexeev@PETERSBURG.EPAM.COM; klist' I see there only one ticket! For example:
$ podman run --rm -it --name kafkacat-DEV \
-v$(pwd)/conf/integration:/conf -v$(pwd)/conf/integration/krb5.conf:/etc/krb5.conf \
localhost/kafkacat_gssapi_heimdal:3 \
kafkacat \
-b kafka-int.epm-eco.projects.epam.com:9095 \
-Xssl.ca.location=/conf/epm-eco-int.ca.crt \
-Xsecurity.protocol=SASL_SSL \
-Xsasl.mechanisms=GSSAPI \
'-Xsasl.kerberos.kinit.cmd=/usr/bin/kinit --use-referrals --password-file=/conf/paswd Pavel_Alexeev@PETERSBURG.EPAM.COM; klist' \
-Xsasl.kerberos.service.name=kafka \
-m30 -L
...
Credentials cache: FILE:/tmp/krb5cc_0
Principal: Pavel_Alexeev@PETERSBURG.EPAM.COM
Issued Expires Principal
May 12 18:54:50 2021 May 13 02:54:49 2021 krbtgt/PETERSBURG.EPAM.COM@PETERSBURG.EPAM.COM
...
So I can't understand where 2 others come from
Listing after execute kafkacat
in the container also show only a single principal.
Will appreciate any help.
P.S. Also report it as cyrus-sasl
issue
ANSWER
Answered 2021-May-13 at 11:50Very strange issue, and honestly I can't say why, but adding into krb5.conf
:
[libdefaults]
dns_canonicalize_hostname = false
solves the problem.
P.S. I've posted that as the solution because it starts work. But if someone may explain in comments why it is the solution and especially how that may be understandable from provided logs it will be very helpful.
QUESTION
This question appears to have been answered before, but none of the answers helped in my case. First I should say that I've followed the OSMnx Installation steps exactly. Then tried to run the following code in a Jupyter Notebook:
place = "San Francisco, California, USA"
g = ox.graph_from_place(place, network_type="bike")
After trying this simple code, the following error message was returned:
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
in ()
1 #Get Bay Area Bike Network
2 place = "San Francisco, California, USA"
----> 3 g = ox.graph_from_place(place, network_type="bike")
/Users/jcroff/anaconda3/envs/ox/lib/python3.6/site-packages/osmnx/core.py in graph_from_place(query, network_type, simplify, retain_all, truncate_by_edge, name, which_result, buffer_dist, timeout, memory, max_query_area_size, clean_periphery, infrastructure)
1809 name=name, timeout=timeout, memory=memory,
1810 max_query_area_size=max_query_area_size,
-> 1811 clean_periphery=clean_periphery, infrastructure=infrastructure)
1812
1813 log('graph_from_place() returning graph with {:,} nodes and {:,} edges'.format(len(list(G.nodes())), len(list(G.edges()))))
/Users/jcroff/anaconda3/envs/ox/lib/python3.6/site-packages/osmnx/core.py in graph_from_polygon(polygon, network_type, simplify, retain_all, truncate_by_edge, name, timeout, memory, max_query_area_size, clean_periphery, infrastructure)
1678 # create a new buffered polygon 0.5km around the desired one
1679 buffer_dist = 500
-> 1680 polygon_utm, crs_utm = project_geometry(geometry=polygon)
1681 polygon_proj_buff = polygon_utm.buffer(buffer_dist)
1682 polygon_buffered, _ = project_geometry(geometry=polygon_proj_buff, crs=crs_utm, to_latlong=True)
/Users/jcroff/anaconda3/envs/ox/lib/python3.6/site-packages/osmnx/projection.py in project_geometry(geometry, crs, to_crs, to_latlong)
51 gdf['geometry'] = None
52 gdf.loc[0, 'geometry'] = geometry
---> 53 gdf_proj = project_gdf(gdf, to_crs=to_crs, to_latlong=to_latlong)
54 geometry_proj = gdf_proj['geometry'].iloc[0]
55 return geometry_proj, gdf_proj.crs
/Users/jcroff/anaconda3/envs/ox/lib/python3.6/site-packages/osmnx/projection.py in project_gdf(gdf, to_crs, to_latlong)
100 # else, project the gdf to UTM
101 # if GeoDataFrame is already in UTM, just return it
--> 102 if (gdf.crs is not None) and ('proj' in gdf.crs) and (gdf.crs['proj'] == 'utm'):
103 return gdf
104
TypeError: argument of type 'CRS' is not iterable
For reference, here are the packages is my conda environment:
# packages in environment at /Users/jcroff/anaconda3/envs/ox:
#
# Name Version Build Channel
appnope 0.1.2 py36h79c6626_1 conda-forge
argon2-cffi 20.1.0 py36h20b66c6_2 conda-forge
async_generator 1.10 py_0 conda-forge
attrs 21.2.0 pyhd8ed1ab_0 conda-forge
backports 1.0 py_2 conda-forge
backports.functools_lru_cache 1.6.4 pyhd8ed1ab_0 conda-forge
bleach 3.3.0 pyh44b312d_0 conda-forge
boost-cpp 1.74.0 h43a636a_2 conda-forge
branca 0.4.2 pyhd8ed1ab_0 conda-forge
brotlipy 0.7.0 py36h20b66c6_1001 conda-forge
bzip2 1.0.8 h0d85af4_4 conda-forge
c-ares 1.17.1 h0d85af4_1 conda-forge
ca-certificates 2020.12.5 h033912b_0 conda-forge
cairo 1.16.0 he43a7df_1008 conda-forge
certifi 2020.12.5 py36h79c6626_1 conda-forge
cffi 1.14.5 py36hfaecaff_0 conda-forge
cfitsio 3.470 h01dc385_7 conda-forge
chardet 4.0.0 py36h79c6626_1 conda-forge
click 7.1.2 pyh9f0ad1d_0 conda-forge
click-plugins 1.1.1 py_0 conda-forge
cligj 0.7.1 pyhd8ed1ab_0 conda-forge
cryptography 3.4.7 py36h3d45be8_0 conda-forge
curl 7.76.1 h06286d4_1 conda-forge
cycler 0.10.0 py_2 conda-forge
decorator 5.0.7 pyhd8ed1ab_0 conda-forge
defusedxml 0.7.1 pyhd8ed1ab_0 conda-forge
descartes 1.1.0 py_4 conda-forge
entrypoints 0.3 pyhd8ed1ab_1003 conda-forge
expat 2.3.0 he49afe7_0 conda-forge
fiona 1.8.19 py36hba155ba_0 conda-forge
folium 0.12.0 pyhd8ed1ab_1 conda-forge
fontconfig 2.13.1 h10f422b_1005 conda-forge
freetype 2.10.4 h4cff582_1 conda-forge
freexl 1.0.6 h0d85af4_0 conda-forge
gdal 3.2.2 py36h99bc8e5_3 conda-forge
geographiclib 1.50 py_0 conda-forge
geopandas 0.9.0 pyhd8ed1ab_0 conda-forge
geopy 2.1.0 pyhd3deb0d_0 conda-forge
geos 3.9.1 he49afe7_2 conda-forge
geotiff 1.6.0 hba2ba3e_5 conda-forge
gettext 0.19.8.1 h7937167_1005 conda-forge
giflib 5.2.1 hbcb3906_2 conda-forge
hdf4 4.2.13 hefd3b78_1005 conda-forge
hdf5 1.10.6 nompi_hc5d9132_1114 conda-forge
icu 68.1 h74dc148_0 conda-forge
idna 2.10 pyh9f0ad1d_0 conda-forge
importlib-metadata 4.0.1 py36h79c6626_0 conda-forge
ipykernel 5.5.4 py36h495a4c6_0 conda-forge
ipython 5.8.0 py36_1 conda-forge
ipython_genutils 0.2.0 py_1 conda-forge
jinja2 2.11.3 pyh44b312d_0 conda-forge
jpeg 9d hbcb3906_0 conda-forge
json-c 0.15 hcb556a6_0 conda-forge
jsonschema 3.2.0 pyhd8ed1ab_3 conda-forge
jupyter_client 6.1.12 pyhd8ed1ab_0 conda-forge
jupyter_core 4.7.1 py36h79c6626_0 conda-forge
jupyterlab_pygments 0.1.2 pyh9f0ad1d_0 conda-forge
kealib 1.4.14 h31dd65d_2 conda-forge
kiwisolver 1.3.1 py36h615c93b_1 conda-forge
krb5 1.17.2 h60d9502_0 conda-forge
libblas 3.9.0 9_openblas conda-forge
libcblas 3.9.0 9_openblas conda-forge
libcurl 7.76.1 h8ef9fac_1 conda-forge
libcxx 11.1.0 habf9029_0 conda-forge
libdap4 3.20.6 h3e144a0_2 conda-forge
libedit 3.1.20191231 h0678c8f_2 conda-forge
libev 4.33 haf1e3a3_1 conda-forge
libffi 3.3 h046ec9c_2 conda-forge
libgdal 3.2.2 h9a52621_3 conda-forge
libgfortran 5.0.0 9_3_0_h6c81a4c_22 conda-forge
libgfortran5 9.3.0 h6c81a4c_22 conda-forge
libglib 2.68.2 hd556434_0 conda-forge
libiconv 1.16 haf1e3a3_0 conda-forge
libkml 1.3.0 h8fd9edb_1013 conda-forge
liblapack 3.9.0 9_openblas conda-forge
libnetcdf 4.8.0 nompi_h81fa352_101 conda-forge
libnghttp2 1.43.0 h07e645a_0 conda-forge
libopenblas 0.3.15 openmp_h5e1b9a4_0 conda-forge
libpng 1.6.37 h7cec526_2 conda-forge
libpq 13.2 h052a64a_2 conda-forge
librttopo 1.1.0 h5413771_6 conda-forge
libsodium 1.0.18 hbcb3906_1 conda-forge
libspatialindex 1.9.3 h1c7c35f_3 conda-forge
libspatialite 5.0.1 heb715ac_4 conda-forge
libssh2 1.9.0 h52ee1ee_6 conda-forge
libtiff 4.2.0 h7c11950_1 conda-forge
libwebp-base 1.2.0 h0d85af4_2 conda-forge
libxml2 2.9.10 h93ec3fd_4 conda-forge
libzip 1.7.3 hbc046b2_0 conda-forge
llvm-openmp 11.1.0 hda6cdc1_1 conda-forge
lz4-c 1.9.3 h046ec9c_0 conda-forge
markupsafe 1.1.1 py36h20b66c6_3 conda-forge
matplotlib 3.2.2 1 conda-forge
matplotlib-base 3.2.2 py36h83d3ec1_1 conda-forge
mistune 0.8.4 py36h20b66c6_1003 conda-forge
munch 2.5.0 py_0 conda-forge
nbclient 0.5.3 pyhd8ed1ab_0 conda-forge
nbconvert 6.0.7 py36h79c6626_3 conda-forge
nbformat 5.1.3 pyhd8ed1ab_0 conda-forge
ncurses 6.2 h2e338ed_4 conda-forge
nest-asyncio 1.5.1 pyhd8ed1ab_0 conda-forge
networkx 2.3 py_0 conda-forge
notebook 6.3.0 py36h79c6626_0 conda-forge
numpy 1.19.5 py36h08dc641_1 conda-forge
openjpeg 2.4.0 h6cbf5cd_0 conda-forge
openssl 1.1.1k h0d85af4_0 conda-forge
osmnx 0.7.3 py36_0 conda-forge
packaging 20.9 pyh44b312d_0 conda-forge
pandas 1.1.5 py36h2be6da3_0 conda-forge
pandoc 2.13 h0d85af4_0 conda-forge
pandocfilters 1.4.2 py_1 conda-forge
pcre 8.44 hb1e8313_0 conda-forge
pexpect 4.8.0 pyh9f0ad1d_2 conda-forge
pickleshare 0.7.5 py_1003 conda-forge
pip 21.1.1 pyhd8ed1ab_0 conda-forge
pixman 0.40.0 hbcb3906_0 conda-forge
poppler 21.03.0 h640f9a4_0 conda-forge
poppler-data 0.4.10 0 conda-forge
postgresql 13.2 ha63e576_2 conda-forge
proj 8.0.0 h1512c50_0 conda-forge
prometheus_client 0.10.1 pyhd8ed1ab_0 conda-forge
prompt_toolkit 1.0.15 py_1 conda-forge
ptyprocess 0.7.0 pyhd3deb0d_0 conda-forge
pycparser 2.20 pyh9f0ad1d_2 conda-forge
pygments 2.9.0 pyhd8ed1ab_0 conda-forge
pyopenssl 20.0.1 pyhd8ed1ab_0 conda-forge
pyparsing 2.4.7 pyh9f0ad1d_0 conda-forge
pyproj 3.0.1 py36hc662631_1 conda-forge
pyrsistent 0.17.3 py36h20b66c6_2 conda-forge
pysocks 1.7.1 py36h79c6626_3 conda-forge
python 3.6.13 h7728216_0_cpython conda-forge
python-dateutil 2.8.1 py_0 conda-forge
python_abi 3.6 1_cp36m conda-forge
pytz 2021.1 pyhd8ed1ab_0 conda-forge
pyzmq 22.0.3 py36h50cd92c_1 conda-forge
readline 8.1 h05e3726_0 conda-forge
requests 2.25.1 pyhd3deb0d_0 conda-forge
rtree 0.9.7 py36h49c2f37_1 conda-forge
send2trash 1.5.0 py_0 conda-forge
setuptools 49.6.0 py36h79c6626_3 conda-forge
shapely 1.7.1 py36h7f0d9e5_4 conda-forge
simplegeneric 0.8.1 py_1 conda-forge
six 1.16.0 pyh6c4a22f_0 conda-forge
sqlite 3.35.5 h44b9ce1_0 conda-forge
terminado 0.9.4 py36h79c6626_0 conda-forge
testpath 0.4.4 py_0 conda-forge
tiledb 2.2.9 he9a4fb4_0 conda-forge
tk 8.6.10 h0419947_1 conda-forge
tornado 6.1 py36h20b66c6_1 conda-forge
traitlets 4.3.3 py36h9f0ad1d_1 conda-forge
typing_extensions 3.7.4.3 py_0 conda-forge
tzcode 2021a h0d85af4_1 conda-forge
tzdata 2021a he74cb21_0 conda-forge
urllib3 1.26.4 pyhd8ed1ab_0 conda-forge
wcwidth 0.2.5 pyh9f0ad1d_2 conda-forge
webencodings 0.5.1 py_1 conda-forge
wheel 0.36.2 pyhd3deb0d_0 conda-forge
xerces-c 3.2.3 h379762d_2 conda-forge
xz 5.2.5 haf1e3a3_1 conda-forge
zeromq 4.3.4 h1c7c35f_0 conda-forge
zipp 3.4.1 pyhd8ed1ab_0 conda-forge
zlib 1.2.11 h7795811_1010 conda-forge
zstd 1.4.9 h582d3a0_0 conda-forge
In reviewing the answers from the question mentioned above, looks like the developer does not support reverting back to an older version of geopandas, as the past couple of releases require geopandas>=0.7
. Any help in resolving this issue would be very much appreciated! Thank you.
ANSWER
Answered 2021-May-13 at 04:04You have installed an extremely old version of OSMnx. Your conda list
output shows you have version 0.7.3 installed, and that was released 3 or 4 years ago. It's so old that it's incompatible with the modern features of GeoPandas and pyproj, including the modern CRS object that's causing your error. I'm not clear how you did it! My best guess is you installed using one of the old tags on this page, which do point to version 0.7.3.
This should be fixed by removing the old environment and then following the installation instructions here, like:
conda env remove -n ox
conda clean --all --yes
conda config --prepend channels conda-forge
conda create -n ox --strict-channel-priority osmnx
QUESTION
I have a problem similar to this RStudio community post and to this stack overflow post.
I have tried the solutions presented in both cases. I still cannot get arrow installed with lz4 support. I am trying to be able to use arrow::read_feather()
which requires lz4 support.
After following the instructions in the first solution, I get the following error when trying to load the arrow package.
> library(arrow)
Error: package or namespace load failed for ‘arrow’ in dyn.load(file, DLLpath = DLLpath, ...):
unable to load shared object '/home/rstudio-user/R/x86_64-pc-linux-gnu-library/4.0/arrow/libs/arrow.so':
libcrypto.so.1.0.0: cannot open shared object file: No such file or directory
Any ideas on how to install arrow with lz4 support in RStudio Cloud?
ANSWER
Answered 2021-May-11 at 20:07Thanks to the comment from @JonKeane and his answer to my jira issue
I was able to use
Sys.setenv(LIBARROW_BINARY = TRUE); install.packages('arrow', type = "source")
to solve this issue.
Community Discussions, Code Snippets contain sources that include Stack Exchange Network
Vulnerabilities
No vulnerabilities reported
Install lz4
Support
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from over 650 million Knowledge Items
Find more librariesExplore Kits - Develop, implement, customize Projects, Custom Functions and Applications with kandi kits
Save this library and start creating your kit
Share this Page