common-utils | common java tools

 by   stefzhlg Java Updated: 2 years ago - Current License: No License

Download this library from

Build Applications

kandi X-RAY | common-utils REVIEW AND RATINGS

common java tool classes.

kandi-support
Support

  • common-utils has a low active ecosystem.
  • It has 15 star(s) with 11 fork(s).
  • It had no major release in the last 12 months.
  • It has a neutral sentiment in the developer community.

quality kandi
Quality

  • common-utils has 0 bugs and 0 code smells.

security
Security

  • common-utils has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
  • common-utils code analysis shows 0 unresolved vulnerabilities.
  • There are 0 security hotspots that need review.

license
License

  • common-utils does not have a standard license declared.
  • Check the repository for any license declaration and review the terms closely.
  • Without a license, all rights are reserved, and you cannot use the library in your applications.

build
Reuse

  • common-utils releases are not available. You will need to build from source code and install.
  • Build file is available. You can build the component from source.
  • common-utils saves you 2409 person hours of effort in developing the same functionality from scratch.
  • It has 5251 lines of code, 562 functions and 49 files with 0 % test coverage
  • It has high code complexity. Code complexity directly impacts maintainability of the code.
Top functions reviewed by kandi - BETA

kandi has reviewed common-utils and discovered the below as its top functions. This is intended to give you an instant insight into common-utils implemented functionality, and help decide if they suit your requirements.

  • Converts number to Chinese number .
  • compress old icon
  • Main entry point .
  • Upload file to remote directory
  • Bin hexidecimal
  • upload file to dst
  • get real IP
  • Encrypts the string using AES .
  • Init response header .
  • Encode the filename in the request .

common-utils Key Features

common java tools

common-utils examples and code snippets

  • How to parse JWT token in unit tests SpringBoot
  • How to install npm pckage from private git repoistory using a token in github actions
  • Pattern matching and get multiple values from URL using java
  • Private package was created and pip installed but cannot import with python
  • kafka-connect-jdbc : SQLException: No suitable driver only when using distributed mode
  • Unable to run a JDBC Source connector with Confluent REST API
  • Errors while trying to build GDB for ARM
  • Spring boot application - Tomcat deployment

How to parse JWT token in unit tests SpringBoot

interface SecurityUtils {
    String getUsername();
    ...
}

@Service
class MySecurityUtils immplements SecurityUtils {
    private JwtToken getJwtToken() {
        return MySecurityContextHolder.getContext().getJwtToken();
    }

    public String getUsername() {
        return getJwtToken().getUsername();
    }
    ...
}

How to install npm pckage from private git repoistory using a token in github actions

- name: Checkout
  uses: actions/checkout@master
  with:
    persist-credentials: false

Pattern matching and get multiple values from URL using java

    public static void main(String[] args)
    {
        Pattern p = Pattern.compile("http[s]?:.+/books/(?<bookId>[^/]+)/author/(?<authorId>[^/]+)/(?<isbn>[^/]+)/media/(?<mediaId>[^/]+)/(?<filename>.+)");
        Matcher m = p.matcher("https:/<baseurl>/v1/files/library/books/1234-4567/author/56784589/32475622347586/media/324785643257567/507f1f77bcf86cd799439011_400.png");
        if (m.matches())
        {
            System.out.println("bookId = " + m.group("bookId"));
            System.out.println("authorId = " + m.group("authorId"));
            System.out.println("isbn = " + m.group("isbn"));
            System.out.println("mediaId = " + m.group("mediaId"));
            System.out.println("filename = " + m.group("filename"));
        }
    }
bookId = 1234-4567
authorId = 56784589
isbn = 32475622347586
mediaId = 324785643257567
filename = 507f1f77bcf86cd799439011_400.png
-----------------------
    public static void main(String[] args)
    {
        Pattern p = Pattern.compile("http[s]?:.+/books/(?<bookId>[^/]+)/author/(?<authorId>[^/]+)/(?<isbn>[^/]+)/media/(?<mediaId>[^/]+)/(?<filename>.+)");
        Matcher m = p.matcher("https:/<baseurl>/v1/files/library/books/1234-4567/author/56784589/32475622347586/media/324785643257567/507f1f77bcf86cd799439011_400.png");
        if (m.matches())
        {
            System.out.println("bookId = " + m.group("bookId"));
            System.out.println("authorId = " + m.group("authorId"));
            System.out.println("isbn = " + m.group("isbn"));
            System.out.println("mediaId = " + m.group("mediaId"));
            System.out.println("filename = " + m.group("filename"));
        }
    }
bookId = 1234-4567
authorId = 56784589
isbn = 32475622347586
mediaId = 324785643257567
filename = 507f1f77bcf86cd799439011_400.png

Private package was created and pip installed but cannot import with python

    packages=['anomaly', 'batch_transform', 'hive_table_checker', 'metadata_io',
              'parquet_converter', 'pyspark_visualizer'],
    packages=['charter_common_utils',
              'charter_common_utils.anomaly',
              'charter_common_utils.batch_transform',
              'charter_common_utils.hive_table_checker',
              'charter_common_utils.metadata_io',
              'charter_common_utils.parquet_converter',
              'charter_common_utils.pyspark_visualizer',
    ], 
from setuptools import find_packages()

…

    packages=find_packages(),
-----------------------
    packages=['anomaly', 'batch_transform', 'hive_table_checker', 'metadata_io',
              'parquet_converter', 'pyspark_visualizer'],
    packages=['charter_common_utils',
              'charter_common_utils.anomaly',
              'charter_common_utils.batch_transform',
              'charter_common_utils.hive_table_checker',
              'charter_common_utils.metadata_io',
              'charter_common_utils.parquet_converter',
              'charter_common_utils.pyspark_visualizer',
    ], 
from setuptools import find_packages()

…

    packages=find_packages(),
-----------------------
    packages=['anomaly', 'batch_transform', 'hive_table_checker', 'metadata_io',
              'parquet_converter', 'pyspark_visualizer'],
    packages=['charter_common_utils',
              'charter_common_utils.anomaly',
              'charter_common_utils.batch_transform',
              'charter_common_utils.hive_table_checker',
              'charter_common_utils.metadata_io',
              'charter_common_utils.parquet_converter',
              'charter_common_utils.pyspark_visualizer',
    ], 
from setuptools import find_packages()

…

    packages=find_packages(),

kafka-connect-jdbc : SQLException: No suitable driver only when using distributed mode

CLASSPATH=/Users/christian/kafka/confluent-5.3.1/share/java/kafka-connect-jdbc/mysql-connector-java-8.0.18.jar connect-standalone.sh worker.properties etc/kafka-connect-jdbc/mysql-jdbc-fp.properties

Unable to run a JDBC Source connector with Confluent REST API

com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: 
Could not create connection to database server

Errors while trying to build GDB for ARM

./configure ... LD=/path/to/arm-XXX-ld CC=/path/to/arm-XXX-gcc CXX=/path/to/arm-XXX-g++
-----------------------
PATH="$PATH:$HOME/tmp/gcc-linaro-7.2.1-2017.11-x86_64_arm-linux-gnueabi/bin"
./configure --host=arm-linux-gnueabi
make
checking for arm-linux-gnueabi-gcc...  found arm-linux-gnueabi-gcc
-----------------------
PATH="$PATH:$HOME/tmp/gcc-linaro-7.2.1-2017.11-x86_64_arm-linux-gnueabi/bin"
./configure --host=arm-linux-gnueabi
make
checking for arm-linux-gnueabi-gcc...  found arm-linux-gnueabi-gcc

Spring boot application - Tomcat deployment

<modules>
  <module>core</module>
  <module>api</module>
  <module>tests</module>
  <module>web</module>
</modules>
-----------------------
<build>
    <plugins>
        <plugin>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-maven-plugin</artifactId>
            <executions>
                <execution>
                    <goals>
                        <goal>repackage</goal>
                    </goals>
                    <configuration>
                        <classifier>exec</classifier>
                    </configuration>
                </execution>
            </executions>
        </plugin>
    </plugins>
</build>

COMMUNITY DISCUSSIONS

Top Trending Discussions on common-utils
  • How to parse JWT token in unit tests SpringBoot
  • IntelliJ debug breakpoints not working for Tomcat Run Configuration
  • How to install npm pckage from private git repoistory using a token in github actions
  • Pattern matching and get multiple values from URL using java
  • Private package was created and pip installed but cannot import with python
  • kafka-connect-jdbc : SQLException: No suitable driver only when using distributed mode
  • Docker Build Kitura Sqift Container - Shim.h mysql.h file not found
  • Kafka Connect java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;
  • Unable to run a JDBC Source connector with Confluent REST API
  • Why my kafka connects to mysql8.0 always encounters problem?
Top Trending Discussions on common-utils

QUESTION

How to parse JWT token in unit tests SpringBoot

Asked 2021-May-20 at 07:49

I have a microservice setup with Spring boot and OAuth 2 with JWT. I have additional fields in my JWT token.

Most of my services call a static method that has a thread local of the additional fields in the token.
How can I write unit tests for such services?
Even if I tried to inject a mock user it doesn't work and I couldn't find a way of sending the JWT because I am no testing the controllers.

Code:

SecurityUtils static Calss (also check the package for other relevant JWT handler) .

Example on a method that will call the static class (Line 79).

Method:

public CommonResponse saveUpdateProfile(ProfileRequest request) {

    String authUsername = SecurityUtils.getUsername();

    Optional<ProfileEntity> optionalProfile = findProfileByUsername(authUsername);

    ProfileEntity profile;
    if (optionalProfile.isPresent()) {
        profile = optionalProfile.get();
    } else {
        profile = ProfileEntity.builder()
                .username(authUsername)
                .build();
    }

    profile.setFirstName(request.getFirstName());
    profile.setLastName(request.getLastName());

    ProfileEntity savedProfile = profileRepository.save(profile);

    if (savedProfile == null) {
        throw new RuntimeException("Failed to save user in database");
    }

    return CommonResponse.ok(savedProfile);
}

I appreciate all the help.

ANSWER

Answered 2021-May-20 at 07:49

Ok, so that's a common problem when using static methods. You can't easily override them, e.g. in tests. I think what I would do is to turn your SecurityUtils class into a service and make it implement an interface. Then inject this interface into any other service that needs to use it, instead of calling static methods. Then you can easily provide another implementation to your tests.

So you would have something like that:

interface SecurityUtils {
    String getUsername();
    ...
}

@Service
class MySecurityUtils immplements SecurityUtils {
    private JwtToken getJwtToken() {
        return MySecurityContextHolder.getContext().getJwtToken();
    }

    public String getUsername() {
        return getJwtToken().getUsername();
    }
    ...
}

Then in the unit test you can just inject a mock implementation of SecurityUtils to any class you're testing.

Source https://stackoverflow.com/questions/67591875

QUESTION

IntelliJ debug breakpoints not working for Tomcat Run Configuration

Asked 2021-Feb-10 at 20:48

Context

I have a small application with an endpoint which calls some converter library. My Run Configuration is of type Tomcat and deploys an exploded war which is my application.

In the pom.xml's <dependencies> of that application, I have an external library I need to debug. That library is called within my application, obviously.

When I launch the Tomcat Run Configuration in Debug mode, the logs indicate that the Agent seems to have been set up properly and the artefact is deployed successfully (the following is a subset of the logs which I thought were relevant):

C:\Apps\apache-tomcat-8.5.56\bin\catalina.bat run
[2021-02-09 01:32:03,448] Artifact crs-classic-conv-endpoint:war exploded: Waiting for server connection to start artifact deployment...
Using CATALINA_BASE:   "C:\Users\me\AppData\Local\JetBrains\IntelliJIdea2020.3\tomcat\ac2f062f-8f6d-4769-8de5-120d70232ac9"
Using CATALINA_HOME:   "C:\Apps\apache-tomcat-8.5.56"
Using CATALINA_TMPDIR: "C:\Apps\apache-tomcat-8.5.56\temp"
Using JRE_HOME:        "C:\Apps\jdk-1.8.0_181_pki18"
Using CLASSPATH:       "C:\Apps\apache-tomcat-8.5.56\bin\bootstrap.jar;C:\Apps\apache-tomcat-8.5.56\bin\tomcat-juli.jar"
Connected to the target VM, address: '127.0.0.1:61873', transport: 'socket'
Connected to server
[2021-02-09 01:32:05,611] Artifact crs-classic-conv-endpoint:war exploded: Artifact is being deployed, please wait...
Root WebApplicationContext: initialization started 
09-Feb-2021 13:32:30.964 INFOS [RMI TCP Connection(5)-127.0.0.1] com.sun.xml.ws.server.MonitorBase.createRoot Metro monitoring rootname successfully set to: com.sun.metro:pp=/,type=WSEndpoint,name=-RetrieveCRSClassicContractService-RetrieveCRSClassicContractPort
Registering beans for JMX exposure on startup 
Bean with name 'loggingConfiguration' has been autodetected for JMX exposure 
Bean with name 'mailConfiguration' has been autodetected for JMX exposure 
Bean with name 'performanceConfiguration' has been autodetected for JMX exposure 
Bean with name 'propertiesConfiguration' has been autodetected for JMX exposure 
Located managed bean 'propertiesConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.PropertiesConfiguration] 
Located managed bean 'mailConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.MailConfiguration] 
Located managed bean 'performanceConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.PerformanceConfiguration] 
Located managed bean 'loggingConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.LoggingConfiguration] 
Registering beans for JMX exposure on startup 
Bean with name 'loggingConfiguration' has been autodetected for JMX exposure 
Bean with name 'mailConfiguration' has been autodetected for JMX exposure 
Bean with name 'performanceConfiguration' has been autodetected for JMX exposure 
Bean with name 'propertiesConfiguration' has been autodetected for JMX exposure 
Located managed bean 'propertiesConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.PropertiesConfiguration] 
Located managed bean 'mailConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.MailConfiguration] 
Located managed bean 'performanceConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.PerformanceConfiguration] 
Located managed bean 'loggingConfiguration': registering with JMX server as MBean [Foo.${service.name}.Configuration:name=ws-common-utils.LoggingConfiguration] 
Root WebApplicationContext: initialization completed in 15487 ms 
09-Feb-2021 13:32:32.585 INFOS [RMI TCP Connection(5)-127.0.0.1] com.sun.xml.ws.transport.http.servlet.WSServletDelegate.<init> WSSERVLET14 : initialisation du servlet JAX-WS
Refreshing org.springframework.context.support.ClassPathXmlApplicationContext@167fa5b2: startup date [Tue Feb 09 13:32:32 EST 2021]; root of context hierarchy 
[2021-02-09 01:32:33,359] Artifact crs-classic-conv-endpoint:war exploded: Artifact is deployed successfully
[2021-02-09 01:32:33,359] Artifact crs-classic-conv-endpoint:war exploded: Deploy took 27,748 milliseconds

Then, I can see that my breakpoints are registered due to the white checkmark:

enter image description here

However, that's only the case for the code which is in the module's src/main/java directory. The library I'm trying to debug, which is found in the External Libraries, will not have its breakpoints registered:

enter image description here

I've added a few lines of logging, and modified the version in that library's pom.xml and ran mvn clean install in both the library and my application to make sure I was pointing to the right version of the code.


Diagnosis

Now here comes the weird part. When I run the test which calls the endpoint of my application, I see the following logs:

com.my.app.ExecutionServiceException: java.lang.NullPointerException: TheConverter.java, notWithinYearBoundaries line 1110
    at com.my.app.handleTransaction(MyAppEndpointHandler.java:143)

This proves to me that the library code where I had a breakpoint on a logging (L1109) is indeed being ran (and I see the actual logs in the Tomcat logs).

Moreover, when I click on MyAppEndpointHandler.java:143, IntelliJ indeed does open the class in which I had set my breakpoint which was marked as registered (and yes, L61 is within the same function as L143).


Problems

  1. IntelliJ doesn't even interrupt the running application to show me the debugging window when the code reaches the registered breakpoint on L61.
  2. IntelliJ refuses to register my breakpoint in the external library.

Things I tried

This is quite disturbing, and I've tried very many different things (a bunch of stuff I saw in that other SO question, among other things):

  1. clean & rebuild
  2. close all projects, and delete target, .idea/, and *.iml
  3. update my IntelliJ version (from 2020.3 to 2020.3.2)
  4. try another version of Tomcat
  5. Invalidate Caches & Restart
  6. and more...

It might be worth saying that two other developers on my team report that on their machine, they are not having my problem. And we appear to have to same IntelliJ configurations and settings.


Settings and Run Configuration

Here is the Tomcat Run Configuration:

Here is my Debugger Settings:


IntelliJ debug console output

Here is a partial version of the console logs:

...

[JDI: Sending Command(id=16) JDWP.VirtualMachine.TopLevelThreadGroups]
[JDI: Receiving Command(id=16) JDWP.VirtualMachine.TopLevelThreadGroups]
[JDI: Receiving:                groups(ThreadGroupReferenceImpl[]): ]
[JDI: Creating new com.jetbrains.jdi.ThreadGroupReferenceImpl (id = 336)]
[JDI: Receiving:                    groups[i](ThreadGroupReferenceImpl): ref=336]
[JDI: Sending Command(id=18) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 6]
[JDI: Sending:                 suspendPolicy(byte): 0]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Receiving Command(id=18) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 4]
[JDI: Sending Command(id=20) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 7]
[JDI: Sending:                 suspendPolicy(byte): 0]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Receiving Command(id=20) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 5]
[JDI: Sending Command(id=22) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 8]
[JDI: Sending:                 suspendPolicy(byte): 1]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 5]
[JDI: Sending:                         classPattern(String): sun.instrument.InstrumentationImpl]
[JDI: Receiving Command(id=22) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 6]
[JDI: Sending Command(id=24) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 8]
[JDI: Sending:                 suspendPolicy(byte): 1]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 5]
[JDI: Sending:                         classPattern(String): sun.instrument.InstrumentationImpl]
[JDI: Receiving Command(id=24) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 7]
[JDI: Sending Command(id=26) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 8]
[JDI: Sending:                 suspendPolicy(byte): 1]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 5]
[JDI: Sending:                         classPattern(String): com.my.app.RetrieveCRSClassicContractEndpointHandler]
[JDI: Receiving Command(id=26) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 8]
[JDI: Sending Command(id=28) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 8]
[JDI: Sending:                 suspendPolicy(byte): 1]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 5]
[JDI: Sending:                         classPattern(String): com.my.app.RetrieveCRSClassicContractEndpointHandler]
[JDI: Receiving Command(id=28) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 9]
[JDI: Sending Command(id=30) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 8]
[JDI: Sending:                 suspendPolicy(byte): 1]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 5]
[JDI: Sending:                         classPattern(String): com.the.library.TheConverter]
[JDI: Receiving Command(id=30) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 10]
[JDI: Sending Command(id=32) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 8]
[JDI: Sending:                 suspendPolicy(byte): 1]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 5]
[JDI: Sending:                         classPattern(String): com.my.app.SomeException]
[JDI: Receiving Command(id=32) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 11]
[JDI: Sending Command(id=34) JDWP.VirtualMachine.Resume]
[JDI: Clearing VM suspended cache]
[JDI: Clearing temporary cache for ThreadReference 1]
[JDI: Receiving Command(id=1) JDWP.Event.Composite]
[JDI: Clearing temporary cache for ThreadGroupReference 336]
[JDI: Receiving:                suspendPolicy(byte): 0]
[JDI: Receiving Command(id=34) JDWP.VirtualMachine.Resume]
[JDI: Receiving:                events(Events[]): ]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 2]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=337]
[JDI: Receiving:                        signature(String): Ljava/lang/InternalError;]
[JDI: Receiving:                        status(int): 7]
[JDI: EventSet: SUSPEND_NONE]
[JDI: Looking up Class, signature='Ljava/lang/InternalError;', id=337]
[JDI: Caching new ReferenceType, sig=Ljava/lang/InternalError;, id=337]
[JDI: Sending Command(id=39) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving Command(id=39) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): main]
[JDI: Event: ClassPrepareEvent in thread main]
[JDI: Receiving Command(id=2) JDWP.Event.Composite]
[JDI: Receiving:                suspendPolicy(byte): 0]
[JDI: Receiving:                events(Events[]): ]
[JDI: Handled Prepare Event for java.lang.InternalError]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 2]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving:                        refTypeTag(byte): 2]
[JDI: Receiving:                        typeID(long): ref=338]
[JDI: Receiving:                        signature(String): Ljava/lang/instrument/Instrumentation;]
[JDI: Receiving:                        status(int): 3]
[JDI: EventSet: SUSPEND_NONE]
[JDI: Looking up Interface, signature='Ljava/lang/instrument/Instrumentation;', id=338]
[JDI: Caching new ReferenceType, sig=Ljava/lang/instrument/Instrumentation;, id=338]
[JDI: Sending Command(id=41) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving Command(id=41) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): main]
[JDI: Event: ClassPrepareEvent in thread main]
[JDI: Receiving Command(id=3) JDWP.Event.Composite]
[JDI: Receiving:                suspendPolicy(byte): 1]
[JDI: Handled Prepare Event for java.lang.instrument.Instrumentation]
[JDI: Receiving:                events(Events[]): ]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 7]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=339]
[JDI: Receiving:                        signature(String): Lsun/instrument/InstrumentationImpl;]
[JDI: Receiving:                        status(int): 3]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 6]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=339]
[JDI: Receiving:                        signature(String): Lsun/instrument/InstrumentationImpl;]
[JDI: Receiving:                        status(int): 3]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 2]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=1]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=339]
[JDI: Receiving:                        signature(String): Lsun/instrument/InstrumentationImpl;]
[JDI: Receiving:                        status(int): 3]
[JDI: EventSet: SUSPEND_EVENT_THREAD]

...

[JDI: EventSet: SUSPEND_NONE]
[JDI: Looking up Class, signature='Lcom/another/library/AbstractServiceRequestHandler;', id=7241]
[JDI: Caching new ReferenceType, sig=Lcom/another/library/AbstractServiceRequestHandler;, id=7241]
[JDI: Sending Command(id=20792) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving Command(id=20792) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(4)-127.0.0.1]
[JDI: Event: ClassPrepareEvent in thread RMI TCP Connection(4)-127.0.0.1]
[JDI: Receiving Command(id=6914) JDWP.Event.Composite]
[JDI: Receiving:                suspendPolicy(byte): 1]
[JDI: Receiving:                events(Events[]): ]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 9]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=7242]
[JDI: Receiving:                        signature(String): Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;]
[JDI: Receiving:                        status(int): 3]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 8]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=7242]
[JDI: Receiving:                        signature(String): Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;]
[JDI: Receiving:                        status(int): 3]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 2]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving:                        refTypeTag(byte): 1]
[JDI: Receiving:                        typeID(long): ref=7242]
[JDI: Receiving:                        signature(String): Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;]
[JDI: Receiving:                        status(int): 3]
[JDI: Handled Prepare Event for com.another.library.AbstractServiceRequestHandler]
[JDI: EventSet: SUSPEND_EVENT_THREAD]
[JDI: Looking up Class, signature='Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;', id=7242]
[JDI: Caching new ReferenceType, sig=Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;, id=7242]
[JDI: Sending Command(id=20794) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving Command(id=20794) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(4)-127.0.0.1]
[JDI: Event: ClassPrepareEvent in thread RMI TCP Connection(4)-127.0.0.1]
[JDI: Looking up Class, signature='Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;', id=7242]
[JDI: Sending Command(id=20796) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving Command(id=20796) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(4)-127.0.0.1]
[JDI: Event: ClassPrepareEvent in thread RMI TCP Connection(4)-127.0.0.1]
[JDI: Looking up Class, signature='Lcom/my/app/RetrieveCRSClassicContractEndpointHandler;', id=7242]
[JDI: Sending Command(id=20798) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving Command(id=20798) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(4)-127.0.0.1]
[JDI: Event: ClassPrepareEvent in thread RMI TCP Connection(4)-127.0.0.1]
[JDI: Handled Prepare Event for com.my.app.RetrieveCRSClassicContractEndpointHandler]
[JDI: Sending Command(id=20800) JDWP.ReferenceType.MethodsWithGeneric]
[JDI: Sending:                 refType(ReferenceTypeImpl): ref=7242]
[JDI: Receiving Command(id=20800) JDWP.ReferenceType.MethodsWithGeneric]
[JDI: Receiving:                declared(MethodInfo[]): ]
[JDI: Receiving:                    declared[i](MethodInfo): ]
[JDI: Receiving:                    methodID(long): 748485344]
[JDI: Receiving:                    name(String): <init>]
[JDI: Receiving:                    signature(String): ()V]
[JDI: Receiving:                    genericSignature(String): ]
[JDI: Receiving:                    modBits(int): 1]
[JDI: Receiving:                    declared[i](MethodInfo): ]
[JDI: Receiving:                    methodID(long): 748485368]
[JDI: Receiving:                    name(String): validateRequest]
[JDI: Receiving:                    signature(String): (Lcom/my/app/IRetrieveCRSClassicContractRequest;Lcom/my/app/IRetrieveCRSClassicContractResponse;)Z]
[JDI: Receiving:                    genericSignature(String): ]
[JDI: Receiving:                    modBits(int): 1]
[JDI: Receiving:                    declared[i](MethodInfo): ]
[JDI: Receiving:                    methodID(long): 748485384]
[JDI: Receiving:                    name(String): handleTransaction]
[JDI: Receiving:                    signature(String): (Lcom/my/app/IRetrieveCRSClassicContractRequest;Lcom/my/app/IRetrieveCRSClassicContractResponse;)V]

...

[JDI: Sending Command(id=20820) JDWP.ReferenceType.SourceFile]
[JDI: Sending:                 refType(ReferenceTypeImpl): ref=7242]
[JDI: Receiving Command(id=20820) JDWP.ReferenceType.SourceFile]
[JDI: Receiving:                sourceFile(String): RetrieveCRSClassicContractEndpointHandler.java]
[JDI: Sending Command(id=20822) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 2]
[JDI: Sending:                 suspendPolicy(byte): 2]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 7]
[JDI: Sending:                         loc(Location): com.my.app.RetrieveCRSClassicContractEndpointHandler:35]
[JDI: Receiving Command(id=20822) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 14]
[JDI: Sending Command(id=20824) JDWP.EventRequest.Set]
[JDI: Sending:                 eventKind(byte): 2]
[JDI: Sending:                 suspendPolicy(byte): 2]
[JDI: Sending:                 modifiers(Modifier[]): ]
[JDI: Sending:                     modifiers[i](Modifier): ]
[JDI: Sending:                     modKind(byte): 7]
[JDI: Sending:                         loc(Location): com.my.app.RetrieveCRSClassicContractEndpointHandler:61]
[JDI: Receiving Command(id=20824) JDWP.EventRequest.Set]
[JDI: Receiving:                requestID(int): 15]
[JDI: Sending Command(id=20826) JDWP.ThreadReference.Resume]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving Command(id=20826) JDWP.ThreadReference.Resume]
[JDI: Receiving Command(id=6915) JDWP.Event.Composite]
[JDI: Receiving:                suspendPolicy(byte): 0]
[JDI: Receiving:                events(Events[]): ]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 8]
[JDI: Receiving:                        requestID(int): 2]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving:                        refTypeTag(byte): 2]
[JDI: Receiving:                        typeID(long): ref=7243]
[JDI: Receiving:                        signature(String): Lcom/my/app/core/repository/IRepository;]
[JDI: Receiving:                        status(int): 3]
[JDI: EventSet: SUSPEND_NONE]

...

[JDI: Sending Command(id=24396) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=3191]
[JDI: Receiving Command(id=24396) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(idle)]
[JDI: Event: ThreadDeathEvent in thread RMI TCP Connection(idle)]
[JDI: Receiving Command(id=8105) JDWP.Event.Composite]
[JDI: Receiving:                suspendPolicy(byte): 0]
[JDI: Receiving:                events(Events[]): ]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 7]
[JDI: Receiving:                        requestID(int): 5]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=1805]
[JDI: EventSet: SUSPEND_NONE]
[JDI: Sending Command(id=24399) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=1805]
[JDI: Receiving Command(id=24399) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(idle)]
[JDI: Event: ThreadDeathEvent in thread RMI TCP Connection(idle)]
2021-02-10 11:08:47,714 [ 204386]   WARN - n.process.BaseOSProcessHandler - Process hasn't generated any output for a long time.
If it's a long-running mostly idle daemon process, consider overriding OSProcessHandler#readerOptions with 'BaseOutputReader.Options.forMostlySilentProcess()' to reduce CPU usage.
Command line: C:\Apps\apache-tomcat-8.5.56\bin\catalina.bat run 
java.lang.Throwable: Process creation:
    at com.intellij.execution.process.BaseOSProcessHandler.<init>(BaseOSProcessHandler.java:32)
    at com.intellij.execution.process.OSProcessHandler.<init>(OSProcessHandler.java:91)
    at com.intellij.execution.process.OSProcessHandler.<init>(OSProcessHandler.java:84)
    at com.intellij.javaee.appServers.run.execution.LocalJavaeeServerProcessHandler.<init>(LocalJavaeeServerProcessHandler.java:40)
    at com.intellij.javaee.appServers.run.execution.PatchedLocalState$ScriptBasedLocalJavaeeServerProcessHandler.<init>(PatchedLocalState.java:190)
    at com.intellij.javaee.appServers.run.execution.PatchedLocalState.startJ2EEProcess(PatchedLocalState.java:98)
    at com.intellij.javaee.appServers.run.execution.J2EEProcessHandlerWrapper.lambda$new$0(J2EEProcessHandlerWrapper.java:97)
    at com.intellij.util.concurrency.BoundedTaskExecutor.doRun(BoundedTaskExecutor.java:216)
    at com.intellij.util.concurrency.BoundedTaskExecutor.access$200(BoundedTaskExecutor.java:27)
    at com.intellij.util.concurrency.BoundedTaskExecutor$1.execute(BoundedTaskExecutor.java:195)
    at com.intellij.util.ConcurrencyUtil.runUnderThreadName(ConcurrencyUtil.java:208)
    at com.intellij.util.concurrency.BoundedTaskExecutor$1.run(BoundedTaskExecutor.java:184)
    at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
    at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
    at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:668)
    at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1$1.run(Executors.java:665)
    at java.base/java.security.AccessController.doPrivileged(Native Method)
    at java.base/java.util.concurrent.Executors$PrivilegedThreadFactory$1.run(Executors.java:665)
    at java.base/java.lang.Thread.run(Thread.java:834)
[JDI: Receiving Command(id=8106) JDWP.Event.Composite]
[JDI: Receiving:                suspendPolicy(byte): 0]
[JDI: Receiving:                events(Events[]): ]
[JDI: Receiving:                    events[i](Events): ]
[JDI: Receiving:                    eventKind(byte): 7]
[JDI: Receiving:                        requestID(int): 5]
[JDI: Receiving:                        thread(ThreadReferenceImpl): ref=2559]
[JDI: EventSet: SUSPEND_NONE]
[JDI: Sending Command(id=24402) JDWP.ThreadReference.Name]
[JDI: Sending:                 thread(ThreadReferenceImpl): ref=2559]
[JDI: Receiving Command(id=24402) JDWP.ThreadReference.Name]
[JDI: Receiving:                threadName(String): RMI TCP Connection(idle)]
[JDI: Event: ThreadDeathEvent in thread RMI TCP Connection(idle)]
[JDI: Sending Command(id=24404) JDWP.VirtualMachine.Resume]
[JDI: Receiving Command(id=24404) JDWP.VirtualMachine.Resume]
[JDI: Sending Command(id=24406) JDWP.VirtualMachine.Dispose]
[JDI: Receiving Command(id=24406) JDWP.VirtualMachine.Dispose]
[JDI: Target VM i/f closing event queues]
[JDI: Internal event handler exiting]
[JDI: Target VM interface thread exiting]
[JDI: Sending Command(id=24408) JDWP.VirtualMachine.Dispose]
shutdown on Thread[AWT-EventQueue-0,6,Idea Thread Group]

The breakpoint properties

ANSWER

Answered 2021-Feb-10 at 20:48

Turns out there was a misunderstanding. I thought the Test class was making a network call to the localhost Tomcat instance, when in fact it called the source code directly.

That's why when I was launching the Tomcat Run Configuration in Debug mode I would get a registered breakpoint in my source code, but it would never suspend the application, despite me seeing the logs.

And since the application was configured to output logs in a file in target, I thought the logs were coming from Tomcat.

Stepping in from the Test class launched in Debug mode brought me to the application code, and eventually also reached the library code. That is also why I was seeing logs coming from the library, but without seeing the registered breakpoint when I was running the Test class normally after launching the Tomcat Run Configuration in Debug mode.

Ugh. That was an embarrassing and time-consuming journey.

Source https://stackoverflow.com/questions/66125604

QUESTION

How to install npm pckage from private git repoistory using a token in github actions

Asked 2020-Jun-16 at 16:46

I'm trying to install npm packages for my application within a Dockerfile. However, I get the following error when it comes to installing a package from a private git repository.

 Step 9/10 : RUN npm ci
 ---> Running in 57960fe4df81
npm ERR! Error while executing:
npm ERR! /usr/bin/git ls-remote -h -t ***github.com/<redacted-private-org>/<redacted-private-repo>.git
npm ERR! 
npm ERR! remote: Repository not found.
npm ERR! fatal: repository 'https://github.com/<redacted-private-org>/<redacted-private-repo>.git/' not found
npm ERR! 
npm ERR! exited with error code: 128

Dockerfile

FROM node:12.18.0-alpine3.10

RUN apk update && apk upgrade && \
    apk add --no-cache bash git openssh

RUN mkdir -p /home/dev

WORKDIR /home/dev

COPY . /home/dev

RUN npm ci

CMD ["node", "api/api.js"]

package.json

{
  "name": "api",
  "version": "0.1.0",
  "author": "me",
  "license": "",
  "scripts": {
    "prestart": "",
    "start": "NODE_ENV=development nodemon ./api/server.js",
  },
  "dependencies": {
    "bcrypt-nodejs": "^0.0.3",
    "body-parser": "^1.18.2",
    "org-common-utils": "git+https://<redacted-username>:<redacted-token>@github.com/<redacted-private-org>/<redacted-private-repo>.git",
    "cors": "^2.8.4",
    "dotenv": "^8.2.0",
    "express": "^4.16.3",
    "express-routes-mapper": "^1.1.0",
    "helmet": "^3.12.0",
    "igdb-api-node": "^3.1.7",
    "jsonwebtoken": "^8.2.1",
    "mysql": "^2.16.0",
    "mysql2": "^1.6.4",
    "node-cache": "^4.2.0",
    "sequelize": "^5.21.3",
    "sqlite3": "^4.0.0",
  },
  "devDependencies": {
    "cross-env": "^5.1.4",
    "eslint": "^4.19.1",
    "eslint-config-airbnb-base": "^12.1.0",
    "eslint-plugin-import": "^2.18.0",
    "husky": "^0.14.3",
    "jest": "^22.4.3",
    "nodemon": "^1.17.3",
    "shx": "^0.2.2",
    "supertest": "^3.0.0"
  }
}

To install from a private Github repository I'm using a username and token combination as you can see in my package.json.

The repository exists because if I try to navigate to the URL it loads when I'm logged in https://github.com/redacted-private-org/redacted-private-repo

This issue is only occurring in github actions pipeline.

ANSWER

Answered 2020-Jun-16 at 16:45

This issue was only occurring in github actions pipeline. It's solved by setting persist-credentials to false otherwise it uses github actions token which does not have the necessary permissions to pull/install the repository. .

- name: Checkout
  uses: actions/checkout@master
  with:
    persist-credentials: false

https://github.com/actions/checkout

Source https://stackoverflow.com/questions/62407913

QUESTION

Pattern matching and get multiple values from URL using java

Asked 2020-May-07 at 11:18

I am using Java-8, I would like to check whether the URL is valid or not based on pattern. If valid then I should get the attributes bookId, authorId, category, mediaId

Pattern: <basepath>/books/<bookId>/author/<authorId>/<isbn>/<category>/mediaId/<filename>

And this is the sample URL

URL => https:/<baseurl>/v1/files/library/books/1234-4567/author/56784589/32475622347586/media/324785643257567/507f1f77bcf86cd799439011_400.png

Here Basepath is /v1/files/library.

I see some pattern matchings but I couldn't relate with my use-case, probably I was not good at reg-ex. I am also using apache-common-utils but I am not sure How to achieve it either.

Any help or hint would be really appreciable.

ANSWER

Answered 2020-May-07 at 11:18

Try this solution (uses named capture groups in regex):

    public static void main(String[] args)
    {
        Pattern p = Pattern.compile("http[s]?:.+/books/(?<bookId>[^/]+)/author/(?<authorId>[^/]+)/(?<isbn>[^/]+)/media/(?<mediaId>[^/]+)/(?<filename>.+)");
        Matcher m = p.matcher("https:/<baseurl>/v1/files/library/books/1234-4567/author/56784589/32475622347586/media/324785643257567/507f1f77bcf86cd799439011_400.png");
        if (m.matches())
        {
            System.out.println("bookId = " + m.group("bookId"));
            System.out.println("authorId = " + m.group("authorId"));
            System.out.println("isbn = " + m.group("isbn"));
            System.out.println("mediaId = " + m.group("mediaId"));
            System.out.println("filename = " + m.group("filename"));
        }
    }

prints:

bookId = 1234-4567
authorId = 56784589
isbn = 32475622347586
mediaId = 324785643257567
filename = 507f1f77bcf86cd799439011_400.png

Source https://stackoverflow.com/questions/61655737

QUESTION

Private package was created and pip installed but cannot import with python

Asked 2020-Mar-20 at 00:29

I created a private package in TestPyPI

The package has successfully pip installed:

(base) my_user:Desktop$ python3 -m pip install --index-url https://test.pypi.org/simple/ --no-deps charter-common-utils==0.0.1
Looking in indexes: https://test.pypi.org/simple/
    Requirement already satisfied: charter-common-utils==0.0.1 in /Users/my_id/opt/anaconda3/lib/python3.7/site-packages (0.0.1)

I start python in terminal:

>>> import charter_common_utils
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'charter_common_utils'

I've read about the python path issues but that does not seem to be the issue since the last path listed is the one referred to in the 'Requirement already satisfied' above:

(base) SR-C02XT71WJG5J:Desktop p2929612$ python3
Python 3.7.6 (default, Jan  8 2020, 13:42:34) 
[Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import sys
>>> sys.path
['', '/Users/my_id/opt/anaconda3/lib/python37.zip', '/Users/my_id/opt/anaconda3/lib/python3.7', '/Users/my_id/opt/anaconda3/lib/python3.7/lib-dynload', '/Users/my_id/.local/lib/python3.7/site-packages', '/Users/my_id/opt/anaconda3/lib/python3.7/site-packages']

When I follow /Users/my_id/opt/anaconda3/lib/python3.7/site-packages' I am able to see the file charter_common_utils-0.0.1.dist-info

Why am I not able to import the package? Any help is much appreciated.

ANSWER

Answered 2020-Mar-20 at 00:29

Your setup.py lists a lot of top-level packages:

    packages=['anomaly', 'batch_transform', 'hive_table_checker', 'metadata_io',
              'parquet_converter', 'pyspark_visualizer'],

After installation you could import anomaly or parquet_converter but not charter_common_utils; the latter is nowhere mentioned. To import charter_common_utils you have to:

1) create a new directory charter_common_utils at the top of your source directory (where setup.py resides);

2) create a new empty file charter_common_utils/__init__.py;

3) move all your top-level directories (anomaly, batch_transform, hive_table_checker, metadata_io, parquet_converter, pyspark_visualizer) into charter_common_utils;

4) change your setup.py:

    packages=['charter_common_utils',
              'charter_common_utils.anomaly',
              'charter_common_utils.batch_transform',
              'charter_common_utils.hive_table_checker',
              'charter_common_utils.metadata_io',
              'charter_common_utils.parquet_converter',
              'charter_common_utils.pyspark_visualizer',
    ], 

Or change setup.py this way:

from setuptools import find_packages()

…

    packages=find_packages(),

Source https://stackoverflow.com/questions/60766744

QUESTION

kafka-connect-jdbc : SQLException: No suitable driver only when using distributed mode

Asked 2020-Mar-13 at 08:20

We have successfully used mySQL - kafka data ingestion using jdbc standalone connector but now facing issue in using the same in distributed mode (as kafka connect service ).

connect-distributed.properties file-

bootstrap.servers=IP1:9092,IP2:9092
group.id=connect-cluster
key.converter.schemas.enable=true
value.converter.schemas.enable=true
offset.storage.topic=connect-offsets
offset.storage.replication.factor=2
config.storage.topic=connect-configs
config.storage.replication.factor=2
status.storage.topic=connect-status
status.storage.replication.factor=2
offset.flush.interval.ms=10000
plugin.path=/usr/share/java,/usr/share/java/kafka-connect-jdbc

I have my connector jars here-

/usr/share/java/kafka-connect-jdbc

-rw-r--r-- 1 root root  906708 Jul 29 01:18 zookeeper-3.4.13.jar
-rw-r--r-- 1 root root   74798 Jul 29 01:18 zkclient-0.10.jar
-rw-r--r-- 1 root root 5575351 Jul 29 01:18 sqlite-jdbc-3.8.11.2.jar
-rw-r--r-- 1 root root   41203 Jul 29 01:18 slf4j-api-1.7.25.jar
-rw-r--r-- 1 root root  658466 Jul 29 01:18 postgresql-9.4-1206-jdbc41.jar
-rw-r--r-- 1 root root 1292696 Jul 29 01:18 netty-3.10.6.Final.jar
-rw-r--r-- 1 root root  489884 Jul 29 01:18 log4j-1.2.17.jar
-rw-r--r-- 1 root root  211219 Jul 29 01:18 kafka-connect-jdbc-5.0.0.jar
-rw-r--r-- 1 root root  317816 Jul 29 01:18 jtds-1.3.1.jar
-rw-r--r-- 1 root root   87325 Jul 29 01:18 jline-0.9.94.jar
-rw-r--r-- 1 root root   20844 Jul 29 01:18 common-utils-5.0.0.jar
-rw-r--r-- 1 root root   20437 Jul 29 01:18 audience-annotations-0.5.0.jar
-rw-r----- 1 root root 2132635 Nov 11 16:31 mysql-connector-java-8.0.13.jar

I am able to run the standalone mode by running the script in this way-

/usr/bin/connect-standalone /etc/kafka/connect-standalone.properties /etc/kafka-connect-jdbc/source-quickstart-mysql.properties

But when I try to invoke the REST API to run a distributed mode connector , I get the error:

curl -X POST -H "Accept:application/json" -H "Content-Type:application/json" X.X.X.X:8083/connectors/ -d '{"name": "linuxemp-connector", "config": { "connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector", "tasks.max": "1", "connection.url": "jdbc:mysql://Y.Y.Y.Y:3306/linux_db?user=groot&password=pwd","table.whitelist": "emp","mode": "timestamp","incrementing.column.name":"empid","topic.prefix": "mysqlconnector-" } }'

error-

{"error_code":400,"message":"Connector configuration is invalid and contains the following 2 error(s):\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://Y.Y.Y.Y:3306/linux_db?user=groot&password=pwd for configuration Couldn't open connection to jdbc:mysql://Y.Y.Y.Y:3306/linux_db?user=groot&password=pwd\nInvalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://Y.Y.Y.Y:3306/linux_db?user=groot&password=pwd for configuration Couldn't open connection to jdbc:mysql://Y.Y.Y.Y:3306/linux_db?user=groot&password=pwd\nYou can also find the above list of errors at the endpoint `/{connectorType}/config/validate`"}

note- connector jars are placed on all connect nodes , plugin.path is same on all connect nodes and kafka-connect service is up and running .

What am I missing ? Why am I not able to submit REST call to start a distributed connect worker/task for this mysql pipeline ? This works absolutely fine with standalone mode . But throws error with distributed mode .

Please help!

Thanks !

ANSWER

Answered 2019-Mar-05 at 13:29

By fixing below things, the issue got resolved -

1.Changed permissions of /usr/share/java/kafka-connect-jdbc/mysql-connector-java-8.0.13.jar to 755 2.Keep only /usr/share/java in plugin path.

3.Change my sql table structure to have one primary key and one column property with incremental nature or timestamp.

Source https://stackoverflow.com/questions/53321726

QUESTION

Docker Build Kitura Sqift Container - Shim.h mysql.h file not found

Asked 2019-Aug-19 at 13:10

i am trying to move my current Kitura Dev setup into a real running environment by first moving it into a Docker Container to later migrate it to a Cloud Provider.

However while trying to build my Docker Setup for Kitura i run into problems i did not find any proper solution how to fix these.

I am Building my docker Container from The instructions from this page https://www.kitura.io/docs/deploying/docker.html

But i am also using SwiftKuery with MySQL in the package.

My Docker Tools file looks like the following

FROM ibmcom/swift-ubuntu:5.0.2
##FROM swift:5.0.2

LABEL maintainer="IBM Swift Engineering at IBM Cloud"
LABEL Description="Template Dockerfile that extends the ibmcom/swift-ubuntu image."

# We can replace this port with what the user wants
EXPOSE 8080 1024 1025

# Default user if not provided
ARG bx_dev_user=root
ARG bx_dev_userid=1000

# Install system level packages
RUN apt-get update && apt-get dist-upgrade -y
##RUN apt-get update && apt-get install -y sudo libcurl4-openssl-dev openssl libssl-dev pkg-config libmysqlclient-dev

# Add utils files
ADD https://raw.githubusercontent.com/IBM-Swift/swift-ubuntu-docker/master/utils/tools-utils.sh /swift-utils/tools-utils.sh
ADD https://raw.githubusercontent.com/IBM-Swift/swift-ubuntu-docker/master/utils/common-utils.sh /swift-utils/common-utils.sh
RUN chmod -R 555 /swift-utils

# Create user if not root
RUN if [ "$bx_dev_user" != root ]; then useradd -ms /bin/bash -u $bx_dev_userid $bx_dev_user; fi

# Make password not required for sudo.
# This is necessary to run 'tools-utils.sh debug' script when executed from an interactive shell.
# This will not affect the deploy container.
RUN echo "$bx_dev_user ALL=NOPASSWD: ALL" > /etc/sudoers.d/user && \
    chmod 0440 /etc/sudoers.d/user

#Install some further SSL related flaws
##RUN  find / -name libssl.so.1.1 -type f -print

# Bundle application source & binaries
COPY . /swift-project

I get the following error

[5/24] Compiling Swift Module 'SwiftKueryMySQL' (3 sources)
[6/24] Compiling Swift Module 'KituraContracts' (9 sources)
<module-includes>:1:10: note: in file included from <module-includes>:1:
#include "shim.h"
         ^
/swift-project/.build-ubuntu/checkouts/SwiftKueryMySQL/Sources/CMySQL/shim.h:3:10: error: 'mysql.h' file not found
#include <mysql.h>
         ^
/swift-project/.build-ubuntu/checkouts/SwiftKueryMySQL/Sources/SwiftKueryMySQL/MySQLConnection.swift:21:8: error: could not build C module 'CMySQL'
import CMySQL
       ^

after using this commands

docker build -t myapp-build -f Dockerfile-tools .

The First one still works

docker run -v $PWD:/swift-project -w /swift-project myapp-build /swift-utils/tools-utils.sh build release

Error Appears while Building the Container

Any one has an idea what i could do to fix this issue ... ?

UPDATE:

I tried one provided suggestion to add a .swift-build-linux file with one entry

swift build -Xcc -I/usr/include/mysql/

Into the source files of my project .

up on running the command:

docker run -v $PWD:/swift-project -w /swift-project myapp-build /swift-utils/tools-utils.sh build release

It seems to pick it up as you can see below... but still i run into the same issue :(

Current folder: /swift-project
Command: build
Build configuration: release
Build folder: /swift-project/.build-ubuntu
Compiling the project...
Build configuration: release
Custom build command: swift build -Xcc -I/usr/include/mysql/££
warning: you may be able to install mysqlclient using your system-packager:
     apt-get install libmysqlclient-dev

[1/27] Compiling agentcore ibmras/common/Logger.cpp
[2/27] Compiling agentcore ibmras/common/MemoryManager.cpp
[3/27] Compiling agentcore ibmras/common/Properties.cpp
[4/27] Compiling agentcore ibmras/common/LogManager.cpp
[5/26] Compiling Swift Module 'SwiftKueryMySQL' (3 sources)
[6/26] Compiling Swift Module 'KituraContracts' (9 sources)
[7/26] Compiling Swift Module 'CloudFoundryEnv' (6 sources)
[8/26] Compiling CHTTPParser utils.c
[9/26] Linking ./.build-ubuntu/x86_64-unknown-linux/release/libagentcore.so
[10/26] Compiling CHTTPParser http_parser.c
<module-includes>:1:10: note: in file included from <module-includes>:1:
#include "shim.h"
         ^
/swift-project/.build-ubuntu/checkouts/SwiftKueryMySQL/Sources/CMySQL/shim.h:3:10: error: 'mysql.h' file not found
#include <mysql.h>
         ^
/swift-project/.build-ubuntu/checkouts/SwiftKueryMySQL/Sources/SwiftKueryMySQL/MySQLConnection.swift:21:8: error: could not build C module 'CMySQL'
import CMySQL
       ^
Kais-MacBook-Pro:beautylivery_server_mqsql kaibaier$ docker run -v $PWD:/swift-project -w /swift-project myapp-build /swift-utils/tools-utils.sh build release
Current folder: /swift-project
Command: build
Build configuration: release
Build folder: /swift-project/.build-ubuntu
Compiling the project...
Build configuration: release
Custom build command: swift build -Xcc -I/usr/include/mysql/
warning: you may be able to install mysqlclient using your system-packager:
     apt-get install libmysqlclient-dev

[1/33] Compiling CHTTPParser http_parser.c
[2/39] Compiling CHTTPParser utils.c
[3/65] Compiling Swift Module 'TypeDecoder' (2 sources)
[4/65] Compiling Swift Module 'Socket' (3 sources)
[5/65] Compiling Swift Module 'Signals' (1 sources)
[6/65] Compiling Swift Module 'Logging' (3 sources)
[7/65] Compiling Swift Module 'KituraTemplateEngine' (1 sources)
[8/65] Compiling Swift Module 'Cryptor' (11 sources)
[9/65] Compiling memplugin MemoryPlugin.cpp
[10/65] Compiling Swift Module 'LoggerAPI' (1 sources)
[11/65] Compiling hcapiplugin APIConnector.cpp
[12/65] Linking ./.build-ubuntu/x86_64-unknown-linux/release/libmemplugin.so
[13/65] Compiling envplugin envplugin.cpp
[14/65] Linking ./.build-ubuntu/x86_64-unknown-linux/release/libhcapiplugin.so
[15/65] Compiling cpuplugin cpuplugin.cpp
[16/65] Compiling Swift Module 'SwiftKuery' (49 sources)
[17/65] Compiling Swift Module 'KituraContracts' (9 sources)
[18/65] Compiling Swift Module 'HeliumLogger' (2 sources)
[19/65] Compiling Swift Module 'SSLService' (2 sources)
[20/65] Compiling Swift Module 'Health' (3 sources)
[21/65] Compiling Swift Module 'FileKit' (1 sources)
[22/65] Compiling Swift Module 'Configuration' (5 sources)
[23/65] Compiling agentcore ibmras/monitoring/connector/configuration/ConfigurationConnector.cpp
[24/65] Linking ./.build-ubuntu/x86_64-unknown-linux/release/libenvplugin.so
[25/65] Linking ./.build-ubuntu/x86_64-unknown-linux/release/libcpuplugin.so
[26/65] Compiling agentcore ibmras/monitoring/connector/ConnectorManager.cpp
[27/65] Compiling agentcore ibmras/monitoring/agent/threads/WorkerThread.cpp
[28/65] Compiling agentcore ibmras/monitoring/agent/threads/ThreadPool.cpp
[29/65] Compiling agentcore ibmras/monitoring/agent/SystemReceiver.cpp
[30/65] Compiling agentcore ibmras/monitoring/agent/BucketList.cpp
[31/65] Compiling agentcore ibmras/monitoring/agent/Bucket.cpp
[32/65] Compiling Swift Module 'CloudFoundryEnv' (6 sources)
[33/65] Compiling agentcore ibmras/monitoring/agent/Agent.cpp
[34/65] Compiling agentcore ibmras/monitoring/Plugin.cpp
[35/65] Compiling agentcore ibmras/common/util/sysUtils.cpp
[36/65] Compiling agentcore ibmras/common/util/strUtils.cpp
[37/65] Compiling agentcore ibmras/common/util/LibraryUtils.cpp
[38/65] Compiling agentcore ibmras/common/util/FileUtils.cpp
[39/65] Compiling agentcore ibmras/common/port/linux/Thread.cpp
[40/65] Compiling Swift Module 'SwiftKueryMySQL' (3 sources)
[41/65] Compiling agentcore ibmras/common/port/linux/Process.cpp
[42/65] Compiling agentcore ibmras/common/port/ThreadData.cpp
<module-includes>:1:10: note: in file included from <module-includes>:1:
#include "shim.h"
         ^
/swift-project/.build-ubuntu/checkouts/SwiftKueryMySQL/Sources/CMySQL/shim.h:3:10: error: 'mysql.h' file not found
#include <mysql.h>
         ^
/swift-project/.build-ubuntu/checkouts/SwiftKueryMySQL/Sources/SwiftKueryMySQL/MySQLConnection.swift:21:8: error: could not build C module 'CMySQL'
import CMySQL
       ^

UPDATE :

It seems to work after i added this line in my Dockerfile-tools . :

RUN apt-get update && apt-get install -y sudo libmysqlclient-dev 

... but now i get a new error i first have to investigate ...

[19/20] Compiling Swift Module 'Beautylivery_Server_New' (1 sources) [20/20] Linking ./.build-ubuntu/x86_64-unknown-linux/release/Beautylivery_Server_New clang-7: error: unable to execute command: Bus error clang-7: error: linker command failed due to signal (use -v to see invocation) <unknown>:0: error: link command failed with exit code 254 (use -v to see invocation)

UPDATE - Resolution:

Ok i added the entry

swift build -Xcc -I/usr/include/mysql/

to the new created .swift-build-linux file in the root directory in my project. (previously i had a small spelling mistake in the line).

Now i was able to pass this step of the Dockerisation process ... :)

UPDATE - NEW Problem:

Hello Together, i was running in a new problem related to the tutorial and this is to run the conatainer after building it: As i assume it is a different topic, i have raised a new question: Run Kitura Docker Image causes libmysqlclient.so.18 Error

its is that up on running the container, i get an error message:

error while loading shared libraries: libmysqlclient.so.18: cannot open shared object file: No such file or directory

Thanks a lot for the Help !

ANSWER

Answered 2019-Aug-14 at 09:04

The problem here is that the ibmcom/swift-ubuntu:5.0.2 image is built on top of Ubuntu 14.04. The version of libmysqlclient-dev supplied with 14.04 does not include the pkg-config information that allows the Swift compiler to find the headers without help.

There are two solutions to this problem:

1: You could add -Xcc -I/usr/include/mysql/ arguments to the swift build command. Either:

  • replace the command you are executing in the build container with swift build -Xcc -I/usr/include/mysql/, or
  • if you'd like to keep using the tools-utils.sh script, you can create a file in your project called .swift-build-linux which contains a single line: swift build -Xcc -I/usr/include/mysql/ - this will be picked up by the tools-utils.sh script when it runs the build.

2: You can replace your base images with FROM swift:5.0.2 (for the build image) and FROM swift:5.0.2-slim (for the run image) - these are the official Swift-maintained images which are based on Ubuntu 18.04, and as of the Swift 5.0.2 release, provide a 'slim' image similar to ibmcom's 'runtime' image.

  • Note that these images do not bundle the libssl-dev or libcurl4-openssl-dev dependencies, so you will need to include those in your Dockerfile-tools.

Source https://stackoverflow.com/questions/57486811

QUESTION

Kafka Connect java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;

Asked 2019-Jun-30 at 03:29

I am trying to setup kafka-connect-cassandra on an AWS instance.

I have setup plugin.path in connect-avro-distributed.properties file:

plugin.path=/home/ubuntu/kafka_2.11-1.0.0/plugins

And I have kafka-connect-cassandra-1.0.0-1.0.0-all.jar in:

/home/ubuntu/kafka_2.11-1.0.0/plugins/lib

This is the traceback:

[2018-02-18 10:28:33,268] INFO Kafka Connect distributed worker initializing ... (org.apache.kafka.connect.cli.ConnectDistributed:60)
[2018-02-18 10:28:33,278] INFO WorkerInfo values: 
    jvm.args = -Xmx256M, -XX:+UseG1GC, -XX:MaxGCPauseMillis=20, -XX:InitiatingHeapOccupancyPercent=35, -XX:+ExplicitGCInvokesConcurrent, -Djava.awt.headless=true, -Dcom.sun.management.jmxremote, -Dcom.sun.management.jmxremote.authenticate=false, -Dcom.sun.management.jmxremote.ssl=false, -Dkafka.logs.dir=/var/log/kafka, -Dlog4j.configuration=file:/etc/kafka/connect-log4j.properties
    jvm.spec = Oracle Corporation, Java HotSpot(TM) 64-Bit Server VM, 1.8.0_151, 25.151-b12
    jvm.classpath = /home/ubuntu/kafka_2.11-1.0.0/plugins:/usr/share/java/kafka/jackson-jaxrs-json-provider-2.9.1.jar:/usr/share/java/kafka/paranamer-2.7.jar:/usr/share/java/kafka/rocksdbjni-5.7.3.jar:/usr/share/java/kafka/jackson-jaxrs-base-2.9.1.jar:/usr/share/java/kafka/jetty-servlets-9.2.22.v20170606.jar:/usr/share/java/kafka/kafka-clients-1.0.0-cp1.jar:/usr/share/java/kafka/xz-1.5.jar:/usr/share/java/kafka/commons-lang3-3.1.jar:/usr/share/java/kafka/jetty-security-9.2.22.v20170606.jar:/usr/share/java/kafka/httpclient-4.5.2.jar:/usr/share/java/kafka/jackson-core-asl-1.9.13.jar:/usr/share/java/kafka/connect-json-1.0.0-cp1.jar:/usr/share/java/kafka/connect-runtime-1.0.0-cp1.jar:/usr/share/java/kafka/jersey-container-servlet-2.25.1.jar:/usr/share/java/kafka/commons-collections-3.2.1.jar:/usr/share/java/kafka/connect-transforms-1.0.0-cp1.jar:/usr/share/java/kafka/jersey-common-2.25.1.jar:/usr/share/java/kafka/zookeeper-3.4.10.jar:/usr/share/java/kafka/scala-library-2.11.11.jar:/usr/share/java/kafka/jackson-core-2.9.1.jar:/usr/share/java/kafka/argparse4j-0.7.0.jar:/usr/share/java/kafka/maven-artifact-3.5.0.jar:/usr/share/java/kafka/kafka_2.11-1.0.0-cp1-test-sources.jar:/usr/share/java/kafka/commons-codec-1.9.jar:/usr/share/java/kafka/kafka-connect-cassandra-1.0.0-1.0.0-all.jar:/usr/share/java/kafka/httpcore-4.4.4.jar:/usr/share/java/kafka/hk2-utils-2.5.0-b32.jar:/usr/share/java/kafka/connect-api-1.0.0-cp1.jar:/usr/share/java/kafka/javassist-3.21.0-GA.jar:/usr/share/java/kafka/kafka_2.11-1.0.0-cp1-sources.jar:/usr/share/java/kafka/support-metrics-client-4.0.0.jar:/usr/share/java/kafka/commons-compress-1.8.1.jar:/usr/share/java/kafka/kafka_2.11-1.0.0-cp1-scaladoc.jar:/usr/share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/share/java/kafka/jersey-media-jaxb-2.25.1.jar:/usr/share/java/kafka/kafka-streams-1.0.0-cp1.jar:/usr/share/java/kafka/zkclient-0.10.jar:/usr/share/java/kafka/hk2-locator-2.5.0-b32.jar:/usr/share/java/kafka/slf4j-api-1.7.25.jar:/usr/share/java/kafka/support-metrics-common-4.0.0.jar:/usr/share/java/kafka/kafka.jar:/usr/share/java/kafka/jersey-server-2.25.1.jar:/usr/share/java/kafka/jackson-module-jaxb-annotations-2.9.1.jar:/usr/share/java/kafka/jetty-io-9.2.22.v20170606.jar:/usr/share/java/kafka/kafka-log4j-appender-1.0.0-cp1.jar:/usr/share/java/kafka/avro-1.8.2.jar:/usr/share/java/kafka/jackson-annotations-2.9.1.jar:/usr/share/java/kafka/guava-20.0.jar:/usr/share/java/kafka/hk2-api-2.5.0-b32.jar:/usr/share/java/kafka/lz4-java-1.4.jar:/usr/share/java/kafka/reflections-0.9.11.jar:/usr/share/java/kafka/commons-digester-1.8.1.jar:/usr/share/java/kafka/slf4j-log4j12-1.7.25.jar:/usr/share/java/kafka/jersey-client-2.25.1.jar:/usr/share/java/kafka/commons-lang3-3.5.jar:/usr/share/java/kafka/jackson-mapper-asl-1.9.13.jar:/usr/share/java/kafka/javax.annotation-api-1.2.jar:/usr/share/java/kafka/snappy-java-1.1.4.jar:/usr/share/java/kafka/javax.inject-2.5.0-b32.jar:/usr/share/java/kafka/jackson-databind-2.9.1.jar:/usr/share/java/kafka/jetty-http-9.2.22.v20170606.jar:/usr/share/java/kafka/kafka-streams-examples-1.0.0-cp1.jar:/usr/share/java/kafka/plexus-utils-3.0.24.jar:/usr/share/java/kafka/metrics-core-2.2.0.jar:/usr/share/java/kafka/connect-file-1.0.0-cp1.jar:/usr/share/java/kafka/kafka-tools-1.0.0-cp1.jar:/usr/share/java/kafka/jersey-guava-2.25.1.jar:/usr/share/java/kafka/commons-logging-1.2.jar:/usr/share/java/kafka/jetty-server-9.2.22.v20170606.jar:/usr/share/java/kafka/validation-api-1.1.0.Final.jar:/usr/share/java/kafka/jetty-continuation-9.2.22.v20170606.jar:/usr/share/java/kafka/osgi-resource-locator-1.0.1.jar:/usr/share/java/kafka/httpmime-4.5.2.jar:/usr/share/java/kafka/log4j-1.2.17.jar:/usr/share/java/kafka/jopt-simple-5.0.4.jar:/usr/share/java/kafka/kafka_2.11-1.0.0-cp1-javadoc.jar:/usr/share/java/kafka/javax.ws.rs-api-2.0.1.jar:/usr/share/java/kafka/jersey-container-servlet-core-2.25.1.jar:/usr/share/java/kafka/jetty-servlet-9.2.22.v20170606.jar:/usr/share/java/kafka/jetty-util-9.2.22.v20170606.jar:/usr/share/java/kafka/commons-validator-1.4.1.jar:/usr/share/java/kafka/kafka_2.11-1.0.0-cp1.jar:/usr/share/java/kafka/javax.inject-1.jar:/usr/share/java/kafka/commons-beanutils-1.8.3.jar:/usr/share/java/kafka/javassist-3.20.0-GA.jar:/usr/share/java/kafka/aopalliance-repackaged-2.5.0-b32.jar:/usr/share/java/kafka/kafka_2.11-1.0.0-cp1-test.jar:/usr/share/java/confluent-common/zookeeper-3.4.10.jar:/usr/share/java/confluent-common/common-metrics-4.0.0.jar:/usr/share/java/confluent-common/build-tools-4.0.0.jar:/usr/share/java/confluent-common/zkclient-0.10.jar:/usr/share/java/confluent-common/slf4j-api-1.7.25.jar:/usr/share/java/confluent-common/common-utils-4.0.0.jar:/usr/share/java/confluent-common/netty-3.10.5.Final.jar:/usr/share/java/confluent-common/common-config-4.0.0.jar:/usr/share/java/confluent-common/jline-0.9.94.jar:/usr/share/java/confluent-common/log4j-1.2.17.jar:/usr/share/java/kafka-serde-tools/kafka-json-serializer-4.0.0.jar:/usr/share/java/kafka-serde-tools/paranamer-2.7.jar:/usr/share/java/kafka-serde-tools/xz-1.5.jar:/usr/share/java/kafka-serde-tools/jackson-core-asl-1.9.13.jar:/usr/share/java/kafka-serde-tools/jackson-core-2.9.1.jar:/usr/share/java/kafka-serde-tools/kafka-connect-avro-converter-4.0.0.jar:/usr/share/java/kafka-serde-tools/commons-compress-1.8.1.jar:/usr/share/java/kafka-serde-tools/avro-1.8.2.jar:/usr/share/java/kafka-serde-tools/jackson-annotations-2.9.1.jar:/usr/share/java/kafka-serde-tools/kafka-schema-registry-client-4.0.0.jar:/usr/share/java/kafka-serde-tools/kafka-avro-serializer-4.0.0.jar:/usr/share/java/kafka-serde-tools/jackson-mapper-asl-1.9.13.jar:/usr/share/java/kafka-serde-tools/jackson-databind-2.9.1.jar:/usr/share/java/kafka-serde-tools/snappy-java-1.1.1.3.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-json-provider-2.9.1.jar:/usr/bin/../share/java/kafka/paranamer-2.7.jar:/usr/bin/../share/java/kafka/rocksdbjni-5.7.3.jar:/usr/bin/../share/java/kafka/jackson-jaxrs-base-2.9.1.jar:/usr/bin/../share/java/kafka/jetty-servlets-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/kafka-clients-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/xz-1.5.jar:/usr/bin/../share/java/kafka/commons-lang3-3.1.jar:/usr/bin/../share/java/kafka/jetty-security-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/httpclient-4.5.2.jar:/usr/bin/../share/java/kafka/jackson-core-asl-1.9.13.jar:/usr/bin/../share/java/kafka/connect-json-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/connect-runtime-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-2.25.1.jar:/usr/bin/../share/java/kafka/commons-collections-3.2.1.jar:/usr/bin/../share/java/kafka/connect-transforms-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/jersey-common-2.25.1.jar:/usr/bin/../share/java/kafka/zookeeper-3.4.10.jar:/usr/bin/../share/java/kafka/scala-library-2.11.11.jar:/usr/bin/../share/java/kafka/jackson-core-2.9.1.jar:/usr/bin/../share/java/kafka/argparse4j-0.7.0.jar:/usr/bin/../share/java/kafka/maven-artifact-3.5.0.jar:/usr/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-test-sources.jar:/usr/bin/../share/java/kafka/commons-codec-1.9.jar:/usr/bin/../share/java/kafka/kafka-connect-cassandra-1.0.0-1.0.0-all.jar:/usr/bin/../share/java/kafka/httpcore-4.4.4.jar:/usr/bin/../share/java/kafka/hk2-utils-2.5.0-b32.jar:/usr/bin/../share/java/kafka/connect-api-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/javassist-3.21.0-GA.jar:/usr/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-sources.jar:/usr/bin/../share/java/kafka/support-metrics-client-4.0.0.jar:/usr/bin/../share/java/kafka/commons-compress-1.8.1.jar:/usr/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-scaladoc.jar:/usr/bin/../share/java/kafka/javax.servlet-api-3.1.0.jar:/usr/bin/../share/java/kafka/jersey-media-jaxb-2.25.1.jar:/usr/bin/../share/java/kafka/kafka-streams-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/zkclient-0.10.jar:/usr/bin/../share/java/kafka/hk2-locator-2.5.0-b32.jar:/usr/bin/../share/java/kafka/slf4j-api-1.7.25.jar:/usr/bin/../share/java/kafka/support-metrics-common-4.0.0.jar:/usr/bin/../share/java/kafka/kafka.jar:/usr/bin/../share/java/kafka/jersey-server-2.25.1.jar:/usr/bin/../share/java/kafka/jackson-module-jaxb-annotations-2.9.1.jar:/usr/bin/../share/java/kafka/jetty-io-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/kafka-log4j-appender-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/avro-1.8.2.jar:/usr/bin/../share/java/kafka/jackson-annotations-2.9.1.jar:/usr/bin/../share/java/kafka/guava-20.0.jar:/usr/bin/../share/java/kafka/hk2-api-2.5.0-b32.jar:/usr/bin/../share/java/kafka/lz4-java-1.4.jar:/usr/bin/../share/java/kafka/reflections-0.9.11.jar:/usr/bin/../share/java/kafka/commons-digester-1.8.1.jar:/usr/bin/../share/java/kafka/slf4j-log4j12-1.7.25.jar:/usr/bin/../share/java/kafka/jersey-client-2.25.1.jar:/usr/bin/../share/java/kafka/commons-lang3-3.5.jar:/usr/bin/../share/java/kafka/jackson-mapper-asl-1.9.13.jar:/usr/bin/../share/java/kafka/javax.annotation-api-1.2.jar:/usr/bin/../share/java/kafka/snappy-java-1.1.4.jar:/usr/bin/../share/java/kafka/javax.inject-2.5.0-b32.jar:/usr/bin/../share/java/kafka/jackson-databind-2.9.1.jar:/usr/bin/../share/java/kafka/jetty-http-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/kafka-streams-examples-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/plexus-utils-3.0.24.jar:/usr/bin/../share/java/kafka/metrics-core-2.2.0.jar:/usr/bin/../share/java/kafka/connect-file-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/kafka-tools-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/jersey-guava-2.25.1.jar:/usr/bin/../share/java/kafka/commons-logging-1.2.jar:/usr/bin/../share/java/kafka/jetty-server-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/validation-api-1.1.0.Final.jar:/usr/bin/../share/java/kafka/jetty-continuation-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/osgi-resource-locator-1.0.1.jar:/usr/bin/../share/java/kafka/httpmime-4.5.2.jar:/usr/bin/../share/java/kafka/log4j-1.2.17.jar:/usr/bin/../share/java/kafka/jopt-simple-5.0.4.jar:/usr/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-javadoc.jar:/usr/bin/../share/java/kafka/javax.ws.rs-api-2.0.1.jar:/usr/bin/../share/java/kafka/jersey-container-servlet-core-2.25.1.jar:/usr/bin/../share/java/kafka/jetty-servlet-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/jetty-util-9.2.22.v20170606.jar:/usr/bin/../share/java/kafka/commons-validator-1.4.1.jar:/usr/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1.jar:/usr/bin/../share/java/kafka/javax.inject-1.jar:/usr/bin/../share/java/kafka/commons-beanutils-1.8.3.jar:/usr/bin/../share/java/kafka/javassist-3.20.0-GA.jar:/usr/bin/../share/java/kafka/aopalliance-repackaged-2.5.0-b32.jar:/usr/bin/../share/java/kafka/kafka_2.11-1.0.0-cp1-test.jar:/usr/bin/../share/java/confluent-support-metrics/*:/usr/share/java/confluent-support-metrics/*
    os.spec = Linux, amd64, 4.4.0-1049-aws
    os.vcpus = 2
 (org.apache.kafka.connect.runtime.WorkerInfo:71)
[2018-02-18 10:28:33,279] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectDistributed:69)
[2018-02-18 10:28:33,290] INFO Loading plugin from: /home/ubuntu/kafka_2.11-1.0.0/plugins/lib (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:179)
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.collect.Sets$SetView.iterator()Lcom/google/common/collect/UnmodifiableIterator;
    at org.reflections.Reflections.expandSuperTypes(Reflections.java:380)
    at org.reflections.Reflections.<init>(Reflections.java:126)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanPluginPath(DelegatingClassLoader.java:258)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.scanUrlsAndAddPlugins(DelegatingClassLoader.java:201)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.registerPlugin(DelegatingClassLoader.java:193)
    at org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader.initLoaders(DelegatingClassLoader.java:153)
    at org.apache.kafka.connect.runtime.isolation.Plugins.<init>(Plugins.java:47)
    at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:70)

The first entry in jvm.classpath is the location where I have kafka-connect-cassandra.jar, inside plugin/lib.

guava.jars:

These are the paths of guava jars. Where should I place the kafka-connect-cassandra.jar or should I just remove any of these jars?

/usr/share/java/kafka-connect-elasticsearch/guava-18.0.jar
/usr/share/java/kafka-connect-storage-common/guava-14.0.1.jar
/usr/share/java/kafka/guava-20.0.jar
/home/ubuntu/cassandra/apache-cassandra-3.11.1/lib/guava-18.0.jar

Kindly help me out.

ANSWER

Answered 2018-Feb-18 at 00:59

This is a classpath issue. Looks like maybe you have an incompatible version of guava in the classpath? If your plugin path isn't including this method in any of the jars it has, that's a connector packaging issue. If it is, then you probably have two versions hanging around. Double check that plugin path with a find command to inspect all jars for that class in the message as a first step. Ultimately, you'll need to figure out what version of the dependency the connector expects and get that version and only that version into the plugin path.

Source https://stackoverflow.com/questions/48842850

QUESTION

Unable to run a JDBC Source connector with Confluent REST API

Asked 2019-Jun-30 at 03:19

I want to run a JDBC source connector using Kafka Connect REST API. Although stand-alone mode works perfect using the following properties file:

name=source-mysql-test
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
tasks.max=1

connection.url=jdbc:mysql://localhost:3306/kafka
connection.user=myuser
connection.password=mypass


table.whitelist=MY_TABLE

# Pull all rows based on timestamp
mode=timestamp
timestamp.column.name=ROWVERSION
validate.non.null=false

# The Kafka topic will be made up of this prefix, plus the table name.
topic.prefix=MYSQL-

table.types=TABLE,VIEW
poll.interval.ms=1000

I am not able to run the connector using the REST API. Here's the call:

curl -X POST -H "Content-Type: application/json" --data '{"name": "source-mysql-test", "config": {"connector.class":"io.confluent.connect.jdbc.JdbcSourceConnector", "tasks.max":"1", "connection.url":"jdbc:mysql://localhost:3306/kafka","connection.user":"myuser","connection.password":"mypass", "table.whitelist":"MY_TABLE", "mode":"timestamp", "timestamp.column.name":"ROWVERSION", "validate.non.null":"false", "topic.prefix":"MYSQL-", "table.types":"TABLE,VIEW", "poll.interval.ms":"1000" }}' http://localhost:8083/connectors

And here's the response:

{
  "error_code": 400,
  "message": "Connector configuration is invalid and contains the following 2 error(s):\nInvalid value com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create connection to database server. for configuration Couldn't open connection to jdbc:mysql://localhost:3306/kafka\nInvalid value com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Could not create connection to database server. for configuration Couldn't open connection to jdbc:mysql://localhost:3306/kafka\nYou can also find the above list of errors at the endpoint `/{connectorType}/config/validate`"
}

In the past I have used the REST API in order to run JDBC sink connectors without any problems!

Here are the available connector plugins:

> ls /usr/share/java/kafka-connect-jdbc/
common-utils-4.1.0.jar        mysql-connector-java-5.1.46.jar  uber-restavro-1.0-SNAPSHOT.jar
jline-0.9.94.jar              netty-3.10.5.Final.jar          
kafka-connect-jdbc-4.1.0.jar  postgresql-9.4-1206-jdbc41.jar   zkclient-0.10.jar
log4j-1.2.17.jar              slf4j-api-1.7.25.jar             zookeeper-3.4.10.jar
mssql-jdbc-6.2.2.jre8.jar     sqlite-jdbc-3.8.11.2.jar

ANSWER

Answered 2018-May-15 at 16:33
com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: 
Could not create connection to database server

Kafka Connect cannot connect to your MySQL machine.

Source https://stackoverflow.com/questions/50351956

QUESTION

Why my kafka connects to mysql8.0 always encounters problem?

Asked 2019-May-15 at 13:16

When I try to connect to mysql8.0 with kafka connect,there are always a problem about my driver. The problem is no suitable driver for found

This is for a new CentOS7 plugin.path = [share/java, /root/confluent-5.2.1/share/confluent-hub-components] Under the directory of /root/confluent-5.2.1/share/confluent-hub-components, there are tow drivers:

[root@localhost confluent-hub-components]# ls
confluentinc-kafka-connect-jdbc  debezium-debezium-connector-mysql

My driver is

[root@localhost confluent-hub-components]# cd confluentinc-kafka-connect-jdbc/
[root@localhost confluentinc-kafka-connect-jdbc]# ls
assets  doc  etc  lib  manifest.json
[root@localhost confluentinc-kafka-connect-jdbc]# cd lib
[root@localhost lib]# ls
audience-annotations-0.5.0.jar  jline-0.9.94.jar  kafka-connect-jdbc-5.2.1.jar  postgresql-9.4-1206-jdbc41.jar  sqlite-jdbc-3.25.2.jar  zookeeper-3.4.13.jar
common-utils-5.2.1.jar          jtds-1.3.1.jar    netty-3.10.6.Final.jar        slf4j-api-1.7.25.jar            zkclient-0.10.jar

Here is my code

 bin/connect-standalone etc/schema-registry/connect-avro-standalone.properties etc/kafka-connect-jdbc/mysql-source.properties

mysql-source.properties:

#tasks to create:
name=jdbc-source-mysql-01
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
#tasks.max=1
# a table called 'users' will be written to the topic 'test-mysql-jdbc-users'.
connection.user=root
connection.password=root
connection.url=jdbc:mysql://localhost:3306/employees
mode=bulk
#incrementing.column.name=fdsid
topic.prefix=test-mysql-jdbc-

Error:

Invalid value java.sql.SQLException: No suitable driver found for jdbc:mysql://localhost:3306/employees for configuration Couldn't open connection to jdbc:mysql://localhost:3306/employees

ANSWER

Answered 2019-May-15 at 13:16

The MySQL driver does not ship with the Kafka Connect JDBC connector. You have to install it yourself and in the correct location.

You need to put the relevant MySQL JDBC Driver JAR in the Kafka Connect JDBC folder before you start the Kafka Connect worker

This post will help you out with more detail.

Source https://stackoverflow.com/questions/56149282

Community Discussions, Code Snippets contain sources that include Stack Exchange Network

VULNERABILITIES

No vulnerabilities reported

INSTALL common-utils

You can use common-utils like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the common-utils component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .

SUPPORT

For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

Implement common-utils faster with kandi.

  • Use the support, quality, security, license, reuse scores and reviewed functions to confirm the fit for your project.
  • Use the, Q & A, Installation and Support guides to implement faster.

Discover Millions of Libraries and
Pre-built Use Cases on kandi