kandi background
Explore Kits

hbase | Apache HBase

 by   apache Java Version: rel/2.4.11 License: Apache-2.0

 by   apache Java Version: rel/2.4.11 License: Apache-2.0

Download this library from

kandi X-RAY | hbase Summary

hbase is a Java library typically used in Big Data, Spark, Hadoop applications. hbase has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has high support. You can download it from GitHub, Maven.
Apache HBase [1] is an open-source, distributed, versioned, column-oriented store modeled after Google' Bigtable: A Distributed Storage System for Structured Data by Chang et al.[2] Just as Bigtable leverages the distributed data storage provided by the Google File System, HBase provides Bigtable-like capabilities on top of Apache Hadoop [3]. To get started using HBase, the full documentation for this release can be found under the doc/ directory that accompanies this README. Using a browser, open the docs/index.html to view the project home page (or browse to [1]). The hbase 'book' at http://hbase.apache.org/book.html has a 'quick start' section and is where you should being your exploration of the hbase project. The latest HBase can be downloaded from an Apache Mirror [4]. The source code can be found at [5]. The HBase issue tracker is at [6]. Apache HBase is made available under the Apache License, version 2.0 [7]. The HBase mailing lists and archives are listed here [8]. The HBase distribution includes cryptographic software. See the export control notice here [9].
Support
Support
Quality
Quality
Security
Security
License
License
Reuse
Reuse

kandi-support Support

  • hbase has a highly active ecosystem.
  • It has 4427 star(s) with 2980 fork(s). There are 411 watchers for this library.
  • There were 2 major release(s) in the last 6 months.
  • hbase has no issues reported. There are 166 open pull requests and 0 closed requests.
  • It has a positive sentiment in the developer community.
  • The latest version of hbase is rel/2.4.11
hbase Support
Best in #Java
Average in #Java
hbase Support
Best in #Java
Average in #Java

quality kandi Quality

  • hbase has no bugs reported.
hbase Quality
Best in #Java
Average in #Java
hbase Quality
Best in #Java
Average in #Java

securitySecurity

  • hbase has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
hbase Security
Best in #Java
Average in #Java
hbase Security
Best in #Java
Average in #Java

license License

  • hbase is licensed under the Apache-2.0 License. This license is Permissive.
  • Permissive licenses have the least restrictions, and you can use them in most projects.
hbase License
Best in #Java
Average in #Java
hbase License
Best in #Java
Average in #Java

buildReuse

  • hbase releases are available to install and integrate.
  • Deployable package is available in Maven.
  • Build file is available. You can build the component from source.
hbase Reuse
Best in #Java
Average in #Java
hbase Reuse
Best in #Java
Average in #Java
Top functions reviewed by kandi - BETA

kandi has reviewed hbase and discovered the below as its top functions. This is intended to give you an instant insight into hbase implemented functionality, and help decide if they suit your requirements.

  • add HBase methods
  • Finish the active master .
  • This method is used to perform the compaction .
  • Generate assignment plan .
  • Create a record writer .
  • Checks for consistency .
  • Process a multi request .
  • Fill the snapshot .
  • Retrieves next row .
  • Perform a rolling split on a table .

hbase Key Features

http://hbase.apache.org

http://research.google.com/archive/bigtable.html

http://hadoop.apache.org

http://www.apache.org/dyn/closer.lua/hbase/

https://hbase.apache.org/source-repository.html

https://hbase.apache.org/issue-tracking.html

http://hbase.apache.org/license.html

http://hbase.apache.org/mail-lists.html

https://hbase.apache.org/export_control.html

Failed to Find Any Kerberos TGT while trying to access Kerberized HBase Without kinit

copy iconCopydownload iconDownload
object Debug extends App {
   val hbaseConf: HadoopConf = HBaseConfiguration.create

   UserGroupInformation.setConfiguration(hbaseConf)
   UserGroupInformation.loginUserFromKeytab("principal@REALM","C:\\principal.keytab")

   val connection: Connection = ConnectionFactory.createConnection(hbaseConf)
   :
   etc
   :

Apache Hive fails to initialize on Windows 10 and Cygwin

copy iconCopydownload iconDownload
schematool -dbType derby -initSchema
Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
--CREATE FUNCTION "APP"."NUCLEUS_ASCII" (C CHAR(1)) RETURNS INTEGER LANGUAGE JAVA PARAMETER STYLE JAVA READS SQL DATA CALLED ON NULL INPUT EXTERNAL NAME 'org.datanucleus.store.rdbms.adapter.DerbySQLFunction.ascii';

--CREATE FUNCTION "APP"."NUCLEUS_MATCHES" (TEXT VARCHAR(8000),PATTERN VARCHAR(8000)) RETURNS INTEGER LANGUAGE JAVA PARAMETER STYLE JAVA READS SQL DATA CALLED ON NULL INPUT EXTERNAL NAME 'org.datanucleus.store.rdbms.adapter.DerbySQLFunction.matches' ;
Initialization script completed
schemaTool completed
-----------------------
schematool -dbType derby -initSchema
Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
--CREATE FUNCTION "APP"."NUCLEUS_ASCII" (C CHAR(1)) RETURNS INTEGER LANGUAGE JAVA PARAMETER STYLE JAVA READS SQL DATA CALLED ON NULL INPUT EXTERNAL NAME 'org.datanucleus.store.rdbms.adapter.DerbySQLFunction.ascii';

--CREATE FUNCTION "APP"."NUCLEUS_MATCHES" (TEXT VARCHAR(8000),PATTERN VARCHAR(8000)) RETURNS INTEGER LANGUAGE JAVA PARAMETER STYLE JAVA READS SQL DATA CALLED ON NULL INPUT EXTERNAL NAME 'org.datanucleus.store.rdbms.adapter.DerbySQLFunction.matches' ;
Initialization script completed
schemaTool completed
-----------------------
schematool -dbType derby -initSchema
Error: FUNCTION 'NUCLEUS_ASCII' already exists. (state=X0Y68,code=30000)
org.apache.hadoop.hive.metastore.HiveMetaException: Schema initialization FAILED! Metastore state would be inconsistent !!
Underlying cause: java.io.IOException : Schema script failed, errorcode 2
Use --verbose for detailed stacktrace.
*** schemaTool failed ***
--CREATE FUNCTION "APP"."NUCLEUS_ASCII" (C CHAR(1)) RETURNS INTEGER LANGUAGE JAVA PARAMETER STYLE JAVA READS SQL DATA CALLED ON NULL INPUT EXTERNAL NAME 'org.datanucleus.store.rdbms.adapter.DerbySQLFunction.ascii';

--CREATE FUNCTION "APP"."NUCLEUS_MATCHES" (TEXT VARCHAR(8000),PATTERN VARCHAR(8000)) RETURNS INTEGER LANGUAGE JAVA PARAMETER STYLE JAVA READS SQL DATA CALLED ON NULL INPUT EXTERNAL NAME 'org.datanucleus.store.rdbms.adapter.DerbySQLFunction.matches' ;
Initialization script completed
schemaTool completed

Python ValueError To many values to unpack

copy iconCopydownload iconDownload
interests =[(0,"Hadoop"),(0,"Big Data"),(0,"HBase"),(0,"Java"),(0,"Spark"),(0,"Storm"),(0,"Cassandra"),(1,"NoSQL",0),
            (1,"MongoDB"),(1,"Cassandra"),(1,"HBase"),(1,"Postgres"),(2,"Python"),(2,"scikit-learn"),(2,"scipy"),(2,"numpy"),
            (2,"statsmodels"),(2,"pandas"),(3,"R"),(3,"Python"),(3,"statistics"),(3,"regression"),(3,"probability"),
            (4,"machine learning"),(4,"regression"),(4,"decision trees"),(4,"libsvm"),(5,"Python"),(5,"R"),(5,"Java"),
            (5,"C++"),(5,"Haskell"),(5,"programming languages"),(6,"statistics"),(6,"probability"),(6,"mathematics"),
            (6,"theory"),(7,"machine learning"),(7,"scikit-learn"),(7,"Mahoot"),(7,"neural networks"),(8,"neural networks"),
            (8,"deep learning"),(8,"Big Data"),(8,"artificial intelligence"),(9,"Hadoop"),(9,"Java"),(9,"MapReduce"),
            (9,"Big Data")]
for x,y,*a in interests:
    print(x)
    print(y)

HBase java error - Expected HEADER=HBas but received HEADER=\x00\x00\x01\x0B

copy iconCopydownload iconDownload
org.apache.hadoop.conf.Configuration configuration = HBaseConfiguration.create();
configuration.set("hbase.zookeeper.quorum", "server1, server2, server3");
configuration.set("hbase.zookeeper.property.clientPort", "2181");
configuration.set("hadoop.security.authentication","kerberos");
configuration.set("hbase.security.authentication","kerberos");
configuration.set("hbase.cluster.distributed", true);
configuration.set("hbase.rpc.protection","authentication"); // Check this on your hbase configuration
configuration.set("hbase.regionserver.kerberos.principal", "hbase/_HOST@yourdomain");
configuration.set("hbase.master.kerberos.principal","hbase/_HOST@yourdomain");
UserGroupInformation.setConfiguration(configuration);         
UserGroupInformation.loginUserFromKeytab(youruser, "src/main/resources/key.keytab");

When using spark-hbase, I got a ClassNotFoundException: org.apache.hadoop.hbase.spark.SparkSQLPushDownFilter

copy iconCopydownload iconDownload
$HOME/.m2/repository/org/scala-lang/scala-library/<scala.version>/
export HBASE_CLASSPATH=$HBASE_HOME/site-lib/hbase-spark-1.0.1-SNAPSHOT.jar:$HBASE_HOME/site-lib/hbase-spark-protocol-shaded-1.0.1-SNAPSHOT.jar:$HBASE_HOME/site-lib/scala-library-2.12.12.jar
val df = sql.read.format("org.apache.hadoop.hbase.spark")
    .option("hbase.columns.mapping", "...")
    .option("hbase.spark.pushdown.columnfilter", false)
    .option("hbase.table", "person")
    .load()
-----------------------
$HOME/.m2/repository/org/scala-lang/scala-library/<scala.version>/
export HBASE_CLASSPATH=$HBASE_HOME/site-lib/hbase-spark-1.0.1-SNAPSHOT.jar:$HBASE_HOME/site-lib/hbase-spark-protocol-shaded-1.0.1-SNAPSHOT.jar:$HBASE_HOME/site-lib/scala-library-2.12.12.jar
val df = sql.read.format("org.apache.hadoop.hbase.spark")
    .option("hbase.columns.mapping", "...")
    .option("hbase.spark.pushdown.columnfilter", false)
    .option("hbase.table", "person")
    .load()
-----------------------
$HOME/.m2/repository/org/scala-lang/scala-library/<scala.version>/
export HBASE_CLASSPATH=$HBASE_HOME/site-lib/hbase-spark-1.0.1-SNAPSHOT.jar:$HBASE_HOME/site-lib/hbase-spark-protocol-shaded-1.0.1-SNAPSHOT.jar:$HBASE_HOME/site-lib/scala-library-2.12.12.jar
val df = sql.read.format("org.apache.hadoop.hbase.spark")
    .option("hbase.columns.mapping", "...")
    .option("hbase.spark.pushdown.columnfilter", false)
    .option("hbase.table", "person")
    .load()

Python Dataframe to Columnar format for accessing the columns dynamically

copy iconCopydownload iconDownload
df = (
    df.rename(columns={"COL1": "KEY_COLUMN"})
    .melt("KEY_COLUMN", var_name="KEY", value_name="VALUE")
    .sort_values(by="KEY_COLUMN")
)
print(df)
   KEY_COLUMN   KEY  VALUE
0         100  COL2    200
4         100  COL3    300
1         101  COL2    201
5         101  COL3    301
2         102  COL2    202
6         102  COL3    302
3         103  COL2    203
7         103  COL3    303
-----------------------
df = (
    df.rename(columns={"COL1": "KEY_COLUMN"})
    .melt("KEY_COLUMN", var_name="KEY", value_name="VALUE")
    .sort_values(by="KEY_COLUMN")
)
print(df)
   KEY_COLUMN   KEY  VALUE
0         100  COL2    200
4         100  COL3    300
1         101  COL2    201
5         101  COL3    301
2         102  COL2    202
6         102  COL3    302
3         103  COL2    203
7         103  COL3    303

Unable to canonicalize address localhost/&lt;unresolved&gt;:2222 because it's not resolvable

copy iconCopydownload iconDownload
<property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>hbase.tmp.dir</name>
    <value>./tmp</value>
  </property>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://server1:9000/hbase</value>
  </property>
  <property>
    <name>hbase.zookeeper.quorum</name>
    <value>zkserver1,zkserver2,zkserver3</value>
  </property>
  <property>
      <name>hbase.zookeeper.property.clientPort</name>
      <value>2222</value>
  </property>

    conf.set("hbase.cluster.distributed", "hbase.cluster.distributed");
conf.set("hbase.cluster.distributed", "true");
-----------------------
<property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>hbase.tmp.dir</name>
    <value>./tmp</value>
  </property>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://server1:9000/hbase</value>
  </property>
  <property>
    <name>hbase.zookeeper.quorum</name>
    <value>zkserver1,zkserver2,zkserver3</value>
  </property>
  <property>
      <name>hbase.zookeeper.property.clientPort</name>
      <value>2222</value>
  </property>

    conf.set("hbase.cluster.distributed", "hbase.cluster.distributed");
conf.set("hbase.cluster.distributed", "true");
-----------------------
<property>
    <name>hbase.cluster.distributed</name>
    <value>true</value>
  </property>
  <property>
    <name>hbase.tmp.dir</name>
    <value>./tmp</value>
  </property>
  <property>
    <name>hbase.rootdir</name>
    <value>hdfs://server1:9000/hbase</value>
  </property>
  <property>
    <name>hbase.zookeeper.quorum</name>
    <value>zkserver1,zkserver2,zkserver3</value>
  </property>
  <property>
      <name>hbase.zookeeper.property.clientPort</name>
      <value>2222</value>
  </property>

    conf.set("hbase.cluster.distributed", "hbase.cluster.distributed");
conf.set("hbase.cluster.distributed", "true");

Python subprocess with apostrophes, removes them

copy iconCopydownload iconDownload
subprocess.run([
    'docker', 'exec', 'hbase', 'bash', '-c',
    '''echo 'create "myTable", "R"' | hbase shell'''])

java.lang.NoClassDefFoundError: org/apache/hadoop/hive/ql/metadata/HiveException when query in spark-shell

copy iconCopydownload iconDownload
cp  /usr/lib/hive/conf/hive-site.xml    ${SPARK_HOME}/conf/

apache tomcat start fails with phoenix-5.0.0-HBase-2.0-client.jar in project lib folder

copy iconCopydownload iconDownload
com.clearspring.analytics:stream:2.9.5
com.fasterxml.jackson.core:jackson-annotations:2.4.0
com.fasterxml.jackson.core:jackson-core:2.4.0
com.fasterxml.jackson.core:jackson-databind:2.4.0
com.fasterxml.jackson.jaxrs:jackson-jaxrs-base:2.7.8
com.fasterxml.jackson.jaxrs:jackson-jaxrs-json-provider:2.7.8
com.fasterxml.jackson.module:jackson-module-jaxb-annotations:2.7.8
com.fasterxml.woodstox:woodstox-core:5.0.3
com.github.stephenc.findbugs:findbugs-annotations:1.3.9-1
com.github.stephenc.jcip:jcip-annotations:1.0-1
com.google.code.gson:gson:2.8.1
com.google.guava:guava:13.0.1
com.google.protobuf:protobuf-java-util:3.3.0
com.google.protobuf:protobuf-java:2.5.0
com.google.re2j:re2j:1.1
com.jcraft:jsch:0.1.54
com.nimbusds:nimbus-jose-jwt:4.41.1
com.salesforce.i18n:i18n-util:1.0.4
com.squareup.okhttp:okhttp:2.4.0
com.squareup.okio:okio:1.4.0
com.sun.jersey.contribs:jersey-guice:1.19
com.sun.jersey:jersey-client:1.19
com.sun.jersey:jersey-core:1.19
com.sun.jersey:jersey-json:1.19
com.sun.jersey:jersey-server:1.19
com.sun.jersey:jersey-servlet:1.19
com.tdunning:json:1.8
com.thoughtworks.paranamer:paranamer:2.3
commons-beanutils:commons-beanutils:1.9.3
commons-cli:commons-cli:1.4
commons-codec:commons-codec:1.7
commons-collections:commons-collections:3.2.2
commons-daemon:commons-daemon:1.0.13
commons-io:commons-io:2.5
commons-lang:commons-lang:2.6
commons-logging:commons-logging:1.2
commons-net:commons-net:3.1
io.dropwizard.metrics:metrics-core:3.1.0
io.netty:netty-all:4.1.17.Final
io.netty:netty:3.10.5.Final
javax.annotation:javax.annotation-api:1.2
javax.servlet.jsp:javax.servlet.jsp-api:2.3.1
javax.servlet:javax.servlet-api:3.1.0
javax.validation:validation-api:1.1.0.Final
javax.ws.rs:javax.ws.rs-api:2.0.1
javax.ws.rs:jsr311-api:1.1.1
javax.xml.bind:jaxb-api:2.2.11
jline:jline:2.11
log4j:log4j:1.2.17
net.minidev:accessors-smart:1.1
net.minidev:json-smart:2.2.1
org.antlr:antlr-runtime:3.5.2
org.apache.avro:avro-ipc:1.7.3
org.apache.avro:avro:1.7.7
org.apache.commons:commons-collections4:4.1
org.apache.commons:commons-compress:1.4.1
org.apache.commons:commons-configuration2:2.1.1
org.apache.commons:commons-crypto:1.0.0
org.apache.commons:commons-csv:1.0
org.apache.commons:commons-lang3:3.6
org.apache.commons:commons-math3:3.6.1
org.apache.curator:curator-client:2.12.0
org.apache.curator:curator-framework:2.12.0
org.apache.curator:curator-recipes:2.12.0
org.apache.flume:flume-ng-configuration:1.4.0
org.apache.flume:flume-ng-core:1.4.0
org.apache.flume:flume-ng-sdk:1.4.0
org.apache.hadoop:hadoop-annotations:3.0.0
org.apache.hadoop:hadoop-auth:3.0.0
org.apache.hadoop:hadoop-client:3.0.0
org.apache.hadoop:hadoop-common:3.0.0
org.apache.hadoop:hadoop-distcp:2.7.4
org.apache.hadoop:hadoop-hdfs-client:3.0.0
org.apache.hadoop:hadoop-hdfs:3.0.0
org.apache.hadoop:hadoop-mapreduce-client-common:3.0.0
org.apache.hadoop:hadoop-mapreduce-client-core:3.0.0
org.apache.hadoop:hadoop-mapreduce-client-jobclient:3.0.0
org.apache.hadoop:hadoop-yarn-api:3.0.0
org.apache.hadoop:hadoop-yarn-common:3.0.0
org.apache.hbase.thirdparty:hbase-shaded-miscellaneous:2.1.0
org.apache.hbase.thirdparty:hbase-shaded-netty:2.1.0
org.apache.hbase.thirdparty:hbase-shaded-protobuf:2.1.0
org.apache.hbase:hbase-annotations:2.0.0
org.apache.hbase:hbase-client:2.0.0
org.apache.hbase:hbase-common:2.0.0
org.apache.hbase:hbase-hadoop-compat:2.0.0
org.apache.hbase:hbase-hadoop2-compat:2.0.0
org.apache.hbase:hbase-http:2.0.0
org.apache.hbase:hbase-mapreduce:2.0.0
org.apache.hbase:hbase-metrics-api:2.0.0
org.apache.hbase:hbase-metrics:2.0.0
org.apache.hbase:hbase-procedure:2.0.0
org.apache.hbase:hbase-protocol-shaded:2.0.0
org.apache.hbase:hbase-protocol:2.0.0
org.apache.hbase:hbase-replication:2.0.0
org.apache.hbase:hbase-server:2.0.0
org.apache.hbase:hbase-zookeeper:2.0.0
org.apache.htrace:htrace-core4:4.2.0-incubating
org.apache.htrace:htrace-core:3.1.0-incubating
org.apache.httpcomponents:httpclient:4.0.1
org.apache.httpcomponents:httpcore:4.0.1
org.apache.kerby:kerb-admin:1.0.1
org.apache.kerby:kerb-client:1.0.1
org.apache.kerby:kerb-common:1.0.1
org.apache.kerby:kerb-core:1.0.1
org.apache.kerby:kerb-crypto:1.0.1
org.apache.kerby:kerb-identity:1.0.1
org.apache.kerby:kerb-server:1.0.1
org.apache.kerby:kerb-simplekdc:1.0.1
org.apache.kerby:kerb-util:1.0.1
org.apache.kerby:kerby-asn1:1.0.1
org.apache.kerby:kerby-config:1.0.1
org.apache.kerby:kerby-pkix:1.0.1
org.apache.kerby:kerby-util:1.0.1
org.apache.kerby:kerby-xdr:1.0.1
org.apache.kerby:token-provider:1.0.1
org.apache.mina:mina-core:2.0.4
org.apache.phoenix:phoenix-core:5.0.0-HBase-2.0
org.apache.phoenix:phoenix-flume:5.0.0-HBase-2.0
org.apache.phoenix:phoenix-pig:5.0.0-HBase-2.0
org.apache.phoenix:phoenix-spark:5.0.0-HBase-2.0
org.apache.tephra:tephra-api:0.14.0-incubating
org.apache.tephra:tephra-core:0.14.0-incubating
org.apache.tephra:tephra-hbase-compat-2.0:0.14.0-incubating
org.apache.twill:twill-api:0.8.0
org.apache.twill:twill-common:0.8.0
org.apache.twill:twill-core:0.8.0
org.apache.twill:twill-discovery-api:0.8.0
org.apache.twill:twill-discovery-core:0.8.0
org.apache.twill:twill-zookeeper:0.8.0
org.apache.yetus:audience-annotations:0.5.0
org.codehaus.jettison:jettison:1.3.8
org.codehaus.woodstox:stax2-api:3.1.4
org.eclipse.jetty:jetty-http:9.3.19.v20170502
org.eclipse.jetty:jetty-io:9.3.19.v20170502
org.eclipse.jetty:jetty-security:9.3.19.v20170502
org.eclipse.jetty:jetty-server:9.3.19.v20170502
org.eclipse.jetty:jetty-servlet:9.3.19.v20170502
org.eclipse.jetty:jetty-util-ajax:9.3.19.v20170502
org.eclipse.jetty:jetty-util:9.3.19.v20170502
org.eclipse.jetty:jetty-webapp:9.3.19.v20170502
org.eclipse.jetty:jetty-xml:9.3.19.v20170502
org.fusesource.leveldbjni:leveldbjni-all:1.8
org.glassfish.hk2.external:aopalliance-repackaged:2.5.0-b32
org.glassfish.hk2.external:javax.inject:2.5.0-b32
org.glassfish.hk2:hk2-api:2.5.0-b32
org.glassfish.hk2:hk2-locator:2.5.0-b32
org.glassfish.hk2:hk2-utils:2.5.0-b32
org.glassfish.hk2:osgi-resource-locator:1.0.1
org.glassfish.jersey.bundles.repackaged:jersey-guava:2.25.1
org.glassfish.jersey.containers:jersey-container-servlet-core:2.25.1
org.glassfish.jersey.core:jersey-client:2.25.1
org.glassfish.jersey.core:jersey-common:2.25.1
org.glassfish.jersey.core:jersey-server:2.25.1
org.glassfish.jersey.media:jersey-media-jaxb:2.25.1
org.glassfish.web:javax.servlet.jsp:2.3.2
org.glassfish:javax.el:3.0.1-b11-SNAPSHOT
org.iq80.snappy:snappy:0.3
org.jamon:jamon-runtime:2.4.1
org.javassist:javassist:3.20.0-GA
org.jruby.jcodings:jcodings:1.0.18
org.jruby.joni:joni:2.1.2
org.jvnet:tiger-types:1.4
org.mortbay.jetty:jetty-util:6.1.26
org.mortbay.jetty:jetty:6.1.26
org.mortbay.jetty:servlet-api:2.5-20110124
org.slf4j:slf4j-api:1.6.4
org.slf4j:slf4j-log4j12:1.7.25
org.xerial.snappy:snappy-java:1.0.5
sqlline:sqlline:1.2.0

Community Discussions

Trending Discussions on hbase
  • Failed to Find Any Kerberos TGT while trying to access Kerberized HBase Without kinit
  • HBase to Delta Tables
  • Apache Hive fails to initialize on Windows 10 and Cygwin
  • Python ValueError To many values to unpack
  • HBase java error - Expected HEADER=HBas but received HEADER=\x00\x00\x01\x0B
  • what does &quot;Online&quot; in Online analytical processing means?
  • When using spark-hbase, I got a ClassNotFoundException: org.apache.hadoop.hbase.spark.SparkSQLPushDownFilter
  • Accessing HBase on Amazon EMR with Athena
  • Processing of queries using SparkSQL on difference databases
  • Is there a maximum storage space configuration in HDFS or HBase?
Trending Discussions on hbase

QUESTION

Failed to Find Any Kerberos TGT while trying to access Kerberized HBase Without kinit

Asked 2022-Feb-21 at 20:36

I have a very simple Scala HBase GET application. I tried to make the connection as below:

import org.apache.hadoop.hbase.{HBaseConfiguration, TableName}
import org.apache.hadoop.hbase.client.{ConnectionFactory, Get}

object Debug extends App {
    val hbaseConf: HadoopConf = HBaseConfiguration.create
    val connection: Connection = ConnectionFactory.createConnection(hbaseConf)
    val hbaseTable = connection.getTable(TableName.valueOf("my-hbase-table"))
    hbaseTable.get(new Get("rowkey".getBytes).addColumn("colFam".getBytes,"colName".getBytes))
}

Whenever I run this, I get an error like below:

Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:211)
at org.apache.hadoop.hbase.security.AbstractHBaseSaslRpcClient.getInitialResponse(AbstractHBaseSaslRpcClient.java:131)
at org.apache.hadoop.hbase.security.NettyHBaseSaslRpcClientHandler$1.run(NettyHBaseSaslRpcClientHandler.java:108)
at org.apache.hadoop.hbase.security.NettyHBaseSaslRpcClientHandler$1.run(NettyHBaseSaslRpcClientHandler.java:104)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1746)
at org.apache.hadoop.hbase.security.NettyHBaseSaslRpcClientHandler.handlerAdded(NettyHBaseSaslRpcClientHandler.java:104)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.callHandlerAdded0(DefaultChannelPipeline.java:606)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.addFirst(DefaultChannelPipeline.java:187)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.addFirst(DefaultChannelPipeline.java:380)
at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.addFirst(DefaultChannelPipeline.java:359)
at org.apache.hadoop.hbase.ipc.NettyRpcConnection.saslNegotiate(NettyRpcConnection.java:200)
... 18 more
Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)
at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:162)
at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:122)
at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:189)
at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:224)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212)
at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179)
at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:192)
... 30 more

I am using Windows, so I have put 4 files under the root directory of C: drive:

    C:\cacerts
    C:\jaas.conf
    C:\krb5.conf
    C:\principal.keytab

My C:\jaas.conf :

KafkaClient {
    com.sun.security.auth.module.Krb5LoginModule required
    useKeyTab=true
    keyTab="C:\\principal.keytab"
    useTicketCache=false
    debug=true
    principal="principal@REALM";
};
Client {
    com.sun.security.auth.module.Krb5LoginModule required
    useKeyTab=true
    keyTab="C:\\principal.keytab"
    useTicketCache=false
    debug=true
    principal="principal@REALM";
};
com.sun.security.jgss.krb5.initiate {
    com.sun.security.auth.module.Krb5LoginModule required
    useKeyTab=true
    keyTab="C:\\principal.keytab"
    principal="principal@REALM";
};

I am on Cloudera CDH version 6.3.2. Downloaded hbase-client-config (ssl-client.xml, hdfs-site.xml, hbase-site.xml, core-site.xml) from Cloudera manager and added HBase client config files under resources folder in IntelliJ. my-hbase-client-config-files

And also set VM options for jaas.conf in IntelliJ, Run/Debug Configurations -> Application -> Build and run -> VM Options. my-intellij-application-vm-options-config

When application is started, am able to see jaas.conf file is being taken jaas.conf-logs

Even if it is saying that it gonna use the keytab, still I am getting error like this: Kerberos-error-message

My Cloudera version: 6.3.2

Scala version: 2.11.12

HBase Client: 2.1.0

Does anyone has any ideas? Is Java authentication system not taking Kerberos ticket by itself when I give jaas.conf?

ANSWER

Answered 2022-Feb-11 at 14:32

You will get this error message when Jaas cannot access the kerberos keytab.

Can you check for user permission issues? Login as user that will run the code and do a kinit ? What error message do you get? (Resolve the permission issue I'm suggesting you have.)

You seem to rule out a path issue, and seem to have the correct '\\'.

Source https://stackoverflow.com/questions/71048452

Community Discussions, Code Snippets contain sources that include Stack Exchange Network

Vulnerabilities

No vulnerabilities reported

Install hbase

You can download it from GitHub, Maven.
You can use hbase like any standard Java library. Please include the the jar files in your classpath. You can also use any IDE and you can run and debug the hbase component as you would do with any other Java program. Best practice is to use a build tool that supports dependency management such as Maven or Gradle. For Maven installation, please refer maven.apache.org. For Gradle installation, please refer gradle.org .

Support

For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

DOWNLOAD this Library from

Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

Explore Related Topics

Share this Page

share link
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

  • © 2022 Open Weaver Inc.