Spark Java access remote HDFS

Suppose we need to work with different HDFS (clusterB, for instance) from our Spark Java application, running on clusterA. Firstly, you need to add –conf key to your run command. Depends on Spark version: Secondly, when you creating Spark’s Java context, add that: You need to go to clusterB and gather core-site.xml and hdfs-site.xml from there (default location for Cloudera is /etc/hadoop/conf) […]

READ MORE

Java access remote HDFS from current Hadoop cluster

Suppose we have our Java app running on Hadoop clusterA, and we want to access remote HDFS based on Hadoop clusterB. Let’s see how we can do it: You need to go to clusterB and gather core-site.xml and hdfs-site.xml from there (default location for Cloudera is /etc/hadoop/conf) and put near your app running in clusterA. […]

READ MORE

Kafka broker Kerberos

Let’s see how we can configure Kerberos between Kafka broker and Kafka client on server side. The client side is presented here: https://mchesnavsky.tech/how-to-create-kafka-kerberos-java-consumer. <kafka_home>/conf/server.properties <kafka_home>/bin/kafka-run-class.sh Insert this: To KAFKA_OPTS: Result: /your/path/to/kafka_server_jaas.conf Kerberos between Kafka brokers is configuring with separate conf keys (which we not mentioned in this article). Above configuration is for broker-client interaction.

READ MORE

How to create Kafka Kerberos Java consumer

Suppose that you need to create Kafka Java consumer with Kerberos. The code will be: You don’t need to specify java.security.auth.login.config Java property, because we set SaslConfigs.SASL_JAAS_CONFIG property directly to the consumer. You just need to made changes in kafkaJaasConfiguration() method that necessary for your Kerberos configuration.

READ MORE

Gmavenplus: Unrecognized target bytecode

When you try to compile Groovy with Java and use gmavenplus plugin for that purpose, you may face with this errors: To solve this problem, you need replace the ‘8’ to ‘1.8’ in your maven-compiler-plugin: If you don’t have maven-compiler-plugin – add it to your project’s pom.xml, as above.

READ MORE

How to enable Ignite metrics

There is a lot of pre-configured metrics in Ignite that disabled for default. If you want to see them, you may pass instance of LogExporterSpi to IgniteConfiguration like that: You can work with Ignite metrics by some number of ways, not only through the logging system. For example: JMX SQL Views OpenCensus Please, refer to […]

READ MORE