Author Archives: shalishvj : My Experience with BigData

About shalishvj : My Experience with BigData

6+ years of experience using Bigdata technologies in Architect, Developer and Administrator roles for various clients. • Experience using Hortonworks, Cloudera, AWS distributions. • Cloudera Certified Developer for Hadoop. • Cloudera Certified Administrator for Hadoop. • Spark Certification from Big Data Spark Foundations. • SCJP, OCWCD. • Experience in setting up Hadoop clusters in PROD, DR, UAT , DEV environments.

Tips: Hive

Mask a Column Create a table, Insert values to it CREATE TABLE IF NOT EXISTS employee_test1 ( eid String, name String) ROW FORMAT DELIMITED FIELDS TERMINATED BY ‘\t’ LINES TERMINATED BY ‘\n’ STORED AS TEXTFILE; INSERT INTO TABLE employee_test1 VALUES … Continue reading

Posted in Tips, Uncategorized | Tagged , , , , , , | Leave a comment

Hadoop Cluster : Run Command on ALL Nodes

Its usually tough to run a command on all nodes of a hadoop cluster. Here is a script to do that.. run_command !/bin/bash TPUT=’tput -T xterm-color’ txtund=$(${TPUT} sgr 0 1) # Underline txtbld=$(${TPUT} bold) # Bold txtrst=$(${TPUT} sgr0) # Reset … Continue reading

Posted in Hadoop Cluster Administration, Hadoop Cluster Installation, Uncategorized, Unix | Leave a comment

Some Curl Commands for BigData

Writing to HDFS curl -i -X PUT -T $file -L “http://$namenode:50070/webhdfs/v1//$file?op=CREATE&$user” Reading from HDFS curl -i -X GET “http://$namenode:50070/webhdfs/v1//$file?op=OPEN” In a kerberized environment (Writing to HDFS) curl –negotiate -ku : -X PUT $file “http://:50070/webhdfs/v1//$file?op=CREATE&” OR curl -iku $userName:$password -L -T … Continue reading

Posted in Rest API, Uncategorized, webhdfs | Tagged , , , | Leave a comment

Truststore & Keystore

In SSL handshake, TrustStore is to verify credentials stores certificates from third party, Java application communicate or certificates signed by CA(certificate authorities like Verisign) which can be used to identify third party.   KeyStore is to provide credential. stores private … Continue reading

Posted in SSL, Uncategorized | Tagged , , | Leave a comment

Integrating Kafka and Storm

Intro This article focuses on integrating Storm and Kafka Data is not encrypted in this case Create client_jaas file (under /usr/hdp/current/storm-client/conf/) KafkaClient{ required useTicketCache=true renewTicket=true serviceName=”kafka”; }; Client{ required useTicketCache=true renewTicket=false serviceName=”zk”; }; StormClient{ required … Continue reading

Posted in Kafka, Storm | Tagged , , | Leave a comment

Kafka Producers and Consumers (Console / Java) using SASL_SSL

Intro Producers / Consumers help to send / receive message to / from Kafka SASL is used to provide authentication and SSL for encryption JAAS config files are used to read kerberos ticket and authenticate as a part of SASL … Continue reading

Posted in Kafka | Tagged , , , , | Leave a comment

Apache Storm

Architecture / Components Nimbus and Supervisor daemons are designed to be fail-fast (process self-destructs whenever any unexpected situation is encountered) stateless (all state is kept in Zookeeper or on disk) Nimbus and Supervisor daemons must be run under supervision using … Continue reading

Posted in Storm, Uncategorized | Tagged , | Leave a comment