Loading csv to hadoop fs:
hadoop fs -put test.tsv /tmp/
hadoop fs -ls /tmp/
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns="HBASE_ROW_KEY,cf1:c1,cf1:c2" -Dimporttsv.separator="," -Dimporttsv.bulk.output="/tmp/hbaseoutput" t1 /tmp/test.tsv
b) Upload the data from the HFiles located at /tmp/hbaseoutput to the HBase table t1
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles /tmp/hbaseoutput t1
hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns="HBASE_ROW_KEY,cf1:c1,cf1:c2" t1 /tmp/test.tsv
hadoop fs -put test.tsv /tmp/
hadoop fs -ls /tmp/
1. BULK LOADING
a) Preparing StoreFileshbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns="HBASE_ROW_KEY,cf1:c1,cf1:c2" -Dimporttsv.separator="," -Dimporttsv.bulk.output="/tmp/hbaseoutput" t1 /tmp/test.tsv
b) Upload the data from the HFiles located at /tmp/hbaseoutput to the HBase table t1
hbase org.apache.hadoop.hbase.mapreduce.LoadIncrementalHFiles /tmp/hbaseoutput t1
2. NON-BULK LOADING
Upload the data from TSV format in HDFS into HBase via Putshbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.columns="HBASE_ROW_KEY,cf1:c1,cf1:c2" t1 /tmp/test.tsv
ReplyDeletegood work done and keep update more.i like your information's and that is very much useful for readers.
Hadoop Training in Chennai
Big data training in chennai
Big Data Course in Chennai
JAVA Training in Chennai
Python Training in Chennai
Selenium Training in Chennai
Hadoop training in chennai
Big data training in chennai
hadoop training in Velachery
Brilliant post! We are connecting to this extraordinary post on our site. Keep up the extraordinary composition.
ReplyDeletetech news