Unisoft Technology Inc.
Hadoop Developer / Admin Jr
Unisoft Technology Inc., Gaithersburg, Maryland, us, 20883
Unisoft is an EEO company that offers a rewarding career for people seeking a challenge in technology. We are always looking for talented and creative individuals to join our high-performance team to satisfy the growing market demand. We are interested to hear from you if you are pursuing a long-term career. We offer competitive salaries, excellent healthcare benefits, and an attractive vacation package. If you want to join an excellent working environment, please send your Resume to: resumes@unisofTechinc.com
Minimum Requirements:
Four (4) years of product administration experience in RHEL Linux based environment.At least three (3) years of experience in administering cloud-based multi-user environments.A minimum of four (4) years of related experience as a Hadoop administrator with knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, and ZooKeeper.Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.Experience in developing Spark code to read data/messages from Kafka topics and update/insert/delete the records in corresponding databases like Splice Machine, PostgreSQL, MSSQL, etc.Experience in developing shell/Python scripts to transform the data in HDFS.Experience in managing and deploying HBase, translating, loading, and exhibiting unrelated data sets in various formats and sources like JSON, text files, Kafka queues, and log data.Experience in fine-tuning applications and systems for high performance and higher volume throughput.Prior Hadoop cluster deployment experience in adding and removing nodes.Hands-on experience with Cloudera and working with data delivery teams to set up new Hadoop users.
#J-18808-Ljbffr
Minimum Requirements:
Four (4) years of product administration experience in RHEL Linux based environment.At least three (3) years of experience in administering cloud-based multi-user environments.A minimum of four (4) years of related experience as a Hadoop administrator with knowledge of Cloudera Hadoop components such as HDFS, Sentry, HBase, Impala, Hue, Spark, Hive, Kafka, YARN, and ZooKeeper.Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa.Experience in developing Spark code to read data/messages from Kafka topics and update/insert/delete the records in corresponding databases like Splice Machine, PostgreSQL, MSSQL, etc.Experience in developing shell/Python scripts to transform the data in HDFS.Experience in managing and deploying HBase, translating, loading, and exhibiting unrelated data sets in various formats and sources like JSON, text files, Kafka queues, and log data.Experience in fine-tuning applications and systems for high performance and higher volume throughput.Prior Hadoop cluster deployment experience in adding and removing nodes.Hands-on experience with Cloudera and working with data delivery teams to set up new Hadoop users.
#J-18808-Ljbffr