Career Guidant Inc.
Hadoop Developer
Career Guidant Inc., Tampa, Florida, us, 33646
Career Guidant, an internationally acclaimed, trusted multi-faceted organization in Information Technology Custom Learning Services for Enterprises, Lateral Staffing Solutions, Information Technology Development & Consulting, Infrastructure & Facility Management Services, and Technical Content Development as core competencies. Our experienced professionals bring a wealth of industry knowledge to each client and operate in a manner that produces superior quality and outstanding results.Career Guidant's proven and tested methodologies ensure client satisfaction as the primary objective. We are committed to our core values of Client Satisfaction, Professionalism, Teamwork, Respect, and Integrity.Career Guidant, with its large network of delivery centres, support offices, and partners across India, Asia Pacific, Middle East, Far East, Europe, and the USA, is committed to rendering the best service to our clients to ensure their operations continue to run smoothly.Job Description
Preferred
At least 2 years of experience in Big Data space.Strong Hadoop – MAP REDUCE/Hive/Pig/SQOOP/OOZIE - MUSTHands-on experience with Java, APIs, spring – MUSTGood exposure to columnar NoSQL DBs like HBase.Complex High Volume High Velocity projects end-to-end delivery experience.Good experience with at least one scripting language like Scala, Python.Good exposure to Big Data architectures.Experience with some framework building experience on Hadoop.Very good understanding of the Big Data ecosystem.Experience with sizing and estimating large scale big data projects.Good DB knowledge with SQL tuning experience.Experience with Impala.Good exposure to Splunk.Good exposure to Kafka.Experience with Apache Parquet Data format.Past experience and exposure to ETL and data warehouse projects.Experience with Spark, Flume.Cloudera/Hortonworks certified.Experience and desire to work in a Global delivery environment.The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email, or face-to-face. Travel may be required as per the job requirements. Qualifications
Basic
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.At least 4 years of experience with Information Technology.
#J-18808-Ljbffr
Preferred
At least 2 years of experience in Big Data space.Strong Hadoop – MAP REDUCE/Hive/Pig/SQOOP/OOZIE - MUSTHands-on experience with Java, APIs, spring – MUSTGood exposure to columnar NoSQL DBs like HBase.Complex High Volume High Velocity projects end-to-end delivery experience.Good experience with at least one scripting language like Scala, Python.Good exposure to Big Data architectures.Experience with some framework building experience on Hadoop.Very good understanding of the Big Data ecosystem.Experience with sizing and estimating large scale big data projects.Good DB knowledge with SQL tuning experience.Experience with Impala.Good exposure to Splunk.Good exposure to Kafka.Experience with Apache Parquet Data format.Past experience and exposure to ETL and data warehouse projects.Experience with Spark, Flume.Cloudera/Hortonworks certified.Experience and desire to work in a Global delivery environment.The job entails sitting as well as working at a computer for extended periods of time. Should be able to communicate by telephone, email, or face-to-face. Travel may be required as per the job requirements. Qualifications
Basic
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.At least 4 years of experience with Information Technology.
#J-18808-Ljbffr