The Dignify Solutions LLC
Hadoop Developer (Cloudera, Terraform) - (Fulltime)
The Dignify Solutions LLC, Charlotte, North Carolina, United States, 28245
8+ years of overall IT experience with more that 4+ years of experience in Hadoop/Big Data technologies (HDFS, SQOOP, Hive, Pig, Spark, Impala, Oozie etc.)
Strong Hands-on experience in working with
Cloudera Data Hadoop (CDH) and Cloudera Distribution (CDP)
on Hadoop, Kerberos AD Integration, Upgradation to new CDP versions and migrationStrong experience in Integrating current Hadoop for Authorization & Authentication into Active Directory and KerberosExpertise in SQL and Performance tuning experience, Batch and distributed computing using ETL/ELT (Spark/SQL Server DWH/ Teradata etc.)Experience in No SQL Databases such as HBase, Solr etc.,At least one end-to-end implementation project experience on data processing pipeline using Hadoop (CDP) ecosystem, Data Lake and Data WarehouseAbility to review technical deliverables, mentor and drive technical teams to deliver quality productsAbility of data profiling, data quality assessment, business rules validations to incorporate validation and verification mechanism to ensure data quality is of high value/standardsMust be able to quickly understand technical and business requirements and can translate them into technical implementations along with integration of new data sources and toolsStrong object-oriented design and analysis skillsOwn product features from the development, testing through to production deployment along with excellent written and verbal communication skills
Strong Hands-on experience in working with
Cloudera Data Hadoop (CDH) and Cloudera Distribution (CDP)
on Hadoop, Kerberos AD Integration, Upgradation to new CDP versions and migrationStrong experience in Integrating current Hadoop for Authorization & Authentication into Active Directory and KerberosExpertise in SQL and Performance tuning experience, Batch and distributed computing using ETL/ELT (Spark/SQL Server DWH/ Teradata etc.)Experience in No SQL Databases such as HBase, Solr etc.,At least one end-to-end implementation project experience on data processing pipeline using Hadoop (CDP) ecosystem, Data Lake and Data WarehouseAbility to review technical deliverables, mentor and drive technical teams to deliver quality productsAbility of data profiling, data quality assessment, business rules validations to incorporate validation and verification mechanism to ensure data quality is of high value/standardsMust be able to quickly understand technical and business requirements and can translate them into technical implementations along with integration of new data sources and toolsStrong object-oriented design and analysis skillsOwn product features from the development, testing through to production deployment along with excellent written and verbal communication skills