Bridge Technologies and Solutions
Sr.Hadoop Developer
Bridge Technologies and Solutions, Beaverton, Oregon, us, 97078
Typically requires a Bachelors Degree and minimum of 5 years directly relevant work experience.
Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer, you will work with a variety of talented client teammates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics.
Responsibilities:
Design and implement map reduce jobs to support distributed processing using Java, Cascading, Python, Hive, and Pig; ability to design and implement end-to-end solutions.Build libraries, user-defined functions, and frameworks around Hadoop.Research, evaluate, and utilize new technologies/tools/frameworks around the Hadoop ecosystem.Develop user-defined functions to provide custom Hive and Pig capabilities.Define and build data acquisition and consumption strategies.Define & develop best practices.Work with support teams in resolving operational & performance issues.Work with architecture/engineering leads and other teams on capacity planning.Qualifications
Qualification:• MS/BS degree in a computer science field or related discipline.• 6+ years’ experience in large-scale software development.• 1+ year experience in Hadoop.• Strong Java programming, shell scripting, Python, and SQL.• Strong development skills around Hadoop, MapReduce, Hive, Pig, and Impala.• Strong understanding of Hadoop internals.• Good understanding of AVRO and JSON and other compression.• Experience with build tools such as Maven.• Experience with databases like Oracle.• Experience with performance/scalability tuning, algorithms, and computational complexity.• Experience (at least familiarity) with data warehousing, dimensional modeling, and ETL development.• Ability to understand ERDs and relational database schemas.• Proven ability to work cross-functional teams to deliver appropriate resolution.
Nice to have:
Experience with open source NoSQL technologies such as HBase and Cassandra.Experience with messaging & complex event processing systems such as Kafka and Storm.Machine learning framework.Statistical analysis with Python, R, or similar.Additional Information
All your information will be kept confidential according to EEO guidelines.
#J-18808-Ljbffr
Client is embarking on the big data platform in Consumer Digital using Hadoop Distributed File System cluster. As a Sr. Hadoop developer, you will work with a variety of talented client teammates and be a driving force for building solutions. You will be working on development projects related to commerce and web analytics.
Responsibilities:
Design and implement map reduce jobs to support distributed processing using Java, Cascading, Python, Hive, and Pig; ability to design and implement end-to-end solutions.Build libraries, user-defined functions, and frameworks around Hadoop.Research, evaluate, and utilize new technologies/tools/frameworks around the Hadoop ecosystem.Develop user-defined functions to provide custom Hive and Pig capabilities.Define and build data acquisition and consumption strategies.Define & develop best practices.Work with support teams in resolving operational & performance issues.Work with architecture/engineering leads and other teams on capacity planning.Qualifications
Qualification:• MS/BS degree in a computer science field or related discipline.• 6+ years’ experience in large-scale software development.• 1+ year experience in Hadoop.• Strong Java programming, shell scripting, Python, and SQL.• Strong development skills around Hadoop, MapReduce, Hive, Pig, and Impala.• Strong understanding of Hadoop internals.• Good understanding of AVRO and JSON and other compression.• Experience with build tools such as Maven.• Experience with databases like Oracle.• Experience with performance/scalability tuning, algorithms, and computational complexity.• Experience (at least familiarity) with data warehousing, dimensional modeling, and ETL development.• Ability to understand ERDs and relational database schemas.• Proven ability to work cross-functional teams to deliver appropriate resolution.
Nice to have:
Experience with open source NoSQL technologies such as HBase and Cassandra.Experience with messaging & complex event processing systems such as Kafka and Storm.Machine learning framework.Statistical analysis with Python, R, or similar.Additional Information
All your information will be kept confidential according to EEO guidelines.
#J-18808-Ljbffr