Randstad USA
Software Engineer IV
Randstad USA, Charlotte, North Carolina, United States, 28245
job summary:
Description :
Client's Enterprise Data Products Team is seeking a Subject Matter Expert to help accelerate our deployment of Raw and Derived data products to AWS data lakes. Candidates should be experts in the field and be able to illustrate experience with building and publishing the data structures we are starting to produce to help our techs overcome obstacles and avoid pitfalls. They should also be capable of helping us accelerate our rate of production through optimization and automation using Terraform Enterprise scripts and optimized AWS, Apache, and other tool configurations and architectural design. And be experienced and flexible with changing demands working in an Agile development environment.
Specifically we are looking for individuals who can show they have at least 5+ year of experience in Data Engineering role and be in a position to be a source of knowledge for our existing data engineers. Experience with below tech is must have:
AWS Tech Stack
AWS Lakeformation
AWS EMR
Apache Hudi
Flink
Aurora
PostgreSQL
AWS Glue (Glue Catalog, Glue ETL and Crawler)
AWS Athena
Redshift
Sagemaker/ML
AWS Lambda
DynamoDB
RDS (Relational Database Services).
AWS S3: good foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc.
Implementation and tuning experience in Streaming use cased and Big Data Ecosystem (such as EMR, Hadoop, Spark, Hudi, Kafka/Kinesis etc.)
Should be able to build scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective.
Have a good understanding of Data Lake concepts and Data Warehouse concepts. Should be familiar with concepts on Modern Data Architecture.
Ability to define Standards and guidelines with understanding of various Compliance and Auditing needs.
Be proficient in Python or Java to handle large volume data processing.
Terraform Enterprise.
Kafka
Qlik Replicate Additional helpful experience would include:
AWS Services CloudTrail, SNS, SQS, CloudWatch, Step Functions
Experience with Secrets Management Platform like Vault and AWS Secrets manager
Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse
Experience with RDBMS platforms and good proficiency with SQL
Knowledge of IAM roles and Policies
Experience with Event Driven Architecture
Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
Databases - Document DB, Mongo DB
Hadoop platform (Hive; HBase; Druid)
Java, Scala, Node JS
Workflow Automation
Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
Background in Kubernetes, Distributed Systems, Microservice architecture and containers
Experience with Rest APIs and API gateway
Deep understanding of networking DNS, TCP/IP and VPN
Kafka Schema Registry
Confluent Avro
Linux and Bash Scripting
location: Charlotte, North Carolina job type: Contract salary: $71.32 - 81.32 per hour work hours: 8am to 5pm education: Bachelors
responsibilities: Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
Lead the Design, Build, Test and Deployment of components in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
Understand requirements / use case to outline technical scope and lead delivery of technical solution
Works closely with the Product Owner to align on delivery goals and timing
Collaborates with Data and Solution architects on key technical decisions
Lead design and implementation of data quality check methods
Ensures Test Driven development
Good skills with business stakeholder interactions
Ensure data security and permissions solutions, including data encryption, user access controls and logging
qualifications: Experience level: Experienced Education: Bachelors
skills:
Professional Engineer
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc.
In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).
This posting is open for thirty (30) days.
#J-18808-Ljbffr
Client's Enterprise Data Products Team is seeking a Subject Matter Expert to help accelerate our deployment of Raw and Derived data products to AWS data lakes. Candidates should be experts in the field and be able to illustrate experience with building and publishing the data structures we are starting to produce to help our techs overcome obstacles and avoid pitfalls. They should also be capable of helping us accelerate our rate of production through optimization and automation using Terraform Enterprise scripts and optimized AWS, Apache, and other tool configurations and architectural design. And be experienced and flexible with changing demands working in an Agile development environment.
Specifically we are looking for individuals who can show they have at least 5+ year of experience in Data Engineering role and be in a position to be a source of knowledge for our existing data engineers. Experience with below tech is must have:
AWS Tech Stack
AWS Lakeformation
AWS EMR
Apache Hudi
Flink
Aurora
PostgreSQL
AWS Glue (Glue Catalog, Glue ETL and Crawler)
AWS Athena
Redshift
Sagemaker/ML
AWS Lambda
DynamoDB
RDS (Relational Database Services).
AWS S3: good foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc.
Implementation and tuning experience in Streaming use cased and Big Data Ecosystem (such as EMR, Hadoop, Spark, Hudi, Kafka/Kinesis etc.)
Should be able to build scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective.
Have a good understanding of Data Lake concepts and Data Warehouse concepts. Should be familiar with concepts on Modern Data Architecture.
Ability to define Standards and guidelines with understanding of various Compliance and Auditing needs.
Be proficient in Python or Java to handle large volume data processing.
Terraform Enterprise.
Kafka
Qlik Replicate Additional helpful experience would include:
AWS Services CloudTrail, SNS, SQS, CloudWatch, Step Functions
Experience with Secrets Management Platform like Vault and AWS Secrets manager
Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse
Experience with RDBMS platforms and good proficiency with SQL
Knowledge of IAM roles and Policies
Experience with Event Driven Architecture
Experience with native AWS technologies for data and analytics such as Kinesis, OpenSearch
Databases - Document DB, Mongo DB
Hadoop platform (Hive; HBase; Druid)
Java, Scala, Node JS
Workflow Automation
Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
Background in Kubernetes, Distributed Systems, Microservice architecture and containers
Experience with Rest APIs and API gateway
Deep understanding of networking DNS, TCP/IP and VPN
Kafka Schema Registry
Confluent Avro
Linux and Bash Scripting
location: Charlotte, North Carolina job type: Contract salary: $71.32 - 81.32 per hour work hours: 8am to 5pm education: Bachelors
responsibilities: Provides technical direction, guides the team on key technical aspects and responsible for product tech delivery
Lead the Design, Build, Test and Deployment of components in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead)
Understand requirements / use case to outline technical scope and lead delivery of technical solution
Works closely with the Product Owner to align on delivery goals and timing
Collaborates with Data and Solution architects on key technical decisions
Lead design and implementation of data quality check methods
Ensures Test Driven development
Good skills with business stakeholder interactions
Ensure data security and permissions solutions, including data encryption, user access controls and logging
qualifications: Experience level: Experienced Education: Bachelors
skills:
Professional Engineer
Equal Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.
At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.
Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc.
In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).
This posting is open for thirty (30) days.
#J-18808-Ljbffr