Randstad
data engineer iv
Randstad, Charlotte, North Carolina, United States, 28245
data engineer iv.
charlotte , north carolina
posted 2 days ago
job details
summary
$74.45 - $84.45 per hour
contract
bachelor degree
category computer and mathematical occupations
reference1071022
job details
job summary:
Description:
Notes from Engagement Manager:
Candidates MUST be able to code
Candidates should be "consultants."
Architects/Managers are not the right fit. If current role or previous role is an architect, they will not be the right fit for this role.
Needs to be able to talk technical with a nontechnical audience. Candidates MUST be motivated and good communicators.
Skills:
Data platforms on AWS. Data Pipelines and Data Lakes.
Candidates in this role should have the ability to develop both Data Lakes AND Data Pipelines. It would be great if you could highlight that on the resumes. Candidates without must haves will not be shortlisted
Data Engineers that are good coders/analysts.
Client Enterprise Data Platforms Team is seeking a Subject Matter Expert to help develop client Data Fabric as an interconnected network of data capabilities and data products composed to deliver data at speed and scale.
Candidates should be experts in the field and be able to illustrate experience with skills related to developing and building data platforms to help our team overcome obstacles and avoid pitfalls.
They should also be capable of helping us accelerate our rate of production through optimization and automation using Terraform Enterprise scripts and optimized AWS, Apache, and other tool configurations and architectural design.
As well as be experienced and flexible with changing demands working in an Agile development environment.
Specifically, we are looking for individuals who can show they have at least 5+ year of experience in Data Engineering or Software Engineering role and be in a position to be a source of knowledge for our existing engineers.
Must have experience with below tech stack:
GIT
AWS
IAM
API Gateway
Lambda
Step Functions
Lakeformation
EKS
Glue: Catalog, ETL, Crawler
Athena
Lambda
S3 (Good foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc)
Apache Hudi
Flink
PostgreSQL
SQL
RDS (Relational Database Services).
Python
Java
Terraform Enterprise
Must be able to explain what TF is used for
Understand and explain basic principles (e.g. modules, providers, functions)
Must be able to write and debug TF
Additional helpful experience would include:
Kafka and Kafka Schema Registry
AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift
Concourse
Secrets Management Platform: Vault, AWS Secrets manager
Experience with Event Driven Architecture
Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
Background in Kubernetes, Distributed Systems, Microservice architecture and containers
Implementation and tuning experience in Streaming use cases in Big Data Ecosystems (such as EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis etc.)
Experience building scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective.
Good understanding of Data Lake and Data Warehouse concepts.
Ability to define Standards and guidelines with understanding of various Compliance and Auditing needs.
location: Charlotte, North Carolina
job type: Contract
salary: $74.45 - 84.45 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
Provides technical direction, engage team in discussion on how to best guide/build features on key technical aspects and responsible for product tech delivery
Works closely with the Product Owner and team to align on delivery goals and timing
Collaborates with architects on key technical decisions for data and overall solution
Lead design and implementation of data quality check methods
Ensure data security and permissions solutions, including data encryption, user access controls and logging
Be able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.
Self-starter mentality
Willing to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than another
Thrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.
Questioning and Improvement mindset
Must be ready to ask questions about why something is currently done the way it is and suggest alternative solutions
Customer facing skills
Interfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues they encounter with our products
qualifications:
Experience level: Experienced
Education: Bachelors
skills:
Data AnalysisEqual Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).This posting is open for thirty (30) days.
charlotte , north carolina
posted 2 days ago
job details
summary
$74.45 - $84.45 per hour
contract
bachelor degree
category computer and mathematical occupations
reference1071022
job details
job summary:
Description:
Notes from Engagement Manager:
Candidates MUST be able to code
Candidates should be "consultants."
Architects/Managers are not the right fit. If current role or previous role is an architect, they will not be the right fit for this role.
Needs to be able to talk technical with a nontechnical audience. Candidates MUST be motivated and good communicators.
Skills:
Data platforms on AWS. Data Pipelines and Data Lakes.
Candidates in this role should have the ability to develop both Data Lakes AND Data Pipelines. It would be great if you could highlight that on the resumes. Candidates without must haves will not be shortlisted
Data Engineers that are good coders/analysts.
Client Enterprise Data Platforms Team is seeking a Subject Matter Expert to help develop client Data Fabric as an interconnected network of data capabilities and data products composed to deliver data at speed and scale.
Candidates should be experts in the field and be able to illustrate experience with skills related to developing and building data platforms to help our team overcome obstacles and avoid pitfalls.
They should also be capable of helping us accelerate our rate of production through optimization and automation using Terraform Enterprise scripts and optimized AWS, Apache, and other tool configurations and architectural design.
As well as be experienced and flexible with changing demands working in an Agile development environment.
Specifically, we are looking for individuals who can show they have at least 5+ year of experience in Data Engineering or Software Engineering role and be in a position to be a source of knowledge for our existing engineers.
Must have experience with below tech stack:
GIT
AWS
IAM
API Gateway
Lambda
Step Functions
Lakeformation
EKS
Glue: Catalog, ETL, Crawler
Athena
Lambda
S3 (Good foundational concepts like object data store vs block data store, encryption/decryption, storage tiers etc)
Apache Hudi
Flink
PostgreSQL
SQL
RDS (Relational Database Services).
Python
Java
Terraform Enterprise
Must be able to explain what TF is used for
Understand and explain basic principles (e.g. modules, providers, functions)
Must be able to write and debug TF
Additional helpful experience would include:
Kafka and Kafka Schema Registry
AWS Services: CloudTrail, SNS, SQS, CloudWatch, Step Functions, Aurora, EMR, Redshift
Concourse
Secrets Management Platform: Vault, AWS Secrets manager
Experience with Event Driven Architecture
Experience transitioning on premise big data platforms into cloud-based platforms such as AWS
Background in Kubernetes, Distributed Systems, Microservice architecture and containers
Implementation and tuning experience in Streaming use cases in Big Data Ecosystems (such as EMR, EKS, Hadoop, Spark, Hudi, Kafka/Kinesis etc.)
Experience building scalable data infrastructure and understand distributed systems concepts from a data storage and compute perspective.
Good understanding of Data Lake and Data Warehouse concepts.
Ability to define Standards and guidelines with understanding of various Compliance and Auditing needs.
location: Charlotte, North Carolina
job type: Contract
salary: $74.45 - 84.45 per hour
work hours: 8am to 5pm
education: Bachelors
responsibilities:
Provides technical direction, engage team in discussion on how to best guide/build features on key technical aspects and responsible for product tech delivery
Works closely with the Product Owner and team to align on delivery goals and timing
Collaborates with architects on key technical decisions for data and overall solution
Lead design and implementation of data quality check methods
Ensure data security and permissions solutions, including data encryption, user access controls and logging
Be able to think unconventionally to find the best way to solve for a defined use case with fuzzy requirements.
Self-starter mentality
Willing to do their own research to solve problems and can clearly present findings and engage in conversation on what makes one solution better than another
Thrive in a fail-fast environment, involving mini PoCs, and participate in an inspect and adapt process.
Questioning and Improvement mindset
Must be ready to ask questions about why something is currently done the way it is and suggest alternative solutions
Customer facing skills
Interfacing with stakeholders and other product teams via pairing, troubleshooting support, and debugging issues they encounter with our products
qualifications:
Experience level: Experienced
Education: Bachelors
skills:
Data AnalysisEqual Opportunity Employer: Race, Color, Religion, Sex, Sexual Orientation, Gender Identity, National Origin, Age, Genetic Information, Disability, Protected Veteran Status, or any other legally protected group status.At Randstad Digital, we welcome people of all abilities and want to ensure that our hiring and interview process meets the needs of all applicants. If you require a reasonable accommodation to make your application or interview experience a great one, please contact HRsupport@randstadusa.com.Pay offered to a successful candidate will be based on several factors including the candidate's education, work experience, work location, specific job duties, certifications, etc. In addition, Randstad Digital offers a comprehensive benefits package, including health, an incentive and recognition program, and 401K contribution (all benefits are based on eligibility).This posting is open for thirty (30) days.