Resource Informatics Group
Requirement 1:
Role:
AWS Cloud Architect
Location:Plano, TX
Duration: Long-term
Rate:Open
Start date: Immediate
Job Description: DXC Team is seeking WS Cloud engineer to help configure, deploy and operationalize platforms for data science on AWS. As models, apps, and data pipelines are created and operationalized, the data science team requires engineers with understanding of cloud native technology to develop, manage, automate, and facilitate the deployment and operational capabilities of the data science team. AWS Cloud engineer joins a growing competency within the data science team to operationalize a variety of analytics products on cloud platforms. Responsibilities: Designing, Configuring, deploying, managing and automating cloud infrastructure (S3, IAM, Redshift, DynamoDB, EC2, VPC, Lambda, CloudWatch, Databricks, Glue, MLops and Athena) that is secure and scalable on AWS to be used by the data science team and extended team. Monitoring and ensure the integrity of data pipelines. Automate Cloud deployments using Terraform. Administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD and/or containers. Ensuring the compliance of the data science operations on AWS. Monitoring usage, cost, and implement optimizations of a variety of AWS resources. Maintaining the Jenkins pipeline and Perform code promotions through change management Provision DynamoDB tables with encryption and grant access using the IAM policies Deploy and manage AWS Serverless application running on API Gateway and LAMBDA Deploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methods Primary Skills :
AWS:
S3, Redshift, DynamoDB, EC2, VPC, Lambda, CloudWatch etc. Bigdata :
Databricks, Cloudera, Glue and Athena Automation:
Terraform and Python Qualifications:
Bachelors degree with 14+ years of experience : 8+ years of experience in designing, building, and maintaining AWS Cloud infrastructure and 6+ years of experience in Bigdata administration and/or infrastructure administration. Experience in automating AWS infrastructure using terraform and Python is must. Experience in database technologies is a plus. Knowledge in all aspects of DevOps (source control, continuous integration, deployments, etc.) Proficiency in security implementation best practices on IAM policies, KMS encryption, Secrets Management, Network Security Groups etc. Experience working in the SCRUM Environment.
Requirement 2:
Role: AWS DevOps Tech Lead
Location:Plano, TX
Duration: Long-term
Rate:Open
Start date: Immediate
Responsibilities:
Extensive experience in designing, configuring, deploying, managing and automating AWS Core Services like S3, IAM, EC2, Route53, LB, CloudWatch, Lambda, KMS, Secrets Manager and VPC. Experience in automating cloud deployments using Terraform and Terragrunt. Experience in GitHub Actions based DevOps Experience in managing a datalake on AWS or strong knowledge on AWS data platform services like Redshift, DynamoDB, Databricks, and Athena. Administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD and/or containers. Monitoring usage, cost, and implement optimizations of a variety of AWS resources. Maintaining the GHA pipeline and perform code promotions through change management Deploy and manage AWS Serverless application running on API Gateway and Lambda. Deploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methods Manage Databricks jobs, sql warehouse, access to data. Manage Denodo, integration to all services in the data lake. Manage the open-source spark cluster. Comply with all agreed SLA requirements for all service requests, incidents and changes. Required Skills :
AWS (experience mandatory):
S3, IAM, EC2, Route53, LB, CloudWatch, Lambda, KMS, Secrets Manager, Cloud Trail and VPC. Automation (experience mandatory):
Terraform Bigdata (knowledge mandatory):
Databricks or Redshift DevOps (mandatory):
GitHub Actions ,
Python/Shell scripting
AWS Cloud Architect
Location:Plano, TX
Duration: Long-term
Rate:Open
Start date: Immediate
Job Description: DXC Team is seeking WS Cloud engineer to help configure, deploy and operationalize platforms for data science on AWS. As models, apps, and data pipelines are created and operationalized, the data science team requires engineers with understanding of cloud native technology to develop, manage, automate, and facilitate the deployment and operational capabilities of the data science team. AWS Cloud engineer joins a growing competency within the data science team to operationalize a variety of analytics products on cloud platforms. Responsibilities: Designing, Configuring, deploying, managing and automating cloud infrastructure (S3, IAM, Redshift, DynamoDB, EC2, VPC, Lambda, CloudWatch, Databricks, Glue, MLops and Athena) that is secure and scalable on AWS to be used by the data science team and extended team. Monitoring and ensure the integrity of data pipelines. Automate Cloud deployments using Terraform. Administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD and/or containers. Ensuring the compliance of the data science operations on AWS. Monitoring usage, cost, and implement optimizations of a variety of AWS resources. Maintaining the Jenkins pipeline and Perform code promotions through change management Provision DynamoDB tables with encryption and grant access using the IAM policies Deploy and manage AWS Serverless application running on API Gateway and LAMBDA Deploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methods Primary Skills :
AWS:
S3, Redshift, DynamoDB, EC2, VPC, Lambda, CloudWatch etc. Bigdata :
Databricks, Cloudera, Glue and Athena Automation:
Terraform and Python Qualifications:
Bachelors degree with 14+ years of experience : 8+ years of experience in designing, building, and maintaining AWS Cloud infrastructure and 6+ years of experience in Bigdata administration and/or infrastructure administration. Experience in automating AWS infrastructure using terraform and Python is must. Experience in database technologies is a plus. Knowledge in all aspects of DevOps (source control, continuous integration, deployments, etc.) Proficiency in security implementation best practices on IAM policies, KMS encryption, Secrets Management, Network Security Groups etc. Experience working in the SCRUM Environment.
Requirement 2:
Role: AWS DevOps Tech Lead
Location:Plano, TX
Duration: Long-term
Rate:Open
Start date: Immediate
Responsibilities:
Extensive experience in designing, configuring, deploying, managing and automating AWS Core Services like S3, IAM, EC2, Route53, LB, CloudWatch, Lambda, KMS, Secrets Manager and VPC. Experience in automating cloud deployments using Terraform and Terragrunt. Experience in GitHub Actions based DevOps Experience in managing a datalake on AWS or strong knowledge on AWS data platform services like Redshift, DynamoDB, Databricks, and Athena. Administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD and/or containers. Monitoring usage, cost, and implement optimizations of a variety of AWS resources. Maintaining the GHA pipeline and perform code promotions through change management Deploy and manage AWS Serverless application running on API Gateway and Lambda. Deploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methods Manage Databricks jobs, sql warehouse, access to data. Manage Denodo, integration to all services in the data lake. Manage the open-source spark cluster. Comply with all agreed SLA requirements for all service requests, incidents and changes. Required Skills :
AWS (experience mandatory):
S3, IAM, EC2, Route53, LB, CloudWatch, Lambda, KMS, Secrets Manager, Cloud Trail and VPC. Automation (experience mandatory):
Terraform Bigdata (knowledge mandatory):
Databricks or Redshift DevOps (mandatory):
GitHub Actions ,
Python/Shell scripting