Glow Networks
Data Engineer
Glow Networks, Dallas, Texas, United States, 75215
Job Description
Data Engineer
Location: TX - Plano, 75075
***Must be US citizen or Green Card Holder.
Responsibilities:Extensive experience in designing, configuring, deploying, managing and automating AWS Core Services like S3, IAM, EC2, Route53, SNS, SQS, ELB, CloudWatch, Lambda and VPC.Experience in automating cloud deployments using Terraform and PythonExperience in DevOPS, administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD.Maintaining the Jenkins pipeline and Perform code promotions through change managementExperience in AWS data platform services like Redshift, DynamoDB, Databricks, Glue, MLops and Athena.Ensuring the compliance of the data science operations on AWS.Monitoring usage, cost, and implement optimizations of a variety of AWS resources.Provision DynamoDB tables with encryption and grant access using the IAM policiesDeploy and manage AWS Serverless application running on API Gateway and LAMBDADeploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methodsManage Denodo VDBs, address performance issues, support onboarding of new apps and datasetsManage Denodo stored procedures, healthchecks, HA setup, and address connectivity issues.Provide detailed capacity assessment on a regular basis for Denodo, Redshift, and Databricks.Required Skills:
AWS (experience mandatory): S3, IAM, EC2, Route53, SNS, SQS, ELB, CloudWatch, Lambda and VPCAutomation (experience mandatory): Terraform and PythonBigdata (Experience with atleast 2): Denodo, Redshift, DynamoDB, Databricks, Glue, and Athena.DevOPS (Mandatory): GitHub Actions, Python/Shell scripting
Data Engineer
Location: TX - Plano, 75075
***Must be US citizen or Green Card Holder.
Responsibilities:Extensive experience in designing, configuring, deploying, managing and automating AWS Core Services like S3, IAM, EC2, Route53, SNS, SQS, ELB, CloudWatch, Lambda and VPC.Experience in automating cloud deployments using Terraform and PythonExperience in DevOPS, administering the deployment, management, and monitoring of applications deployed on AWS via CI/CD.Maintaining the Jenkins pipeline and Perform code promotions through change managementExperience in AWS data platform services like Redshift, DynamoDB, Databricks, Glue, MLops and Athena.Ensuring the compliance of the data science operations on AWS.Monitoring usage, cost, and implement optimizations of a variety of AWS resources.Provision DynamoDB tables with encryption and grant access using the IAM policiesDeploy and manage AWS Serverless application running on API Gateway and LAMBDADeploy Redshift Clusters into VPC with encryption, enable cross region snapshots, configure subnet groups and setup monitoring, and resize the cluster using elastic and classic methodsManage Denodo VDBs, address performance issues, support onboarding of new apps and datasetsManage Denodo stored procedures, healthchecks, HA setup, and address connectivity issues.Provide detailed capacity assessment on a regular basis for Denodo, Redshift, and Databricks.Required Skills:
AWS (experience mandatory): S3, IAM, EC2, Route53, SNS, SQS, ELB, CloudWatch, Lambda and VPCAutomation (experience mandatory): Terraform and PythonBigdata (Experience with atleast 2): Denodo, Redshift, DynamoDB, Databricks, Glue, and Athena.DevOPS (Mandatory): GitHub Actions, Python/Shell scripting