Cynet Systems
AWS Cloud Engineer
Cynet Systems, Charlotte, North Carolina, United States, 28245
Job Description:
Responsibilities:
Hands-on Experience with Java, EMR, Flink, Kafka, AWS services - S3, Lambda, Athena.Extensive Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora.Required Tools and Languages - Python, Spark, PySpark and Pandas.Infrastructure as Code technology - Terraform/CloudFormation.Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse.Experience with RDBMS platforms and Strong proficiency with MySQL, Postgres.Deep knowledge of IAM roles and Policies.Experience using AWS monitoring services like Cloud Watch, Cloud Trail and Cloud Watch events.Experience with Kafka/Messaging preferably Confluent Kafka.Experience with Event Driven Architecture.Works with the team on key technical aspects and responsible for product tech delivery.Engages in the Design, Build, Test and Deployment of components.Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead).Understand requirements / use case to outline technical scope and lead delivery of technical solution.Confirm required developers and skill sets specific to product.Provides leadership, direction, peer review and accountability to developers on the product.Works closely with the Product Owner to align on delivery goals and timing.ssists Product Owner with prioritizing and managing team backlog.Collaborates with Data and Solution architects on key technical decisions.The architecture and design to deliver the requirements and functionality.6-8+ years of experience.Must Haves:
WS.Kafka (or any other data processing skill).Python.Java background.Terraform.Required Skills:
WS, Java, Kafka, Python, EMR, Streams Processing application development experience using Kafka Streams and/or Flink java APIs, Experience with Postgres, MySQL, Terraform, Concourse.Nice to Have Skills:
Flink, Glue Catalog, Redshift, Lake Formation, Document DB, ECS/EKS, and at least AWS Practitioner, Developer, or Data Specialty certification.
Responsibilities:
Hands-on Experience with Java, EMR, Flink, Kafka, AWS services - S3, Lambda, Athena.Extensive Experience with EMR databases such as Glue Catalog, Lake Formation, Redshift, DynamoDB and Aurora.Required Tools and Languages - Python, Spark, PySpark and Pandas.Infrastructure as Code technology - Terraform/CloudFormation.Experience with DevOps pipeline (CI/CD) - Bitbucket; Concourse.Experience with RDBMS platforms and Strong proficiency with MySQL, Postgres.Deep knowledge of IAM roles and Policies.Experience using AWS monitoring services like Cloud Watch, Cloud Trail and Cloud Watch events.Experience with Kafka/Messaging preferably Confluent Kafka.Experience with Event Driven Architecture.Works with the team on key technical aspects and responsible for product tech delivery.Engages in the Design, Build, Test and Deployment of components.Where applicable in collaboration with Lead Developers (Data Engineer, Software Engineer, Data Scientist, Technical Test Lead).Understand requirements / use case to outline technical scope and lead delivery of technical solution.Confirm required developers and skill sets specific to product.Provides leadership, direction, peer review and accountability to developers on the product.Works closely with the Product Owner to align on delivery goals and timing.ssists Product Owner with prioritizing and managing team backlog.Collaborates with Data and Solution architects on key technical decisions.The architecture and design to deliver the requirements and functionality.6-8+ years of experience.Must Haves:
WS.Kafka (or any other data processing skill).Python.Java background.Terraform.Required Skills:
WS, Java, Kafka, Python, EMR, Streams Processing application development experience using Kafka Streams and/or Flink java APIs, Experience with Postgres, MySQL, Terraform, Concourse.Nice to Have Skills:
Flink, Glue Catalog, Redshift, Lake Formation, Document DB, ECS/EKS, and at least AWS Practitioner, Developer, or Data Specialty certification.