Logo
Calibre Inc

Requirements Action Officer

Calibre Inc, Mckinney, Texas, United States, 75070


JOB SUMMARY

The Cloud Data Engineer role will support our Cloud Data Management and Advanced Analytics platforms. In this role, an employee will work with data services, data analysts and the data scientist team(s) to help the organization build out secure, scalable, fault-tolerant and high performing cloud-based architecture to enable and sustain data-driven decisions.

PRIMARY DUTIES & RESPONSIBILITIES

Design, implement, and support various database services of our Cloud Platform.

Provide technological guidance on Data Lake and Enterprise Data Warehouse design, development, implementation and monitoring.

Understand, Design and Implement Data Security around cloud infrastructures.

Provide support and guidance to Data Services and other application development teams on various AWS Database Products.

Work with leadership on process improvement and strategic initiatives on Cloud Platform.

Required Skills

REQUIRED SKILLS

Extensive hands-on experience including design and implementation across broad range of database services on Amazon Web Services (AWS).

Experience with AWS Database Migration Service (AWS DMS) and migrating RDBMS (SQL Server/Oracle) from On-premise to AWS (Aurora/Redshift).

Solid understanding of various Data Management and Data Pipeline tools available in AWS.

Working knowledge with primary AWS Services like S3, Lambda, Batch, Glue, Athena, EC2, EBS, CloudWatch, CloudTrail, ECS, ECR, EMR, IAM, SNS etc.

Development experience with any major ETL tool, preferably Informatica.

Good understanding of implementing datalake and data warehouse in Cloud.

Experience in creating and deploying CloudFormation Templates (CFTs)

Experience with Lifecycle Management of S3 Buckets.

Clear Understanding of Cloud Databases Security involving AWS IAM Users and access, IAM Roles and Policies, Federated Users and permissions.

Good Understanding of AWS Encryption methodologies and AWS KMS Services.

Experience with database performance testing and capacity planning.

Working knowledge and experience with software development life cycle (SDLC) and agile/iterative methodologies.

Implementation experience of Big Data technologies like Hadoop, Spark, Presto, Hive, and Hue will be a major advantage.

Knowledge of AWS Machine Learning Offerings will be a plus

Experience with any Data Visualization tools will be a plus

APPLICABLE TO ALL EMPLOYEES OF GLOBE LIFE & ACCIDENT AND ITS SUBSIDIARIES:

Reliable and predictable attendance of your assigned shift

Ability to work full time and/or part time based on the position specifications.

Required Experience

REQUIRED EXPERIENCE

Bachelor's degree in Computer Science/Engineering, Information Systems or equivalent work experience in a technical position

5+ years of experience in Information Technology.

3+ years of experience in Database Engineering primarily in AWS Redshift, RDS/Aurora, DynamoDB, DMS, Glue.

Proven experience in building data pipelines and database applications

Strong coding and scripting experience with Python, PowerShell or similar languages.

Experience of implementing Amazon EMR or Big Data technologies like Hadoop, Spark, Presto, Hive will be a plus

Prior domain experience of Life Insurance, Annuity or Financial Services is a plus

AWS Certifications are considered a strong plus

Excellent verbal and written communication skills.

Ability to work independently and as part of team.

Qualifications:REQUIRED SKILLS

Extensive hands-on experience including design and implementation across broad range of database services on Amazon Web Services (AWS).

Experience with AWS Database Migration Service (AWS DMS) and migrating RDBMS (SQL Server/Oracle) from On-premise to AWS (Aurora/Redshift).

Solid understanding of various Data Management and Data Pipeline tools available in AWS.

Working knowledge with primary AWS Services like S3, Lambda, Batch, Glue, Athena, EC2, EBS, CloudWatch, CloudTrail, ECS, ECR, EMR, IAM, SNS etc.

Development experience with any major ETL tool, preferably Informatica.

Good understanding of implementing datalake and data warehouse in Cloud.

Experience in creating and deploying CloudFormation Templates (CFTs)

Experience with Lifecycle Management of S3 Buckets.

Clear Understanding of Cloud Databases Security involving AWS IAM Users and access, IAM Roles and Policies, Federated Users and permissions.

Good Understanding of AWS Encryption methodologies and AWS KMS Services.

Experience with database performance testing and capacity planning.

Working knowledge and experience with software development life cycle (SDLC) and agile/iterative methodologies.

Implementation experience of Big Data technologies like Hadoop, Spark, Presto, Hive, and Hue will be a major advantage.

Knowledge of AWS Machine Learning Offerings will be a plus

Experience with any Data Visualization tools will be a plus

APPLICABLE TO ALL EMPLOYEES OF GLOBE LIFE & ACCIDENT AND ITS SUBSIDIARIES:

Reliable and predictable attendance of your assigned shift

Ability to work full time and/or part time based on the position specifications.