Saxon Global
Cloud Data Engineer
Saxon Global, Chicago, Illinois, United States, 60290
Job Description:
You have the opportunity to join a small team focused on the firm's strategic investments in our Cloud Data Operations space. You are a passionate visionary and technologist with a focus on data operations, cloud solutions, and implementations. In your role, you will be given great latitude to flex your skillset while benefiting from the support of our many AWS Certified team members.
As a Cloud Data Engineer, your efforts will advance the firm's capabilities to deliver the strategy for optimized solution architectures as relates to data lakes, data warehouses, and analytics, ensuring our technology solutions safeguard the security, privacy, and integrity of our firm. You will collaborate with our Engineering and Product teams to provide sound solution architectures in accordance with industry wide best practices.
QualificationsSelf-starter / autonomy is a must. Demonstrated ability to self-teach, learn, and apply learnings as a regular course of activity.Work during Chicago, Central Standard Time, office hours for billed time.Experience building & operating data lakes and data warehouse solutions on AWS.Comfortable documenting designs using modeling methodologies; I.e., UML.3+ years Solution Architecture experience, including 2+ years of Data Architecture.3+ years hands-on AWS experience.3+ years Software Development experience, including 2+ years of Python (including Spark) experience. Hands-on experience with several of the following AWS services (Athena, Glue ETL & Crawlers, Redshift, RDS, DynamoDB, Lambda, CLI, EC2, Step Functions, CloudWatch).2+ years hands-on AWS CI/CD experience including CodeCommit, CodePipeline and CloudFormation. A well-grounded knowledge of engineering and continuous delivery practices using modern software development tooling (e.g., GitHub, IDEA, PyCharm, Visual Studio, etc.), processes (e.g., Scrum, Agile), and toolsets (e.g., JIRA, Confluence).2+ years on business intelligence / reporting tools, such as Tableau, Power BI, etc.1+ years hands-on experience crafting IAM Security Policies and Roles, as well as S3 Bucket Policies.Strong DevOps mindset and experience in promoting code from staging to production environments through automation.BA/BS degree in Computer Science, related Software or Data Engineering or equivalent experience.AWS Certified, such as Solution Architect, Analytics, and/or DevOps.**MUST HAVE**
ACTIVE AWS Architect Associate (Solution Architect, Analytics, and/or DevOps)Resume synthesis - efficient, effective communication)
If they can't write an effective resume (8 pages long, full of details - won't get a look)
Design level thinking
Some sense of design
Ability to articulate their ideas verbally at different levels as well as technically model
Confluence (don't search for this, just qualify it when speaking with folks - this will help you understand how much they are accustomed to articulating their solutions)
Open to daily walkthroughs
Very collaborative & iterative
Use CloudFormation templates (CFT) to manage IAM roles and polices.
Also, use of CFT and CodePipeline to provision AWS resources.
Technologies most commonly referenced in job intake:
S3PythonSQL
You have the opportunity to join a small team focused on the firm's strategic investments in our Cloud Data Operations space. You are a passionate visionary and technologist with a focus on data operations, cloud solutions, and implementations. In your role, you will be given great latitude to flex your skillset while benefiting from the support of our many AWS Certified team members.
As a Cloud Data Engineer, your efforts will advance the firm's capabilities to deliver the strategy for optimized solution architectures as relates to data lakes, data warehouses, and analytics, ensuring our technology solutions safeguard the security, privacy, and integrity of our firm. You will collaborate with our Engineering and Product teams to provide sound solution architectures in accordance with industry wide best practices.
QualificationsSelf-starter / autonomy is a must. Demonstrated ability to self-teach, learn, and apply learnings as a regular course of activity.Work during Chicago, Central Standard Time, office hours for billed time.Experience building & operating data lakes and data warehouse solutions on AWS.Comfortable documenting designs using modeling methodologies; I.e., UML.3+ years Solution Architecture experience, including 2+ years of Data Architecture.3+ years hands-on AWS experience.3+ years Software Development experience, including 2+ years of Python (including Spark) experience. Hands-on experience with several of the following AWS services (Athena, Glue ETL & Crawlers, Redshift, RDS, DynamoDB, Lambda, CLI, EC2, Step Functions, CloudWatch).2+ years hands-on AWS CI/CD experience including CodeCommit, CodePipeline and CloudFormation. A well-grounded knowledge of engineering and continuous delivery practices using modern software development tooling (e.g., GitHub, IDEA, PyCharm, Visual Studio, etc.), processes (e.g., Scrum, Agile), and toolsets (e.g., JIRA, Confluence).2+ years on business intelligence / reporting tools, such as Tableau, Power BI, etc.1+ years hands-on experience crafting IAM Security Policies and Roles, as well as S3 Bucket Policies.Strong DevOps mindset and experience in promoting code from staging to production environments through automation.BA/BS degree in Computer Science, related Software or Data Engineering or equivalent experience.AWS Certified, such as Solution Architect, Analytics, and/or DevOps.**MUST HAVE**
ACTIVE AWS Architect Associate (Solution Architect, Analytics, and/or DevOps)Resume synthesis - efficient, effective communication)
If they can't write an effective resume (8 pages long, full of details - won't get a look)
Design level thinking
Some sense of design
Ability to articulate their ideas verbally at different levels as well as technically model
Confluence (don't search for this, just qualify it when speaking with folks - this will help you understand how much they are accustomed to articulating their solutions)
Open to daily walkthroughs
Very collaborative & iterative
Use CloudFormation templates (CFT) to manage IAM roles and polices.
Also, use of CFT and CodePipeline to provision AWS resources.
Technologies most commonly referenced in job intake:
S3PythonSQL