Affinity Plus Federal Credit Union
AWS Architect
Affinity Plus Federal Credit Union, Saint Paul, Minnesota, United States, 55130
Description
Position Overview:An AWS Architect at Affinity Plus builds secure, resilient and highly scalable solutions while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the AWS architecture services into the enterprise. You will be responsible for implementing securely architected solutions that are operationally reliable, performant, and deliver on strategic initiatives of tangible, data-driven outcomes.Duties and Responsibilities:Work closely with team members to lead, design, and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigationUse a defense-in-depth approach when designing and deploying performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceablePractice AWS Well-Architected six pillars - operational excellence, security, reliability, performance, efficiency, cost optimization, and sustainabilityCreate and maintain secure and performant network designs; assisting with troubleshooting as neededBuild out new API integrationsAssemble large, complex data sets into workstreams that meet functional and non-functional business requirementsAssist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling into the data lakeBe impeccable in your version control while performing commits, branching, and security. Security version control best practices include data encryption, user authentication and authorization, access controls, audit trails, and threat detection.Automate infrastructure provisioning when it makes sense and strive to ensure migrated workloads are cloud-nativePartner with developers to continuously improve their ability to develop and deploy applicationsBuild infrastructure for optimal extraction, loading and transformation of data from a wide variety of data sourcesWork with the developers to troubleshoot, maintain and monitor scalable data pipelinesPerform root cause analysis to answer specific business questions and identify opportunities for process improvementCollaborate with Enterprise Digital Intelligence (Edi) team to improve data workflows that feed business intelligence tools to increase data accessibility for staff and foster data-driven decision-making across the organizationUse observability and SIEM tools to monitor data and services ensuring production data is secure, has integrity and is available for key stakeholders and the business processes that depend on itWork in a hybrid workflow environment using agile methodologies as well as waterfall project/product managementEmploy change management best practices to ensure that services remain readily accessible to the businessMaintain tools, processes and associated documentation to manage the compute environmentImplement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needsBe a good steward and practice effective cloud governance controls in cloud operationsManage and monitor Windows, Red Hat and CentOS Linux operating systems using tools like Systems Manager and RH SatelliteReadily communicate to leadership on topics including outages, updates on key infrastructure items, audit mitigation progress, and security vulnerabilities.Other duties as assigned
Qualifications and Skills:Required Qualifications and Skills2+ years' experience with data lakes, i.e. Databricks, Snowflake, Amazon S3 and/or Lake Formation, etc.3+ years' of related experience in designing secure, scalable and cost-effective big data architecture5+ years' experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologiesBachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experienceMid-level knowledge of code versioning tools [such as Git, Mercurial or SVN]Expert proficiency in Lambda, Python, C++, Java, R, and SQL programming languagesExpert proficiency in IaC tools, i.e. Terraform, Ansible, Cloud FormationProficiency in software engineering best practices employed in the software development lifecycle, including coding standards, code reviews, source control management, build processes, testing and operationsAdvanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of unstructured dataStrong understanding and knowledge of financial industry technology standards, compliance requirements, and experience working with audit and regulatory bodiesMid-level experience of the AWS Cloud platform core foundational native services and expertise in the Big Data & AI first party services such as AWS Glue, Amazon Athena, Amazon Kinesis, Amazon QuickSight, CrawlersProficiency in working with all types of operating systems, especially Linux and Unix.Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualizationProficient in building, automating and deploying data pipelines and workflows into end-user facing applicationsAbility to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiativesTechnical expertise with data models, data mining, and segmentation techniquesWorking knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API GatewaysExpert at diagnostic and problem resolution providing third-level supportFamiliarity of working with unstructured data sets (i.e. voice, image, log files, social media posts, email)Possess an organized methodical approach and bring a continuous improvement mindsetDemonstrated predisposition for action, willingness to partner, mentor, and an overall innate drive to provide an exceptional member and employee experienceHighly creative and innovative technologist that thrives independently and collaborates well in a team environmentStrong analytical and decision-making skills with a high degree of accuracyStrong verbal, written, and interpersonal communication skillsTime Management skills and the ability to prioritize workloads
Preferred QualificationsExperience in a financial institutionExpert-level knowledge of AWS infrastructure configurations and services offeringExpert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta LakeExperience with MDM using data governance solutionsAdvanced technical certifications: AWS Certified Cloud Practitioner, Solutions Architect, Certified Developer, or SysOps Administrator certifications; RHEL RHCSA/RHCE; AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics preferred
Workplace Environment:Working in a stationary position for 80% of the work dayUtilizing the telephone and video conferencing 10-20% of the dayMoving, lifting and/or carrying 30 pounds with or without accommodationsBending, twisting, kneeling, stooping or crouching when appropriate, on occasionRepetitive movements, including but not limited to typing, mousing, phones, etc.May require travel for an onsite presence for employee meetings & events for collaboration, connection, project work, All-Employee Day, etc.
Required Work Schedule:Standard Monday through Friday business hours with participation in a 24/7 on-call rotation as well as a willingness to work afterhours as needed for upgrades, feature rollouts, etc. Consistent and reliable attendance is a required essential function of this role to meet the needs of the department/team and organization.This position has the ability to be based virtually, but does require travel to St. Paul, MN for training, company events, project initiatives and team and department meetings.Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities
The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information. 41 CFR 60-1.35(c)
Position Overview:An AWS Architect at Affinity Plus builds secure, resilient and highly scalable solutions while keeping simplicity and operational effectiveness in mind. This role collaborates with teammates, product teams, and cross-functional project teams to lead the adoption and integration of the AWS architecture services into the enterprise. You will be responsible for implementing securely architected solutions that are operationally reliable, performant, and deliver on strategic initiatives of tangible, data-driven outcomes.Duties and Responsibilities:Work closely with team members to lead, design, and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigationUse a defense-in-depth approach when designing and deploying performant systems that appropriately auto-scale, are highly available, fault-tolerant, self-monitoring, and serviceablePractice AWS Well-Architected six pillars - operational excellence, security, reliability, performance, efficiency, cost optimization, and sustainabilityCreate and maintain secure and performant network designs; assisting with troubleshooting as neededBuild out new API integrationsAssemble large, complex data sets into workstreams that meet functional and non-functional business requirementsAssist and advise data engineers in the preparation and delivery of raw data for prescriptive and predictive modeling into the data lakeBe impeccable in your version control while performing commits, branching, and security. Security version control best practices include data encryption, user authentication and authorization, access controls, audit trails, and threat detection.Automate infrastructure provisioning when it makes sense and strive to ensure migrated workloads are cloud-nativePartner with developers to continuously improve their ability to develop and deploy applicationsBuild infrastructure for optimal extraction, loading and transformation of data from a wide variety of data sourcesWork with the developers to troubleshoot, maintain and monitor scalable data pipelinesPerform root cause analysis to answer specific business questions and identify opportunities for process improvementCollaborate with Enterprise Digital Intelligence (Edi) team to improve data workflows that feed business intelligence tools to increase data accessibility for staff and foster data-driven decision-making across the organizationUse observability and SIEM tools to monitor data and services ensuring production data is secure, has integrity and is available for key stakeholders and the business processes that depend on itWork in a hybrid workflow environment using agile methodologies as well as waterfall project/product managementEmploy change management best practices to ensure that services remain readily accessible to the businessMaintain tools, processes and associated documentation to manage the compute environmentImplement reusable design templates and solutions to integrate, automate, and orchestrate cloud operational needsBe a good steward and practice effective cloud governance controls in cloud operationsManage and monitor Windows, Red Hat and CentOS Linux operating systems using tools like Systems Manager and RH SatelliteReadily communicate to leadership on topics including outages, updates on key infrastructure items, audit mitigation progress, and security vulnerabilities.Other duties as assigned
Qualifications and Skills:Required Qualifications and Skills2+ years' experience with data lakes, i.e. Databricks, Snowflake, Amazon S3 and/or Lake Formation, etc.3+ years' of related experience in designing secure, scalable and cost-effective big data architecture5+ years' experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologiesBachelor's or Master's degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experienceMid-level knowledge of code versioning tools [such as Git, Mercurial or SVN]Expert proficiency in Lambda, Python, C++, Java, R, and SQL programming languagesExpert proficiency in IaC tools, i.e. Terraform, Ansible, Cloud FormationProficiency in software engineering best practices employed in the software development lifecycle, including coding standards, code reviews, source control management, build processes, testing and operationsAdvanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of unstructured dataStrong understanding and knowledge of financial industry technology standards, compliance requirements, and experience working with audit and regulatory bodiesMid-level experience of the AWS Cloud platform core foundational native services and expertise in the Big Data & AI first party services such as AWS Glue, Amazon Athena, Amazon Kinesis, Amazon QuickSight, CrawlersProficiency in working with all types of operating systems, especially Linux and Unix.Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualizationProficient in building, automating and deploying data pipelines and workflows into end-user facing applicationsAbility to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiativesTechnical expertise with data models, data mining, and segmentation techniquesWorking knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API GatewaysExpert at diagnostic and problem resolution providing third-level supportFamiliarity of working with unstructured data sets (i.e. voice, image, log files, social media posts, email)Possess an organized methodical approach and bring a continuous improvement mindsetDemonstrated predisposition for action, willingness to partner, mentor, and an overall innate drive to provide an exceptional member and employee experienceHighly creative and innovative technologist that thrives independently and collaborates well in a team environmentStrong analytical and decision-making skills with a high degree of accuracyStrong verbal, written, and interpersonal communication skillsTime Management skills and the ability to prioritize workloads
Preferred QualificationsExperience in a financial institutionExpert-level knowledge of AWS infrastructure configurations and services offeringExpert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta LakeExperience with MDM using data governance solutionsAdvanced technical certifications: AWS Certified Cloud Practitioner, Solutions Architect, Certified Developer, or SysOps Administrator certifications; RHEL RHCSA/RHCE; AWS Certified Data Analytics, DASCA Big Data Engineering and Analytics preferred
Workplace Environment:Working in a stationary position for 80% of the work dayUtilizing the telephone and video conferencing 10-20% of the dayMoving, lifting and/or carrying 30 pounds with or without accommodationsBending, twisting, kneeling, stooping or crouching when appropriate, on occasionRepetitive movements, including but not limited to typing, mousing, phones, etc.May require travel for an onsite presence for employee meetings & events for collaboration, connection, project work, All-Employee Day, etc.
Required Work Schedule:Standard Monday through Friday business hours with participation in a 24/7 on-call rotation as well as a willingness to work afterhours as needed for upgrades, feature rollouts, etc. Consistent and reliable attendance is a required essential function of this role to meet the needs of the department/team and organization.This position has the ability to be based virtually, but does require travel to St. Paul, MN for training, company events, project initiatives and team and department meetings.Equal Opportunity Employer/Protected Veterans/Individuals with Disabilities
The contractor will not discharge or in any other manner discriminate against employees or applicants because they have inquired about, discussed, or disclosed their own pay or the pay of another employee or applicant. However, employees who have access to the compensation information of other employees or applicants as a part of their essential job functions cannot disclose the pay of other employees or applicants to individuals who do not otherwise have access to compensation information, unless the disclosure is (a) in response to a formal complaint or charge, (b) in furtherance of an investigation, proceeding, hearing, or action, including an investigation conducted by the employer, or (c) consistent with the contractor's legal duty to furnish information. 41 CFR 60-1.35(c)