NORTHWESTERN MUTUAL LIFE INSURANCE
Data Engineer III
NORTHWESTERN MUTUAL LIFE INSURANCE, Milwaukee, Wisconsin, United States, 53244
JOB REQUIREMENTS: At Northwestern Mutual, we are strong, innovative and growing. We invest in our people. We care and make a positive difference. Northwestern Mutual Wealth and Investment Management is our company's rapidly growing investment management business; building on our customers' trust, IPS is a $300B AUM ""startup"", inspiring change across the organization. Our Technology unit is leading the way in redefining how technology is designed and used. In Managed Investments we are reinventing ourselves, reinforcing our core values while embracing modern practices and groundbreaking technologies, to enable our business to grow at an outstanding rate. We are looking for phenomenal people to help drive this transformation and shape our future. We are looking for a Data Engineer III to join our team. Our team is responsible for developing and supporting the investment data platform to provide streamlined and governed access to trusted data. The team handles integration to third-party vendors and/or internal data stores as needed so that the back-end implementation is not exposed to consumers. Technologies used by the team include Snowflake, DBT, AWS, Kubernetes. Primary Duties & Responsibilities: Build and maintain an investment data platform running on Snowflake data cloud Apply engineering best practices to analyze, design, develop, deploy and support software solutions. Develop software using continuous deployment and integration practices. Participate in an Agile implementation and maintenance of source control and release procedures. Participate in Code Reviews and feedback to the team Explain technical solutions to technical teams. Contribute to a collaborative work environment in which all team members are respected regardless of their individual differences and are motivated to improve both their individual and team contributions. Identify data quality issues and their root causes. Propose fixes and design data audits Qualifications: Bachelor's Degree or equivalent experience 2-4 years of professional experience. At least 2 years of professional software engineering, debugging, analysis, and software documentation experience. Code Knowledge: Python, JVM (Java), SQL Production experience with Snowflake and DBT or other cloud data technologies is a plus Experience building and maintaining ETL pipelines in production Production experience or familiarity with AWS serverless patterns, CI/CD using GitLab, K8s, and deployment using terraform concepts is a plus Experience with Agile methodologies/DevOps environment. Intermediate understanding of database structures, theories, principles, and practices. Intermediate understanding of Data Quality and Data Concepts. Basic understanding of Data Integration Patterns and Tooling including ELT/ETL, EII, Replication, Event Streaming, and Virtualization to support batch and real-time For full info follow application link. EEO/AA Employer/Vets/Disability APPLICATION INSTRUCTIONS: Apply Online: ipc.us/t/210F04A8132F4887 Qualified females, minorities, and special disabled veterans and other veterans are encouraged to apply.