Logo
Albertsons

Advanced Data Engineer

Albertsons, Boise, Idaho, United States, 83708


Job Description

About the company Albertsons Companies is at the forefront of the revolution in retail. With a fixation on raising the bar with innovation and building belonging through our culture, our team is rallying our company around a unique purpose: to create joy around each table and inspire a healthier tomorrow for every community. Albertsons Companies is one of the largest food and drug retailers in the United States, with over 2,200 stores in 34 states and the District of Columbia. Our well-known banners include Albertsons, Safeway, Vons, Jewel-Osco, Shaw's, Acme, Tom Thumb, Randalls, United Supermarkets, Pavilions, Star Market, Haggen, Carrs, Kings Food Markets, and Balducci's Food Lovers Market. We support our stores with 22 distribution centers and 19 manufacturing plants. Placing a premium on adaptability, safety and family well-being, our work model, Presence with a Purpose, offers a hybrid work environment between remote work and office time. A one-size-fits-all approach does not apply to everyone, and teams are empowered to make decisions best for them.

Bring your flavor Building the future of food and well-being starts with you. Join our team and bring your best self to the table. #bringyourflavor

What you will be doing As an Advanced Data Engineer in Supply Chain Intelligence, you'll play a critical role in crafting data workflows and designing scalable solutions that drive business value. Your expertise in data engineering will enable you to mentor peers, contribute to system architecture, and develop innovative solutions that align with business objectives. By collaborating closely with teams across the organization, your contributions will directly enhance company's data infrastructure and promote best practices in data management and governance.

The role is based out of Boise, Idaho. Candidate must be knowledgeable about Supply Chain Finance domain, systems, data structures, processes and roles & responsibilities of various departments in Supply Chain Finance.

WHAT YOU'LL DO: •Develop and Implement Solutions: Design and develop multi-tier systems, focusing on analysis, coding, testing, debugging, and comprehensive documentation. •Mentor and Support Teams: Guide and mentor data engineering teams, helping to automate and enhance business data processes with minimal supervision. •Engage the Engineering Community: Present and share technical expertise with the broader engineering community, enhancing knowledge-sharing across teams. •Coordinate Complex Workflows: Lead the design and development efforts of sophisticated data workflows, translating technical requirements into efficient data pipelines. •Create Prototypes and Proofs of Concept: Utilize your technical knowledge to design and support the implementation of prototypes, proofs of concept, and data solutions. •Guide Data Curation: Organize and implement data curation practices as specified by technical requirements, ensuring data quality and consistency. •Lead Code and Design Reviews: Mentor teams in code development, participate in reviews, and uphold coding standards that foster code quality and extensibility. •Collaborate on Technology Roadmaps: Partner with Architects and Product Owners to design and document technology roadmaps that align with strategic objectives. •Promote Code Standards: Support and enforce coding standards and assist senior staff in developing and refining those standards to ensure quality code across teams. WHAT YOU'LL GAIN: •Leadership Opportunities: The chance to mentor and lead teams, influencing the next generation of data engineers at ACI. •Innovation and Impact: Work on complex data projects that drive company's success and position you as a key player in the data engineering space. •Collaborative Environment: Partner with cross-functional teams, gaining exposure to a range of perspectives and fostering continuous learning. •Professional Development: Access to training plans and opportunities to grow your technical expertise and leadership capabilities. •Recognition and Visibility: Engage with technical leadership and present solutions that impact the entire organization. QUALIFICATIONS: Minimum Qualifications: •Bachelor's degree in Computer Science or related discipline, or equivalent work experience. •At least 5 years of experience in data engineering. •Minimum of 4 years of experience in SQL, data modeling, data warehousing, and data analysis. •Experience using cloud technologies (e.g., Azure, AWS, GCP). •Experience with modern data platforms (e.g., ADLS, Databricks, Big Query, Snowflake). •Experience with ETL/ELT, SQL, and cloud data integration tools (e.g., Azure Data Factory, DataFlow). Desired Qualifications: •Experience with master data (MDM), metadata, and data governance tools (e.g., Alation, Erwin, Collibra, Azure Purview, Informatica MDM). •Proficiency with CI/CD tools (e.g., Azure DevOps, GitHub) and version control practices. •Familiarity with git, code reviews, pull requests, and branching standards (e.g., Git Flow, Trunk-Based Development). •Experience with Test Driven Development, unit testing, integration testing, and performance testing. •Ability to deliver scalable data solutions compatible with existing technical environments throughout the data lifecycle. •Experience leading and collaborating with teams across the SDLC. •Experience with complex data solutions and analytics engines