Logo
Association of Organ Procurement Organizations

BI Warehouse Architect Sierra Donor Services

Association of Organ Procurement Organizations, California, Missouri, United States, 65018


Salary Range: $96,000.00 To $130,000.00 Annually

Sierra Donor Services (SDS) is looking for a dynamic and enthusiastic team member to join us to save lives! Our mission at Sierra Donor Services is to save lives through organ donation, and we want professionals on our team that will embrace this important work. We are currently seeking a BI Warehouse Architect. The BI Warehouse Architect is responsible for designing, deploying, and maintaining advanced data solutions, including data warehouse and data management processes, to support the business intelligence team with reporting and advanced analytics requirements. This position will be onsite in Sacramento, CA.COMPANY OVERVIEW AND MISSIONFor over four decades, DCI Donor Services has been a leader in working to end the transplant waiting list. Our unique approach to service allows for nationwide donation, transplantation, and distribution of organs and tissues while maintaining close ties to our local communities. DCI Donor Services operates three organ procurement/tissue recovery organizations: New Mexico Donor Services, Sierra Donor Services, and Tennessee Donor Services. We also maximize the gift of life through the DCI Donor Services Tissue Bank and Tennessee Donor Services Eye Bank.Our performance is measured by the way we serve donor families and recipients. To be successful in this endeavor is our ultimate mission. By mobilizing the power of people and the potential of technology, we are honored to extend the reach of each donor’s gift and share the importance of the gift of life. We are committed to diversity, equity, and inclusion. With the help of our employee-led strategy team, we will ensure that all communities feel welcome and safe with us because we are a model for fairness, belonging, and forward thinking.Key responsibilities this position will perform include:Adheres to data engineering principles and practices.Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.Demonstrates problem-solving ability that allows the team for timely and effective issue resolution.Drives and completes project deliverables within the data engineering & management area according to project plans.Utilize in-depth technical expertise regarding data models, master data management, metadata management, reference data management, and data warehousing.Work with internal technical resources to optimize the data warehouse through hardware or software upgrades or enhancements.Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs.Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners.Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity.Ensure compliance with all data warehouse administration activities.Manage and collect business metadata and data integration points.Coordinate with business intelligence to prepare technical design specifications to address user needs.Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. Provide technical support and coordination during warehouse design, testing, and movement to production.Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices. Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.Performs other related duties as assigned.The ideal candidate will have:TECHNICAL SKILLS:Programming Languages:

Advanced proficiency in SQLPreferred familiarity with Python

Cloud Platforms:

Strong experience with at least one major cloud platform (AWS, Azure, or GCP)Understanding of cloud-native architectures and services

Data Warehousing and Lakes:

Experience with modern data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery)Familiarity with data warehouse architectures and technologies

ETL/ELT and Data Pipelines:

Proficiency in designing and implementing scalable data pipelinesExperience with ETL/ELT tools

Database Systems:

Strong knowledge of both relational (e.g., PostgreSQL, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databasesExperience with database optimization and performance tuning

Data Modeling:

Proficiency in dimensional modeling and data warehouse designExperience with data modeling tools

Data Governance and Security:

Understanding of data governance principles and practicesKnowledge of data security and privacy best practices

Machine Learning Operations (MLOps):

Familiarity with MLOps practices and tools

Data Visualization:

Basic proficiency with data visualization tools (e.g., Power BI, Tableau)

PHYSICAL TRAITS:

Reads, writes, listens and observes. Communicates using both verbal and technological avenues. Walks, stands, lifts, and carries light loads.QUALIFICATIONS:Education Required:

Bachelor’s degree in Computer Science, Data Science, Engineering, or a related technical field. Master’s degree is preferred but not required. Equivalent combination of education and experience may be considered.Experience:

Minimum of 7 years of professional experience in data engineering, with at least 3 years in a senior or lead role. Must have ten (10) years designing and implementing large-scale data pipelines and ETL processes. Must have a minimum of three (3) years working with cloud-based platforms (e.g., AWS, Azure, GCP). Must have ten (10) years implementing and maintaining data lakes and/or warehouses. Must have five (5) years using modern big data technologies such as Spark, Hadoop, etc. Must have experience in applying data governance and security practices. Must have experience documenting and keeping updated documentation on practices/processes for warehousing.LICENSES/CERTIFICATION:

Certifications in the following areas are preferred but not required.We offer a competitive compensation package including:Up to 176 hours of PTO your first yearUp to 72 hours of Sick Time your first yearTwo Medical Plans (your choice of a PPO or HDHP), Dental, and Vision Coverage403(b) plan with matching contributionCompany provided term life, AD&D, and long-term disability insuranceWellness ProgramSupplemental insurance benefits such as accident coverage and short-term disabilityDiscounts on home/auto/renter/pet insuranceCell phone discounts through VerizonMonthly phone stipend**New employees must have their first dose of the COVID-19 vaccine by their potential start date or be able to supply proof of vaccination.**You will receive a confirmation e-mail upon successful submission of your application. The next step of the selection process will be to complete a video screening. Instructions to complete the video screening will be contained in the confirmation e-mail. Please note – you must complete the video screening within 5 days from submission of your application to be considered for the position.DCIDS is an EOE/AA employer – M/F/Vet/Disability.

#J-18808-Ljbffr