BI Warehouse Architect Job at DCI Donor Services in Nashville
DCI Donor Services, Nashville, TN, US
Job Description
Sierra Donor Services (SDS) is looking for a dynamic and enthusiastic team member to join us to save lives!! Our mission at Sierra Donor Services is to save lives through organ donation and we want professionals on our team that will embrace this important work!! We are currently seeking an BI Warehouse Architect. The BI Warehouse Architect is responsible for designing, deploying, and maintaining advanced data solutions, including data warehouse and data management processes, to support the business intelligence team with reporting, and advanced analytics requirements.
COMPANY OVERVIEW AND MISSION
For over four decades, DCI Donor Services has been a leader in working to end the transplant waiting list. Our unique approach to service allows for nationwide donation, transplantation, and distribution of organs and tissues while maintaining close ties to our local communities.
DCI Donor Services operates three organ procurement/tissue recovery organizations: New Mexico Donor Services, Sierra Donor Services, and Tennessee Donor Services. We also maximize the gift of life through the DCI Donor Services Tissue Bank and Tennessee Donor Services Eye Bank.
Our performance is measured by the way we serve donor families and recipients. To be successful in this endeavor is our ultimate mission. By mobilizing the power of people and the potential of technology, we are honored to extend the reach of each donor’s gift and share the importance of the gift of life.
We are committed to diversity, equity, and inclusion. With the help of our employee-led strategy team, we will ensure that all communities feel welcome and safe with us because we are a model for fairness, belonging, and forward thinking.
Key responsibilities this position will perform include:
- Adheres to data engineering principles and practices.
- Design, build, deploy, automate, and maintain end-to-end data pipelines for new and existing data sources and targets utilizing modern ETL/ELT tools and practices, including stream processing technologies where appropriate.
- Demonstrates problem solving ability that allows team for timely and effective issue resolution.
- Drives and completes project deliverables within the data engineering & management area according to project plans.
- Utilize in-depth technical expertise regarding data models, master data management, metadata management, reference data management, and data warehousing.
- Work with internal technical resources to optimize the data warehouse through hardware or software upgrades or enhancements.
- Design and implement data models that balance performance, flexibility, and ease of use, considering both analytical and operational needs.
- Enable and support self-service analytics by designing intuitive data models and views, collaborating with the Business Intelligence team to ensure data is easily accessible and interpretable for business partners.
- Manage and automate the deployment of upgrades, patches, and new features across the data infrastructure, ensuring minimal disruption to data services and maintaining system integrity.
- Ensure compliance to all data warehouse administration activities.
- Manage and collect business metadata and data integration points.
- Coordinate with business intelligence to prepare technical design specifications to address user needs.
- Develop and implement comprehensive testing strategies, including automated unit, integration, and end-to-end tests, to ensure the accuracy, reliability, and performance of data pipelines and procedures. Provide technical support and coordination during warehouse design, testing, and movement to production.
- Create and maintain thorough, up-to-date documentation for all data engineering projects, processes, and systems, adhering to organizational standards and leveraging modern documentation tools and practices. Implement business rules via coding, stored procedures, middleware, or other technologies ensuring scalability and maintainability of implemented solutions.
- Performs other related duties as assigned.
The ideal candidate will have:
TECHNICAL SKILLS:
- Programming Languages:
- Advanced proficiency in SQL
- Preferred familiarity with Python
- Cloud Platforms:
- Strong experience with at least one major cloud platform (AWS, Azure, or GCP)
- Understanding of cloud-native architectures and services
- Data Warehousing and Lakes:
- Experience with modern data warehousing solutions (e.g., Snowflake, Amazon Redshift, Google BigQuery)
- Familiarity with data warehouse architectures and technologies
- ETL/ELT and Data Pipelines:
- Proficiency in designing and implementing scalable data pipelines
- Experience with ETL/ELT tools
- Database Systems:
- Strong knowledge of both relational (e.g., PostgreSQL, Oracle) and NoSQL (e.g., MongoDB, Cassandra) databases
- Experience with database optimization and performance tuning
- Data Modeling:
- Proficiency in dimensional modeling and data warehouse design
- Experience with data modeling tools
- Data Governance and Security:
- Understanding of data governance principles and practices
- Knowledge of data security and privacy best practices
- Machine Learning Operations (MLOps):
- Familiarity with MLOps practices and tools
- Data Visualization:
- Basic proficiency with data visualization tools (e.g., Power BI, Tableau)
PHYSICAL TRAITS: Reads, writes, listens and observes. Communicates using both verbal and technological avenues. Walks, stands, lifts, and carries light loads.
QUALIFICATIONS:
Education Required:
Bachelor's degree in Computer Science, Data Science, Engineering, or a related technical field. Master's degree is preferred but not required. Equivalent combination of education and experience may be considered.
Experience:
Minimum of 7 years of professional experience in data engineering, with at least 3 years in a senior or lead role.
Must have ten (10) years designing and implementing large-scale data pipelines and ETL processes
Must have minimum of three (3) years working with cloud-based platforms (e.g., AWS, Azure, GCP)
Must have ten (10) years implementing and maintaining data lakes and/or warehouses
Must have five (5) years using modern big data technologies such as Spark, Hadoop, etc.
Must have experience in applying data governance and security practices.
Must have experience documenting and keeping updated documentation on practices/processes for warehousing.
LICENSES/CERTIFICATION: Certifications in the following areas are preferred but not required
- Data Engineering Certifications:
- Google Certified Professional Data Engineer
- AWS Certified Data Analytics – Specialty
- Azure Data Engineer Associate
- Cloudera Certified Professional (CCP) Data Engineer
- Modern Data Warehouse Architecture Certifications:
- Databricks Certified Professional Data Engineer
- SnowPro Advanced: Data Engineer Certification
- Microsoft Certified: Azure Data Engineer Associate
- Google Cloud Certified - Professional Data Engineer
- AWS Certified Data Analytics - Specialty
- Advanced SQL Developer Certifications:
- Microsoft Certified: Azure Database Administrator Associate
- Oracle PL/SQL Developer Certified Associate
- Databricks Certified Associate Developer for Apache Spark
- Google Cloud Certified - Professional Cloud Database Engineer
- AWS Certified Database – Specialty
- Advanced Programming and ETL Development Certifications:
- Python Institute PCPP – Certified Professional in Python Programming
- RStudio Certified Professional Data Scientist
- Informatica Certified Professional or Talend Certified Developer
- AWS Certified Developer - Associate
- Microsoft Certified: Azure Data Engineer Associate
- Data Governance and Security Certifications:
- ISACA Certified Information Systems Auditor (CISA)
- Certified in Data Protection (CIPT)
- Certified Information Systems Security Professional (CISSP)
We offer a competitive compensation package including:
- Up to 176 hours of PTO your first year
- Up to 72 hours of Sick Time your first year
- Two Medical Plans (your choice of a PPO or HDHP), Dental, and Vision Coverage
- 403(b) plan with matching contribution
- Company provided term life, AD&D, and long-term disability insurance
- Wellness Program
- Supplemental insurance benefits such as accident coverage and short-term disability
- Discounts on home/auto/renter/pet insurance
- Cell phone discounts through Verizon
- Monthly phone stipend
**New employees must have their first dose of the COVID-19 vaccine by their potential start date or be able to supply proof of vaccination.**
You will receive a confirmation e-mail upon successful submission of your application. The next step of the selection process will be to complete a video screening. Instructions to complete the video screening will be contained in the confirmation e-mail. Please note - you must complete the video screening within 5 days from submission of your application to be considered for the position.
DCIDS is an EOE/AA employer – M/F/Vet/Disability.