Franklin Fitch
Data Warehouse Architect
Franklin Fitch, Boston, Massachusetts, us, 02298
Data Warehouse Architect | Contract-to-Hire | Boston - Hybrid (Remote First)
As the
Data Warehouse Architect , you will be responsible for creating the data architecture for a newly established enterprise data warehouse. You will design, implement, and optimize
ETL processes , data models, and integrations from multiple data sources, including APIs. You will also leverage
Azure Synapse Analytics
for scalable data integration and analytics. This is a key leadership role, working closely with stakeholders to ensure the data warehouse aligns with business goals and scales as the company grows. Responsibilities: Design & Build:
Lead the end-to-end design and development of the data warehouse from scratch, including data modeling, ETL pipelines, and integrations from multiple data sources (APIs, databases, etc.). Cloud & Big Data Solutions:
Leverage
cloud platforms
(e.g., AWS, Google BigQuery, Snowflake) and
big data technologies
(e.g., Hadoop, Spark) to build scalable and efficient solutions. Synapse Integration:
Use
Azure Synapse Analytics
to enable integrated analytics and seamless data workflows. Collaboration & Leadership:
Work with BI, analytics, and business teams to define data requirements, mentor junior team members, and ensure successful implementation. Data Governance & Security:
Implement best practices for data governance, quality, and security, ensuring compliance with relevant regulations. Optimization & Innovation:
Continuously evaluate and enhance the architecture for performance, cost efficiency, and scalability. Requirements: Experience:
5+ years as a Data Warehouse Architect or similar role, with experience designing and building data warehouses from the ground up. Expertise in
ETL processes ,
data integration ,
API integration , and
cloud-based data platforms
(AWS, Google Cloud, Synapse, Snowflake). Strong knowledge of
data modeling ,
SQL , and
big data technologies
(e.g., Hadoop, Spark). Excellent problem-solving and communication skills. Qualifications: Experience in
real-time data processing
and stream processing. Synapse Analytics
experience. Education:
Bachelor's or Master’s in Computer Science, Information Systems, or a related field. Cloud certifications are a bonus (e.g., AWS Certified Big Data - Specialty).
#J-18808-Ljbffr
Data Warehouse Architect , you will be responsible for creating the data architecture for a newly established enterprise data warehouse. You will design, implement, and optimize
ETL processes , data models, and integrations from multiple data sources, including APIs. You will also leverage
Azure Synapse Analytics
for scalable data integration and analytics. This is a key leadership role, working closely with stakeholders to ensure the data warehouse aligns with business goals and scales as the company grows. Responsibilities: Design & Build:
Lead the end-to-end design and development of the data warehouse from scratch, including data modeling, ETL pipelines, and integrations from multiple data sources (APIs, databases, etc.). Cloud & Big Data Solutions:
Leverage
cloud platforms
(e.g., AWS, Google BigQuery, Snowflake) and
big data technologies
(e.g., Hadoop, Spark) to build scalable and efficient solutions. Synapse Integration:
Use
Azure Synapse Analytics
to enable integrated analytics and seamless data workflows. Collaboration & Leadership:
Work with BI, analytics, and business teams to define data requirements, mentor junior team members, and ensure successful implementation. Data Governance & Security:
Implement best practices for data governance, quality, and security, ensuring compliance with relevant regulations. Optimization & Innovation:
Continuously evaluate and enhance the architecture for performance, cost efficiency, and scalability. Requirements: Experience:
5+ years as a Data Warehouse Architect or similar role, with experience designing and building data warehouses from the ground up. Expertise in
ETL processes ,
data integration ,
API integration , and
cloud-based data platforms
(AWS, Google Cloud, Synapse, Snowflake). Strong knowledge of
data modeling ,
SQL , and
big data technologies
(e.g., Hadoop, Spark). Excellent problem-solving and communication skills. Qualifications: Experience in
real-time data processing
and stream processing. Synapse Analytics
experience. Education:
Bachelor's or Master’s in Computer Science, Information Systems, or a related field. Cloud certifications are a bonus (e.g., AWS Certified Big Data - Specialty).
#J-18808-Ljbffr