PamTen
Data Engineering Architect
PamTen, Los Angeles, California, United States, 90079
Data Engineering Architect
We are seeking an experienced Data Engineering Architect to design and implement scalable data solutions for our organization. The ideal candidate will have deep expertise in cloud data warehousing, ETL/ELT processes, data modeling, and business intelligence.
Responsibilities:• Design and architect end-to-end data solutions leveraging AWS Redshift, Apache Airflow, dbt, and other modern data tools• Develop data models and implement data pipelines to ingest, transform, and load data from various sources into our Redshift data warehouse• Create and maintain Apache Airflow DAGs to orchestrate complex data workflows and ETL processes• Implement data transformations and modeling using dbt to ensure data quality and consistency• Design and optimize Redshift clusters for performance, scalability, and cost-efficiency• Collaborate with data analysts and scientists to expose data through Tableau dashboards and reports• Establish data governance practices and ensure data security/compliance• Mentor junior data engineers and promote best practices across the data team• Evaluate new data technologies and make recommendations to improve our data architecture
Requirements:• 7+ years of experience in data engineering, with at least 3 years in an architect role• Deep expertise with AWS Redshift, including data modeling, query optimization, and cluster management• Strong experience with Apache Airflow for workflow orchestration and scheduling• Proficiency with dbt for data transformation and modeling• Experience creating dashboards and reports in Tableau• Excellent SQL skills and experience with Python• Knowledge of data warehousing concepts and dimensional modeling• Strong communication skills and ability to work cross-functionally• Bachelor's or Master's degree in Computer Science, Engineering, or related field
We are seeking an experienced Data Engineering Architect to design and implement scalable data solutions for our organization. The ideal candidate will have deep expertise in cloud data warehousing, ETL/ELT processes, data modeling, and business intelligence.
Responsibilities:• Design and architect end-to-end data solutions leveraging AWS Redshift, Apache Airflow, dbt, and other modern data tools• Develop data models and implement data pipelines to ingest, transform, and load data from various sources into our Redshift data warehouse• Create and maintain Apache Airflow DAGs to orchestrate complex data workflows and ETL processes• Implement data transformations and modeling using dbt to ensure data quality and consistency• Design and optimize Redshift clusters for performance, scalability, and cost-efficiency• Collaborate with data analysts and scientists to expose data through Tableau dashboards and reports• Establish data governance practices and ensure data security/compliance• Mentor junior data engineers and promote best practices across the data team• Evaluate new data technologies and make recommendations to improve our data architecture
Requirements:• 7+ years of experience in data engineering, with at least 3 years in an architect role• Deep expertise with AWS Redshift, including data modeling, query optimization, and cluster management• Strong experience with Apache Airflow for workflow orchestration and scheduling• Proficiency with dbt for data transformation and modeling• Experience creating dashboards and reports in Tableau• Excellent SQL skills and experience with Python• Knowledge of data warehousing concepts and dimensional modeling• Strong communication skills and ability to work cross-functionally• Bachelor's or Master's degree in Computer Science, Engineering, or related field