Compunnel
Data Engineer
Compunnel, San Francisco, CA, United States
Job Summary:
We are seeking a skilled Data Engineer to join our team and continuously improve our data infrastructure. In this role, you will collaborate with cross-functional teams including Data Scientists, Analysts, Product Managers, and Software Engineers to deliver on data needs, explore opportunities for process improvements, and implement automation. You will also be responsible for establishing data governance policies to ensure data quality, consistency, and security.
Key Responsibilities:
Certification: Google Cloud Certification , AWS Certification
We are seeking a skilled Data Engineer to join our team and continuously improve our data infrastructure. In this role, you will collaborate with cross-functional teams including Data Scientists, Analysts, Product Managers, and Software Engineers to deliver on data needs, explore opportunities for process improvements, and implement automation. You will also be responsible for establishing data governance policies to ensure data quality, consistency, and security.
Key Responsibilities:
- Continuously improve and optimize the organization’s data infrastructure to stay ahead of technology trends.
- Collaborate with cross-functional teams (Data Scientists, Analysts, Product Managers, and Engineers) to understand data needs and deliver on those needs effectively.
- Evaluate, select, and implement BI tools and technologies (including data visualization tools, reporting platforms, and analytics solutions) that best suit the organization's goals.
- Run Proof of Concepts (POC) for selected tools and technologies.
- Stay up-to-date with the latest trends, technologies, and best practices in the BI field.
- Identify opportunities for process improvements, automation, and innovation to enhance the organization’s data analytics capabilities.
- Define and implement Data Governance policies to ensure data quality, consistency, and security.
- Establish access controls, data privacy measures, and data retention policies to ensure compliance with regulatory requirements.
- Develop scalable ETL pipelines and support the integration of data across different systems.
- 8-10 years of experience in data engineering with a focus on designing scalable ETL pipelines.
- Advanced proficiency in Hadoop, Scala, Spark, Kafka, and SQL.
- At least 6 years of experience working with cloud platforms such as Azure, AWS, or Google Cloud.
- Experience in architecting large and complex data pipelines and performing data analysis and exploration.
- Hands-on experience with Cloud technologies including Azure, Databricks, ADLS, Spark, Cosmos DB, and other big data technologies.
- Proven experience working in an Agile delivery environment and with multi-developer teams using version control tools.
- Strong knowledge of data structures, design patterns, and data validation.
- Experience working with APIs to collect or ingest data into systems.
- Solid experience with SQL, both relational and No-SQL databases.
- Experience integrating and building data platforms in support of BI, Analytics, Data Science, and real-time applications.
- Strong communication and problem-solving skills to collaborate effectively across teams and solve data-related challenges.
- Experience with Datastage or other ETL tools (e.g., FiveTran, Informatica).
- Familiarity with BI reporting tools such as SSAS Cubes, Tableau, ThoughtSpot, Power BI, or similar platforms.
- Experience with data integration tools and BI reporting tools (Cognos, Tableau, etc.).
- Strong expertise with cloud technologies, particularly in a cloud-native architecture.
- Platform migration experience is a plus.
- Cloud certifications (AWS, Azure, Google Cloud) preferred.
- Certifications in Data Engineering or related fields (e.g., Databricks, Hadoop) are a plus.
Certification: Google Cloud Certification , AWS Certification