Logo
Dice

Sr Data Engineer - Hadoop Stack_Fulltime (Remote for EST/CST)

Dice, Little Ferry, New Jersey, us, 07643


Dice is the leading career destination for tech experts at every stage of their careers. Our client, Valere Labs Pvt Ltd, is seeking the following. Apply via Dice today!Job Posting: Senior Data Engineer

Location:

Seattle, WA (Remote: EST or CST)Type:

Full-Time EmploymentIntroduction:

Our Ads & Data Platforms team, a division of Entertainment & Technology, is actively seeking a skilled Senior Data Engineer to join our dynamic team. As a key member of our Content Engineering team, you will play a crucial role in designing and implementing scalable data solutions to drive business insights and decision-making processes. We are particularly seeking individuals with expertise in Hadoop technology stack and a passion for leveraging big data to deliver cutting-edge solutions.Responsibilities:

Contribute to the design and expansion of our Data Products and Data Warehouses focusing on Content Performance and Engagement data.Develop scalable data warehousing solutions and build ETL pipelines in Big Data environments (cloud, on-prem, hybrid).Collaborate with cross-functional teams including Data Product Managers, Data Architects, and Data Engineers to design, implement, and deliver successful data solutions.Maintain detailed documentation to ensure data quality and governance.Ensure high operational efficiency and quality of solutions to meet SLAs and support commitments to internal stakeholders.Actively participate in and advocate for agile/scrum practices to drive process improvements for the team.Basic Qualifications:

7+ years of data engineering experience with a focus on developing large data pipelines.Proficiency in SQL and ability to create performant datasets through queries.Hands-on experience with distributed systems such as Spark, Hadoop (HDFS, Hive, Presto, PySpark).Experience with at least one major MPP or cloud database technology (Snowflake, Redshift, Big Query).Preferred Qualifications:

Familiarity with Cloud technologies like AWS (S3, EMR, EC2).Experience with data integration toolsets (e.g., Airflow) and maintaining Data Pipelines.Knowledge of Data Modeling techniques and Data Warehousing standard methodologies.Strong scripting skills, including Bash scripting and Python.Familiarity with Scrum and Agile methodologies.Strong problem-solving skills with excellent analytical and communication abilities.Required Education:

Bachelor's Degree in Computer Science, Information Systems, or related field.Preferred Education:

Master's Degree in Computer Science, Information Systems, or related field.

#J-18808-Ljbffr