Outdefine
Data Engineer
Outdefine, San Francisco, California, United States, 94199
As a skilled professional seeking career growth, you deserve access to the best job opportunities available. Join Outdefine's Trusted community today and apply to premier job openings with leading enterprises globally. Set your own rate, keep all your pay, and enjoy the benefits of a fee-free experience.Data Engineer
Outdefine Partner
Web3
10-50
San Francisco, CA, USA
Apply NowAbout the job
Overview:
Job OverviewLocation: Hybrid, Guadalajara - MXLong term Contract: All MX benefitsMandatory: 5+ yearsExperience with: Azure, Spark/Databricks, Snowflake, ETLsHiring Process: 3 roundsRoles and Responsibilities:
Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to ELT principles and business goals.Solves complex data problems to deliver insights that helps business achieve its goals.Creates data products for engineers, analysts, and data scientist team members to accelerate their productivity.Engineer effective features for modelling in close collaboration with data scientists and businesses.Leads the evaluation, implementation and deployment of emerging tools and processes for analytics data engineering to improve productivity and quality.Partners with machine learning engineers, BI, and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.Fosters a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.Advises, consults, mentors, and coaches other data and analytic professionals on data standards and practices.Develops and delivers communication and education plans on analytic data engineering capabilities, standards, and processes.Learns about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics as necessary to carry out role effectively.Required Skills:
5-10 years of experience required.Experience with designing and maintaining data warehouses and/or data lakes with big data technologies such as Spark/Databricks, or distributed databases, like Redshift and Snowflake.Experience in building data pipelines and deploying/maintaining them following modern DE best practices (e.g., DBT, Airflow, Spark, Python OSS Data Ecosystem).Knowledge of Software Engineering fundamentals and software development tooling (e.g., Git, CI/CD, JIRA) and familiarity with the Linux operating system and the Bash/Z shell.Experience with cloud database technologies (e.g., Azure) and developing solutions on cloud computing services and infrastructure in the data and analytics space.Basic familiarity with BI tools (e.g., Alteryx, Tableau, Power BI, Looker).Expertise in ELT and data analysis, SQL primarily.Conceptual knowledge of data and analytics, such as dimensional modelling, reporting tools, data governance, and structured and unstructured data.Employee location:
San Francisco, CA, USAExperience level:
Not specifiedWorkplace type:
HybridJob type:
Full time contractCompensation:
$25 - 35 /hrCurrency:
USD
#J-18808-Ljbffr
Outdefine Partner
Web3
10-50
San Francisco, CA, USA
Apply NowAbout the job
Overview:
Job OverviewLocation: Hybrid, Guadalajara - MXLong term Contract: All MX benefitsMandatory: 5+ yearsExperience with: Azure, Spark/Databricks, Snowflake, ETLsHiring Process: 3 roundsRoles and Responsibilities:
Designs, develops, optimizes, and maintains data architecture and pipelines that adhere to ELT principles and business goals.Solves complex data problems to deliver insights that helps business achieve its goals.Creates data products for engineers, analysts, and data scientist team members to accelerate their productivity.Engineer effective features for modelling in close collaboration with data scientists and businesses.Leads the evaluation, implementation and deployment of emerging tools and processes for analytics data engineering to improve productivity and quality.Partners with machine learning engineers, BI, and solutions architects to develop technical architectures for strategic enterprise projects and initiatives.Fosters a culture of sharing, re-use, design for scale stability, and operational efficiency of data and analytical solutions.Advises, consults, mentors, and coaches other data and analytic professionals on data standards and practices.Develops and delivers communication and education plans on analytic data engineering capabilities, standards, and processes.Learns about machine learning, data science, computer vision, artificial intelligence, statistics, and/or applied mathematics as necessary to carry out role effectively.Required Skills:
5-10 years of experience required.Experience with designing and maintaining data warehouses and/or data lakes with big data technologies such as Spark/Databricks, or distributed databases, like Redshift and Snowflake.Experience in building data pipelines and deploying/maintaining them following modern DE best practices (e.g., DBT, Airflow, Spark, Python OSS Data Ecosystem).Knowledge of Software Engineering fundamentals and software development tooling (e.g., Git, CI/CD, JIRA) and familiarity with the Linux operating system and the Bash/Z shell.Experience with cloud database technologies (e.g., Azure) and developing solutions on cloud computing services and infrastructure in the data and analytics space.Basic familiarity with BI tools (e.g., Alteryx, Tableau, Power BI, Looker).Expertise in ELT and data analysis, SQL primarily.Conceptual knowledge of data and analytics, such as dimensional modelling, reporting tools, data governance, and structured and unstructured data.Employee location:
San Francisco, CA, USAExperience level:
Not specifiedWorkplace type:
HybridJob type:
Full time contractCompensation:
$25 - 35 /hrCurrency:
USD
#J-18808-Ljbffr