Rivka DevelopmentTM
Architect - Data Engineer
Rivka DevelopmentTM, Snowflake, Arizona, United States, 85937
We are seeking a highly skilled
Architect - Data Engineer . This role involves working with cutting-edge data platforms and cloud technologies to design and deliver innovative solutions for forward-thinking organizations. As a Data Engineering Architect, you will collaborate in small, agile teams to build next-generation data platforms that leverage modern data technologies and cloud environments.Key Responsibilities:
Data Platform Design & Development : Architect and build large-scale data solutions using
Big Data Platforms
such as
Apache Spark ,
Presto , and
Amazon EMR .Cloud Data Warehousing : Design and optimize data warehouses on platforms like
Amazon Redshift ,
Snowflake , and
Google BigQuery
for advanced analytics and storage solutions.Object-Oriented Programming : Develop data processing pipelines and automation scripts using
Java
and
Python
to streamline operations and enhance platform scalability.NoSQL Database Management : Leverage databases such as
DynamoDB ,
Cosmos DB , and
MongoDB
to support dynamic and large-scale data storage requirements.Container Management : Utilize
Kubernetes
and
Amazon ECS
to manage and deploy containerized data solutions across distributed environments.AI/ML Integration : Collaborate with teams using
Amazon SageMaker
and
Azure ML Studio
to incorporate machine learning models into data pipelines.Streaming Analytics : Implement and manage real-time data ingestion and analytics using
Amazon Kinesis
and
Apache Kafka .Data Visualization : Create dashboards and visual analytics using tools like
Tableau
and
PowerBI
for insights into data trends and performance metrics.Workflow Automation : Use modern data workflow tools like
Apache Airflow ,
dbt , and
Dagster
to orchestrate and manage complex ETL processes and data pipelines.Required Skills:
Proficiency in
Big Data Platforms
(e.g.,
Apache Spark ,
Presto ,
Amazon EMR ).Experience with
Cloud Data Warehouses
such as
Amazon Redshift ,
Snowflake , or
Google BigQuery .Strong coding skills in
Java
and
Python .Expertise in managing
NoSQL Databases
(e.g.,
DynamoDB ,
Cosmos DB ,
MongoDB ).Hands-on experience with
Container Management Systems
like
Kubernetes
and
Amazon ECS .Familiarity with
AI/ML platforms
(e.g.,
Amazon SageMaker ,
Azure ML Studio ).Knowledge of
streaming data platforms
(e.g.,
Amazon Kinesis ,
Apache Kafka ).Experience with
visual analytics tools
(e.g.,
Tableau ,
PowerBI ).Proficiency in managing
modern data workflows
using
Apache Airflow ,
dbt , or
Dagster .What Youll Do:
Work in collaborative, small teams with minimal oversight to design and deliver cutting-edge data solutions on
AWS ,
Microsoft Azure , and
Google Cloud Platform .Use distributed processing engines, event streaming platforms, and cloud data warehouse tools to build the next generation of data platforms.Stay at the forefront of innovations in data platform development and delivery by continuously evolving and refining your skills as new technologies emerge.This role offers the opportunity to work with innovative organizations and lead the way in data platform architecture and engineering using the latest tools and cloud environments.
#J-18808-Ljbffr
Architect - Data Engineer . This role involves working with cutting-edge data platforms and cloud technologies to design and deliver innovative solutions for forward-thinking organizations. As a Data Engineering Architect, you will collaborate in small, agile teams to build next-generation data platforms that leverage modern data technologies and cloud environments.Key Responsibilities:
Data Platform Design & Development : Architect and build large-scale data solutions using
Big Data Platforms
such as
Apache Spark ,
Presto , and
Amazon EMR .Cloud Data Warehousing : Design and optimize data warehouses on platforms like
Amazon Redshift ,
Snowflake , and
Google BigQuery
for advanced analytics and storage solutions.Object-Oriented Programming : Develop data processing pipelines and automation scripts using
Java
and
Python
to streamline operations and enhance platform scalability.NoSQL Database Management : Leverage databases such as
DynamoDB ,
Cosmos DB , and
MongoDB
to support dynamic and large-scale data storage requirements.Container Management : Utilize
Kubernetes
and
Amazon ECS
to manage and deploy containerized data solutions across distributed environments.AI/ML Integration : Collaborate with teams using
Amazon SageMaker
and
Azure ML Studio
to incorporate machine learning models into data pipelines.Streaming Analytics : Implement and manage real-time data ingestion and analytics using
Amazon Kinesis
and
Apache Kafka .Data Visualization : Create dashboards and visual analytics using tools like
Tableau
and
PowerBI
for insights into data trends and performance metrics.Workflow Automation : Use modern data workflow tools like
Apache Airflow ,
dbt , and
Dagster
to orchestrate and manage complex ETL processes and data pipelines.Required Skills:
Proficiency in
Big Data Platforms
(e.g.,
Apache Spark ,
Presto ,
Amazon EMR ).Experience with
Cloud Data Warehouses
such as
Amazon Redshift ,
Snowflake , or
Google BigQuery .Strong coding skills in
Java
and
Python .Expertise in managing
NoSQL Databases
(e.g.,
DynamoDB ,
Cosmos DB ,
MongoDB ).Hands-on experience with
Container Management Systems
like
Kubernetes
and
Amazon ECS .Familiarity with
AI/ML platforms
(e.g.,
Amazon SageMaker ,
Azure ML Studio ).Knowledge of
streaming data platforms
(e.g.,
Amazon Kinesis ,
Apache Kafka ).Experience with
visual analytics tools
(e.g.,
Tableau ,
PowerBI ).Proficiency in managing
modern data workflows
using
Apache Airflow ,
dbt , or
Dagster .What Youll Do:
Work in collaborative, small teams with minimal oversight to design and deliver cutting-edge data solutions on
AWS ,
Microsoft Azure , and
Google Cloud Platform .Use distributed processing engines, event streaming platforms, and cloud data warehouse tools to build the next generation of data platforms.Stay at the forefront of innovations in data platform development and delivery by continuously evolving and refining your skills as new technologies emerge.This role offers the opportunity to work with innovative organizations and lead the way in data platform architecture and engineering using the latest tools and cloud environments.
#J-18808-Ljbffr