Phoenix Staff Inc
Data Streaming Architect Engineer
Phoenix Staff Inc, Scottsdale, Arizona, us, 85261
Job DescriptionJob Description
Title:
Data Streaming Architect & Engineer
Location:
Scottsdale, AZ (Remote)
Our client is seeking a Data Streaming Engineer with expertise in distributed systems to lead the re-engineering of our data ingestion pipeline. This pivotal role focuses on transitioning from a traditional ETL model to a real-time streaming architecture that can handle high data volumes and provide real-time insights across our platform. You will play a hands-on role in defining and implementing the technical future of our data infrastructure. You will have a significant influence on the direction of our client’s product’s data architecture, collaborating closely with developers and product leaders. While this is a highly technical role, you’ll also be a critical voice in shaping the future of our data infrastructure, working with cutting-edge tools and technology.
Your Role:
Design, develop, and implement a data streaming platform from the ground up to replace our current ETL-based pipeline.
Architect stream-based data pipelines that enable applications to listen to and process data as it flows through the infrastructure.
Lead the selection, deployment, and ongoing management of distributed data systems, with a focus on Apache Kafka or similar technologies.
Collaborate closely with developers and stakeholders to align the data streaming architecture with business and technical goals.
Provide technical leadership and guidance to a small team, ensuring effective deployment and optimization of the data streaming solution.
Implement and maintain CI/CD processes for continuous improvement and scalability of the streaming platform.
Ensure basic security practices are followed within the data pipeline architecture.
What You Bring:
3+ years of hands-on experience in distributed systems development, deployment, and management.
Expertise in stream processing technologies like Apache Kafka, Google Pub/Sub, Amazon Kinesis, or Azure Event Bus, with Kafka being strongly preferred.
Strong experience in Unix/Linux-based systems (particularly Ubuntu), with familiarity in Windows environments.
Proven experience designing and building real-time data pipelines for high-volume data ingestion, ideally within smaller companies where you’ve led the project end-to-end.
Proficiency in Python and scripting languages (Bash, Perl) for system automation and data processing.
Hands-on experience in deploying solutions, not just conceptualizing or leading from a distance—this is a role for a doer.
Strong collaborator with the ability to communicate technical concepts and sell ideas to both developers and non-technical stakeholders.
Broad technical background, including exposure to various technologies beyond SQL and relational databases.
Experience with cloud infrastructure, though we are not tied to any specific cloud vendor.
Basic understanding of information security as it relates to data infrastructure.
Bachelor’s degree in computer science, engineering, or a related field is preferred.
Company Description
We believe that finding the right candidate shouldn’t be so hard. Neither should finding a job you love. By taking the time to understand your specific needs, we make the perfect placements and build relationships that last long after the position is filled. We’re fanatical about the right fit, and we look forward to finding yours.
Company DescriptionWe believe that finding the right candidate shouldn’t be so hard. Neither should finding a job you love. By taking the time to understand your specific needs, we make the perfect placements and build relationships that last long after the position is filled. We’re fanatical about the right fit, and we look forward to finding yours.
Title:
Data Streaming Architect & Engineer
Location:
Scottsdale, AZ (Remote)
Our client is seeking a Data Streaming Engineer with expertise in distributed systems to lead the re-engineering of our data ingestion pipeline. This pivotal role focuses on transitioning from a traditional ETL model to a real-time streaming architecture that can handle high data volumes and provide real-time insights across our platform. You will play a hands-on role in defining and implementing the technical future of our data infrastructure. You will have a significant influence on the direction of our client’s product’s data architecture, collaborating closely with developers and product leaders. While this is a highly technical role, you’ll also be a critical voice in shaping the future of our data infrastructure, working with cutting-edge tools and technology.
Your Role:
Design, develop, and implement a data streaming platform from the ground up to replace our current ETL-based pipeline.
Architect stream-based data pipelines that enable applications to listen to and process data as it flows through the infrastructure.
Lead the selection, deployment, and ongoing management of distributed data systems, with a focus on Apache Kafka or similar technologies.
Collaborate closely with developers and stakeholders to align the data streaming architecture with business and technical goals.
Provide technical leadership and guidance to a small team, ensuring effective deployment and optimization of the data streaming solution.
Implement and maintain CI/CD processes for continuous improvement and scalability of the streaming platform.
Ensure basic security practices are followed within the data pipeline architecture.
What You Bring:
3+ years of hands-on experience in distributed systems development, deployment, and management.
Expertise in stream processing technologies like Apache Kafka, Google Pub/Sub, Amazon Kinesis, or Azure Event Bus, with Kafka being strongly preferred.
Strong experience in Unix/Linux-based systems (particularly Ubuntu), with familiarity in Windows environments.
Proven experience designing and building real-time data pipelines for high-volume data ingestion, ideally within smaller companies where you’ve led the project end-to-end.
Proficiency in Python and scripting languages (Bash, Perl) for system automation and data processing.
Hands-on experience in deploying solutions, not just conceptualizing or leading from a distance—this is a role for a doer.
Strong collaborator with the ability to communicate technical concepts and sell ideas to both developers and non-technical stakeholders.
Broad technical background, including exposure to various technologies beyond SQL and relational databases.
Experience with cloud infrastructure, though we are not tied to any specific cloud vendor.
Basic understanding of information security as it relates to data infrastructure.
Bachelor’s degree in computer science, engineering, or a related field is preferred.
Company Description
We believe that finding the right candidate shouldn’t be so hard. Neither should finding a job you love. By taking the time to understand your specific needs, we make the perfect placements and build relationships that last long after the position is filled. We’re fanatical about the right fit, and we look forward to finding yours.
Company DescriptionWe believe that finding the right candidate shouldn’t be so hard. Neither should finding a job you love. By taking the time to understand your specific needs, we make the perfect placements and build relationships that last long after the position is filled. We’re fanatical about the right fit, and we look forward to finding yours.