About Blankfactor
Blankfactor is a technology consulting firm dedicated to delivering cutting-edge data solutions. We specialize in cloud-native architectures, advanced analytics, and big data platforms, working with global clients to drive digital transformation.
Role Overview
We are seeking a Data Architect with strong hands-on expertise in Google Cloud Platform (GCP) to design, architect, and implement large-scale data solutions . This role requires deep technical knowledge of Hadoop, Spark, MapR, BigQuery, BigTable, ElasticSearch, and Airflow , with a focus on solutioning and cloud-native architecture . The ideal candidate will be a technical leader who can define and implement best practices for data engineering and architecture.
Key Responsibilities
- Architect and implement scalable, high-performance data platforms using GCP services .
- Lead end-to-end big data solutioning , optimizing Hadoop, Spark, and MapR for distributed processing.
- Design and optimize BigQuery and BigTable solutions for large-scale data storage and analytics.
- Integrate and manage ElasticSearch for data indexing and real-time search capabilities.
- Implement workflow automation and orchestration using Airflow .
- Define best practices for data security, cost optimization, and cloud performance within GCP.
- Partner with business stakeholders and engineering teams to develop scalable data architectures .
- Provide technical leadership, mentorship, and guidance to engineering teams.
Must-Have Requirements
✅ Expert-level experience in Google Cloud Platform (GCP) , including hands-on architecture and solutioning .
✅ Strong background in big data processing with Hadoop, Spark, and MapR .
✅ Proven experience in BigQuery, BigTable, and ElasticSearch .
✅ Hands-on experience managing data workflows with Airflow .
✅ Proficiency in Python, Scala, or Java for data engineering.
✅ Deep understanding of cloud security, cost optimization, and high-performance computing .
✅ Ability to lead architecture discussions and make technical decisions.
✅ Must be located in Phoenix, AZ, and available for hybrid work .
Nice to Have
⭐ Experience with real-time streaming technologies (Kafka, Apache Flink) .
⭐ Google Professional Data Engineer certification .
⭐ Experience working with hybrid cloud architectures .
Work Mode & Location
This role requires a hybrid work model based in Phoenix, AZ . Candidates must be located in Phoenix and available to work onsite as needed.
Why Join Us?