Logo
Edward Jones

GCP Data Engineer

Edward Jones, Dallas, TX


Our future colleague will:
  • Contribute to multi year data analytics modernization roadmap for the bank.
  • You will directly work on the platform based on Google BigQuery and other GCP services to integrate new data sources and model the data up to the serving layer.
  • Contribute to this is unique opportunity as the program is set-up to completely rethink reporting and analytics with Cloud technology.
  • Collaborate with different business groups, users to understand their business requirements and design and deliver GCP architecture, Data Engineering scope of work
  • You will work on a large-scale data transformation program with the goal to establish a scalable, efficient and future-proof data & analytics platform.
  • Develop and implement cloud strategies, best practices, and standards to ensure efficient and effective cloud utilization.
  • Work with cross-functional teams to design, implement, and manage scalable and reliable cloud solutions on GCP.
  • Provide technical guidance and mentorship to the team to develop their skills and expertise in GCP.
  • Stay up-to-date with the latest GCP technologies, trends, and best practices and assess their applicability to client solutions.

Qualifications
What will help you succeed:
  • Bachelors University degree computer science/IT
  • Masters in Data Analytics/Information Technology/Management Information System (preferred)
  • Strong understanding of data fundamentals, knowledge of data engineering and familiarity with core cloud concepts
  • Must have good implementation experience on various GCP's Data Storage and Processing services such as BigQuery, Dataflow, Bigtable, Dataform, Data fusion, cloud spanner, Cloud SQL
  • Must have programmatic experience of SQL, Python, Apache Spark
  • At least 5 - 7 years of professional experience in building data engineering capabilities for various analytics portfolio with atleast 5 years in GCP/Cloud based platform.
Your expertise in one or more of the following areas is highly valued:
  • Google Cloud Platform, ideally with Google BigQuery, Cloud Composer and Cloud Data Fusion,Cloud spanner, Cloud SQL
  • Experience with legacy data warehouses (on SQL Server or any Relational Datawarehouse platform)
  • Experience with a testing framework
  • Experience with Business Intelligence tools like PowerBI and/or Looker
What sets you apart:
  • Experience in complex migrations from legacy data warehousing solutions or on-prem datalakes to GCP
  • Experience with building generic, re-usable capabilities and understanding of data governance and quality frameworks
  • Experience in building real-time ingestion and processing frameworks on GCP.
  • Adaptability to learn new technologies and products as the job demands.
  • Multi-cloud & hybrid cloud experience
  • Any cloud certification (Preference to GCP Certifications)
  • Experience working with Financial and Banking Industry


Required Skills : GCP Data Engineering (BigQuery, Dataflow, etc)Project is a migration project from Legacy systems to GCP, lots of work to be done, potential for a long term project
Background Check :Yes
Drug Screen :Yes
Notes :
Selling points for candidate :Project is a migration project from Legacy systems to GCP, lots of work to be done, potential for a long term project
Project Verification Info :
Candidate must be your W2 Employee :Yes
Exclusive to Apex :No
Face to face interview required :No
Candidate must be local :Yes
Candidate must be authorized to work without sponsorship ::No
Interview times set : :No
Type of project :
Master Job Title :
Branch Code :