BigQuery Data Architect
FourSteps Solutions - San Francisco, California, United States, 94199
Work at FourSteps Solutions
Overview
- View job
Overview
Job Description: BigQuery Data Architect Position Title : BigQuery Data Architect Location : San Ramon , CA Position Overview We are seeking an experienced
BigQuery Data Architect
to join our team for a 4-month project to implement a Predictive Cash Flow solution. The role will focus on designing and managing data pipelines in Google BigQuery, integrating data from Oracle Fusion (AP, AR, Payables, Receivables, SCM), Workday Active Payroll, Salesforce, Trovata, and Oracle EPM. The BigQuery Data Architect will ensure efficient data ingestion, transformation, and export to support real-time cash flow forecasting and financial planning. Key Responsibilities Design scalable BigQuery datasets, tables, and schemas to store and process financial data from multiple sources (e.g., invoices, payroll, banking transactions). Define data models to support cash flow forecasting, ensuring alignment with Oracle EPM requirements. Optimize BigQuery for performance, cost, and scalability. Data Ingestion : Configure automated data pipelines to load JSON/CSV files from Google Cloud Storage (GCS) into BigQuery. Implement streaming data ingestion for real-time data (e.g., Trovata banking transactions) using BigQuery Streaming API. Integrate with Oracle Integration Cloud (OIC) for seamless data extraction from source systems. Data Transformation : Write complex SQL queries to standardize, aggregate, and transform data (e.g., currency conversion, date alignment). Create views and materialized views in BigQuery for consolidated cash flow metrics (e.g., daily inflows/outflows). Ensure data quality through validation checks and error handling. Data Export : Design export processes to transfer transformed data from BigQuery to Oracle EPM via CSV/JSON files in GCS or direct API integration. Map BigQuery data to EPM dimensions (e.g., Account, Period, Entity) for financial planning and reporting. Performance Optimization : Optimize BigQuery queries and partitioning/clustering strategies to reduce costs and improve query performance. Monitor and manage BigQuery usage to stay within budget and quotas. Collaboration and Support : Work closely with Integration Developers, Oracle EPM Consultants, and Functional Consultants to align data flows. Support testing phases (unit, SIT, UAT) by validating data accuracy and pipeline reliability. Provide technical documentation and training to client teams on BigQuery processes. Monitoring and Maintenance : Set up monitoring via Google Cloud Monitoring to track pipeline performance and detect failures. Troubleshoot and resolve data pipeline issues during development and hyper-care phases. Qualifications Education : Bachelor’s degree in Computer Science, Data Engineering, or a related field (Master’s preferred). Experience : 5+ years of experience in data architecture or data engineering, with at least 3 years focused on Google BigQuery. Proven experience designing and implementing data pipelines in Google Cloud Platform (GCP). Hands-on experience with ETL processes for financial or ERP data (e.g., Oracle Fusion, Workday, Salesforce). Familiarity with Oracle EPM or similar financial planning tools is a plus. Technical Skills : Expertise in Google BigQuery, including schema design, SQL, partitioning, and clustering. Proficiency in Google Cloud Storage (GCS), Google Cloud Scheduler, and BigQuery Streaming API. Strong SQL skills for data transformation and query optimization. Experience with integration tools like Oracle Integration Cloud (OIC), Informatica, or Talend. Knowledge of REST APIs, JSON/CSV processing, and data warehousing concepts. Familiarity with data security and compliance standards (e.g., GDPR, CCPA). Soft Skills : Strong problem-solving and analytical skills. Excellent communication and collaboration abilities to work with cross-functional teams. Ability to document technical processes clearly and train non-technical stakeholders. Preferred Qualifications Experience with financial data models (e.g., cash flow, receivables, payables). Knowledge of Oracle Fusion Financials APIs Google Cloud Professional Data Engineer certification or equivalent. Prior work on predictive analytics or cash flow forecasting projects Seniority level
Seniority level Mid-Senior level Employment type
Employment type Contract Job function
Job function Engineering and Information Technology Industries IT Services and IT Consulting Referrals increase your chances of interviewing at FourSteps Solutions by 2x Sign in to set job alerts for “Data Architect” roles.
San Francisco, CA $144,000.00-$201,000.00 2 days ago Fremont, CA $122,000.00-$169,000.00 2 days ago San Francisco, CA $144,000.00-$201,000.00 2 days ago Fremont, CA $144,000.00-$201,000.00 2 days ago Senior Architect - Data Engineering and Analytics
San Jose, CA $181,000.00-$354,200.00 4 days ago Senior Staff Engineer, Data Center Strategy
Senior Solutions Architect/Engineer, Cloud, Data and AI
Mountain View, CA $200,000.00-$250,000.00 1 week ago Sunnyvale, CA $169,000.00-$338,000.00 1 week ago Delivery Solutions Architect - Healthcare & Life Sciences
We’re unlocking community knowledge in a new way. Experts add insights directly into each article, started with the help of AI.
#J-18808-Ljbffr