Canopy One Solutions Inc
Need Locals to CA Candidates GCP Data Engineer @ Sunnyvale, CA...
Canopy One Solutions Inc, Sunnyvale, California, United States, 94087
Job Role:
Data Engineer - GCPLocation:
Sunnyvale, CA (Onsite)Job Type:
ContractDuration:
Long TermResponsibilities
Well versed with Hadoop, Spark, Cloud, Python/PySpark and Java, Streaming, Kafka, BackendYou have a proven track record coding with at least one programming language (e.g., Scala, Python)You're experienced in one of cloud computing platforms (e.g., GCP, Azure)You're skilled in data modeling & data migration protocolsExperience with the integration tools like Automic, AirflowExperience in building highly scalable Big Data solutions and ETL ecosystemsKnowledge of Databricks is an added advantageHands on knowledge in NoSQL like Cosmos DB along with RDBMS like MySQL, Postgres is a plusIncrease the efficiency of the team by setting right Processes of Software Development, Requirement Intake, Effort EstimationDemonstrating creative, critical thinking & troubleshooting skillsQualifications
Experience with GCP, Data warehousing, BI preferredHands on working experience in any messaging platform like Kafka is preferredGreetings from Canopy One Solutions,Hope your day is treating you well! We are immediately hiring for the above position. If you think you are the right suitable resource for this opportunity based on your skills and expertise, please do share your updated resume along with contact details. We will be happy to discuss in detail about this role.Company Information:
Canopy One Solutions delivers the best, cost-effective software solutions, IT resources, and consulting services in Application Integration, Data migration, Data Warehouse & Business Intelligence for Financial, Telecom, and Health Care industry verticals.
#J-18808-Ljbffr
Data Engineer - GCPLocation:
Sunnyvale, CA (Onsite)Job Type:
ContractDuration:
Long TermResponsibilities
Well versed with Hadoop, Spark, Cloud, Python/PySpark and Java, Streaming, Kafka, BackendYou have a proven track record coding with at least one programming language (e.g., Scala, Python)You're experienced in one of cloud computing platforms (e.g., GCP, Azure)You're skilled in data modeling & data migration protocolsExperience with the integration tools like Automic, AirflowExperience in building highly scalable Big Data solutions and ETL ecosystemsKnowledge of Databricks is an added advantageHands on knowledge in NoSQL like Cosmos DB along with RDBMS like MySQL, Postgres is a plusIncrease the efficiency of the team by setting right Processes of Software Development, Requirement Intake, Effort EstimationDemonstrating creative, critical thinking & troubleshooting skillsQualifications
Experience with GCP, Data warehousing, BI preferredHands on working experience in any messaging platform like Kafka is preferredGreetings from Canopy One Solutions,Hope your day is treating you well! We are immediately hiring for the above position. If you think you are the right suitable resource for this opportunity based on your skills and expertise, please do share your updated resume along with contact details. We will be happy to discuss in detail about this role.Company Information:
Canopy One Solutions delivers the best, cost-effective software solutions, IT resources, and consulting services in Application Integration, Data migration, Data Warehouse & Business Intelligence for Financial, Telecom, and Health Care industry verticals.
#J-18808-Ljbffr