HCL Global Systems
Big Data Cloud Architect
HCL Global Systems, Atlanta, Georgia, United States, 30383
Big Data Cloud Architect r to work with one of the leading healthcare providers in US. The ideal candidate may possess good background on Healthcare Business Responsibilities Provide Architecture and hands on support for below activities. Lead data engineer and analyst to deliver data sets and analysis results as per business requirements. Assemble large, complex data sets that meets functional/Non-Functional business requirements. Automating manual processes, optimizing data delivery, recommending platform greater scalability/improvements Collaborate with initiative leads to optimize and enhance new capabilities. Mentor team in migrating Hadoop on-prem to cloud AWS and snowflake Create and maintain optimal data pipeline architecture. Presenting analysis results/recommendations using Powerpoint Requirements Mandatory skills: Hands on experience in migrating Hadoop on-prem to cloud platform, AWS, S3, Snowflake Experience in analyzing data using 'Big-Data' platform, AWS, Snowflake, Spark, Python, Scala Strong Analytical skills in relating multiple data sets and identify patterns Hands on experience in writing advanced SQL queries, familiarity with variety of database Experience in building and optimizing 'Big-Data' pipelines, architecture, and data sets Experience in Hadoop file formats like ORC, Avro, Parquet, CSV Experience in NoSQL databases like MongoDB Visualize data sets using Tableau or Power BI : 12+ yrs IT experience and good expertise in SDLC/Agile 6+ yrs in programing language any (Python, Scala, Spark) 3+ yrs experience in Cloud big data platform AWS and Snowflake 5+ yrs experience in providing solutions for Business intelligence and advanced data analytics 3+ yrs Healthcare IT projects Desired/Preferred: Scheduling tools like Control-M, Oozie NoSQL databases like MongoDB Implement Python flex APIs to share data insights to digital systems