Logo
Tata Consultancy Services

Developer

Tata Consultancy Services, Cary, North Carolina, United States, 27518


Azure Data Engineer •Collect, store, process and analyze large datasets to build and implement extract, transfer, load (ETL) processes •Develop reusable frameworks to reduce the development effort involved thereby ensuring cost savings for the projects. •Develop quality code with thought through performance optimizations in place right at the development stage. •Appetite to learn new technologies and be ready to work on new cutting-edge cloud technologies. •Work with team spread across the globe in driving the delivery of projects and recommend development and performance improvements.

Essential Business Experience and Technical Skills: •Building and Implementing data ingestion and curation process developed using Big data tools such as Spark (Scala/python/Java), Hive, HDFS, Sqoop ,Kafka, Kerberos, Impala etc. and CDP 7.x •Ingesting huge volumes data from various platforms for Analytics needs and writing high-performance, reliable, and maintainable ETL code Strong •Strong analytic skills related to working with unstructured datasets •Strong experience in building/designing Data warehouses, data stores for analytics consumption On prem and Cloud (real time as well as batch use cases) •Ability to interact with business analysts and functional analysts in getting the requirements and implementing the ETL solutions.

Required: •8+ years of solutions development and delivery experience with 5+ years of recent experience in data engineering. •Proficiency and extensive Experience with Spark & Scala/Python and performance tuning •Hive database management and Performance tuning - Partitioning / Bucketing. •Strong SQL knowledge and data analysis skills for data anomaly detection and data quality assurance. •Very good problem solver and excellent communication skills - both written and verbal

Preferred: •Expertise in Python and experience writing Azure functions using Python/Node.js •Experience using Event Hub for data integrations. •Hands on expertise in: Implementing analytical data stores in Azure platform using ADLS/Azure data factory /Data Bricks and Cosmos DB (mongo/graph API) •Experience ingesting using Azure data factory, Complex ETL using Data Bricks. •Eagerness to learn new technologies on the fly and ship to production

Salary Range- $90,000-$120,000 a year

#LI-CO1