Logo
Mindlance

ETL Developer - Expert (11 yrs.)

Mindlance, Roanoke, Texas, United States, 76299


Duration:0-6 month(s) Description/Comment:Our Opportunity: Do you want to be part of an enterprise data solutions team managing over 4 petabytes of data and building the next-generation analytics platform for a leading financial firm with over trillion in assets under management? At client, the Data & Rep Technology (DaRT) organization owns the strategy, implementation, delivery, and support of the enterprise data warehouse and emerging data platforms. We are looking for someone who has a passion for data and comes with a data engineering background. Someone who has experience designing and coding batch and real-time ETL (and ELT) and wants to be part of the Dev Engineering team that is actively designing and implementing the Enterprise Data solution frameworks. Someone who wants to be challenged every day and has a passion for keeping up to date on new technologies in the Data Engineering space set new standards for Clients hundreds of ETL developers and collaborates with the team members along the way. What youll do: You will be a Sr. Data Engineer in a horizontal Dev Engineering team that includes onshore and offshore developers using best-in-class Google Cloud, Big Data, and relational Data warehouse technologies, including Informatica Big Query, IICS, Talend, Teradata, Python, etc. You will be prototyping data solutions to enable faster access to data for the analytics use case developers. You will be developing re-usable data solution patterns to enable quick-to-market data assets. You will be analyzing & profiling business data on relational and Big Data environments. Youll have the opportunity to grow in responsibility, work on exciting and challenging projects, train on emerging technologies and help set the future of the Data Solutions Delivery teams. Manage day-to-day re-usable framework development activities for new data solutions and troubleshooting existing solutions. Partner with product owners and directors to lead technical discussions and resolve technical issues pply best practices of data integration for data quality and automation Partner with the product vendors to identify and manage open product issues Solve complex data integration problems Work with project development teams and technology partners to develop high-level designs and cost and effort estimates for new framework development efforts. rchitect, design, and develop solutions and provide supporting documentation. Develop and maintain code for data ingestion and curation using Informatica IICS, Talend, Spark, Kafka, etc. dditional Job Details: What MUST you have? Minimum 10 years of hands-on development experience using parallel processing databases like Teradata. Mandatory, 3 years experience in cloud technologies like AWS, preferably Big Query, Data Proc on Google Cloud Platform. Using GCP native ETL solutions. Or building custom ETL/ELT solutions using Python. Preferably expert-level hands-on development experience using Informatica IICS Experience in data streaming technologies like Kafka Experience with all aspects of data systems, including database design, ETL, aggregation strategy, performance optimization. Experience setting best practices for building and designing ETL code and strong SQL experience to develop, tune, and debug complex SQL applications. Expertise in schema design, developing data models, and proven ability to work with complex data is required. Hands-on experience with programming language Java/Python/Spark Hands-on experience with Linux and shell scripting Hands-on experience with CI/CD tools like Bamboo, Jenkins, Bitbucket, etc.