Logo
RIT Solutions, Inc.

ETL Developer (Sr. SSIS Developer)

RIT Solutions, Inc., Schiller Park, Illinois, United States, 60176


Title: ETL Developer (Sr. SSIS Developer) Location: The role is fully remote with the exception of needing to go to either the Beloit, WI office or the Chicago, IL office once a year Job Description: Ideally we would find candidates that are within a 2-3 hour drive and are ok going to the office 1-2 times a year. If we find strong candidates that need to be 100% remote we can consider that as well. Any work authorization is fine as long as they are on your W2. It is a Sr. SSIS Developer role so strong experience with SSIS, T-SQL and SQL Server are needed. The role is supporting a multi-year ERP implementation. They are implementing Agility but Agility experience is not required but they do need to have ERP experience but it doesn't matter if it is SAP, PeopleSoft, Workday, Oracle ERP or any other ERP system but they do need to have the ERP experience. Let me know what you come up with. what skills are required and what skills are desired? Work with the team-help the business to define the technical and business requirements SSIS T-SQL and Scripting Not so Senior, but Lead experience is good Relational database understanding ERP does not need to be Agility. They can have Oracle, SAP, etc, as long as they can write queries within T- SQL Can you tell me about the project they will be on? This has been a multi-year POS/ERP implementation from a legacy ERP/POS system to DMSi Agility (https://www.dmsi.com/products-services/agility/). This is a complex, enterprise level initiative with various work streams and a number of stakeholders. Job Description: Agility ETL Developer Job Description Position : ETL Developer (Data Conversions, SSIS, SQL Server, Python, ERP Expertise) We're seeking a dynamic and detail oriented ETL Developer with hands-on experience in large-scale data conversion projects. This role requires a proactive professional who has led or contributed to two or more recent data migration initiatives, particularly involving Db2 and IBM mainframe systems transitioning into MS SQL Server. The ideal candidate will also have strong ERP expertise, with a comprehensive understanding of ERP data models and concepts, and the ability to efficiently migrate data into a custom ERP system. Requirements: Data Conversion Expertise: Must have led or been significantly involved in data conversion projects in the last two roles, specifically migrating data from Teradata and IBM mainframe systems into SQL Server. SSIS & SQL Server Mastery: Advanced experience in developing and optimizing SSIS packages, alongside proficiency in MS SQL Server with a focus on complex SQL query writing and database performance tuning. Min 8-10 yrs. of SSIS, ETL & SQL Development. Knowledge of MS SQL Server 2012 or later. Experience with SSRS, SSIS, T-SQL; develop SSIS packages. Python Proficiency: Strong Python skills to automate ETL processes and handle intricate data transformations. Data Mapping & Modeling: Demonstrated experience in conducting detailed data mapping and creating precise data models for migration. Strong Data analysis and data migration script creation experience. Data Flow Diagrams: Ability to craft clear and effective data flow diagrams to illustrate migration and transformation processes. Technical Documentation: Proven track record of writing clear, comprehensive technical design documents that communicate complex technical solutions to both technical and non-technical audiences. ERP Data Expertise: Extensive experience working with ERP data models, with a deep understanding of core ERP entities like customers, suppliers, items, and transactions, and the ability to load data into custom ERP systems. Collaboration & Communication: Exceptional communication skills with the ability to work directly with business users, architects, and cross-functional teams, ensuring alignment and understanding of data solutions. Responsibilities: Design and implementation of SSIS, and ETL based on business requirements. Strong problem-solving and analytical skills. Able to understand business requirements and develop SQL Server solutions that work within the technical boundaries. Conduct comprehensive data mapping, data analysis, and data modeling to support ETL processes. Use Python scripting to automate data transformation processes and streamline ETL workflows. Write SQL and T-SQL scripts/statements to analyze and translate legacy ERP data into a new ERP platform. Collaborate closely with business stakeholders, architects, and development teams to translate complex business requirements into robust data engineering solutions. Create, debug, and execute T-SQL scripts and stored procedures that match data mapping specifications. Create and maintain detailed data flow diagrams, showcasing the logical flow of data throughout the migration process. Present insights and technical details to both technical and non-technical teams, ensuring transparent communication across all levels. Perform quality assurance and database validation on all test and live conversions. Leverage a strong understanding of ERP concepts to ensure seamless data interpretation and migration strategies. Work with ERP data models, including customers, suppliers, items, and open transactions, ensuring data is accurately migrated into custom ERP systems. Ability to learn on the fly. Quickly and effectively integrates new information and skills to enhance personal performance or the performance of the organization. Learns from successes and failures, regards all experience as an opportunity to learn and improve. Creates jobs for batch and real time processing of data from internal and external sources. Supports software applications/operating systems through data research and debugging. Participates in the testing process through test review and analysis, test witnessing and certification of data. Able to communicate with application technical leads and business users. Ability to roll up sleeves, dig in and figure out complex problems. Additional Skills (Good to Have): NiFi Experience: Knowledge of NiFi for building efficient data flows and processing pipelines. Apache Airflow: Familiarity with Apache Airflow for orchestrating complex ETL workflows and managing data pipelines. Apache Splunk: Experience with Apache Splunk for monitoring, searching, and analyzing large volumes of data. Apache Spark: Familiarity with Apache Spark for large-scale data processing and distributed computing.