Fidelity Investments
Principal Data Engineer (Raleigh, NC)
Fidelity Investments, Burnsville, North Carolina, United States, 28714
Qualifications
Or, alternatively, Master's degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal Data Engineer (or closely related occupation) designing, implementing, and developing applications in Sybase, SQL server, Oracle, Informatica, Python and Azure.Candidate must also possess:Demonstrated Expertise ("DE") performing data mining, data modeling, and writing and tuning SQL queries; developing stored procedures using Sybase, SQL server, and Oracle; developing Python and shell scripts; and employing release management using git-stash, Jenkins and uDeploy.DE developing workflows in Informatica PowerCenter; automating batch processing using Control M; developing pipelines to process and output data in Excel, XMLs, and fixed width file formats; converting legacy Sybase and SQR code to SQL MI on Azure; and performing Informatica upgrades.DE developing code in Oracle for Seibel plan configurations; developing valuation files with 401k, pensions, and earnings-related data using SQL server and Sybase procedures; developing reports using Hyperion SQR; creating database and system designs, and flowcharts using MS Visio; and migrating database objects to SQL MI Azure on cloud using Microsoft SSMA.DE developing Python modules, Restful APIs for data exchange; building and orchestrating Extract, Transform and Load (ETL) data pipelines using Python, SQL procedures and shell scripts; developing Jasper templates for generating pdfs; and developing code within security standards using Hashicorp Vault.Responsibilities
Develops database modules in Microsoft SQL Server, Sybase, and Oracle.Develops Extract, Transform and Load (ETL) and Extract Load and Transform (ELT) processes and pipelines to move data to and from Snowflake data store.Develops workflows using Informatica PowerCenter to create reportable data and builds reports using Business Intelligence tools - Hyperion SQR and Power BI.Develops data solutions on Azure cloud using cloud-native services.Demonstrates experience in building data transformation and data processing using Python.Designs and develops valuation reports and client-based reporting with pensions and earnings data.Develops and performance tunes the SQL routines to mine data and validate the accuracy of pension calculations.Develops shell scripts for batch processing and automating jobs using Control-M.Builds and implements automation using frameworks- git-stash, SourceTree, Jenkins, Jfrog Artifactory, and uDeploy.Delivers strategic technology solutions with efficiency and scale.Mentors the team and new team members, and outlines best coding standards and practices.Develops comprehensive documentation for multiple applications or subsystems.Confers with software architects, systems analysts, and other software engineers/developers to design systems.Modernize legacy applications using cloud solutions.Identifies technology trends in the cloud space and assists in adopting new solutions as offered by cloud service providers.Implements MS Azure, or other cloud-based services for data management.Obtains information on project limitations and capabilities, performance requirements and interfaces.Develops and oversees software system testing and validation procedures, programming, and documentation.
#J-18808-Ljbffr
Or, alternatively, Master's degree (or foreign education equivalent) in Computer Science, Engineering, Information Technology, Information Systems, Mathematics, Physics, or a closely related field and three (3) years of experience as a Principal Data Engineer (or closely related occupation) designing, implementing, and developing applications in Sybase, SQL server, Oracle, Informatica, Python and Azure.Candidate must also possess:Demonstrated Expertise ("DE") performing data mining, data modeling, and writing and tuning SQL queries; developing stored procedures using Sybase, SQL server, and Oracle; developing Python and shell scripts; and employing release management using git-stash, Jenkins and uDeploy.DE developing workflows in Informatica PowerCenter; automating batch processing using Control M; developing pipelines to process and output data in Excel, XMLs, and fixed width file formats; converting legacy Sybase and SQR code to SQL MI on Azure; and performing Informatica upgrades.DE developing code in Oracle for Seibel plan configurations; developing valuation files with 401k, pensions, and earnings-related data using SQL server and Sybase procedures; developing reports using Hyperion SQR; creating database and system designs, and flowcharts using MS Visio; and migrating database objects to SQL MI Azure on cloud using Microsoft SSMA.DE developing Python modules, Restful APIs for data exchange; building and orchestrating Extract, Transform and Load (ETL) data pipelines using Python, SQL procedures and shell scripts; developing Jasper templates for generating pdfs; and developing code within security standards using Hashicorp Vault.Responsibilities
Develops database modules in Microsoft SQL Server, Sybase, and Oracle.Develops Extract, Transform and Load (ETL) and Extract Load and Transform (ELT) processes and pipelines to move data to and from Snowflake data store.Develops workflows using Informatica PowerCenter to create reportable data and builds reports using Business Intelligence tools - Hyperion SQR and Power BI.Develops data solutions on Azure cloud using cloud-native services.Demonstrates experience in building data transformation and data processing using Python.Designs and develops valuation reports and client-based reporting with pensions and earnings data.Develops and performance tunes the SQL routines to mine data and validate the accuracy of pension calculations.Develops shell scripts for batch processing and automating jobs using Control-M.Builds and implements automation using frameworks- git-stash, SourceTree, Jenkins, Jfrog Artifactory, and uDeploy.Delivers strategic technology solutions with efficiency and scale.Mentors the team and new team members, and outlines best coding standards and practices.Develops comprehensive documentation for multiple applications or subsystems.Confers with software architects, systems analysts, and other software engineers/developers to design systems.Modernize legacy applications using cloud solutions.Identifies technology trends in the cloud space and assists in adopting new solutions as offered by cloud service providers.Implements MS Azure, or other cloud-based services for data management.Obtains information on project limitations and capabilities, performance requirements and interfaces.Develops and oversees software system testing and validation procedures, programming, and documentation.
#J-18808-Ljbffr