Cynet Systems
Cloud SR ETL Developer
Cynet Systems, Richmond, Virginia, United States, 23214
Job Description: Responsibilities: Designs and develops systems for the maintenance of the Data Asset Program(Data Hub), ETL processes, and business intelligence. Develop a new data engineering process that leverages a new cloud architecture and will extend or migrate our existing data pipelines to this architecture as needed. Design and support the DW database and table schemas for new and existing data sources for the data hub and warehouse. Design and development of Data Marts. Work closely with data analysts, data scientists, and other data consumers within the business in an attempt to gather and populate the data hub and data warehouse table structure, which is optimized for reporting. The Data developers partner with the Data modeler and Data architect in an attempt to refine the businesss data requirements, which must be met for building and maintaining Data Assets. Understanding of Agile methodologies and processes. Skills: dvanced understanding of data integrations. Strong knowledge of database architectures. Strong analytical and problem-solving skills. bility to build strong relationships both internally and externally. bility to negotiate and resolve conflicts. bility to effectively prioritize and handle multiple tasks and projects. Strong written and verbal communication skills. Desire to learn, innovate, and evolve technology. Computer Skills/MS Office/Software: Excellent computer skills and high proficiency in the use of MS Word, PowerPoint, MS Excel, MS Project, MS Visio, and MS Team Foundation Server, which will all be necessary for the creation of visually and verbally engaging ETL, data designs and tables as well as the communication of documentation and reporting. Deep passion for data analytics technologies as well as analytical and dimensional modeling. The candidate must be extensively familiar with ETL (Extraction, Transformation and load), data warehousing, and business intelligence tools such as business objects, PowerBI, and Tableau. The candidate must also have vast knowledge of database design and modeling in the context of data warehousing. Experience with key data warehousing architectures including Kimball and Inmon, and has broad experience designing solutions using a broad set of data stores (e.g., HDFS, Azure Data Lake Store, Azure Blob Storage, Azure SQL Data Warehouse, Azure CosmosDB. Technologies: Data Factory v2, Data Lake Store, Data Lake Analytics, Azure Analysis Services, Azure Synapse. IBM Datastage, Erwin, SQL Server (SSIS, SSRS, SSAS), ORACLE, T-SQL, Azure SQL Database, Azure SQL Datawarehouse. Operating System Environments (Windows, Unix, etc.). Scripting experience with Windows and Python, Linux Shell scripting.