Logo
Saxon Global

Data System Engineer

Saxon Global, Alpharetta, Georgia, United States, 30239


Local candidate only- must send documentation with name/address

Hybrid 3 days a week onsitePotential to convertThe Data System Engineer will be responsible for tasks such as data engineering, data modeling, ETL processes, data warehousing, and data analytics & science. Our platform run both on premise and on the cloud (AWS/Azure).Knowledge/Skills:

Able to establish, modify or maintain data structures and associated components according to designUnderstands and documents business data requirementsAble to come up with Conceptual and Logical Data Models at Enterprise, Business Unit/Domain LevelUnderstands XML/JSON and schema development/reuse, database concepts, database designs, Open Source and NoSQL conceptsPartners with Sr. Data Engineers and Sr. Data architects to create platform level data models and database designsTakes part in reviews of own work and reviews of colleagues' workHas working knowledge of the core tools used in the planning, analyzing, designing, building, testing, configuring and maintaining of assigned application(s)Able to participate in assigned team's software delivery methodology (Agile, Scrum, Test-Driven Development, Waterfall, etc.) in support of data engineering pipeline developmentUnderstands infrastructure technologies and components like servers, databases, and networking conceptsWrite code to develop, maintain and optimized batch and event driven for storing, managing, and analyzing large volumes of structured and unstructured data bothMetadata integration in data pipelinesAutomate build and deployment processes using Jenkins across all environments to enable faster, high-quality releasesQualification:

Up to 4 years of software development experience in a professional environment and/or comparable experience such as:

Understanding of Agile or other rapid application development methodsExposure to design and development across one or more database management systems DB2, SybaseIQ, Snowflake as appropriateExposure to methods relating to application and database design, development, and automated testingUnderstanding of big data technology and NOSQL design and development with variety of data stores (document, column family, graph, etc.)General knowledge of distributed (multi-tiered) systems, algorithms, and relational & non-relational databasesExperience with Linux and Python scripting as well as large scale data processing technology such as sparkExposure to Big data technology and NOSQL design and coding with variety of data stores (document, column family, graph, etc.)Experience with cloud technologies such as AWS and Azure, including deployment, management, and optimization of data analytics & science pipelinesNice to have: Collibra, Terraform, Java, Golang, Ruby, Machine Learning Operation deploymentBachelor's degree in computer science, computer science engineering, or related field required