Logo
Tekwissen

Data Engineer IV

Tekwissen, Seattle, Washington, us, 98127


Overview:

TekWissen Group is a workforce management provider throughout the USA and many other countries in the world. Our client is a company operating a marketplace for consumers, sellers, and content creators. It offers merchandise and content purchased for resale from vendors and those offered by third party sellers.

Job Title: Data Engineer IV

Location: Seattle, WA 98121

Duration: 12 Months

Job Type: Contract

Work Type: Onsite

Job Description:AWS Infrastructure and Supply chain Finance is seeking a Data Engineer to join the team that is building data infrastructure to support analytical and operational finance requirements.The team is committed to building the next generation data platform that will enable support for Client's rapidly growing and dynamic businesses, and use it to deliver the analytical and operational applications which will have an immediate influence on day-to-day decision making.Our platform serves Client's finance, supply chain and accounting functions across the globe. You will be primarily using but not limited to AWS solution stacks like Redshift, S3, Lambda, SNS/SQS, Lake-formation, Client Simple EDI (SEDI), SFTP, Data Pipelines (AWS Glue/Hammerstone) and reporting tools such as Tableau and Alteryx to implement solutions.As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL and Reporting), infrastructure (e.g. hardware and software) and their integration.You should have deep understanding of the architecture for enterprise level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud).You should be an expert in the design, creation, management, and business use of large datasets.The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions. Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional data volumes, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is a passionate about data and analytics.Responsibilities:

As a Data Engineer you will be working in one of the largest and complex data warehouse environments.Design, implement, and support a platform providing secured access to large datasets.Build data pipelines, source data from internal/external ERP systems like Coupa, SAP.Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.Tune application and query performance using profiling tools and SQL.Analyze and solve problems at their root, stepping back to understand the broader context.Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.Developing and supporting the analytic technologies that give our customers timely, flexible and structured access to their data.Required Skills & Experience:

7+ years of related experience.Experience with data modeling, warehousing and building ETL pipelinesExperience with SQLExperience in at least one modern scripting or programming language, such as Python, Java, SparkExperience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissionsExperience with non-relational databases / data stores (object storage, document or key-value stores, graph databases, column-family databases)Experience as a data engineer or related specialty with a track record of manipulating, processing, and extracting value from large datasetsKnowledge of SFTP and file storage systemsKnowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellencePreferred:

Prior knowledge of Coupa modules and backend mechanism is plusExperience with data transfers using SFTP server and EDI is desiredExperience and understanding working with transportation/Freight dataCandidate Requirements:

REQUIRED SKILLS:

SQL, Data modelling and PythonAWS cloud technologies S3, glue, SnS/SQS, LamdaBuilding ETL pipeline and data infrastructureExperience designing solutions to import/export data through internal/external systemsYears of Experience:

8-10Degree or Certification:

Bachelors' degree in computer science and software engineeringTop 3 must-have hard skills:

SQL and PythonAWS TechnologiesData Pipelines

TekWissen® Group is an equal opportunity employer supporting workforce diversity.