Logo
Intelliswift

Data Engineer IV

Intelliswift, Seattle, Washington, us, 98127


Pay rate range - $80/hr. to $83/hr. on W2100% Onsite

Must HaveSQL and PythonAWS TechnologiesData Pipelines

REQUIRED SKILLS:SQL, Data modeling, and PythonAWS cloud technologies S3, glue, SnS/SQS, LamdaBuilding ETL pipeline and data infrastructureExperience designing solutions to import/export data through internal/external systems

Years of Experience:

8 - 10

Degree or Certification:Bachelor's degree in computer science and software engineering

Job Description:seeking a Data Engineer to join the team that is building data infrastructure to support analytical and operational finance requirements.The team is committed to building the next-generation data platform that will enable support for rapidly growing and dynamic businesses, and use it to deliver the analytical and operational applications which will have an immediate influence on day-to-day decision-making.Our platform serves finance, supply chain, and accounting functions across the globe.You will be primarily using but not limited to AWS solution stacks like Redshift, S3, Lambda, SNS/SQS, Lake-formation, Simple EDI (SEDI), SFTP, Data Pipelines (AWS Glue/Hammerstone), and reporting tools such as Tableau and Alteryx to implement solutions.

As a Data Engineer, you should be an expert with data warehousing technical components (e.g. Data Modeling, ETL, and Reporting), infrastructure (e.g. hardware and software), and their integration.You should have a deep understanding of the architecture for enterprise-level data warehouse solutions using multiple platforms (RDBMS, Columnar, Cloud).You should be an expert in the design, creation, management, and business use of large datasets.The candidate is expected to be able to build efficient, flexible, extensible, and scalable ETL and reporting solutions.Our ideal candidate thrives in a fast-paced environment, relishes working with large transactional data volumes, enjoys the challenge of highly complex business contexts (that are typically being defined in real-time), and, above all, is passionate about data and analytics.

Responsibilities• As a Data Engineer you will be working in one of the largest and complex data warehouse environments.• Design, implement, and support a platform providing secured access to large datasets.• Build data pipelines, and source data from internal/external ERP systems like Coupa, SAP.• Own the design, development, and maintenance of ongoing metrics, reports, analyses, dashboards, etc. to drive key business decisions.• Tune application and query performance using profiling tools and SQL.• Analyze and solve problems at their root, stepping back to understand the broader context.• Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for datasets.• Recognize and adopt best practices in reporting and analysis: data integrity, test design, analysis, validation, and documentation.• Developing and supporting the analytic technologies that give our customers timely, flexible and structured access to their data.

Required Skills & Experience* 7+ years of related experience.- Experience with data modeling, warehousing, and building ETL pipelines- Experience with SQL- Experience in at least one modern scripting or programming language, such as Python, Java, Spark- Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, Firehose, Lambda, and IAM roles and permissions- Experience with non-relational databases/data stores (object storage, document or key-value stores, graph databases, column-family databases)- Experience as a data engineer or related specialty with a track record of manipulating, processing, and extracting value from large datasets- Knowledge of SFTP and file storage systems- Knowledge of professional software engineering & best practices for full software development life cycle, including coding standards, software architectures, code reviews, source control management, continuous deployments, testing, and operational excellence

Preferred- Prior knowledge of Coupa modules and backend mechanism is a plus- Experience with data transfers using SFTP server and EDI is desired- Experience and understanding of working with transportation/Freight data