Data Integration Specialist
Seneca Resources, Philadelphia, PA, United States
A government agency located in Philadelphia, PA is looking for a Data Integration Specialist for a 6 month role. This is a hybrid position with some onsite work required in Philadelphia.
Position overview / Statement of Work
We are seeking a skilled and experienced Data Integration Specialist for a temporary assignment to join our dynamic team. This position will be responsible for the migration, integration, and maintenance of data within Amazon S3 and Amazon Redshift environments. This position will also be responsible for ensuring data integrity and security throughout the entire process, from initial migration to continuous data operations. This position requires 3 or more years of experience working as a Data Integration Specialist.
The pay rate range for this role is $65 - $75.85/hr
Work Mode: Hybrid, minimum 3 days onsite per week
Work activities:
• Design and implement data pipelines to extract, transform and load data (ETL and/or ELT) data into Amazon S3 using Azure Data Factory and other tools
• Set up automated workflows to ensure continuous data migration
• Configure Amazon S3 buckets for data storage
• Set up data schemas and tables in Amazon Redshift
• Migrate cleaned and preprocessed data into Amazon S3 and Redshift
• Ensure data integrity and security during migration
• Monitor data pipelines for performance and reliability
• Perform regular maintenance to address any issues
• Improve data integration processes for increased efficiency and scalability
• Document data integration processes and procedures
Skills/experience of the assigned staff:
Required
• Experience with various cloud platforms
• Proficient with Azure Data Factory
• Strong knowledge of Amazon AWS S3, AWS Redshift and AWS Lambda
• Proficient in SQL and ETL tools
• Effective communication and documentation abilities
• Expertise in data security and integrity measures
• Excellent problem-solving and analytical skills
• 3 years of experience creating, documenting, implementing, and maintaining data pipelines
Highly Desired/Preferred
• Experience using AWS Glue
• Proficient with Python, C# and sparkSQL