ASRC Federal Holding Company
Azure Data Factory (ADF) Engineer Job at ASRC Federal Holding Company in Fairfie
ASRC Federal Holding Company, Fairfield, CA, United States
Job Description
ASRC Federal Broadleaf Division is seeking a qualified Software Developer (Journeyman) to support our Army Analytics Group customer out of Fairfield, CA. This is a 100% REMOTE position.
JOB DUTIES:
EXPERIENCE:
EEO Statement
ASRC Federal and its Subsidiaries are Equal Opportunity / Affirmative Action employers. All qualified applicants will receive consideration for employment without regard to race, gender, color, age, sexual orientation, gender identification, national origin, religion, marital status, ancestry, citizenship, disability, protected veteran status, or any other factor prohibited by applicable law.
ASRC Federal Broadleaf Division is seeking a qualified Software Developer (Journeyman) to support our Army Analytics Group customer out of Fairfield, CA. This is a 100% REMOTE position.
JOB DUTIES:
- An Azure Data Factory Engineer is responsible for designing, implementing, and optimizing data pipelines, ensuring data quality and governance, collaborating with other teams, and managing the security of data processing workflows. The role demands both technical proficiency with ADF and Azure services and a collaborative approach to meeting business data needs efficiently.The duties of an Azure Data Factory (ADF) engineer primarily revolve around designing, implementing, and managing data pipelines and workflows in the Azure cloud environment. Here are the key responsibilities typically associated with this role:
- Design and Develop Data Pipelines
- Design, build, and manage data pipelines in Azure Data Factory for moving and transforming data across multiple sources and destinations.
- Create end-to-end ETL (Extract, Transform, Load) or ELT (Extract, Load, Transform) workflows to integrate data from on-premises, cloud, and external data sources.
- Configure ADF activities, triggers, and datasets to automate data processes based on business requirements.
- Data Transformation and Integration
- Use ADF's mapping data flows and wrangling data flows to transform raw data into usable formats and ensure that data is optimized for reporting and analysis.Integrate data from multiple sources such as SQL databases, NoSQL databases, APIs, flat files, and external systems into Azure Data Lake, Azure SQL Database, or Azure Synapse Analytics.
- Work with other Azure services like Databricks, Azure Functions, and Logic Apps to perform data processing and integration tasks.
- Data Orchestration and Workflow Automation
- Orchestrate complex workflows that coordinate data processing across different data stores and Azure services.
- Implement triggers, schedules, and control flows to automate and manage data workflows for real-time, batch, and event-driven data integration.
- Set up monitoring and alerting mechanisms to track pipeline health and identify any bottlenecks or failures in the data pipeline.
- Optimize and Maintain Data Pipelines
- Monitor and optimize performance of data pipelines to ensure efficiency, scalability, and cost-effectiveness.
- Regularly review and refactor pipelines, making adjustments to reduce latency, improve throughput, and minimize Azure costs.
- Troubleshoot pipeline issues and resolve any failures or delays in data processing.
- Data Quality and Governance
- Implement data validation and quality checks to ensure data integrity and accuracy across pipelines.
- Enforce data governance policies, including data lineage and metadata management, to comply with business standards and regulatory requirements.
- Document data flows, pipeline configurations, and integration processes for transparency and accountability.
- Collaborate with Cross-Functional Teams
- Work closely with data engineers, data analysts, data scientists, and business stakeholders to understand data requirements and translate them into ADF workflows.
- Collaborate with DevOps teams to deploy data solutions and manage CI/CD pipelines for ADF projects.
- Provide technical support and guidance to other teams involved in data integration, especially around best practices in Azure Data Factory.
- Implement Security and Compliance Measures
- Ensure that data pipelines adhere to security protocols, including data encryption, role-based access control, and data masking as required.
- Work within compliance frameworks such as GDPR, HIPAA, or SOC to secure sensitive data throughout the ETL process.
- Configure ADF to ensure that data access and processing meet organizational security and compliance standards.
- Stay Current with Azure and Data Engineering Trends
- Keep up to date with the latest developments in Azure Data Factory and other Azure services to continuously improve data solutions.
- Explore new features, tools, and technologies in Azure that can enhance data processing, analytics, and storage capabilities.
- Participate in training, workshops, and certifications to maintain expertise in data engineering practices and cloud technology.
EXPERIENCE:
- Eight (8) years functional software development experience.
- Experience with issue tracking systems such as azure boards and JIRA.
- Security + certified
- Active Secret Security Clearance
- Bachelor's Degree
EEO Statement
ASRC Federal and its Subsidiaries are Equal Opportunity / Affirmative Action employers. All qualified applicants will receive consideration for employment without regard to race, gender, color, age, sexual orientation, gender identification, national origin, religion, marital status, ancestry, citizenship, disability, protected veteran status, or any other factor prohibited by applicable law.