ICF Olson
Senior Data Engineer (AWS)- Remote
ICF Olson, Reston, Virginia, United States, 22090
Senior Data Engineer (AWS) - Remote*We are open to supporting 100% remote work anywhere within the US*
We are seeking a talented Data Engineer to join our team and play a pivotal role in transforming raw data into actionable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and ETL processes, ensuring seamless data flow from various sources to our data warehouse. Your expertise in AWS technologies and BI tools will be instrumental in delivering high-quality data solutions that drive informed decision-making.
Responsibilities:
Data Extraction, Transformation, and Loading (ETL):
Develop and implement efficient ETL pipelines to extract data from diverse legacy systems, transform it to meet business requirements, and load it into our data warehouse or data lake.
Data Security:
Develop security procedures and create systems to keep data secure.
Data Warehousing and Data Lake:
Design and build scalable data warehouses and data lakes to store and manage large volumes of structured and unstructured data.
Data Quality:
Ensure data quality by implementing data validation and cleansing processes, identifying and resolving data inconsistencies.
Data Modeling:
Create and maintain data models that accurately represent the business domain, ensuring data integrity and consistency.
AWS Technologies:
Leverage AWS services like AWS Glue, S3, Redshift, EMR, and Lambda to build and manage data pipelines effectively.
BI Tools:
Integrate data with BI tools (e.g., Tableau, Power BI) to enable data-driven insights and reporting.
Collaboration:
Work closely with data analysts, data scientists, and business stakeholders to understand their data requirements and deliver solutions that meet their needs.
Basic Qualifications:
Bachelor's degree in Computer Science, Data Engineering, or a related field.
3+ years of experience in data engineering or a similar role.
3+ years of experience building ETL pipelines from multiple database systems using SQL, Spark, Unix, Python, Java, or C#.
3+ years of experience with SQL and Python or other scripting languages.
3+ years of experience with ETL processes and tools.
3+ years of experience with AWS technologies, including S3, Redshift, EMR, and Lambda.
2+ years of experience with RDBMS such as Oracle 12c, MS SQL Server, DB2.
Green Card Holder or US Citizenship required due to federal contract requirements.
Must be able to obtain Public Trust clearance.
MUST RESIDE IN THE United States (U.S.) and the work MUST BE PERFORMED in the United States (U.S.), as this work is for a federal contract and laws do apply.
Preferred Qualifications:
Familiarity with data warehousing and data lake concepts.
Experience with data governance and data quality frameworks.
Knowledge of cloud-native data platforms and technologies.
Certifications in AWS or other relevant technologies.
Pay Range:
The pay range for this position based on full-time employment is $84,533.00 - $143,706.00 Nationwide Remote Office (US99).
#J-18808-Ljbffr
We are seeking a talented Data Engineer to join our team and play a pivotal role in transforming raw data into actionable insights. As a Data Engineer, you will design, develop, and maintain robust data pipelines and ETL processes, ensuring seamless data flow from various sources to our data warehouse. Your expertise in AWS technologies and BI tools will be instrumental in delivering high-quality data solutions that drive informed decision-making.
Responsibilities:
Data Extraction, Transformation, and Loading (ETL):
Develop and implement efficient ETL pipelines to extract data from diverse legacy systems, transform it to meet business requirements, and load it into our data warehouse or data lake.
Data Security:
Develop security procedures and create systems to keep data secure.
Data Warehousing and Data Lake:
Design and build scalable data warehouses and data lakes to store and manage large volumes of structured and unstructured data.
Data Quality:
Ensure data quality by implementing data validation and cleansing processes, identifying and resolving data inconsistencies.
Data Modeling:
Create and maintain data models that accurately represent the business domain, ensuring data integrity and consistency.
AWS Technologies:
Leverage AWS services like AWS Glue, S3, Redshift, EMR, and Lambda to build and manage data pipelines effectively.
BI Tools:
Integrate data with BI tools (e.g., Tableau, Power BI) to enable data-driven insights and reporting.
Collaboration:
Work closely with data analysts, data scientists, and business stakeholders to understand their data requirements and deliver solutions that meet their needs.
Basic Qualifications:
Bachelor's degree in Computer Science, Data Engineering, or a related field.
3+ years of experience in data engineering or a similar role.
3+ years of experience building ETL pipelines from multiple database systems using SQL, Spark, Unix, Python, Java, or C#.
3+ years of experience with SQL and Python or other scripting languages.
3+ years of experience with ETL processes and tools.
3+ years of experience with AWS technologies, including S3, Redshift, EMR, and Lambda.
2+ years of experience with RDBMS such as Oracle 12c, MS SQL Server, DB2.
Green Card Holder or US Citizenship required due to federal contract requirements.
Must be able to obtain Public Trust clearance.
MUST RESIDE IN THE United States (U.S.) and the work MUST BE PERFORMED in the United States (U.S.), as this work is for a federal contract and laws do apply.
Preferred Qualifications:
Familiarity with data warehousing and data lake concepts.
Experience with data governance and data quality frameworks.
Knowledge of cloud-native data platforms and technologies.
Certifications in AWS or other relevant technologies.
Pay Range:
The pay range for this position based on full-time employment is $84,533.00 - $143,706.00 Nationwide Remote Office (US99).
#J-18808-Ljbffr