Credence company
Data Engineer
Credence company, Mc Lean, Virginia, us, 22107
Data Engineer
Job Locations
US-VA-McLean
ID
2024-8756
Category
Information Technology
Type
Regular Full-Time
Overview
Credence Management Solutions, LLC (Credence) is seeking a Data Engineer to join our Technology Practice. This position will provide an opportunity to work with an experienced team of Cloud Engineers, Solutions Architects, Data Scientists, Developers, and Cybersecurity professionals. The successful candidate will have the opportunity to work in a federal client environment and grow their career in the Cloud and Data Science spaces by honing their cloud and data science knowledge and skillsets. This role is supporting a federal client environment so US Citizenship is required. This position is based in Credence's corporate headquarters located in Tyson's Corner in Vienna, VA.We are seeking a motivated and skilled Data Engineer to join our team. The ideal candidate is passionate about data engineering, possesses a strong analytical mindset, and demonstrates a keen interest in working with large-scale data systems. In this role, you will assist in designing, developing, and maintaining our data pipelines, ensuring the efficient flow of data between various systems. You will collaborate closely with cross-functional teams, including architects, data scientists, cloud engineers, software engineers, and business analysts, QA Teams, to enable data-driven decision-making and deliver high-quality solutions.Responsibilities include, but are not limited to the duties listed below
*Assist in designing, developing, and maintaining data pipelines, ensuring smooth data flow from diverse sources into our data warehouse and data lakes to support end-to-end solutions including AI/ML.
*Collaborate with architects, data scientists and senior and junior software engineers to implement efficient and scalable data processing solutions.*Exploration and assessment of technical solutions through analysis, prototypes, and pilots.*Perform data cleansing, transformation, and aggregation to ensure data integrity and accuracy.*Help optimize and tune data pipelines for performance and reliability.*Assist in monitoring and troubleshooting data-related issues to ensure smooth operations.*Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.*Research and stay up-to-date with emerging data engineering technologies and best practices to contribute innovative ideas and improvements.*Required to document data engineering processes, procedures, and standards per projects needs and as a best practice for future references.Education, Requirements and Qualifications
Must be a US Citizen with Active or ability to obtain DoD secret clearance, top-secret clearance as required.Required CompTIA Security + certification and AWS Cloud Practitioner certification.Bachelor's degree in computer science, information systems, statistics, or a related critical thinking field. Equivalent practical experience will also be considered.Strong understanding of data engineering concepts and principles (ingestion, standardization, transformation)Demonstrated Proficiency in programming languages such as Python, Java, or ScalaExperience with EKS, ECS, Kafka, Data Firehose, Kinesis, Airflow, AWS Glue, Kendra, Step Functions, Elasticsearch or OpenSearch integration, S3, Lambda, Amazon Redshift, Dynamo DB, RDS, AWS Lake Formation, AWS Sagemaker, Bedrock, Event Bridge, Athena, CloudTrail, CloudWatch, DatabricksProficiency with No SQL, relational databases Such as SQL Server, Oracle, Open Source DBs like PostgreSQL, Vector DB, Graph DB, Dynamo DB, Graph QL, Data Retention, Performance Tuning etc.Experience with Data Modeling, Data Orchestration, ELT/ETL tools and familiarity with concepts such as Medallion Architecture, Data Mesh, data lakehouse etc.,Experience with DevSecOps, Monitoring, Error Logging, Tracing/Analysis, Alerts ManagementKnowledge of infrastructure as Code (IaC), DevOps scripting languages (Gitlab CI/CD), Terraform, AWS CDK, AWS CloudFormation, AWS Code Deploy.Proficiency with Analytics tools like Tableau, Power BIExperienced with Data Security, Data Encryption, Data Masking, Data Privacy RulesExperience with Federal Data Compliance RegulationsProficiency in distributed computing and big data technologies such as EMR, Apache Hadoop, SparkKnowledge of other cloud-based data storage and processing platforms like Azure, or Google Cloud is a plus.Excellent problem-solving and analytical skills.Excellent communication skills and combined with strong team player abilities.Attention to detail and commitment to delivering high-quality work.Ability to work in a fast-paced environment and handle multiple tasks simultaneously.Nice to have.AWS Certified Data EngineerDatabricks Certified Data Engineer
Working Conditions and Physical Requirements
AWOnsite in - Vienna Va#LI-OnsiteNeed help finding the right job?
We can recommend jobs specifically for you!Click here to get started.
Job Locations
US-VA-McLean
ID
2024-8756
Category
Information Technology
Type
Regular Full-Time
Overview
Credence Management Solutions, LLC (Credence) is seeking a Data Engineer to join our Technology Practice. This position will provide an opportunity to work with an experienced team of Cloud Engineers, Solutions Architects, Data Scientists, Developers, and Cybersecurity professionals. The successful candidate will have the opportunity to work in a federal client environment and grow their career in the Cloud and Data Science spaces by honing their cloud and data science knowledge and skillsets. This role is supporting a federal client environment so US Citizenship is required. This position is based in Credence's corporate headquarters located in Tyson's Corner in Vienna, VA.We are seeking a motivated and skilled Data Engineer to join our team. The ideal candidate is passionate about data engineering, possesses a strong analytical mindset, and demonstrates a keen interest in working with large-scale data systems. In this role, you will assist in designing, developing, and maintaining our data pipelines, ensuring the efficient flow of data between various systems. You will collaborate closely with cross-functional teams, including architects, data scientists, cloud engineers, software engineers, and business analysts, QA Teams, to enable data-driven decision-making and deliver high-quality solutions.Responsibilities include, but are not limited to the duties listed below
*Assist in designing, developing, and maintaining data pipelines, ensuring smooth data flow from diverse sources into our data warehouse and data lakes to support end-to-end solutions including AI/ML.
*Collaborate with architects, data scientists and senior and junior software engineers to implement efficient and scalable data processing solutions.*Exploration and assessment of technical solutions through analysis, prototypes, and pilots.*Perform data cleansing, transformation, and aggregation to ensure data integrity and accuracy.*Help optimize and tune data pipelines for performance and reliability.*Assist in monitoring and troubleshooting data-related issues to ensure smooth operations.*Collaborate with cross-functional teams to understand data requirements and translate them into technical solutions.*Research and stay up-to-date with emerging data engineering technologies and best practices to contribute innovative ideas and improvements.*Required to document data engineering processes, procedures, and standards per projects needs and as a best practice for future references.Education, Requirements and Qualifications
Must be a US Citizen with Active or ability to obtain DoD secret clearance, top-secret clearance as required.Required CompTIA Security + certification and AWS Cloud Practitioner certification.Bachelor's degree in computer science, information systems, statistics, or a related critical thinking field. Equivalent practical experience will also be considered.Strong understanding of data engineering concepts and principles (ingestion, standardization, transformation)Demonstrated Proficiency in programming languages such as Python, Java, or ScalaExperience with EKS, ECS, Kafka, Data Firehose, Kinesis, Airflow, AWS Glue, Kendra, Step Functions, Elasticsearch or OpenSearch integration, S3, Lambda, Amazon Redshift, Dynamo DB, RDS, AWS Lake Formation, AWS Sagemaker, Bedrock, Event Bridge, Athena, CloudTrail, CloudWatch, DatabricksProficiency with No SQL, relational databases Such as SQL Server, Oracle, Open Source DBs like PostgreSQL, Vector DB, Graph DB, Dynamo DB, Graph QL, Data Retention, Performance Tuning etc.Experience with Data Modeling, Data Orchestration, ELT/ETL tools and familiarity with concepts such as Medallion Architecture, Data Mesh, data lakehouse etc.,Experience with DevSecOps, Monitoring, Error Logging, Tracing/Analysis, Alerts ManagementKnowledge of infrastructure as Code (IaC), DevOps scripting languages (Gitlab CI/CD), Terraform, AWS CDK, AWS CloudFormation, AWS Code Deploy.Proficiency with Analytics tools like Tableau, Power BIExperienced with Data Security, Data Encryption, Data Masking, Data Privacy RulesExperience with Federal Data Compliance RegulationsProficiency in distributed computing and big data technologies such as EMR, Apache Hadoop, SparkKnowledge of other cloud-based data storage and processing platforms like Azure, or Google Cloud is a plus.Excellent problem-solving and analytical skills.Excellent communication skills and combined with strong team player abilities.Attention to detail and commitment to delivering high-quality work.Ability to work in a fast-paced environment and handle multiple tasks simultaneously.Nice to have.AWS Certified Data EngineerDatabricks Certified Data Engineer
Working Conditions and Physical Requirements
AWOnsite in - Vienna Va#LI-OnsiteNeed help finding the right job?
We can recommend jobs specifically for you!Click here to get started.