Maintec Technologies
Data Engineer
Maintec Technologies, Mc Lean, Virginia, us, 22107
ROLE: - Data Engineer
LOCATION: - Mclean - VA
DURATION: - Long term contract
Client: - Freddie Mac
Experience: 8 to 10yrs
Shift: Day 9AM-7PM EST
Required Skills: Snowflake, AWS Big Data, Kafka
Job Summary:Years of experience in the following Big Data and distributed programming models and technologies and techniques such as Hadoop Spark Map Reduce Sqoop Hive Hadoop Distributed File System HDFS Distributed Indexing and Databases HBase Hive Cassandra Vertica Serialization Formats JSON Avro Parquet Pyspark Spark Scala Hive Tuning Bucketing PartitioningKnowledge of database structures theories principles and practices both SQL and NoSQLExperience on data lakes datahub implementationExperience and knowledge working with relational databases RDBMSExperience and knowledge working in Kafka Spark streaming Sqoop Oozie Airflow ControlMExperience with Shell ScriptingBachelor's degree in Information Technology Computer Science Engineering or related field or equivalent combination of education and work experienceWell versed with Information and application security including LDAP certificates public key encryption SSH access credentials etco or more years of experience working in Agile Lean Kanban or Scaled Agile organizationKnowledge or experience in Jira Confluence and BitbucketExperience applying TDD BDD and Static Code analysis to improve quality and reliability of deliveryAbility to independently perform all duties from Analysis to deployment to postproduction defect fixingMust have experience working on Production support PreferredExperience in building Microservices using Java Python SPARK OCP RESTful APIsExperience in one or more of the following Amazon Web Services AWS Cloud services EC EMR ECS Docker OpenShift Kubernetes Amazon EKS S SNS SQS Cloud Formation Cloud Watch LambdaExperience working on RDS Aurora PostgreSQLExperience working on Snowflake or similar
LOCATION: - Mclean - VA
DURATION: - Long term contract
Client: - Freddie Mac
Experience: 8 to 10yrs
Shift: Day 9AM-7PM EST
Required Skills: Snowflake, AWS Big Data, Kafka
Job Summary:Years of experience in the following Big Data and distributed programming models and technologies and techniques such as Hadoop Spark Map Reduce Sqoop Hive Hadoop Distributed File System HDFS Distributed Indexing and Databases HBase Hive Cassandra Vertica Serialization Formats JSON Avro Parquet Pyspark Spark Scala Hive Tuning Bucketing PartitioningKnowledge of database structures theories principles and practices both SQL and NoSQLExperience on data lakes datahub implementationExperience and knowledge working with relational databases RDBMSExperience and knowledge working in Kafka Spark streaming Sqoop Oozie Airflow ControlMExperience with Shell ScriptingBachelor's degree in Information Technology Computer Science Engineering or related field or equivalent combination of education and work experienceWell versed with Information and application security including LDAP certificates public key encryption SSH access credentials etco or more years of experience working in Agile Lean Kanban or Scaled Agile organizationKnowledge or experience in Jira Confluence and BitbucketExperience applying TDD BDD and Static Code analysis to improve quality and reliability of deliveryAbility to independently perform all duties from Analysis to deployment to postproduction defect fixingMust have experience working on Production support PreferredExperience in building Microservices using Java Python SPARK OCP RESTful APIsExperience in one or more of the following Amazon Web Services AWS Cloud services EC EMR ECS Docker OpenShift Kubernetes Amazon EKS S SNS SQS Cloud Formation Cloud Watch LambdaExperience working on RDS Aurora PostgreSQLExperience working on Snowflake or similar