SysMind Tech
Big Data administrator
SysMind Tech, Houston, Texas, United States, 77246
Job Title
Big data Administrator
Relevant Experience
(in Yrs)
Min 3-5 yrs
Must Have Technical/Functional Skills •Proficiency in the Cloudera suite, including Kafka, HDFS, HBASE, KUDU, Zookeeper, HIVE, Impala, NIFI, SPARK, FLINK, Oozie, Yarn, Atlas, Ranger, RangerKMS, and KTS. •Experience with Cloudera ECS and Cloudera Data Services such as Cloudera Data Engineering, Cloudera Data Warehouse, and Cloudera Machine Learning. •Familiarity with container orchestration systems like Kubernetes (Openshift or equivalent). •Strong background in Linux system administration and shell scripting. •Basic programming skills in Java, Scala, or Python. •Experience with graph databases like Neo4j is a plus. •Knowledge of cloud-native Big Data technologies and architectural principles. •In-depth knowledge and hands-on experience in configuring and managing Kerberos authentication for a secure big data environment. Proficiency in setting up Kerberos realms, troubleshooting Kerberos authentication issues, and performing regular security checks and configuration audits. •Strong understanding of encryption protocols and standards for securing data. Experience with configuring TLS/SSL for data in transit across distributed networks and implementing encryption solutions for data at rest, using technologies like Hadoop's encryption zones or third-party encryption tools. Familiarity with key management practices and security compliance requirements for data protection.
Experience Required
As a Big Data Administrator, you will be responsible for managing our Big Data infrastructure and tools within the Cloudera ecosystem. You will play a critical role in our data team, ensuring the high availability, performance, and security of our data solutions
Roles & Responsibilities •Install, configure, and maintain Cloudera Big Data clusters across multiple environments, ensuring optimal performance and resource utilization. •Perform cluster maintenance tasks such as patching, upgrades, and migrations, applying best practices for minimal downtime. •Ensure high avai
Big data Administrator
Relevant Experience
(in Yrs)
Min 3-5 yrs
Must Have Technical/Functional Skills •Proficiency in the Cloudera suite, including Kafka, HDFS, HBASE, KUDU, Zookeeper, HIVE, Impala, NIFI, SPARK, FLINK, Oozie, Yarn, Atlas, Ranger, RangerKMS, and KTS. •Experience with Cloudera ECS and Cloudera Data Services such as Cloudera Data Engineering, Cloudera Data Warehouse, and Cloudera Machine Learning. •Familiarity with container orchestration systems like Kubernetes (Openshift or equivalent). •Strong background in Linux system administration and shell scripting. •Basic programming skills in Java, Scala, or Python. •Experience with graph databases like Neo4j is a plus. •Knowledge of cloud-native Big Data technologies and architectural principles. •In-depth knowledge and hands-on experience in configuring and managing Kerberos authentication for a secure big data environment. Proficiency in setting up Kerberos realms, troubleshooting Kerberos authentication issues, and performing regular security checks and configuration audits. •Strong understanding of encryption protocols and standards for securing data. Experience with configuring TLS/SSL for data in transit across distributed networks and implementing encryption solutions for data at rest, using technologies like Hadoop's encryption zones or third-party encryption tools. Familiarity with key management practices and security compliance requirements for data protection.
Experience Required
As a Big Data Administrator, you will be responsible for managing our Big Data infrastructure and tools within the Cloudera ecosystem. You will play a critical role in our data team, ensuring the high availability, performance, and security of our data solutions
Roles & Responsibilities •Install, configure, and maintain Cloudera Big Data clusters across multiple environments, ensuring optimal performance and resource utilization. •Perform cluster maintenance tasks such as patching, upgrades, and migrations, applying best practices for minimal downtime. •Ensure high avai