Sonsoft Inc
Hadoop Administration
Sonsoft Inc, Foster City, California, United States, 94420
Sonsoft , Inc. is a USA based corporation duly organized under the laws of the Commonwealth of Georgia. Sonsoft Inc. is growing at a steady pace specializing in the fields of Software Development, Software Consultancy and Information Technology Enabled Services.
Job Description
At least 4 years of experience in Implementation and Administration of Hadoop infrastructureAt least 2 years of experience Architecting, Designing, Implementation and Administration of Hadoop infrastructureAt least 2 years of experience in Project life cycle activities on development and maintenance projects.Should be able to provide Consultancy to client / internal teams on which product/flavor is best for which situation/setupOperational expertise in troubleshooting , understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networksHadoop, MapReduce, HBase, Hive, Pig, MahoutHadoop Administration skills: Experience working
in Cloudera Manager or Ambari, Ganglia, NagiosExperience in using Hadoop Schedulers - FIFO, Fair Scheduler, Capacity SchedulerExperience in Job Schedule Management - Oozie or Enterprise Schedulers like Control-M, TivoliGood knowledge of Linux
(RHEL, Centos, Ubuntu)Experience in setting up
Ad/LDAP/Kerberos Authentication modelsExperience in Data Encryption techniqueResponsibilities:-Upgrades and Data MigrationsHadoop Ecosystem and Clusters maintenance as well as creation and removal of nodesPerform administrative activities with Cloudera Manager/Ambari and tools like Ganglia, NagiosSetting up and maintaining Infrastructure and configuration for Hive, Pig and MapReduceMonitor Hadoop Cluster Availability, Connectivity and SecuritySetting up Linux users, groups, Kerberos principals and keysAligning with the Systems engineering team in maintaining hardware and software environments required for HadoopSoftware installation, configuration, patches and upgradesWorking with data delivery teams to setup Hadoop application development environmentsPerformance tuning of Hadoop clusters and Hadoop MapReduce routinesScreen Hadoop cluster job performances and capacity planningData modelling, Database backup and recoveryManage and review Hadoop log filesFile system management, Disk space management and monitoring (Nagios, Splunk etc)HDFS support and maintenancePlanning of Back-up, High Availability and Disaster Recovery InfrastructureDiligently teaming with Infrastructure, Network, Database, Application and Business Intelligence teams to guarantee high data quality and availabilityCollaborating with application teams to install operating system and Hadoop updates, patches and version upgradesImplementation of Strategic Operating model in line with best practicesPoint of Contact for Vendor escalationsAbility to work in team in diverse/ multiple stakeholder environmentAnalytical skillsQualifications
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.At least 7 years of experience
within the Information Technologies.Additional Information
**
U.S. citizens and those authorized to work in the U.S. are encouraged to apply
. We are unable to sponsor at this time.Note:-This is a
Full-Time Permanent job opportunity for you.Only US Citizen, Green Card Holder, GC-EAD , H4-EAD & L2-EAD can apply.No OPT-EAD , TN Visa
& H1B Consultants please.
Please mention your
Visa Status
in your
email
or
resume
.
#J-18808-Ljbffr
Job Description
At least 4 years of experience in Implementation and Administration of Hadoop infrastructureAt least 2 years of experience Architecting, Designing, Implementation and Administration of Hadoop infrastructureAt least 2 years of experience in Project life cycle activities on development and maintenance projects.Should be able to provide Consultancy to client / internal teams on which product/flavor is best for which situation/setupOperational expertise in troubleshooting , understanding of system’s capacity, bottlenecks, basics of memory, CPU, OS, storage, and networksHadoop, MapReduce, HBase, Hive, Pig, MahoutHadoop Administration skills: Experience working
in Cloudera Manager or Ambari, Ganglia, NagiosExperience in using Hadoop Schedulers - FIFO, Fair Scheduler, Capacity SchedulerExperience in Job Schedule Management - Oozie or Enterprise Schedulers like Control-M, TivoliGood knowledge of Linux
(RHEL, Centos, Ubuntu)Experience in setting up
Ad/LDAP/Kerberos Authentication modelsExperience in Data Encryption techniqueResponsibilities:-Upgrades and Data MigrationsHadoop Ecosystem and Clusters maintenance as well as creation and removal of nodesPerform administrative activities with Cloudera Manager/Ambari and tools like Ganglia, NagiosSetting up and maintaining Infrastructure and configuration for Hive, Pig and MapReduceMonitor Hadoop Cluster Availability, Connectivity and SecuritySetting up Linux users, groups, Kerberos principals and keysAligning with the Systems engineering team in maintaining hardware and software environments required for HadoopSoftware installation, configuration, patches and upgradesWorking with data delivery teams to setup Hadoop application development environmentsPerformance tuning of Hadoop clusters and Hadoop MapReduce routinesScreen Hadoop cluster job performances and capacity planningData modelling, Database backup and recoveryManage and review Hadoop log filesFile system management, Disk space management and monitoring (Nagios, Splunk etc)HDFS support and maintenancePlanning of Back-up, High Availability and Disaster Recovery InfrastructureDiligently teaming with Infrastructure, Network, Database, Application and Business Intelligence teams to guarantee high data quality and availabilityCollaborating with application teams to install operating system and Hadoop updates, patches and version upgradesImplementation of Strategic Operating model in line with best practicesPoint of Contact for Vendor escalationsAbility to work in team in diverse/ multiple stakeholder environmentAnalytical skillsQualifications
Bachelor’s degree or foreign equivalent required from an accredited institution. Will also consider three years of progressive experience in the specialty in lieu of every year of education.At least 7 years of experience
within the Information Technologies.Additional Information
**
U.S. citizens and those authorized to work in the U.S. are encouraged to apply
. We are unable to sponsor at this time.Note:-This is a
Full-Time Permanent job opportunity for you.Only US Citizen, Green Card Holder, GC-EAD , H4-EAD & L2-EAD can apply.No OPT-EAD , TN Visa
& H1B Consultants please.
Please mention your
Visa Status
in your
or
resume
.
#J-18808-Ljbffr