ideaVat
Sr Hadoop Developer
ideaVat, Columbia, South Carolina, us, 29228
Jobs Bridge Inc is among the fastest growing IT staffing / professional services organizations with its own job portal. Jobs Bridge works extremely closely with a large number of IT organizations in the most in-demand technology skill sets.Job Description
Skills:
Hadoop, Pig, Hive, Map Reduce, ETL with data warehouse, BILocation:
Columbia, SCTotal Experience:
5 yrs.Max Salary:
Not MentionedEmployment Type:
Direct Jobs (Full Time)Domain:
AnyDescription
GC can applyData Modeling:
Deep expertise with modeling databases for enterprise-grade solutions, ideally, analytics solutions.Coding:
Build highly performant and scalable enterprise-grade ETL processes for populating analytics Hadoop and Oracle based data warehouses. Working in Agile teams on Linux development environments.Data Setup:
Bring together data and create views of data sets stored in the Hadoop-based big data platform using Hive, Pig, SQOOP, and Oozie.Testing:
Assist quality assurance testing teams. Where required, develop and conduct unit tests, develop system test data and perform system tests.Documentation:
Develop program specifications and flowcharts (dataflows, jobflows, etc.) for stand-alone products or systems. Prepare concise internal program documentation on product development and revisions. Prepare user guides and operational instruction manuals.Communication:
Convey problems, solutions, updates, and project status to peers, customers, and management. Develop and maintain program, systems, and user documentation.Planning:
Prepare time estimates for assigned tasks. Attend post-implementation reviews.Qualifications:
Bachelors/Masters degree in computer science/math/statistics or a related discipline preferred with 3+ years of database development/data mining experience, OR demonstrated ability to meet job requirements through a comparable number of years of technical work experience.Excellent understanding of Big Data Analytics platforms; Hands-on experience with the following technologies: Hadoop, HBase, Pig, SQL.Solid experience as an RDBMS developer, ideally on Oracle, with stored procedures, query performance tuning, and ETL.Solid experience with scripting languages such as Python, Shell.Some experience of web development stacks, analytics, NoSQL data stores, data modeling, analytical tools, and libraries.Solid understanding of Data Warehousing concepts and technologies.Strong foundational knowledge and experience with distributed systems and computing systems in general.Experience working in Agile teams.Experience in Healthcare will be a big plus.
#J-18808-Ljbffr
Skills:
Hadoop, Pig, Hive, Map Reduce, ETL with data warehouse, BILocation:
Columbia, SCTotal Experience:
5 yrs.Max Salary:
Not MentionedEmployment Type:
Direct Jobs (Full Time)Domain:
AnyDescription
GC can applyData Modeling:
Deep expertise with modeling databases for enterprise-grade solutions, ideally, analytics solutions.Coding:
Build highly performant and scalable enterprise-grade ETL processes for populating analytics Hadoop and Oracle based data warehouses. Working in Agile teams on Linux development environments.Data Setup:
Bring together data and create views of data sets stored in the Hadoop-based big data platform using Hive, Pig, SQOOP, and Oozie.Testing:
Assist quality assurance testing teams. Where required, develop and conduct unit tests, develop system test data and perform system tests.Documentation:
Develop program specifications and flowcharts (dataflows, jobflows, etc.) for stand-alone products or systems. Prepare concise internal program documentation on product development and revisions. Prepare user guides and operational instruction manuals.Communication:
Convey problems, solutions, updates, and project status to peers, customers, and management. Develop and maintain program, systems, and user documentation.Planning:
Prepare time estimates for assigned tasks. Attend post-implementation reviews.Qualifications:
Bachelors/Masters degree in computer science/math/statistics or a related discipline preferred with 3+ years of database development/data mining experience, OR demonstrated ability to meet job requirements through a comparable number of years of technical work experience.Excellent understanding of Big Data Analytics platforms; Hands-on experience with the following technologies: Hadoop, HBase, Pig, SQL.Solid experience as an RDBMS developer, ideally on Oracle, with stored procedures, query performance tuning, and ETL.Solid experience with scripting languages such as Python, Shell.Some experience of web development stacks, analytics, NoSQL data stores, data modeling, analytical tools, and libraries.Solid understanding of Data Warehousing concepts and technologies.Strong foundational knowledge and experience with distributed systems and computing systems in general.Experience working in Agile teams.Experience in Healthcare will be a big plus.
#J-18808-Ljbffr