Logo
Highering LLC

BIG DATA ETL PROJECT LEAD

Highering LLC, Friendly, Maryland, United States,


As a Big Data Project Lead, you will lead, design and implement innovative big data warehouse solutions and work alongside the project team, evaluating new features and architecture. Reporting to the Project Manager, you will help drive decisions and build collaborative relationships with key functional groups in the organization.In addition, you will:

Lead, design and implement innovative ETL solutions using Hadoop, HDFS, Hive, Sqoop, NoSQL, and other Big Data related technologies.Work with various teams to understand requirements and evaluate new features and architecture to help drive decisions.Build collaborative partnerships with architects, technical leads, and key individuals within other functional groups.Perform detailed analysis of business problems and technical environments and use this in designing quality technical solutions.Actively participate in code reviews and test solutions to ensure they meet best practice specifications.Build and foster a high-performance project delivery culture, mentor team members and provide the team with the tools and motivation to make things happen.Work with stakeholders and cross-functional teams to develop new solutions or enhance existing solutions.Demonstrate values of passion for Client Service, Innovation, Expertise, Balance, Respect for All, Teamwork, and Initiative.We’re excited about you if you have:

8-10 years of software development and deployment experience with at least 5 years of hands-on experience with Hadoop applications (e.g., administration, configuration management, monitoring, debugging, and performance tuning).Strong experience building data ingestion pipelines (simulating Extract, Transform, Load workload), data warehouse or database architecture.Hands-on development experience using open source big data components such as Hadoop, Hive, Pig, Spark, HBase, Hawk, Oozie, Mahout, Flume, Kafka, ZooKeeper, Sqoop, etc.Knowledge of Hortonworks distribution and other similar products.Strong experience with data modeling, design patterns, building highly scalable Big Data Solutions and distributed applications.Knowledge of cloud platforms, for example:Experience with Google Cloud, Azure, AWS or equivalent cloud platforms.Google Cloud: Experience designing, deploying, and administering scalable, available, and fault-tolerant systems on Google Cloud using Google Cloud’s big data tools (BigQuery, Cloud Dataflow, Cloud Dataproc, Cloud Datalab, Cloud Dataprep, Cloud Pub/Sub, Genomics, Google Data Studio).Experience with Google Cloud Management Portal, Machine Learning, and Google Cloud Storage and databases.Hadoop: Experience with storing, joining, filtering, and analyzing data using Spark, Hive, and Map Reduce.Experience working with continuous integration frameworks, building regression-able code within the data world using GitHub, Jenkins, and related applications.Experience with programming/scripting languages such as Scala/Java/Python/R, etc. (any combination).Analytical approach to problem-solving with an ability to work at an abstract level and gain consensus; excellent interpersonal, leadership, and communication skills.Data-oriented personality. Motivated, independent, efficient, and able to handle several projects; work under pressure with a solid sense of setting priorities.Ability to work in a fast-paced (startup-like) agile development environment.Friendly, articulate, and interested in working in a fun, small team environment.Experience working in the retail industry with a large-scale enterprise organization, e-commerce, marketing, and CRM applications will be a plus.

#J-18808-Ljbffr