Logo
Cyrten

AWS Databricks Cloud Data Engineer (Sr.) 100% Remote

Cyrten, New York, New York, United States,


AWS Databricks Cloud Data Engineer (Sr) 100% Remote

Location:

Remote

Preferred Time Zones:

EST, CST, MTN

Rate:

DOE - W2

Length:

1 to 3 years

Top 3-5 skills:

1. Hands-on Terraform. Build from scratch.

2. Scripts - Shell Unix python

3. Infrastructure experience deploying platforms

4. Data Processing platforms such as Databricks (or Snowflake)

5. Experience with data analytics in a cloud environment

Tip for success:

Underneath the Skills section of your resume, put a Terraform section. Don't be shy; list everything you have ever done in Terraform. Terraform is such a major key to this opportunity.

Qualifications:Bachelor's degree in Computer Science, Management Information Systems, Computer Engineering, or related field or equivalent work experience; advanced degree preferredSeven or more years of experience as an AWS Data Engineer or Architect in designing and building large-scale solutions in an enterprise setting in both development and deploymentFive plus years in designing and building solutions in the cloudExpertise in building and managing Cloud databases such as AWS RDS, DynamoDB, DocumentDB, or analogous architecturesExpertise in building Cloud Database Management Systems in Databricks Lakehouse or analogous architecturesExpertise in Cloud Data Warehouses in Redshift, BigQuery, or analogous architectures is a plusDeep SQL expertise, data modeling, and experience with data governance in relational databasesExperience with the practical application of data warehousing concepts, methodologies, and frameworks using traditional (Vertica, Teradata, etc.) and current (SparkSQL, Hadoop, Kafka) distributed technologiesRefined skills using one or more scripting languages (e.g., Python, bash, etc.)Experience using ETL/ELT tools and technologies such as Talend, Informatica a plusEmbrace data platform thinking, design and develop data pipelines keeping security, scale, uptime, and reliability in mindExpertise in relational and dimensional data modelingUNIX admin and general server administration experience requiredPresto, Hive, SparkSQL, Cassandra, or Solr, and other Big Data query and transformation experiences a plusExperience using Spark, Kafka, Hadoop, or similar distributed data technologies a plusAble to expertly express the benefits and constraints of technology solutions to technology partners, business partners, and team membersExperience with leveraging CI/CD pipelinesExperience with Agile methodologies and ability to work in an Agile manner is preferredOne or more cloud certificationsResponsibilities:

Understand technology vision and strategic direction of business needsUnderstand our current data model and infrastructure, proactively identify gaps and areas for improvement, and prescribe architectural recommendations focusing on performance and accessibility.Partner across engineering teams to design, build, and support the next generation of our analytics systems.Partner with business and analytics teams to understand specific requirements for data systems to support the development and deployment of data workloads ranging from Tableau reports to ad hoc analyses.Own and develop architecture supporting translating analytical questions into effective reports that drive business action.Automate and optimize existing data processing workloads by recognizing data and technology usage patterns and implementing solutions.Solid grasp of the intersection between analytics and engineering while maintaining a proactive approach to ensure solutions demonstrate high-performance levels, privacy, security, scalability, and reliability upon deployment.Provide guidance to partners on the effective use of the database management systems (DBMS) platform through collaboration, documentation, and associated standard methodologies.Design and build end-to-end automation to support and maintain software currencyCreate build automation services using Terraform, Python, and OS shell scripts.Develop validation and certification processes through automation toolsDesign integrated solutions in alignment with design patterns, blueprints, guidelines, and standard methodologies for productsParticipate in developing solutions by incorporating cloud native and 3rd party vendor productsParticipate in research, perform POCs (proofs of concept) with emerging technologies, and adopt industry best practices in the data space to advance the cloud data platform.Develop data streaming, migration, and replication solutionsDemonstrate leadership, collaboration, exceptional communication, negotiation, strategic, and influencing skills to gain consensus and produce the best solutions.

Please Note:

Must be a U.S. Citizen, this is for a Federal Gov't client

NO 3rd Party Candidates

NO 3rd Party Vendors