Hubstaff
Data Engineer(Python, SQL, Snowflake, ETL, Databricks, Spark) - 100% Remote
Hubstaff, Snowflake, Arizona, United States, 85937
Data Engineer (Python, SQL, Snowflake, ETL, Databricks, Spark) - 100% Remote Full Time
Nex4
HQ:
Columbia, Missouri, United StatesJob Description:5+ years of experience as a data engineer and Snowflake developerProficiency in programming languages (Python, Java, or Scala)Experience with big data technologies like Hadoop, Spark, and KafkaKnowledge of database management systems, both relational (e.g., SQL, PostgreSQL) and non-relational (e.g., MongoDB, Cassandra)Familiarity with data integration and ETL tools, such as Talend, InformaticaStrong problem-solving and analytical skillsExcellent communication and collaboration abilitiesAdaptability and a willingness to learn new technologies and techniquesCreate, test, and implement enterprise-level apps with SnowflakeSnowflake warehousing, architecture, processing, administrationMaintenance of ideal data pipeline based on ETL toolsDesign and implement features for identity and access managementCreate authorization frameworks for better access controlImplement novel query optimization, major security competencies with encryptionSolve performance issues and scalability issues in the systemTransaction management with distributed data processing algorithmsExperience in Machine Learning, with a particular emphasis on Large Language Models (LLMs) and Generative AIComprehensive knowledge and hands-on experience with fine-tuning approaches and training modelsStrong analytical and problem-solving skillsStrong programming skills in Python, SQLDemonstrated leadership in both applied research and developmentHow to apply:Please schedule a meeting for the hiring process with Ben, our new Hiring Manager for this role.Experience levels: Intermediate (3 - 5 yrs), Expert (5+ yrs)
#J-18808-Ljbffr
Nex4
HQ:
Columbia, Missouri, United StatesJob Description:5+ years of experience as a data engineer and Snowflake developerProficiency in programming languages (Python, Java, or Scala)Experience with big data technologies like Hadoop, Spark, and KafkaKnowledge of database management systems, both relational (e.g., SQL, PostgreSQL) and non-relational (e.g., MongoDB, Cassandra)Familiarity with data integration and ETL tools, such as Talend, InformaticaStrong problem-solving and analytical skillsExcellent communication and collaboration abilitiesAdaptability and a willingness to learn new technologies and techniquesCreate, test, and implement enterprise-level apps with SnowflakeSnowflake warehousing, architecture, processing, administrationMaintenance of ideal data pipeline based on ETL toolsDesign and implement features for identity and access managementCreate authorization frameworks for better access controlImplement novel query optimization, major security competencies with encryptionSolve performance issues and scalability issues in the systemTransaction management with distributed data processing algorithmsExperience in Machine Learning, with a particular emphasis on Large Language Models (LLMs) and Generative AIComprehensive knowledge and hands-on experience with fine-tuning approaches and training modelsStrong analytical and problem-solving skillsStrong programming skills in Python, SQLDemonstrated leadership in both applied research and developmentHow to apply:Please schedule a meeting for the hiring process with Ben, our new Hiring Manager for this role.Experience levels: Intermediate (3 - 5 yrs), Expert (5+ yrs)
#J-18808-Ljbffr