Doran Jones Inc
Sr Software Engineer- Big Data and Cloud
Doran Jones Inc, Tampa, Florida, us, 33646
Doran Jones is looking for a Senior Software Engineer to leverage their engineering experience to create a solutions-focused Agile environment. This role requires hands-on participation in the development and optimization of processes, solutions, and code.
When you join our team, you will have the opportunity to demonstrate your ability to work on solution development activities, from the initial requirement capture to software architecture and implementation.
This position is hybrid and requires 2-3 days a week in the Tampa office.
It is a full-time, salaried position and is not open to C2C or other contracts.
Responsibilities
Foster a collaborative and high-performance culture within the teams.Drive the adoption of best practices in software development, data engineering, and cloud computing.Support the design, development, and maintenance of large-scale data processing systems using technologies such as Snowflake, ETL/ELT processes, and open table format ICEBERGEnsure the efficient management and transformation of large datasets to support business analytics and reporting needs.Architect and implement data solutions on cloud platforms, particularly AWS and GC, ensuring scalability, performance, and security.Hands-on involvement in the development of pipelines, ETL processes, and data storage solutions.Work closely with data architects, data scientists, and business stakeholders to translate requirements into robust data solutions.Stay current with emerging technologies and industry trends in big data and cloud computing to drive innovation within the organization.Required Experience
8+ years of hands-on experience in Big Data, Snowflake, ETL Processing, and open table format data structures like Data Lake and Delta Lake.Proficiency in big data technologies and frameworks (e.g., Hadoop, Spark, Kafka).Strong expertise in Snowflake, ETL tools (e.g., AWS Glue/EMR), and open table formats.Solid understanding of cloud architectures and services on AWS and GCP.Proficient in programming languages such as Python, Java, or Scala.Exceptional communication skills, both verbal and written, with the ability to convey complex technical concepts to non-technical stakeholders.Strong problem-solving abilities and a results-oriented mindset.Preferred Experience
Experience with data governance, security, and compliance in cloud environments.Familiarity with DevOps practices and CI/CD pipelines in cloud settings.Advanced certifications in AWS, GCP, or Big Data technologies.
When you join our team, you will have the opportunity to demonstrate your ability to work on solution development activities, from the initial requirement capture to software architecture and implementation.
This position is hybrid and requires 2-3 days a week in the Tampa office.
It is a full-time, salaried position and is not open to C2C or other contracts.
Responsibilities
Foster a collaborative and high-performance culture within the teams.Drive the adoption of best practices in software development, data engineering, and cloud computing.Support the design, development, and maintenance of large-scale data processing systems using technologies such as Snowflake, ETL/ELT processes, and open table format ICEBERGEnsure the efficient management and transformation of large datasets to support business analytics and reporting needs.Architect and implement data solutions on cloud platforms, particularly AWS and GC, ensuring scalability, performance, and security.Hands-on involvement in the development of pipelines, ETL processes, and data storage solutions.Work closely with data architects, data scientists, and business stakeholders to translate requirements into robust data solutions.Stay current with emerging technologies and industry trends in big data and cloud computing to drive innovation within the organization.Required Experience
8+ years of hands-on experience in Big Data, Snowflake, ETL Processing, and open table format data structures like Data Lake and Delta Lake.Proficiency in big data technologies and frameworks (e.g., Hadoop, Spark, Kafka).Strong expertise in Snowflake, ETL tools (e.g., AWS Glue/EMR), and open table formats.Solid understanding of cloud architectures and services on AWS and GCP.Proficient in programming languages such as Python, Java, or Scala.Exceptional communication skills, both verbal and written, with the ability to convey complex technical concepts to non-technical stakeholders.Strong problem-solving abilities and a results-oriented mindset.Preferred Experience
Experience with data governance, security, and compliance in cloud environments.Familiarity with DevOps practices and CI/CD pipelines in cloud settings.Advanced certifications in AWS, GCP, or Big Data technologies.