Intralox
Data Engineer
Intralox, New Orleans, Louisiana, United States, 70123
This position is not currently eligible for work authorization.
Intralox, L.L.C., a division of Laitram, L.L.C., and a global provider of conveyance solutions and services, has an opening for a Data Engineer/Power BI Developer within the Digital Solutions team. The Digital Solution (DS) team is leading Intralox’s implementation and evolution of our enterprise business applications and new digital solutions. DS is more than just an IT department – we are business domain experts, software developers, and operational support resources that use technology to bring more value to customers and Intralox.
Intralox was founded on the principle of doing the right thing, by treating customers, employees, and suppliers with honesty, fairness, and respect. We invest heavily in these values and aim to practice our business philosophy principles every day, which is why we have been consistently recognized for innovation and workplace excellence.
ResponsibilitiesCreate and maintain optimal data pipeline architecture.Assemble large, complex data sets that meet business requirements.Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Microsoft ‘big data’ technologies.Build analytics tools in Power BI that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Create and support data tools for analytics and business team members that assist them in building and optimizing our products and business processes.Work with data and analytics experts to strive for greater functionality in our data systems.
RequirementsThis position is not currently eligible for work authorization.Minimum 2 years of experience in a Data Engineer role.Bachelor's degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. A graduate degree in a relevant field is preferred.Advanced working knowledge of SQL and experience with relational databases, including query authoring (SQL). Familiarity with a variety of databases is desirable.Proven experience in building and optimizing 'big data' data pipelines, architectures, and data sets.Strong analytical skills for working with structured and unstructured datasets.Proficiency in developing processes to support data transformation, data structures, metadata, dependency, and workload management.Demonstrated success in manipulating, processing, and extracting value from large, disparate datasets.Excellent project management and organizational skills.Experience with or knowledge of the following software/tools or similar is required:Big data tools: Hadoop, Spark, Kafka, etc.Relational SQL and NoSQL databases.Data pipeline and workflow management tools: Azure Data Factory.Azure cloud services: VMs, Synapse Analytics, Data Factory, Azure SQL, Data Lake.Object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.Knowledge of Oracle EBS preferred.EOE/M/F/Vet/Disabled
#J-18808-Ljbffr
Intralox, L.L.C., a division of Laitram, L.L.C., and a global provider of conveyance solutions and services, has an opening for a Data Engineer/Power BI Developer within the Digital Solutions team. The Digital Solution (DS) team is leading Intralox’s implementation and evolution of our enterprise business applications and new digital solutions. DS is more than just an IT department – we are business domain experts, software developers, and operational support resources that use technology to bring more value to customers and Intralox.
Intralox was founded on the principle of doing the right thing, by treating customers, employees, and suppliers with honesty, fairness, and respect. We invest heavily in these values and aim to practice our business philosophy principles every day, which is why we have been consistently recognized for innovation and workplace excellence.
ResponsibilitiesCreate and maintain optimal data pipeline architecture.Assemble large, complex data sets that meet business requirements.Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and Microsoft ‘big data’ technologies.Build analytics tools in Power BI that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics.Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.Create and support data tools for analytics and business team members that assist them in building and optimizing our products and business processes.Work with data and analytics experts to strive for greater functionality in our data systems.
RequirementsThis position is not currently eligible for work authorization.Minimum 2 years of experience in a Data Engineer role.Bachelor's degree in Computer Science, Statistics, Informatics, Information Systems, or another quantitative field. A graduate degree in a relevant field is preferred.Advanced working knowledge of SQL and experience with relational databases, including query authoring (SQL). Familiarity with a variety of databases is desirable.Proven experience in building and optimizing 'big data' data pipelines, architectures, and data sets.Strong analytical skills for working with structured and unstructured datasets.Proficiency in developing processes to support data transformation, data structures, metadata, dependency, and workload management.Demonstrated success in manipulating, processing, and extracting value from large, disparate datasets.Excellent project management and organizational skills.Experience with or knowledge of the following software/tools or similar is required:Big data tools: Hadoop, Spark, Kafka, etc.Relational SQL and NoSQL databases.Data pipeline and workflow management tools: Azure Data Factory.Azure cloud services: VMs, Synapse Analytics, Data Factory, Azure SQL, Data Lake.Object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.Knowledge of Oracle EBS preferred.EOE/M/F/Vet/Disabled
#J-18808-Ljbffr