Logo
Total Quality Logistics

Senior Data Platform Engineer

Total Quality Logistics, Charlotte, North Carolina, United States, 28245


About the role:

As a Data Platform Engineer III, your primary objective is to build, maintain, and monitor both highly available and scalable data platforms that support the flow of data from business critical source systems to analytical reporting layers enabling Data Engineers, Data Scientists, Analysts, and Self-Service teams throughout the organization.

What's in it for you:Health, Dental and Vision coverage to best fit your needs, including a plan that takes $0 out of your paycheck + 401(k) with company matchTQL's IT Team offers a hybrid work environment with the ability to work remotely 40 hours per monthAdvancement opportunities within a robust IT departmentAccess to the latest emerging technologies through strong vendor partnershipsReimbursement for continuous education and technical trainingBe a part of a company with a history of investing in people and technology

What you'll be doing:

Capable of working on a small team with an Agile framework, either Scrum or KanbanProvide exceptional customer service and communication, responding to needs with a high sense of urgencyTranslate the needs of internal Data Teams into functional platformsTake ownership of environments from Proof of Concept to ProductionMonitor and enhance these environments during their lifecyclesCreate and maintain architecture documentationOvercome obstacles and challenges with an elevated level of accountability and initiativeWork on multiple projects simultaneouslyPartner with Data Architects to support the platforms they design/build

What you'll need:

Bachelor's degree or higher is preferred or relevant work experienceExceptional aptitude for learning new tools and persevere when faced with challengesExcellent verbal and written communication skills5 to 8 years deploying multiple variations of IaaS, PaaS, SaaS platforms5 to 8 years configuring security concepts including RBAC, IAM, Authentication methods, and least privilege across platforms2 to 5 years of automating administration tasks by scripting with PowerShell or similar2 to 5 years working within Infrastructure as Code environments using TerraformExperience administering and supporting data lifecycle self-service tools like Microsoft Fabric, Databricks, or SnowflakeHistory of managing HADR and scalability strategies like scale out versus scale upExperience administrating Power BI and SQL Server Reporting Services (SSRS), or other data visualization platforms like Tableau or QlikCreate advanced monitoring with tools like Azure Log Analytics, DataDog, SolarWinds, or similarAdministrative experience with cloud data storage platforms including Azure Data Lake Gen2, Elastic, Snowflake or othersHistory of Git and Pipeline experience using Azure DevOps or other CI/CD technologiesComprehension of container concepts like Kubernetes or Docker clustersWork within a Data Mart and Data Warehouse experience is a plusUnderstanding of Data Cubes or SQL Server Analysis Services (SSAS) experience is a plusExperience developing in or managing Relational Database Management Systems (RDBMS) like SQL Server, MySQL, Oracle, or PostgreSQL is a plus

Where you'll be:

200 Regency Executive Park Dr., Suite 100 & 200, Charlotte, NC 28217