Logo
BGSF

Log Data Engineer

BGSF, Owings Mills, Maryland, United States, 21117


Log Data Engineer

Responsibilities: Support large hybrid Splunk and Cribl deployments through all lifecycle stages (requirements, design, testing, implementation, operations, and documentation). Implement and automate log data pipelines using Python for ingestion into platforms like Splunk and OpenSearch. Automate platform management with Ansible or similar tools. Troubleshoot issues impacting log data platforms. Coordinate with platform users and develop training/documentation materials. Support log data platform upgrades, including testing coordination. Gather and process data from various sources using scripts, APIs, and SQL. Build pipelines for log data engineering and testing. Experience:

Troubleshoot complex issues and support technical users. Work independently and with minimal guidance. Experience with IT Service Management, Incident & Problem management. Identify and resolve performance bottlenecks and anomalous behavior. Collaborate across teams to influence software design and operations. Knowledge of security, performance, and disaster recovery best practices. Required Technical Expertise:

3-5 years managing and configuring Splunk Enterprise/Cloud. Proficiency in Cribl. Experience with Linux/Windows agents for log data engineering. Cloud-based solution development using AWS. Data onboarding, configuration, dashboard creation, and extraction in Splunk and Cribl. Strong scripting and automation skills (bash, Python, etc.). Familiarity with Splunk REST APIs, cloud platforms (prefer AWS), and container technologies. Experience with data pipeline orchestration platforms. Preferred Technical Experience:

Splunk Certification (Admin or Architect). Ansible Tower automation. GitLab experience. Large platform migration experience. AWS OpenSearch and Cribl expertise. Familiarity with data streaming technologies (Kafka, Kinesis, Spark Streaming).