Logo
Borderless Capital

Blockchain Data Engineer

Borderless Capital, Brazil, Indiana, United States, 47834


Building the Future of Crypto

Join our Data Engineering team and play a crucial role in building and maintaining scalable solutions to drive fast and accurate data-driven decisions across terabytes of data. You'll be pivotal in managing our data warehouse and data lake, crafting pipelines to efficiently move and process vast amounts of data for different data products. Our team handles both batch and streaming data, with diverse responsibilities matching your interests and expertise.As Blockchain Data Engineer, you'll contribute to our Blockchain Data Platform in-house system enabling on-chain read and write capabilities for Kraken teams and beyond. This data is accessible through our data systems, supporting several projects from various engineering and business units. Our vision is to establish this platform as the premier source of blockchain data, serving all internal projects.The opportunity

Work and master knowledge of blockchain data!Investigate notable cryptocurrency transactions and addresses for insights and detecting issues.Build scalable and reliable data pipelines that collect, transform, load, and curate data from internal systems.Ensure high data quality for pipelines you build and make them auditable.Support design and deployment of distributed data store that will be central source of truth across the organization.Develop, customize, configure self-service tools that help our data consumers to extract and analyze data from our massive internal data store.Evaluate new technologies and build prototypes for continuous improvements in data engineering.Skills you should HODL

Deep hands-on experience with at least one major blockchain, potentially more (i.e. UTXO, EVM chains, etc..)2+ years of work experience in relevant field (Analytics Engineer, Data Engineer, DWH Engineer, Software Engineer, etc).Excellent SQL and data manipulation skills using common frameworks like Spark/PySpark, Pandas, Polars, or similar.Experience with data warehouse technologies and relevant data modeling best practices.Experience building data pipelines/ETL (or ELT) and familiarity with design principles (Apache Airflow is a big plus).Experience with at least one of major programming languages (e.g. Python, Scala, Java,..).Experience with business requirements gathering for data sourcing while working remotely and asynchronously.Nices to have

Experience with Dune Analytics, Nansen, Elliptic, Chainalysis, or other blockchain data providers is a big plus.Experience working with cloud services (e.g. AWS, GCP, ..) and/or Kubernetes.Experience working in a highly regulated environment.Experience in building and contributing to data-lakes.Working with petabytes of data while optimizing queries and processes.Enjoy Dockerizing services!

#J-18808-Ljbffr