Data Engineer - E3
BitGo
AI Summary
The vacancy is well-structured with clear responsibilities, compensation, and requirements, but lacks explicit KPIs and social media links.
Description
BitGo is seeking a skilled Data Engineer to design and maintain scalable data pipelines, enhance blockchain reporting, and ensure data quality.
Requires 5+ years experience, strong SQL and Python skills, and familiarity with modern data platforms.
BitGo is the leading infrastructure provider of digital asset solutions, delivering custody, wallets, staking, trading, financing, and settlement services from regulated cold storage.
Since our founding in 2013, we have focused on enabling our clients to securely navigate the digital asset space.
With a global presence and multiple Trust companies, BitGo serves thousands of institutions, including many of the industry's top brands, exchanges, and platforms, and millions of retail investors worldwide.
As the operational backbone of the digital economy, BitGo handles a significant portion of Bitcoin network transactions and is the largest independent digital asset custodian, and staking provider, in the world.
## What you'll do
- •Design, build, and maintain scalable, reliable data pipelines that collect, transform, and curate data from internal systems.
- •Build automated reconciliation, monitoring and alerting systems.
- •Enhance and expand BitGo’s blockchain reporting infrastructure.
- •Integrate select external data sources to enrich the data platform.
- •Ensure high data quality and auditability across all pipelines.
- •Optimize data systems for near real-time processing and insights.
## Conditions
- •Competitive salary.
- •IT equipment support for work.
- •Meal & Commute allowance.
- •Medical Insurance.
- •Attractive Well-being allowance (comprises of medical, wellness and fitness aspects).
- •Snacks: on-the-house in the Bangalore office.
- •Great/Talented workforce to learn and grow with.
Requirements
- •5+ years of work experience in relevant field (Data Engineer, Software Engineer).
- •Strong experience with server-side languages (Python).
- •Strong experience with SQL databases like Postgres or MySQL.
- •Experience building data pipelines/ETL and familiarity with design principles.
- •Experience with data warehouse technologies and data modeling best practices (Snowflake, BigQuery, Spark etc).
- •Strong experience with systems design and event driven systems (Kafka).
- •A self-starter capable of adapting quickly and being decisive.
- •Experience with unit and functional testing and debugging.
- •Experience in Git/GitHub and branching methodologies, code review tools, CI tools, JIRA, Confluence, etc.
- •Ability to work independently in a fast-paced environment.
- •Engineering degree in Computer Science or equivalent.