Data Platform Engineer - Twinstake

Nethermind logoNethermind

Mar 02

About Twinstake:

Twinstake, a leading institutional staking provider, offers unparalleled knowledge and experience in the institutional market. Our foundation is coupled with deep crypto expertise and technology expertise from our core devs and industry veterans.

Our strengths include:

  • Regulatory Compliance
  • Non-custodial Solutions
  • Enhanced Reporting and Data
  • Optimized Performance
  • Premier Service

About the Role:

As a Data Platform Engineer at Twinstake, you will play a pivotal role in shaping our data infrastructure, focusing on automation, reliability, and efficiency. You will be instrumental in constructing internal frameworks that enable our analysts and modeling teams to efficiently work with a state-of-the-art data stack. We are looking for someone with a client-centric mindset who’s action-driven and who’s ready to bring their knowledge and experience to supercharge our delivery output, and also challenge our existing build and approach to ensure we’re building the right thing, right.

Key Responsibilities:

  1. Data Platform Infrastructure:
    • Design, build, and maintain a robust data platform with a focus on infrastructure scalability and reliability.
    • Own and streamline the entire staking data journey, from seamless ingestion to insightful reporting. Automated checks and Airflow-powered pipelines ensure data accuracy, while proactive alerts and robust monitoring keep you informed.
    • Be instrumental in reviewing our existing data lakehouse implementation and drive the project forward using industry best practice and tooling
  2. Automation and Framework Development:
    • Lead the development of internal frameworks for data analysts and modeling teams. This will include working closely with our data modelling team to ensure that data is easily ingested and consumed by QuickSight dashboards - which power detailed reporting information about validator performance
    • Contribute to the continuous improvement of our data platform by integrating new technologies.
  3. Infrastructure as Code and Containerization:
    • Use Terraform and other infrastructure-as-code tools for efficient cloud resource management.
    • Maintain and optimize Kubernetes clusters for high availability and performance.
    • Utilize Docker for containerization, enhancing deployment processes and environment consistency.
  4. Data Lakehouse Architecture:
    • Forge the path towards cutting-edge data lakehouse design, leveraging your mastery of Delta Lake and Apache Iceberg to tackle current infrastructure challenges head-on. Build robust, scalable data lakehouses that empower reliable data pipelines and deliver actionable insights for internal and external clients, including staking reporting.
  5. Data Processing and Streaming:
    • Spearhead the integration of Apache Spark and Apache Flink, revolutionizing staking data processing. Leverage your expertise to craft cutting-edge, incremental data workflows, streamlining the journey from raw data to actionable insights.
  6. Team Collaboration and Leadership:
    • Collaborate with cross-functional teams to understand data needs and deliver solutions that align with company goals and product roadmaps. Twinstake is a client centric company so you’ll be working hand in hand with the product team to design solutions from client feedback, working alongside the infrastructure team who maintain a very large number of blockchain validators, and assisting the modelling team to ensure they have the relevant blockchain data they require to mathematically model various validator reward scenarios.

Qualifications:

  • 5+ years of experience in data engineering, focusing on data platform infrastructure.
  • Strong skills in Apache Airflow, with a big plus for Airflow contributors.
  • Extensive experience with Terraform and other infrastructure-as-code tools.
  • Solid understanding and experience with Apache Spark or Apache Flink.
  • Expertise in Kubernetes and Docker.
  • Exceptional problem-solving, communication, and team collaboration skills.

Bonus:

  • Hands-on experience in the crypto industry.
  • Experience with QuickSight
  • Experience in a fast-paced startup, building modern data stacks for scale.
  • Hands-on experience in Data SecOps or Knowledge.
  • Hands-on experience in BizOps or Knowledge.
  • Proficiency in Delta Lake or Apache Iceberg.

Nice to have:

  • Hands-on experience in the crypto industry.
  • Experience with QuickSight
  • Experience in a fast-paced startup, building modern data stacks for scale.
  • Hands-on experience in Data SecOps or Knowledge.
  • Hands-on experience in BizOps or Knowledge.
  • Proficiency in Delta Lake or Apache Iceberg.

Listed in: Web3 Jobs, Web3 Crypto Jobs, Developer Crypto Jobs, Engineering Crypto Jobs, Compliance Crypto Jobs, Data Crypto Jobs, Startup Web3 Jobs, Docker Web3 Jobs, Kubernetes Crypto Jobs, Full Time Web3 Jobs.

Let employer know that you found this job on CryptoJobsList. This helps us get more companies to post web3 jobs here!

3 applications