Aptos is a people-first blockchain on a mission to help billions of people achieve universal and fair access to decentralized assets in a safe and scalable way.
Founded by some of the original creators and maintainers that researched, designed, and built the Diem blockchain to serve this purpose, we have dedicated several years toward this mission. We believe the open-source Diem technology we have developed is an important foundation of a safe and scalable web3 world where everyone has more equitable opportunities to grow and access financial assets with lower fees and fewer intermediaries.
Aptos (Ohlone for "The People") encompasses our mission and ethos for why we build.
About The Role
We are searching for an experienced Data Engineer to join our growing Analytics team. You will be responsible for expanding and optimizing our data infra and data pipeline architecture, as well as optimizing data flow, collection, and management of on-chain, off-chain, and cross-chain data. As of first data engineer, you will be in charge of the entire data warehouse from ingestion to insight. You will work closely with Data Scientists to derive data-driven insights that drive our product vision forward. We are looking for a bold thinker who has the ability to execute well; in return, you will receive a lot of autonomy and ownership over the projects you tackle.’
What you'll be doing:
- Define, build and deliver data pipelining architecture. This includes ingesting data from different data sources and creating aggregate views.
- Work with stakeholders to assemble large, complex data sets for ready business consumption.
- Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
- Ensure that Aptos DW has high-quality, high-trust data the company can drive decisions from. Build a core data model that serves as the foundation of all of the use cases.
- Manage data scalability, partitioning, growth, and availability utilizing cloud data warehousing technologies like Bigquery.
- Work with large-scale batch and streaming data. Look to reduce the time to answers and increase the freshness of data.
- Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
What we’re looking for:
- 5+ years of relevant experience
- A degree in a technical field such as Finance, Data Science, Statistics, Computer Science, or similar field
- Strong data wrangling and SQL skills. You have a track record of optimizing large pipelines to run very efficiently
- Experience in at least one programming language (e.g. Python)
- Expert level exposure and experience working with big-data technologies like Spark/Presto/Bigquery/Snowflake/Redshift
- Experience manipulating large amounts of structured and unstructured data through pipeline development tools like Airflow
- Be able to proactively manage prioritization of work and deliver work with great quality and influence the broader team in creating leverage
- Hands-on experience building dashboards with a data visualization tool such as Tableau, Looker, Data Studio, etc.
- Passion to initiate and lead projects to completion in a fast-paced, ever-changing, start-up environment
Listed in: Cryptocurrency Jobs, Remote Web3 Jobs, Developer Web3 Jobs, Python Web3 Jobs, DeFi Crypto Jobs, Finance Crypto Jobs, Data Crypto Jobs, Full Time Crypto Jobs.
Please let Aptos know that you found this job on Crypto Jobs List. This helps us get more companies to post crypto jobs here!