Senior Data Engineer
Chainalysis is one of the oldest crypto companies to offer on-chain analysis for its clients. They offer investigation and compliance tools to crypto companies, government agencies, regulators, and more. For example, an exchange can hire them to flag transactions coming from wallets associated with exchange hacks or known terrorists to make sure they do not engage with them. Chainalysis software has been used to solve some of the biggest high-profile cases in the crypto industry.
Chainalysis has its offerings in more than 60 countries. At any given time, there are hundreds of Chainalysis Jobs. If you're looking for a career in Chainalysis, there's a lot of categories to choose from in Engineering, Marketing, and sales. Host of the jobs are not remote. But since Chainalysis has its offices in multiple countries, most jobs should be very accessible. Check out some of the Chainalysis jobs below:
The engineering team at Chainalysis is inspired by solving the hardest technical challenges and creating products that build trust in cryptocurrencies. We’re a global organization with teams in the UK, Denmark, and the USA who thrive on the challenging work we do and doing it with other exceptionally talented teammates. Our industry changes every day and our job is to build a flexible platform that will allow us to adapt to those rapid changes.
As a Senior Data Engineer, Investments, you’ll be responsible for building and maintaining the data pipelines for our Market Intelligence product. You are going to build an infrastructure of data intensive pipelines that runs with low latency, choose the technology that powers them, and collaborate closely with our data scientists within the team. You’ll write and maintain ETLs and their orchestration in order to produce meaningful and timely insights for our customers and their businesses. You will have the opportunity to lead the projects as senior engineer and help our customers to understand the market that they are in and help them to make better decisions for their business.
In one year you’ll know you were successful if:
- You’ve worked with other engineering teams to understand their data lifecycle, the right integration points and developed the new iteration of our data engineering stack and data infrastructure.
- You’ve developed and handled scalable data pipelines and built out new integrations with internal and external data sources.
- You’ve maintained optimal data pipeline architecture, including looking for and proposing improvements to the existing architecture.
- Together with the rest of the team, you’ve created scalable, self-healing and robust data pipelines with low latency.
A background like this helps:
- You are excited about Data!
- 5+ years of experience in data engineering, with a focus on designing and implementing data pipelines using orchestration tools like Airflow, Dagster, Prefect.
- Strong experience with Big Data Processing Tools like Databricks, Dremio, Fivetran, Snowflake, dbt, EMR, Athena, Glue and Presto.
- Strong experience with cloud service providers like AWS, GCP or Azure and infrastructure management using Terraform or alternatives such as AWS CloudFormation.
- Experience with implementing observability and monitoring tools such as Humio and Datadog to ensure pipeline health, data quality, and timeliness of data.
- Strong knowledge of data modeling, data architecture, and data governance, like Data Mesh, Data Vault, Star Schema, Kimball and Inmon.
- Proficiency in programming languages such as Python, and SQL.
- Familiarity with big data storage technologies such as Hadoop Distributed File System (HDFS), Amazon S3, or Azure Blob Storage, table formats like Iceberg and Delta and file formats like Parquet and Avro.
- Strong understanding of data security and privacy issues and experience implementing data security measures.
- Experience with Agile and working in a collaborative environment with cross-functional teams. Collaborate with data analysts, data scientists, and other stakeholders to understand their data needs and design pipelines to meet those needs.
- Experience working with DevOps methodologies, taking ownership of the CI/CD pipelines and experience using tools such as Github Actions, CircleCI etc.
- Excellent communication and presentation skills to communicate with technical and non-technical stakeholders.
- Experience with mentoring junior data engineers and participating in knowledge sharing sessions with other teams.
- Eagerness to be proactive and try out new solutions. Ask for forgiveness, not permission.
- Curious about cryptocurrencies/decentralized-finance or a desire to learn - we can help!
Listed in: Web3 Jobs, Remote Web3 Jobs, Devops Crypto Jobs, Security Web3 Jobs, Developer Crypto Jobs, Engineering Crypto Jobs, Python Web3 Jobs, Senior Web3 Jobs, Junior Level Web3 Jobs, Finance Crypto Jobs, Data Web3 Jobs, Full Time Web3 Jobs.