Binance Accelerator Program - Data Warehouse Engineer
atBinance
Nov 27
Founded by Changpeng Zhao (CZ) in 2017, Binance is currently the largest cryptocurrency exchange in terms of daily volume. Binance is the core global exchange. However, Binance operates separate exchanges in some countries such as the US, UK, Singapore, and Turkey due to regulatory reasons.
Since Binance has global operations, the exchange does a lot of hiring on a regular basis. Being a market leader, Binance Jobs also come with significant perks. Most of the jobs are remote, with flexible working hours. Binance also offers health insurance, the option to be paid in crypto, and programs to develop your skills.
If you're looking for Binance US Jobs, a wide range of them are also available most of the time. On average, the Binance Interview process lasts 2-4 weeks with 4 steps: Application Review, Interview, Offer, and finally Onboarding.
Binance is a leading global blockchain ecosystem behind the world’s largest cryptocurrency exchange by trading volume and registered users. We are trusted by over 230 million people in 100+ countries for our industry-leading security, user fund transparency, trading engine speed, deep liquidity, and an unmatched portfolio of digital-asset products. Binance offerings range from trading and finance to education, research, payments, institutional services, Web3 features, and more. We leverage the power of digital assets and blockchain to build an inclusive financial ecosystem to advance the freedom of money and improve financial access for people around the world.About Binance Accelerator Program Binance Accelerator Program is a concise fixed-term program designed for Early Career Talent to have an immersive experience in the rapidly expanding Web3 space. You will be given the opportunity to experience life at Binance and understand what goes on behind the scenes of the worlds’ leading blockchain ecosystem. Alongside your job, there will also be a focus on networking and development, which will expand your professional network and build transferable skills to propel you forward in your career. Learn about BAP Program HEREWho may applyCurrent university students, who are able to participate in this program for 3 months to 6 months.
Requirements
- According to the company's data warehouse specifications and business understanding, build a universal and flexible data warehouse system that can quickly support the needs and reduce repetitive development work efforts.
- Data model design, development, testing, deployment, online data job monitoring, and the ability to quickly solve complex problems, especially the optimization of complex calculation logic and performance tuning, etc.
- Participate in Data governance, including the construction of the company’s metadata management system and data quality monitoring system.Â
- Design and implement a data platform integrated with data lake warehouse to support real-time data processing and analysis requirements.
- Build knowledge graph, and provide in-depth business insight.
- Participate in technical team building and learning growth, and contribute to the team’s overall knowledge accumulation and skill improvement.
Responsibilities
- Undergraduate in a quantitative discipline, such as Mathematics/Statistics, Actuarial Sciences, Computer Science, Engineering, or Life Sciences.
- Understanding of data warehouse modeling and data governance. Knowledge of data warehouse development methodology, including dimensional modeling, information factory etc.
- Proficient in Java / Scala / Python (at least one language) and Hive & Spark SQL programming languages.
- Familiar with OLAP technology (such as: kylin, impala, presto, druid, etc.).
- Knowledgable in Big Data batch pipeline development.
- Familiar with Big Data components including but not limited to Hadoop, Hive, Spark, Delta lake, Hudi, Presto, Hbase, Kafka, Zookeeper, Airflow, Elastic search, Redis, etc.
- Experiences with AWS Big Data services are a plus.
- Have a strong team collaboration attitude and develop partnerships with other teams and businesses.
- Experience in real-time data processing, familiar with stream processing frameworks such as Apache Kafka, Apache Flink, in-depth knowledge of Lakehouse technology, practical project experience, proficiency in StarRocks, including its data model design, query optimization and performance tuning.
- Experience in knowledge graph construction and application, and knowledge of graph databases such as Nebula, etc.
Listed in: Web3 Jobs, Web3 Web3 Jobs, Research Crypto Jobs, Developer Web3 Jobs, Trading Crypto Jobs, Engineering Crypto Jobs, Exchange Web3 Jobs, Python Web3 Jobs, Security Web3 Jobs, Scala Crypto Jobs, Java Crypto Jobs, Data Crypto Jobs, Finance Crypto Jobs, Full Time Crypto Jobs.