OKX is a leading crypto trading app, and a Web3 ecosystem. Trusted by more than 20 million global customers in over 180 international markets, OKX is known for being the fastest and most reliable crypto trading app of choice for investors and professional traders globally.
Since 2017, OKX has served a global community of people who share a common interest in participating in a new financial system that is designed to be a level playing field for everyone. We strive to educate people on the potential of crypto markets and how to invest Beyond the OKX trading app, our Web3 wallet, known as MetaX, is our latest offering for people looking to explore the world of NFTs and the metaverse while trading GameFi and DeFi tokens.
About the team:
OKX data team is responsible for the whole data scope of OKG, from techincal selection, architecture design, data ingestion, data storage, ETL, data visualization to business intelligence and data science. We are data engineers, data analysts and data scientists. The team has end-to-end ownership of most of the data at OKx throughout the whole data lifecycle including data ingestion, data ETL, data warehouse and data services. As a data engineer of the team, you will work with the team to leverage data technologies to empower evidence-based decision-making and improve the quality of the company's products and services.
- Design and build resilient and efficient data pipelines for both batch and real-time streaming data
- Architect and design data infrastructure on cloud using industry standard tools
- Execute projects with an Agile mindset
- Build software frameworks to solve data problems at scale
- Collaborate with product managers, software engineers, data analysts and data scientists to build scalable and data-driven platforms and tools
- Ensure data integrity and scalability through enforcement of data standards. Improve data validation and monitoring processes to proactively prevent issues and quickly identify issues. Drive resolution on the issues.
- Define, understand, and test external/internal opportunities to improve our products and services.
- Bachelor’s Degree in Computer Science or have equivalent professional experience
- Solid Experience with data processing tools such as Spark, Flink
- Solid Experience implementing batch and streaming data pipelines
- Solid experiences in Python/Go/Scala/Java.
- In-depth knowledge of both SQL and NoSQL databases, including performance tuning and troubleshooting
- Familiar with DevOps tools such as Git, Docker, k8s
- Experience with the cloud (e.g. AWS, Ali Cloud, GCP, Azure)
- Be proficient in SQL, familiar with advanced SQL features such as window functions, aggregate functions and creating scalar functions/user-defined functions.
- Proven successful and trackable experience in full end-to-end data solutions involving data ingestion, data persistence, data extraction and data analysis.
- Self-driven, innovative, collaborative, with good communication and presentation skills
- Fluent in English, both written and spoken.
- Experience in FinTech, eCommerce, SaaS, AdTech, or Digital Wallet business industries.
- Experience in working with teams across offices and timezones is a plus.
- Experience in big data tools such as Amplitude/Tableau/QlikView, Ali Cloud DataWorks, MaxCompute, Hadoop, Hive, Spark and HBase is a big plus.