← Back to all jobs

Data Engineer

Gsrmarkets logo

Gsrmarkets

📍 Remote💰 $80k - $95k🕐 Posted 1 month ago
Data EngineerRemotemulti-chainjavapythonsqlrustawsapache-flinkairflowtableaulooker
Apply

Job Description

About Us

Founded in 2013, GSR is a leading market-making and programmatic trading company in the exciting and fast-evolving world of cryptocurrency trading. With more than 200 employees in 5 countries, we provide billions of dollars of liquidity to cryptocurrency protocols and exchanges on a daily basis. We build long-term relationships with cryptocurrency communities and traditional investors by offering exceptional service, expertise and trading capabilities tailored to their specific needs.

About the Role

This role sits within GSR's global Data Engineering team, where you'll contribute to the design and development of scalable data systems that support our trading and business operations. You'll work closely with stakeholders across the firm to build and maintain pipelines, manage data infrastructure, and ensure data is reliable, accessible, and secure. It's a hands-on engineering position with scope to shape the way data is handled across the business, working with modern tools in a fast-moving, high-performance environment.

Responsibilities

  • Data Pipeline Development: Build and maintain scalable, efficient ETL/ELT pipelines for both real-time and batch processing. Integrate data from APIs, streaming platforms, and legacy systems, with a focus on data quality and reliability.
  • Infrastructure & Architecture: Design and manage data storage solutions, including databases, warehouses, and lakes. Leverage cloud-native services and distributed processing tools (e.g., Apache Flink, AWS Batch) to support large-scale data workloads.
  • Operations & Tooling: Monitor, troubleshoot, and optimize data pipelines to ensure performance and cost efficiency. Implement data governance, access controls, and security measures in line with best practices and regulatory standards. Develop observability and anomaly detection tools to support Tier 1 systems.
  • Collaboration & Continuous Improvement: Work with engineers and business teams to gather requirements and translate them into technical solutions. Maintain documentation, follow coding standards, and contribute to CI/CD processes. Stay current with new technologies and help improve the team's tooling and infrastructure.

Requirements

  • 8+ years of experience in data engineering or a related field
  • Strong programming skills in Java, Python and SQL; familiarity with Rust is a plus
  • Proven experience designing and maintaining scalable ETL/ELT pipelines and data architectures
  • Hands-on expertise with cloud platforms (e.g., AWS) and cloud-native data services
  • Comfortable with big data tools and distributed processing frameworks such as Apache Flink or AWS Batch
  • Strong understanding of data governance, security, and best practices for data quality
  • Effective communicator with the ability to work across technical and non-technical teams

Nice to Have

  • Experience with orchestration tools like Apache Airflow
  • Knowledge of real-time data processing and event-driven architectures
  • Familiarity with observability tools and anomaly detection for production systems
  • Exposure to data visualization platforms such as Tableau or Looker
  • Relevant cloud or data engineering certifications

Benefits

  • A collaborative and transparent company culture founded on Integrity, Innovation and Performance
  • Competitive salary with two discretionary bonus payments a year
  • Benefits such as Healthcare, Dental, Vision, Retirement Planning
  • 30 days holiday and free lunches when in the office
  • Regular Town Halls, team lunches and drinks
  • A Corporate and Social Responsibility program
  • Charity fundraising matching and volunteer days