Developer - Data Infrastructure Engineering
Akuna Capital
Sydney
4d ago

About Akuna :

Akuna Capital is a young and booming trading firm with a strong focus on collaboration, cutting-edge technology, data driven solutions and automation.

We specialize in providing liquidity as an options market-maker meaning we are committed to providing competitive quotes that we are willing to both buy and sell.

To do this successfully we design and implement our own low latency technologies, trading strategies and mathematical models.

Our Founding Partners, Andrew Killion and Mitchell Skinner, first conceptualized Akuna in their hometown of Sydney. They opened the firm’s first office in 2011 in the heart of the derivatives industry and the options capital of the world Chicago.

Today, Akuna is proud to operate from additional offices in Sydney, Shanghai, and Boston.

Akuna Sydney opened in early 2018 and is at the center of Akuna’s Asian trading operations. Akuna’s focus in Asia is currently trading HK, Korea, cryptocurrencies and US night markets and is looking to expand to trading on all major Asian exchanges.

Employees will work together towards achieving Akuna’s goals across all areas of the business, including trading and desk buildout, cutting-edge research and data analysis, strategy creation, and building ultra-low-latency trading systems that are tailored to local market conditions.

What you’ll do as a Developer on the Data Infra team at Akuna :

We are a data driven organization and are seeking Developers to take our data to the next level. We collect large volumes of data from both internal and external sources, requiring talented individuals to identify opportunities to improve and expand our data capabilities.

Working on our data infrastructure is a high impact position and you will have opportunity to work closely with our world class team of Devs, Quants, Traders and Management.

In this role, you will :

  • Lead the effort to democratize data access at Akuna
  • Architect, implement, and improve tools that build and interact with our diverse data
  • Standardize our data management best practices
  • Work closely with stakeholders throughout the firm to identify how data is consumed
  • Build and deploy pipelines to collect and transform our rapidly growing Big Data set
  • Propose and effect changes to our data generation processes
  • Produce clean, well-tested, and documented code with a clear design to support mission critical applications
  • Challenge the status quo and help push our organization forward, as we grow beyond the limits of our current tech stack
  • Qualities that make great candidates :

  • BS / MS / PhD in Computer Science, Engineering, Physics, Math, or equivalent technical field
  • 3+ years of professional experience developing software applications
  • Java or Scala experience required Python a plus
  • Highly motivated and willing to take ownership of high-impact projects upon arrival
  • Experience building ETL pipelines
  • Must possess excellent communication, analytical, and problem-solving skills
  • Demonstrated experience working with diverse data sets and frameworks across multiple domains financial data experience not required
  • Interest or experience in building scalable, containerized workflows
  • Demonstrated experience using software engineering best practices like Continuous Integration / Deployment to deliver complex software projects
  • Experience with containerization and container orchestration technologies, like Docker and Kubernetes, is a big plus
  • Report this job
    checkmark

    Thank you for reporting this job!

    Your feedback will help us improve the quality of our services.

    Apply
    My Email
    By clicking on "Continue", I give neuvoo consent to process my data and to send me email alerts, as detailed in neuvoo's Privacy Policy . I may withdraw my consent or unsubscribe at any time.
    Continue
    Application form