Software Engineer: Data Platform Team

Rogo AI logo

Rogo AI

We're building AI thought partners

We aim to make people smarter and more creative, accelerating the creation and sharing of knowledge in financial services. We're unabashedly ambitious and dead set on building the biggest Financial AI company in the world. Our team is lean, smart, and enormously ambitious. We're growing fast out of our beautiful office in NYC.

Why Join Rogo?

Exceptional traction: strong PMF with the world's largest investment banks, hedge funds, and private equity firms.

World-class team: we take talent density seriously. We like working with incredibly smart, driven people.

Velocity: we work fast, which means you learn a lot and constantly take on new challenges.

Frontier technology: we're developing cutting-edge AI systems, pushing the boundaries of published research, redefining what's possible, and inventing the future.

Cutting-edge product: our platform is state-of-the-art and crazily powerful. We're creating tools that make people smarter, reinventing how you discover, create, and share knowledge.

About the Role

As an engineer on the Data Platform team, you will help build out our real-time data pipelines for millions of unstructured financial documents to feed our financial LLM. It’s cutting-edge data engineering at the AI frontier. At Rogo, having a thorough and accurate understanding of data is at the core of the work we do.

Responsibilities

  • Designing and building large-scale infrastructure to power crawling, ranking, and retrieval systems.
  • Ship secure and compliant code: implement security concepts to develop software dealing with sensitive data, work with the security teams choosing what to build vs buy.
  • Raise the bar for code quality, reliability, and product velocity. Collaboratively, you'll push yourself and peers to develop technically and interpersonally.

Hard Requirements

  • 4+ years of industry experience as a data engineer
  • Top-notch programming skills in any language (Python a plus)
  • Highly proficient with Python and SQL, and an intuitive understanding of multi-threading, multi-processing, asyncio, and other concurrency primitives
  • Mastery of: Postgres, Snowflake or Elasticsearch
  • 2+ years of experience with Apache Airflow, Dagster or other orchestration framework
  • Experience deploying and monitoring mission-critical ETL pipelines with large and heterogeneous data sources
  • Experience with distributed systems
  • Experience with AWS or other cloud environments
  • Familiarity with LLMs and their potential applications

Bonus Requirements

  • Experience with a strongly typed language (e.g., Rust)
  • Experience at a hypergrowth startup
  • Financial services work experience
  • Experience with stream processing
  • Knowledge of Datadog and other telemetry tooling

Who You Are

  • You thrive in fast-paced environments. You are high-intensity and care a lot about what you do, and you're ecstatic to work at a start-up.
  • You are ambitious. You have fun solving problems that others think are impossible.
  • You are curious. You find joy in learning about AI, technology, and finance.
  • You are an owner. You are autonomous, self-directed, and comfortable working with ambiguity.
  • You are collaborative, organized, and thoughtful.

Location

    New York City, US

Job type

  • Fulltime

Role

Engineering

Keywords

  • On-site
  • Engineer
  • Full Time