Saul Ramirez profile photo

Saul Ramirez

Head of Research at Subquadratic

Subquadratic ArchitecturesSequence ModelingSpeech & LLM ResearchRAG SystemsTime Series AnalysisML Infrastructure

About

I'm Saul Ramirez, currently the Head of Research at Subquadratic. My career has been a journey of scaling complex systems—from modeling groundwater for the U.S. National Water Model during my Ph.D. to architecting 30 TB data pipelines at Amazon and solving character drift in long-form conversational AI. I'm a 'researcher who engineers,' meaning I care deeply about grounding theoretical breakthroughs in production reality. Currently, I'm focused on moving beyond the quadratic constraints of standard Transformers to unlock the next generation of long-context sequence modeling. I'm passionate about building research organizations that ship real-world impact, and I'm always looking to connect with talented scientists and engineers who want to challenge industry assumptions and build the future of language and speech.

Networking

What I can offer

  • Deep expertise in long-context sequence modeling
  • Insights on transitioning research into production
  • Technical leadership in Speech and LLM domains

Looking for

  • Research Scientists
  • Infrastructure Engineers
  • Product Designers
  • Collaborators on the future of language and speech modeling

Best fit for

AI ResearchersML EngineersTechnical FoundersConference attendees (ICLR, NeurIPS, AAAI)

Current Interests

Alternative transformer architectures (SSMs, Linear Attention)Cognitive Psychology in AILong-horizon time series dataBuilding shipping-focused research organizations

Background

Career

Transitioned from Civil Engineering research and hydrology into large-scale data engineering at Amazon and machine learning at BENlabs, eventually specializing in conversational AI and subquadratic architectures.

Education

Ph.D., MS, and BS in Civil Engineering from Brigham Young University.

Achievements

  • Solved character drift in 45+ minute AI conversations
  • Reduced Amazon legacy pipeline runtime by 75%
  • Contributed to 7% sales lift via recommendation systems at BENlabs
  • Published 5 peer-reviewed papers on spatio-temporal datasets
  • Saved $300K in project overruns at Kiewit via ML tools

Opinions

  • The future of AI requires moving beyond standard Transformer limits to manage large contexts efficiently.
  • LLMs aggressively compress information compared to the loose, flexible representations of humans.
  • Research should be grounded in whether ideas actually hold up in production systems.
  • Research teams should use ablations to show incremental progress against North Star metrics.