Legal LLMs, Pre-Training & Custom AI Hardware
Australian private M&A lawyer turned legal AI founder. Building a startup focused on private equity fund documentation, with custom inference hardware to keep client data off the public cloud. Deep expertise in model pre-training, open-weight LLMs (Qwen, Kimi, DeepSeek), and running AI infrastructure at home. Background: private fund legal work in Sydney and Hong Kong.
About This Episode
Jeremy is a private M&A lawyer who runs a legal AI startup — and runs his own LLM inference hardware out of his living room because he's been building custom PCs since age 14 (for Call of Duty), then mined Dogecoin on graphics cards, and discovered all that knowledge transfers perfectly to training and serving models. We go deep on why lawyers can't use ChatGPT, Claude, or Gemini with client data, why the entire private equity domain is essentially absent from LLM training corpora, why Qwen punches way above its weight, and why the pre-training ceiling is real.