llama2 from scratch ↗
full LLaMA 2 implementation in PyTorch - RoPE, GQA, SwiGLU, KV cache
first principles learnings is fun actually
transformer from scratch ↗
GPT-2 built from scratch; architecture, training, fine-tuning - classification and instruction
all hail karpathy!
cornelia ↗
AI law assistant + Word plugin - built it, shipped it, sold it
a side project that gave me my biggest learnings/month ratio
llmd ↗
one API call turns any URL into clean markdown - LLM-ready, noise-free
because feeding raw HTML to a model is messy, and i am lazy
healthcare chatbot ↗
locally-deployed AI that reads medical docs and answers questions about them
handles scanned PDFs, insurance policies, prescriptions
blackbox labs ↗
runs a tech twitter account while you sleep. GPT-3.5 drafts tweets on a cron schedule, routes them to discord for human approval, and only posts to twitter when you say so. also auto-replies to incoming tweets.
i grew my twitter from 0 to 10k in 5 months, so i kinda knew what worked and what not, so automated all of my wisedom with this.
jobjesus ↗
speaks interview questions out loud, listens to your answers, grades you at the end
built in a hackathon for fun, after OpenAI dropped API access