Skip to content
Software Engineer

German
Alvarez

I build things for the web.

26

Software engineer who enjoys building digital experiences. Currently documenting my journey through Fractal School and exploring the intersection of AI and software development.

ReactNext.jsTypeScriptPythonDjangoPostgreSQLTailwindAIClaudetRPCReactNext.jsTypeScriptPythonDjangoPostgreSQLTailwindAIClaudetRPC

Featured
Article

Read Full
2026-02-093 min read

Teaching Claude to Teach Me

Today I sat down to review the code I wrote last week — a multiplayer tic-tac-toe game with WebSockets, an AI opponent, and an ELO rating system — and ended up building something I didn't expect: a system that teaches me while I code.

It Started with a Quiz

I asked Claude to read through my projects and quiz me on them. Not surface-level stuff like "what framework did you use?" but architecture questions: why the WebSocket channels are split the way they are, how the AI difficulty works without changing the search depth, what would break if the game state was mutable.

I scored 8 out of 10. The two I missed were both framework-specific APIs I hadn't internalized yet — Next.js static generation and the Markdown processing pipeline. The architecture and design reasoning? I nailed those.

That felt like useful signal. Not just "you passed" but a clear map of where my understanding is solid and where the gaps are.

Building a Diagram Tool

Next, I built a script that takes a Mermaid diagram string and outputs both an SVG and ASCII text file. Simple enough — install beautiful-mermaid, write a Bun script, save the output. But the interesting part was what came after.

Once the script worked, I realized: next time I start a fresh Claude session, it won't know this script exists. Every tool I build is invisible to future conversations unless I tell Claude about it every time.

Skills: Teaching Claude to Remember

That's where Claude Code skills come in. A skill is a markdown file with instructions that Claude loads automatically when it's relevant. You put it in .claude/skills/ and Claude knows when to use it.

So I wrote a skill for the diagram script. Then I wrote a skill for writing skills. Then things got interesting.

The Skills I Actually Wanted

Once I had the pattern down, I started thinking about what would actually help me learn faster:

Code Review — Reviews my code as a staff engineer. No softened feedback, no "great job on the basics." It leads with bugs, then security, then architecture, then complexity. Each issue gets a severity tag and a concrete fix. I already knew from previous sessions that prompting Claude with a senior engineer persona gets dramatically better feedback — now that persona is baked in permanently.

Concept Check — Quizzes me on concepts I just used during a session. It triggers proactively after I finish a feature or fix a bug, asks "why" questions instead of "what" questions, and logs the results to my daily notes. The goal: make sure I'm actually learning what I'm building, not just copying patterns.

Debugging — A systematic workflow that kicks in when I'm stuck. It forces me to stop, read the error, trace the data flow, form one hypothesis, and test it before writing any fix. This one exists because of a specific pattern in my work: when I'm stuck, my instinct is to add complexity instead of stepping back. The debugging skill explicitly calls that out and redirects me.

Explaining Code — Explains code the way I learn best: why it exists, then how it works with an analogy, then a walkthrough of the important lines, then what would break if you changed it. Four steps, in that order, every time.

Daily Notes — Auto-summarizes what I built, what I learned, and what gaps showed up. Runs at the end of every session so there's a persistent record of my progress.

What I Actually Learned

The meta-lesson here is that tools shape how you work. A quiz after every feature changes your relationship with the code you're writing — you start paying more attention because you know you'll be tested. A debugging workflow that says "stop adding complexity" changes how you respond to being stuck.

None of these skills are technically complex. They're just markdown files with instructions. But they compound. Every future session starts with Claude already knowing how to review my code, how to explain things to me, how to catch me when I'm spiraling on a bug, and how to check if I actually understood what just happened.

The tools you build around your learning matter as much as the learning itself.

What's Next

I want to build a learning log that tracks concepts across sessions — not just what I got right today, but trends over time. Which topics keep showing up as gaps? Which ones clicked after one explanation?

The system is starting to feel less like "me using an AI tool" and more like a feedback loop that gets smarter about how I learn. That's the part I'm most excited about.

Recent
Posts

View All

Get In
Touch

I'm always open to interesting conversations and opportunities. Feel free to reach out.