Ideas like this are bad ones. Words matter, you should put effort into them, minimization is not the primary optimization, don't let something like this MitM and change your hard work for the worse.
The reason people do custom is to craft very good instructions and tools, something a machine is not capable of
Perhaps? I just used it to analyze one of my 96k Zig codebases using Claude Code and here is (part of) what came back. (I snipped out the deeper analysis above as it exposes my private project - but it was all correct).
I had it run a separate analysis using traditional vs. opty and count the actual tool calls and input token counts. My prompt was basically, "do a full analysis of this entire codebase."
Interesting. Can anyone provide personal insights or benchmarks on how effective TOON compared to e.g., JSON or Markdown is (Codex, Claude, ...)?
Ideas like this are bad ones. Words matter, you should put effort into them, minimization is not the primary optimization, don't let something like this MitM and change your hard work for the worse.
The reason people do custom is to craft very good instructions and tools, something a machine is not capable of
Perhaps? I just used it to analyze one of my 96k Zig codebases using Claude Code and here is (part of) what came back. (I snipped out the deeper analysis above as it exposes my private project - but it was all correct).
I had it run a separate analysis using traditional vs. opty and count the actual tool calls and input token counts. My prompt was basically, "do a full analysis of this entire codebase."you're focused on quantity, that's yesterday's problem, tokens are getting cheaper, contexts are getting longer
try quality instead