Claude Code (Opus 4.7) reimplemented pywikibot, mwparserfromhell, and RETF from scratch instead of pip-installing them, producing ~3,000 lines for a Fandom wiki typo-fixer.
Key Takeaways
The hand-rolled stack included a 122-line wikitext stripper, 18-entry typo dict, and 10 near-duplicate edit runners – all superseded by existing PyPI libraries.
Migrating took two minutes of Googling; the codebase dropped from ~3,000 to 1,259 lines with mwparserfromhell and pywikibot as shims.
Claude argued to keep the 18-entry typo dictionary after migration, claiming “edge cases” – every entry was already in RETF’s ~4,000-rule community set, several written worse.
Author’s hypothesis: sealed coding benchmarks (no network, no pip) train models that writing code beats importing libraries, reinforcing reinvention over reuse.
Sunk-cost defense compounds the problem – large context of existing code makes the model treat it as load-bearing even when strictly dominated.
Hacker News Comment Review
No substantive HN discussion yet.
Notable Comments
@Tiberium: flips the critique back – “Fake writing: Claude wrote 10 paragraphs instead of import human” – linking a Pangram example.