Claude Code is like crack cocaine for CRIMPers

Started by Luhmann on 11/10/2025
Luhmann 11/10/2025 10:22 am
I recently discovered Claude Code (CC), a separate app from Claude AI specifically for software development, though they share many features, and you can use the same plan on both. As the title says, it is like crack cocaine for those of us who constantly like to try new apps or tweak the settings on existing apps.

Want a new feature in Raycast? Have CC write you an extension for it! Want a new action script for Drafts? Have CC write it for you. An open-source browser extension doesn't work how you like? Have CC fork the code and fix it. If the Pro plan didn't have a limited number of tokens, I'd be developing my own apps from scratch.

And I'm not even talking about the ways you modify Claude itself by writing new skills for it. For instance, I wrote a Claude code skill to interact with a Zotero MCP and format search results for Logseq... or more precisely, I used the "skill creation skill" to write it for me.

Claude Code has completely transformed how I use my computer, but it is too addictive. I really need to spend less time CRIMPing and more time doing my actual job!

Posted here because I figured I'd find a sympathetic audience.
satis 11/10/2025 5:45 pm
It's funny that Anthropic wasn't even sure if they were going to release this to the public. The head of Claude Coding said, "The Labs team started using it immediately, the next day after I gave it to them. I walked in and was like, "Wow, this is crazy." I've built a lot of products, but never really seen that before. And then we gave it to all of Anthropic, and pretty soon everyone was a daily active user. I think still 80% to 90% of people use it daily, and it's like 100% weekly.... We weren't totally sure if we should even release it, to be honest. We thought it was our secret sauce since it makes our researchers more productive, by a lot."

Vibe coding is cool, yet as powerful as this is it is just a baby step compared to flexible processes in/on aps that will be tailored to individual use in the near future.
Paul Korm 11/10/2025 6:03 pm
It's hard to believe we're just at the beginning of tools like Claude Code (and others) in the hands of the general public. What'll we have in a year, two years?

I've played with Claude Code, but I'm having more practical fun these days experimenting with Claude's access to my Readwise database, controlled access to my file system, and some of the new tools in beta test with Craft. It is so easy to have Claude look for my notes in Readwise and in my files, on a variety of topics, and then synthesize them and insert the findings and conclusions into a document in Craft. I have no doubt in a month or two or six there will even more ways to do this and what i can do today will seem too simple and uninteresting.

Not easy to put my finger on just what exactly is going on here. Not ordinary CRIMPing -- more like MegaCRIMPing.

(BTW, side note: if you're tired of Claude or ChatGPT telling you that your questions are brilliant, change the default prompt in Claude's general settings, or ChatGPT's personalization settings.)
satis 11/11/2025 9:51 pm


Paul Korm wrote:
(BTW, side note: if you're tired of Claude or ChatGPT telling you that
your questions are brilliant, change the default prompt in Claude's
general settings, or ChatGPT's personalization settings.)

THANKS!

I just changed ChatGPT from Default ("Cheerful and adaptive") to Robot ("Efficient and blunt") and to be sure gave it 'Custom instructions' of no encouragement. I just re-asked some old questions and the responses were more structured and less verbose. I like it!

Aside from telling me how smart and handsome I am, the main downside I've found to using LLCs is that they use search history to make assumptions about my needs. Yesterday I was researching a computer for an elderly neighbor and was getting inappropriately powerful recommendations based on my music and photography hobbies, and I had to edit the question to note that the product search was not for personal use.
Cyganet 11/12/2025 9:28 am
I find these unwanted assumptions annoying and a waste of my time trying to manage them. I have turned off using memories and previous chats for this reason. I have also turned off "improve the model for everyone".
Luhmann 11/15/2025 12:22 am
I spoke to a professional programmer about how to better use Claude Code and they recommended having it always tell you what it is going to do before doing it, and then write itself a report about what worked/didn't work after it is done. It also helps to instruct it to offer two or three solutions to every problem: one "elegant" and one "succinct." There are a lot of little tips like that you can find if you search for how to configure your claude.md file.
Amontillado 11/16/2025 3:05 pm
The most advanced general intelligence processors available - our brains - decide that we can leverage our resources better with vibe coding.

It's sound reasoning even if it opens the door for slop, but I have a question.

Is there any reason to assume future AGI will find different justification? Will it want to expend its exaflops and petawatt hours on something it could delegate to lesser intelligences like LLMs?

Walking the reasoning back, are we making the best use of our resources, draining our water tables and putting homes in competition for power while bypassing the practice of personal achievement?

I'm all for advancement in computing. The current AI craze seems inelegant to me.

OpenAI's plans to spend $1.4 trillion on a $300 billion speculative investment bankroll seems like the very picture of optimism to me.

As we say here in the American South, bless their hearts.
satis 11/16/2025 3:47 pm
AI’s resource footprint looks big mostly because the numbers are unusual, not because the impact is unprecedented. Put into context, datacenter use is still well below crypto mining, residential cooling, gaming rigs, or industrial processes. The sky is not falling, although it makes great newsbait. And since AI systems help reduce total energy use in other sectors via grid optimization, logistics, materials discovery, etc., the picture is more complicated than 'AI drains water tables.'

The worry about bypassing personal achievement echoes every technological shift from handheld calculators to CAD. Tools automate the routine parts of work and raise the ceiling on what individuals can accomplish; they don’t erase the human element. There’s no basis to assume a future AGI would prefer not to expend compute the way a tired human might. It will do whatever its objective function demands, whether that’s more compute or less. Machines don't inherit human motivational structure.

As for the huge investment numbers, every transformative technology looked like reckless optimism before the returns arrived. Alternating current was dismissed as niche. Influential monks and scholars predicted the printing press wouldn't last. In the 90s economist Paul Krugman infamously compared the impact of the internet to the fax machine and doubted its economic significance. Camera companies underestimated the imact of digital cameras... and then underestimated smartphones with cameras.

Tech also has a long history of real bubbles that left behind hugely valuable infrastructure and profitable winners.

The dot-com crash looked absurd in the moment but it built the modern web. The fiber-optic glut seemed like waste until streaming needed every strand. Just because today’s AI wave has bubble-like froth (and some inelegant code) doesn’t mean it isn’t producing real, durable capability that advantages people and companies who use it over those who don't.

If the upside of more capable AI is even a fraction of what people expect, the scale of investment won’t look extravagant in hindsight.
Chris Murtland 11/16/2025 4:41 pm
How long will any competitive advantage derived from "using AI" last? It seems if the tech gets more powerful and easier for anyone to use (and baked into everything anyway), it will quickly be reduced to something like "my competitive advantage is that I know how to send an email," i.e., not really an advantage. If anything created can be instantly replicated by anyone else, the only marketplace value left has to come from something else besides creation; but what is that something else? Attention for advertisers, I guess.

Perhaps it would be easier and ultimately more advantageous to just wait for the point at which my AI avatar, without all that much input from me, can represent me in the AI loop (AI creates plus AI consumes). Hey, AI me, go do AI stuff in the AI world and make it seem like I really know what I am doing in that world.
satis 11/16/2025 5:58 pm
It’s important not to conflate access with mastery. Even when a tool becomes universal (and decades later that still isn’t true worldwide for something like the internet), the baseline advantage levels but the advantage for people who use it well doesn’t. Everyone got the internet, but not everyone became Amazon. Everyone got spreadsheets, but not everyone became a great analyst.

If creation becomes cheap or instantly replicable value doesn’t vanish, it just *moves* because AI commoditizes production but not insight, direction, or strategy. Differentiation shifts from creation to the parts machines can’t do for you: choosing what to make, taste and judgment, speed of execution, trust, brand, distribution, and integrating tools into real workflows.
Chris Murtland 11/16/2025 11:26 pm
Fair enough.

I do wonder if the reliance will extend beyond production to all the other things you mention; even if AI doesn't do them particularly well (yet), it may appear to do them well enough that the allure of outsourcing everything to the machine will be too great.

Maybe innovation will come to mean doing the things AI never suggests you do.
Amontillado 11/17/2025 5:00 am
This is really a nice conversation. Philosophy and technology, cool stuff, although the combination is more often used to attack progress than to support it.

I don't wish to attack progress, but there are real gaps in how we are training ourselves.

Slide rules, for example. I can make logs and exponents dance and sing, at least to some extent, because a few years ago I learned how to use a slide rule.

That gave me a physical grasp of some numeric methods I didn't have before.

From a slide rule, in these modern times, I learned stuff I wouldn't have otherwise.

Who would of thought that?
satis 11/17/2025 9:40 pm
Yes, older tools can give a tactile understanding of abstract operations that modern tools often obscure. And the broader idea is also true, that some technological conveniences reduce exposure to underlying mechanisms. A useful takeaway is not that older tools are inherently *better* but that certain forms of hands-on, constraint-driven practice can expose structure that more automated tools hide.

There are tradeoffs to be sure, but in many cases they're worth it, though not necessarily to everyone.

I remember a high school teacher who decades ago was horrified at the thought of electronically delivered news because he felt he'd regularly gained knowledge and insight from random, serendipitous discovery when reading the NYTimes daily. Faster access, searchability, customization is gained at the expense of reduced accidental encounters with unrelated topics, and there's the higher risk of filter bubbles created by algorithmic recommendation systems.

But in 2025 how many people are reading newspapers? And how many would want to go back?
Paul Korm 11/17/2025 10:59 pm
satis wrote:
But in 2025 how many people are reading newspapers? And how many would
want to go back?

Uh ... me. I read newspapers and lots of "serious" magazines. The weekend FT provides a enjoyable hour on a quiet Sunday afternoon. I've had my weekly New Yorker delivered since I was in elementary school when I couldn't wait for a new McPhee essay or Kael review.
satis 11/18/2025 4:19 am


Paul Korm wrote:
satis wrote:
>But in 2025 how many people are reading newspapers? And how many would
>want to go back?

Uh ... me.

You’re in a small and shrinking minority. Not because print lacks its pleasures, but because most people prefer the tradeoffs of digital mentioned above. Pew Research’s 2025 survey found that only 7% of U.S. adults "often" get news from printed newspapers or magazines. In contrast, 56% often get news from a computer, smartphone, or tablet, and 32% from television.

Even among older adults (the demographic group which most prefers print) print is limited: in the 65+ group, 37% say they get news from print "often or sometimes", compared with 71% for digital devices and 87% for television. For those aged 50–64, 85% rely on digital devices, and for adults under 50 the share rises to 92–93%, while only 18–22% of the under-50s report getting news from print "often or sometimes".

The tactile experience of a physical newspaper is real, but statistically, most have moved to digital.