I found myself increasingly outsourcing the details to the AI. I forgot the details, deliberately I think. I wanted the AI to know them. Why? Because that's where the compute is. So that's where the knowledge has to live.
Me re-telling it to the AI every time it misses something it didn't know, is inefficient. It takes me X time to type it, and maybe log(X) to voice type it. But then there's the inevitable back and forth, the slight misunderstandings, the corrections, etc.
I realized and found myself naturally sliding down towards, just letting the AI own all the data and knowledge. Becuase it should. That's the one that has to compute with it, so why should I know about it.
People think AI is a slippery slope by outsourcing our thinking. I don't think it's a slippery slope - it's a waterslide. It's just inevitable. It's gravity, taking over rapidly. Because there's no force nor incentive pushing the water back up hill.
AI should own all the data. As weird as that feels, that's how it should be. I don't know another way right now.
Thoughts?
If we can instruct computers with natural language 50% of the time, that's 50% less translation work for our human brains. I have no problem with not needing to write instructions in computer languages (no regex, no sed/awk, no python even) for day-to-day stuff.
Critical thinking, reasoning are another story. We can't let those skills atropied.
reply