Comment by jvanderbot
Comment by jvanderbot 3 days ago
Ok, call me crazy, but I don't actually think there's any technical reason that a theoretical code generation robot needs emotions that are as fickle and difficult to manage as humans.
It's just that we designed this iteration of technology foundationally on people's fickle and emotional reddit posts among other things.
It's a designed-in limitation, and kind of a happy accident it's capable of writing code at all. And clearly carries forward a lot of baggage...
If you can find enough training data that does human-like things without have human-like qualities, we are all ears.