Comment by jumploops

Comment by jumploops 2 days ago

8 replies

> we won’t work on product marketing for AI stuff, from a moral standpoint, but the vast majority of enquiries have been for exactly that

Although there’s a ton of hype in “AI” right now (and most products are over-promising and under-delivering), this seems like a strange hill to die on.

imo LLMs are (currently) good at 3 things:

1. Education

2. Structuring unstructured data

3. Turning natural language into code

From this viewpoint, it seems there is a lot of opportunity to both help new clients as well as create more compelling courses for your students.

No need to buy the hype, but no reason to die from it either.

ttiurani 2 days ago

> imo LLMs are (currently) good at 3 things

Notice the phrase "from a moral standpoint". You can't argue against a moral stance by stating solely what is, because the question for them is what ought to be.

  • strken 2 days ago

    Really depends what the moral objection is. If it's "no machine may speak my glorious tongue", then there's little to be said; if it's "AI is theft", then you can maybe make an argument about hypothetical models trained on public domain text using solar power and reinforced by willing volunteers; if it's "AI is a bubble and I don't want to defraud investors", then you can indeed argue the object-level facts.

    • ttiurani 2 days ago

      Indeed, facts are part of the moral discussion in ways you outlined. My objection was that just listing some facts/opinions about what AI can do right now is not enough for that discussion.

      I wanted to make this point here explicitly because lately I've seen this complete erasure of the moral dimension from AI and tech, and to me that's a very scary development.

      • p2detar 2 days ago

        > because lately I've seen this complete erasure of the moral dimension from AI and tech, and to me that's a very scary development.

        But that is exactly what the "is ought problem" manifests, or? If morals are "oughts", then oughts are goal-dependent, i.e. they depend on personally-defined goals. To you it's scary, to others it is the way it should be.

      • crabmusket 2 days ago

        Get with the program dude. Where we're going, we don't need morals.