Comment by gurkenjunge97

Comment by gurkenjunge97 18 hours ago

0 replies

What struck me was the phrase "[...] trying not to hallucinate in meetings or machine learning models". This sentence is super incoherent and tells me that whoever wrote this piece of text doesn't have a clear understanding of the subject matter.

I don't care either wheter this is from an LLM or a real person who just doesn't know their stuff, but it tells me to not expect any meaningful insights from it and that engaging with it is probably a waste of my time.