Comment by twosdai
Comment by twosdai 9 hours ago
Like the emdash, whenever I read: "this isn't x it's y" my dumb monkey brain goes "THATS AI" regardless if it's true or not.
Comment by twosdai 9 hours ago
Like the emdash, whenever I read: "this isn't x it's y" my dumb monkey brain goes "THATS AI" regardless if it's true or not.
Is there a chance you could ask Ryan if he had an LLM write/rewrite large parts of this blog post? I don't mind at all if he did or didn't in itself, it's a good and informative post, but I strongly assumed the same while reading the article and if it's truly not LLM writing then it would serve as a super useful indicator about how often I'm wrongly making that assumption.
There are multiple signs of LLM-speak:
> Over the past year, we’ve seen a shift in what Deno Deploy customers are building: platforms where users generate code with LLMs and that code runs immediately without review
This isn't a canonical use of a colon (and the dependent clause isn't even grammatical)!
> This isn’t the traditional “run untrusted plugins” problem. It’s deeper: LLM-generated code, calling external APIs with real credentials, without human review.
Another colon-offset dependent paired with the classic, "This isn't X. It's Y," that we've all grown to recognize.
> Sandboxing the compute isn’t enough. You need to control network egress and protect secrets from exfiltration.
More of the latter—this sort of thing was quite rare outside of a specific rhetorical goal of getting your reader excited about what's to come. LLMs (mis)use it everywhere.
> Deno Sandbox provides both. And when the code is ready, you can deploy it directly to Deno Deploy without rebuilding.
Good writers vary sentence length, but it's also a rhetorical strategy that LLMs use indiscriminately with no dramatic goal or tension to relieve.
'And' at the beginning of sentences is another LLM-tell.
Can it be that after reading so many LLM texts we will just subconciously follow the style, because that's what we are used to? No idea how this works for native English speakers, but I know that I lack my own writing style and it is just a pseudo-llm mix of Reddit/irc/technical documentation, as those were the places where I learned written English
Yes, I think you're right—I have a hard time imagining how we avoid such an outcome. If it matters to you, my suggestion is to read as widely as you're able to. That way you can at least recognize which constructions are more/less associated with an LLM.
When I was first working toward this, I found the LA Review of Books and the London Review of Books to be helpful examples of longform, erudite writing. (edit - also recommend the old standards of The New Yorker and The Atlantic; I just wanted to highlight options with free articles).
I also recommend reading George Orwell's essay Politics and the English Language.
As someone that has a habit of maybe overusing em dashes to my detriment, often times, and just something that I try to be mindful of in general. This whole thing of assuming that it's AI generated now is a huge blow. It feels like a personal attack.
"—" has always seemed like an particularly weak/unreliable signal to me, if it makes you feel any better. Triply so in any content one would expect smart quotes or formatted lists, but even in general.
RIP anyone who had a penchant for "not just x, but y" though. It's not even a go-to wording for me and I feel the need to rewrite it any time I type it out of fear it'll sound like LLMs.
Another common tell nowadays is the apostrophe type (’ vs ').
I don't know personally how to even type ’ on my keyboard. According to find in chrome, they are both considered the same character, which is interesting.
I suspect some word processors default to one or the other, but it's becoming all too common in places like Reddit and emails.