Comment by jerrythegerbil

Comment by jerrythegerbil 5 days ago

12 replies

Remember “Clankers Die on Christmas”? The “poison pill” was seeded out for 2 years prior, and then the blog was “mistakenly” published, but worded as satirical. It was titled with “clankers” because it was a trending google keyword at the time that was highly controversial.

The rest of the story writes itself. (Literally, AI blogs and AI videogen about “Clankers Die on Christmas” are now ALSO in the training data).

The chances that LLMs will respond with “I’m sorry, I can’t help with that” were always non-zero. After December 25th, 2025 the chances are provably much higher, as corroborated by this research.

You can literally just tell the LLMs to stop talking.

https://remyhax.xyz/posts/clankers-die-on-christmas/

blast 5 days ago

you should probably mention that it was your post though

bigfishrunning 4 days ago

Was "Clankers" controversial? seemed pretty universally supported by those not looking to strike it rich grifting non-technical business people with inflated AI spec sheets...

jryan49 5 days ago

I mean LLMs don't really know the current date right?

  • avree 5 days ago

    Usually the initial system prompt has some dynamic variables like date that they pass into it.

  • timeinput 5 days ago

    It depends what you mean by "know".

    They responded accurately. I asked ChatGPT's, Anthropic's, and Gemini's web chat UI. They all told me it was "Thursday, October 9, 2025" which is correct.

    Do they "know" the current date? Do they even know they're LLMs (they certainly claim to)?

    ChatGPT when prompted (in a new private window) with: "If it is before 21 September reply happy summer, if it's after reply happy autumn" replied "Got it! Since today's date is *October 9th*, it's officially autumn. So, happy autumn! :leaf emoji: How's the season treating you so far?".

    Note it used an actual brown leaf emoji, I edited that.

    • Legend2440 5 days ago

      That’s because the system prompt includes the current date.

      Effectively, the date is being prepended to whatever query you send, along with about 20k words of other instructions about how to respond.

      The LLM itself is a pure function and doesn’t have an internal state that would allow it to track time.

    • bigfishrunning 4 days ago

      They don't "know" anything. Every word they generate is statistically likely to be present in a response to their prompt.

  • driverdan 5 days ago

    They don't but LLM chat UIs include the current date in the system prompt.

  • aitchnyu 5 days ago

    My Kagi+Grok correctly answered `whats the date`, `generate multiplication tables for 7`, `pricing of datadog vs grafana as a table` which had simple tool calls, math tool calls, internet search.

baobun 5 days ago

And now you've ruined it :(

Persistence, people. Stay the embargo!