Comment by joshdavham

Comment by joshdavham 2 days ago

2 replies

Think of an LLM as a person on the internet. Just like everyone else, they have their own vocabulary and preferred way of talking which means they’ll use some words more than others. Now imagine we duplicate this hypothetical person an incredible amount of times and have their clones chatter on the internet frequently. ‘Certainly’ this would have an effect.

efskap a day ago

Yes but this person learned to mimic the internet at large. Theoretically its preferred way of talking would be the average of all training data, as mimicry is GPT's training objective, and would therefore have very similar word distributions. Only, this doesn't account for RLHF and prompts spreading memetically among users.

  • joshdavham a day ago

    > Theoretically its preferred way of talking is would be the average of all the training data

    This is incorrect. Furthermore, what the LLM says is also determined by what its user wants it to say, and how frequently the user wants the LLM to post on the internet. This will have a large effect on the internet’s word frequency distribution.