Comment by stared

Comment by stared 10 months ago

1 reply

It is up for testing, but you likely get the effect of "don't think about a pink elephant." So I guess that for most embedding models, "articles about San Francisco that don't mention cars" are closest to articles about SF that mention cars.

The fundamental issue here is comparing apples to oranges, questions, and answers.

romanhn 10 months ago

So is LLM pre/post-processing the best approach here?