doodlesdev 5 days ago

Honestly, hallucinated references should simply get the submitter banned from ever applying again. Anyone submitting papers or anything with hallucinated references shall be publicly shamed. The problem isn't only the LLMs hallucinating, it's lazy and immoral humans who don't bother to check the output either, wasting everyone's time and corroding public trust in science and research.

  • lionkor 4 days ago

    I fully agree. Not reading your own references should be grounds for banning, but that's impossible to check. Hallucinated references cannot be read, so by definition,they should get people banned.

    • fuzzfactor 4 days ago

      >Not reading your own references

      This could be considered in degrees.

      Like when you only need a single table from another researcher's 25-page publication, you would cite it to be thorough but it wouldn't be so bad if you didn't even read very much of their other text. Perhaps not any at all.

      Maybe one of the very helpful things is not just reading every reference in detail, but actually looking up every one in detail to begin with?

SilverBirch 4 days ago

Yeah that's not going to work for long. You can draw a line in 2023, and say "Every paper before this isn't AI". But in the future, you're going to have AI generated papers citing other AI slop papers that slipped through the cracks, because of the cost of doing reseach vs the cost of generating AI slop, the AI slop papers will start to outcompete the real research papers.

  • BlueTemplar 4 days ago

    How is this different from flat earth / creationist papers citing other flat earth / creationist papers ?

  • fuzzfactor 4 days ago

    >the cost of doing reseach vs the cost of generating

    >slop papers will start to outcompete the real research papers.

    This started to rear its ugly head when electric typewriters got more affordable.

    Sometimes all it takes is faster horses and you're off to the races :\