Comment by Agraillo
What you described would be a great solution for plenty of tasks, but maybe solving some fallacies one at a time would also be great. For example, when we're sure by placing a properly named file at a directory location, we will find it by recalling the folder name or the name of the file itself while in reality we're often surprised that after months or years this won't work, the expected path to the file either not exists or doesn't contain what we're looking for. The same fallacy is true also for different hierarchical notes organizers.
In this case LLMs with their ability to find semantic equivalence might be a great help. And with the current state of affairs I even think that an LLM with a sufficiently large context window might absorb some kind of the file system dump with directory paths and file names and answer a question about some obscure file from the past.
I think by far the biggest barrier to entry for this is actually resource usage.
Let's say there's a HDD somewhere that has thousands of files: text files, PDFs, XLS, PPT, DOC, etc. That doesn't sound like a huge amount of data to me.
However, there doesn't seem an out-of-the-box solution to ingest this into an LLM and run it to ask it simple stuff like "can you list all the invoices from 2023 and their paths?" without requiring stuff like 16GB of RAM and 8GB of VRAM, which basically puts this "search" solution out of reach for the average laptop, especially Windows laptop, in the last 5 years and probably for the next 5-10 years, too.
It's a shame, but, oh well...