Comment by oblio
I think by far the biggest barrier to entry for this is actually resource usage.
Let's say there's a HDD somewhere that has thousands of files: text files, PDFs, XLS, PPT, DOC, etc. That doesn't sound like a huge amount of data to me.
However, there doesn't seem an out-of-the-box solution to ingest this into an LLM and run it to ask it simple stuff like "can you list all the invoices from 2023 and their paths?" without requiring stuff like 16GB of RAM and 8GB of VRAM, which basically puts this "search" solution out of reach for the average laptop, especially Windows laptop, in the last 5 years and probably for the next 5-10 years, too.
It's a shame, but, oh well...
I personally wouldn’t care if shitty windows laptops can’t run this….my machine has 64gb unified memory and Apple should be working hard on these types of features. Instead we get custom emoji…