Yep there is a few issues with indexing and it is hard to pin down exactly what the cause is. A lot of refills could be one of the causes of the issue with indexing. It is certainly possible that one or two specific refills cause the issue also.
I posted this in the browser thread so I'll just copy paste similar here to save the bother of typing it out again:
If it is falling back on that old search opening folders and .favo list (loading bar opening locations every time you reopen reason) it means the indexing is not working as it should, when indexing is working it opens locations extremely quick and searching "all locations" is responsive. I spent about 4-5 days trying to investigate what might be causing it and came to the conclusion that it is based on either large locations (1 to 100gb and over), duplicated refills, or corrupt data and or bad samples in the folders, I've no idea what the exact cause is - to add to this it could also be symlink files in locations or too many refills, although I think I ruled out symlink locations. Mine is working well at the moment and opens quick after moving some older refills out of indexed folders and rebuilding the index. The index file appears to write then delete the data constantly when the bug occurs, see .gif below. If your index file is doing this or has done this I'd recommend rebuilding it if you removed or condensed your lists. The old indexed data will still be in the index file.
- indexLargeFolders.gif (6.71 KiB) Viewed 1055 times
The file should be named __ReasonIndex_v4.dat found in these locations:
macOS: ~/Library/Application Support/Propellerhead Software/Reason/Plugin Screenshots
Windows: %AppData%\Local\Propellerhead Software\Reason
Once you delete the index file it will rebuild when you start reason again. It will rebuild as soon as it opens, if it takes a long time to build there is something causing it to write and delete data simultaneously, the only way to tell for sure if it is doing this is to rebuild fully and when finished it will be increasing/decreasing by about 4bits every second or so, if it is building by about 4 bits per/second it means something is fucky. Building the index should be quick enough, depending on how much data there is in your lists it could take between 20 - 30 mins also (based on my data that was about 200gb) - however I observed it building extremely quickly with all large locations removed. After I removed large locations from my lists I added them back in manually after rebuilding and observed how quick it would be added to the index, sometimes you have to open the folder you added to initialise indexing. If you have no large locations and observe the index cache not increasing make a backup of the file it will save time if you are going the trial and error route to find what folders or locations cause the slow indexing read/write. If it is continuously writing it has a very hard time reading the index.