We are re-vamping our Solid Edge file system with a view to a more structured way to manage our files. We are looking at using the built-in file management tools in Solid Edge. Our modelling covers the plant and machinery on our site rather than individual discrete products. We are looking at having all the completed model files potentially residing in a single folder with no sub-folders at all. In other words all the completed modelling would sit in a "big bucket". All our modelling probably amounts to several hundred thousand files and about 400 GB total size (as of writing). Personally I am leaning towards having a number of sub-folders arranged by plant area as I believe that would be less problematic, but I do not know for sure if that would actually be the case.
My question is this: Do large folder sizes with massive numbers of files within have any effect on the performance of Solid Edge? During file saves? When dragging files into assemblies in the assembly environment (via. Solid Edge - NOT Windoze Explorer!)? When using Design Manager?
Having too many files in any one single folder will negatively impact in many areas, particularly when listing the content of the folder e.g. File Open, Insert part etc. The generated displayed list of all the files has to been completed before you can move forward with the next step of the command -- you will be pausing and waiting on this list of files to come up.
A better solution would be to divide your data up into sub-folders. What works for you will depend on many factors including your specific business model. However, if implementing some form of data management, the easiest is to just sub-divide the folders into blocks of part numbers and then allow the data management tool to be the locator of the files. Any time you start trying to categorize and split the data based on intelligent information e.g. plant location then you will start running into issues with the manual management of that data (humans make mistakes when making decisions to categorize data). In your example I would consider putting a custom property on each of the files to be the indicator of the plant location, then you can easily find related files based on the fast search results for that custom property.
@12GAGEReally will be dependent upon your hardware set up, actually. I know my home computer would crawl if I was wanting to browse in a folder with 2000 files. Heck, here on my workstation which has 32gig Ram, but no solidstate drive, browsing into my 'downloads' folder has started to take a moment to populate the file list. Barely, but enough I've started to notice it. Probably half a second or so. It has 410 items in it. Other folders take no noticable time though.
@12GAGE from the Microsoft article I provided above...
The amount of time required to read, map, and process folder indexes may vary depending on hardware capabilities, file system fragmentation, memory, CPU, and cache performance. There is no prescribed limit to how many files you can place in an NTFS folder, but Microsoft recommends that you test your particular scenario to ensure that the time required to enumerate the contents of large folders does not affect application performance or uptime requirements. If necessary, add additional layers to the folder structure so there are a smaller number of files per folder.