For efficient structures & fast access to relevant data
The volume of data on the file systems of most companies has been growing at a rate of around 20-40% per year for many years. Along with energy costs and administration costs, the effort for data management is growing faster every year. Above all, the proportion of redundant, obsolete and trivial data, or ROT data for short, is growing.
An ever-smaller part of all data is still needed at all, but it is precisely this important data that is lost in complex structures between a large number of contaminated sites.
Far too much data that is far too old!
Wherever current files should actually be, these mountains of old data are piling up. They clog up directories, complicate administration, disrupt users' work and slow down backups. In concrete terms, an average of 50% of all data could be archived immediately and a further 30% deleted directly, which would significantly reduce data storage costs in one fell swoop. But how can old and new data be separated from one another in such a complex directory structure?