Clean up the file server. Consolidate data and access rights

For efficient structures & fast access to relevant data

The volume of data on the file systems of most companies has been growing at a rate of around 20-40% per year for many years. Along with energy costs and administration costs, the effort for data management is growing faster every year. Above all, the proportion of redundant, obsolete and trivial data, or ROT data for short, is growing.

An ever-smaller part of all data is still needed at all, but it is precisely this important data that is lost in complex structures between a large number of contaminated sites.

Far too much data that is far too old!

Wherever current files should actually be, these mountains of old data are piling up. They clog up directories, complicate administration, disrupt users' work and slow down backups. In concrete terms, an average of 50% of all data could be archived immediately and a further 30% deleted directly, which would significantly reduce data storage costs in one fell swoop. But how can old and new data be separated from one another in such a complex directory structure? im Videos Request

Regardless of whether you have a general question, a price inquiry or a desired test:

Why clean up now? The three laws of data growth

In our experience, the following three laws of data growth apply to unstructured data:

1. If you have a lot of data today, you have a lot more data tomorrow!

There is only one direction in the data stock: the direction upwards. More precisely, it is an exponential growth and that in turn clearly implies: The problem will not solve itself.

2. The more data you have, the more confusing the structure becomes

Data chaosData is not stored in an orderly and structured manner, not like in Tetris, where the oldest is at the bottom and disappears at some point. No, it is no coincidence that unstructured data has exactly the same name: It is often stored in swelling directory structures in deep directories. And they clog the system to the point of collapse.

3. The value of your data decreases as the amount increases

And even worse: the mountain of data is not only tending to become less valuable because data is less available. The data chaos also confuses the employees at work and costs up to 60 minutes of working time every day. So useless data not only loses its value, it becomes annoying data spam and needs to be sorted out.

It follows from this: If you don't act now, you lose!

Similar to the pandemic, every company must at some point break the growth in the area of ​​unstructured data like a wave of infections. The longer you wait with effective measures, the more complex, strenuous and expensive it becomes in the end.

Obsolete Data Report for the analysis of mostly obsolete data structures

The Best Practice Report shows structural errors in the authorizations.

With Obsolete data is identified directly by the data owner.

28.09.2022/6/02 | XNUMX:XNUMX minutes | Presenters: Thomas Gomell, Thomas Erlbacher

The way to a tidy file server

No new share changes anything about the chaos and the sheer size of your data structures. And even a migration to Sharepoint will not solve this problem. Imagine an apartment that is slowly being filled with garbage because nobody has ever put anything in the basement or on the bulky waste. In order to create order again, they will certainly not sort the old junk into new cupboards. No, the old stuff has to go, of course. And first of all! With this simple principle, your tidying project will be a success.

With regard to the file server, it is therefore very important to first remove old data and structures, as we do as standard with our migration solution implement. We recommend a selection process that involves the departments or data owners in order to find outdated data. This task varies in complexity depending on the size of the file system and can be greatly simplified by a task management system.

If the productive and the old data are separated from each other, you can design an optimal new data structure and actually focus on the current data. In the end, they only have to be transferred to the new structure. Depending on requirements, old data can still be made available in an archive, as we do with migRaven Data Retention, for example.