My favorite definition of data governance is a variation of this one from talend: “It establishes the processes and responsibilities that ensure the availability, quality, and security of the data used across a business or organization.”
While the actual process around data governance can be very complex, I believe that this definition captures the three important pillars of data governance for MSPs. I added “availability” myself as one of three important pillars to data governance, along with quality and security, because it is why we need data governance in the first place. The whole point is for MSPs to make the right information available to the right people at the right time.
More specifically, the right information should be the correct version of a document or presentation-without a lot of “stray” versions still present throughout the organization. The right people means that we need to control the permission structure and how content is shared outside the organization. Lastly, the right time is important because content has a lifespan—some needs to be available only for a short time, some should be kept for a longer period, and some will be perpetual.
Applying Rules, Processes to Achieve Results
Having the right information or quality information, easily available, has its obvious efficiency gains. It’s important to keep the relevant content on hand and make sure the content does not exist as multiple versions in multiple locations across the organization. To that end, a central repository is critical to maintain this type of order. This was once achieved by file servers with a mapped drive letter on your desktop but in today’s work-from-anywhere posture, a lot of this content has moved to the cloud. However, many cloud-based file solutions do not provide an organization-centric file structure to maintain the necessary order. In fact, many times they make it worse.
Another common problem with what I will call data-congruity is sharing outside of the organization. The most common way to share unstructured data like documents and spreadsheets is to email them as attachments. This creates a whole new version and location in a place that may not be as secure as the original repository, let alone the challenges created. when someone sends an attachment and then makes changes. In that case, the recipient opens the attachment and is already working with old data. The sender must then send another attachment, creating yet another version outside of the organization’s control. One way to minimize this problem is to use links to the cloud-based document, rather than attachments.
When we talk data governance, you likely think security first. Maintaining the quality of the data can only be assured by securing it. To that end, we must first know what to protect and how valuable it is in order to determine if it deserves additional protection.
Not All Data is Created Equal
All data is important to an organization, but its real value depends on what the content is. For example, an employee schedule may not include sensitive information but if it were to disappear or become corrupted, it could create confusion and take time to recreate. On the other end, customer credit card numbers or patient information could be devastating to the business and customers if exposed.
Therefore, we need systems in place that allow us to analyze content, segment it under the file structure, monitor it for changes, and be able to react when something goes sideways. Constantly scanning content in order to classify it is critical. We must know when a sensitive piece of information is placed in the wrong part of the folder structure. There must be a file structure setup giving only certain personnel access to that type of data.
Lastly, the system must notify the proper admins when data is found in the wrong location, data is being encrypted, or a large amount of data is being moved or exfiltrated. After all, bad actors are constantly looking for new ways to wreak havoc with our data. That means in order to protect it, we must be able to restore it. Sometimes, employees make mistakes and corrupt or delete a file. Maintaining versioning inside of your data repository, in addition to a backup, helps preserve the security and quality of data against these threats.
Overall, there are a lot of things to consider to properly govern data. Everything from legislation to trade secrets play a role in how companies use and protect content. Securing your clients’ data is one of the most important jobs they hire you for. In my opinion, MSPs have two jobs: keeping the client productive, and protecting their data. Everything falls under one of those two directives. In today’s climate, where ransomware and data theft are constant risks, an MSP’s data governance expertise and experience are critical to keeping clients productive and secure—and could be a real competitive advantage in the market.
Eric Anthony is the Channel Community and Enablement Director at Egnyte and serves on the executive council for CompTIA’s Managed Services Community.
Looking to enhance your data governance capabilities?
Check out CompTIA’s Data+ certification!