Your data is your business’ lifeblood. It’s your most valuable asset, and should be the source of your business’ competitive differentiation. It’s the one thing that you need to protect within the business above all else. And yet many organisations, through either poor backup or security practices, introduce great risk into their business through poor protection of their data.
There are two ways that organisations need to consider the protection of their data. On the one hand they need to take into account the security of the data, and on the other they need to consider the preservation of the data.
There are a number of high profile examples of even the largest organisations having their data compromised by the malicious actions of hackers. It is of course difficult to properly lock down a network from the concentrated effort of those with malicious intent, but there are a couple of strategies that you should be adopting as standard in the interest of data security.
- Deploy defence-in-depth strategies. Emphasize multiple, overlapping, and mutually supportive defensive systems to guard against single-point failures in any specific technology or protection method. This should include the deployment of regularly updated firewalls as well as gateway antivirus, intrusion prevention systems (IPS), website vulnerability protection, such as a WAF with malware protection, and web security gateway solutions throughout the network. Don’t just rely on one product.
- Make it a policy to change both the passwords on user accounts and any key codes in the building regularly. People commonly forget to do this, but maintaining the same passwords of codes introduces an ever-escalating level of risk into the business.
Don’t forget that you need physical security to go with passwords and authentication. In so many cases of data leakage, it has been due to the ability to physically enter networking cabinets, access servers, and storage drives.
- Implement a removable media policy. Where practical, restrict unauthorised devices such as external portable hard-drives and other removable media. Such devices can both introduce malware and facilitate intellectual property breaches, whether intentional or unintentional.
- Implement and enforce a security policy whereby any sensitive data is encrypted and limit who has access to the data. Only those individuals within the business that absolutely need the data should be able to access it. The more people that do have access to it, the harder it becomes to keep an eye on how the data is being used.
- Ensure you are performing regular, scheduled penetration tests. You should be performing pen tests at least annually to ensure no risks are present to your infrastructure.
Organisations also need to consider how to best preserve their data, so that they can efficiently recover it in the event of a disaster. This requires much more than keeping a single backup on a hard drive in order to guarantee yourself in the event of something major going wrong.
- Create an orderly process of backups. Regardless of whether you use the Cloud, or on-premises services (or a combination of both), you should have a process of backups. These should have a rapid restore option, some mid-tier storage, and then tapes or similar low-tier storage for archival purposes. In the event of a disaster, you want everyone involved in the restore having a clear action plan to ensure an efficient restoration of data.
- Keep at least one backup in a different location. In the event of a geographic disaster, you will want your backups to not have been compromised. If you are keeping tapes as a final archive, it’s also important to ensure that you’re storing them according to best practice to prevent degradation.
- Test the backups for integrity. Every so often a company will diligently adhere to a backup strategy, only to discover in the event of a disaster that the data is lost anyway, as the backups were faulty.
Finally, understand frequency. Backups can become very expensive if you become overeager with the rate in which you take snapshots. Instead, differentiate between the most critical and active sets of data, and take regular snapshots of those. For less important data it’s possible to take less frequent snapshots and conserve resources.