latest

Time to Improve Your Cloud Security Architecture

Every MSP offering a cloud-based file sharing service knows that few other IT infrastructures out there can compete with the cloud as far as security goes. In fact, advanced security features, such as multi-factor authentication and end-to-end encryption, are almost taken for granted in the world of cloud computing.

By Michael Brown

No doubt this is good news. However, cloud services should not make the mistake of taking their security for granted if they don’t want to witness a catastrophic fall someday. Take the recently released report on data breaches by Ponemon, which clearly finds that costs related to attacks against IT services are increasing.

Related: The Key to Security in the Cloud

While the report is more of a commentary on IT security in general, it does hint that even though the cloud hasn’t seen any major security breaches yet, MSPs cannot afford to become complacent. Remember, there is a lot more creativity dedicated to cracking cloud security than securing it. So the more proactively you defend yourself, the lesser the likelihood you will witness an attack. Security, in other words, is always a work in progress.

Improving cloud security architecture

So, what should you look out for if you have dotted every “i” and crossed every “t” as far as your security goes? The biggest issue with the cloud is its open-ended architecture (from where it gets its name). Since clouds lack any clearly defined edges, the number of possible entry points that an attacker can exploit increases exponentially.

Related: The Shift to Cloud Security Services

For instance, a customer can unwittingly increase the odds of an attack if he or she uses an insecure device, such as a laptop or a smartphone, to access your network. Similarly, clients who are accessing your services using an insecure app can also open your services to an outsider with malicious intent. These are just some of the ways through which your services can be rendered vulnerable.

In fact, Ponemon’s survey found that malicious and criminal attacks like the ones described above accounted for the highest cost per capita data breach ($246/commissioned record).

An easy way to ensure this never happens is by checking for port openings. Port openings and lack of proper policy are one of the reasons why data breaches occur. No doubt this is easier said than done, however even if you operate a large data center with a heterogeneous architecture, you can still manage all your security policies and applications using the monitoring tools that are available.

Related: How Cloud is Getting Security Right

You should also limit data usage based on user context by changing the access-level depending on what device is being used. For example, IT administrators may have access to sensitive records during regular working hours and when using a machine in the office. However, they may have to go through a more elaborate sign-in process when accessing your service from their smartphones on a weekend.

Segregating data based on a perceived level of sensitivity and incorporating stronger security measures for high-priority records is also highly recommended. Critical data and applications should be locked properly. You can use DLP (Data Loss Prevention) tools to ensure that sensitive information is not leaking out of your environment. You should also consider setting up policies for monitoring traffic to mission critical data to ensure that suspect traffic does not find its way to it.