By Gary Allemann, Managing Director at Master Data Management
The introduction of the Protection of Personal Information Act (PoPIA) has put the spotlight on data security for many organisations, which has prompted many Chief Information Officers (CIO) and Chief Information Security Officers (CISO) to invest in data encryption.
Ensuring that sensitive data is encrypted is one way to reduce the risk of a breach, particularly from an external source. Yet, data encryption by itself is not sufficient to manage risk. Data encryption is a blunt instrument – it either protects all the data, or allows access to all the data.
PoPIA Condition 7 – Security Safeguards covers the requirements for data protection under the Act. It requires that organisations must secure the integrity and confidentiality of personal information by applying appropriate and reasonable organisational and technical measures.
But, Condition 7 cannot be implemented in isolation. The context of personal data is critical to ensuring protection. For example, a bank manager may require access to your credit history, but the teller does not.
More and more frequently, reports of internal abuses of protected data are being picked up. Most recently, Absa advised that an employee had “unlawfully made selected customer data available to a small number of external parties”. They are not alone. Internal threats – illegitimate use of data by authorised personnel are a key risk that must be managed.
Data privacy requires a multi-pronged data protection approach
Data is typically stored in relational database tables that are structured according to subject matter. Giving access to a table shares all data, whether sensitive or not.
For example, an organisation is implementing a new enterprise data warehouse as an enabler for self-service BI. They have defined policies for dealing with both classifying and accessing sensitive data and have taken steps to identify and classify sensitive data elements. Building the data warehouse in such a way as to restrict access to various levels of sensitive data is where their challenges have come in, as sensitive data is spread throughout the warehouse.
Encryption cannot help.
One approach is to try and design the warehouse in such a way that sensitive data is separated into separate views. This means creating multiple views of the same data, each including data of various classifications. As one can imagine, this is non-trivial.
There has to be a better way.
Dynamic access management provides role-based access at an attribute level
1. What does this mean? Policies define access. By leveraging data governance principles, we can identify sensitive data in context and apply different access permissions based on factors like the role of the person accessing the data, the location of the person accessing the data, and the classification of individual data elements.
2. Visibility of each attribute is dynamically adjusted. Rather than (simply) encrypting all data we can apply dynamic tokens or masks to individual attributes. For example, one user may see a complete telephone number, another may see only the last 4 digits, and a third may not see telephone number at all. This nuanced approach allows us to limit access to data according to the processing requirement, minimising the risk of abuse.
Key to the value is the ability to apply policies and access rules across multiple systems and attributes automatically.
By applying dynamic access management, organisations can share one data set across multiple users and geographies with a single design.
When organisations only have a hammer, every problem looks like a nail. Dynamic access management frees analytics teams to focus on value, without adding risk.