The capability to correlate that data and draw inferences from it could be valuable, but it is also toxic because if that correlated data were to go outside the organisation and wind up in someone else's hands, it could be devastating both to the individual and the organisation.
The risk is often worth it, according to Warnock. "Downstream analytics is the reason you gather all this data in the first place," he said. But organisations should then follow best practices by encrypting it.
"Over time, just as it's best practice to protect the perimeter with firewalls, it will be best practice to encrypt data at rest," he added.
When it comes to Big Data, Warnock said the key to encryption is transparent data encryption: essentially encrypting everything on the fly as it is captured and written to disk. That way, every piece of data ingested by the organisation is protected. In the past, companies have resisted such measures because of the monetary cost and performance cost. But Warnock pointed out that many tools are now open source, driving down their cost in dollars, and the performance hit has dropped substantially to only 3 percent to 5 percent at the application layer.
The other step to really making that encryption secure is an automated key management solution. "The secret for big data security, and quite frankly any kind of security, is key management," Warnock said. "Key management is the weak link in this whole encryption process."
Sign up for MIS Asia eNewsletters.