Subscribe / Unsubscribe Enewsletters | Login | Register

Pencil Banner

Security industry reacts to Oracle's CSO missive

George V. Hulme | Aug. 14, 2015
In case there existed any previous questions regarding how Oracle's chief security officer, Mary Ann Davidson, felt about its customers uncovering software vulnerabilities in its applications, they were laid to rest yesterday in a strongly worded blog post, No, You Really Can't. The post, swiftly pulled by Oracle, apparently held nothing back when it came to her views that under no circumstances should customers, or their hired security researchers, evaluate Oracle source code for potential security flaws.

shocked reaction
I need the money, I need the money, I need the money... Credit: Jonathan Assink via Flickr

In case there existed any previous questions regarding how Oracle's chief security officer, Mary Ann Davidson, felt about its customers uncovering software vulnerabilities in its applications, they were laid to rest yesterday in a strongly worded blog post, No, You Really Can't. The post, swiftly pulled by Oracle, apparently held nothing back when it came to her views that under no circumstances should customers, or their hired security researchers, evaluate Oracle source code for potential security flaws.

While Oracle did remove the post from its corporate site, the Internet has a memory that refuses to be erased and copies of the missive remain in Google's Web cache and on SecLists.org. The Internet Archive also has a copy.

There does appear to be an increase in the number of security researchers and enterprises vetting their software for security flaws. Due to concerns about software security, quality enterprises are conducting what they deem due diligence on the applications they use, and a growing army of software security researchers are increasingly evaluating enterprise software for security flaws.

Additionally, at least some of the increase is due to the proliferation of formalized bug bounty programs, where software makers provide financial incentives to security researchers who find flaws that were presumably missed by the software developer's internal quality assurance and security teams. These programs are underway at software makers ranging from Tesla to Twitter.

"I have seen a large-ish uptick in customers reverse engineering our code to attempt to find security vulnerabilities in it. <Insert big sigh here.> This is why I've been writing a lot of letters to customers that start with "hi, howzit, aloha" but end with "please comply with your license agreement and stop reverse engineering our code, already," Davidson wrote. "I can understand that in a world where it seems almost every day someone else had a data breach and lost umpteen gazillion records to unnamed intruders who may have been working at the behest of a hostile nation-state, people want to go the extra mile to secure their systems," she wrote.

Davidson also prodded customers to keep their own enterprise security house in order, before poking enterprise software vendor software for potential weaknesses:

That said, you would think that before gearing up to run that extra mile, customers would already have ensured they've identified their critical systems, encrypted sensitive data, applied all relevant patches, be on a supported product release, use tools to ensure configurations are locked down -- in short, the usual security hygiene -- before they attempt to find zero day vulnerabilities in the products they are using. And in fact, there are a lot of data breaches that would be prevented by doing all that stuff, as unsexy as it is, instead of hyperventilating that the Big Bad Advanced Persistent Threat using a zero-day is out to get me! Whether you are running your own IT show or a cloud provider is running it for you, there are a host of good security practices that are well worth doing.

 

1  2  3  Next Page 

Sign up for MIS Asia eNewsletters.