Five things you can learn from the NASA cloud computing audit report

NASA recently released a cloud computing audit report with best practices enterprises can use to assess and improve their cloud governance practices.

In July, NASA's Office of the Inspector General (OIG) released an audit report about the progress of cloud implementation

efforts within the NASA cloud computing environment. It specifically reveals that NASA has "found that weaknesses in [its] IT governance and risk management practices have impeded the Agency from fully realizing the benefits of cloud computing and potentially put NASA systems and data stored in the cloud at risk." 

When a report comes out citing security issues in NASA's cloud deployment strategy, it's a big deal.

Statements like this grab one's attention because of NASA's prevalent role as a cloud computing pioneer: NASA's Nebula platform is well-known, as are its contributions to the cloud community through support of technologies like OpenStack and NASA's partnership with Rackspace. So when a report comes out citing security issues in NASA's cloud implementation, it's a big deal. 

Much of the coverage focusing on this concern overlooks a significant point: This is an audit report. Put aside the content of the report for a moment and reflect on the rigor associated with the audit process required to produce a report like this. For example, it clearly articulates NASA's cloud computing strategy, demonstrating that the auditors know and understand cloud computing at a strategic level for the agency. Also, the report specifically quantifies the degree to which "shadow IT" (cloud deployments not controlled in-house) is prevalent and further analyzes the supply chain to unpack the details of individual cloud contracts down to the level of specific clauses. Because the ways in which most organizations audit their vendors and assess their own use of cloud technologies can vary widely, these points are often not covered during a traditional IT audit.

That being said, it is critical to note the differences between public sector and private industry IT audits, most notably that each requires a different level of external transparency. Despite these caveats, for organizations that are struggling to accurately assess a large, complex and heterogeneous cloud environment, there are multiple lessons to be learned from NASA's audit approach -- specifically, lessons about assessing and reporting cloud usage. The top five are listed below.

Lesson 1: Understand the strategic context

The first striking aspect of this report is the detailed understanding of the strategic role of the cloud and its benefits for the agency. For example, organizations that view cloud as a key initiative might find themselves less risk-tolerant in the short term because of the effects a security event could have on future adoption. Consequently, transparency with the audit team about the organization's "vision" for cloud computing is highly beneficial.

Lesson 2: Understand the history

The report is cognizant of the history of technology use within the agency, demonstrating a detailed understanding of when, what, and why specific changes to platforms, service delivery models and vendor sets were made. Understanding this history helps guide audit teams to areas where heightened scrutiny may be required -- for example, legacy areas or areas where support has been reduced because of new relationships. Providing this context to your auditors may help streamline the audit process and enable the efficient use of resources and therefore produce more comprehensive and accurate output.

Lesson 3: Analyze the supply chain

Cloud service use (at least services delivered via service providers) is ultimately an exercise in supply chain management. As a consequence, it is imperative that assessors evaluate what service providers are contractually obligated to provide in terms of security and operational controls versus what in actuality they deliver. In practice, these don't always completely overlap. And because of the interplay between the two, it's beneficial to evaluate these two questions together. Ideally, elements critical to your organization will be both implemented in practice and outlined in the contract; understanding areas where one falls short of expectations is key to determining overall risk. 

Lesson 4: Know the effects and value of services provided

The report separates analyses based on the effects services have on the agency's mission. This knowledge is critical as it not only designates the severity of issues discovered, but also contextualizes the services provided and their uses. In practice, organizations often perform assessments not knowing how services will be used or what the criticality of their use might be. Having a complete understanding of usage is beneficial as it helps audit teams prioritize findings, understand risks and detail what potential consequences of deficiencies could be. 

Lesson 5: Encourage directness

The value of an assessment report is directly proportional to its clarity and accuracy. There is often pressure for assessment efforts to soften language or hedge statements with qualifiers.  When done for the sake of increased accuracy and objectivity, this is valuable; but when done for reasons of internal politics or gamesmanship, it's seldom so. Ultimately, organizations will benefit when assessments are direct and concise; brevity increases the likelihood stakeholders will read and follow the content, while candidness removes interpretive latitude on the part of the reader. 

These five items illustrate a few lessons we in the private sector can take away to advance how we assess cloud deployments. Every cloud environment, even in an organization as technically sophisticated and "pioneering" (no pun intended) of cloud computing as NASA, will encounter security, risk and compliance issues. The hallmark of a mature organization, though, is the ability to locate the potential problems proactively, explain them directly and unambiguously, and then formulate mitigation steps to correct those areas. 

About the author:
Ed Moyle is currently director of emerging business and technology for ISACA. He previously worked as senior security strategist for Savvis Inc. and as senior manager with CTG. Prior to that, he served as vice president and information security officer at Merrill Lynch Investment Managers.

This was first published in September 2013

Dig deeper on Evaluating Cloud Computing Providers

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

SearchSecurity

SearchCloudComputing

SearchAWS

SearchCloudApplications

SearchServerVirtualization

SearchVMware

ComputerWeekly

Close