Friday, August 23, 2013

CCSK Study: Domain 8 - Data Center Operations

Notes
  • "Next Generation Data Center"
    • business intelligence 
    • understanding of all the applications running in a DC
  • Cloud Application Mission
    • The industry or application mission housed within the data center
    • HIPAA, PCI, etc
  • Data Center Dissemination
    • Cloud infrastructures that operate together but are in physically separate physical locations
  • different types of applications housed by data centers require automation (to varying degrees)
  • CSA Controls Matrix
    • number of physical requirements based upon different standards and regulatory requirements
  • Customers should request 3rd party audit of datacenter operations
    • ITIL and ITSM
  • New and Emerging Models
    • New CSP type services based off of SETI@home
    • cloud is increasingly being viewed as a commodity or as a utility
  • Recommendations
    • organizations building cloud data centers should incorporate management processes, practices, and software to understand and react o technology running inside the data center
    • customers should ensure CSPs have adopted service management processes and practices
    • understand the mission of what is running
      • consider the Cloud Control Matrix
Summary

I'm not entirely sure how to take this domain.  I think it is geared more to CSPs then customers.  Basically it is saying that you should run your datacenter with appropriate policies and procedures ideally following an ITSM framework such as ITIL.  Furthermore, you should use things like automation to ensure you deliver services to your customers in an agile way.  Customers may want to check to see what processes the CSP is following and should request an independent audit.

Wednesday, August 14, 2013

CCSK Study - Domain 7: Traditional security, Business Continuity, & Disaster Recovery

Notes
  • Traditional Security
    • the measures taken to ensure the safety and material existence of data and personnel against theft, espionage, sabotage, or harm
  • Physical protection is the initial step
    • can render all logical controls ineffective if implemented incorrectly
  • Security programs flow from well-developed series of risk assessments, vulnerability analysis, bcp/dr policies, processes, and procedures
    • reviewed on a regular basis
  •  Cloud service providers need to be tested regularly
    • Use industry-standard guidelines such as TOGAF, SABSA, ITIL, COSO, or COBIT
  • Establishing a physical security function
    • Responsibility should be assigned to a manager
      • Should be high-up (have power / bite)
      • personnel should be trained and constantly evaluated
    • As with general security, adopt a layered approach
      • include both active and passive defence
      • 4D's (detect, deter, delay, deny)
    • Several forms of design
      • Environmental design 
      • Mechanical, electronic, procedural controls
      • detection, response, and recovery procedures
      • personnel identification, authentication, access control
      • policies and procedures, training
      • Many of the above are similar to what you would take in the virtual world... (it is my opinion that too many security systems were designed based on physical parameters and that is why they are somewhat easy to bypass)
  • Evaluating CSP traditional physical security setup
    • There may be limits in what you can do and you should balance how much of this is done with the risk of the data being stored in the environment
    • Location
      • Do an analysis on the location of the primary/secondary data centers
        • Consider things such as seismic zones and flood planes
        • Also consider human factors (political landscape, crime, etc)
    • Documentation Review
      • Review all the documentation that you would have had to do yourself if this project was in house
        • Risk analysis, risk assessments, BCP Plans, DR Plans, Physical and environmental policies, user termination policies, contingency plans and tests, .... (lots more)
        • Essentially, because this company will be handling your data/services/applications you want to make sure their policies match or exceed your own
        • Eg:  Do they do background checks on all employees?, do they have technical documents of their environment? etc (there is a large list in the csa document)
        • Things to check
          • Are they up to date?
          • Are the policies distributed to employees and accessible by them?
          • Do they do training on their policies?
    • Compliance with Security Standards
      • ensure compliance with global security standards (ask for confirmation)
      • Verify the compliance certificate
      • Look for verifiable evidence of resource allocation, such as budget/manpower, to the compliance program
      • verify internal audit
    • Visual Walkthrough
      • If you want to, make sure you know what you are doing.  There is a checklist here of things to look at
  • Security Infrastructure
    • Applies more when selecting a physical infrastructure provider
    • Basically, you are looking for best practices in data center setup and security
    • Checklist in this section (7.1.2) should be considered
  • Human Resource Physical Security
    • purpose is to minimize the risk of the personnel closest to the data disrupting operations and compromising the cloud
    • Consider
      • Roles and responsibilities are clearly defined
      • Background verification and screenings are done
      • Employment agreements (NDA's)
      • Employment terminations
      • Training (security, code of conduct, etc)
  • Assessing CSP Security
    • This section contains various checklists on areas to assess when selecting a CSP
    • I'm not going to list them all out, read the doc
    • Procedures
      • Basically, are their procedures documented and made available for inspection on demand
      • Things like NDAs, background checks, policies for information sharing, etc
    • Security Guard Personnel
      • Verify the instructions given to security personnel on what they should be checking, etc
    • Environmental Security
      • What protections are in place against environmental hazards (protection or detection)?
      • Maintenance plans, humidity controls, physically secure locations, impact of near-by (next-door) disasters in plans, asset control policies, methods for destroying data
  • Business Continuity
    • Provisions should be put in place should a major outage occur 
      • Financial compensation should the SLAs not be met
    • Review the existence of 
      • Emergency Response Team (ERT)
      • Crisis Management Team
      • Incident Response Team
    • Restoration Priorities
      • Discuss, incorporate, and quantify the RPO and RTO
      • Understand the information security controls needed
  • Recommendations
    • There is a lot in this section and I will go over some key points.  This is another section you will want to just read
    • Policy Recommendations
      • "Stringent security practices should prove to be cost effective and quantified by reducing risk to personnel, revenue, reputation, and shareholder value"
      •  Ensure that various policies meet or exceed the tenants current implementations
        • ie: background checks, least privilege, NDAs are enforced, etc
    • Transparency Recommendations
      • Perform an on-site visit (preferably unannounced)
      • Acquire documentation prior to visit in order to be able to conduct a mini-audit
    • Human Resources Recommendations
      • Ensure security team has industry certifications
    • Business Continuity Recommendations
      • Review BCP Plans of the CSP
    • Disaster Recovery Recommendations
      • plans should account for supplier(CSP) failure and have planned for the ability to switch providers
      • full-site, system, disk, and file recovery should be implemented via a user-drive, self-service portal
      • SLA should be properly negotiate
Summary

It is amazing how similar all of these topics are to things you would/should do in your own datacenter or organization.  There are all important points, however, to consider when migrating to the cloud.  As pointed out in the document, one must pay attention to BCP and DR issues.  There have been several notable instances where cloud service providers have "gone down" for hours at a time.  One should either protect against this via a cloud broker type tool that allows for service migration across different providers, or protect against the loss in financial terms via the SLA.

The other main point in this section is around the review of practices and documents provided by the CSP.  One of the key points here is that the CSP should be able to provide most of these documents "on-demand".  It should not come as a surprise to them that you are requesting to see their policies and procedures.  IT can be "expensive" when done properly, but that is only when you are ignoring the risk to the data and services that IT support.  As stated in the document, when done properly, security controls and IT in general can actually mitigate risk and save the company money in the event of unforeseen circumstances. 

The last point to note here is around the policies and procedures of the CSP.  Ultimately, you need to ensure they are following the same or better standards that you are following.  There has been a lot of discussion lately as to whether the cloud is "secure" or not. Some say that it is more secure than traditional IT because CSPs actually put money into the things mentioned in this document.  I think the argument is ultimately flawed.  If an IT organization was not aware of these best practices, chances are, they are not looking for it in their cloud provider... or not able to make sure that the cloud provider is doing what they say they are doing.  I guess what I am trying to say is that bad IT breeds bad IT and the problem is just worse in the cloud than it is in traditional IT that you can control.  IT organizations with strong and mature policies would probably be able to strategically use cloud resources (if they wanted to) knowing that they have processes and policies that work in-house.  They would take those lessons learned, and look for a partner (notice I didn't say CSP) that shares their same values and can provide them service at a reasonable (NOT "cheap") price.

There are quite a few good lists in this section, probably all good exam questions too.  This is going to be a section that I have to come back and review before the test.

Tuesday, August 13, 2013

CCSK Study - Domain 6: Interoperability and Portability

Notes
  • Scenarios
    • interoperability and portability allows you to scale a service across multiple disparate providers on a global scale
    • could allow the easy movement of data/applications/services from one provider to another
  • Not a unique concept to cloud
  • Interoperability
    • High level: requirement for the components of a cloud eco-system to work together
    • mandates that components be replaceable by new or different components and continue to work
      • Sorta like how management views employees ??? ;)
    • also extends to exchange of data
    • Reasons to change providers (short-list)
      • Unacceptable increase in cost
      • New provider provides more/better features
      • Provider ceases business operations
      • Provider is shutdown due to legal/disaster
      • Unacceptable decrease in service quality
      • Dispute between cloud customer and provider
    • Remember, cloud companies are also in the business of making money!
    • Lack of interoperability will lead to vendor lock-in
  • Portability
    • easy of ability to which application components are moved and reused elsewhere regardless of provider, platform, OS, infrastructure, location, storage, data format, or APIs
    • Generally only feasible to be able to port from cloud providers in the same "class" (eg:  IaaS to IaaS)
      • referring to the octant of the cloud cube
  • Failure to plan for I & P Can lead to unforeseen costs
    • Vendor Lock-In
    • incompatibilities across different cloud infrastructure causing disruption of service
    • unexpected application re-engineering
    • costly data migrations or data conversion
    • retrain or retooling new applications or management software
    • loss of data or application security
  • Moving services to the cloud is a form of outsourcing; the golden rule of outsourcing is "understand up-front and plan for how to exit the contract"[sic]
  • Interoperability Recommendations
    • hardware
      • do not access direct hardware if you don't have to
      • virtualize when you can
    • physical network devices
      •  try to ensure APIs have the same functionality
      •  try to use network and security abstractions
    • virtualization
      • use open virtualization formations (OVF) when possible
      • Understand and document vendor customized virtualization hooks or extensions in use
    • Frameworks
      • investigate CSP APIs and plan for changes
      • use open and published APIs
    • Storage
      • use portable formats for unstructred data
      • understand database system used for structured data and conversion requirements
      • assess the need for encryption of data in transit
    • Security
      • Use SAML or WS-Security for auth controls (more portable)
      • Encrypt data, understand how keys are used/stored
      • Ensure that log data is portable and secured
      • Ensure data can be securely deleted from the original system
  • Portability Recommendations
    • Understand SLA differences
    • Understand different architectures
      • understand portability issues which may include API, hypervisors, application logics, and other restrictions
    • Understand encryption, keys, etc
    • Remember to check for metadata
  • Recommendations for different Cloud models
    • IaaS
      • use OVF
      • document/eliminate provider-specific extensions
      • understand the de-provisioning of VM process (secure?)
      • understand the decommissioning of storage
      • understand costs involved for moving data
      • understand the process/governance of encryption keys
    • PaaS
      • use platforms with standard syntax and apis and that use open standards such as OCCI
      • understand the tools available for secure data transfer/backup
      • understand how base services such as monitoring and logging transfer to a new provider
      • understand functionality of old provider vs new (control)
      • understand impact of performance and availability 
    • SaaS
      • Perform regular data extractions and backups
      • understand what metadata can be exported
      • understand custom tools that may need to be redeveloped
      • ensure backups of logs/access records are preserved for compliance reasons
    • Private Cloud
      • ensure interoperability between hypervisors
      • use standard APIs
    • Public Cloud
      • ensure cloud providers use open/common interfaces
    • Hybrid cloud
      • ensure the ability to federate with different cloud providers to enable higher levels of scalability
Summary

I personally found this chapter hard to get through.  Portability and Interoperability are fundamental tenants of any solution, in my mind.  Being a developer, you use concepts such as abstraction to make your code more modular.  Modularity leads to code being able to be ported to different environments and allow for extensions to be built to handle specific scenarios.  I think that this chapter basically echos those fundamental tenants (over and over again).  Although it is probably better as a checklist, there are some good points that are given.  Bottom line, use open standards.  Use open technologies.  Use open APIs.  Use Industry standards.  Plan for any deviations from the above.

Friday, August 9, 2013

CCSK Study - Domain 5: Information Managemetn and Data Security

Notes
  • This domain talks about the security of data in a global sense, with some emphasis on how data is secured as it moves into the cloud
  • Data security begins with managing internal data
  • Different cloud architectures offer different storage options
    • IaaS
      • Raw Storage: basically a physical drive
      • Volume Storage: virtual hard drive
      • Object Storage:  API access that stores data as "objects".
        • Sometimes called file storage
      • Content Delivery Network: Object storage which is then distributed to multiple geographically distributed nodes
    • PaaS
      • Database-as-a-Service
      • Big-Data-as-a-Service
        • Object storage with requirements such as broad distribution, heterogeneity, and currency/timeliness
      • Application Storage
        • Any storage that is consumable via API but does not conform to the above two
      • Consumes:
        • Databases
          • Information may be stored in databases directly that run on IaaS
        • Object/File Storage
          • IaaS object storage but only accessible via PaaS APIs
        • Volume Storage
          • May use IaaS Volume Storage
    • SaaS
      • As with PaaS, wide range of storage options/consumption models
      • Information Storage and Management
        • data is simply entered into the service
        • stored in a database (typically)
        • could provide some access to PaaS APIs for mass upload type functionality
      • Content/File Storage
        • File stores are made available via web-based user interface
      • Consumption
        • Database
        • Object/File Store
        • Volume Storage
        • Key is that the services that are consumed are only accessible via the SaaS service
  • Data Dispersion
    • Technique that can be used to secure data
    • Data is devided into chunks and those chunks are then signed
    • Chunks are distributed across multiple servers
    • In order to recreate the data, an attack must be able to target all servers that contain the chunks of data
    • Or attack the API that puts it all together?
  • Information Management
    • includes the processes and policies for both understanding how your information is used, and governing that usage
  • Data Security Lifecycle
    • Basically, we need to understand the "states" data can be in, the location where the data lives, and the functions/actors/controls in place to control data
    • 6 phases
      • Not liner, data can pass through some stages multiple times, or some stages not at all
      • Create: generation of new or modification of existing content
      • Store: Committing data to some sort of storage
      • Use: Data is viewed, processed, etc
      • Share: Information is made accessible to others
      • Archive: data enter long term storage
      • Destroy: data is permanently destroyed
    • Location and Access
      • Data can be accessed on a veriaty of end-user devices that all offer different security mechanisms
      • Data can live in traditional infrastructure
      • Data can live in cloud and hosting services
      • Key Questions
        • Who is accessing the data?
        • How can the access it?
    • Functions, Actors, and Controls
      • We need to identify what actions we can conduct on a given datum
        • Access
        • Process
        • Store
      • An Actor performs each function in a location
        • person, application, system, process
      • Controls
        • put in place to restrict the list of possible actions to the list of allowed actions
  • Information Governance
    • like information management, only different
    • Includes the definition and application of
      • Information Classification
        • Does not need to be super granular to work (ie: differentiate regulated content from non-regulated content)
      • Information Management Policies
        • Defines what types of actions are allowed on a given datum
      • Location and Jurisdictional Policies
        • defines where data may be located
      • Authorizations
        • Defines who is authorized to access which types of information
      • Ownership
      • Custodianship
  • Data Security
    • This section lists out some controls to protect data
    • Detecting and Preventing Migrations into the cloud
      • Monitoring Access to internal repositories
        • DAM: Database Access Monitoring
        • FAM: File Access Monitoring
      • Monitoring/Prevention of Data moving into the cloud
        • URL Filtering
          • Prevent access to mass upload apis, etc
        • Data Loss Prevention
      • Placement of network based tools must be understood and planned accordingly
    • Protecting data moving to the cloud or within it
      • Client/Application Encryption
        • Data is encrypted before it is sent to the cloud
      • Link/network encryption
        • Data is encrypted in transit (SSL)
      • Proxy-Based Encryption
        • Legacy apps
        • Not recommended
        • Data is sent to a proxy-based encryption device before being sent to the cloud
    • Protecting data in the cloud
      • Step 1: Detection
        • Content Discovery
          • Need to understand the content being stored in the cloud
      • Step 2: Encryption
        • The different cloud architectures offer different encryption options.
        • Generally: Volume encryption, object encryption
        • Key management is the important issue here
          • Provider-managed keys
          • Client-managed keys
          • Proxy-Managed keys
        • Should use per-customer keys if you have to use provider managed keys
          • SaaS and PaaS may not offer protections such as passpharses on the keys
    • Data Loss Prevention
      • Many different deployment options (endpoint, hypervisor, network, etc)
      • Definition:  Products that, based on central policies, identify, monitor, and protect data at rest, in motion, and in use through deep content analysis
    • Database and File Activity Monitoring
      • duh!
    • Application Security
      • Remember, most data breaches are due to poor application security
    • Privacy Preserving storage
      • Similar concept to VPN  is VPS or virtual private storage
      • Doesn't matter if someone intercepts the data, they cannot use it / understand it
      • Certs are good, but are bound to the identity of the user
        • May violate some regulations if the authentication requestor knows the identity of the person accessing the information
        • ABCs or attribute-based credentials
          • Sorta like claims based authentication, do not need to know the user anymore, just the "rights" they have been granted
    • Digital Rights Management
      • encrypts content and then applies a series of rights
        • For example, can play, but cannot share/copy
      • Consumer DRM
        • music industry! (ugh)
        • emphasis on one way distribution
      • Enterprise DRM
        • emphasis on more complex rights, policies, and integration
    • Recommendations
      • Understand the cloud storage architecture in use
      • chose data dispersion when available
      • use the Data Security Lifecycle as a guide for building controls
      • monitor internal data repositories with DAM/FAM
      • Use DLP and Url filtering to track employee activity
      • Use content discovery
      • Encrypt data ruthlessly (my words)
        • Transit, storage layer, and if possible against viewing of the CSP
      • Remember that most data breaches are because of weak application security
Summary
This domain was a little bit more involved that the last one, but, once again I think focuses more on common sense than anything else.  I think the key point here is that data is hard to manage internally.  And that is okay, most corporations do not have a good way to manage that data internally, but at least the data in internal and only accessible by employees that are under contract.  Once you move to CSPs (or enable the internet...) you need to start having the right tools in place to monitor activity and usage of your data.  These includes concepts such as DAM/FAM/URL Filtering/DLP.  I personally think that the best solutions these days are those that allow data to enforce its own "security".  IE:  The data is encrypted and a client needs to be installed to un-encrypt it.  The client can then enforce policy and nobody can access the data unless the client is installed etc.  As stated in the document, this leads to expensive infrastructure to have this happen.  There are also ways around this (copy and paste for example).  To make the problem easier to tackle, create board generalizations for the data (regulated vs not-regulated) and go from there.  Understand also the concepts of key management.  Ultimately, when you do PaaS or SaaS the service on the other end will need to "understand" the data in order to be able to provide you a service.  Those risks need to be weighed out during the initial cloud discussions.

Thursday, August 8, 2013

CCSK Study - Domain 4: Compliance and Audit Management

Notes
  • Corporate Governance: The balance of control between stakeholders, directors and manages to provide consistent management, cohesive application of policies, and enable effective decision making.
  • Enterprise Risk Management: Methods and processes (frameworks) used by organizations to balance decision making based on risks and opportunities
  • Compliance and Audit Assurance: Awareness and adherence to corporate obligations
  • Audit
    • key component to any proper organizational governance strategy
    • should be conducted independantly
    • should be robustly designed
    • should take into consideration the cloud
      • scale and services provided
  • Recommendations
    • Understand that audit processes change when moving to the cloud
    • Understand the contractual responsibilities of each party
    • Determine how existing compliance requirements will be impacted by the use of cloud services
      • Who does what?
    • Be careful with PII data
    • Customers and CSPs must agree on how to collect, store, and share compliance evidence
      • Select auditors that are "cloud aware"
      • request SSAE 16 SOC2 or ISAE 3402 Type 2 Report
      • Understand how audits will be conducted
  • Requirements
    • Ensure a  "right to audit" clause
      • Audit framework may be adapted to use 3rd party frameworks such as ISO, IEC, etc
    • Ensure a "right to transparency" clause
      • should include provisions for automated information such as logs, reports and pushed information such as diagrams, architectures and schematics
    • mutually selected 3rd party auditors
    • some agreement on common certification assurance framework (ISO,COBIT,etc)
Summary
I'm glad this was a short section!  I think that definitions used in this section are fairly common and apply to any organization, not just one using cloud.  The points made in this section seem fairly straightforward.  Basically, make sure the audit process takes into account the cloud.  Make sure that you have provisions in your contract that allow you to be compliant and force the CSP to do it's share.   All these things should be discussed up front with the CSP and the risk/benefits should be weighed if the contract is just a "click-wrapper" or non-negotiable.

CCSK Study - Domain 3: Legal Issues: Contracts and Electronic Discovery

Notes
  • Legal Issues
    • Many different regions and countries have numerous laws in place to protect the privacy of personal data and the security of information and computer systems.
    • Most specify terms such as "Adopt reasonable technical, physical, and administrative measures in order to protect personal data from loss, misuse, or alternations"
    • Examples
      • OECD: Organization for economic cooperation and development
      • APEC: Asia Pacific Economic Cooperation's Privacy Framework
      • European Union Data Protection Directive
    • Organizations should be aware of the laws they are subject to
      • Even contractors of corporations may be subject to certain laws
      •  HIPAA, GLBA, PCI DSS, ISO 27001, COPPA
    • May not be in the form of laws, but rather contractual obligations
    • Some laws may prohibit the export of data/information outside of the country
      • Obviously comes into play with cloud providers
    • Key point:  under many of these laws, the responsibility for protecting and securing the data typically remains with the collector or custodian of the data.  Before entering into a cloud computing arrangement, a company should evaluate it's own processes.  A company should, and in some cases is legally bound to, conduct DD of the proposed cloud service provider.
    • Companies should keep in mind that CSPs are constantly updating, and they should continually monitor, test, and update their process to reflect any changes in the CSP
      • Example: CYBEX
    • E-Discovery Issues
      • I think that although these issues were brought up during a conversation about e-discovery, they are relevant to all types of data being stored in the cloud
      • ESI: Electronically stored information
      • Possession, Custody, and Control
        • Clients are expected to turn over all data in their control (that pertains)
        • Clients do not have access to CSPs DR locations, or certain metadata that the CSP has created about a document
        • Clients should have an understanding of what data is and is not avaliable
      • Relevant Cloud Applications and Environment
        • The cloud app may come into scope and may require a separate subpoena
      • Searchability and E-discovery Tools
        • Certain tools will not work with the cloud, or may be expensive to run
        • Client may not have rights to search all data in the cloud
      • Preservation
        • Clients need to preserve the data (using all reasonable steps)
        • What about SLA's?  What happens if the SLA expires before the preservation term?
        • Monitoring of cloud provider?
        • What about the costs of storage for preservation?
        • Can the client effectively download the data in a forensically sound manner so it can be preserved off-line / near-line?
        • How is data tagged or scoped for preservation in the cloud?  Does the cloud provider offer that granularity?
      • Collection
        • Due to CSP, collection of data may be more difficult
        • Data may only be available in batches at a time
        • Access and bandwidth restrictions?
        • SLA may restrict the speed at which data is accessed or the manner in which it is accessed
        • Cannot do bit-by-bit forensics, if required
        • Client is subject to take reasonable steps to validate that its collection from its CSP is complete and accurate
      • CSP may deny direct access to its hardware
      • CSP may be able to produce "native production" of the data but it may not be in a usable format
      • Documents should not be considered more or less admissible or credible from the cloud (provided no evidence to contradict)
      • Clients should contract in provisions that they be notified and given sufficient time to fight subpoena or search warrant
Summary
This section brings up some good points about storing data in a CSP.  Although the focus here was more on the legal end, it is important to understand that these issues around the trust of data stored, how it is stored, and how it is accessible are applicable to all types of data.  The courts obviously require some degree of validation to be done that the data can be admissible in court.  Further to that, with respect to e-discovery, the courts need some degree of assurance that all the data that should have been submitted in fact was.  A subset of these issues may be important to other types of data based on contractual obligations or corporate policies.

Monday, August 5, 2013

CCSK Study - Domain 2: Governance & Enterprise Risk Management

As stated in the title, Domain 2 focuses on the issues of Governance and Enterprise Risk Management as it relates to the cloud.

Notes

  • Corporate Governance
    • is the set of processes, technologies, customs, policies, laws and institutions affecting the way an enterprise is directed, administered, or controlled
    • 5 basic principles
      • Auditing Supply Chains
      • Board and Management Structure and Process
      • Corporate responsibility and compliance
      • Financial transparency and information disclousure
      • Ownership structure and exercise of control rights
  • Enterprise Risk Management
    • is the process of measuring, managing and mitigating uncertainty or risk
    • Multiple methods to deal with risk
      • Avoidance
      • Reduction
      • Share / Insure
      • Accept
    • General goal: maximize value in line with risk appetite and strategy
    • Many benefits to cloud computing, however
      • Customers should view cloud service providers as supply chain security issues
      • Must evaluate providers incident management, disaster recovery policies, business continuity policies...
    • Companies should adopt an established risk framework
      • should use metrics to measure risk management
        • SCAP, CYBEX, GRC-XML
      • adopt risk centric viewpoint
      • framework should account for legal perspective across different jurisdictions
  • Recommendations
    •  Reinvest the cost savings from moving to the cloud into security
      • Detailed assessments
      • Application of Security Controls
      • Risk assessments, verifying provider capabilities, etc
    • Review security controls and vendor capabilities as part of DD
      • review for sufficiency, maturity, and consistency with the user's information security management processes
    • Ensure goverence processes and structures are agreed upon by both the tenant and provider
    • Security departments should be engaged as part of the SLAs
      • Ensure that security requirements are contractually enforceable
    • Define appropriate cloud security metrics
      • Really? Do these exist?
    • Consider the affect of cloud limitations on audit policies and assessments
      • may have to change the way audit is conducted
      • remember to contract requirements in
    • Risk management should include identification and valuation of assets, identificationa nd analysis of threats and vulnerabilities and their potential impact on assets, likelihoods of events/senarios, and management-approved risk levels
    • Take into account vendor risk
      • business sustainability, portability of data/applications,
Summary
This section essentially defines enterprise risk management and corporate governance.  In theory, all organizations should already be doing this at some level.  I think the important points here are to make sure the enterprise is aware that moving to the cloud means a loss of control over every aspect of the technical solution.  This means, in some cases, changing the way audits or testing is done to accommodate for the vendors preferences or limitations.  Further to this, you need to pay your lawyers and ensure that all requirements you have are stipulated in some form or another into the contract.  CSPs are basically an extension of the enterprise, much in the same way outsourcing is, but it basically has full control over the data you place in its possession.  I like the point about re-investing "savings" into increased security.  Ultimately, as you lose full control over an asset, you must increase your vigilance (detection tools) to ensure that your wishes as stipulated in a contract are being followed.  You can try and hide behind a contract, saying that it was the CSPs responsibility to do something, however in the courts you would have to prove that the CSP was negligent.  This may be harder than anticipated.