Standards for Accountability in the Cloud

Chapter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8937)

Abstract

This paper examines the role of standards in the cloud with a particular focus on accountability, in the context of the A4Cloud Project (Accountability for the Cloud). To this end, we first provide a general overview of standards, what they are and how we can categorize them, as illustrated by a few cloud-specific examples. Next, we examine the intersection between standards and accountability, by highlighting how standards influence the A4Cloud Project and reciprocally how the A4Cloud Project aims to influence accountability related standards. We argue that specification standards can foster interoperability for the purpose of accountability, thereby making accountability more automated and pervasive. Finally, we take a closer look at a particular accountability requirement: the continuous monitoring of the compliance of cloud services. This is an area of great interest for standardization, which faces many research challenges.

Keywords

Cloud Standards Accountability Interoperability Security Monitoring 

1 Introduction

The IEEE standards glossary1 describes a standard as “a document that defines the characteristics of a product, process or service, such as dimensions, safety aspects, and performance requirements”. This work provides an overview of standards with a particular focus on their role for accountability in the cloud, based largely on the latest research we conducted in the context of the A4Cloud European Research Project (http://www.a4cloud.eu/).

1.1 The Importance of Standards

Standards are important in many different ways in IT. First and foremost, as we will see, standards create common vocabularies and interoperability, thereby enabling different organization to follow common and comparable processes, exchange and interpret data in a unified way. For service providers, interoperability enables automation and cost reduction. For customers, it enables choice between “compatible” offerings, avoiding vendor lock-in. In some domains, it also increases quality assurance with the adoption of standardized criteria of evaluation for products or services.

Standards are also notorious battlegrounds for competing economic interests. In some sectors, companies lobby strongly for the adoption of standards that match their own products or services as closely as possible: it provides them with a strong advantage over their competitors, which will need to adapt to the new adopted norm. The classic example of a standards war is the videotape format2 war between VHS and BetaMax in the 80 s.

Finally, while policymakers often influence the definition of standards in order to promote open markets or customer safety, for example, there is also evidence of the opposite: existing standards influence policy makers, as illustrated by current EU cloud standardization initiatives.3 In this respect, fundamental standards (see Sect. 1.2) play a key role by establishing the scope and terminology in a field, in way that is later difficult to change completely.

1.2 The Main Categories of Standards

While there are many ways to classify standards, we propose to adopt the 4 general categories of standards defined by CEN-CENELEC4 to structure the discussion of this work. If we restrict ourselves to the IT domain, these 4 categories can be expressed as follows:
  1. 1.

    Fundamental standards - which concern terminology, conventions, signs and symbols, etc.;

     
  2. 2.

    Organization standards - which describe the functions and relationships of a company, as well as elements such as quality management and assurance, maintenance, value analysis, project or system management, etc.

     
  3. 3.

    Specification standards - which define characteristics of a product or a service, such as interfaces (APIs), data formats, communication protocols and other interoperability features, etc.;

     
  4. 4.

    Test methods and analysis standards - which measure characteristics of a system, describing processes and reference data for analysis;

     

Examples of standards in the first 3 categories are provided in Sect. 2 where we illustrate some key cloud standards. The last category can be exemplified through software testing standards such as [1]. Since it is less relevant to our work on accountability, we do not discuss this 4th category further.

1.3 Organization of This Work

The rest of this paper is organized as follows.

In Sect. 2, we offer a few illustrative examples of standards that play a key role in the cloud, some of which are likely familiar to the reader.

In Sect. 3, we examine the intersection between accountability and standards. First, we look at the role of standards from a research project perspective. Next, we focus on the use of standards to foster interoperability as a means to increase accountability.

The last section of this paper is devoted to the continuous monitoring of the security of cloud services. We argue that this is still largely an area of ongoing research where standardization has a key role to play.

2 Examples of Key Standards in the Cloud

In this section, we provide examples of fundamental standards, organization standards, and (de facto) specification standards, which play an important role in the cloud.

2.1 The NIST Cloud Computing Reference Architecture

The NIST Special Publication 500-292, better known as the NIST Cloud Computing Reference Architecture [5] is a good example of a fundamental standard. The initial goal of this standard was to create “a level playing field for industry to discuss and compare their cloud offerings with the US Government”. This standard essentially proposes:
  1. (1)

    A taxonomy of cloud actors.

     
  2. (2)

    Key architectural components of cloud service deployment and orchestration.

     

The 5 main cloud actors proposed are Cloud Consumers, Cloud Providers, Cloud Brokers, Cloud Auditors and Cloud Carriers. We will not details the key architecture components proposed in the standard, but we note that they include the ubiquitous terminology for service deployment models such as public, private, community and hybrid clouds, concepts related to management, responsibility, security, privacy, business support, service orchestration including the IaaS/PaaS/SaaS deployment models. As such, it’s easy to understand that this standard has been very influential in shaping the discussion about cloud ecosystems as well as following cloud standards,5 way beyond the US government domain.

From the point of view of accountability, we note that the roles described in this standard do not provide a fully satisfactory tool to describe accountability scenarios in the cloud. The NIST model defines a Cloud Provider as an entity “responsible for making a service available to interested parties” and a Cloud Consumers as an entity “that maintains a business relationship with, and uses service from, Cloud Providers”. It is tempting to consider the person (data subject) whose data is processed by a cloud as a cloud customer, but this approach excludes many scenarios. Consider for example the case of a hospital that stores some heath data with a cloud provider: the hospital is a cloud customer but the patient cannot be attributed any role in the NIST model despite being the entity to which all others are accountable to in the supply chain.

In the A4Cloud project, to deal with this limitation, rather than creating a new role taxonomy specifically for accountability, we chose to reuse and extend the NIST taxonomy, adding relevant accountability actors (as described in Sect. 3.2). This was done for pragmatic reasons in order to facilitate the dissemination of our work outside the project, illustrating again the importance of fundamental standards in setting the terminology in a field.

2.2 ISO/IEC 27001 and CSA CCM

Information security often tends to be based on a set of ad hoc security controls and “best practices checklists6” (e.g. “update your antivirus frequently”). While such approaches may work in small organizations, they do not scale to larger ones, because they often miss crucial issues such as governance and compliance.

ISO/IEC 27001 [3] is an organizationstandard for information security management systems (ISMS). This standard aims to bring a formalized and managed approach to information security. As such, it directs organizations to: (i) establish an information security governance process, (ii) conduct a risk analysis, (iii) apply adequate risk treatment notably by selecting relevant controls, and (iv) monitor and update the information security process through time.

IT services, including cloud services, can be certified against ISO/IEC 27001 after being audited by an independent third-party. By receiving such a certificate, the service provider demonstrates the adoption of a managed holistic approach to information security. Many prominent cloud services have achieved this certification recently (e.g. Amazon AWS, Microsoft Azur, Google Apps, etc.).

One of the key elements of the establishment of an ISMS is the selection of adequate controls, i.e. measures that mitigate identified risks and enforce the proper management of the information system. ISO/IEC 27001 (and it’s companion standard ISO/IEC 27002 [2]) proposes a non-limitative catalogue of controls objectives for companies to choose from. Unfortunately, while this catalogue of control objectives is well suited for traditional IT systems, is does not fully address cloud specifics (e.g. multi-tenancy and virtualization).

In 2010, the Cloud Security Alliance (CSA, employer of these authors) started developing a control framework called the Cloud Control Matrix7 (CCM) that specifically addresses cloud security issues. In the end of 2013, CSA introduced the STAR Certification8: the first cloud security specific certification scheme, based on ISO/IEC 27001 but extending it with several features, including the use of the CCM as the main control framework, as opposed to the traditional approach.

The example of CSA’s pragmatic re-use of ISO 27001 is yet another illustration of the influence and importance of established standards, as already highlighted with the reference architecture of the NIST in the previous section.

2.3 Amazon EC2 and S3 APIs

Amazon EC29 and S310 APIs are application programming interfaces for virtualized computing and storage facilities, respectively. Some competitors of Amazon and open-source cloud IaaS projects,11 including OpenStack, Eucalyptus, OpenNebula or CloudStack, have adopted these APIs for their IaaS offering. While they are not standards by design, these APIs have become an interesting example of de facto specification standards.

2.4 Gathering Cloud Standards

There are many more cloud standards beyond the previous few examples.

Some of these standards are not cloud specific but provide one of many building blocks for cloud services, such as JSON, XML, XACML [8], OpenID,12 AMPQ [12], SSH [13] or Remote Frame Buffer13 just to name a few examples.

On the other hand, some standards are being developed for cloud specific needs, such as:
  • OVF [14]: a standard for packaging virtual “appliances” in portable containers.

  • OCCI14: an API standard for the management of IaaS and PaaS.

  • CDMI [15]: an API specification standard for managing virtual storage.

These examples of cloud specific specification standards generally aim to foster portability across cloud services. In the following sections we will also discuss relevant cloud organizational standards mainly from ISO/IEC.

For more detailed overview of standards in the cloud we refer the reader to [16].

3 Standards and Accountability

The A4Cloud Project proposes the following definition for accountability: Accountability consists of defining governanceto comply in a responsible manner with internal and external criteria, ensuring implementation of appropriate actions, explaining and justifying those actions and remedying any failure to act properly. Putting this definition in practice through concepts, mechanisms, tools and standards is the central goal of the project as described in [23].

In 2013, ETSI (the European Telecommunications and Standards Institute) was tasked to perform a gap analysis of cloud standards through their Cloud Standards Coordination taskforce (CSC), to which A4Cloud contributed. The results [17] of this gap analysis highlight the lack of cloud accountability-related standards, by concluding that “…our analysis has shown that cloud computing governance and assurance standards specifically developed for and aimed at the cloud already exist (e.g., cloud controls framework, security cloud architectures, continuous monitoring of cloud service provider’s) and some of them are considered as sufficiently mature to be adopted. Further standardization work may be helpful as a supplement to best practices in areas such as incident management, cloud forensics, and cloud supply chain accountability management.” In this section we show how standards can play an important role in bridging this conspicuous gap, first by describing A4Cloud initiatives to include accountability in on-going cloud related standards, and second by discussing how we could create specification standards to further bolster accountability through interoperability and automation.

3.1 A Project Strategy for Putting Accountability in Standards

From a project point of view, the relationship between accountability and standards is a two-way street. On one hand, A4Cloud aims to develop a set of practices, processes and tools that can be reused and extended by cloud providers and researchers. To maximize the success of such an approach, it is necessary to reuse existing standards as much possible to facilitate interoperability with existing systems and the dissemination of project results. On the other hand, many of these practices, processes and tools contain a degree of novelty that is not covered by any existing standard. It is therefore important for us to also consider how we can influence the design of existing and future standards to cover accountability requirements. ETSI highlighted [17] “the need for further standardization efforts in the area of accountability and cloud incident management (e.g., related with a SLA infringements).” They further added that “such work would greatly benefit the whole cloud supply chain, although once again the main challenge is trust/security assurance among the involved stakeholders.”

There is a very large potential for standardization work related to accountability, which extends way beyond the resources available to a single European research project such as A4Cloud. We therefore decided to build a standardization strategy that would focus on a pragmatic approach. As part of this strategy, we classified cloud related standards in the following categories:
  • Adopted standards: Standards that are adopted in the project without any change (e.g. XACML [8]).

  • Missing standards: Standards that are fully missing with regards to accountability, and could not be realistically created within the timeframe of the project.

  • Partial standards: Standards that are strongly related to accountability, but that would require some modifications or extension.

As detailed in [16], we have focused our attention to standards in the third category where there is furthermore an opportunity to contribute, either because the standard is still in an incubating stage or because it is ongoing a revision. We will describe a few examples of these standards as an illustration.

ISO/IEC 17788 & 17789. ISO/IEC 17788 [18] is a fundamental standard titled “Cloud Computing Overview and Vocabulary” and aims to provide a terminology for other standards related to cloud computing. The companion ISO/IEC 17789 [19], which is titled “Cloud Computing Reference Architecture”, is also a fundamental standard that aims to provide a reference architecture for cloud computing, describing cloud computing roles, activities, functional components and their relationships, much like the NIST did (see Sect. 2.1).

Our main goal from the A4Cloud perspective is to add accountability as a crosscutting aspect in these two standards, thereby increasing the chance that accountability will be part of the discussion in future standardization processes that will reference these two fundamental standards as a foundation.

ISO/IEC 19086. This standard [9] titled “Service Level Agreement (SLA) Framework and Terminology” provides a set of building blocks for SLA, through the definition of some fundamental concepts and terminology, with an aim to set a common framework for customers and providers. It is therefore again a fundamental standard, rather than a standard describing SLA templates and formats, despite its title.

Whereas traditional SLAs tend to focus on performance objectives only (e.g. uptime), our main goal is to influence the wording of the standard to make additional room for the inclusion of privacy, security and governance objectives in SLAs, as these are needed to support accountability.

NIST CSC. As part of their ongoing work on a Cloud Reference Architecture and Taxonomy (RATax) NIST has began examining how to define metrics applicable to the monitoring of security properties described in an SLA [11]. As a starting point this fundamental standard will likely define core concepts such as metrics, measurements, measurementunits and measures in order to build a sound SLA metric architecture.

The goal pursued by A4Cloud is again to try to influence this work to allow room for the inclusion of metrics that apply not only to performance but also to accountability related objectives, such as security, privacy and governance. As further discussed in Sect. 4.3, such a framework needs to be flexible enough to address the complexity of security measurements.

3.2 Interoperability as an Enabler of Accountability

One of the attractive aspects of the cloud ecosystem is the ability to build new cloud services and applications from other pre-existing cloud services and applications. This is typically exemplified by cloud services like Dropbox [20], which builds upon Amazon storage, or more complex services like Netflix,15 which combine IaaS, PaaS, and content distribution networks across the globe. The ability to make services work together seamlessly across complex supply chains is made possible by two largely intertwined features: interoperability and automation. Interoperability describes16 the “ability of a system or a product to work with other systems or products without special effort on the part of the customer” and is “made possible by the adoption of standards”. Formal or de facto standards specify common data formats, semantics and communication protocols adopted by actors in the cloud supply chain. The adoption of standards in turn facilitates automation of the processes involved in the provisioning of cloud services, unleashing the efficiencies that make the cloud successful. We believe that with adequate automation, we can reduce real or perceived costs associated with providing accountability in the cloud can be reduced. In turn, by reducing the cost of accountability we can encourage the greater adoption of best practices for data stewardship.

As described in the next paragraphs, in order to support automated mechanisms to enable accountability provision in the cloud, we first identified all actors typically involved in cloud accountability interactions. Next, we found that their accountability-related interactions could be classified in 4 general subgroups, which in turn could be used to shape requirements for interoperability for the purpose of accountability.

As described in Sect. 2.1, the well-known NIST cloud supply chain taxonomy has shortcomings when it comes to the description of accountability actors. Nevertheless, because of its popularity, we chose to modify and extend their model rather than create a new one, creating the following cloud accountability taxonomy composed of 7 main roles:
  1. 1.

    Cloud Subject: An entity (individual or organisation) whose data are processed by a cloud provider, either directly or indirectly.

     
  2. 2.

    Cloud Customer: An entity (individual or organisation) that (1) maintains a business relationship with, and (2) uses services from a Cloud Provider.

     
  3. 3.

    Cloud Provider: An entity responsible for making a [cloud] service available to Cloud Customers.

     
  4. 4.

    Cloud Carrier: The intermediary entity that provides connectivity and transport of cloud services between Cloud Providers and Cloud Customers.

     
  5. 5.

    Cloud Broker: An entity that manages the use, performance and delivery of cloud services, and negotiates relationships between Cloud Providers and Cloud Customers.

     
  6. 6.

    Cloud Auditor: An entity that can conduct independent assessment of cloud services, information system operations, performance and security of the cloud implementation, with regards to a set of requirements, which may include security data protection, information system management, laws or regulations and ethics.

     
  7. 7.

    Cloud Supervisory Authority: An entity that oversees and enforces the application of a set of rules.

     
Next, we found that we could classify the accountability interactions between any pair of those 7 actors into 4 main subgroups:
  1. 1.

    Agreement covers all interactions that lead to one actor taking legal responsibility for the handling of certain data provided by another party according to a certain policy. These interactions may include a negotiation phase. A policy may express requirements that apply to all 7 core accountability attributes, and contributes to the implementation of the attributes of responsibility and liability.

     
  2. 2.

    Reporting covers all interactions related to the reporting by an actor about current data handling practices and policies (e.g. reporting security breaches or providing security/privacy level indictors). This type of interaction mainly supports the implementation of the accountability attributes of transparency and observability.

     
  3. 3.

    Demonstration covers all interactions that lead to one actor demonstrating the correct implementation of some data handling policies. This includes external verifications by auditors or cryptographic proofs of protocol executions, for example. This type of interaction mainly supports the implementation of the accountability attributes of verifiability and attributability. We emphasise that Demonstration differs from Reporting in that it implies some form of proof or provision of evidence.

     
  4. 4.

    Remediation covers all interactions that lead one actor to seek and receive or offer remediation for failures to follow data handling policies. This mainly supports the implementation of the accountability attribute of remediability.

     

By cross matching these 4 subgroups of interactions with the cloud accountability roles above, we identified 31 key interoperability requirements for accountability in the cloud. While we refer the reader to [21] for the details, we can highlight two key elements of this analysis. First and foremost, an essential requirement for enabling interoperability for the purpose of accountability in the cloud is the ability of 2 communicating parties to share a common understanding of security and data protection policy semantics and their associated metrics, be it for the purpose of agreement, reporting, demonstration and/or remediation. Unfortunately, this common ground for semantics hardly exists today [4]. For example, all major cloud providers use different semantics and metrics for availability [6]. The same can be said if two interacting actors use different technical standards to interpret properties such as “consent”, “confidentiality level” or “user information” (independently of their legal meaning), just to give a few examples.

Second, interoperable accountability mechanisms have to be interoperable across the cloud supply chain. For example, if a cloud provider needs to report data stewardship information to a customer, it may need itself to obtain information from other providers acting as its sub-providers, while still preserving a common understanding of policy semantics.

With so many actors and interactions, we need to set priorities in attempting to automate accountability interactions in the cloud. The logical step is to focus first on the most frequent and necessary interactions and later on the more uncommon ones. In this respect, Information and Agreement are the two subgroups of interactions we should start with, focusing in particular on Cloud Customers, Cloud Providers, and Cloud Subjects (data subjects). At the other end of the spectrum, we expect remediation interactions and more generally interactions with supervisory authorities and auditors to be rarer and therefore less of a priority for automation.

There are currently some significant initiatives that could provide interoperability and automation supporting accountability in the cloud. To begin with, the A4Cloud project itself is proposing a policy language A-PPL, which is an extension of the PPL language [7], itself based on XACML [8]. More broadly, the A4Cloud project will produce a set of novel tools that will aim to tackle the interoperability issues highlighted above. The Cloud Security Alliance is developing two relevant RESTful APIs: CloudAudit to access audit data from cloud provider, and the Cloud Trust Protocol for constant monitoring of security properties of cloud services, both contributing to automated Information and Demonstration interactions. As described in Sect. 3.1, the NIST has begun examining how to define metrics applicable to the monitoring of security properties described in an SLA. Also in Sect. 3.1, we noted that ISO is developing a new standard for cloud SLAs. The European Commission is investigating model terms for cloud SLAs [22]. As these initiatives mature and related standards begin to be developed, we hope to see accountability as a service become a reality in the cloud in the next few years.

4 Standardization Challenges in Continuous Monitoring

As we argued in the previous section, information (and agreement) interactions should be priority for interoperability efforts. In fact, many cloud customers hesitate to adopt the cloud because of a perceived loss of control and lack of transparency.17 By allowing customers to frequently query and receive up-to-date information about the security and compliance level of a cloud provider, we could create a path towards greater transparency and trust.

In this section we therefore focus our attention on one particular type of accountability interactions described in the previous section: information interactions between Cloud Customers and Cloud Providers. More specifically, we look at the continuous monitoring of security attributes of cloud providers.

4.1 Defining Continuous Monitoring

We consider “continuous monitoring” as the automated gathering of up-to-date machine-readable information related to information security & privacy compliance.

By automated, we do not mean that the data is not produced by humans but rather that it is made available by automated means. By up-to-date, we mean that the information is based on periodic assessments done at reasonably short time internals. The time internals between “checks” can be seconds, hours, days, weeks, and it is supposed to be the shortest interval that would allow the collection of up-to-dated information for the purpose of monitoring the relevant security attributes of a cloud service. Such an approach is meant to complement traditional audits, which are limited to annual or bi-annual assessments, as in ISO/IEC 27001 certification (see Sect. 2.2).

4.2 The Scope of Constant Monitoring: Security Attributes

In theory, many things are susceptible to continuous monitoring as defined above, but in practice providing access to relevant up-to-date security information in an automated way is usually challenging, unless the data is related solely to system performance (e.g. availability, throughput, etc.). In the field of security, the notion of “continuous monitoring” has been tentatively applied both to high-level “control objectives” and “controls” or lower-level objects such as “service level objectives”, “performance indicators” and “security properties”.

By nature, control objectives proposed in control frameworks such as ISO/IEC 27002 or the Cloud Control Matrix (CCM) are a mix of compliance, governance and technical aspects (see Sect. 2.2). As such, most controls objectives contain elements that cannot be assessed by automated means on a reasonably short timeframe or continuous basis. For example, many control objectives in the CCM v. 3.0 refer to “documented procedures”, “applicable legal obligations” or generic functional requirements such as “support forensic investigative capabilities”. These elements require human assessment and no automated process would be capable of monitoring their effective implementation in an up-to-date fashion with current technology.

Controls, as instantiations of control objectives into precise mechanisms and policies, lend themselves better to “continuous monitoring”. Still many controls contain elements that require human evaluation that may not be feasible on a continuous basis. Often, what we can really monitor is specific individual characteristics of a control, or characteristics that show that a control has failed. In some cases we will monitor the existence/effectiveness of a control by inferring information collected through the checking of other similar controls or security attributes. For example, we might define a control that requires the implementation of a documented backup policy, which includes monthly testing of backup restorations associated to recovery point objectives (RPO). We may not be able to constantly monitor that the policy is up-to-date or that the technical backup mechanisms are in alignment with the policy, but we may monitor backup restoration frequency, success rate, simulated restoration point actual (RPA) and contrast it with the RPO.

We broadly define a security attribute as any non-functional security characteristic of a system that can be assessed quantitatively or quantitatively. Security attributes therefore encompass elements typically called elsewhere security properties, performance indicators, or (improperly) service level objectives.

Continuous monitoring seems to be more applicable to the notion security attributes because of their more focused scope. Security attributes may include for example “monthly uptime”, “encryption level in transit/at rest”, “key length”, “incident response time”, “data erasure quality”, “country level anchoring”, etc.

These simpler characteristics we call “security attributes” are often –but not always– represented by values in a restricted domain and associated with specific metrics (i.e. a standard of measurement along with a measure unit). Attributes come in many “flavors”: some attributes are largely declarative (e.g. processing location) while others are computative, requiring large calculations over system events (e.g. monthly uptime).

4.3 Standardizing the Measurement of Security Attributes

We next turn our attention to the necessity of standards for security attributes and their metrics.

In it’s simplest form, continuous monitoring could rely on a simple API that would allow Cloud Customers to securely query Cloud Providers about a set of security attributes, as part of an automated accountability-based approach. Defining such an API is one of the goals of the Cloud Trust Protocol18 that is currently being developed by CSA. Describing constraints on security attributes in machine-readable format can also serve as a basis for the definition of Security Level Objectives19 in SLAs or automated monitoring-based certificate generation.20 However, to achieve these goals, both the Cloud Customer and the Cloud Provider need to have a common interpretation of the definition and the measurement of the security attributes. Furthermore, if we want to allow Cloud Customers to compare different service offerings, this common interpretation should extend to multiple cloud customers and providers, thereby becoming a standard.

Defining standards for the measurement of security attributes is however a challenging endeavor as we can illustrate with 3 examples:

Uptime. We can note that all major cloud providers define at least one security attribute: availability, usually expressed as a percentage of uptime. Many IaaS providers claim a monthly uptime of at least 99.95 %. Yet as shown in [6] they all use a different method to calculate that number, making comparisons largely impossible.

Processing location. The location of data processing on a country or regional level of granularity is an important declarative attribute for regulatory compliance. However, consider the scenario where data is stored in a datacenter in Belgium but accessed for remote administration purposes form the US. Data protection authorities will typically consider that the data processing took place in Belgium and the US, while businesses will typically advertise where the data is stored, namely in Belgium.

Incident management. The quality of incident management can be measured in several complementary ways, such as the percentage of incidents reported/mitigated within a certain contractually agreed timeframe and the number of incidents over a period. As broadly discussed in [4], this requires defining precisely what an “incident” is, and expressing measurements by differentiating incidents into normalized categories (e.g. severe/medium/low) otherwise we risk comparing apple and oranges.

With these 3 examples, we can easily imagine the challenges of properly standardizing other attributes that are relevant to data stewardship, such as confidentialitylevel, vulnerability management, data deletion level, durability or recovery point objective, retrievability, etc.

While the measurement of security attributes still offers many opportunities for research, there are several standardization initiatives that are currently trying to tackle this problem. These include previously mentioned work from CSA on CTP and SLA metrics, work form NIST on metrics for SLAs (see Sect. 3.1), and effort form ISO to develop a foundation for SLA metrics (see Sect. 3.1).

5 Conclusion

Walking through standards related to the cloud may seem like walking through a dense forest, with many paths, intersections, and opportunities to get lost. In fact, the key action in the European Commission’s cloud strategy21. is to “cut through the jungle of standards”. One may feel that there are already too many standards for the cloud. Yet, many standards related to cloud service interfaces, portability, quality and management are still in their infancy. Furthermore, as this paper has shown, the cloud needs many more standards in order to enable “plug-and-play” accountability, security, and compliance. The approach developed by A4Cloud to standardization, seeks to provide meaningful contributions to relevant standardization/technical recommendations bodies in order to start bridging the identified accountability gaps.

This is just the beginning.

Footnotes

References

  1. 1.
    ISO/IEC/IEEE 29119-1:2013, Software and systems engineering—Software testing—Part 1: Concepts and definitions, Aug 2013Google Scholar
  2. 2.
    International Organization for Standardization. ISO/IEC 27002: Information Technology, Security Techniques, Code of Practice for Information Security Management. ISO/IEC 2005Google Scholar
  3. 3.
    International Organization for Standardization. ISO/IEC 27001:2013 Information technology—Security techniques—Information security management systems-Requirements. ISO/IEC 2013Google Scholar
  4. 4.
    Hogben, G., Dekker, M. (eds.) Procure Secure, A guide to monitoring of security service levels in cloud contracts, ENISA 2012Google Scholar
  5. 5.
    Liu, F., Tong, J., Mao, J., Bohn, R., Messina, J., Badger, L., Leaf, D.: NIST cloud computing reference architecture. NIST special publication, 500, 292 (2011)Google Scholar
  6. 6.
    Hogben, G., Pannetrat, A.: Mutant Apples: A critical examination of cloud SLA availability definitions. In: IEEE 5th international conference Cloud Computing Technology and Science (CloudCom), Dec 2013Google Scholar
  7. 7.
    Ardagna, C.A., Bussard, L., De Capitani Di Vimercati, S., Neven, G., Paraboschi, S., Pedrini, E., Preiss, S., Raggett, D., Samarati, P., Trabelsi, S., Verdicchio, M.: Primelife policy language (2009). http://www.w3.org/2009/policy-ws/papers/Trabelisi.pdf
  8. 8.
    OASIS Standard. eXtensible Access Control Markup Language (XACML) Version 3.0, 22 Jan 2013. http://docs.oasis-open.org/xacml/3.0/xacml-3.0-core-spec-os-en.html
  9. 9.
    ISO/IEC NP 19086, Information technology – Distributed application platforms and services – Cloud computing – Service level agreement (SLA) framework and terminology, under development, Nov 2013Google Scholar
  10. 10.
    European Commision: Cloud Service Level Agreement Standardisation Guidelines. Technical Report, Cloud Select Industry Group (C-SIG), June 2014. https://ec.europa.eu/digital-agenda/en/news/cloud-service-level-agreement-standardisation-guidelines
  11. 11.
    National Institute of Standards and Technology: NIST Cloud Computing: Cloud Service Metrics Description (RATAX). Working document 2014Google Scholar
  12. 12.
    International Organization for Standardization. ISO/IEC 19464:2014 Information technology – Advanced Message Queuing Protocol (AMQP) v1.0 specification. ISO/IEC, 2014Google Scholar
  13. 13.
    Network Working Group of the IETF, Jan 2006, RFC 4252, The Secure Shell (SSH) Authentication ProtocolGoogle Scholar
  14. 14.
    International Organization for Standardization. ISO/IEC 17203:2011 “Open Virtualization Format”. ISO/IEC 2011Google Scholar
  15. 15.
    Storage Networking Industry Association, “Cloud Data Management Interface”, Version 1, 12 April 2010Google Scholar
  16. 16.
    A4Cloud: Deliverable D:A-5.1 Report on A4Cloud contribution to standards, Sept 2014Google Scholar
  17. 17.
    ETSI: Cloud Standards Coordination – Final Report, Version 1, Nov 2013Google Scholar
  18. 18.
    International Organization for Standardization. ISO/IEC DIS 17788: Information technology—Cloud computing—Overview and vocabulary, Under development. ISO/IEC JTC 1/SC 38Google Scholar
  19. 19.
    International Organization for Standardization. ISO/IEC DIS 17789: Information technology – Cloud computing – Reference architecture, Under development. ISO/IEC JTC 1/SC 38Google Scholar
  20. 20.
    Drago, I., Mellia, M., Munafo, M.M., Sperotto, A., Sadre, R., Pras, A.: Inside dropbox: Understanding personal cloud storage services. In: Proceedings of the 2012 ACM Conference on Internet Measurement Conference (IMC 2012), pp. 481–494. ACM, New York (2012)Google Scholar
  21. 21.
    Alain, P., Vasilis, T., Daniele C. D:C-3.1 Requirements for cloud interoperability. A4Cloud public deliverable. Nov 2013Google Scholar
  22. 22.
    European Commission: Cloud Service Level Agreement Standardisation Guidelines. Technical Report, Cloud Select Industry Group (C-SIG), June 2014Google Scholar
  23. 23.
    Massimo, F., Theofrastos, K., Siani, P.: Accountability for data governance in cloud ecosystems. In: 2013 IEEE 5th International Conference on Cloud Computing Technology and Science (CloudCom), vol. 2, pp. 327–332, 2–5 Dec 2013Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Cloud Security AllianceEdinburghScotland, UK

Personalised recommendations