Design and Evaluation of Quality in Cloud Services/en

Aus Aifbportal
Version vom 10. Juli 2012, 15:19 Uhr von Ewi (Diskussion | Beiträge)
(Unterschied) ← Nächstältere Version | Aktuelle Version (Unterschied) | Nächstjüngere Version → (Unterschied)
Wechseln zu:Navigation, Suche
Button quality active en.png
Button composition normal en.png

For the design and evaluation of Cloud services' and systems' critical qualities, the eOrganisation research group develops solutions to address complex requirements like elastic scalability or adaptive security: single quality properties need to be measurable and comparable and they need to be evaluated considering their trade-offs. A holistic view on quality properties encompasses the view of the service provider (optimizing, e.g., resources) and the view of the service consumer (optimizing, e.g., performance).

Quality Measurement and Benchmarking

In this field the team develops experimental and qualitative methods and tools to determine consistency, elasticity, performance and security of Cloud systems (especially Cloud data bases), as well as Cloud services and applications (for example, processes).

Selected publications from this field:

David Bermbach, Stefan Tai
Benchmarking Eventual Consistency: Lessons Learned from Long-Term Experimental Studies
Proceedings of the 2nd IEEE International Conference on Cloud Engineering (IC2E), IEEE

Markus Klems, David Bermbach, René Weinert
A Runtime Quality Measurement Framework for Cloud Database Service Systems
Proceedings of the 8th International Conference on the Quality of Information and Communications Technology, Seiten: 38--46, Conference Publishing Services (CPS), September, 2012

Alexander Lenk, Michael Menzel, Johannes Lipsky, Stefan Tai
What are you paying for? Performance benchmarking for Infrastructure-as-a-Service offerings
Proceedings of the 4th IEEE International Conference on Cloud Computing (CLOUD 2011), in print, Washington, D. C., Juli, 2011

David Bermbach, Stefan Tai
Eventual Consistency: How soon is eventual? An Evaluation of Amazon S3's Consistency Guarantees
Proceedings of the 6th MW4SOC Workshop of the 12th International Middleware Conference 2011, ACM

Markus Klems, Michael Menzel, Robin Fischer
Consistency Benchmarking: Evaluating the Consistency Behavior of Middleware Services in the Cloud
In Paul P. Maglio, Mathias Weske, Jian Yang, Marcelo Fantinato, Proceedings of the 8th International Conference on Service Oriented Computing (ICSOC), Seiten: 627-634, Springer, Lecture Notes in Computer Sciences, Berlin Heidelberg, Dezember, 2010

↑ top

Quality Trade-Offs and Optimization

Qualities depend on each other in a way that induces trade-offs between them (e.g., consistency, availability and partition tolerance). The group experimentally and qualitatively researches these trade-offs and develops methods and tools to optimize and adapt systems to the dynamic requirements of providers and consumers.

Selected publications from this field:

Michael Menzel, Marten Schönherr, Stefan Tai
(MC²)²: Criteria, Requirements and a Software Prototype for Cloud Infrastructure Decisions
Software: Practice and Experience, 2011

David Bermbach, Markus Klems, Michael Menzel, Stefan Tai
MetaStorage: A Federated Cloud Storage System to Manage Consistency-Latency Tradeoffs
Proceedings of the 4th IEEE International Conference on Cloud Computing (CLOUD 2011), IEEE Press, Washington, D.C., Juli, 2011

↑ top