The Influence of Delay on Cloud Gaming Quality of Experience (T-Labs Series in Telecommunication Services) 3030998681, 9783030998684

This book provides an understanding of the impact of delay on cloud gaming Quality of Experience (QoE) and proposes tech

110 83

English Pages 175 [166] Year 2022

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Preface
Acknowledgments
Contents
About the Author
Acronyms
1 Introduction
1.1 Motivation
1.2 Research Question and Scope
1.3 Structure of the Book
2 Gaming QoE Assessment
2.1 Introduction
2.2 Cloud Gaming
2.3 Concept of QoE
2.4 Gaming Taxonomy
2.5 Gaming QoE Influencing Factors
2.5.1 Human Factors
2.5.2 Context Factors
2.5.3 System Factors
Media-Related System Influencing Factors
Playing Devices
Network
Game
2.6 Quality Features
2.6.1 Game-Related Aspects
Interaction Quality
Playing Quality
Appeal
2.6.2 Player Experience Aspects
2.7 Methods for Assessing Gaming QoE
2.7.1 Questionnaire
2.7.2 Experimental Setup of Laboratory Environment
2.7.3 Experimenting Using Crowdsourcing Platform
Rating Platform
Crowd Gaming Framework
2.8 Summary
3 Influence of Delay on Gaming QoE
3.1 Introduction
3.2 Interactive Dataset of ITU-T Rec. G.1072
3.2.1 Data Analyses
3.3 Game Characteristics Influencing Delay Sensitivity of Games
3.3.1 Focus Group Study (Study 3.1)
Demographics of Participants
Experiment Details
Identified Characteristics
3.3.2 Definition and Categorization of the Characteristics
Temporal Accuracy (TA)
Spatial Accuracy (SA)
Predictability (PR)
Number of Input Directions (NID)
Consequences (CQ)
Importance of Actions (IoA)
Number of Required Actions (NRA)
Feedback Frequency (FF)
Type of Input (ToI)
3.3.3 Quantification of the Game Characteristics (Study 3.2)
Experiment Details
Data Analysis
3.3.4 Characteristics Grouping
3.3.5 Discussion
3.4 Delay Sensitivity Classification
3.4.1 Dataset Details
Crowdsourcing Game Dataset (CSGDS)
3.4.2 Decision Tree
3.4.3 Performance of the Classification
3.4.4 Discussion
3.5 User Strategy and User Inputs (Study 3.3)
3.5.1 Experiment Details
3.5.2 Influence of Player Strategies on Delay Perception
3.5.3 Influence of Player Strategies on User Input Characteristics
3.5.4 IQ and Gaming QoE
3.5.5 Discussion
3.6 Predicting the Influence of Delay on IQ
3.7 Impact of Performance and Delay on Gaming QoE (Study 3.4)
3.7.1 Experiment Details
3.7.2 Demographic Information of the Test Participants
3.7.3 Influence of Performance on Gaming QoE
3.7.4 Relationship Between Delay, Performance, and QoE
3.7.5 Discussion
3.8 Summary
4 Influence of Time-Varying Delay on Gaming QoE
4.1 Introduction
4.2 Influence of Jitter on Gaming QoE (Study 4.1)
4.2.1 Experiment Details
4.2.2 Data Analysis
4.2.3 Comparison of Crowdsourcing and Laboratory Study (Study 4.2)
Data Analysis
4.2.4 Discussion
4.3 The Serial-Position Effects of Delay on Gaming QoE
4.3.1 Experiment Details
4.3.2 Data Analysis
4.3.3 Discussion
4.4 Impact of Gamers' Adaptation to Delay on Gaming QoE
4.4.1 Experiment Details (Study 4.3)
4.4.2 Data Analysis
Correlation Between Subjective and Objective Performance
Impact of User Adaptability to Delay on User Performance
Impact of User Adaptability to Delay on Gaming QoE
4.4.3 Discussion
4.5 Summary
5 Delay Compensation Technique
5.1 Introduction
5.2 Overview of Delay Compensation Techniques
5.2.1 Delay Compensation on Online Gaming
5.2.2 Delay Compensation on Cloud Gaming
5.3 Proposed Technique (Study 5.1)
5.3.1 Experiment Details
5.3.2 Adaptation by Reducing the Spatial Accuracy
5.3.3 Adaptation by Reducing the Temporal Accuracy
5.3.4 Adaptation Using Both Temporal and Spatial Accuracy
5.3.5 Discussion
5.4 Extension of the Proposed Delay Compensation Technique (Study 5.2)
5.4.1 Experiment Details
Demographic Information of the Test Participants
5.4.2 Evaluation and Results
Adaptation Using Spatial Accuracy
Adaptation Using Temporal Accuracy
Adaptation Using Predictability
Adaptation Using Number of Required Actions
Adaptation Using Consequences
Visible and Invisible Adaptation
5.4.3 Discussion
5.5 Delay Compensation Techniques on the Variation of Delay
5.5.1 Game Adaptation Technique in Jitter
5.5.2 De-Jitter Buffering and the Adaptation
5.5.3 Dynamic Adaptation and Invisible Adaptation
5.6 Discussion and Summary
6 Conclusion and Outlook
6.1 Summary
6.2 Answers to Research Questions
6.2.1 Research Question 1
RQ1.1: What Are the Game Characteristics Influencing the Delay Sensitivity of Cloud Games?
RQ1.2: Can Games be Classified Regarding Their Delay Sensitivity Based on Expert Judgement?
RQ1.3: Can the Delay Sensitivity Classification Improve Gaming QoE Prediction?
6.2.2 Research Question 2
RQ2.1: What Is the Impact of Jitter on Gaming QoE?
RQ2.2: What Is the Impact of Serial-Position Effects on Gaming QoE?
RQ2.3: What Is the Impact of Frequent Delay Variation on Gaming QoE?
6.2.3 Research Question 3
RQ3.1: Can Adaptation Technique Compensate for the Influence of Delay?
RQ3.2: Can Adaptation Technique Compensate for the Influence of Delay Variation?
6.3 Contributions of the Book
6.4 Limitations
6.5 Future Work
A Additional Material Related to Subjective Experiments
B Snapshot of the Game Used in CSGDS
C Questionnaire Used for Expert Judgment
References
Index
Recommend Papers

The Influence of Delay on Cloud Gaming Quality of Experience (T-Labs Series in Telecommunication Services)
 3030998681, 9783030998684

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

T-Labs Series in Telecommunication Services

Saeed Shafiee Sabet

The Influence of Delay on Cloud Gaming Quality of Experience

T-Labs Series in Telecommunication Services Series Editors Sebastian Möller, Quality and Usability Lab, Technische Universität Berlin, Berlin, Germany Axel Küpper, Telekom Innovation Laboratories, Technische Universität Berlin, Berlin, Germany Alexander Raake, Audiovisual Technology Group, Technische Universität Ilmenau, Ilmenau, Germany

It is the aim of the Springer Series in Telecommunication Services to foster an interdisciplinary exchange of knowledge addressing all topics which are essential for developing high-quality and highly usable telecommunication services. This includes basic concepts of underlying technologies, distribution networks, architectures and platforms for service design, deployment and adaptation, as well as the users’ perception of telecommunication services. By taking a vertical perspective over all these steps, we aim to provide the scientific bases for the development and continuous evaluation of innovative services which provide a better value for their users. In fact, the human-centric design of high-quality telecommunication services – the so called “quality engineering” – forms an essential topic of this series, as it will ultimately lead to better user experience and acceptance. The series is directed towards both scientists and practitioners from all related disciplines and industries. ** Indexing: books in this series are indexing in Scopus **

Saeed Shafiee Sabet

The Influence of Delay on Cloud Gaming Quality of Experience

Saeed Shafiee Sabet Holistic Systems Simula Metropolitan Center for Digital Engineering Oslo, Norway

Zugl.: Berlin, Technische Universität, Diss., 2022 u. d. T. Understanding and Mitigating the Influence of Delay on Cloud Gaming Quality of Experience. ISSN 2192-2810 ISSN 2192-2829 (electronic) T-Labs Series in Telecommunication Services ISBN 978-3-030-99868-4 ISBN 978-3-030-99869-1 (eBook) https://doi.org/10.1007/978-3-030-99869-1 © The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

This book is dedicated to my father, my mother, and my wife, Maryam. Thank you for all your support and encouragement.

Preface

The gaming industry is growing very fast and now is the most lucrative industry in the entertainment sector, surpassing other entertainment industries such as the boxoffice and music industry. The growth of the market increased the competition over delivering a better quality of experience (QoE) to users and the emergence of new services. Cloud gaming is a new service that recently emerged. It requires a reliable and low-latency network to be run smoothly, which makes it a very challenging service. Using a cloud gaming service with high latency would harm the user’s interaction with the game, leading to a QoE degradation. However, the negative effect of delay on gaming QoE depends strongly on the content of the games. A slow-paced card-playing game is typically not as delay sensitive as a shooting game at a certain level of delay. For optimal resource allocation and a good quality estimation, cloud providers, game developers, and network planners need to consider the game content’s impact. This book contributes to a better understanding of the impact of delay on QoE by identifying game characteristics influencing a user’s delay perception and predicting the gaming QoE degraded by the delay. In addition, using the insight gained, a delay compensation technique that can mitigate the negative influence of delay on gaming QoE is proposed, which uses the game characteristics to adapt the games. In summary, the main contributions of this book are (1) the identification of the characteristics influencing the delay and proposing an expert evaluation metric to quantify these characteristics, (2) the development of delay sensitivity classification, (3) the development of a model to predict the gaming QoE degraded by the delay, (4) the investigation of the influence of several types of delay variation on gaming QoE, and (5) proposing a game adaptation technique that mitigates the impact of delay and delay variation on gaming QoE.

vii

viii

Preface

The work presented was carried out within the scope of my PhD dissertation, which was conducted under the supervision of Prof. Dr.-Ing. Sebastian Möller at the Quality and Usability Lab of the Technische Universität Berlin from 2018 to 2021. Oslo, Norway January 2022

Saeed Shafiee Sabet

Acknowledgments

I would like to express my deepest gratitude to my supervisor Prof. Dr.-Ing. Sebastian Möller for providing me with guidance throughout all these years and Prof. Dr.-Ing Carsten Griwodz who I am thankful for his continuous support and recommendations about various decisions I had to make. I want to give special thanks to Prof. Dr. Pål Halverson and Prof. Dr. Michael Riegler for leading the HOST department and providing me all kinds of support to conduct the user studies presented in this work, I am very thankful for all your advice on both academic and personal levels. I would like to also thank Ragnhild Eg, I highly appreciate your effort and help in co-examining my dissertation. My time at the Quality and Usability Lab and Simula Research Lab was really joyful, and I’ve me my great colleagues, of whom some turned into good friends. I would particularly like to thank two of my greatest colleagues Steven Schmidt and Saman Zadtootaghaj with whom I had numerous discussions on various topics and projects. Steven, thank you for all the valuable feedback and insights that you provided, and Saman, thank you for being always supportive in both academics and my personal life. I would also like to thank Babak Naderi whom I greatly appreciate for the discussion and recommendation that he provides me for statistics and crowdsourcing studies. Thank you all three of you for being such good friends and helping me in all stages of this work. I would like to acknowledge some of my colleagues from the HOST Department at SimulaMet, Håkon Stensland, Cise Midoglu, Daniel Schroeder, Sajjad Amouie, Zohaib Hassan, Pegah Salehi, Debesh Jha, Vajira Thambawita, Steven Hicks, Hanna Borgli, Pia Smedsrud, Andrea Storås, Inga Strümke, and Thu Nguyen, and from Quality and Usability lab Jan-Niklas Voigt-Antons, Nabajeet Barman, Salar Mohtaj, Stefan Uhrig, Rafael Zequeira, Gabriel Mittag, Tanja Kojic, Falk Schiffner, and Thilo Michael. Many thanks to Simula management team for giving me this opportunity, Aslak Tveito, Marianne Aasen, Nina Lillevand, Maria Benterud, Elin Christophersen, Sven-Arne Reinemo, Marianne Sundet, Sissi Chan, and Fanny Klang. Many thanks to Irene Hube-Achter and Yasmin Hillebrenner for their support during my visits at QU Lab.

ix

Contents

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Research Question and Scope . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.3 Structure of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

1 1 3 5

2

Gaming QoE Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Cloud Gaming . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.3 Concept of QoE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.4 Gaming Taxonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.5 Gaming QoE Influencing Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.5.1 Human Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.5.2 Context Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.5.3 System Factors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.6 Quality Features . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.6.1 Game-Related Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.6.2 Player Experience Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.7 Methods for Assessing Gaming QoE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.7.1 Questionnaire . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.7.2 Experimental Setup of Laboratory Environment . . . . . . . . . . . . . . 2.7.3 Experimenting Using Crowdsourcing Platform . . . . . . . . . . . . . . . 2.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

7 7 8 10 11 13 13 14 14 18 18 19 20 23 26 26 30

3

Influence of Delay on Gaming QoE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2 Interactive Dataset of ITU-T Rec. G.1072 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2.1 Data Analyses . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 Game Characteristics Influencing Delay Sensitivity of Games . . . . . . . 3.3.1 Focus Group Study (Study 3.1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3.2 Definition and Categorization of the Characteristics . . . . . . . . . . 3.3.3 Quantification of the Game Characteristics (Study 3.2) . . . . . .

31 31 32 34 36 37 39 45

xi

xii

Contents

3.4

3.5

3.6 3.7

3.8 4

5

3.3.4 Characteristics Grouping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Delay Sensitivity Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4.1 Dataset Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4.2 Decision Tree . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4.3 Performance of the Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.4.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . User Strategy and User Inputs (Study 3.3) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5.1 Experiment Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5.2 Influence of Player Strategies on Delay Perception . . . . . . . . . . . 3.5.3 Influence of Player Strategies on User Input Characteristics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5.4 IQ and Gaming QoE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Predicting the Influence of Delay on IQ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Impact of Performance and Delay on Gaming QoE (Study 3.4). . . . . . 3.7.1 Experiment Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.7.2 Demographic Information of the Test Participants . . . . . . . . . . . . 3.7.3 Influence of Performance on Gaming QoE . . . . . . . . . . . . . . . . . . . . 3.7.4 Relationship Between Delay, Performance, and QoE. . . . . . . . . 3.7.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

47 47 48 49 50 52 53 54 54 56 56 58 59 59 60 61 61 62 63 64 64

Influence of Time-Varying Delay on Gaming QoE . . . . . . . . . . . . . . . . . . . . . . . 4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Influence of Jitter on Gaming QoE (Study 4.1). . . . . . . . . . . . . . . . . . . . . . . . 4.2.1 Experiment Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.2 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.3 Comparison of Crowdsourcing and Laboratory Study (Study 4.2) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.2.4 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3 The Serial-Position Effects of Delay on Gaming QoE . . . . . . . . . . . . . . . . 4.3.1 Experiment Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3.2 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.3.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4 Impact of Gamers’ Adaptation to Delay on Gaming QoE . . . . . . . . . . . . 4.4.1 Experiment Details (Study 4.3) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4.2 Data Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

67 67 67 68 70

Delay Compensation Technique . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Overview of Delay Compensation Techniques . . . . . . . . . . . . . . . . . . . . . . . . 5.2.1 Delay Compensation on Online Gaming . . . . . . . . . . . . . . . . . . . . . . 5.2.2 Delay Compensation on Cloud Gaming . . . . . . . . . . . . . . . . . . . . . . .

91 91 92 92 93

71 73 74 75 78 79 80 81 82 87 89

Contents

6

xiii

5.3 Proposed Technique (Study 5.1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3.1 Experiment Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3.2 Adaptation by Reducing the Spatial Accuracy . . . . . . . . . . . . . . . . 5.3.3 Adaptation by Reducing the Temporal Accuracy . . . . . . . . . . . . . 5.3.4 Adaptation Using Both Temporal and Spatial Accuracy. . . . . . 5.3.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4 Extension of the Proposed Delay Compensation Technique (Study 5.2) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.1 Experiment Details . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.2 Evaluation and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.5 Delay Compensation Techniques on the Variation of Delay . . . . . . . . . . 5.5.1 Game Adaptation Technique in Jitter . . . . . . . . . . . . . . . . . . . . . . . . . . 5.5.2 De-Jitter Buffering and the Adaptation . . . . . . . . . . . . . . . . . . . . . . . . 5.5.3 Dynamic Adaptation and Invisible Adaptation . . . . . . . . . . . . . . . . 5.6 Discussion and Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

94 95 96 97 98 99 100 100 101 110 112 112 113 115 117

Conclusion and Outlook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2 Answers to Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2.1 Research Question 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2.2 Research Question 2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2.3 Research Question 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3 Contributions of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.5 Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

119 119 121 121 122 123 124 125 126

A Additional Material Related to Subjective Experiments . . . . . . . . . . . . . . . . 129 B Snapshot of the Game Used in CSGDS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 135 C Questionnaire Used for Expert Judgment. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 141 References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145 Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153

About the Author

Saeed Shafiee Sabet received his master’s degree in information technology from the University of Tehran in 2018 and PhD degree from Technische Universität Berlin on modeling the gaming quality of experience under the supervision of Prof. Dr.Ing. S. Möller. in 2021. He is currently a postdoctoral researcher at Simula Research Lab and Oslo Metropolitan University, working on applied artificial intelligence in user quality of experience and virtual reality. His research interests include multimedia systems and networking, specifically cloud gaming, VR, and applied AI for multimedia systems.

xv

Acronyms

AC ACR ACR-HR ANOVA AP APM AQ AVE AWS CI CFA CFI CG CH CMIN CN CO CR CS DCR DMOS DoF DSCQS EC-ACR ECG EDA EEG EFA EMG FL fps

Service Acceptability Absolute Category Rating ACR with hidden reference Analysis of Variance Appeal Actions per Minute Audio Quality Average Variance Extracted Amazon Web Services Confidence interval Confirmatory Factor Analysis Comparative Fit Index Crowdgaming Challenge Minimum Discrepancy Controllability Competency Composite Reliability Crowdsourcing Degradation Category Rating Differential MOS Degrees of Freedom Double Stimulus Continuous Quality Scale Extended Continuous ACR Electrocardiography Electrodermal Activity Electroencephalography Exploratory Factor Analysis Electromyography Flow Frames per second xvii

xviii

FSS GEQ GIPS GPU GTA HCI HDR HEVC HIT HMDs HR HUD IC IEQ IF iGEQ IM IR ITU-T KMO LE MaxRH MDA ML MMO MOS MSV NA NPX PA PAF PC PCA PClose PENS PI PLCC PLEXQ PPX PR PX PXI QoE QoS

Acronyms

Flow State Scale Game Experience Questionnaire Gaming Input Quality Scale Graphics Processing Unit Grand Theft Auto Human-Computer Interaction High-Dynamic-Range High-Efficiency Video Coding Human Intelligence Task Head-Mounted Display Heart Rate Heads-Up Display Intuitive Controls Immersive Experience Questionnaire Immediate Feedback in-game Game Experience Questionnaire Immersion Item Reliability Telecommunication Standardization Sector of the International Telecommunication Union Kaiser-Meyer-Olkin Learnability Maximal Reliability Mechanics-Dynamics-Aesthetics Maximum-Likelihood Massively Multiplayer Online Mean Opinion Score Maximum Shared Variance Negative Affect Negative Player Experience Positive Affect Principal Axis Factoring Pair Comparison Principal Component Analysis p of Close Fit Player Experience and Need Satisfaction Performance Indication Pearson Linear Correlation Coefficient Playful Experiences Questionnaire Positive Player Experience Playing Performance Player Experience Player Experience Inventory Quality of Experience Quality of Service

Acronyms

RE Rec. RMSE RMSEA RQ RTCP RTP RTSP RTT SE SEM SG12 SI SRCC SRMR SSCQE SVQ TCP TE TI TVQ UDP UESz URL UX VD VF VIF VL VM VQ VU WebRTC

xix

Responsiveness Recommendation Root Mean Square Error Root Mean Square Error Of Approximation Research Question Real-Time Control Protocol Real-Time Transport Protocol Real-Time Streaming Protocol Round-Trip Time Standardized Effect estimates Structural Equation Modeling Study Group 12 Spatial Information index Spearman’s Rank Correlation Coefficient Standardized Root Mean Square Residual Single Stimulus Continuous Quality Evaluation Spatial Video Quality Transmission Control Protocol Tension Temporal Information index Temporal Video Quality Datagram Protocol User Engagement Scale Uniform Resource Locator User Experience Video Discontinuity Video Fragmentation Variance Inflation Factor Suboptimal Video Luminosity Virtual Machine Video Quality Video Unclearness Web Real Time Communication

Chapter 1

Introduction

1.1 Motivation Playing games dates back to a long time ago in history, when dices were being made by ancient humans using bones. The Dutch historian Huizinga states “PLAY is older than culture” [1] and does exist in different aspects of human life. This also included the digital era when with the advancements of technology and the invention of the first computers, digital games came into existence. The digital games evolved in the early 1950s where the interaction with the game was made by moving punch cards. However, it was two decades later, during the 1970s, when the first generation of gaming consoles came to the market. This was the start for the golden age of digital gaming, which happened in the early 1980s when personal computers as well as gaming consoles were well established and came into users’ homes. The gaming market is growing ever since. Nowadays, the first generation of digital gamers who grew up with games are grown-ups and have spending power. Unlike the beginning of the digital gaming, these days gaming is not only considered for children but also for parents and grownups to play video games. The number of gamers worldwide is estimated to be 2.7 billion in 2020 based on a report from Statista1 which means out of each three person in the world, there is at least one gamer. These formed the gaming industry into a billion-dollar business that surpassed many other entertainment industries, such as the music and the box-office industry. The global gaming market was valued at USD 162.32 billion in 2020 and is expected to reach a value of USD 295.63 billion by 2026,2 registering a Compound Annual Growth Rate (CAGR) of 10.5% over the forecast period (2021–2026).

1 https://www.statista.com/statistics/293304/number-video-gamers/. 2 https://www.mordorintelligence.com/industry-reports/global-games-market.

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 S. S. Sabet, The Influence of Delay on Cloud Gaming Quality of Experience, T-Labs Series in Telecommunication Services, https://doi.org/10.1007/978-3-030-99869-1_1

1

2

1 Introduction

However, this giant gaming industry is not only growing in the market but also from the technology perspective. With more advancement of technology and affordable broadband connection, online gaming has emerged. Before online gaming, multiplayer gaming and playing with friends were limited to physical presence. Online gaming using sending/receiving updates of the game state communicates with other players in the game and enables playing with friends and other people over a network. However, playing over the network introduced new challenges as the network delay had a significant negative influence on the users’ Quality of Experience (QoE), which is described by the degree of delight or annoyance of the user [2]. A classical experience of delay in online games was inconsistency of the objects’ positions [3], e.g., shooting to a dead enemy in a First Person Shooter (FPS) game or a racing game where a player thinks he/she has finished the race first, but when the update arrives, it shows that the opponent finished the race earlier. With the development of delay compensation techniques in online gaming, the major challenge of delay was resolved to a high degree, and online gaming success became possible. However, even now, the influence of low latency delay is perceivable, and these techniques are still being improved. Another advancement of the digital games, which is the most noticeable in the eyes’ of users is the advancement in the graphic quality and texture details. The progress in gaming made the life cycle of the hardware, such as graphic cards very short, which forces users to frequently upgrade their devices to be able to play the latest games with high quality. Thus, the concept of cloud gaming emerged to resolve this issue by executing the heavy process in the cloud. In recent years, due to the existence of fast and reliable core networks and the increasing prevalence of cloud infrastructures, more services tend to move from end users’ devices to cloud servers. While online games were being rendered on the client device, the idea behind cloud gaming is to run the whole game on a remote server and send the rendered scene as a video stream to the clients. This architecture has many advantages for both users and game developers. Some of the most beneficial advantages of cloud gaming includes that users can play games anywhere, anytime and on any device, users do not need to frequently spend money on upgrading their devices or purchasing new consoles, users do not need to spend hours for downloading games, frequent updating, and installation. Cloud gaming can come with subscription services and pay for as much as you play schema. On the other hand, game developers do not need to be worried about software piracy, crossplatform development, or updating distributions as the game always stays on a central platform. The first cloud gaming service that came to the market was OnLive. OnLive officially was launched in March 2010. However, it failed to create an adequate level of QoE for its users and was sold to Sony Entertainment in 2015. Gaikai was another cloud gaming service that was sold to Sony. It was lunched in 2012, and was focused on the game advertisements by letting user playing demos of games. A few years later, with the pervasiveness of the broadband network many big companies such as Sony with PlayStationNow, Google with Statia, Nvidia with GeForceNow, Microsoft with Project xCloud, and Amazon with Luna entered into

1.2 Research Question and Scope

3

the cloud gaming industry and are heavily investing in the development of this service. However, despite the many advantages and attractiveness of cloud gaming, it faces some serious challenges on which its success depends. The reason why services like OnLive were not successfully to deliver a good QoE to the users was the cloud gaming challenges. Cloud gaming requires a high-bandwidth and low latency network. With the improvement of the broadband network and availability of high-bandwidth connections for home users, the bandwidth problem is not as huge issue as it was before. However similar to the early ages of online gaming, latency is still a major challenge for the success of cloud gaming. Furthermore, cloud gaming is even more sensitive to network delay than online gaming due to the absence of the instant feedback. Therefore most online gaming delay compensation techniques are not applicable in cloud gaming, and the development of delay compensation techniques specifically for cloud gaming services is required. To develop these techniques, knowledge, and understanding about the influence of delay on cloud games is crucial. However, delay in real network is not constant and has temporal and spatial variation over time which creates jitter, serial-position effects (e.g., the moment delay occurs), and frequent delay switching. In the other hand, it is found that the delay does not influence all the cloud games in the same way, e.g., a card-playing game with low interaction is not as sensitive to delay as a FPS game. Therefore, being aware of the game requirements is necessary and highly important for cloud service providers, game developers, and network planners in order to provide a better QoE, optimal resource allocation, quality estimation, and to develop delay compensation techniques. At first this book looks into the characteristics and reasons why some games are more sensitive to delay than others. Additionally, one aim is to use these identified characteristics to build a human-understandable classification of games to distinguish between the games based on their sensitivity towards delay. Next, the impact of various delay variations in terms of the position, pattern and the amount of delay will be investigated. Furthermore, the obtained knowledge will be applied to propose a mitigation technique that reduces the sensitivity of games towards delay and, consequently, aiming to improve the gaming QoE in case of a network delay.

1.2 Research Question and Scope In order to achieve these goals, the book tries to address to following approachable Research Questions (RQ): • RQ1: What is the impact of delay on cloud gaming QoE? – RQ1.1: What are the game characteristics influencing the delay sensitivity of cloud games?

4

1 Introduction

– RQ1.2: Can games be classified regarding their delay sensitivity based on expert judgements? – RQ1.3: Can the delay sensitivity classification improve gaming QoE prediction? • RQ2: What is the impact of time-varying parameters of delay on gaming QoE? – RQ2.1: What is the impact of jitter on gaming QoE? – RQ2.2: What is the impact of serial-position effects on gaming QoE? – RQ2.3: What is the impact of frequent delay variation on gaming QoE? • RQ3: Can game adaptation techniques mitigate the influence of delay using game characteristics? – RQ3.1: Can adaptation techniques compensate for the influence of delay? – RQ3.2: Can adaptation techniques compensate for the influence of delay variation? This book aims to identify the important characteristics that make a game sensitive to delay and to propose an expert evaluation method in which experienced players can quantify those gaming content characteristics (RQ1.1). The expert evaluation method enables the development of a delay sensitivity classification based on a decision tree (RQ1.2). In addition, it is aimed to develop a model that considers the impact of game content information to predict the gaming QoE degraded by delay (RQ1.2). Furthermore, the research aims to gain some knowledge about the influence of time-varying parameters of delay, including the variations in the amount of delay (RQ2.1), the position and the pattern of delay (RQ2.2) and the frequency in which delay occurs (RQ2.3). Finally, the book aims to use the characteristics and knowledge gained to develop techniques for mitigating the influence of delay (RQ3.1) and delay variation on gaming QoE (RQ3.2). Due to the very high complexity of cloud gaming systems, and many possible influencing factors a reduction to the scope of the research is required. The research presented in this work enables predicting and improving the gaming QoE, without focusing on the design of a game, but rather on mitigating the impact of the delay on gaming QoE. With regards to the range of the parameters the scope of the project is limited to the influence of delay and delay variation and other factors such as packet loss and bandwidth are not considered. In the subjective experiments, only the group of casual gamers participated in the studies. This was done because the participants with no prior gaming experience might not have the competence to judge the influence of delay realistically, and the core gamers may be overly sensitive and critical to even low latency delay, and they are not the main target group of cloud gaming services. Therefore the presented models and results are only limited to casual gamers, and they might not be applicable to core gamers. In addition, the influence of social factors, especially for multiplayer games, are not considered within the scope of this work. With regards to the system used, even though some findings of this book may apply to traditional online gaming, the focus of this book is on cloud gaming. Moreover, Virtual Reality (VR) games displayed

1.3 Structure of the Book

5

on Head-Mounted Displays (HMDs), small display sizes such as mobile and tablet screens, input devices other than mouse and keyboard are not considered within the scope of the presented work.

1.3 Structure of the Book Following this chapter, Chap. 2 provides an introduction to the QoE concept with a focus on QoE in the context of cloud gaming, including influencing factors, and quality features. Afterwards, methods for gaming QoE assessments with a focus on the subjective evaluation in laboratory and crowdsourcing environments are investigated. Next, Chap. 3 starts with an introduction to the influence of delay on cloud gaming QoE, identifies the characteristics influencing the delay sensitivity of cloud games. Further, these characteristics are employed to derive a classification for games based on their delay sensitivity. Following this, in Chap. 3 the classification is validated based on several datasets. Using these datasets, the impact of delay is modeled to predict gaming QoE. The chapter closes with a discussion about the influence of delay on in-game performance, and the relationship between delay, performance, and QoE is investigated in a combined way. Chapter 4 investigates the influence of time-varying delay on gaming QoE. The chapter investigates when the amount of delay is varying (also known as jitter), when it occurs at various moments (serial-position effects), and when the pattern of variation in term of the delay frequency is different. Chapter 5 proposes a latency compensation technique that mitigates the impact of delay on gaming QoE. The mitigation technique uses a series of game characteristics identified in Chap. 3 and adapts the game according to these characteristics. The performance of the mitigation technique is evaluated in different scenarios when delay and delay variations are being simulated. And finally, Chap. 6 provides a summary of the main contributions of the previous chapters and answers the RQs. The book concludes with a discussion of the limitations of the presented work and the directions for future work.

Chapter 2

Gaming QoE Assessment

2.1 Introduction The previously described growth of the market is not only limited to the gaming industry, but all the multimedia markets have seen a tremendous growth in the last couple of decades, which increased the competition over delivering a better service quality to the users. Therefore, the need to evaluate the quality of multimedia services became urgent. As Lord Kelvin states, “If you can’t measure it, you probably can’t improve it”. This chapter provides a short overview about quality assessment on multimedia services with a focus on the cloud gaming service. The chapter starts with Sect. 2.2 which provides an introduction to the architecture of a cloud gaming service. Next, the concept of quality, Quality of Service (QoS) and QoE are elaborated in Sect. 2.3. With the definition of QoE and cloud gaming, Sect. 2.4 presents a taxonomy of gaming QoE in cloud gaming, followed by an investigation of the factors influencing the cloud gaming QoE in Sect. 2.5 and the quality aspects in Sect. 2.6. Afterwards, Sect. 2.7 reviews assessment methods for conducting subjective studies in laboratory and crowdsourcing environments. Finally, the chapter is summarized in Sect. 2.8.

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 S. S. Sabet, The Influence of Delay on Cloud Gaming Quality of Experience, T-Labs Series in Telecommunication Services, https://doi.org/10.1007/978-3-030-99869-1_2

7

8

2 Gaming QoE Assessment

2.2 Cloud Gaming Juul [4] defines a game by six features namely rules, outcomes, valorization of outcome, players’ effort, players’ attachment to outcome and negotiable consequences. This definition is also embedded in the ITU-T Rec. G.1032 [5] as follow:

Game A game is a rule-based system with a variable and quantifiable outcome, where different outcomes are assigned different values, the player exerts effort in order to influence the outcome, the player feels emotionally attached to the outcome, and the consequences of the activity are optional and negotiable [4].

Digital games were traditionally being played on local devices (Personal Computer (PC), consoles, etc.), but, with the rapid availability of the internet, multiplayer online gaming became very popular. Online gaming is defined by ITU-T Rec. G.1032 [5] as:

Online Gaming A service that enables a video game to be either partially or primarily played over a broadband network. The service renders the game at the client device while the updated states of game are transferred over a broadband network.

While in online games, games are being rendered on the client device, games can be fully executed on a server in case of a cloud gaming service. Cloud gaming is a new service that merges two concepts of gaming and cloud computing. Cloud computing, also known as on-demand computing, is a shift of resources such as storage and computing power from the locally installed programs and resources to a computer in a geographically different server. Cloud computing is defined by the U.S National Institute of Standards and Technology (NIST) [6] as:

Cloud Computing A model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.

2.2 Cloud Gaming

9

Combining game and cloud concepts, there is cloud gaming, which was emerged in recent years due to the existence of fast and reliable core networks and cloud infrastructures. Cloud gaming is defined by ITU-T Rec.G.1032 [5] as:

Cloud Gaming Cloud gaming is characterized by game content delivered from a server to a client as a video stream with game controls sent from the client to the server. The execution of the game logic, rendering of the virtual scene, and video encoding are performed at the server, while the client is responsible for video decoding and capturing of client input.

Figure 2.1 shows the framework of a cloud gaming service in an abstract way. The game engine is located in the cloud server and consists of several components, including logic and scene renderer engine, which are depicted in the picture due to their relevance. The game engine receives the user inputs from the client and generates the game events and logic accordingly. Afterwards, the scenes and audio are generated and rendered. The rendered scenes are captured and compressed using the video/audio encoders to compress the video/audio for being sent over the network. Video encoders such as H.264, H.265, VP9, AV1 are typically being used for encoding videos. For acceleration

Online Gaming Server

User Game Engine

Game Logic

Scene Render User Input

Audio/Video A di /Vid Output

Input Interpreter

Audio/Video Encoder

Input Capture

Audio/Video Decoder

Packezaon

Data center Network

Client Gateway

Packezaon

Broadband Network

Fig. 2.1 The framework of cloud gaming system

10

2 Gaming QoE Assessment

and reducing delay, video encoders use the hardware implementation of encoding such as NVIDIA Encoder (NVEnc) for the H.264/MPEG-4 AVC. After encoding the video, the compressed video is packetized and sent to the client as a video stream. The protocol used for sending these packets is often different on various cloud gaming servers and mostly customized. A packet tracing using Wireshark1 on Google Stadia, PlayStationNow, and Nvidia GeForceNow showed [7] that Statia is using the standard version of Web Real Time Communication (WebRTC) (over Datagram Protocol (UDP)), and uses one Real-time Transport Protocol (RTP) for both audio/video transmission and keeps another RTP stream for retransmission. However, GeforceNow uses separate RTP flows for sending audio and video, GeforceNow first starts establishing the connection using Transport Layer Security (TLS) over Transmission Control Protocol (TCP) and once the connection established it starts sending packets over multiple UDP flows with RTP streams. PlayStationNow, despite the Stadia and GeforceNow is not using any documented protocol and sends its packets over a customized UDP flow rather than using RTP flow. The client side, also known as thin client, is responsible for decoding the encoded audio/video coming from the cloud server. The decoding can also be done using hardware accelerators to speed up the process and reduce the delay. After decoding, decoded audio and video are sent to output devices such as monitors, haptic and audio devices. The thin client device is also responsible for capturing and sending the user inputs to the server. For sending the user inputs also various services employ different protocols, e.g., Stadia uses Datagram TLS while GeforceNow and PlayStationNow created their own customized socket over UDP to send datagrams.

2.3 Concept of QoE It seems that there was no agreement on the meaning of the term quality, and it was redefined multiple times. The term quality in the multimedia domain has mostly been used for years by the networking community as the QoS defined in ITU-T Recommendation E.800 [8] as:

Quality of Service Totality of characteristics of a telecommunications service that bear on its ability to satisfy stated and implied needs of the user of the service.

The “totality of characteristics” refers to a set of guaranteed performance characteristics of a network connection. This definition is from the perspective of

1 https://www.wireshark.org/.

2.4 Gaming Taxonomy

11

the service providers. Although, in many cases, it can be closely related to the user satisfaction, it does not take into account the users’ experience and expectations. For example, playing on first-generation consoles or watching on a black and white TV years ago was considered as a high-quality service. However, it is not acceptable nowadays for people who are playing games with 4K resolution on PlayStation 5. These examples illustrate that the term QoS, a well-established term, is not sufficient and does not show users’ perceived quality as it does not consider the user and contextual factors. Therefore the members of the COST Action QUALINET defined the Quality of Experience (QoE) [2] as:

Quality of Experience The degree of delight or annoyance of the user of an application or service. It results from the fulfillment of his or her expectations with respect to the utility and / or enjoyment of the application or service in the light of the user’s personality and current state.

In this definition, the application is defined as “a software and/or hardware that enables usage and interaction by a user for a given purpose. Such purpose may include entertainment or information retrieval, or other” and the service is defined as “an episode in which an entity takes the responsibility that something desirable happens on behalf of another entity.” The QoE definition takes into account the user factors and expectations as well as the contextual factors that were neglected in the QoS definition. The QoE in the context of gaming is referred in this book as “gaming QoE”, which means the general definition of QoE where the application/service is cloud gaming, the users are the gamers, and with respect to the quality elements and features that are important in the context of gaming.

2.4 Gaming Taxonomy Unlike productivity-related applications that are designed to prevent any challenges for their users, games are commonly designed to be challenging. Therefore, standard methods for determining QoE in multimedia applications are not easily applicable in the context of computer gaming. Computer gaming is a human-machine interaction activity in which players are emotionally attached to the outcome of their activities. In addition, the user input devices, as well as the characteristics of the user, can significantly impact the user-perceived QoE. Thus, in [9] a taxonomy of QoS and QoE aspects related to computer gaming has been proposed as shown in Fig. 2.2. The gaming taxonomy as shown in Fig. 2.2 has a multilayer structure consists of influencing factors, interaction performance metrics, and quality features. The influencing factors or quality factors defined by the QUALINET white [2] papers:

12

2 Gaming QoE Assessment

Quality of Service (QoS)

Quality Factors User Factors

System Factors

Context Factors

Interaction Performance Game Output Transmission

Output Performance

Generation Performance

Communication

Interface

Backend

User Contol Transmission

Input Performance

Interpretation Performance

Perceptual Effort Cognitive Workload

User

Physical Response Effort

Game Reaction Performace

Game

Game Logic Performance

Game Control Performance

Quality of Experience (QoE)

Game-related Aspects System Personality Aesthetics Novelty

Output Quality

Appeal

Interactive Behavior

Input Quality

Learnability

Interaction Quality

Intuitivity

Playing Quality

Hedonic

Pragmatic

Player Experience Aspects Involvement Presence

Immersion

Tension Positive Affect

Challenge Negative Affect

Competence

Flow and Absorption

Acceptability

Fig. 2.2 Gaming taxonomy from Müller et al. [9] and Schmidt [10]

Influencing Factor Any characteristic of a user, system, service, application, or context whose actual state or setting may have influence on the Quality of Experience for the user.

The influencing factors at the top of the taxonomy can be categorized into three factors, including user, system, and context. These factors that influence QoS and QoE aspects are explained in detail in Sect. 2.5. The next layer, considers the interaction performance, and represents the interaction between the user and the system. The interaction performance in the gaming taxonomy is inspired by the corresponding taxonomy for task-oriented multimodal interaction, as the interaction performance aspects were defined separately for the system and for the user [11]. It describes the behaviors and processes between the user and the system during the interaction. The system performance in gaming is divided into three categories including interface (e.g., the devices and software that are being used), backend performance, and game performance (e.g., the story behind the interaction).

2.5 Gaming QoE Influencing Factors

13

The bottom layer of taxonomy describes quality features, which are presented in Sect. 2.6 and defined in [12] as:

Quality Feature A perceivable, recognized, and nameable characteristic of the individual’s experience of a service that contributes to its quality.

2.5 Gaming QoE Influencing Factors This section describes the cloud gaming influencing factors as described in ITU-T Rec. G.1032 [5]. Knowledge about these parameters is essential for the gaming QoE assessment. The parameters are either the factors under the experiment or must be taken into account in conducting subjective experiments. They are categorized into user(human)-related, system-related and contextual factors that influence the cloud gaming QoE.

2.5.1 Human Factors Based on the definition of the QUALINET whitepaper [2], human influencing factor are “any variant or invariant property or characteristic of a human user. The characteristic can describe the demographic and socio-economic background, the physical and mental constitution, and the user’s emotional state” [2]. The static and dynamic human factors such as the demographic information (age, gender), and factors related to the player’s mode, such as emotional states and motivations, can influence the gaming QoE. Vermeulen investigated the influence of gender in gaming and found out that women are generally more annoyed by the violence, sexuality, and complexity in the game content, but there was no difference between the women who had prior experience on these games and men [13]. However, Manaro et al. in another study investigated the influence of age, gender, and preference and their results indicated that it is not the gender nor age that makes the difference, but it is the users’ preferences [14]. Intrinsic and extrinsic motivations of a user can have a strong influence on gaming QoE as various games provide different types of fun and motivations. While some people may enjoy a horror game and prefer to play those kinds of games, some users’ preference lies in a less violent game, e.g., Overcooked. Playing style of the users is another human factor linked to the motivation of the users. Bartle differentiates between “achiever”, “explorer”, “socializer” and “killer” in which each of them has their motivations [15]. A user with a “killer” playing style

14

2 Gaming QoE Assessment

preference tends towards, e.g., FPS games where it gives the possibility to kill others while an “explorer” might like to play, e.g., an open-world game. Another human influencing factor is the gaming experience which tries to categorise the users based on their prior experience in gaming. Categorization such as “hardcore” and “casual” gamers [16] or “newbie” and “pro” are defined with respect to the gaming experience. However, the experience could be more or less in a specific game or game genre. It was shown that players with higher experience are more sensitive to the degradation made by impaired conditions such as low encoding bitrate [17] and network delay [18]. Human vision and reaction time are also another factor linked to the human, for which users with lower visual and auditory acuity might perceive the degradation less strongly.

2.5.2 Context Factors Context factors are defined as “factors that embrace any situational property to describe the user’s environment in terms of physical, temporal, social, economic, task, and technical characteristics” [19, 20]. For cloud gaming, it includes the physical environment where the game is being played in terms of the usage situation (home, office, on the move, public places), the room condition (lighting condition, noise, acoustic), the transmission channel in terms of the privacy and security issues, parallel activities of the users, the social context, which is the relationship of the player with others who are watching the player’s gameplay either passively or actively as in a co-operation mode, or enemy who is either present or online, and Service Factor, which includes aspects such the pricing, offers, availability, restrictions, and ease of access to the service.

2.5.3 System Factors System factors refer to properties and characteristics that determine the technically produced quality of an application or service [20]. The QUALINET whitepaper [2] categorises the system factors into four subcategories of (1) Media-related System influencing factors referring to the compression and the video encoder configurations, (2) Device-related System influencing factors referring to the input and output devices being used in the cloud gaming setup, (3) Network-related influencing factors referring to data transmission over a network and the network impairments, and finally (4) Content-related System influencing factors that refer to the game and the game characteristics. In the following, each of these four system influencing factors is discussed.

2.5 Gaming QoE Influencing Factors

15

Media-Related System Influencing Factors These influencing factors include factors such as encoding, frame rate, and resolution. Due to the high bandwidth requirement of cloud gaming systems, videos should be compressed using encoders before being streamed to the client. Various cloud gaming services use different video codecs, e.g., Google Stadia is known to use VP9, but many other service operators such as Steam Remote Play use H.264. Various video encoders are discussed to lead to different performance in gaming applications depending on their implementations [21]. This performance not only impacts the video quality, but also the encoding time. Various implementations such as NVEnc, the hardware-accelerated video encoding, try to reduce the encoding time and reduce the delay created in the encoding process. The framerate is another factor that has a significant impact on the gaming QoE, both with respect to interactive quality and also video quality [22, 23]. Hong et al. showed that in a fixed encoding frame rate, a higher bit rate always leads to higher quality. Resolution is also another important factor, but it mainly impacts the video quality and not the interaction quality [24].

Playing Devices Playing devices, including the input and output devices, can have a significant impact on the QoE. For video quality assessment, it is shown [25] that the perceived quality is influenced by factors such as the viewing distance, display size, brightness, contrast, sharpness, screen resolution, refresh rate, and color. Beyer investigated screen sizes for mobile gaming and showed that there is an acceptance threshold somewhere between 3.27" and 5" [26, 27]. If the screen size is larger than this threshold, gaming QoE does not further increase significantly within the range of tested screen sizes. This size for display monitors on PC was chosen to be 24’ in G.1072 [23] for gaming studies. In addition, it is shown that users who wear a VR headset perceive higher levels of immersion and presence in comparison to users who use a display monitor [28]. Moreover, input devices are crucial in terms of the input modality. The influence of choosing various input devices such as touch input, mouse, and gamepad was investigated in [29] and showed that it does not significantly influence the perceived QoE. However, the results were only limited to the games used in the study and were hardly generalizable; depending on the task at hand and user preference a proper apparatus can have strong impact on the user performance and gaming QoE. The devices can also create delays (e.g., a wireless device). A survey of real-world gaming setups found overall local latencies ranging from 23 to 243 ms [30].

16

2 Gaming QoE Assessment

Network The network connection strongly influences the cloud gaming QoE. Generally, cloud gaming requires a high bandwidth with a low latency network in order to be run smoothly and to create a good QoE for its users. Four main degradation that need to be taken into account are delay, jitter, bandwidth, and packet loss. The negative impact of low bandwidth on gaming QoE has been shown in several studies [31, 32]. It mostly impacts the output quality (video and audio quality). The bandwidth requirement of cloud gaming services can be resolved by compressing video. However, when the bandwidth is low, it results in artifacts such as low video quality (blurry and blocky artifacts), low video resolution, low frame rates, or even causing delay due to buffering. Packet loss also mostly impacts the output quality, and its effect have been investigated in [33] and [34]. However in some cases, it causes very low framerates, which in that case can impact the IQ and create difficulties for controlling the game. Cloud gaming services employ various techniques such as forward error correction (FEC), video concealment methods (e.g., framecopy, motion copy), and retransmission to conceal the packet loss. For example, Google Stadia uses a designated RTP stream to retransmit the packets that are lost. The retransmission can also lead to additional delay. Most importantly for the IQ are delay and jitter. Delay can come from different sources, including user, server and the transmission. Choy et al. [35] formulates the delay in cloud gaming as Eq. 2.1. tnetwork

T = tclient

   + taccess + tisp + ttransit + tdatacenter +tserver

(2.1)

tclient consists of all the delay created in the processes of capturing the user input, decoding video and audio delay, and providing the feedback in the output devices, which ranges from 23 to 243 ms [30] in real-world systems. The transmission delay tnetwork consists of several components. The first one is taccess which is the time consumed by the transmission of packets to the first internet connection router. The average access delay on a loaded link exceeds 40 ms [36]. In addition, home gateways add extra delay depending on the number of active users (multi-user households) and WiFi signals. The second section of the delay in the network comes from the data transmission time between the access router and the peering point connection of Internet Service Providers (ISPs). However, ISPs networks are generally fast and reliable unless congestion due to high traffic occurs. ttransit is the data transmission time between the ISPs and the datacenter network switches. This time depends highly on the geographical distance and numbers of hops between the cloud server and the ISPs. Also, tdatacenter is the transmission time between the core switches to the access switches of the server. Network latency between two servers in modern datacenters is typically below 1 ms [37]. However the virtualization required for the cloud gaming services itself adds delay. Finally, tserver is the delay

2.5 Gaming QoE Influencing Factors

17

from all the necessary processes required to be done by cloud gaming servers, including game engine delay, capturing delay, and encoding audio and video delay. Regardless of where the delay is coming from it can negatively impact the user experience. The influence of delay in online and cloud gaming QoE has been investigated in several studies [38]. Early studies are related to online gaming and the inconsistency of objects’ positions in the presence of delay, which decreases the QoE by creating paradoxical situations [3]. Jarschel et al. showed that delay negatively impacts the gaming QoE, but has to be put into context with the content [34, 39]. Quax et al. [40] showed that delay has a different influence in different game genres. For instance, FPS games are more sensitive to delay than platform games. Beyer et al. [27] showed that within the same genre, depending on the game rules and implementation, the sensitivity of games may differ significantly. Moreover, Schmidt et al. [41] showed that different scenarios might lead to different delay sensitivities even within the same game. In addition to gaming QoE, the delay affects the gamer’s performance [42–44], but that has also been shown to be game dependent. The variation in delay, known as jitter can be created for different reasons such as out-of-order delivery and variation in parameters mentioned above. The majority of the research done on the influence of jitter is limited to online gaming. In [45], Quax et al. conducted a subjective experiment on a FPS game on online gaming. They concluded that players feel impaired with delays, but jitter does not play a prominent role as much as delay in the acceptance of the service. Furthermore, Beznosyk et al. investigated jitter on online gaming and showed that high levels of jitter such as 100 ms on top of 200 ms fixed delay negatively influence the gaming QoE [46]. However, for cloud gaming, to the best of our knowledge there is no work dedicated to investigating the influence of jitter on cloud gaming QoE.

Game The game as the core component of any gaming system is identified as one of the most dominant influencing factors in many previous studies. The influence of many other influencing factors such as network delay is found to be mediated by the type of game. For example, a game scenario with many dull moments, such as in a typical card game, is not as sensitive towards the network delay as a shooting game. However, a game classification targeting these effects is still missing. Classifications such as game genres are too broad and are not precise enough for categorizing the game. Wolf [47] defined 42 different genres but many games belong to several categories of these genres. Djaouti et al. [48] classified games based on the game rules and goals, and defined ten game bricks such as move, shoot, avoid and explore. Lee et al. [49] classified games based on their scoring system. The authors categorized the games’ scoring system based on three characteristics: preservability, controllability, and relationship to achievement. Zadtootaghaj et al. classified games with respect to their video complexity by means of a decision tree [50]. The authors used a focus group to identify characteristics possibly influencing the

18

2 Gaming QoE Assessment

video complexity and then built a decision tree using these characteristics. These characteristics include Degree of Freedom (DoF), Movement Type (MT), Camera Pace (CP), Amount of Camera Movement (ACM), Texture Details (TD), Object Number (ON), Color Diversity (CD), Static Areas (STAR) and Color Redundancy (CRR). The classification was later improved by the collaboration of author of this book to enable content classification based on the framerate sensitivity as well as the video complexity [51]. Fitts defined the Fitt’s law using two characteristics, which states that the time to aim at a target depends on the amplitude (distance from cursor to the target) and the width of the target. Later the Fitt’s law was [52] adapted to be used for two and three-dimensional targets [53, 54], as well as to be used for moving targets [55]. Claypool, in another work, modeled the moving target selection when it was affected by delay [56]. Furthermore, Eg et al. [57] showed players’ performance under delay is dependent on the task difficulty, and they modeled the influence of delay on target selection using the speed of the target [58]. Claypool et al. [59] categorized games with respect to their delay sensitivity using two game characteristics, precision and deadline. Sackl et al. [60] provided a list of metrics defining the sensitivity of games, such as the required number of actions, maximum successful time, reaction time, and predictability of actions. However, there were no instructions given to quantify those metrics, and an investigation related to delay sensitivity is still missing.

2.6 Quality Features The quality features at the bottom of the game taxonomy on Fig. 2.2 are categorized into game-related aspects and player experience aspects. The quality features are inspired by the Game Experience Questionnaire (GEQ) in [61], and they are discussed in the following sections.

2.6.1 Game-Related Aspects As illustrated in the gaming taxonomy, the game-related aspects are categorized into appeal and aesthetic characteristics, playing quality, and more importantly for this book, the interaction quality.

Interaction Quality This quality feature can be considered as the core of the gaming QoE. Engl [62] defines the term interaction quality also known as “the playability of the game” as “the degree to which all functional and structural elements of a game (hardware

2.6 Quality Features

19

and software) provide a positive player experience”. The interaction quality in the gaming taxonomy consists of input and output quality, which refers to the input of the users in terms of the game commands and the output of the system to the players in terms of audio/video quality, and the interactive behaviour of player. The output quality can be measured using subjective and objective methods. There are several video quality models to assess the video quality in cloud gaming [32, 63– 65]. The quality feature IQ is the main quality feature in examining the influence of network parameters such as delay and jitter in a cloud gaming service. The IQ can be assessed using subjective studies and is the quality feature that captures the impact of the delay most. Therefore, it has to be assessed in the current document.

Playing Quality The term “usability of gaming” has inspired this feature, and it takes into account the pragmatic aspects of gaming services. The game usability is defined by Pinelle et al. [66] as “the degree to which a player is able to learn, intuitively control, and understand a game.” The usability of games is based on effectiveness and efficiency concepts, which are not easily applied in gaming as it does not address features such as entertainment, and engagement. Furthermore, games are typically designed to be challenging, which contrasts with effectiveness and efficiency. Therefore the term “playing quality” instead of “game usability” is used in the game taxonomy.

Appeal Aesthetics and appeal are considered as the sensory experience that the system elicits and the extent to which this experience fits an individual’s goals and spirit [67]. These quality features in the gaming taxonomy consist of system personality, which refers to the user’s perception of the system, aesthetics, and novelty, which result in the appeal [68, 69].

2.6.2 Player Experience Aspects The player experience aspects in the gaming taxonomy are adapted based on Poels et al. definition [70] that it divides into the sub-aspects challenge, control, flow, tension, immersion, competence, positive affect, and negative affect. Murphy [11] associated the positive affect with fun and feelings such as engagement, enjoyment, and cheer. Moreover, the negative affect with feelings such as boredom, frustration, and tension which typically happened when the players needs for challenge are not satisfied or the game is too challenging. Flow is associated with challenge and players’ ability and can be high when the game challenge is in an optimal state. Chen [71] describes ideal zones of the

20

2 Gaming QoE Assessment

flow when capabilities of the users match the game challenges. If the challenge is too high, the flow experience is destroyed, leading to anxiety of losing followed by frustration, whereas a too easy game would end in boredom. The flow is also related to the player experience where the gamers with higher capabilities demand higher challenge in comparison to less skilled gamers. However according to Csikszentmihalyi [72], broader requirements to create a state of optimal experience include balance between game challenges and user required skills, merging of user action and awareness where users perform the actions almost spontaneously, a clear goal of the game, providing immediate feedback that helps a user to progress towards the goal, the concentration on the task at hand and paying less attention to other things, the paradox of control where user belief to has a full control of the game actions, loss of self-consciousness, and transformation of time where the user perception of time is distorted [73]. Absorption is a close concept to flow, being absorbed refers to being in a state of deep attention with the event experienced [74]. Webster and Ho argue that absorption is “identical to flow, just without the dimension of control”. Presence and immersion are the other quality factors in the player experience aspects. Presence is a psychological state of “being there.” It is mostly related to virtual reality gaming where the sensory information can convey the feeling of being there. Immersion is a well-known term in gaming, and recently in other multimedia applications. However, similar to the term quality, there seems that there is no agreement over the definition of immersion, and it is often confused with presence. Schmidt et al. [75] defined mental immersion as “a state of the user linked to mental absorption, mediated by the game content including narrative elements and challenges”. From the system point of view, immersion can be defined as “the degree to which immersive media environments sub-merge the perceptual system of the user in computer-generated stimuli. The more the system blocks out stimuli from the physical world, the more the system is considered to be immersive” [76, 77]. Immersion in gaming is influenced by the degree of involvement. Brown and Cairns defined three stages for being totally immersed, “engagement”, “engrossment”, and “total immersion” [78]. Engagement starts with learning the game and investing time by overcoming the barrier of preferences. In the second stage, the gamer needs to be emotionally attached to the game, and in this stage, s/he starts paying less attention to the surrounding. And finally, at the stage of total immersion, the player feels like being in the game and being cut off from reality [78, 79].

2.7 Methods for Assessing Gaming QoE With a summary of the gaming taxonomy, the influencing factors and quality features, this section discusses the assessment of gaming QoE. Taking the influencing factors and quality features in the assessment of gaming QoE into account is very important as they are the factors that are being assessed or they should be controlled in conducting experiment. The traditional paradigms of evaluating QoE

2.7 Methods for Assessing Gaming QoE

21

of multimedia services make use of subjective assessment, which are typically being conducted in a laboratory environment and, recently, making use of crowdsourcing. This section first investigates the methods for assessment of gaming QoE, relevant standards and guidance for assessing the quality features, and then it investigates the environment and the experimental setups required for conducting subjective experiment. Typically, in subjective quality assessments, a series of participants are invited and presented with a set of conditions, and they are asked to express their opinion on a rating scale. ITU-T Rec. P.10 [80] defines the assessment of quality as:

Quality Assessment The process of measuring or estimating the QoE for a set of users of an application or a service with a dedicated procedure, and considering the influencing factors (possibly controlled, measured, or simply collected and reported). The output of the process may be a scalar value, multidimensional representation of the results, and/or verbal descriptors. All assessments of QoE should be accompanied by the description of the influencing factors that are included. The assessment of QoE can be described as comprehensive when it includes many of the specific factors, for example, a majority of the known factors.

In addition to using a questionnaire for subjective assessment, there exist several other ways to assess the gaming QoE, including behavioral assessment, psychophysiological assessments, and continuous quality assessment. Behavioral assessment in gaming, such as assessing using the actions per minute (APM) and ingame performance, does not necessarily show the user experience’s self-judgment. In addition, these metrics are hardly generalizable as the user interactions with the game and the scoring system are different from game to game. Psychophysiological assessments are another way to assess the gaming QoE. Lee et al. [81] used facial electromyography (EMG) to measure players’ experience, and it was shown that, although it can be an indicator of how much a player is dissatisfied by the latency in one single game, similar to behavioral assessment the results are not comparable with other games and are hardly generalizable. In general, mapping between players’ bio-signals, such as electrocardiography measurements and their emotions, is not a simple task, and it is still under study. Especially, various games can trigger different user emotions (e.g., horror games versus a happy game). Another approach for quality assessment is the continuous assessment. Sabet et al. [82] proposed to use continuous assessment that enabled users to provide a rating in a similar approach to Single Stimulus Continuous Quality Evaluation (SSCQE) in ITU-R BT.500 [83], where players give their opinion scores using a foot pedal while playing the game.

22

2 Gaming QoE Assessment

The subjective assessment of gaming content using a questionnaire is standardized by ITU-T Rec. P.809 [67]. It is divided into two categories of a passive viewing and listening paradigm, and an interactive paradigm. The passive paradigm is used when the impairment does not influence the users’ interaction, for example, watching a video of the games on Twitch.tv. It is very similar to standardized video quality assessment, and should be used when the output quality is targeted. In the interactive paradigm, the participants must play a game scenario and then express their opinion through the provided assessment method. This study paradigm should be used when quality features such as IQ, immersion, and flow are targeted to be measured. The subjective studies that were conducted for this book are followed by the guidance of this test paradigm, which will be detailed in the following. ITU-T Rec. P.809 defined two types of interactive study depending on the aim of the study, short interactive are designed to be used:

Short Interactive In a short interactive test, in which a typical stimulus (interactive game play) length is between 90–120 s, it is possible to assess the interaction quality (e.g., the impact of delay on the control), but the assessment of more complex player experience features, highly depends on the player and the game content. Games that meet the interest of the player which are intuitive without being bored, can already immerse players within a short time of a few minutes.

Moreover, the long interactive are designed to be used:

Long Interactive When aiming for all QoE aspects mentioned in gaming taxonomy, so far there is no recommendation for an ideal stimuli length available, but it is reasonable to use a duration of 10–15 min to ensure that players get emotionally attached to a game scenario while aiming to measure emotions and other QoE aspects such as flow.

Therefore, the test conducted in this book should be the short interactive as the impact of delay in interaction quality is the parameter under the investigation. To ensure the participants have similar knowledge of the game and to reduce the learning effects, information about the test design as well as the rules and controls of the game should be given to the participants. In addition, it is recommended to include a training session for all the games. The training session should use a (nondegraded) reference condition, and it may additionally show degraded conditions in order to anchor the use of the rating scale(s). For the assessment methods (e.g., questionnaires), questions should be explained before starting the test, and it should

2.7 Methods for Assessing Gaming QoE

23

be mentioned that participants should not rate the graphic quality (graphical details, abstract or realistic graphics) but rather artifacts. The instruction could be a printed version or an introductory video, which should include the following information: • Information about experimental details, such as session duration and assessment methods (questionnaire details). • Information about how to control the game via an input device. • Information about the game concept, goal of the game and basic description of objects in the game such as enemies, and obstacles. • It should be mentioned to test participants that there are no “correct” ratings. The instructions should not suggest that there is a correct rating or provide any feedback as to the “correctness” of any response. In an interactive gaming experiment, in addition to choosing the game, the game scenario should be chosen. Schmidt et al. showed that [41] the game scenario is important for the delay sensitivity of a game, the scenes from the game must be representative of the chosen game, and they should be similar between all the test participants. The scenario should not be too difficult or too easy for the participants. After playing the game scenario, the post-condition questionnaires are presented to the participants. The next section investigates the questionnaire for the evaluation of interactive gaming QoE.

2.7.1 Questionnaire For the quality assessment of many multimedia applications, summative subjective methods like the determination of the “overall quality” have been widely used and standardized in many multimedia application assessments. This method allows participants to express their overall opinion on the quality through rating on a given scale. Many multimedia assessment methods make use of a five-point Absolute Category Rating (ACR). The collected ratings of the participants are summarized by a Mean Opinion Score (MOS) [83]. However, to assess other quality features discussed in the gaming taxonomy, there exist several questionnaires measuring various quality aspects named in the game taxonomy. ITU-T Rec. P.809 [67] and Schmidt [10] reviewed a list of these questionnaires, some of which are presented in Table 2.1. These questionnaires are to assess quality aspects such as immersion, presence and flow, while they are not suitable for assessing the IQ. Schmidt [10] developed the Gaming Input Quality Scale (GIPS), which measures the IQ of a player interacting with a gaming system. The GIPS is a questionnaire that is particularly developed to measure the influence of the network degradation on the cloud gaming experience known as “input quality” rather than focusing on the players’ emotional aspects or the game design. The GIPS aims to measure the IQ

24

2 Gaming QoE Assessment

Table 2.1 Overview of questionnaires for assessing gaming QoE aspects from [10] Name of questionnaire Cognitive Absorption Scale (CAS)

Engagement Questionnaire (EQ) Flow State Scale (FSS)

Game Experience Questionnaire (GEQ) Igroup Presence Questionnaire (IPQ) Presence Questionnaire (PQ)

Immersive Experience Questionnaire (IEQ) Player Experience of Need Satisfaction (PENS) User Engagement Scale (UESz)

Quality aspects Cognitive absorption, ease of use, usefulness, personal innovativeness, playfulness, intention to use, self efficacy [84] Interest, authenticity, curiosity, involvement, and fidelity [85] Challenge-skill balance, action-awareness merging, clear goals, unambiguous feedback, concentration on task, paradox of control, loss of self-consciousness, transformation of time, autotelic experience [86] Immersion, flow, competence, positive and negative effect, tension, challenge [61] Spatial presence, involvement, realness [87] Main: control, sensory, distraction, realism; sub: involvement, natural, auditory, haptic, resolution, interface quality [88] Person factors: cognitive involvement, real world dissociation and emotional involvement; game factors: challenge and control [79] Competence, autonomy, relatedness, and intuitive controls; presence dimensions: narrative, emotional, physical [89] Focused attention, perceived usability, aesthetics, and satisfaction [90]

of a player interacting with a gaming system. Therefore it is the ideal choice for the subjective studies that are being conducted in this document. The GIPS consists of two sub-aspects, namely “Responsiveness” and “Controllability”. Responsiveness is defined as:

Responsiveness Responsiveness describes the temporal aspects of the feedback a player receives after performing an action, e.g., a mouse click or a keystroke. The response of the game (system) should be available immediately after the player performs an action (input event).

2.7 Methods for Assessing Gaming QoE

25

Moreover the controllability defined as:

Controllability The perceived controllability is the degree to which a player is able to control a game using the given input device and available interaction possibilities. It describes whether the performed input action resulted in the desired outcome. The controllability does not relate to the learnability of the controls nor to autonomy (freedom or power over something).

The GIPS uses a 7-point continuous scale, known as extended continuous ACR (EC-ACR) scale as was recommended in ITU-T Rec. P.809. It is based on a Likert scale, which uses five labels ranging from “strongly disagree” to “strongly agree”. An example of the item and scale used for GIPS is shown in Fig. 2.3. In order to transfer it back to 5-point ACR and to use MOS, the scale transformation performed by Köster et al. [91] can be used. The list of items in the GIPS questionnaire and their labels are presented in Table 2.2.

I felt that I had control over my interaction with the system.

strongly disagree

disagree

undecided

agree

strongly agree

Fig. 2.3 Example of item and scale used for the GIPS [10] Table 2.2 List of GIPS items from Schmidt [10] Code Controllability CN1 CN2 CN3 Responsiveness RE1 RE2 RE3 RE4 RE5

Label I felt that I had control over my interaction with the system I felt a sense of control over the game interface and input devices I felt in control of my game actions I noticed delay between my actions and the outcomes The responsiveness of my inputs was as I expected My inputs were applied smoothly I received immediate feedback on my actions I was notified about my actions immediately

26

2 Gaming QoE Assessment

2.7.2 Experimental Setup of Laboratory Environment In conducting lab studies, recommendations of ITU-T Rec. P.910 [92] and ITU-T Rec. P.911 [93] should be followed. They provide details for environment specifications such as room size, viewing distance, luminance of the screen, chromaticity of background, background room illumination, and background noise level. A neutral environment, e.g., sound-shielded rooms with daylight imitation, are recommended, except in case that a real-life gaming situation, e.g., mobile gaming in subway, is the purpose of the test. A vital issue in conducting gaming QoE studies is the design of the game platform itself. Therefore, a standardized cloud gaming platform should be defined when conducting QoE tests. It is recommended to have the cloud gaming platform in a local network free from any uncontrolled impairment unless the study aims to measure the real network. Another important aspect to address is the differentiation of user devices. Prior to the study design, depending on the purpose of the study, the user devices should be determined, including consoles, smartphones, tablets, and personal computers. For choosing the participants, ITU-T Rec. P.809 [67] recommends screening based on human influence factors, especially the gaming experience, and depending on the purpose of the test, the target group should be selected.

2.7.3 Experimenting Using Crowdsourcing Platform The QoE assessment of multimedia services is traditionally conducted in laboratory environments. Although, as discussed in the previous section, it offers a controlled environment, these experiments are often time-consuming, expensive, and are conducted with limited diversity of participants. Therefore, the method of crowdsourcing has become very popular in recent years. It enables gathering a large amount of subjective data in a short amount of time from a desired group of participants. Participants of such tests, referred to as (crowd) workers, are typically recruited via platforms such as Amazon’s Mechanical Turk (MTurk), Microworkers, or Crowdee, and they will solve short mini-tasks requiring some human intelligence. The workers are compensated with a monetary reward. While crowdsourcing is a precious tool for gathering a large amount of data, its validity and reliability are under question as the test is not being conducted in a controlled environment, and the experimenter cannot observe the participants. In recent years, crowdsourcing gained much attention in the quality assessment of multimedia services such as speech, audio, and video [94, 95], and it was shown to be a valid and reliable tool in this domain. However, due to the special gaming requirements, such as the need for a framework capable of running the games, storing the users interactions and guidance stated in ITU-T Rec. P.809 those frameworks and methods are not applicable in crowdsouring of gaming studies. In [96], Schmidt, with the

2.7 Methods for Assessing Gaming QoE

27

Fig. 2.4 Crowdsourcing platform from Schmidt et al. [96]

author of this document’s collaboration, developed a crowdsourcing platform to conduct gaming experiment following the recommendations of ITU-T Rec. P.809 interactive paradigm. The framework of the developed crowdsourcing platform is shown in Fig. 2.4. The crowdsourcing platform consists of two main components, the rating platform and the crowd gaming framework. The rating platform, which can be hosted on one of the crowdsourcing platforms (MTurk was chosen in this implementation), is responsible for recruiting participants, loading the questionnaire and instruction, forwarding participants to the game, and collecting participants’ ratings. The crowdgaming framework, which is hosted on the experimenter’s server, is responsible for hosting the games, storing and processing the users’ interactions with the games, and generating a token in case that the users’ interactions with the game were confirmed to be valid. In the following, the components of the crowdsourcing platform are discussed in more detail.

Rating Platform The rating platform component of the framework is very similar to other assessments of multimedia services using crowdsourcing, and ITU-T Rec. P.808 [97] inspires it. It is recommended to have a qualification task before inviting the participants to the rating task. MTurk provides statistics to filter users, such as the

28

2 Gaming QoE Assessment

task approval rate (e.g., users with 98% task approval or more), number of approved tasks, and users’ location and native language that could be used for qualification tasks. However, these filters do not guarantee that selected crowdworkers will perform well in the gaming tasks; therefore, for the gaming qualification task, a prestudy where the crowdworkers are asked to fill out a questionnaire regarding users’ experience in gaming should be conducted. Users who are eligible to participate in the rating study can be selected based on Clause 8.3 on ITU-T P.809 [67], including gaming experience of the users, weekly gameplay, age, and considering the gender balance to have an equal number of male and female participants in the study. Afterward, participants who are qualified can participate in the rating platform. In addition to the qualification task, it should be asked that only crowdworkers who are using a desktop (PC) or a laptop with a keyboard and mouse participate in the task. However, depending on the study’s purpose, the platform is designed to be responsive and capable of being run on smartphones and tablets. At the beginning of the rating, the procedure of the test, including the information and instruction of the questionnaire and expectation from the users, is explained clearly for each step. In addition, in case of an unforeseen issue or question, the framework provides the possibility to contact the experimenter via a given Email address. After the instruction, participants are asked to fill out a pre-questionnaire regarding their demographic information, and in case the game scenario has sound, a stereo soundcheck could be performed (playing a stereo audio file of a Completely Automated Public Turing test to tell Computers and Humans Apart (CAPTCHA) and asking users to give a correct answer). Afterward, following the ITU-T Rec, P.808 [97] and P.809 [67], a training session of the game is considered in the framework. There, workers learn the rules and controls of the game. After the training, the rating task starts in which participants play a scenario of the game, and then they express their opinion using the postcondition questionnaire. After playing each round of the test, in case of a valid and adequate engagement with the game, the crowd gaming framework generates a unique token. Worker are asked to put this token in the MTurk to verify that they actually played the game. MTurk allows workers who entered a valid token to provide a rating for that condition, and to continue the rest of the test.

Crowd Gaming Framework The game is the core component of the interactive crowd gaming framework. Nine open-source games were implemented using the JavaScript p5 library, which offers a set of drawing functions, in addition, add-ons for interaction with other HTML5 objects are used. As a basis for a majority of the games, the “A game a day” project by Kael Kirk2 was used. The JavaScript games are chosen as they are light and can be run on any crowd workers’ PCs. As there was no access to the

2 https://github.com/Kaelinator/AGAD.

2.7 Methods for Assessing Gaming QoE

29

crowdworkers network, network impairments such as delay and packet loss have to be implemented artificially, e.g., the delay can be simulated by buffering input commands. Based on the ITU-T Rec. P.809 [67] in addition to the training session, the starting page of the games include a screenshot of the game with an explanation of the game rules, controls, heads-up display (HUD) elements, and a start button that starts the game scenario at the crowdworkers discretion. When user starts the game, all the user’s interactions including the user’s inputs and performance are being collected and once the duration of the run is finished, these statistics are sent to the engagement check module. The engagement check module is responsible for evaluating whether the worker played the game scenario as intended or not. The engagement test works by looking into the user inputs and rejects ratings belonging to users who did not have enough interactions with the game. The minimum number of required interactions with the game can be found during a pre-test. The number of inputs, i.e., mouse clicks or keystrokes, for each game during the most strongly impaired condition, e.g., a delay of 300 ms, can be collected to derive an activity threshold. Workers who have more than 20% of the threshold value can pass the engagement check. If workers fail this check, they should be asked to play the condition again as they did not pay enough attention to the game. For the training session, as the duration is usually shorter than the other conditions, the threshold should be scaled based on the duration. The engagement check module is found to be an essential tool for a crowdgaming framework; this module can prevent workers from cheating, and it is of high value for the training session to make sure workers understand the rules and controls of the game in a short time. When the engagement check is passed, the participant’s interaction information such as the user inputs, scores, number of failures, game-specific objectives, the game scenario ID, timestamps, worker browser, and operating system are stored in the game server. Afterward, participants are indirectly forwarded back to the rating task in MTurk. However, as the game server in this framework is located in a separated server from MTurk and there is no control over the MTurk backend, there is no information available to find out if the person who played the game in the game server was the same person who is answering the questions in the MTurk server. Therefore a foreign key as a token system is used. Each time, a 36 character long universally unique identifier (UUID) is generated after each gaming session, and participants are asked to copy and enter the code in the MTurk questionnaire manually. In this way, the game server data is connected to the MTurk, and the participants’ subjective ratings are linked to the stored information regarding the game statistics and user interactions in the game server. If a valid token ts used, the rating scale will be shown, and the participant is allowed to rate the confirmed game scenario by means of the questionnaire. In addition to the qualification task and engagement check, to make sure the collected data is valid and reliable, a gold standard question (trapping question) was

30

2 Gaming QoE Assessment

placed in the pool of the questionnaire as recommenced by ITU-T Rec. P.808 [97]. A trapping question is a question whose answer is known to the experimenter. Using a trapping question, the ratings of workers who take the rating process lightly or even attempt to cheat can be detected. The question that is asked from the crowdworkers in studies conducted in this book is “In the game I played, I was able to talk to other players.” and participants are supposed to answer disagree as they are not able to talk with anyone in the games. The ratings of participants who fail this test should be removed from the data analyses.

2.8 Summary This chapter provided an overview of the necessary steps and the required knowledge that are needed for the assessment of gaming QoE. This included the concept of quality and QoE, the cloud gaming framework, the gaming QoE taxonomy, and methods for assessing gaming QoE using laboratory and crowdsourcing studies. The chapter started with an introduction to the cloud gaming structure and components of this service. Next, the terms quality and QoS were defined. It was emphasized that assessing QoS alone is not sufficient, and the QoE was defined. Next, the gaming QoE taxonomy was introduced, and various quality aspects of gaming QoE, including game-related and player experience aspects, were discussed. In addition, the gaming influencing factors on the QoE, including human, context, and the system was summarized, and it was shown how network parameters such as delay impact the gaming QoE. Afterward, methods for assessment of the gaming QoE including various questionnaires and the guidance of ITU-T Rec. P.809 were summarized. Finally, the experimental setup used in conducting interactive studies in a laboratory environment and with a crowdsourcing platform were discussed. Chapter 4, further discusses the validity and reliability of crowdsourcing study.

Chapter 3

Influence of Delay on Gaming QoE

3.1 Introduction Delay can be defined as the elapsed time from the moment of entering an input(action) by a user to the moment when the user receives the feedback in terms of visual, auditory, or haptic feedback (reaction). Therefore this time includes all processes in the client machine, the cloud server, and the transmission delay. A detailed overview of the factor is presented in Sect. 2.5. The delay emulated in this section is artificially added as the round-trip time (RTT) delay in laboratory tests and the client input delay is also emulated when crowdsourcing studies are used. When using laboratory environment tests, fast input and output devices and hardware accelerator encoding was used to minimize tclient and tserver . However, controlling tclient was not feasible in crowdsourcing studies. Nevertheless, regardless of the source of the delay, it can negatively impact the gaming QoE and user performance. This chapter investigates the impact of constant delay on gaming QoE and investigates why some games are more delay sensitive. The chapter is organized as follows. First, Sect. 3.2 discusses the development of the Interactive Dataset of ITU-T Rec. G.1072 and shows that the impact of delay is game-dependent. Next, Sect. 3.3, based on the text and material presented before in [98], studies the reasons behind the differences in games’ sensitivity to delay by identifying underlying characteristics that potentially influencing the delay sensitivity of games. Using these characteristics, a classification model is developed in Sect. 3.4 which is validated over a training set and an external dataset. In addition, Sect. 3.5, based on the text and material presented before in [99], validates the classification when the players choose different user strategies. Moreover, a quality model to predict the influence of delay on gaming QoE is developed in Sect. 3.6. Section 3.7, based on the work presented in [100], shows that gaming QoE is not the only factor influenced by delay, and it investigates the relationship between delay,

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2023 S. S. Sabet, The Influence of Delay on Cloud Gaming Quality of Experience, T-Labs Series in Telecommunication Services, https://doi.org/10.1007/978-3-030-99869-1_3

31

32

3 Influence of Delay on Gaming QoE

performance, and gaming QoE. Finally Sect. 3.8 provides a brief summary of the findings presented in Chap. 3.

3.2 Interactive Dataset of ITU-T Rec. G.1072 In order to study the influence of delay on cloud gaming experience, the Interactive Dataset of ITU-T Rec. G.1072 is developed following the short interactive paradigm of ITU-T Rec. P.809 [67]. The cloud gaming test setup was configured in a local network as illustrated in Fig. 3.1. It was conducted in a laboratory local network to be free from any external network impairment. A powerful PC with a highend graphics card was used as a cloud server, and the STEAM in-home streaming (remote play)1 was used as the cloud gaming platform. The client PC was equipped with a 24-inch display monitor that had a super-fast response time of 1 ms and refresh rate of 144 Hz, and a typical pair of Universal Serial Bus (USB) mouse and keyboard was used as input devices on the client machine. To create network delay for various conditions NetEm2 as the network emulator was used. In addition to the test setup, another laptop was used that was running the digital version of the questionnaire tool that participants were asked to fill out without interrupting the test setup. For the questionnaire, a 31-item post-condition questionnaire was used, including the overall gaming QoE, GIPS [10] questionnaire, output quality questionnaire [101], and in-game GEQ (iGEQ) [61]. Of those, only the GIPS and the

24' 24'

NetEm

Test Part icpant PC

Test Subject

NetEm

Cloud Gaming Server

Rating Tool

Fig. 3.1 The test setup that was used in the Interactive Dataset of ITU-T Rec. G.1072. The right PC runs the games and sends the game scene to the STEAM client through the network. The RTT delay was emulated using NetEm

1 https://store.steampowered.com/remoteplay. 2 https://wiki.linuxfoundation.org/networking/netem.

3.2 Interactive Dataset of ITU-T Rec. G.1072

33

Fig. 3.2 Snapshot of the 9 games that were used in ITU-T Rec. G.1072 Interactive Dataset

overall gaming QoE are relevant for the book, and others which are mostly triggered by the game design and not the delay impairment are not used in the following data analyses. In addition to the main questionnaire, the tool included test instructions, a pre-test, post-game, and post-test questionnaire. The information about them are presented in Appendix A. For conducting the study, ITU-T Rec. P.809[67], P.910[92], and P.911[93] were considered, the light conditions in the test room, acoustical properties, the viewing distance (viewing distance = 3*window size), and position of participants were set following these ITU recommendations. The Interactive Dataset was used for the development of ITU-T Rec. G.1072 [23], and it consists of various conditions such as framerate, packet loss, as well as delay. Of those, delay studies that are aligned with the intersect of this book are used, and the other conditions are excluded from the analyses. In this dataset, a series of studies in which different amounts of fixed delays, including 0 ms, 50 ms, 100 ms, 200 ms, and 400 ms, are conducted. The dataset made use of 9 games that were popular at the time of conducting the study. Figure 3.2 shows a snapshot of these games, and Table 3.1 provides a short description of these games. The dataset was created in 9 study blocks. In each of them, one of these 9 games was tested. Each study lasted a maximum of two hours using a total of 17 conditions in each study block. Overall in all the studies, 179 people participated, including 72 females and 107 males. The participants were aged 18 to 41 with a mean of 27.43, a Median of 27, and a Standard Deviation (SD) of 4.5. Participants were casual gamers with a fair level of gaming experience. Table 3.2 shows the detailed information on the demographic of the test participants.

34

3 Influence of Delay on Gaming QoE

Table 3.1 Description of the games that were used in the Interactive Dataset of ITU-T Rec. G.1072 Game Dota2

Overwatch Counter strike

Rayman legends Tekken 7 Project cars Worms W.M.D. Hearthstone Bejeweled 3

Description A battle arena game where each team should defend their towers while destroying enemies’ towers. Using various abilities that players have, they can attack by aiming and clicking (Fig. 3.2a ) A FPS game containing two teams where each team has to move in a 3D world and shoot at enemies by aiming and clicking (Fig. 3.2b) A FPS game that has two teams of terrorist and police. The terrorists plant a bomb and polices have to defuse it after eliminating the terrorists by aiming and clicking (Fig. 3.2c) A platform run and jump game where the player have to jump from a series of different obstacles to reach the end of the level (Fig. 3.2d) A fighting game where the player can hit the opponent using combination of buttons through punches, kicks and different marshal arts (Fig. 3.2e) A car racing game which is designed to be a simulator for car driving. Player has to move the car to remain inside the race track (Fig. 3.2f) A turn-based game where a player has to aim and destroy the enemy’s building (Fig. 3.2g) A turn-based card playing game where in each turn the player picks a card from the deck (Fig. 3.2h) A turned based tile matching game where the player should group similar gems together and create a sets of at least three gems in a row (Fig. 3.2i)

Table 3.2 Demographic of the test participants in the Interactive Dataset

Age 18–23 42 (23%)

24–29 73 (40%) Gender Female 72 (40%) Game expertise 1 2 14% 11% Weekly gameplay 0–1 h 1–5 h 21% 24%

30–35 61 (34%)

36–40 2 (1%)

>40 1 (0.05%)

3 41%

4 24%

5 8%

5–10 h 28%

10–20 h 19%

>20 h 8%

Male 107(60%)

3.2.1 Data Analyses The negative influence of delay on gaming QoE and IQ can be seen in Figs. 3.3 and 3.4. A Factorial Analysis of Variance (ANOVA) was conducted to compare the main effects of delay and game in addition to the interaction effect between delay and game on gaming QoE and IQ. The results are reported in Table 3.3, which shows the significant main effect of delay and games as well as the interaction effect

3.2 Interactive Dataset of ITU-T Rec. G.1072

35

Delay

5

0 50 100 200 400

MOS QoE

4

3

2

Hearthstone

Counter Strike

Project Cars

Worms

Overwatch

Tekken

Bejeweled

Rayman

Dota2

1

Fig. 3.3 Bar plots with the 95% confidence interval of the MOS value of gaming QoE under different levels of delay on nine games Delay

MOS Input Quality

5

0 50 100 200 400

4

3

2

Hearthstone

Counter Strike

Project Cars

Worms

Overwatch

Tekken

Bejeweled

Rayman

Dota2

1

Fig. 3.4 Bar plots with the 95% confidence interval of the MOS value of IQ under different levels of delay on nine games

between them, meaning that the impact of delay on gaming QoE and IQ is game dependent. The post-hoc test using Bonferroni correction on IQ and overall QoE (see Table 3.4) reveals that 50 ms did not impact the user experience, where there was

36

3 Influence of Delay on Gaming QoE

Table 3.3 Results of factorial ANOVA investigating the main of effect of delay and game, as well as interaction between them on gaming QoE and IQ. Source Delay Game Delay* game

QoE df 1 44 44 44

df 2 4 8 32

F 190.961 12.689 5.772

p