검색 상세

이벤트 기반의 비디오 품질 지표가 사용자 경험에 미치는 영향

Correlating Video Quality Metrics to User Experience: an Event-based Approach

초록/요약

Multimedia services have become an important part of today’s network offering. How to guarantee user’s perception of delivered video quality is a critical problem for all content providers, especially when the underlying transportation uses a shared network infrastructure. In this thesis, we present an event-based model to correlate video quality metrics to user experience, as part of VIDAR - a comprehensive VIDeo quality Analyzer in Real time. Our model maps objective frame-level metrics to perceivable video defects with correlated user experience, using machine learning techniques. Through experiments, our model shows good classification accuracy of defect events while keeping the training time reasonable. Furthermore, we have developed user perception models for each event, which also show good prediction of the user experience when subjected to the specific type of events.

more

목차

1 Introduction 1
2 Related Work 4
2.1 Objective Video Quality Assessment Methods 4
2.2 From Objective Metrics to User Experience (MOS) 5
2.3 Machine Learning Techniques 7
2.4 Overview of H.264/AVC 8
3 Proposed Approach 11
3.1 Overview of VIDAR Project 11
3.2 eSSIM Aggregator 12
3.2.1 Overview of eSSIM Aggregator 13
3.2.2 Preprocessing for eSSIM Raw Data 15
3.2.3 Event Segmentation 16
3.2.4 Feature Extraction 18
3.2.5 Preprocessing for Features 20
3.2.6 Feature Reduction 21
3.3 Multi-class Classification 22
3.3.1 Kernel Selection 22
3.3.2 Parameters setting 23
3.3.3 Combination of Binary Classifiers 25
3.3.4 Multi-class Classification with k-NN 26
3.4 User Model 27
3.4.1 Distortion 27
3.4.2 Glitch 28
3.4.3 Freezing 28
4 Validation 29
4.1 Validation of eSSIM Aggregator 29
4.1.1 Experiment Setup and Dataset Generation 29
4.1.2 Multi-class Classification (SVM) 33
4.1.3 Multi-class Classification (k-NN) 39
4.2 Validation of User Model 40
4.2.1 Subjective Testing 40
4.2.2 Distortion 41
4.2.3 Glitch 43
4.2.4 Freezing 44
4.2.5 Summary 44
5 Concluding Remarks 46
5.1 Summary 46
5.2 Contributions 46
5.3 Future Work 47
References 48

more