Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Leon Zucherman is active.

Publication


Featured researches published by Leon Zucherman.


international conference on heterogeneous networking for quality, reliability, security and robustness | 2014

Video quality of experience in the presence of accessibility and retainability failures

Weiwei Li; Hamood-Ur Rehman; Diba Kaya; Mark H. Chignell; Alberto Leon-Garcia; Leon Zucherman; Jie Jiang

Accurate Quality of Experience measurement for streaming video has become more crucial with the increase in demand for online video viewing. Quantifying video Quality of Experience is a challenging task. Significant efforts to quantify video Quality of Experience have primarily focused on the measurement of Quality of Experience for videos with network and compression related impairments. These impairments, however, may not always be the only main factors affecting Quality of Experience in an entire video viewing session. In this paper, we evaluate Quality of Experience for entire video viewing sessions, from the beginning to the end. In doing so, we evaluate videos with temporary interruptions as well as those with permanent interruptions or failures. We consider two types of failures, namely Accessibility and Retainability failures, and present the results of two subjective studies. These results indicate: (a) Accessibility and Retainability failures are rated lower compared to temporary interruption impairments; (b) Accessibility failures are rated close to the lowest value on the rating scale; and (c) the traditionally used 5-point scale to measure video Quality of Experience is not sufficient in the presence of Accessibility and Retainability failures.


international conference on communications | 2015

Acceptability and Quality of Experience in over the top video

Petros Spachos; Weiwei Li; Mark H. Chignell; Alberto Leon-Garcia; Leon Zucherman; Jie Jiang

Consumer acceptance is of great interest in the adoption of novel multimedia products and services. A number of factors can greatly influence the customer experience during a video session, impacting the acceptability of the product or service. Factors such as the Technical Quality (TQ) which covers the technical aspects of the signal quality can be controlled from the network provider. On the other hand, the network provider has no control on the subjects level of interest in a video, the Content Quality (CQ). Together TQ and CQ influence the Overall eXperience (OX) of the subject and, in the case of Over The Top (OTT) video, in a video session. In this paper, we present results from a user subjective study in which the impact of TQ and OX on the acceptability was investigated for OTT video sessions. To minimize the CQ impact on OX, the videos were carefully selected to have relatively neutral content. We assess the impact of TQ and OX on acceptability for videos containing impairment and failure events during lifecycle, thus affecting Accessibility, Retainability and Integrity. Our experimental results indicate that TQ and OX have a strong impact on acceptability.


signal-image technology and internet-based systems | 2014

Impact of Retainability Failures on Video Quality of Experience

Weiwei Li; Hamood Ur-Rehman; Mark H. Chignell; Alberto Leon-Garcia; Leon Zucherman; Jie Jiang

Measurement of video Quality of Experience is needed to enable telecommunication service operators to provide acceptable video viewing experience to their customers. Without accurate Quality of Experience measurement, it is hard to predict customer satisfaction. Accessibility and Retain ability failures are important factors that can affect the experience of viewing video on the Internet. Most of the published work in the assessment of Video Quality of Experience has focused on the quality of video viewing sessions without considering the impact of Accessibility and Retain ability failures. In this paper, we focus on the Quality of Experience for video sessions in the presence of Retain ability failures. We primarily evaluate video Quality of Experience for videos with temporary interruptions, called Integrity impairments, as well as with permanent interruptions called Retain ability failures. A subjective study was conducted to observe the effect of Retain ability failure on video Quality of Experience. The results of the subjective study suggest that videos with Retain ability failures are rated much lower than videos without any failure. An initial linear model for QoE of videos subject to Retain ability failures is introduced.


international conference on communications | 2016

Understanding the relationships between performance metrics and QoE for Over-The-Top video

Weiwei Li; Petros Spachos; Mark H. Chignell; Alberto Leon-Garcia; Leon Zucherman; Jie Jiang

In this paper, we study the relationships between Quality of Service (QoS) and Quality of Experience (QoE) in a session-based Over-The-Top (OTT) video service. A number of Performance Metrics (PMs) with and without the existence of failures during a video are examined. As QoE factors, Technical Quality (TQ) and Acceptability are used. We analyze the correlation between QoS performance metrics and QoE factors, and find new PMs should be employed because failures are included in QoE evaluation. We also summarize the relationships between QoS metrics and QoE factors through machine learning approaches. Using decision tree, we have a general idea about the relationships between PMs and QoE factors. We also understand the impact caused by failures and the value of rating scales.


consumer communications and networking conference | 2016

Impact of technical and Content Quality on Overall Experience of OTT video

Weiwei Li; Petros Spachos; Mark H. Chignell; Alberto Leon-Garcia; Leon Zucherman; Jie Jiang

Quality of Experience (QoE) is a crucial guiding factor for network management of an end-to-end service session. The network provider can control the resources allocated to sessions and in doing so, influence the Technical Quality (TQ), which covers the technical aspects of signal quality during the session. On the other hand, the network provider has no control over the Content Quality (CQ), which pertains to the users level of interest in a particular video. Together TQ and CQ influence the Overall eXperience (OX) in a session. In this paper, we present results from a user subjective study in which the impact of TQ and CQ on OX was investigated for Over-The-Top (OTT) video sessions from the perspective of a network provider. This perspective places a focus on those elements of QoE that can be controlled by the provider. Various studies have shown that very high interest in a content can strongly influence QoE independent of other factors, so our study uses videos that are neutral with respect to content. We assess the TQ, CQ and OX for video sessions that contain Integrity impairments (in the form of image freezing) and failures in terms of session Accessibility and Retainability. Our findings indicate that TQ and CQ have a strong impact on OX in the presence of impairments, but no failures. On the other hand, TQ is the main determinant of OX when failures are present.


world of wireless mobile and multimedia networks | 2017

Subjective QoE assessment on video service: Laboratory controllable approach

Petros Spachos; Thomas Lin; Weiwei Li; Mark H. Chignell; Alberto Leon-Garcia; Jie Jiang; Leon Zucherman

This paper introduces research that addresses the subjective assessment of Quality of Experience (QoE) during the entire life cycle of a video session. We define a video session life cycle as the time from when a user attempts to initiate playback, until such time that the video ends either from normal video conclusion or through a network-induced failure. We provide a detailed description of our assessment methodology designed to discern whether a users QoE would be impacted by the presence of failures. To accomphsh this, we carefully select various test conditions to take into consideration the rating scale used, the types of impairments and failures seen by the user, and whether impaired videos are seen together with failed videos in multi-video sessions. The selection and creation of source video sequences are also discussed, as well as the use of between-subjects and within-subjects approaches for running our experiments in a controlled laboratory setting. Statistical analysis was carried out to interpret our experimental results. We compared the results of the between-subjects measures and the results of the within-subjects measures, and concluded that the introduction of a scale with an extended lower bound enabled subjects to more clearly express their dissatisfaction of videos with failures when compared to the traditional ITU 5-point rating scale. In addition, we observed that videos that were simply impaired but concluded normally did not have a statistically significant difference when an extended scale was used.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2016

Assessment of Technical Quality of Online Video Using Visualization in Place of Experience

Mark H. Chignell; Diba Kaya; Leon Zucherman; Jie Jiang

This paper introduces a new method to collect subjective ratings of Technical Quality (TQ) in disrupted video (DV). TQ is related to whether or not the video has disruptions such as impairments (re-buffering, perceived as freezing for a period of time followed by resumption of video playback) or failures (where the video playback stops part way through and fails to complete). The assessment method introduced in this paper avoids the confounding effects of content on TQ ratings and reduces the time and effort necessary to run experiments. Actual videos, as stimuli, are replaced with schematic representations of those videos. We ran an experiment with 37 participants to explore the viability of assessing TQ using visualizations instead of actual videos. The experiment had contrasting conditions that compared TQ ratings of actual videos, vs. schematic representations of videos. The results of the experiment showed that, with appropriate training, ratings of TQ made after viewing the visualizations only were similar to TQ ratings made after actually watching videos with corresponding impairments or failures.


2016 Digital Media Industry & Academic Forum (DMIAF) | 2016

Peak-end effects in video Quality of Experience

Mark H. Chignell; Leon Zucherman; Diba Kaya; Jie Jiang

The study reported in this paper demonstrated, for the first time, that the peak-end effect, commonly found with respect to memories of experience, also applies to overall Quality of Experience (QoE) measures obtained after participants view a sequence of videos. Sequences of videos shown in an experiment varied according to the sequencing and grouping of videos with better or worse Technical Quality (TQ). An end effect was found for both better TQ and worse TQ videos. However the peak effect was found for bad, but not good, videos. These results provide an important first step towards the development of models of Likelihood to Recommend (L2R) based on accumulated experience with a service.


2016 Digital Media Industry & Academic Forum (DMIAF) | 2016

Assessing unreliability in OTT video QoE subjective evaluations using clustering with idealized data

Jie Jiang; Petros Spachos; Mark H. Chignell; Leon Zucherman

In this paper, we describe an Over-The-Top (OTT) video Quality of Experience (QoE) subjective evaluation experiment that was carried out to examine variations in the way subjects assess viewing experiences. The experiment focuses on different level of impairment and failure types, using 5-point measurement scales. Clustering is used to differentiate between unreliable and reliable participants, where reliability is defined in terms of criteria such as consistency of rating and ability to distinguish between qualitative differences in level of impairments. The results show that clustering a data set that is augmented with unreliable pseudo-participants can provide a new and improved perspective on individual differences in video QoE assessment.


Computer Networks | 2018

A quantitative relationship between Application Performance Metrics and Quality of Experience for Over-The-Top video

Weiwei Li; Petros Spachos; Mark H. Chignell; Alberto Leon-Garcia; Leon Zucherman; Jie Jiang

Abstract Quality of Experience (QoE) is a measure of the overall level of customer satisfaction with a vendor. In telecommunications, consumer satisfaction is of great interest in the adoption of novel multimedia products and services. A number of factors can greatly influence the customer experience during a video session. Factors such as user perception, experience, and expectations are expressed by QoE while factors such as application and network performance are expressed by Quality of Service (QoS) parameters. This paper studies the relationship between QoS and QoE in a session-based mobile video streaming. Specific QoS Application Performance Metrics (APMs) are examined based on a QoE assessment database which is built for experimentation and contains 108 subjects. It is shown that these APMs are highly related to two QoE factors, Technical Quality (TQ) and Acceptability. Furthermore, Viewing Ration (VR) parameter and the corresponding Kendall correlation between VR and QoE factors proves that VR is a valuable metric for mapping QoS to QoE. We further generated the compacted decision tree to predict QoE factors through Rebuffering Ratio (RR), Non-interruption Content Viewing Ratio (VRc), and Non-interruption Viewing Ratio (VRs). Through extensive experimentation, a general relationship between APMs and QoE factors has been examined and a primary QoE model is proposed based on this relationship.

Collaboration


Dive into the Leon Zucherman's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Weiwei Li

University of Toronto

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Diba Kaya

University of Toronto

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge