3a. feira
Towards Volumetric Video Conferencing
Pablo Cesar
CWI and TU Delft

With Social Extended Reality (XR) emerging as a new medium, where users can remotely experience immersive content with others, the vision of a true feeling of ‘being there together’ has become a realistic goal. This keynote will provide an overview about the challenges to achieve such a goal, based on results from practical case studies like the TRANSMIXR and MediaScape XR projects. We will discuss about different technologies, like point clouds, that can be used as the format for representing highly-realistic digital humans, and about metrics and protocols for quantifying the quality of experience. The final intention of the talk is to shed some light on social XR, as a new group of virtual reality experiences based on social photorealistic immersive content. We will discuss about the challenges regarding production and user-centric processes, and discover the new opportunities open by this new medium

Speaker Bio: Pablo Cesar has led the Distributed and Interactive Systems (DIS) group at CWI since January 2014 and is Professor of Human-Centered Multi-
media Systems in the Department of Intelligent Systems (INSY) at TU Delft. He has received the prestigious 2020 Netherlands Prize for ICT Research. He is IEEE (Institute of Electrical and Electronics Engineers) Senior member, the highest
grade for which IEEE members can apply, and ACM (Association for Computing Machinery) Distinguished Member, cited for significant achievements across the computing field recognizing up to 10 percent of ACM worldwide membership. His research focuses on measuring and evaluating the way users interact and communicate with each other using a wide range of decentralized digital systems. Cesar has co-directed over 15 externally funded research projects (H2020, FP7, FP6, PPP, NWO).

4a. feira
Quality of Experience of Immersive Media – New Challenges
Mylene C.Q. Farias
Texas State University

In the past decade, significant progress in imaging and computing technologies has unleashed the creation of more accurate representations of the physical world, enabling immersive and interactive experiences that more closely resemble reality [1]. As a consequence, immersive applications have attracted a lot of interdisciplinary attention, with the development of multimodal human-computer interactions that allow users to fully immerse themselves within realistic or virtual environments or to interact with real/virtual elements seamlessly blended with the physical world [6]. Immersive experiences can be found at different points along the “virtuality continuum,” ranging from photorealistic settings, through mixed reality approaches, to completely virtual environments.
In this scenario, high-speed connections and high-quality imaging systems are essential for the development of next-generation immersive experiences and applications in areas such as healthcare, education/training, arts & entertainment, remote work, marketing, and automotive. But the success of these emerging applications will depend on the quality of experience (QoE) they provide users [5]. As defined by Qualinet [2], QoE refers to the “degree of delight or annoyance of applications or services resulting from the fulfillment of his or her expectations with respect to the utility and/or enjoyment of the application or service in the light of the users personality and current state.” For immersive technologies, immersive media experiences (IMEx) extend the concept of QoE by encompassing elements such as the sense of presence, immersion, and motion sickness, among others [7]. Both QoE and IMEx are shaped by three influencing factors (IF): the system, the context, and the human user. For immersive experiences, the system IF plays an important role, affecting immersion, presence, and interaction itself. This impact has been extensively
studied in the literature, with several studies analyzing the effect of the design of the device itself on IMEx [3, 4]. The context IF, which includes the user’s environment, is often studied together with the system IF, since it is difficult to separate these two IFs. Finally, an important IF for immersive media are human factors, which include perceptual characteristics such as visual, audio, and spatial perception that make each user unique. It is well known that impairments may reduce perception and cause imbalances, resulting in uncomfortable symptoms and cybersickness. Subjective and instrumental assessments are commonly used to study human IFs in IMEx.
In this context, an open area of research is the design of subjective and instrumental assessment methods to estimate the user immersive media experience. For example, by tracking human behavior and psycho-physiological signals, it is possible to construct models of influential factors that can be employed to identify and potentially forecast phenomena like cybersickness, the user’s perception of immersion and sense of presence, and the overall quality of the experience. In this talk, I discuss several aspects of IMEx, including subjective and instrumental methods, and the several challenges of this area.

Speaker Bio: Mylene Farias received her B.Sc. degree in electrical engineering from Federal University of Pernambuco (UFPE), Brazil, in 1995 and her M.Sc. degree in electrical engineering from the State University of Campinas (UNICAMP), Brazil, in 1998. She received her Ph.D. in electrical and computer engineering from the University of California Santa Barbara (UCSB), USA, in 2004 for work in no-reference video quality metrics. Dr. Farias has worked as a research engineer at CPqD (Brazil) in video quality assessment and validation of video quality metrics. She has also worked as an intern for Philips Research Laboratories (The Netherlands) in video quality assessment of sharpness algorithms and for Intel Corporation (Phoenix, USA) developing no-reference video quality metrics. Currently, she is an Associate Professor in the Department of Computer Science at Texas State University. Previously, she was an associate professor in the Department of Electrical Engineering at the University of Brasilia (UnB).
To date, she has published 54 scientific journals and 110 peer-reviewed conference papers. She is currently an Associate Editor for IEEE Signal Processing Letters and SPIE Journal of Electronic Imaging, in addition to being an Area Editor for Elsevier Signal Processing Image Communication. Dr. Farias is a Senior Member of IEEE and the IEEE Signal Processing Society. She is also a member of ACM. She has served as a member and area chair in technical program committees of several conferences. She has served as a Technical Program Co-Chair for ACM NOSSDAV 2021 and QoMEX 2020 and 2023, and as a co-chair of the ACM MMSys 2022. She was elected as a member of the IEEE Image, Video, and Multidimensional Signal Processing Technical Committee (2021-2023) and of the IEEE Multimedia Signal Processing Technical Committee (2022-2024), both of the IEEE Signal Processing Society. She is currently a member of the JPEG Pleno Light-Field Working Group and VQEG Immersive Media Group. She is also a member of the TV 3.0 Video Coding Layer Test Laboratories of the Brazilian Digital Television System. Her current interests include video quality metrics, video processing, multimedia signal processing, immersive media, and visual attention.

5a. feira:
Bytes and Rhythms – Delving into the Tech-Driven Evolution of Food and Music
Alexandra Covaci
University of Kent

In the rapidly advancing domain of design and technology, the transition from a predominantly human-centred approach to a more encompassing more-than-human perspective is gaining traction. This paper investigates the implications of such a shift, focusing on the domains of music creation and culinary experiences—two intrinsic human activities deeply embedded within larger ecosystems. Utilising immersive Virtual Reality (VR) methodologies, we examine the intricate relationship musicians have with their sonic environments and the profound connections diners establish with the food life-cycle. Our research highlights the constraints of an approach that prioritises only human experiences, unveiling potential avenues for more holistic, interconnected engagements. By incorporating more-than-human elements, from the ambient subtleties influencing musical performances to the comprehensive journey of food from origin to consumption, we advocate for a design perspective that is both holistic and integrative. This perspective not only enriches human experiences but also promotes a balanced coexistence with the entities that influence our daily interactions.

Speaker Bio: With a keen interest in how technology can enhance our lives, Dr Alexandra Covaci delves into the realm of immersive technologies. As a Senior Lecturer at the School of Engineering and the Research Lead for the Institute of Cultural and Creative Industries at the University of Kent, her work focuses on designing technology that resonates deeply with human experiences, ensuring that it serves as an enabler rather than a disruptor.