Panagiotis D. Ritsos

MEng PhD Essex, FHEA

Lecturer in Visualization

Immersive Environments Lab
Visualization, Data, Modelling and
Graphics (VDMG) research group,

School of Computer Science
and Electronic Engineering,

Bangor University,
Dean Street, Bangor,
Gwynedd, UK, LL57 1UT

XReality -Virtual, Mixed & Augmented Reality

XReality

XR or XReality is the fusion of real and computer-generated elements towards the production of new hybrid - synthetic - environments. As a field, it encompasses the full spectrum of immersive technologies, and includes Virtual, Mixed and Augmented Realities. The ‘X’ has the form of a variable.

This theme has strong synergies with Immersive Analytics, however it has a more expanded remit looking into themes besides data visualization and analytics.

Despite the fact that, in the past, subdomains of XR focused primarily to the sense of vision, today’s XR systems can employ all our senses and offer us capabilities through services that are perceivable but not necessarily vision-specific.

The continuous advances in miniaturization, battery/power efficiency, communications (e.g., 5G) and the establishment of cloud services, along with the recent popularity of wearable/mobile devices, contributes towards this paradigm shift, which is the focus of this theme.

XR activities are a core theme of the Bangor Immersive Environments Lab (BIEL).

Collaborators (in various publications)

Bangor University, University of Essex, University of Maryland - College Park, Edinburgh Napier University, University of Applied Sciences and Arts Northwestern Switzerland College Park, University of Chester

B. Williams, P. D. Ritsos, and C. Headleand, “Virtual Forestry Generation: Evaluating Models for Tree Placement in Games,” Computers, vol. 9, no. 1, p. 20, Mar. 2020. A handful of approaches have been previously proposed to generate procedurally virtual forestry for virtualworlds and computer games, including plant growthmodels and point distribution methods. However, there has been no evaluation to date which assesses how effective these algorithms are at modelling real-world phenomena. In this paper we tackle this issue by evaluating three algorithms used in the generation of virtual forests – a randomly uniform point distribution method (control), a plant competition model, and an iterative random point distribution technique.Our results show that a plant competition model generated more believable content when viewed from an aerial perspective. Interestingly however, we also found that a randomly uniform point distribution method produced forestry which was rated higher in playability and photorealism, when viewed from a first-person perspective. We conclude that the objective of the game designer is important to consider when selecting an algorithm to generate forestry, as the algorithms produce forestry which is perceived differently.
[Abstract]   [PDF]   [BibTex]   [doi:10.3390/computers9010020]  

N. W. John, S. R. Pop, T. W. D. Day, P. D. Ritsos, and C. J. Headleand, “The Implementation and Validation of a Virtual Environment for Training Powered Wheelchair Manoeuvres,” IEEE Transactions on Visualization and Computer Graphics, vol. 24, no. 5, pp. 1867–1878, May 2018. Navigating a powered wheelchair and avoiding collisions is often a daunting task for new wheelchair users. It takes time and practice to gain the coordination needed to become a competent driver and this can be even more of a challenge for someone with a disability. We present a cost-effective virtual reality (VR) application that takes advantage of consumer level VR hardware. The system can be easily deployed in an assessment centre or for home use, and does not depend on a specialized high-end virtual environment such as a Powerwall or CAVE. This paper reviews previous work that has used virtual environments technology for training tasks, particularly wheelchair simulation. We then describe the implementation of our own system and the first validation study carried out using thirty three able bodied volunteers. The study results indicate that at a significance level of 5% then there is an improvement in driving skills from the use of our VR system. We thus have the potential to develop the competency of a wheelchair user whilst avoiding the risks inherent to training in the real world. However, the occurrence of cybersickness is a particular problem in this application that will need to be addressed.
[Abstract]   [PDF]   [BibTex]   [doi:10.1109/TVCG.2017.2700273]   [Invited at IEEE VR 2018]

J. C. Roberts, P. D. Ritsos, S. K. Badam, D. Brodbeck, J. Kennedy, and N. Elmqvist, “Visualization Beyond the Desktop - the next big thing,” IEEE Computer Graphics and Applications, vol. 34, no. 6, pp. 26–34, Nov. 2014. Visualization is coming of age. With visual depictions being seamlessly integrated into documents, and data visualization techniques being used to understand increasingly large and complex datasets, the term "visualization"’ is becoming used in everyday conversations. But we are on a cusp; visualization researchers need to develop and adapt to today’s new devices and tomorrow’s technology. Today, people interact with visual depictions through a mouse. Tomorrow, they’ll be touching, swiping, grasping, feeling, hearing, smelling, and even tasting data. The next big thing is multisensory visualization that goes beyond the desktop.
[Abstract]   [PDF]   [BibTex]   [doi:10.1109/MCG.2014.82]   [Video]   [Invited at IEEE VIS 2015]

P. D. Ritsos, A. T. Wilson, H. C. Miles, L. F. Williams, B. Tiddeman, F. Labrosse, S. Griffiths, B. Edwards, K. Möller, R. Karl, and J. C. Roberts, “Community-driven Generation of 3D and Augmented Web Content for Archaeology,” in Eurographics Workshop on Graphics and Cultural Heritage (EGGCH) - Short Papers and Posters, Darmstadt, Germany, 2014, pp. 25–28. Heritage sites (such as prehistoric burial cairns and standing stones) are prolific in Europe; although there is a wish to scan each of these sites, it would be time-consuming to achieve. Citizen science approaches enable us to involve the public to perform a metric survey by capturing images. In this paper, discussing work-in progress, we present our automatic process that takes the user’s uploaded photographs, converts them into 3D models and displays them in two presentation platforms – in a web gallery application, using X3D/X3DOM, and in mobile augmented reality, using awe.js
[Abstract]   [PDF]   [BibTex]   [URL]   [doi:10.2312/gch.20141321]  

P. D. Ritsos, D. J. Johnston, C. Clark, and A. F. Clark, “Engineering an augmented reality tour guide,” in Eurowearable, 2003. IEE, Birmingham, UK, 2003, pp. 119–124. This paper describes a mobile augmented reality system intended for in situ reconstructions of archaeological sites, The evolution of the system from proof of concept to something approaching a satisfactory ergonomic design is described, as are the various approaches to achieving real-time rendering performance from the accompanying software. Finally, some comments are made concerning the accuracy of such systems.
[Abstract]   [PDF]   [BibTex]   [doi:10.1049/ic:20030157]  

P. D. Ritsos, D. J. Johnston, C. Clark, and A. F. Clark, “Mobile augmented reality archaeological reconstruction,” in 30th UNESCO Anniversary Virtual Congress: Architecture, Tourism and World Heritage, Beijing, China, 2002. This paper describes a mobile augmented reality system, based around a wearable computer, that can be used to provide in situ tours around archæological sites. Ergonomic considerations, intended to make the system usable by the general public, are described. The accuracy of the system is assessed, as this is critical for in situ reconstructions.
[Abstract]   [PDF]   [BibTex]  

P. D. Ritsos, “Mixed Reality - A paradigm for perceiving synthetic spaces,” in Real Virtuality, M. Reiche and U. Gehmann, Eds. Transcript-Verlag Bielefeld, 2014, pp. 283–310.
[BibTex]   [URL]   [ISBN:978-3-8376-2608-7]  

P. D. Ritsos, N. W. John, and J. C. Roberts, “Standards in Augmented Reality: Towards Prototyping Haptic Medical AR,” in 8th International AR Standards Meeting, 2013. Augmented Reality technology has been used in medical visualization applications in various different ways. Haptics, on the other hand, are a popular method of interacting in Augmented and Virtual Reality environments. We present how reliance on standards benefits the fusion of these technologies, through a series of research themes, carried out in Bangor University, UK (and international partners), as well as within the activities domain of the Research Institute of Visual Computing (RIVIC), UK.
[Abstract]   [PDF]   [BibTex]  

P. D. Ritsos, D. P. Ritsos, and A. S. Gougoulis, “Standards for Augmented Reality: a User Experience perspective,” in 2nd International AR Standards Meeting, 2011. An important aspect of designing and implementing Augmented Reality (AR) applications and services, often disregarded for the sake of simplicity and speed, is the evaluation of such systems, particularly from non-expert users, in real operating conditions. We are strong advocates of the fact that in order to develop successful and highly immersive AR systems, that can be adopted in day-today scenarios, user assessment and feedback is of paramount importance. Consequently, we also feel that an important fragment of future AR Standardisation should focus on User eXperience (UX) aspects, such as the sense of presence, ergonomics, health and safety, overall usability and product identification. Our paper attempts an examination of these aspects and proposes an adaptive theoretical evaluation framework than can be standardised across the span of AR applications.
[Abstract]   [PDF]   [BibTex]