Panagiotis D. Ritsos

MEng PhD Essex, FHEA

Lecturer in Visualization

Visualization, Modelling and
Graphics (VMG) research group,

School of Computer Science
and Electronic Engineering,

Bangor University,
Dean Street, Bangor,
Gwynedd, UK, LL57 1UT

Find our book on Amazon:

fdsBook

Wearable Mixed & Augmented Reality

wearableMR

Mixed Reality (MR) is the fusion of real and computer-generated elements towards the production of new hybrid - synthetic - environments. Despite the fact that, in the past, MR and the encompassing notions of Augmented and Virtual Reality (AR/VR) focused primarily to the sense of vision, today’s MR systems can employ all our senses and offer us capabilities through services that are perceivable but not necessarily vision-specific. The establishment of cloud services, along with the recent popularity of wearable/mobile devices, contributes towards this paradigm shift, which is the focus of this theme.

Collaborators (in various publications)

Bangor University, University of Essex, University of Maryland - College Park, Edinburgh Napier University, University of Applied Sciences and Arts Northwestern Switzerland College Park, University of Chester

H. C. Miles, A. T. Wilson, F. Labrosse, B. Tiddeman, S. Griffiths, B. Edwards, P. D. Ritsos, J. W. Mearman, K. Möller, R. Karl, and J. C. Roberts, “Alternative Representations of 3D-Reconstructed Heritage Data,” ACM Journal on Computing and Cultural Heritage (JOCCH), vol. 9, no. 1, pp. 4:1–4:18, Nov. 2015. By collecting images of heritage assets from members of the public and processing them to create 3D-reconstructed models, the HeritageTogether project has accomplished the digital recording of nearly 80 sites across Wales, UK. A large amount of data has been collected and produced in the form of photographs, 3D models, maps, condition reports, and more. Here we discuss some of the different methods used to realize the potential of this data in different formats and for different purposes. The data are explored in both virtual and tangible settings, and—with the use of a touch table—a combination of both. We examine some alternative representations of this community-produced heritage data for educational, research, and public engagement applications.
[Abstract]   [PDF]   [BibTex]   [doi:10.1145/2795233]  

J. C. Roberts, P. D. Ritsos, S. K. Badam, D. Brodbeck, J. Kennedy, and N. Elmqvist, “Visualization Beyond the Desktop - the next big thing,” IEEE Computer Graphics and Applications, vol. 34, no. 6, pp. 26–34, Nov. 2014. Visualization is coming of age. With visual depictions being seamlessly integrated into documents, and data visualization techniques being used to understand increasingly large and complex datasets, the term "visualization"’ is becoming used in everyday conversations. But we are on a cusp; visualization researchers need to develop and adapt to today’s new devices and tomorrow’s technology. Today, people interact with visual depictions through a mouse. Tomorrow, they’ll be touching, swiping, grasping, feeling, hearing, smelling, and even tasting data. The next big thing is multisensory visualization that goes beyond the desktop.
[Abstract]   [PDF]   [BibTex]   [doi:10.1109/MCG.2014.82]   [Video]   [Invited at IEEE VIS 2015]

P. D. Ritsos, J. Jackson, and J. C. Roberts, “Web-based Immersive Analytics in Handheld Augmented Reality,” in Posters presented at the IEEE Conference on Visualization (IEEE VIS 2017), Phoenix, Arizona, USA, 2017. The recent popularity of virtual reality (VR), and the emergence of a number of affordable VR interfaces, have prompted researchers and developers to explore new, immersive ways to visualize data. This has resulted in a new research thrust, known as Immersive Analytics (IA). However, in IA little attention has been given to the paradigms of augmented/mixed reality (AR/MR), where computer-generated and physical objects co-exist. In this work, we explore the use of contemporary web-based technologies for the creation of immersive visualizations for handheld AR, combining D3.js with the open standards-based Argon AR framework and A-frame/WebVR. We argue in favor of using emerging standards-based web technologies as they work well with contemporary visualization tools, that are purposefully built for data binding and manipulation.
[Abstract]   [PDF]   [BibTex]   [URL]   [Video]   [Poster]  

P. D. Ritsos, J. W. Mearman, A. Vande Moere, and J. C. Roberts, “Sewn with Ariadne’s Thread - Visualizations for Wearable & Ubiquitous Computing,” in Death of the Desktop Workshop, IEEE Conference on Visualization (VIS), Paris, France, 2014. Lance felt a buzz on his wrist, as Alicia, his wearable, informed him via the bone-conduction ear-piece - ‘You have received an email from Dr Jones about the workshop’. His wristwatch displayed an unread email glyph icon. Lance tapped it and listened to the voice of Dr Jones, talking about the latest experiment. At the same time he scanned through the email attachments, projected in front of his eyes, through his contact lenses. One of the files had a dataset of a carbon femtotube structure
[Abstract]   [PDF]   [BibTex]   [URL]  

P. D. Ritsos, A. T. Wilson, H. C. Miles, L. F. Williams, B. Tiddeman, F. Labrosse, S. Griffiths, B. Edwards, K. Möller, R. Karl, and J. C. Roberts, “Community-driven Generation of 3D and Augmented Web Content for Archaeology,” in Eurographics Workshop on Graphics and Cultural Heritage (EGGCH) - Short Papers and Posters, Darmstadt, Germany, 2014, pp. 25–28. Heritage sites (such as prehistoric burial cairns and standing stones) are prolific in Europe; although there is a wish to scan each of these sites, it would be time-consuming to achieve. Citizen science approaches enable us to involve the public to perform a metric survey by capturing images. In this paper, discussing work-in progress, we present our automatic process that takes the user’s uploaded photographs, converts them into 3D models and displays them in two presentation platforms – in a web gallery application, using X3D/X3DOM, and in mobile augmented reality, using awe.js
[Abstract]   [PDF]   [BibTex]   [URL]   [doi:10.2312/gch.20141321]  

P. D. Ritsos, D. J. Johnston, C. Clark, and A. F. Clark, “Engineering an augmented reality tour guide,” in Eurowearable, 2003. IEE, Birmingham, UK, 2003, pp. 119–124. This paper describes a mobile augmented reality system intended for in situ reconstructions of archaeological sites, The evolution of the system from proof of concept to something approaching a satisfactory ergonomic design is described, as are the various approaches to achieving real-time rendering performance from the accompanying software. Finally, some comments are made concerning the accuracy of such systems.
[Abstract]   [PDF]   [BibTex]   [doi:10.1049/ic:20030157]  

P. D. Ritsos, D. J. Johnston, C. Clark, and A. F. Clark, “Mobile augmented reality archaeological reconstruction,” in 30th UNESCO Anniversary Virtual Congress: Architecture, Tourism and World Heritage, Beijing, China, 2002. This paper describes a mobile augmented reality system, based around a wearable computer, that can be used to provide in situ tours around archæological sites. Ergonomic considerations, intended to make the system usable by the general public, are described. The accuracy of the system is assessed, as this is critical for in situ reconstructions.
[Abstract]   [PDF]   [BibTex]  

P. D. Ritsos, “Mixed Reality - A paradigm for perceiving synthetic spaces,” in Real Virtuality, M. Reiche and U. Gehmann, Eds. Transcript-Verlag Bielefeld, 2014, pp. 283–310.
[BibTex]   [URL]   [ISBN:978-3-8376-2608-7]  

P. D. Ritsos, N. W. John, and J. C. Roberts, “Standards in Augmented Reality: Towards Prototyping Haptic Medical AR,” in 8th International AR Standards Meeting, 2013. Augmented Reality technology has been used in medical visualization applications in various different ways. Haptics, on the other hand, are a popular method of interacting in Augmented and Virtual Reality environments. We present how reliance on standards benefits the fusion of these technologies, through a series of research themes, carried out in Bangor University, UK (and international partners), as well as within the activities domain of the Research Institute of Visual Computing (RIVIC), UK.
[Abstract]   [PDF]   [BibTex]  

P. D. Ritsos, D. P. Ritsos, and A. S. Gougoulis, “Standards for Augmented Reality: a User Experience perspective,” in 2nd International AR Standards Meeting, 2011. An important aspect of designing and implementing Augmented Reality (AR) applications and services, often disregarded for the sake of simplicity and speed, is the evaluation of such systems, particularly from non-expert users, in real operating conditions. We are strong advocates of the fact that in order to develop successful and highly immersive AR systems, that can be adopted in day-today scenarios, user assessment and feedback is of paramount importance. Consequently, we also feel that an important fragment of future AR Standardisation should focus on User eXperience (UX) aspects, such as the sense of presence, ergonomics, health and safety, overall usability and product identification. Our paper attempts an examination of these aspects and proposes an adaptive theoretical evaluation framework than can be standardised across the span of AR applications.
[Abstract]   [PDF]   [BibTex]