Panagiotis D. Ritsos

MEng PhD Essex, FHEA

Lecturer in Visualization

Visualization, Modelling and
Graphics (VMG) research group,

School of Computer Science
and Electronic Engineering,

Bangor University,
Dean Street, Bangor,
Gwynedd, UK, LL57 1UT

Find our book on Amazon:

fdsBook

Immersive Analytics

Visualization researchers need to develop and adapt to today’s new devices and tomorrow’s interface technology. Today, people interact with visual depictions mainly through a mouse and, gradually, by touch. Tomorrow, they will be touching, grasping, feeling, hearing, smelling, and even tasting data.

This theme explores the concept of immersive, multisensory, post-WIMP visualization and investigates how information visualization can employ different display and presentation technologies, such as head-mounted displays, projection systems, wearables, tabletop displays and haptic interfaces. We are currently exploring different technologies that could enable IA in different scenarios.

The <VRIA> Web-based framework for Immersive Analytics

We are especially interested in using web technologies, such as WebVR/XR, for prototyping. We (aka. Peter Butcher mainly…) have been been working towards the development of an open-standards web-based framework, called <VRIA>, that facilitates the creation of virtual spaces suited for immersive analytics.

Overview of the <VRIA> framework and functionality. Extract from our poster presented in IEEE VIS 2018 (see below for links).
Figure 1: Overview of the <VRIA> framework and functionality. Extract from our poster presented in IEEE VIS 2018 (see below for links). [PNG]

Currently <VRIA> supports 3D bar charts and multivariate scatter plots with more visualization types planned. Every visualization component has a corresponding set of interaction components which can be configured in the visualization configuration file. New interactions can be written with A-Frame and React and added to your application with <VRIA>'s API.

Overview of the <VRIA> architecture. Extract from our poster presented in IEEE VIS 2018 (see below for links).
Figure 2: Overview of the <VRIA> architecture. Extract from our poster presented in IEEE VIS 2018 (see below for links). [PNG]

With a number of enhancements planned, in the next phase we are looking to build more 3D visualization components, along with corresponding interaction mechanisms. We also plan to integrate features that allow collaborative tasks in VR space. More information on the <VRIA> framework can be found here, as we complete the work. We expect to release ouf framework early in 2019, pending the integration of some important (and exciting!) features.

Media 2: [VIS18 Preview] Towards a Framework for Immersive Analytics on the Web (Poster) from VGTCommunity on Vimeo.

Stand-alone Web-based MR Prototypes

We have also been using emerging JavaScript frameworks and tools, such as AWE.js, AR.js and Argon.js for recent work on IA in MR/AR. These have been presented in various instances as posters and workshop papers (see below). In all case, we explore the synergy of different tools over the HTML DOM. We essentially use the data manipulation mechanisms offered by libraries like D3.js and match it to depictions built with A-Frame, along with different registration mechanisms. We have also explored the concept of Synthetic Visualizations, which are a synthesis of physical, tangible objects and computer-generated information registered on said objects.

We have created a series of prototypes, exploring Immersive Analytics with Web-technologies. One of our first prototypes used WebVR polyfill to display on Google Cardboard a 3D bar-chart, built using <a href="https://d3js.org/" target="_blank">D3.js</a> and <a href="https://aframe.io/" target="_blank">A-Frame</a> (left). We have also used <a href="https://www.argonjs.io/" target="_blank">Argon.js</a> and <a href="https://aframe.io/" target="_blank">A-Frame</a> for doing a similar depiction in handheld MR (middle). Finally, we explored the synergy between QR codes and AR targets to control both the data and the registration of MR-based data visualizations. See below for links to published and under-progress work (right).
Figure 3: We have created a series of prototypes, exploring Immersive Analytics with Web-technologies. One of our first prototypes used WebVR polyfill to display on Google Cardboard a 3D bar-chart, built using D3.js and A-Frame (left). We have also used Argon.js and A-Frame for doing a similar depiction in handheld MR (middle). Finally, we explored the synergy between QR codes and AR targets to control both the data and the registration of MR-based data visualizations. See below for links to published and under-progress work (right). [PNG]

Media 2: [VIS17 Preview] Web-based Immersive Analytics in Handheld Augmented Reality (Poster) from VGTCommunity on Vimeo.

Collaborators (in various publications)

Bangor University, University of Maryland - College Park, Edinburgh Napier University, University of Applied Sciences and Arts Northwestern Switzerland College Park, University of Chester

P. W. S. Butcher, N. W. John, and P. D. Ritsos, “Towards a Framework for Immersive Analytics on the Web,” in Posters presented at the IEEE Conference on Visualization (IEEE VIS 2018), Berlin, Germany, 2018. We present work-in-progress on the design and implementation of a Web framework for building Immersive Analytics (IA) solutions in Virtual Reality (VR). We outline the design of our prototype framework, VRIA, which facilitates the development of VR spaces for IA solutions, which can be accessed via a Web browser. VRIA is built on emerging open-standards Web technologies such as WebVR, A-Frame and React, and supports a variety of interaction devices (e.g., smartphones, head-mounted displays etc.). We elaborate on our motivation for focusing on open-standards Web technologies and provide an overview of our framework. We also present two early visualization components. Finally, we outline further extensions and investigations.
[Abstract]   [PDF]   [BibTex]   [URL]   [Video]   [Poster]  

P. D. Ritsos, J. Mearman, J. R. Jackson, and J. C. Roberts, “Synthetic Visualizations in Web-based Mixed Reality,” in Immersive Analytics: Exploring Future Visualization and Interaction Technologies for Data Analytics Workshop, IEEE Conference on Visualization (VIS), Phoenix, Arizona, USA, 2017. The way we interact with computers is constantly evolving, with technologies like Mixed/Augmented Reality (MR/AR) and the Internet of Things (IoT) set to change our perception of informational and physical space. In parallel, interest for interacting with data in new ways is driving the investigation of the synergy of these domains with data visualization. We are seeking new ways to contextualize, visualize, interact-with and interpret our data. In this paper we present the notion of Synthetic Visualizations, which enable us to visualize in situ, data embedded in physical objects, using MR. We use a combination of established ‘markers’, such as Quick Response Codes (QR Codes) and Augmented Reality Markers (AR Markers), not only to register objects in physical space, but also to contain data to be visualized, and interchange the type of visualization to be used. We visualize said data in Mixed Reality (MR), using emerging web-technologies and open-standards.
[Abstract]   [PDF]   [BibTex]   [URL]  

P. D. Ritsos, J. Jackson, and J. C. Roberts, “Web-based Immersive Analytics in Handheld Augmented Reality,” in Posters presented at the IEEE Conference on Visualization (IEEE VIS 2017), Phoenix, Arizona, USA, 2017. The recent popularity of virtual reality (VR), and the emergence of a number of affordable VR interfaces, have prompted researchers and developers to explore new, immersive ways to visualize data. This has resulted in a new research thrust, known as Immersive Analytics (IA). However, in IA little attention has been given to the paradigms of augmented/mixed reality (AR/MR), where computer-generated and physical objects co-exist. In this work, we explore the use of contemporary web-based technologies for the creation of immersive visualizations for handheld AR, combining D3.js with the open standards-based Argon AR framework and A-frame/WebVR. We argue in favor of using emerging standards-based web technologies as they work well with contemporary visualization tools, that are purposefully built for data binding and manipulation.
[Abstract]   [PDF]   [BibTex]   [URL]   [Video]   [Poster]  

P. W. Butcher and P. D. Ritsos, “Building Immersive Data Visualizations for the Web,” in Proceedings of International Conference on Cyberworlds (CW’17), Chester, UK, 2017. We present our early work on building prototype applications for Immersive Analytics using emerging standardsbased web technologies for VR. For our preliminary investigations we visualize 3D bar charts that attempt to resemble recent physical visualizations built in the visualization community. We explore some of the challenges faced by developers in working with emerging VR tools for the web, and in building effective and informative immersive 3D visualizations.
[Abstract]   [PDF]   [BibTex]   [URL]   [doi:10.1109/CW.2017.11]  

P. W. S. Butcher, J. C. Roberts, and P. D. Ritsos, “Immersive Analytics with WebVR and Google Cardboard,” in Posters presented at the IEEE Conference on Visualization (IEEE VIS 2016), Baltimore, MD, USA, 2016. We present our initial investigation of a low-cost, web-based virtual reality platform for immersive analytics, using a Google Cardboard, with a view of extending to other similar platforms such as Samsung’s Gear VR. Our prototype uses standards-based emerging frameworks, such as WebVR and explores some the challenges faced by developers in building effective and informative immersive 3D visualizations, particularly those that attempt to resemble recent physical visualizations built in the community.
[Abstract]   [PDF]   [BibTex]   [URL]   [Video]   [Poster]  

J. C. Roberts, P. D. Ritsos, S. K. Badam, D. Brodbeck, J. Kennedy, and N. Elmqvist, “Visualization Beyond the Desktop - the next big thing,” IEEE Computer Graphics and Applications, vol. 34, no. 6, pp. 26–34, Nov. 2014. Visualization is coming of age. With visual depictions being seamlessly integrated into documents, and data visualization techniques being used to understand increasingly large and complex datasets, the term "visualization"’ is becoming used in everyday conversations. But we are on a cusp; visualization researchers need to develop and adapt to today’s new devices and tomorrow’s technology. Today, people interact with visual depictions through a mouse. Tomorrow, they’ll be touching, swiping, grasping, feeling, hearing, smelling, and even tasting data. The next big thing is multisensory visualization that goes beyond the desktop.
[Abstract]   [PDF]   [BibTex]   [doi:10.1109/MCG.2014.82]   [Video]   [Invited at IEEE VIS 2015]

P. D. Ritsos, J. W. Mearman, A. Vande Moere, and J. C. Roberts, “Sewn with Ariadne’s Thread - Visualizations for Wearable & Ubiquitous Computing,” in Death of the Desktop Workshop, IEEE Conference on Visualization (VIS), Paris, France, 2014. Lance felt a buzz on his wrist, as Alicia, his wearable, informed him via the bone-conduction ear-piece - ‘You have received an email from Dr Jones about the workshop’. His wristwatch displayed an unread email glyph icon. Lance tapped it and listened to the voice of Dr Jones, talking about the latest experiment. At the same time he scanned through the email attachments, projected in front of his eyes, through his contact lenses. One of the files had a dataset of a carbon femtotube structure
[Abstract]   [PDF]   [BibTex]   [URL]  

J. C. Roberts, J. W. Mearman, and P. D. Ritsos, “The desktop is dead, long live the desktop! – Towards a multisensory desktop for visualization,” in Death of the Desktop Workshop, IEEE Conference on Visualization (VIS), Paris, France, 2014. “Le roi est mort, vive le roi!”; or “The King is dead, long live the King” was a phrase originally used for the French throne of Charles VII in 1422, upon the death of his father Charles VI. To stave civil unrest the governing figures wanted perpetuation of the monarchs. Likewise, while the desktop as-we-know-it is dead (the use of the WIMP interface is becoming obsolete in visualization) it is being superseded by a new type of desktop environment: a multisensory visualization space. This space is still a personal workspace, it’s just a new kind of desk environment. Our vision is that data visualization will become more multisensory, integrating and demanding all our senses (sight, touch, audible, taste, smell etc.), to both manipulate and perceive the underlying data and information.
[Abstract]   [PDF]   [BibTex]   [URL]