Nico Reski, Aris Alissandrakis, and Jukka Tyrkkö
Linnaeus University, Sweden
Collaborative exploration of rich corpus data using immersive virtual reality and non-immersive technologies
In recent years, large textual data sets, comprising many data points and rich metadata, have become a common object of investigation and analysis. Information Visualization and Visual Analytics provide practical tools for visual data analysis, most commonly as interactive two-dimensional (2D) visualizations that are displayed through normal computer monitors. At the same time, display technologies have evolved rapidly over the past decade. In particular, emerging technologies such as virtual reality (VR), augmented reality (AR), or mixed reality (MR) have become affordable and more user-friendly (LaValle 2016). Under the banner of “Immersive Analytics”, researchers started to explore the novel application of such immersive technologies for the purpose of data analysis (Marriott et al. 2018).
By using immersive technologies, researchers hope to increase motivation and user engagement for the overall data analysis activity as well as providing different perspectives on the data. This can be particularly helpful in the case of exploratory data analysis, when the researcher attempts to identify interesting points or anomalies in the data without prior knowledge of what exactly they are searching for. Furthermore, the data analysis process often involves the collaborative sharing of information and knowledge between multiple users for the goal of interpreting and making sense of the explored data together (Isenberg et al. 2011). However, immersive technologies such as VR are often rather single user-centric experiences, where one user is wearing a head-mounted display (HMD) device and is thus visually isolated from the real-world surroundings. Consequently, new tools and approaches for co-located, synchronous collaboration in such immersive data analysis scenarios are needed.
In this software demonstration, we present our developed VR system that enables two users to explore data at the same time, one inside an immersive VR environment, and one outside VR using a non-immersive companion application. The context of this demonstrated data analysis activity is centered around the exploration of the language variability in tweets from the perspectives of multilingualism and sociolinguistics (see, e.g. Coats 2017 and Grieve et al. 2017). Our primary data come from the the Nordic Tweet Stream (NTS) corpus (Laitinen et al. 2018, Tyrkkö 2018), and the immersive VR application visualizes in three dimensions (3D) the clustered Twitter traffic within the Nordic region as stacked cuboids according to their geospatial position, where each stack represents a color-coded language share (Alissandrakis et al. 2018). Through the utilization of 3D gestural input, the VR user can interact with the data using hand postures and gestures in order to move through the virtual 3D space, select clusters and display more detailed information, and to navigate through time (Reski and Alissandrakis 2019) ( https://vrxar.lnu.se/apps/odxvrxnts-360/ ). A non-immersive companion application, running in a normal web browser, presents an overview map of the Nordic region as well as other supplemental information about the data that are more suitable to be displayed using non-immersive technologies.
We will present two complementary applications, each with a different objective within the collaborative data analysis framework. The design and implementation of certain connectivity and collaboration features within these applications facilitate the co-located, synchronous exploration and sensemaking. For instance, the VR user’s position and orientation are displayed and updated in real-time within the overview map of the non-immersive application. The other way around, the selected cluster of the non-immersive user is also highlighted for the user in VR. Initial tests with pairs of language students validated the proof-of-concept of the developed collaborative system and encourage the conduction of further future investigations in this direction.
Alissandrakis Aris, Nico Reski, Mikko Laitinen, Jukka Tyrkkö, Magnus Levin, and Jonas Lundberg. 2018. Visualizing dynamic text corpora using virtual reality. In The 39th Annual Conference of the International Computer Archive for Modern and Medieval English (ICAME 39): Corpus Linguistics and Changing Society, ICAME 39 Book of Abstracts, page 205, Tampere, Finland, 30 May– 3 June 2018. International Computer Archive of Modern and Medieval English (ICAME).
Coats, Steven. 2017. European language ecology and bilingualism with English on Twitter. In Ciara Wigham and Egon Stemle (eds.), Proceedings of the 5th Conference on CMC and Social Media Corpota for the Humanities, 35-38. Bozen/Bolzano: Eurac Research.
Grieve, Jack, Andrea Nini & Diansheng Guo. 2017. Analysing lexical emergence in American English online. English Language and Linguistics 21: 99-127.
Isenberg Petra, Niklas Elmqvist, Jean Scholtz, Daniel Cernea, Kwan-Liu Ma, and Hans Hagen. 2011. Collaborative visualization: Definition, challenges, and research agenda. Information Visualization, 10(4), pp. 310-326.
Laitinen Mikko, Jonas Lundberg, Magnus Levin, and Rafael Martins. 2018. The Nordic Tweet Stream: A Dynamic Real-Time Monitor Corpus of Big and Rich Language Data, Proc. of Digital Humanities in the Nordic Countries 3rd Conference, Helsinki, Finland, March 7-9, 2018, CEUR WS.org.
LaValle Steven M. 2016. Virtual Reality. Online. http://vr.cs.uiuc.edu [Accessed: 20.02.2019]
Marriott Kim, Falk Schreiber, Tim Dwyer, Karsten Klein, Nathalie H. Riche, Takayuki Itoh, Wolfgang Stuerzlinger, and Bruce H. Thomas (eds.). Immersive Analytics. Springer, 2018.
Reski Nico and Aris Alissandrakis. 2019. Open data exploration in virtual reality: a comparative study of input techonology. Virtual Reality. https://doi.org/10.1007/s10055-019-00378-w
Tyrkkö Jukka. 2018. How Nordic is Nordic Tweeting? Big Data perspectives to online multilingualism. Presentation at Poznan Linguistics Meeting, September 14, 2018.