Academia.eduAcademia.edu

Presented at Intelligent Environments 08

2020

This work-in progress paper summarises our research towards the vision for creating an intelligent university campus (iCampus) based on a mixed reality technology and network based education. The paper brings together earlier work aimed at exploring how simulators and other virtual augmentation can utilised by scientists to enhance the development, testing and demonstration of new ubiquitous technologies and environments with our latest work aimed at creating a simulation of a classroom based on the MPK-20 Project Wonderland Virtual Meeting Office developed by Sun Microsystems. We conclude by outlining our future plans.

Presented at Intelligent Environments 08, University of Washington, Seattle, USA , July 21 - 22, 2008 TOWARDS A MIXED REALITY INTELLIGENT CAMPUS Marc Davies*, Vic Callaghan*, Michael Gardner* *Digital Lifestyles Centre, University of Essex, UK midavi@essex.ac.uk, vic@essex.ac.uk, mgardner@essex.ac.uk Keywords: Simulators, Virtual Reality, Mixed Reality, Pervasive Computing, Hybrid learning. Abstract This work-in progress paper summarises our research towards the vision for creating an intelligent university campus (iCampus) based on a mixed reality technology and network based education. The paper brings together earlier work aimed at exploring how simulators and other virtual augmentation can utilised by scientists to enhance the development, testing and demonstration of new ubiquitous technologies and environments with our latest work aimed at creating a simulation of a classroom based on the MPK-20 Project Wonderland Virtual Meeting Office developed by Sun Microsystems. We conclude by outlining our future plans. 1 Introduction 1.1 The Rise of Virtual Worlds Virtual worlds are becoming increasingly popular for a variety of different applications in the entertainment, business and science sectors. As computer technology has advanced, allowing a higher-level of graphics on desktop PCs, simulators have increased in complexity; incorporating three-dimensional objects, textures and physics to model environments containing more realworld elements and rendering realistic responses to various stimuli. More advanced simulations have allowed researchers to begin seriously exploring the area connecting the realworld with a virtual environment. Collectively known as Mixed Reality, this term can be broken down further using the Reality-Virtuality Continuum [17] into; a) Augmented Reality, where the system consists of virtual components being added to a real-world environment [13], and; b) Augmented Virtuality, where real-world features are added to a virtual environment [13]. 1.2 Computer Games & the Internet The computer games industry is the primary user of virtual worlds, which form the basis of most applications produced. Varying in complexity these game-worlds consist of anything from a basic simulation modelling of a chess board, to a complex virtual environment the size of a country, continent, planet or universe. Originally computer games were designed to be used by a single person or a small group via a local network. However, with the advancement of the internet new categories of games emerged, specially designed to exploit the global-connectivity. Online games, such as the Warcraft MMPORPG, (Massively Multiplayer Online Role-Playing Game) series by Blizzard Entertainment [24], brought simulator modelling to new levels by offering vast, highly detailed worlds online to be simultaneously used by large numbers of users, accessing from anywhere on the planet. Broadband technology has allowed this medium to extend further, with the higher data-transfer speeds making it possible for the detailed worlds, normally found in offline games, to be brought online. The latest generations of computer games consoles have each been designed for broadband internet connectivity, allowing traditional offline game genres, (racing etc.) to be updated so players can challenge opponents online from anywhere in the world. The success of computer games that were designed to be played online [24], has led to an off-shoot genre of online social communities, (e.g. Second Life [16]), where people can log-in to the virtual world where they see and interact with other users, without any of the mission-based objectives or tournaments found in traditional online computer games. The Sony EyeToy, introduced for the PS2 added augmented virtuality functionality to the console, allowing gamers to see themselves inside a virtual world and interact with its contents. 1.3 Businesses & Virtual Worlds Fig. 1: Milgram’s Reality-Virtuality Continuum © University of Essex Second Life [16], (a virtual online community) has expanded to the extent where businesses are being established in the virtual environment, with real-world money being exchanged for products and services used within the virtualised space. Several real-world 1 Presented at Intelligent Environments 08, University of Washington, Seattle, USA , July 21 - 22, 2008 multinational companies and big-brands have been prompted into opening their own Second Life virtual outlets, [2] and some real countries, such as the Maldives and Sweden, have even created their own Second Life embassy. [3] The highly detailed virtual worlds produced by the computer games industry, have also prompted action within the business sector. As an example, later in this paper we discuss the Sun Microsystems’ MPK-20 virtual meeting environment. Additionally firms are now signing deals with game developers to incorporate advertisements promoting their products into the virtual worlds created for future titles. The first simulator was designed to model the iSpace, a full sized two-bedroom apartment located at the University of Essex which serves as an accommodation model for the iCampus. The iSpace is a purpose-built test-bed for pervasive computing research, the iSpace features all the furniture and devices found in a normal home environment, in addition to hollow walls and ceilings fitted with a myriad of embedded-computer based technology. [5][6][14] 1.4 Education & Research An example of a virtual world used for education is NASA’s simulator which is used for training nextgeneration astronauts [1]. Other simulators, (many based on virtual worlds from computer games) are already in common use training people in high-risk or stressful occupations, (e.g. surgeons, soldiers). [4][23][25]. By designing their own computer games young people can acquire Computer Science skills. This method matches the expectations of younger generations, raised with computers, who are unimpressed with simplistic visualizations [18]. The Sun Microsystems’ MPK-20 virtual meeting environment, [22] discussed later in this paper, has also been used for presentations and training. Traditionally most simulators created for scientific research were used as a means of visually displaying a set of data recorded by a set of real-world sensors. Many are visually un-impressive, modelling worlds at a basic level whilst ignoring most or all of the natural background noise found in real environments. Fig. 2: Views of the iSpace 2.2 The Electronic Arts Sims Game as a Simulator 1.5 Bringing it Together For the initial stage of our research towards an iCampus simulation [8], we assessed the relative (dis)advantages of simulating smart environments using modified commercial computer games, versus developing a bespoke simulator program from scratch. The following sections (2 & 3) provide a brief overview of this work. Clearly, from the preceding sections it is evident that there has been a vast investment in games and commercial simulators. Our aim is to harness the synergy of this work so as bring this work to create a mixed reality campus in the most cost and technically effective way we can. 2 Simulating Living Spaces 2.1 The iSpace © University of Essex The iSpace simulation was created by modifying an offthe-shelf copy of the Sims computer game, (Maxis/EA Games, 2000). Apart from the 3D graphics and supporting tools, a particularly attractive feature in the Sims was the fairly realistic behaviour of environment inhabitants. The simulation consisted of a five room environment modelled on the iSpace [5][6][14]. Each object and person was controlled by at least one thread, placed on a stack and run in sequence by the game. Object threads were used to regulate the animation displayed by the game’s virtual machine [9]. Most objects could only access their own threads, so for example a television couldn’t access information contained in a thread for a lamp. To create a Sims-based simulation, the original program code had to be modified so objects could access threads 2 Presented at Intelligent Environments 08, University of Washington, Seattle, USA , July 21 - 22, 2008 for other devices and any required information contained within. For this stage of the project the most efficient way to achieve this was to program a single Sims object to act as a ‘remote-control’ for other pervasive devices in the environment. The ‘Dumbold Voting Machine’ [12] an add-on device available online, was modified to act as a remote-interface, usable by Sims avatars in the environment. Within the code, for the re-programmed voting machine, the current state of each pervasive device in the environment was stored to memory. Fig. 3: The Sims iSpace Simulator observers. Additionally, popular games often spawn myriad online fan-sites, often offering freeware add-ons and/or modifications. This could be exploited by a researcher/developer to further expand the realism or capabilities of their simulation. 3 The Sun Microsystems Darkstar Game Server and Tools as Simulators The Sims was programmed using a bespoke language, created by the original game developers which served the needs of an in-house development team very well. However, as our aims for the iCampus simulation were to create to create an open development platform, it was necessary for us to consider the use of a more common programming language such as C/C++, Java etc,. Thus, for the second phase of our iCampus simulation research, we investigated the Sun Microsystems Project Darkstar, (a.k.a. Sun Game Server), massively multi-user game server [21]. In particular we used Project Wonderland, a Java-based client-server simulator package created from a combination of several previous software applications developed by the company. As Sun have a large workforce distributed around the world, it was difficult for them to have meetings where everybody in a team could be physically present in a single location. Moreover, traditional technical solutions such as video conferencing fell well short of delivering the functionality and the “feel” of face to face meetings. Therefore Sun used their Darkstar and Project Wonderland technology to create MPK-20, a virtual fictitious building designed for online meetings between Sun employees. Fig. 4: Sims Object Remote Control Menu Agent code, added to specially created classes, ran from the voting machine thread, prompting state changes to objects in the environment as required. Agents determined when to make changes using sensor settings coming into the voting machine thread on each cycle. The menus, from the voting machine, were reprogrammed to provide a manual interface for researchers, (see Fig. 4). This menu was used to force the priority of actions performed by a Sims avatar. The original program allowed a player to design, build and furnish a house to their own specifications, using numerous pre-programmed materials and objects available in the game libraries. Using a game to create a digital home simulator introduced several advanced features that provide a higher level of realism to the environment. These features include avatars that randomly visit the virtual home. Another benefit of using computer games is that researchers can take advantage of the popularity of the original product, as a level of familiarity with the environment could be established in the minds of the © University of Essex In more detail, Project Wonderland’s client is based on several programs including Project Looking Glass to generate a scene and the jVoiceBridge for adding audio [22]. The graphical content that creates the visible world as well as the screen buffers controlling the scene is programmed using Java3D [22]. Additional add-on objects/components to the Wonderland world, (e.g. a camera device to record audio and video seen in the client window), make use of other Sun packages such as JMF, (Java Media Framework). Graphical content can be added to a Wonderland world by creating objects using a graphics package, (e.g. Blender or Maya), then exporting the image file into the virtual world with J3dFly, (another Java-based open source project). Project Wonderland is open source, so all code is available to download for free, including Sun’s MPK-20 environment. There are currently two types of avatar featured in the MPK-20/Wonderland environment. First are NPCs, (Non-player characters) who are static in the virtual world, often forming background characters, providing audio 3 Presented at Intelligent Environments 08, University of Washington, Seattle, USA , July 21 - 22, 2008 explanations over the voice-bridge, or otherwise just simply adding to the general ambience of the environment via private conversations between two people. The other type of avatar is the PC, (Player Character) which individually represent a single client logged into the environment server. Each PC is capable of walking around the virtual world, (displayed via an animation). Eventually it is intended that a PC would have an appearance similar to that of its real-world controller, however at the moment, unless coded with a specific template, an avatar is automatically generated upon login. Each PC avatar has the login name of its controller floating above it to identify individuals. Controllers can speak through their avatar to others in the world via the voice-bridge and a microphone, or use a dedicated chat window for text-based messages. While most objects in the Wonderland environment are static, some can be clicked in the client window by a controller using a mouse. A virtual whiteboard can be drawn on in world by one or several users, PDF documents and presentations can be viewed and edited. Currently the avatars themselves cannot use objects and use only a basic form of artificial intelligence allowing them only to randomly wander around the environment, when not being controlled. The scene generated by wonderland and displayed in the client window can be viewed from first-person or several third-person perspectives. the MiRTLE environment also contains a hallway for students waiting for a lecture to begin. Virtual avatars are to be added to represent lecturers and remote students using the environment. Key avatars will have relevant speech and audio files attached through the Wonderland voice-bridge system. Fig. 6: The MiRTLE Seminar/Lecture Room The plan is to deploy MiRTLE on the SJTU eLearning platform where it will enable remote students to see an image of other remote students (emulating their presence in a real classroom). Thus they will see images representing each other in fake virtual classroom positions, as will the teacher. This virtual classroom layer is superimposed in the real classroom, providing a mixed reality setting in which remote and local learners are integrated into the same space. In addition, emotion sensing and display are used to add another level of realism and feedback for the teacher. 5 The Next Step In our last paper we outlined our vision for a Smart Classroom as our next step towards the iCampus vision [8] based on work we are undertaking in collaboration with Shanghai Jiao Tong University, China on a network based Open eLearning platform [19][20]. The Sims iSpace Simulator and the MiRTLE programs are both examples of virtual worlds modelling environments on a room-level. By this we mean that several virtual iSpace environments or MiRTLE seminar rooms could be placed into a single larger simulated world, modelling the contents of a virtual intelligent building, (iBuilding). Sun’s MPK-20/Project Wonderland environment was designed to model a fictional building, (although not necessarily intelligent). Visually the MPK-20 environment is simply a larger version of the Sims iSpace and MiRTLE simulations, (a space divided into numerous smaller sections by internal walls). By utilising the distributed computing infrastructure of Darkstar, the simulation components (e.g. rooms or buildings) can be distributed in ways to improve processing and maintenance, allowing assembly of massive simulations such as whole towns or cities. Inspired by this work we, began the MiRTLE, (Mixed Reality Teaching and Learning Environment) project At the time of writing, the MiRTLE project is based around a mixed reality lecture/seminar room adding a virtual counterpart to a real-world environment. The virtual components have been created using Sun’s Project Wonderland software. In addition to the seminar room, For the next stage of this project our immediate aims are to take the iCampus vision forward by creating a new simulator framework, extending the scale of a virtual environment to a campus-wide simulation. A building, (MPK-20) is simply a space large enough to be divided into multiple smaller rooms, (Sims iSpace, MiRTLE) connected by corridors. Therefore, a university Fig. 5: Sun’s MPK-20 Meeting Environment 4 A Mixed Reality Teaching and Learning Environment (MiRTLE) © University of Essex 4 Presented at Intelligent Environments 08, University of Washington, Seattle, USA , July 21 - 22, 2008 campus would be a space large enough to contain multiple buildings, connected by roads or pathways. Following this principle, we intend to create an iCampus modelling the University of Essex layout, (see Fig 7). which allows people to inhabit a virtual environment interacting with each other and devices in a similar way to a real campus The work has included an EA Sims based simulation of the Essex iSpace and the Project Wonderland based simulation of a smart classroom, (MiRTLE). Whilst these will only form sub components of the larger iCampus simulation, their success establishes the essential first steps in the building of the larger iCampus virtualised world. Fig 7: Relationship Pyramid for an iCampus A campus layout was chosen as we have a major research effort underway on mixed reality educational environments [20]. The iSpace test-bed, modelled by the Sims iSpace Simulator was designed to represent student accommodation, while lecture theatres/seminar rooms on campus can be represented through the MiRTLE and/or Smart Classroom systems. As the Sims-based simulation, created for the early stage of this project, would be difficult to incorporate into the iCampus, and we are considering using Java3D, (used in Project Wonderland) [22] or an equivalent package, (e.g. Java Monkey Engine)[15]. Using client-server architecture, the iCampus will allow many users to log-in from any location in the world and use an avatar to walk around a virtual representation of the real university and to name but some, allow them to participate in interactive activities such as lectures, tutorials, group projects and meetings. The mixed reality style used in MiRTLE will be incorporated into the new system, to add augmented virtuality functionality. We are also intending to perform research into augmented reality which would allow us to display content created in the iCampus at locations in its real-world counterpart. There will be an additional option to populate the virtual world with NPC avatars controlled by an inbuilt artificial intelligence system. This would function on a similar level to the Sims game avatars, with characters interacting with the world’s objects to meet specific personal needs. The primary purpose of the new system will be to offer a simulator capable of providing a virtual world (with mixed reality components if required), modelled to a room-scale, building-scale or community-scale. 6 Conclusions We have outlined our plans fop developing an iCampus. In this special attention will be given to augmented reality features which overlay content from the simulator into the real-world as we intend to include mixed reality components. In addition to providing an environment that real learners can use, we intend also that this environment should be capable of use by researchers and developers of pervasive computer science, to create new technologies (e.g. agents), especially those for use in wide-area locations. Finally, throughout the paper we have referred to simulators and virtual worlds as identical entities. From our perspective, a virtual world is a simulation of the realworld, either; a) Literally (e.g. the Sims Simulator modelling the real-world iSpace), or; b) Fictionally, (e.g. MPK-20, a simulation of a building with no physical presence but is based on the real-world). Thus, for example, the virtual worlds used in computer games, (which may be set on alien planets etc), are simulating the real-world from the perspective of the story being told or the purpose they serve. Acknowledgements We are pleased to acknowledge: Electronic Arts for the Sims Edith editor; Sun Microsystems for financially supporting the development of MiRTLE; Bernard Horan (Sun Microsystems) for invaluable advice throughout this work; Ms. Liping Shen and Prof R. Shen (Shanghai Jiao Tong University) for information on their Network Education College technology (to which this work will connect); John Scott (Essex University) who programmed the MiRTLE prototype. Finally, the bulk of the work reported in this paper is part of my PhD which is funded from private resources for which I wish to express my gratitude to my parents and uncle. References [1] BBC News, Nasa investigates virtual space, http://news.bbc.co.uk/1/hi/technology/7195718.stm, Retrieved: 21st January 2008. [2] BBC News, Online world to get news bureau, http://news.bbc.co.uk/1/hi/technology/6054352.stm, Retrieved: 31st January 2008. In this paper we have described ongoing work aimed at delivering a mixed reality intelligent campus (iCampus), © University of Essex 5 Presented at Intelligent Environments 08, University of Washington, Seattle, USA , July 21 - 22, 2008 [3] BBC News, Sweden plans Second Life embassy, http://news.bbc.co.uk/1/hi/world/europe/6310915.stm, Retrieved: 1st February 2008. [4] BBC News, Virtual bodies aid surgery skills, http://news.bbc.co.uk/1/hi/scotland/7167856.stm, Retrieved: 10th February 2008. [5] Callaghan V, Clarke G, Chin J “Some SocioTechnical Aspects Of Intelligent Buildings and Pervasive Computing Research”, Intelligent Buildings International Journal, Vol. 1 No 1, 2007. [6] Callaghan V, Clark G, Colley M, Hagras H Chin JSY, Doctor F “Intelligent Inhabited Environments”, BT Technology Journal , Vol.22, No.3 . Klywer Academic Publishers, Dordrecht, Netherlands, July 2004. [7] Clarke G, Callaghan V, "Ubiquitous Computing, Informatization, Urban Structures and Density”, Built Environment Journal, Vol. 33, No 2 2007. [8] Davies M., Callaghan V., Shen L., “Modelling Pervasive Environments Using Bespoke & Commercial Game-Based Simulators”, Proceedings of the 2007 International Conference on Life System Modelling and Simulation (LSMS ’07), Shanghai China, Springer, September 2007. [9] Forbus, Kenneth D., Wright, Will. “Some notes on programming objects in The Sims™”, Northwestern University, 31 May 2001. [10] Funge, John David, “Artificial Intelligence for Computer Games,” A K Peters Ltd., Wellesley, Massachusetts, 2004, ISBN 1-56881-208-6. [11] Hagras H., Callaghan V., Colley M., Clarke G., Pounds-Cornish A., Duman H., “Creating an Ambient-Intelligence Environment Using Embedded Agents”, IEEE Intelligent Systems, Vol. 19, Issue 6, pp. 12-20, Nov/Dec 2004 [12] Hopkins D., Dumbold Voting Machine, http://www.donhopkins.com/drupal, Retrieved: 19th June 06. [13] Hughes C.E., Stapleton C.B., Hughes D.E., Smith E.M, “Mixed Reality in Education, Entertainment, and Training”, IEEE Computer Graphics and Applications, Vol. 25, Issue 6, Nov-Dec 2005, p2430. [14] IIEG, iDorm2, http://iieg.essex.ac.uk/idorm2, Retrieved: 18th March 2007. [15] Java Monkey Engine, Java Monkey Engine, http://www.jmonkeyengine.com, Retrieved: 10th February 2008. [16] Linden Lab., Second Life, http://secondlife.com/, Retrieved: 5th April 2007. [17] Milgram P., Kishino A.F., “Taxonomy of Mixed Reality Visual Displays,” IEICE Transactions on Information and Systems, Vol.E77-D, No. 12, Dec 1994, p1321-1329. [18] Overmars M., “Teaching Computer Science through Game Design,” IEEE Computer, Volume 37, Issue 4, IEEE, pages 81 – 83, April 2004. [19] Shen L., Leon E., Callaghan V., Shen R. “Exploratory Research on an Affective eLearning Model”, International Workshop on Blended Learning 2007 © University of Essex (WBL 07) 15-17 August 2007, University of Edinburgh, Scotland. [20] Shen L., Leon E., Callaghan V., Shen R. “An Affective eLearning Model”, in book "Blended Learning" by Pearson, published September 2007 [21] Sun Microsystems, Project Darkstar, https://gamesdarkstar.dev.java.net, Retrieved: 20th April 2007. [22] Sun Microsystems, lg3d-wonderland, https://lg3dwonderland.dev.java.net, Retrieved: 2nd February 2008. [23] Voth, Danna. “Gaming technology helps troops learn language.” IEEE Intelligent Systems. Vol. 19, Issue 5, Sept-Oct 2004. p4 – 6. [24] World of Warcraft Europe, World of Warcraft, http://www.wow-europe.com, Retrieved 8th February 2008. [25] Zyda M., Hiles J., Mayberry A., Wardynski C., Capps M., Osborn B., Shilling R., Robaszewski M., Davis M., “Entertainment R&D for Defense”, IEEE Computer Graphics and Applications, Vol. 23, Issue 1, Jan-Feb 2003, p28 – 36. 6