This is the second installment of the Virtual Spaces: enabling immersive collaborative enterprise series. The first article briefly described the history of virtual worlds and introduced and defined the notion of virtual spaces. It explained usage patterns and technologies that contributed to the rise of virtual spaces development in IBM and compared and contrasted them with virtual worlds. The article then examined six platforms that were either used to successfully build virtual spaces or explore specific aspects and desired functions. These platforms include Active Worlds, Forterra OLIVE, OpenSimulator, Second Life, Torque and Unity.
In Part 2, you will learn about virtual spaces built for large virtual events, collaboration between multicultural teams, remote mentoring and identity in virtual spaces. In addition to business context and social aspects, for each scenario you will examine challenges that developers had to overcome and the architectural overview of the resulting solution with its enabling technologies. You will also learn about tangible business results achieved by each virtual space and lessons learned. The article also explores the motivations and benefits of SOA-based approach to virtual spaces.
During the last two years IBM has amassed a large amount of pragmatic knowledge of how to run large events in virtual spaces which is described in the sections below.
Our original efforts utilized the public Second Life (SL) grid, which is easily accessible and provides scripting and user content creation capabilities enjoyed by many IBM virtual worlds enthusiasts. This option is inexpensive since the basic user accounts are free and a large amount of content is available. On the other hand, since SL is a public grid outside of IBM's firewall, and is therefore is not secure, no confidential material can be used or shared. While rich in content, the SL environment is not business oriented. Many locations are irrelevant (or inappropriate) to IBM business objectives. Although harassment (called griefing) of a SL resident is not commonplace, it is nonetheless a reality of the nature of the environment. In addition, if the enterprise wants control over its content/users, the main SL grid is not appropriate as it is subject to Linden Lab Terms of Service (ToS).
Unlike the shared model used by games such as World of Warcraft, the public SL grid is a large 2D grid of processors, where each processor manages a 256m by 256m square of virtual land. A large database (known as the Asset Server) holds the details of every 3D model, texture, avatar, or other asset. The diagram in Figure 1 is simplified and does not show "environmental" services such as login and instant messaging.
Figure 1. Simplified view of the public Second Life grid
- New users require assistance: The SL viewer is complex and new users must negotiate a relatively steep learning curve to become skilled with the SL interface. Anecdotal evidence suggests that a new user may require up to four hours to become comfortable using the SL viewer and customizing their avatar. Basic tasks such as adjusting point-of-view to focus on a presenter / presentation screen are not intuitive.
- Inability to discuss the confidential material severely limits the utility of the environment for business use.
As the desire to utilize the medium of virtual worlds to experiment with immersive virtual events grew, the limitations of the public SL grid as a business environment became serious obstacles. In early 2008, the IBM Academy of Technology (AoT) decided to hold its first meeting in a virtual environment. Due to the confidentiality requirements for many of the topics handled by the AoT, its meetings must be conducted in a secure environment hosted behind the IBM firewall and accessible only to those with the proper authorization. IBM and Linden Lab began to work together on the the implementation of a "private" SL grid, accessible only to IBMers, where they could discuss confidential material and avoid griefing and other problems common to the public grid. This nine-month project culminated in the first IBM Academy of Technology (AoT) conference in a virtual space, the topic of which appropriately was "Virtual Worlds".
The diagram in Figure 2 shows an overview of the hybrid SL grid. The IBM-only regions and Asset Server are contained behind the IBM firewall. Confidential content is hosted on the IBM Asset Server, and non-IBMers are unable to travel to or access any of the content in the IBM grid.
Figure 2. Architectural overview of IBM's hybrid SL grid
One hundred and fifty people from 97 countries attended the first major IBM virtual spaces event, held over three days in SL running behind the IBM’s firewall. Activities ranged from 37 formal sessions to "walk around" poster sessions to social events with jet skis and hang gliders.
The importance of remote cooperative creation became obvious - many volunteer designers and builders from around the world put thousands of hours into creating this environment. The self-representation of conference participants demonstrated the individuality and diversity of backgrounds and choices ---some arriving as business people, others as robots; blue hair, fancy dresses or jeans; even a tiny motorcycle with sound effects.
On-boarding – setting up and training users -- was time consuming. Both new and current users of SL needed to download and install a new viewer to enable the private Voice over Internet Protocol (VoIP) feature, and VoIP itself was prone to glitches. The voices from sessions scheduled in adjacent virtual spaces overlapped - reminiscent of thin hotel walls. Navigation between public and private space using landmarks was buggy, resulting in several 'lost avatars'.
Despite its issues, VoIP proved to be the "killer app" that made it all work. With VoIP, users are much more inclined to participate and interact with each other. User feedback indicated that the intangible ability to "schmooze" – to simply meet and connect and talk to other people, as one would do at a real-world conference – is a major benefit of virtual space events.
Since participants in AoT events come from all over the world, travel is always an issue, not just because of the expenses, but also the travel time. The virtual conference saved hundreds of thousands (in US$) in travel and conference costs and eliminated travel "down-time" for participants.
Shortly after this first successful event, unpredictable global economic conditions led to the cancellation of the AoT general meeting. Suddenly, the experimental hybrid SL virtual space was seen as a viable platform for the larger, more elaborate general meeting. The AoT general meeting brought together over 200 participants from 34 countries. About 150 poster sessions were held in the SL platform, as well as a variety of additional meetings and socials. The feedback indicated that about 75% of the participants considered the event to be successful.
Since these first two successful events, many IBM organizations became interested in exploiting virtual spaces for their business needs, which led to additional work in virtual spaces described in the following sections.
During an economic downturn that limits business travel, find an innovative way to bring together more than 200 IBM researchers from six geographically distributed labs and enable them to share their newest research activities in a kickoff event.
This "mixed-reality" event integrated SL and real-life meetings and was held in two sessions to accommodate the world-wide audience. During the event, the IBM Research Senior VP and four guest panelists (physically located at different IBM Research locations worldwide) sat down inside the virtual world and discussed their work via SL’s Vivox spatial VoIP. This event was broadcast live to researchers gathered Watson, Hawthorne, Cambridge, Austin, Zurich, Haifa, Almaden, Beijing, Shanghai, Tokyo, New Delhi, and Bangalore. Camera operators at each site ran a SL viewer connected to a projection screen, as people around the world watched the event occur in real-time on the Hybrid SL grid. At sites where a panelist resided, a volunteer ran a second SL viewer to assist in the operation of the panelist’s avatar.
The diagram in Figure 3 shows a high-level view of the connectivity and hardware requirements for this event. A special camera-control script allowed one person to control the point-of-view of all "Camera" computers simultaneously, across the globe.
Figure 3. Overview of the mixed-reality 2009 Research kick-off event
This mixed-reality event allowed participants to experience immersion similar to a video conferencing event, yet was less expensive than outfitting 12 participating sites with new audio/video equipment.
Use of a single "master cameraman" to control cameras at all sites helped to maintain a consistent viewpoint at all sites.
Since this was a mixed-reality event, logistics were similar to a live real-world event. Use of checklists were essential to keeping everyone in sync.
To address the enterprise needs described in previous sections, IBM partnered with Linden Lab to experiment with a stand-alone version of SL. "Nebraska" is the code name for Linden Lab's "SL in a box" appliance.
The Nebraska unit consists of three servers:
- A SL server that runs a small SL grid, unconnected to the public SL grid.
- A streaming media server that stores audio, video, and image data.
- A Vivox voice server that provides spatial VoIP.
Linden Lab copied over the majority of the AoT "hybrid" SL grid content, so the environment started off with a large amount of proven content such as theaters and other facilities. Some added functionality, such as integration with the enterprise LDAP directory for authentication, made this environment more business-ready.
- New user "on-boarding" is critical. There is a direct correlation between time spent on-boarding and the value of the meeting - the more time users spend in pre-event training, the fewer technical issues arise during the actual event. The objectives of the event are more likely to be met as a result.
- Demonstrated cost and time (required for travel and event setup) savings over real-world events.
- Many users report satisfaction with experiencing immersion during the virtual worlds events, yet much more work needs to be done to fully utilize the potential of the 3D environment.
- Event scheduling, user registration and initial avatar set-up must be automated. Much of this has been implemented.
- While a number of people have requested "photo-realistic" avatars, in practice we have found that people prefer a less-realistic, more idealized representation of themselves. The inclusion of a streaming media server facilitates the use of video in presentations.
- A relatively high-powered computer is required to make SL (and other virtual space environments) function well. The integrated video chipsets in some machines have been known to cause frequent crashes and other "glitchy" behavior.
Advantages and disadvantages of the Nebraska solution are summarized in Table 1.
Table 1. Comparison of SL solutions
|Main SL grid|
|Hybrid SL behind the firewall|
|Hybrid SL mixed reality|
|Nebraska "SL in a box"|
As part of a global workforce, some distributed teams know each other only by their voice and perhaps a small picture, and communicate mostly via conference calls, email and instant messaging. These tools are effective, but they lack richness and interpersonal connection. In this environment, mentoring presents challenges beyond typical group gatherings, since discussions are typically private and may involve exchange of confidential information. To accomplish this remotely requires facilities with private voice and chat channels.
The virtual spaces team set out to implement a remote mentoring facility, which would allow more immersion than a phone call, yet support private exchange of information.
Since Metaverse, the Torque Game Engine-based virtual space, did not include voice integration, the solution's focus was on private text chat. By default, like in most virtual spaces, the Metaverse text chat gets broadcast out, either globally or proximity based, depending on your need. This model does not support private conversations. To overcome this challenge, we implemented chat “channels” (a channel is simply a separate line of communication).
The areas for private conversations were defined by triggers, where a trigger is a 3D space that knows when an avatar enters, leaves or remains in the space. We then associate a chat channel with that trigger area.
On the server side, a chat channel is really just a SimSet - a simple collection of objects, in this case client IDs. When a user enters the trigger area with an associated chat channel, that client ID is added to the SimSet. When a user leaves, they are removed. As implemented in Metaverse, channels are mutually exclusive. It should be noted that “public” chat is not a channel at all, though it could be implemented as such. As a result, a person in a private chat may also receive public chats.
Finally, in order to prevent just anyone from walking in on your private conversation, we can assign an access control list (ACL) to the trigger area as well.
Figure 4 shows the flow logic of how a text chat from a user makes it to the recipients.
Figure 4. Metaverse text chat logic flow
One of the big benefits of OpenSimulator (OpenSim) is the availability of voice communication. OpenSim doesn't have the ability to define trigger areas, but you can make media parcels that allow you to define private voice and text areas. Unfortunately, these parcels are defined by 2D coordinates, so it's really a column that extends infinitely. This means, unlike with Torque, you can't create a room that has private voice and have it restricted to that room. The parcel extends beyond the room vertically, through the roof and below the floor. Thus people standing outside the room would not hear the conversation, but people hovering over the room could.
Initial use of virtual spaces for mentoring was well received. People liked the idea of private areas where they could chat and talk. By far, the most common request was for secure private areas. This is leading us to implement access control lists (ACLs) for private areas and regions in OpenSim. With ACLs in place and the ability of the meeting owner to specify individuals and/or groups, you take the next step in providing secure areas for people to discuss sensitive information in virtual spaces.
Identity flexibility is not only an interesting question, but also a key feature of virtual worlds. As players participate in multiple worlds or take on multiple identities in one world, they also assume multiple social roles. Role playing and identity flexibility is hardly unique to virtual worlds. Yet visual representation of avatars introduced a new dimension and, thus, new questions about identity flexibility, personal privacy, reputation and trust. Since virtual worlds are mostly places for entertainment and make-believe, identity flexibility, a salient feature enjoyed and favored by many, is a way of (life) play there.
In a place like SL, this is nearly ubiquitous. You know people simply by their avatar identity, but you don't know if they're your neighbor or someone halfway across the world.
When virtual worlds are exploited by an enterprise, especially behind a firewall (what we call IBM Virtual Spaces), the question of identity and trust is worth rethinking. One of the things we quickly found out when developing virtual spaces was that people want to know who they are interacting with. A survey of 531 Metaverse users demonstrated that IBMers want their "work" avatars to not only carry their real names, but also look like them. Recognition of an avatar was a top desired feature (87%), equal in importance only to customization (87%). Even more, they would like to know what you do inside the company and how to contact you via other means. To meet this need, we developed a simple solution known as a Bluecard.
For most of the use case scenarios in virtual spaces, avatars that can be easily identified as real-life-us, are preferred. There are, though, some cases were anonymous avatars make sense for business use, e.g. training sessions where both managers and non-managers are involved, anonymous surveys and feedback sessions. In a long term, virtual spaces will be able to accommodate all of these scenarios.
The IBM corporate directory, called BluePages, enables employees to easily manage, find, and connect with colleagues. All of the rich metadata associated with an employee, from basic contact information to skills, current projects, and patent activity, can be accessed through a series of REST-style services, facilitating reuse and remixability. It is the most highly trafficked application in IBM, serving millions of requests per day by employees looking for contact information and expertise. The Bluecard widget takes advantage of this popularity by automatically providing contextual employee information. To date, the Bluecard was implemented in two virtual spaces – Metaverse and OpenSim. In Metaverse, users have several ways of seeing an avatar's Bluecard:
- Look at the avatar (place the crosshairs on them)
- Select the user from the list of users online and click the "Bluecard" button
- Via a chat command (e.g. /whois email@example.com)
One of an employee's unique identifiers inside IBM is their email address. In Metaverse, this information is stored as part of their avatar's information as part of the login process. Using the email address, you can gather any public information about an employee.
One interesting thing to note is that this is a completely client side solution. The avatar information is cached on each client machine. The only time the Metaverse server is involved is when using method #1 above to determine who the user is looking at.
The Torque engine provides only basic HTTP support. Because of this, you are unable to use web services that are more complicated than a simple GET/Response. To avoid writing more sophisticated HTTP support into the engine, we decided to use an SOA approach. A Nova REST service was written to accept a simple GET request and return the data in a JSON response (Nova is a J2EE framework which allows developers to define and host new REST style Web services). The web service does all the heavy lifting of data aggregation from the corporate BluePages server. The benefits of using an SOA approach and client side solution:
- Less load on the Metaverse server as it doesn't have to manage the request.
- Less bandwidth consumed between the client and server.
- Less complexity: The client merely does a simple REST request and parses the JSON data.
- Caching: The client can cache pieces of data, such as the photo of the person.
- Reuse: Other applications can use the service.
Figure 5. Bluecard architecture
The architecture and components depicted in Figure 5 show the basic process:
- The Metaverse viewer sends a REST request to the appropriate Nova service.
- The Nova service queries the corporate BluePages server for the requested information and aggregates it.
- The Nova service sends back the requested data as a JSON response.
Figure 6. Screenshot showing Bluecard in Metaverse
Figure 6 shows a screenshot of the Bluecard in Metaverse. The Bluecard appeared automatically when the user looked at the avatar using the cross hairs.
The Bluecard feature has been very popular with users. In a business environment, people want to know who they're talking to and be able to contact them by other means if necessary.
Not everything in a virtual space needs to be 3D – this solution successfully incorporated a familiar 2D interface into a virtual space.
Finally, the SOA approach allowed us to easily get around some of the more troublesome limitations of the Torque engine, while simplifying the code. It also had the additional benefit of allowing work to be reused and consumed by other applications and tools.
Our experiments in virtual spaces have already demonstrated tangible cost savings, as well as softer benefits in different areas of our business. Near-term projects include integration of virtual spaces with collaboration tools, video, and mobile technology to minimize the need for travel.
Yet there are more reasons to use virtual spaces for business enablement. As we build globally integrated enterprise, our communications are constrained by the flat, print-based model of today’s Web. We live in a 3D world and think visually. We need to bridge the gap between the flat Internet and the multi-dimensional reality by exploiting technologies that allow the next generation of immersion and integration between virtual and real worlds.
Stay tuned for Part 3 of this series, which will describe more of virtual spaces we built. The spaces will include white-board brainstorming, multilingual collaboration with instantaneous translation, integration of social networking tools, and systems monitoring. We will continue exploring the benefits of an SOA-based approach to virtual spaces.
We thank Neil Katz, Suzy Deffeyes, Thomas Cook and Rob Smart for their contribution to development of IBM virtual spaces in Second Life and OpenSim. Metaverse would not exist without the hard work of Mike Ackerbauer, Richard Newhook, Robi Bruner, Charisse Lu, Jeffrey Abbott and Joshua Scribner. We also thank many past and present VUC colleagues (too numerous to individually name), and Linden Lab representatives who either contributed to the development of virtual spaces in IBM or shared their thoughts in forums, blogs, or directly with authors.
- Read more about IBM and virtual worlds in the
OpenSimulator and learn how to get started.
- Try out the new
Idealist viewer for OpenSim.
- Learn more about the new virtual worlds protocol
Metaverse eXchange Protocol (MXP).
- Discover the capabilities of the
- Read a case study about the
Academy of Technology Virtual World Conference held by IBM in Second Life.
- Follow the activities of the IBM virtual worlds
- Read more about the
evolution of IBM's virtual universe community.
- This open documentation resource on the
Linden Scripting Language used in Second Life is to help scripters of all skill levels.
- Peruse a collection of
virtual world reports
published by IBM.
- Read more about the running of the IBM Academy of
- Watch clips from the
IBM Human Capital Management University held in Second Life.
- Watch the CNBC interview
Conferences Go Virtual with Karen Keeter of IBM talking about Sametime 3D.
- Browse the
for books on these and other technical topics.
Get products and technologies
IBM product evaluation versions
the online trials in the IBM SOA Sandbox and get your hands on application development tools and middleware products from
DB2®, Lotus®, Rational®, Tivoli®, and
Rick Alther is an advisory software engineer with the CIO Innovate Quick team. He was the technical lead for Metaverse, an internal virtual world platform based on the Torque Game Engine. Rick is currently involved in developing OpenSim for use inside IBM and deploying OpenSim on a cloud environment. In addition to virtual world technologies, Rick is also involved in the adoption and development of open technologies inside IBM, including OpenDocument, Firefox and Linux. Prior to working on virtual world technologies, Rick was involved in grid computing and received an IBM Outstanding Technical Achievement Award for his work as lead developer on the World Community Grid project. Rick holds a BA in Computer Science from Western Connecticut State University and is a member of the Association for Computing Machinery.
Luba Cherakov, an IBM Distinguished Engineer and member of the IBM Academy of Technology, is a key technical leader in the IBM CIO Office. As a director of the Future of the Intranet team, Luba drives high profile initiatives, such as the Genographic project, the Nature Conservancy water modeling, 3-D virtual spaces, and social computing tools. She focuses her team on integration of emerging technologies and techniques with innovative enterprise solutions. Ms. Cherbakov is an author and contributor to the IBM Service-Oriented Modeling and Architecture (SOMA) method, reference architectures, the Architectural Description Standard, and grid computing assets. She is a two-time recipient of the IBM Outstanding Technical Achievement Award, and a recipient of the IBM Corporate Award (the highest technical award for unique technical contributions of superior business value). Ms. Cherbakov is a member of the Society of Women Engineers and the Association for Computing Machinery, and holds a Master's degree in computer science from George Washington University.
Craig Becker (aka Jessica Qin in Second Life) is a member of the original team that spearheaded IBM's movement into the 3D Internet space, and he led the design and construction of IBM's public virtual world presence in Second Life. Working with Linden Lab, he successfully set up the world's first "corporate Second Life grid" behind IBM's firewall, and led the development team to design a virtual conference facility within that secure environment. Craig is an "early adopter" with wide-ranging interests and experience in many areas, including Web Application Development, UI Design, Computer Graphics, OO Design hardware design, AI, and music synthesis. He is experienced at working with customers and has been asked to speak at a number of prestigious conferences on the topic of Virtual Worlds. Craig holds a BS and an MS in Computer Science from the University of Illinois at Urbana-Champaign, and taught himself to program in 1973. He is an IBM Master Inventor and an artist, and his hobbies include science fiction and motorcycling. He lives in Austin, Texas, where he is happily married and the father of two wonderful children.