Virtual Reality in the Classroom

What Is Virtual Reality?

Virtual reality has the potential to be a powerful new tool in the classroom. The purpose of this article is to consider the concept of virtual reality; list present and developing applications; look at reasons to use virtual reality in the classroom; identify some possible classroom applications; and, finally, examine briefly one already available virtual reality software package.

Virtual reality has been defined as a highly interactive, computer-based, multimedia environment in which the user becomes a participant with the computer in a “virtually real” world. Joseph Henderson says environments like these equal high-fidelity simulations produced using interactive media or specialized systems such as aircraft simulators (Henderson, 1991). In a virtual environment, the user no longer looks at a computer screen but becomes part of the action on the screen, giving the sensation of participation.

Helsel contrasts this with “artificial reality,” using Michael Spring’s definition: “an interactive environment that encompasses unencumbered, full-body multisensory participation in computer events” (Helsel, 1991, 1992). First used in 1974 by Myron Krueger, the term describes particular environments like Krueger’s own VIDEO PLACE, an artificial reality exhibit. Mechanical attachments from users to computers are not necessary to participate in the event.

“Cyberspace,” Helsel interprets Spring (1991) as saying, is “. .a place where the human nervous system and mechanical-electronic communications and computation systems are linked” (Helsel, 1992). Some type of brain-computer attachment is probably required. William Gibson created and first used the term “cyberspace” in his 1984 novel, Neuromancer.

For this article, the term “virtual reality” covers the whole field, including artificial reality, cyberspace, and a third type, telepresence, which according to Hilary McLellan (1992), is the feeling of being present at a remote location from where one is actually located, with the ability to manipulate objects at that remote location.

Full-blown virtual reality programs ordinarily use a body-tracking interface with the computer. A full interface includes head-mounted displays with dual television monitors to provide a three-dimensional visual effect, data gloves to allow both tactile sensing and movement of objects within the scene, perhaps a bodysuit, and other body-move- meant tracking devices. Other, less elaborate virtual reality programs depend only on a computer monitor, keyboard, and mouse.

The National Aeronautics and Space Administration (NASA) Ames Research Center in Mountain View, California; the Human Interface Technology Laboratory at the University of Washing- ton in Seattle; the Media Lab at the Massachusetts Institute of Technology (MIT); the University of North Carolina at Chapel Hill; AutoDesk, Inc., of Sausalito, California; VPL Research, Inc., Foster City, California; IBM’s Watson Labs in Hawthorne, New York; Virtus Corporation in Cary, North Carolina, and other research facilities and companies around the world are actively involved in developing virtual reality environments, computer programs, and body sensors, and tracing systems of various types. For example, Jaron Lanier and VPL Research, Inc., have developed the EyePhone (an eye-tracking system), DataSuit (which tracks other gestures), and DataGlove (a hand tracking device). Mattel’s PowerGlove, a mass-market derivation, is used with Nintendo games.

Autodesk, Inc., known for its computer-aided design (CAD) software, has created a graphic representation of a typical office that a user can view from various angles in a three-dimensional model, using a head-mounted display consisting of a small liquid-crystal display (LCD) screen for each eye and a position tracking sensor. At the HumanInterface Technology Laboratory, 19 companies have formed the Virtual Worlds Consortium to apply virtual reality to business (Hamilton, Smith, McWilliams, Schwartz, & Cary, 1992).

History of Virtual Reality

Virtual reality is a merging of concepts that come out of several sources, stretching over a broad period of time. Efforts to produce life-like environments go back for many years. For example, the Link Trainers, first developed in 1927-1929, attempted to duplicate the reality of an airplane cockpit. Control of an environment via a body movement sensor is often an essential element of a virtual reality program. In 1958, Philco Corporation developed a head-mounted visual system controlled by head movement (Fisher, 1990). Computer programmers have worked for years to depict realistic environments on computer screens. In the early 1960s, Ivan Sutherland and others created a head-mounted display whereby the user could look around a graphic room by turning the head. By 1969, Myron Krueger had created a number of interactive environments that allowed participation in a computer event with the full body. One of his best-known environments is VIDEO PLACE, a graphic world in which people interact with each other and with graphic characters. VIDEO PLACE is now at the Connecticut Museum of Natural History in Storrs, Connecticut.

In the early 1970s, Fred Brooks at the University of North Carolina created a system in which the user handled graphic objects with a mechanical manipulator. Toward the end of the Seventies, the Media Lab at the Massachusetts Institute of Technology developed the Aspen Movie Map, a video simulation of a drive through the ski resort of Aspen, Colorado, whereby the participant could drive at will down any street and enter and explore buildings along the route. By 1984, Michael McGreevy and colleagues at NASA had developed data goggles which allowed the user to look around a graphic world portrayed on a computer screen. Another development in the 1980s was the television series, Star Trek, the Next Generation. In this projected future, the starship Enterprise includes a “Holodeck,” used primarily for crew entertainment. This computer-generated environment is so real that “under normal conditions, a participant in a Holodeck simulation should not be able to detect differences between a real object and a simulated one” (Sternbach & Okuda, 1991, p. 158). The Holodeck uses holographic figures and allows the player to participate actively in completely simulated environments.

Also in the early 1980s, William Gibson began publishing science fiction novels, such as Neuro- manner, in which the main characters have reality- like experiences in computer-generated worlds called Cyberspace. In Gibson’s words, Cyberspace is “a consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts.

Interest in virtual reality has quickened in the Nineties. It is moving from the research facility to the world of practical applications. Now is the time to consider uses in the classroom.

Classroom Uses of Virtual Reality

Classroom uses of virtual reality seem to be almost infinite. In their seminal article, “The Implications of Education in Cyberspace,” Rory Stuart and John C. Thomas (1991) list seven roles for cyberspace in education, which apply to the use of virtual reality in the classroom:

• Explore existing places and things that students would not otherwise have access to.

• Explore real things that, without alterations of scale in size and time, could not otherwise be effectively examined.

• Create places and things with altered qualities.

• Interact with people who are in remote locations through global clubs with a common interest or collaborations on projects between students from different parts of the world.

• Interact with real people in non-realistic ways.

• Create and manipulate abstract conceptual representatives, like data structures and mathematical functions.

• Interact with virtual beings, such as representations of historical figures and agents who are representatives of different philosophies and viewpoints participating in simulated negotiations.

Stuart and Thomas (1991) contend that there are at least two types of representations in cyberspace. One uses naturalistic scenes to display objects, attributes, and relationships. The other “uses abstract scenes in which objects, attributes, and relationships are not as they appear in the real world, but are designed to highlight conceptual relationships.”

24/7 Hours Support at (+91) 9500005253 , 7358511611

We are always here to answer you and feel free to contact us at any time.

The University signs MoU with Scopik Edu Services, Salem


The AR and VR technologies will be imparted for the first time in a university curriculum in the country as a direct effect of the Memorandum of Understanding (MoU) signed between Periyar University with a technical collaborator, Scopik Edu Services, which will provide the knowledge skill to the students.

Student Advisor (Campus)

  • PUSJV, Periyar University, Periyar Palkalai Nagar, Salem-636011, Tamil Nadu, India.

  • (+91) 9500005253 ,

    (+91) 7358511611


Copyright 2020 PUSJV ©  All Rights Reserved