Jump to content

Telepresence

From Wikipedia, the free encyclopedia
(Redirected from Immersion (telepresence))
A modular telepresence video conferencing system

Telepresence is the appearance or sensation of a person being present at a place other than their true location, via telerobotics or video.

Telepresence requires that the users' senses interact with specific stimuli in order to provide the feeling of being in that other location. Additionally, users may be given the ability to affect the remote location. In this case, the user's position, movements, actions, voice, etc. may be sensed to transmit and duplicate in the remote location to bring about this effect. Therefore information may be traveling in both directions between the user and the remote location.

A popular application is found in telepresence videoconferencing, the highest possible level of videotelephony. Telepresence via video deploys greater technical sophistication and improved fidelity of both sight and sound than in traditional videoconferencing. Technical advancements in mobile collaboration have also extended the capabilities of videoconferencing beyond the boardroom for use with hand-held mobile devices, enabling collaboration independent of location.

A similar or identical concept is telexistence, which was first proposed by Susumu Tachi in Japan in 1980[1] and 1981[2] as patents and the first report was published in Japanese in 1982[3] and in English in 1984.[4]

History

[edit]

In a pioneering paper, the U.S. cognitive scientist Marvin Minsky attributed the development of the idea of telepresence to science fiction author Robert A. Heinlein: "My first vision of a remote-controlled economy came from Robert A. Heinlein's prophetic 1948 novel, Waldo," wrote Minsky.[5] In his science fiction short story "Waldo" (1942), Heinlein first proposed a primitive telepresence master-slave manipulator system.

The Brother Assassin, written by Fred Saberhagen in 1969, introduced the complete concept for a telepresence master-slave humanoid system. In the novel, the concept is described as follows: "And a moment later it seemed to all his senses that he had been transported from the master down into the body of the slave-unit standing beneath it on the floor. As the control of its movements passed over to him, the slave started gradually to lean to one side, and he moved its foot to maintain balance as naturally as he moved his own. Tilting back his head, he could look up through the slave's eyes to see the master-unit, with himself inside, maintaining the same attitude on its complex suspension."

Early system for immersive telepresence (USAF, 1992 - Virtual Fixtures)

The term telepresence, a neologism due to the futurist Patrick Gunkel, was introduced to the public in a 1980 article by Minsky, who outlined his vision for an adapted version of the older concept of teleoperation that focused on giving a remote participant a feeling of actually being present at a different location.[5] One of the first systems to create a fully immersive illusion of presence in a remote location was the Virtual Fixtures platform developed in 1992 at the U.S. Air Force's Armstrong Labs by inventor Louis Rosenberg. The system included stereoscopic image display from the remote environment as well as immersive touch feedback using a full upper-body exoskeleton.[6][7][8]

The first commercially successful telepresence company, Teleport (which was later renamed TeleSuite), was founded in 1993 by David Allen and Herold Williams.[9] Before TeleSuite, they ran a resort business from which the original concept emerged because they often found businesspeople would have to cut their stays short to participate in important meetings. Their idea was to develop a technology that would allow businesspeople to attend their meetings without leaving the resorts so that they could lengthen their hotel stays.

A Tandberg E20 high resolution videoconferencing phone meant to replace conventional desktop phones

Hilton Hotels had originally licensed to install them in their hotels throughout the United States and other countries, but use was low. The idea lost momentum, with Hilton eventually backing out. TeleSuite later began to focus less on the hospitality industry and more on business-oriented telepresence systems. Shareholders eventually held enough stock to replace the company's original leadership, which ultimately led to its collapse.[citation needed] David Allen purchased all of the assets of TeleSuite and appointed Scott Allen as president[10] of the new company called Destiny Conferencing.

Destiny Conferencing licensed its patent portfolio to HP which became the first large company to join the telepresence industry, soon followed by others such as Cisco and Polycom (now called Poly).[11] After forming a distribution agreement with Pleasanton-based Polycom (now Poly), Destiny Conferencing sold on January 5, 2007, to Polycom (now Poly) for $60 million.

First Telepresence Systems in 1993 (TeleSuite): Englewood, Ohio to New York City

A telepresence research project has started in 1990. Located at the University of Toronto, the Ontario Telepresence Project (OTP) was an interdisciplinary effort involving social sciences and engineering. Its final report stated that it "...was a three year, $4.8 million pre-competitive research project whose mandate was to design and field trial advanced media space systems in a variety of workplaces in order to gain insights into key sociological and engineering issues. The OTP, which has ended in December 1994, was part of the International Telepresence Project which linked Ontario researchers to their counterparts in four European nations. The Project's major sponsor was the Government of Ontario, through two of its Centres of Excellence—the Information Technology Research Centre (ITRC) and the Telecommunications Research Institute of Ontario (TRIO)."[12]

Benefits

[edit]

An industry expert described some benefits of telepresence: "There were four drivers for our decision to do more business over video and telepresence. We wanted to reduce our travel spend, reduce our carbon footprint and environmental impact, improve our employees' work/life balance, and improve employee productivity.".[13]

American exile Edward Snowden participates in a TED talk in Texas from Russia via telepresence robot, March 2014

Rather than traveling great distances in order to have a face-face meeting, it is now commonplace to use a telepresence system instead, which uses a multiple codec video system (which is what the word "telepresence" most currently represents). Each member/party of the meeting uses a telepresence room to "dial in" and can see/talk to every other member on a screen/screens as if they were in the same room. This brings enormous time and cost benefits. It is also superior to phone conferencing (except in cost), as the visual aspect greatly enhances communications, allowing for perceptions of facial expressions and other body languages.

Mobile collaboration systems combine the use of video, audio and on-screen drawing capabilities using newest generation hand-held mobile devices to enable multi-party conferencing in real-time, independent of location. Benefits include cost-efficiencies resulting from accelerated problem resolution, reductions in downtimes and travel, improvements in customer service and increased productivity.[14]

Implementation

[edit]

Telepresence has been described as the human experience of being fully present at a live real-world location remote from one's own physical location. Someone experiencing video telepresence would therefore be able to behave and receive stimuli as if part of a meeting at the remote site. The aforementioned would result in interactive participation of group activities that would bring benefits to a wide range of users.[15]

Implementation of human sensory elements

[edit]

To provide a telepresence experience, technologies are required that implement the human sensory elements of vision, sound, and manipulation.

Vision and sound

[edit]

A minimum system usually includes visual feedback. Ideally, the entire field of view of the user is filled with a view of the remote location, and the viewpoint corresponds to the movement and orientation of the user's head. In this way, it differs from television or cinema, where the viewpoint is out of the control of the viewer.

In order to achieve this, the user may be provided with either a very large (or wraparound) screen, or small displays mounted directly in front of the eyes. The latter provides a particularly convincing 3D sensation. The movements of the user's head must be sensed, and the camera must mimic those movements accurately and in real time. This is important to prevent unintended motion sickness.

Another source of future improvement to telepresence displays, compared by some to holograms, is a projected display technology featuring life-sized imagery.[16]

Sound is generally the easiest sensation to implement with high fidelity, based on the foundational telephone technology dating back more than 130 years. Very high-fidelity sound equipment has also been available for a considerable period of time, with stereophonic sound being more convincing than monaural sound.

Manipulation

[edit]
Monty, a telemanipulation prototype from Anybots

The ability to manipulate a remote object or environment is an important aspect for some telepresence users and can be implemented in a large number of ways depending on the needs of the user. Typically, the movements of the user's hands (position in space and posture of the fingers) are sensed by wired gloves, inertial sensors, or absolute spatial position sensors. A robot in the remote location then copies those movements as closely as possible. This ability is also known as teleoperation.

The more closely the robot re-creates the form factor of the human hand, the greater the sense of telepresence. The complexity of robotic effectors varies greatly, from simple one axis grippers, to fully anthropomorphic robot hands.

Haptic teleoperation refers to a system that provides some sort of tactile force feedback to the user, so the user feels some approximation of the weight, firmness, size, and/or texture of the remote objects manipulated by the robot. A new form of technology, called collaborative telepresence, is currently being developed which will eventually be used to collaborate with others while seeming like you are in the same room as the other person, keeping a normal social distance. Collaborative telepresence uses haptic sensors like these to allow a sense of touch.

Freedom of movement

[edit]
iRobot Ava 500, an autonomous roaming telepresence robot

The prevalence of high quality video conferencing using mobile devices, tablets and portable computers has enabled considerable growth in telepresence robots to help give a better sense of remote physical presence for communication and collaboration in the office, home or school when one cannot be there in person. The robot avatar can move or look around at the command of the remote person. Drivable telepresence robots – typically contain a display (integrated or separate phone or tablet) mounted on a roaming base. Some examples of roaming telepresence robots include Beam by Suitable Technologies, Double by Double Robotics, Ava Telepresence by Ava Robotics, Anybots, Vgo, TeleMe by Mantarobot, and Romo by Romotive.[17]

More modern roaming telepresence robots may include an ability to operate autonomously. The robots can map out the space and be able to avoid obstacles while driving themselves between rooms and their docking stations.[18]

Effectiveness

[edit]

Telepresence's effectiveness varies by degree of fidelity. Research has noted that telepresence solutions differ in degree of implementation, from "immersive" through "adaptive" to "lite" solutions.[19] At the top are immersive solutions where the environments at both ends are highly controlled (and often the same) with respect to lighting, acoustics, decor and furniture, thereby giving all the participants the impression they are together at the same table in the same room, thus engendering the "immersive" label.

Adaptive telepresence solutions may use the same technology, but the environments at both ends are not highly controlled and hence often differ. Adaptive solutions differ from telepresence lite solutions not in terms of control of environments, but in terms of integration of technology. Adaptive solutions use a managed service, whereas telepresence lite solutions use components that someone must integrate.

Transparency of implementation

[edit]
A telepresence conference between participants in Ghana and Newark, New Jersey in 2012

A good telepresence strategy puts the human factors first, focusing on visual collaboration configurations that closely replicate the brain's innate preferences for interpersonal communications, separating from the unnatural "talking heads" experience of traditional videoconferencing. These cues include life–size participants, fluid motion, accurate flesh tones and the appearance of true eye contact.[20] This is already a well-established technology, used by many businesses today. The chief executive officer of Cisco Systems, John Chambers in June 2006 at the Networkers Conference compared telepresence to teleporting from Star Trek, and said that he saw the technology as a potential billion dollar market for Cisco.[21]

Rarely will a telepresence system provide such a transparent implementation with such comprehensive and convincing stimuli that the user perceives no differences from actual presence. But the user may set aside such differences, depending on the application.

The fairly simple telephone achieves a limited form of telepresence using just the human sensory element of hearing, in that users consider themselves to be talking to each other rather than talking to the telephone itself.

Watching television, for example, although it stimulates our primary senses of vision and hearing, rarely gives the impression that the watcher is no longer at home. However, television sometimes engages the senses sufficiently to trigger emotional responses from viewers somewhat like those experienced by people who directly witness or experience events. Televised depictions of sports events as an example can elicit strong emotions from viewers.

As the screen size increases, so does the sense of immersion, as well as the range of subjective mental experiences available to viewers. Some viewers have reported a sensation of genuine vertigo or motion sickness while watching IMAX movies of flying or outdoor sequences.

Because most currently feasible telepresence gear leaves something to be desired; the user must suspend disbelief to some degree, and choose to act in a natural way, appropriate to the remote location, perhaps using some skill to operate the equipment. In contrast, a telephone user does not see herself as "operating" the telephone but merely talking to another person with it.

[edit]

Virtual presence (virtual reality)

[edit]
An online video web conference in an office, December 2010

Telepresence refers to a user interacting with another live, real place, and is distinct from virtual presence, where the user is given the impression of being in a simulated environment. Telepresence and virtual presence rely on similar user-interface equipment, and they share the common feature that the relevant portions of the user's experience at some point in the process will be transmitted in an abstract (usually digital) representation. The main functional difference is the entity on the other end: a real environment in the case of telepresence, vs. a computer in the case of immersive virtual reality.

Presence is very similar to distal attribution or externalization which is like projecting one's presence and mind beyond the limits of our sensory organs and perceiving the environment in such a way. A distinction is made between two separate perceptions. The first being the unmediated perceptions in which we are unable to feel anything beyond our physical surroundings. The second being the mediated presence through technology which forces us to suddenly perceive two different environments at the same time: The one immediately around us and the one projected for us through technology. Mediated experiences are not limited to virtual technology and can also be experienced with spatially distant places such as space with a telescope or a camera.[22]

Applications

[edit]

Application examples could be cited within emergency management and security services, B&I, and the entertainment and education industries.[15]

Connecting communities

[edit]

Telepresence can be used to establish a sense of shared presence or shared space among geographically separated members of a group.[citation needed]

Hazardous environments

[edit]
Robonaut 2 aboard the International Space Station

Many other applications in situations where humans are exposed to hazardous situations are readily recognised as suitable candidates for telepresence. Mining, bomb disposal, military operations, rescue of victims from fire, toxic atmospheres, deep sea exploration, or even hostage situations, are some examples. Telepresence also plays a critical role in the exploration of other worlds, such as with the Mars Exploration Rovers, which are teleoperated from Earth.

Pipeline inspection

[edit]

Small diameter pipes otherwise inaccessible for the examination can now be viewed using pipeline video inspection.

Remote surgery

[edit]

The possibility of being able to project the knowledge and the physical skill of a surgeon over long distances has many attractions. Thus, again there is considerable research underway in the subject. (Locally controlled robots are currently being used for joint replacement surgery as they are more precise in milling bone to receive the joints.) The armed forces have an obvious interest since the combination of telepresence, teleoperation, and telerobotics can potentially save the lives of battle casualties by allowing them prompt attention in mobile operating theatres by remote surgeons.

Recently, teleconferencing has been used in medicine (telemedicine or telematics), mainly employing audio-visual exchange, for the performance of real time remote surgical operations – as demonstrated in Regensburg, Germany in 2002.[23] In addition to audio-visual data, the transfer of haptic (tactile) information has also been demonstrated in telemedicine.[24]

Education

[edit]
A professional development expert in Denver uses telepresence to coach a teacher in Utah during the initial research of Project ThereNow.

Research has been conducted on the use of telepresence to provide professional development to teachers. Research has shown that one of the most effective forms of teacher professional development is coaching, or cognitive apprenticeship. The application of telepresence shows promise for making this approach to teacher professional development practical.[25]

The benefits of enabling schoolchildren to take an active part in exploration have also been shown by the JASON and the NASA Ames Research Center programs. The ability of a pupil, student, or researcher to explore an otherwise inaccessible location is a very attractive proposition; For example, locations where the passage of too many people is harming the immediate environment or the artifacts themselves, e.g. undersea exploration of coral reefs, ancient Egyptian tombs, and more recent works of art.

Another application is for the remote classroom which allows a professor to teach students in different campuses at the same time. An example of this application is in classrooms of the law schools of Rutgers University. Two identical rooms are located in two metropolitan areas. Each classroom is equipped with studio lighting, audio, and video conference equipment connected to a 200-inch monitor on the wall that students face to give an impression that they are all in the same classroom. This allows professors to be on either campus and facilitates the interaction among students in both campuses during the classes.[26]

Telepresence art

[edit]

True telepresence is a multidisciplinary art and science that foundationally integrates engineering, psychology, and the art of television broadcast.

In 1998, Diller and Scofidio created the "Refresh", an Internet-based art installation that juxtaposed a live web camera with recorded videos staged by professional actors. Each image was accompanied with a fictional narrative which made it difficult to distinguish which was the live web camera.

A soap opera for iMacs

In 1993, Eduardo Kac and Ed Bennett created a telepresence installation "Ornitorrinco on the Moon", for the international telecommunication arts festival "Blurred Boundaries" (Entgrenzte Grenzen II). It was coordinated by Kulturdata, in Graz, Austria, and was connected around the world.

From 1997 to the present Ghislaine Boddington of shinkansen and body>data>space has explored, in a collaboration process she has called The Weave[27] using performing arts techniques, the extended use of telepresence into festivals, arts centres and clubs and has directed numerous workshops leading to exploration of telepresence by many artists worldwide. This methodology has been used extensively to develop skills in tele-intuition for young people in preparation for the future world of work through the body>data>space / NESTA project "Robots and Avatars" an innovative project explores how young people will work and play with new representational forms of themselves and others in virtual and physical life in the next 10–15 years.

An overview of telepresence in dance and theatre through the last 20 years is given in «Excited Atoms»[28] research document by Judith Staines (2009) which one can download from the On The Move website.

Vendors

[edit]
[edit]

Telepresence is represented in media and entertainment.

Literature

[edit]
  • The Naked Sun (1957) – a novel mostly occurring on Solaria, a planet with an extremely low population where all personal contact is considered obscene, and all communication occurs through holographic telepresence.
  • Neuromancer (1984) – a novel that not only uses telepresence extensively but introduces cyberspace and the Internet (aka the Matrix) as the underlying vehicle to make telepresence possible.

Television

[edit]

Film

[edit]

Video games

[edit]

Comics

[edit]
  • Lamar Waldron's M.I.C.R.A.: Mind Controlled Remote Automaton (1986-88), related the story of a college female paralyzed by a neck injury, who volunteered to be the remote pilot of an android body created by one of her professors.
  • The Surrogates, a 2005–2006 comic book series, inspired the film Surrogates (2009) and explains about a future where humans have a widespread use of mind-controlled androids.

Virtual reality communities

[edit]

See also

[edit]

References

[edit]
  1. ^ S.Tachi, K.Tanie and K.Komoriya: Evaluation device of mobility aids for the blind, Japanese Patent 1462696 filed on December 26, 1980.
  2. ^ S.Tachi, K.Tanie and K.Komoriya: Operation method of manipulators with sensoryinformation display functions, Japanese Patent 1458263 filed on January 14, 1981.
  3. ^ S.Tachi and M.Abe: Study on Telexistence (I) -Desigan of Visual Display-, Proceedings of the 21st SICE Annual Conference, pp.167-168, Tokyo, July 28–30, 1982.
  4. ^ S. Tachi, K. Tanie, K. Komoriya and M. Kaneko, Tele-existence (I): Design and Evaluation of a Visual Display with Sensation of Presence, RoManSy 84 The Fifth CISM-IFToMM Symposium. 1984: 206–215.
  5. ^ a b Marvin Minsky (June 1980). "Telepresence". Omni.
  6. ^ Kim, Won S.; Rosenberg, Louis B. (21 December 1993). Virtual fixtures as tools to enhance operator performance in telepresence environments. Proc. SPIE 2057, Telemanipulator Technology and Space Telerobotics. Proceedings. Vol. 2057. pp. 10–21. doi:10.1117/12.164901. ISSN 0277-786X.
  7. ^ L. B. Rosenberg. The Use of Virtual Fixtures As Perceptual Overlays to Enhance Operator Performance in Remote Environments. Technical Report AL-TR-0089, USAF Armstrong Laboratory, Wright-Patterson AFB OH, 1992.
  8. ^ Rosenberg, "Virtual Fixtures: Perceptual Overlays Enhance Operator Performance in Telepresence Tasks," Ph.D. Dissertation, Stanford University.
  9. ^ Human Productivity Lab Whitepaper
  10. ^ Options, Telepresence (28 April 2018). "Telepresence Options Website". www.telepresenceoptions.com.
  11. ^ Bushaus, Dawn (March 17, 2008). "Telepresence: Ready For Its Close-Up".
  12. ^ "Final Report" (PDF). toronto.edu.
  13. ^ Companies Worldwide Rely on Polycom Video and Telepresence to Cut Carbon Emissions for a Greener Planet Archived 2008-06-09 at the Wayback Machine, CNN June 5, 2008
  14. ^ Herrell, Elizabeth (March 9, 2010). "Enterprise Communications: The Next Decade". Forrester Research White Paper. Archived from the original on January 17, 2012. Retrieved May 23, 2011.
  15. ^ a b Video Conferencing: A guide to making a Telepresence Business Case Archived 2008-06-28 at the Wayback Machine Matt Walker, 2007
  16. ^ Barras, Colin (26 November 2009). "'Holographic' Videoconferencing Moves Nearer to Market". New Scientist. p. 23. Retrieved 14 December 2009.
  17. ^ Lehrbaum, Rick (January 11, 2013). "Attack of the Telepresence Robots!". InfoWeek. Retrieved December 8, 2013.
  18. ^ Honig, Zach (March 17, 2014). "iRobot's Ava 500 telepresence-on-a-stick is rolling out now (update: $69,500!!)". Engadget. Retrieved 4 July 2014.
  19. ^ Dataquest Insight: The Telepresence Market in India, 2008 Archived 2013-05-12 at the Wayback Machine, Gartner December 16, 2008
  20. ^ Telepresence Consulting – Human Productivity Lab Archived 2007-04-23 at the Wayback Machine
  21. ^ Sanders, Tom (21 June 2006). "Cisco Sets Sail For Teleconferencing". Archived from the original on 2007-09-30.
  22. ^ Steuer, Jonathan (1992). "Defining Virtual Reality: Dimensions Determining Telepresence". Journal of Communication. 42 (4): 73–93. doi:10.1111/j.1460-2466.1992.tb00812.x. ISSN 0021-9916.
  23. ^ M. Nerlich & U. Schächinger (Eds) - Integration of Health Telematics into Medical Practice - IOS Press, 2003
  24. ^ Westwood et al. (Eds.) - Medicine meets Virtual Reality 12: Studies in Health Technology & Informatics - W. Kahled, S. Reichling, O.T. Bruhns, H. Böse, M. Baumann, S. Egersdörfer, D. Klein, A. Tunayer, H. Freimuth, A. Lorenz, A. Pessavento & H. Ermert - Palpation Imaging using a Haptic System for Virtual Reality applications in Medicine. - pp 147-153, IOS Press, 2004
  25. ^ "Evaluating the Effectiveness of a Telepresence-Enabled Cognitive Apprenticeship Model of Teacher Professional Development by R. Shawn Edmondson, Ph.D. Archived 2007-04-18 at the Wayback Machine
  26. ^ Heyboer, Kelly (10 April 2015). "Why is Rutgers merging its law schools? The deans answer your questions". The Star-Ledger. Retrieved 10 April 2015.
  27. ^ The Weave[permanent dead link], Virtual presence physical beings – from telegraph to telecast – a reflection on virtual beingness Ghislaine Boddington November 2000, commissioned by ResCen, Middlesex University
  28. ^ "Excited Atoms", Excited Atoms outlines a history of virtual mobility in performance, presents the main types of work with inspiring current examples and traces some of the most critical issues and motivations for artists, cultural producers and promoters to collaborate, share, make, question, present and innovate using virtual mobility.

Further reading

[edit]
  • "Telepresence" (Minsky 1980; Sheridan 1992a; Barfield, Zelter, Sheridan, & Slater 1995; Welch, Blackmon, Liu, Mellers, & Stark 1996)
  • "Virtual Presence" (Barfield et al., 1995)
  • Oliver Grau: Virtual Art: From Illusion to Immersion, MIT-Press, Cambridge 2003 ISBN 0-262-57223-0
  • Eduardo Kac: Telepresence and Bio Art—Networking Humans, Rabbits and Robot (Ann Arbor: University of Michigan Press, 2005) ISBN 0-472-06810-5
  • "Being There" (Reeves 1991; Heeter 1992; Barfield et al., 1995; Zhoa 2003)
  • "A perceptual illusion of non-mediation" (Lombard & Ditton 1997)
  • "The suspension of disbelief" (Slater & Ushoh 1994)
  • Sheridan, Thomas B. (1992), "Telerobotics, Automation, and Human Supervisory Control, Cambridge, MA: MIT Press. Second edition (2003) ISBN 0-262-51547-4
  • unknown title (Sheridan, 1992, 1992; Barfield & Weghorst, 1993; Slater & Usoh, 1994; Barfield, Sheridan, Zeltzer, Slater, 1995)
[edit]