VR system from Computer Science and Artificial Intelligence Laboratory could make it easier for factory workers to telecommute

4 October 2017 - MASSACHUSETTS INSTITUTE OF TECHNOLOGY, CSAIL

Many manufacturing jobs require a physical presence to operate machinery. But what if such jobs could be done remotely? This week researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) presented a virtual-reality (VR) system that lets you teleoperate a robot using an Oculus Rift headset.

The system embeds the user in a VR control room with multiple sensor displays, making it feel like they are inside the robot's head. By using gestures, users can match their movements to the robot's to complete various tasks.

"A system like this could eventually help humans supervise robots from a distance," says CSAIL postdoctoral associate Jeffrey Lipton, who was lead author on a related paper about the system. "By teleoperating robots from home, blue-collar workers would be able to tele-commute and benefit from the IT revolution just as white-collars workers do now."

The researchers even imagine that such a system could help employ increasing numbers of jobless video-gamers by "game-ifying" manufacturing positions.

The team demonstrated their VC control approach with the Baxter humanoid robot from Rethink Robotics, but said that the approach can work on other robot platforms and is also compatible with the HTC Vive headset.

Lipton co-wrote the paper with CSAIL director Daniela Rus and researcher Aidan Fay. They presented the paper this week at the IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) in Vancouver.

How it works

There have traditionally been two main approaches to using VR for teleoperation.

In a "direct" model, the user's vision is directly coupled to the robot's state. With these systems, a delayed signal could lead to nausea and headaches, and the user's viewpoint is limited to one perspective.

In the "cyber-physical" model, the user is separate from the robot. The user interacts with a virtual copy of the robot and the environment. This requires much more data, and specialized spaces.

The CSAIL team's system is halfway between these two methods. It solves the delay problem, since the user is constantly receiving visual feedback from the virtual world. It also solves the the cyber-physical issue of being distinct from the robot: once a user puts on the headset and logs into the system, they will feel as if they are inside Baxter's head.

The system mimics the "homunculus model of mind", the idea that there's a small human inside our brains controlling our actions, viewing the images we see and understanding them for us. While it's a peculiar idea for humans, for robots it fits: "inside" the robot is a human in a control room, seeing through its eyes and controlling its actions.

Using Oculus' controllers, users can interact with controls that appear in the virtual space to open and close the hand grippers to pick up, move, and retrieve items. A user can plan movements based on the distance between the arm's location marker and their hand while looking at the live display of the arm.

To make these movements possible, the human's space is mapped into the virtual space, and the virtual space is then mapped into the robot space to provide a sense of co-location.

The system is also more flexible compared to previous systems that require many resources. Other systems might extract 2-D information from each camera, build out a full 3-D model of the environment, and then process and redisplay the data.

In contrast, the CSAIL team's approach bypasses all of that by taking the 2-D images that are displayed to each eye. (The human brain does the rest by automatically inferring the 3-D information.)

To test the system, the team first teleoperated Baxter to do simple tasks like picking up screws or stapling wires. They then had the test users teleoperate the robot to pick up and stack blocks.

Users successfully completed the tasks at a much higher rate compared to the "direct" model. Unsurprisingly, users with gaming experience had much more ease with the system.

Tested against state-of-the-art systems, CSAIL's system was better at grasping objects 95 percent of the time and 57 percent faster at doing tasks. The team also showed that the system could pilot the robot from hundreds of miles away, testing it on a hotel's wireless network in Washington, DC to control Baxter at MIT.

Photo: VR system from Computer Science and Artificial Intelligence Laboratory could make it easier for factory workers to telecommute. (Credit: Jason Dorfman, MIT CSAIL)

"This contribution represents a major milestone in the effort to connect the user with the robot's space in an intuitive, natural, and effective manner." says Oussama Khatib, a computer science professor at Stanford University who was not involved in the paper.

The team eventually wants to focus on making the system more scalable, with many users and different types of robots that can be compatible with current automation technologies.

Source: MASSACHUSETTS INSTITUTE OF TECHNOLOGY, CSAIL

Read more

Comments

No comments to display.

Related posts

EU's Call for Proposals: An empowering, inclusive Next Generation Internet

The objective is to support actions on smarter, open, trusted and personalised learning solutions to optimise digital learning and to allow learners to engage and interact with content and with peers.
Application Deadline in 5 months

Singapore to establish Additive Manufacturing Facility and Applications in Maritime Sector

The facility’s location also leverages PSA’s parts supplier base and facility operations to support just-in-time inventory. This move towards digitised inventories reduces the need to hold excess inventory, which lowers storage costs, while shortening turnaround time from weeks to days due to improved availability of spare parts. In the long run, PSA will expand the scope of these services to the wider maritime industry, including ship owners, to help build its business adjacencies.

EU's Call for Proposals: The AQUAEXCEL2020 twelfth call for access

The facilities available cover the entire range of production systems (cage, pond, recirculation, flowthrough, hatchery and disease challenge); environments (freshwater, marine, cold, temperate and warm water); scales (small, medium and industrial scale); fish species (salmonids, cold and warm water marine fish, freshwater fish and artemia); and fields of expertise (nutrition, physiology, health & welfare, genetics, engineering, monitoring & management technologies).
Application Deadline in a month

Environment and Big Data: Role in Smart Cities of India

This study identifies six environmental factors, which should be integrated in the development of smart cities. These environmental factors include indicators of landscape and geography, climate, atmospheric pollution, water resources, energy resources, and urban green space as a major component of the environment.

Corteva Agriscience and IRRI Ink Partnership to Develop Advanced Rice Technologies and Programs

The partnership seeks to improve the genetic outcomes of breeding programs, encourage sustainable rice cultivation, and develop new rice varieties which deliver higher yields and are more resilient against biotic and abiotic stresses.

Call for Applications: Communication projects which mitigate anthropogenic climate change

The Minor Foundation for Major Challenges (MFMC) is inviting applications from all over the world to fund communication projects which mitigate anthropogenic climate change.
Application Deadline in a month

EU's Call for Proposals: Digital technologies for improved performance in cognitive production plants

Proposals need to develop new technologies to realise cognitive production plants, with improved efficiency and sustainability, by use of smart and networked sensor technologies, intelligent handling and online evaluation of various forms of data streams as well as new methods for self-organizing processes and process chains.
Application Deadline in 4 months

Study reveals best use of wildflowers to benefit crops on farms

For the first time, a Cornell University study of strawberry crops on New York farms tested this theory and found that wildflower strips on farms added pollinators when the farm lay within a "Goldilocks zone," where 25 to 55 percent of the surrounding area contained natural lands.

EU's Call for Proposals: Reinforcing the EU agricultural knowledge base

Activities shall analyse and compare the approaches taken on their performance and impact for farmers/foresters as well as effectivity of the communication and information channels used for dissemination in countries and regions.
Application Deadline in 3 months

The Bali Fintech Agenda: A Blueprint for Successfully Harnessing Fintech’s Opportunities

In response to the Bali Fintech Agenda, the World Bank will focus on using fintech to deepen financial markets, enhance responsible access to financial services, and improve cross-border payments and remittance transfer systems.