Posted on

Enhanced computer vision, sensors raise manufacturing stakes for robots as a service

RaaS is defining the second generation of robots that work alongside humans

For more than two decades, robotics market commentaries have predicted a shift, particularly in manufacturing, from traditional industrial manipulators to a new generation of mobile, sensing robots, called “cobots.” Cobots are agile assistants that use internal sensors and AI processing to operate tools or manipulate components in a shared workspace, while maintaining safety.

It hasn’t happened. Companies have successfully deployed cobots, but the rate of adoption is lagging behind expectations.

According to the International Federation of Robotics (IFR), cobots sold in 2019 made up just 3% of the total industrial robots installed. A report published by Statista projects that in 2022, cobots’ market share will advance to 8.5%. This is a fraction of a February 2018 study cited by the Robotic Industries Association that forecasted by 2025, 34% of the new robots being sold in the U.S. will be cobots.

To see a cobot in action, here’s the Kuka LBR iiwa. To ensure safe operation, cobots come with built-in constraints, like limited strength and speed. Those limitations have also limited their adoption.

[embedded content]

As cobots’ market share languishes, standard industrial robots are being retrofitted with computer vision technology, allowing for collaborative work combining the speed and strength of industrial robots with the problem-solving skills and finesse of humans.

This article will document the declining interest in cobots, the reasons for it and the technology that is replacing it. We report on two firms developing computer vision technology for standard robots and describe how developments in 3D vision and so-called “robots as a service” (yes, RaaS) are defining this faster-growing second generation of robots that can work alongside humans.

What are robotics sensing platforms?

Read More

Posted on

MIT muscle-control system for drones lets a pilot use gestures for accurate and specific navigation

[embedded content]

MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has released a video of their ongoing work using input from muscle signals to control devices. Their latest involves full and fine control of drones, using just hand and arm gestures to navigate through a series of rings. This work is impressive not just because they’re using biofeedback to control the devices, instead of optical or other kinds of gesture recognition, but also because of how specific the controls can be, setting up a range of different potential applications for this kind of remote tech.

This particular group of researchers has been looking at different applications for this tech, including its use in collaborative robotics for potential industrial applications. Drone piloting is another area that could have big benefits in terms of real-world use, especially once you start to imagine entire flocks of these taking flight with a pilot provided a view of what they can see via VR. That could be a great way to do site surveying for construction, for example, or remote equipment inspection of offshore platforms and other infrastructure that’s hard for people to reach.

[embedded content]

Seamless robotic/human interaction is the ultimate goal of the team working on this tech, because just like how we intuit our own movements and ability to manipulate our environment most effectively, they believe the process should be as smooth when controlling and working with robots. Thinking and doing are essentially happening in parallel when we interact with our environment, but when we act through the extension of machines or remote tools, there’s often something lost in translation that results in a steep learning curve and the requirement of lots of training.

Cobotics, or the industry that focuses on building robots that can safely work alongside and in close collaboration with robots, would benefit greatly from advances that make the interaction between people and robotic equipment more natural, instinctive and, ultimately, safe. MIT’s research in this area could result in future industrial robotics products that require less training and programming to operate at scale.

Read More