Posted on

Enhanced computer vision, sensors raise manufacturing stakes for robots as a service

RaaS is defining the second generation of robots that work alongside humans

For more than two decades, robotics market commentaries have predicted a shift, particularly in manufacturing, from traditional industrial manipulators to a new generation of mobile, sensing robots, called “cobots.” Cobots are agile assistants that use internal sensors and AI processing to operate tools or manipulate components in a shared workspace, while maintaining safety.

It hasn’t happened. Companies have successfully deployed cobots, but the rate of adoption is lagging behind expectations.

According to the International Federation of Robotics (IFR), cobots sold in 2019 made up just 3% of the total industrial robots installed. A report published by Statista projects that in 2022, cobots’ market share will advance to 8.5%. This is a fraction of a February 2018 study cited by the Robotic Industries Association that forecasted by 2025, 34% of the new robots being sold in the U.S. will be cobots.

To see a cobot in action, here’s the Kuka LBR iiwa. To ensure safe operation, cobots come with built-in constraints, like limited strength and speed. Those limitations have also limited their adoption.

[embedded content]

As cobots’ market share languishes, standard industrial robots are being retrofitted with computer vision technology, allowing for collaborative work combining the speed and strength of industrial robots with the problem-solving skills and finesse of humans.

This article will document the declining interest in cobots, the reasons for it and the technology that is replacing it. We report on two firms developing computer vision technology for standard robots and describe how developments in 3D vision and so-called “robots as a service” (yes, RaaS) are defining this faster-growing second generation of robots that can work alongside humans.

What are robotics sensing platforms?

Read More

Posted on

No-code industrial robotics programming startup Wandelbots raises $30 million

Dresden, Germany-based Wandelbots – a startup dedicated to making it easier for non-programmers to ‘teach’ industrial robots how to do specific tasks – has raised a $30 million Series B funding round led by 83North, an with participation from Next47 and Microsoft’s M12 venture funding arm.

Wandelbots will use the funding to help it speed the market debut of its TracePen, a hand-held, code-free device that allows human operators to quickly and easily demonstrate desired behavior for industrial robots to mimic. Programming robots to perform specific tasks typically requires massive amounts of code, as well as programmers with very specific, in-demand skillsets to accomplish; Wandelbots wants to make it as easy as simply showing a robot what it is you want it to do – and then showing it a different set of behaviors should you need to reprogram it to accomplish a new task or fill in for a different part of the assembly line.

The software that Wandelbots developed to make this possible originally sprung out of work done at the Faculty of Computer Science at the Technical University of Dresden. The startup was a finalist in our TechCrunch Disrupt Battlefield competition in 2017, and raised a $6.8 million Series A round in 2018 led by Paua Ventures, EQT Ventures and others.

Wandelbots already has some big-name clients, including industrial giants like Volkswagen, BMW, Infineon and others, and as of June 17, it’ll be launching its TracePen publicly for the first time. The company’s technology has the potential to save anyone who makes use of industrial robots many months of programming time, and the associated costs – and could ultimately make use of this kind of robotics practical even for smaller companies for whom the budgetary requirements of doing so previously put it out of reach.

I asked Wandelbots CEO and co-founder Christian Piechnick via email whether their platform can overcome some of the challenges companies including Tesla have faced with introducing ever-greater automation to their manufacturing facilities.

“The reversals regarding automation were caused by the inflexibility, complexity and cost introduced by automation with robots,” Piechnick told me via email. “People are usually not aware that 75% of the total cost of ownership of a robot comes from software development. The problems introduced by robots were killing the benefit. This is exactly the problem we are tackling. We enable manufacturers to use robots with an unseen flexibility and we dramatically lower the cost of using robots. Our product enables non-programmers to easily teach a robot new tasks and thus, reduces the involvement of hard-to-find and costly programmers.”

TracePen, the device and companion platform that Wandelbots is launching this week, is actually an evolution of their original vision, which focuses more on using smart clothes to fully model human behavior in real-time in order to translate that to robotic instruction. The company’s pivot to TracePen employs the same underlying software tech, but meets customers much closer to where they already are in terms of processes and operations, while still providing the same cost reduction benefits and flexibility, according to Piechnick.

I asked Piechnick about COVID-19 and how that has impacted Wandelbots’ business, and he replied that in fact it’s driven up demand for automation, and efficiencies that benefit automation, in a number of key ways.

“COVID-19 has impacted the thinking on global manufacturing in various ways,” he wrote. “First there is the massive trend of reshoring to reduce the risk of globally distributed supply chains. In order to scale volume, ensure quality and reduce cost, automation is a natural consequence for developed countries. With a technology that leads to almost immediate ROI and extremely short time-to-market, we hit a trend. Furthermore, the dependency on human workers and the workplace restrictions (e.g., distance between workers) increases the demand for automation tremendously.”

Read More

Posted on

Robotics startup lets machines get closer as humans keep their distance

As humans get used to working at a distance from each other, a startup in Massachusetts is providing sensors that bring industrial robots in close —  centimeters away, in fact. The same technology may support future social distancing efforts on commutes, in a pilot application to allow more subway trains to run on a single track.

Humatics, an MIT spinout backed by Lockheed Martin and Airbus, makes sensors that enable fast-moving and powerful robots to work alongside humans without accidents. If daily work and personal travel to work ever go back to normal, the company believes the same precision can improve aging and crowded infrastructure, enabling trains and buses to run closer together, even as we all may have to get used to working further apart.

This is the emerging field of microlocation robotics — devices and software that help people and machines navigate collaboratively. Humatics has been testing its technology with New York’s MTA since 2018, and today is tracking five miles of a New York subway, showing the transportation authority where six of its trains are, down to the centimeter.

UWB sensors for microlocation

Humatics’ technology in the MTA pilot uses ultrawide band (UWB) radio frequencies, which are less failure-prone than Wi-Fi, GPS and cameras.

“A good example of a harsh environment is a subway tunnel,” said David Mindell, co-founder of Humatics and professor of engineering and aerospace at MIT. “They are full of dust, the temperatures can range from subzero to 100 degrees, and there is the risk of animals or people tampering with devices. Working inside these tunnels is difficult and potentially dangerous for crews, also.”

Humatics has sold more than 10,000 UWB radio beacons, the base unit for their real-time tracking system, to manufacturers of sensor systems, the company says. They pinpoint the location of hundreds of RFID tags at a range of 500 meters, using multiple tags on an object to measure orientation.

Read More

Posted on

MIT muscle-control system for drones lets a pilot use gestures for accurate and specific navigation

[embedded content]

MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) has released a video of their ongoing work using input from muscle signals to control devices. Their latest involves full and fine control of drones, using just hand and arm gestures to navigate through a series of rings. This work is impressive not just because they’re using biofeedback to control the devices, instead of optical or other kinds of gesture recognition, but also because of how specific the controls can be, setting up a range of different potential applications for this kind of remote tech.

This particular group of researchers has been looking at different applications for this tech, including its use in collaborative robotics for potential industrial applications. Drone piloting is another area that could have big benefits in terms of real-world use, especially once you start to imagine entire flocks of these taking flight with a pilot provided a view of what they can see via VR. That could be a great way to do site surveying for construction, for example, or remote equipment inspection of offshore platforms and other infrastructure that’s hard for people to reach.

[embedded content]

Seamless robotic/human interaction is the ultimate goal of the team working on this tech, because just like how we intuit our own movements and ability to manipulate our environment most effectively, they believe the process should be as smooth when controlling and working with robots. Thinking and doing are essentially happening in parallel when we interact with our environment, but when we act through the extension of machines or remote tools, there’s often something lost in translation that results in a steep learning curve and the requirement of lots of training.

Cobotics, or the industry that focuses on building robots that can safely work alongside and in close collaboration with robots, would benefit greatly from advances that make the interaction between people and robotic equipment more natural, instinctive and, ultimately, safe. MIT’s research in this area could result in future industrial robotics products that require less training and programming to operate at scale.

Read More