June 22, 2020
Industrial robotics: computer vision and ML converge
If there is one core trend in industrial robotics in 2020, in spite—or perhaps because of—the current coronavirus crisis, it is a generally steady increase in demand. Added to that is the arousal of interest from industrial sectors outside of manufacturing, the traditional mainstay of robotics research and development.
Among the catalysts for this interest is the coming together of advanced robotic hardware and AI technologies, such as machine learning and computer vision. It’s an amalgamation that’s finally giving developers the power to uncage automated equipment and set it to a multitude of tasks across diverse industrial settings.
Just as importantly, it enables functionality that minimizes danger to humans, so people and machines can safely work together on tasks that benefit from a combination of manual and mechanical processes. Of course, awareness of the broader trends might not be sufficient to decide on the value of investiture in robotic automation. However, the following brief analysis of underlying developments and driving forces could be a source of interest and, perhaps, enlightenment.
Let’s take a look first at what’s behind the growth in the use of industrial robots. Why have they become pervasive in an expanding range of verticals, and what’s stimulating expert predictions such as that by Mordor Intelligence, stating that by 2024 the industrial robotics market will be worth some $40 billion?
The drivers of robotics adoption can be split effectively into two categories—those arising from the industrial environment and its evolution, and those due to the technology itself.
Many industries are feeling the effects of skilled labor shortages. At the same time, companies are reluctant to invest heavily in training and developing unskilled employees, for fear of losing them afterward through defection to competitors. With no end in sight to the workforce shortfall, the appeal of robots as an efficient supplement, and even replacement, for human labor is continuing to grow.
Efficiency may be one of the most compelling arguments for companies substituting human labor with robots, as the latter do not need to sleep, eat, or take breaks. They can perform repetitive tasks with no variation in work quality, and don't get sick, bored, or distracted.
Then there is the safety element. Many industrial applications require the use of stacking, lifting, and cutting equipment. These types of tasks present some degree of risk to human operators, but not for robots.
Perhaps one of the most influential forces acting on the rise of industrial robotics currently is the global impact of the COVID-19 pandemic. Never before have human fragility and the imperviousness of the machine been so visibly compared and universally highlighted.
It's possible that social distancing will become the norm for the foreseeable future, as will the general undesirability of having people working together at close quarters. It’s no surprise, therefore, that the health crisis has spurred more industrial enterprises to look into the possibility of replacing some manual processes with those performed by intelligent automated machinery.
In a nutshell, enterprises are looking to technology for improvements in flexibility, safety, output, quality, and, naturally, cost reduction. Static, caged-in robots have been helping to deliver these benefits for some time in specific sectors, such as the automotive industry.
However, as automation converges with computer vision and machine learning, the advancing capabilities of industrial robotics are bringing the technology to the attention of many other industries.
When technology experts, such as robotics and machine learning consultants, work hand in hand to enhance robots’ abilities, the results can be astounding. This amalgamation of cutting-edge technologies has begun to transform industrial robots from operational fixtures to something more like an electronic workforce. They are gaining the capability to navigate within their environment and execute tasks in cooperation with humans.
The ability to see in 3D enables robots to distinguish visually between an item and its background, and machine-learning algorithms allow them to recognize and identify complex shapes. Hardware and programming advances are facilitating the creation of more versatile mobile skeletons and sensor-equipped tooling that can safely handle delicate and fragile items, such as electronics and food products.
These developments are responsible for the broadening appeal of robotics in industries not traditionally considered suitable for their use—think life sciences, consumer goods, and food and beverages, for example.
Machine learning and artificial intelligence are also making robots able to cooperate safely and work alongside humans, a shift that has led to the adoption of the term cobots.
While these downsized multitaskers of the automated world are still in their infancy, they are stimulating interest both within corporations and smaller businesses. As they continue to become more affordable, their versatility will persuade many more SMEs to take the idea of industrial robotics implementation seriously.
The cost of robotics is generally falling, and alternative business models like robotics-as-a-service (RaaS) make industrial robots accessible even to companies that don’t have substantial capital budgets to exploit. The affordability of the units themselves, along with the fact that programming is becoming more straightforward and hence less costly, is also boosting the appeal of industrial robotics adoption.
Ultimately it comes down to the question of “what can industrial robots do within my enterprise?”
The answer is no longer limited to spot welding and paint spraying. With machine learning and computer vision integrated into robotics software, the door is opening for robots to take on a much broader spectrum of industrial tasks, including:
So we see this growth in capability along with the untethering of robots and their improved suitability for working safely alongside people, combined with the external drivers mentioned earlier. In concert, they are finally bringing about a shift that will soon see robots, especially cobots, a commonplace sight in factories, warehouses, and similar industrial environs around the world.
As cobots seem to be potentially the next big thing in industrial robotics, and interest in them is surging, let’s stay with that topic for a few moments and look at the key trends impacting robots’ use in industry and commerce.
With computer vision and machine learning applied to their operating systems, robots can take on tasks that were once the preserve of human workers.
For example, when equipped with IoT technology, especially sensors, cobots can be programmed to pick up and handle objects using neural networks and other machine-learning models. Conventional industrial robots could only do this if supplied with predetermined positioning and retrieval trajectories.
Next-generation cobots can pick up items from non-fixed locations, using 3D vision and laser-triangulation. Better still, the more objects they pick up, the better they become at handling them, due to the continuous iterative refinements of the machine-learning algorithm.
Items that are not rigid still present some handling challenges for industrial robots, but advances are taking place rapidly. The latest tooling innovations enhance dexterity and sensitivity, typically utilizing sensors and pneumatic or vacuum-based actuation.
As spatial awareness and navigation capabilities mature, industrial robots will be ready to make the ultimate breakthrough—working safely alongside humans in generic, rather than specialized, factory and warehouse environments. Many current cobot models can do this now but not efficiently, as they are constrained in how they interact.
For instance, when a mobile cobot detects an obstacle in its way, its only recourse to prevent a collision may be to stop, remaining stationary until the obstruction is removed. Artificial intelligence is beginning to change things, though, enabling cobots to map the environment around themselves. They will be able to spot collision-risks en route from A to B and reroute themselves to avoid those obstructions while traversing efficiently along an optimal path.
When working in shared environments with humans, the AI-equipped cobot will be able to switch between optimal and safe levels of force and speed while performing tasks. Safe mode will be activated when a cobot detects people nearby, with a return to optimal performance when the space surrounding the machine is clear of human presence.
Cobots with these capabilities are already in evidence, deployed for applications that include:
Examples of cobots already deployed include Sawyer, a multi-purpose robotic arm, which sells for little more than the price of an executive saloon car. Sawyer can be programmed with a tablet or by following a human operative’s arm and hand movements.
One Tennessee injection molding company, Tennplasco, used Sawyer to solve a labor shortage and extend its operation to run 24 hours a day, achieving a return on investment in under four months. Working alongside humans to assemble parts, Sawyer enabled Tennplasco to halve the size of its assembly work cells, from four to two persons.
With its ease of programming and small form-factor, the cobot was up and running within a week. Examples like this prove that industrial robotics technology is available to SMEs today, and can make an appreciable difference to operating costs and, ultimately, profitability.
As the Sawyer example illustrates, cobots are ideal for human/robot co-working situations and can be adapted for a wide range of tasks. However, even when a single task must be performed with precision and efficiency, the integration of industrial robotics, AI, and 3D computer vision changes the game.
Wherever there is a manufacturing or production operation, there is a need to remove cartons from pallets. Machinery has been deployed to perform this task for some years. Until recently, though, those machines could only remove an entire layer of cartons and place them so a human worker could move them, one-by-one, onto a conveyor.
For a robot to depalletize cartons individually, it must be supported by a combination of image analysis and machine learning. That's just what the Photoneo Depalletizer, from a robotic intelligence vendor based in Slovakia, uses to unload individual cases from their pallets at a rate of 1,000 per hour.
The system uses a convolutional neural network to analyze 3D visual data and physical box texture to unload pallets and place the cartons directly where they are required. It can place them onto a floor space or a conveyor system, lifting weights of up to 50 kilograms and sparing human employees from the risk of lifting and bending injuries. Plus, of course, it can do all this non-stop without ever becoming fatigued or needing a break.
Interestingly enough, Rethink Robotics, the company behind Sawyer and its lower-priced sister robot Baxter, ran out of steam in 2018. The development and distribution of Sawyer were taken over subsequently by HAHN Group. It's widely stressed by observers, though, that Rethink Robotics' demise was not down to any lack of buyer-interest in its innovations. Instead, the company failed because it could not contend in what has become a fiercely competitive market.
Of course, this is good news for companies wishing to intensify automation levels in their operations. So too is the splintering of the market into manufacturers and vendors specializing in robot chassis and others focused on components such as 3D cameras and effectors.
If you're not familiar with the term, effectors are the tools used at the end of a robot arm to pick up items. For industrial robots to adapt to various tasks, their design must allow them to accept and work with a range of different effectors.
Suppliers have recognized this and are building their robots to allow fast tooling changeovers, while dedicated manufacturers are developing effectors for hundreds of specialized applications.
Examples of these effector tools include:
As interest intensifies in the implementation of industrial robotics, particularly among smaller businesses, the competition among robot manufacturers and vendors will undoubtedly continue to grow. Indeed, according to a report from Supply Chain Dive, the International Federation of Robotics expects sales of robot units to increase by more than 10% during 2020, potentially stimulating many new manufacturers and vendors to enter the market.
Over the next three or four years, we're likely to see greater diversification in the activities for which industrial robots are deployed. The machines themselves will realize increasingly exceptional capabilities.
This article has discussed several types of robots, but with the primary focus on portable yet not necessarily mobile robotic arms. The robot of the near future will probably become more mobile, more dexterous, and, at risk of introducing the fear factor, more humanlike.
Robots that possess features that people can relate to, like eyes, have already shown themselves to be more readily accepted in co-working environments.
The products mentioned above from Rethink Robotics, Sawyer and Baxter, have expressive eyes and a head-like screen that turns in the direction of an anticipated arm movement. That makes them more comfortable for humans to work alongside such robots, as people subconsciously notice head movements, making the following arm movement less of a surprise.
It is subtleties like this that will probably be incorporated into the next generation of industrial robots. Many will also come equipped with multiple, rather than single, arms, which can work in unison or independently, allowing one robot to fulfill several tasks concurrently.
Mobility, too, is likely to become a more mainstream and less novel feature of industrial robots. Today, we tend to see mobile robots as those that can do little more than move payloads while static machines manipulate products and materials.
Soon enough, these capabilities will be integrated more effectively, enabling a robot in a logistics environment, for instance, to move around a warehouse at will. It will be able to pick and pack mixed products in the same way that a human warehouse operative would—only with higher speed, greater accuracy, and improved efficiency.
In a factory setting, robots similar to those in warehouses will be able to deploy and redeploy from task to task as needed, changing their end-of-arm tooling in an instant, as necessary. They will use advanced navigation to move around spaces to assemble large structures, load and unload machines, or carry parts and equipment from one place to another.
There is also no certainty that mobility will be limited to a horizontal plane. By integrating drone technology, it will be possible to have small robots that can navigate internal and external vertical spaces in factory or warehouse facilities. That will make them invaluable for inspection, light repair, or maintenance tasks in areas where humans can't go without jeopardizing their safety.
Machine learning, computer vision, and other AI technologies are changing the game in industrial robotics, and the prognosis is that adoption will grow exponentially between now and 2025. A 2019 study by Boston Consulting Group found that 86% of companies, across all sectors, plan to integrate advanced robotics into their operations within the next five years.
The current coronavirus pandemic is unlikely to change the minds of companies among that number, potentially accelerating adoption rates. After all, the world is witnessing first-hand how a natural event can cripple industries reliant on people. Meanwhile, industrial robotics is demonstrating that such catastrophes no longer need to be inevitable.
It will be interesting to see which way future trends will swing. As the world settles down into a new normal, will it be one in which enterprises like yours turn to robots for industrial resilience and robustness? It would not be a surprising outcome at all.
Find out how new types of HMI, including enhanced touch interfaces, voice and gesture control, and AR/VR glasses revolutionize Industry 4.0.
The era of robotics in healthcare is dawning. Learn how robots come to redefine care delivery and administration right now.
Learn what security threats plague industrial IoT today and what countermeasures to adopt against them.
Uncover the potential of connected corporate buildings and learn about the cutting-edge smart office use cases from Itransition and other innovative companies.
Itransition has developed 6 versions of the medical equipment system that helps blood centers and labs collect and manage blood data.
In the face of the growing uncertainty surrounding machine learning adoption in the banking industry, Itransition responds with an actionable framework.
Find out how machine learning personalizes patient care, alleviates pressure off doctors, and opens opportunities for innovative medical research.