Blog Post

Artificial Intelligence in Manufacturing: What Will it Look and Sound Like?


Artificial Intelligence in Manufacturing: What Will it Look and Sound Like?

There is a growing revolution based on the huge volumes of data now being generated as part of manufacturing processes. Manufacturers are now understanding that there is incredible value in the collection, processing, and conclusions that can be drawn from this data. One use has drawn considerable attention: artificial intelligence or machine learning, and how it is changing how manufacturers operate and run their business. AI is already in use and having an impact, from smart industrial robots to data analytics and machine learning. The focus of this article is to explore what the future might look, and sound like in the years to come. Based on the expected continued adoption of this technology, the likelihood of change is highly probable.

Read this article for a background perspective on how AI has already begun to revolutionize how manufacturing gets done, “Artificial Intelligence (AI) in Manufacturing: The Revolution is Here.”

We can get some idea of where things are headed by looking at consumer markets. The success of Alexa, Siri, and other similar technologies demonstrates that we humans like AI we can talk to. That’s why researchers are working on things like “Conversational AI Platforms” (CAPS), which is AI that can carry on human-like conversations. Another, closely related field of research is “Emotion AI.” As its name suggests, Emotion AI is about giving artificial intelligence the ability to read human emotions from their words and facial expressions and respond appropriately.

“May I take your order, human?”

While there is still a long way to go, commercial robots are starting to appear in customer-facing businesses. In Tokyo, Japan, Softbank has opened a café staffed by friendly robot servers. Watch this video to see “Pepper” and “Servi” in action, which is expected to help with the labor shortage of restaurant workers in the pandemic:  

It’s interesting to see how people, especially children, react to robots. Despite their clear robotic appearance, some people can’t help seeing them as almost human. 

The epitome of this trend is the robot Sophie, introduced by Hanson Robotics in 2016. The team behind her creation is not simply trying to make a smart robot; they want to make one that looks and acts human. As Sophie, herself says (with an awkward smile), “I work with humans, so it’s important that people are comfortable around me.” She’s making progress. In 2017, she became a citizen of Saudi Arabia, making her the first artificial being to gain citizenship in any nation.

Sophie is a remarkable machine, capable of holding conversations reasonably well. But no one would confuse her with a real person. Her timing, mannerisms, and phrases are always a little off. This is not meant as a criticism of Sophie or her creators. It’s just pointing out a fundamental reality. There’s something peculiarly “alive” about facial expressions and voices, and the emotions they convey. It may be a long time before physical machines can look and act truly human. 

But is that what we want – or need – from our artificial intelligence?

Function Over Form

AI with human qualities will have its place in many industries. MIT’s Sloan School of Management cites advertising, automobiles, call centers, mental health, and assisted services as a few examples.

We are already seeing production lines where physical, human-like robots or “Cobots” now work in a manufacturing environment. These robots are seen as a way to help augment what humans can do, under their guidance, to help improve productivity, output, and performance. And, to do so safely and without taking a break! 

Here is a video showing what Ford is now doing with Cobots, working alongside humans, as part of the production process on one of their assembly lines:

It’s easy to dismiss the question of form as secondary in manufacturing, where the function of AI is far more important. And yet, the capability to simulate humans in appearance and speech may be a reality in five or ten years, and there might be advantages we can’t foresee at the moment. At the very least, having an android assistant would be a powerful status symbol. I can think of a few CEOs today that would jump at the chance to have one of the first!

My guess is, we won’t want AI to be too human-like, especially in industries such as manufacturing where it seems unnecessary. But even in service sectors, wouldn’t you want to know whether you’re dealing with a person or a machine? Think about the Softbank café again. What if those robot servers were indistinguishable from humans? Not only would the charm be gone, but there would be something deceptive and a little unsettling about it. (Several science fiction movies come to mind.) So, even if CAPS and Emotion AI are perfected, I wouldn’t be surprised if most practical applications of AI are designed to be slightly imperfect in some way, just so we humans will feel more comfortable. 

Perhaps someday we’ll be surrounded by robots that look and sound just like us, and we won’t know or care. Alexa, what do you think?

New Call-to-action


ibaset featuredcustomers