Robots Have Learned How to Speak Body Language

We often acquire knowledge of body language through observing people, the environment, or by mimicking the behaviour. Understanding the whole context of the communication can also be obtained through artificial intelligence mechanisms, as discussed in technological-related sites like robots.net.

We can distinguish if the person is lying or not, for instance, just by observing his or her body language. Observing the way they act is a common way to know them personally. Studies reveal that half of the interpersonal communication is nonverbal cues. Some researchers say that 70% or 90% of social life consists of body language communication. Posture and facial expressions are the most common types of this language.

Human Body Language

Body language is the subtle message we send and receive from particular people nonverbally. From our body movements and facial expressions, this nonverbal communication takes a massive part of our daily basis. Human body language can be simplified using these common channels:

Body Proxemics


Proxemics is the movement of our bodies. Gestures are the best example of body proxemics that tells us a lot about the real feelings of the person, such as nervousness. The cues in their bodies can be easily distinguished because of frequent body movements.

Ornaments

Ornaments are the extensions of the human body language. These extensions include sunglasses, clothes, and jewelry we wear, and even hairstyles. Aside from the things we wear, how we treat ornaments also indicates nonverbal cues.

Facial Expressions

Facial Exprtession are necessary parts of body language. Facial gestures are often associated with intense feelings. Through observing someone’s face, we can identify his or her hidden emotions. When someone gets angry, sad, or upset, the first thing we notice to them is their facial expressions.

ALSO READ  What Can Seniors Do to Protect Themselves Online?

Historical Background of Robotics


The existence of robots first recognized in Egypt. Egyptians used mechanical devices like water clocks to strike the bells of the water. Around 3000 B.C., these figurines were built to carry out various physical tasks.

The earliest modern robot was invented in the 1950s by George C. Devol. His invention was called “Unimate,” the first reprogrammable manipulator in the contemporary world. However, he intended to sell his creation but did not succeed. An engineer/businessman named Joseph Engleberger appreciated his product and bought it in the late 1960s. He modified it and turned into an industrial robot, marking the emergence of the robotics business.

Joseph Engleberger created a robotics company and named it Unimation. It was a successful company, and because of Engleberger’s effort for the triumph, he is now known as the “Father of Robotics.”

Charles Rosen and his research team also developed a robot named “Shakey.” The robot is considered as the more advanced technology than Unimate. Using his television eyes, Shakeys could wander around the room and observe the scenes.

Robots on Analysing Human Body Language


Human capabilities include deciphering nonverbal messages. It is part of our instinct to read someone’s body language without them giving some explanation. Can robots do the same? Robots have advanced facial recognition and computer vision system. But in spite of this, they still have difficulties in recognizing small details from human’s body language.

To resolve this problem, the researchers from Carnegie Mellon University (CMU) creates a body tracking system design to read and track every small detail from nonverbal communication. They called it OpenPose that even processes multiple video games and monitor people. The system simplifies the interactions between humans and robots.

ALSO READ  Why it’s Important for Fathers to Know the Child Support Laws in Oklahoma

OpenPose system additionally tracks the head, torso, and limbs of the human body. The researchers used CMU’s Panoptic studio to make the system more effective. The studio has 500 cameras that can capture body positions from different angles. The captured images are turned into details, showing crucial information about nonverbal messages.

The images were originally captured in 2D, but the researchers took the efforts and later detected in 3D to clearly understand each posture of your body from different angles. The processed data of this system is a great development in science as it can be used to read the entire body language and the individual’s real emotion.

Researchers confirm that this type of automatic system can be applied to all kinds of human-robot interaction. Thus, it plays a crucial role in virtual reality (V.R.) games. This allows them to track every movement of the user without another sensor.

Takeaway

Robotics is often present in various fields. From the clinical setting to business industries, their functions play a significant role in creating a convenient world. In understanding multiple views in the societal context, the system of robots has a considerable influence on understanding what we desired to learn thoroughly. The best example of it is nonverbal communication.

People tend to overlook or misinterpret the human body language, but nowadays, understanding the nonverbal messages can be conveniently interpreted through robots. Their existence is one of the most significant advancements in science and technology.

Posts created 1448

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top