The concept of living things: adding life to machines
Para português, acesse aqui.
Since the beginning of times, the human is amazed about the possibility of creation, and has made significant improvements over time on its own pace. Going from pictures, to animation, to mechanical movement objects and so on. From generation to generation we’ve been mesmerized about approaching that “Creator’s feeling”, and this is good since it keeps us inspired.
One of the most amazing things about this is how the possibility of creation of new creatures is improved, and this is mostly enabled by incredible technology, but what exactly does it mean?
What is "incredible"?
Let’s suppose you’re reading this in 20 years from now, probably you’ve experienced so much amazing things that robots are no longer incredible. And by that, let’s use this word:
Incredible: Borrowed from Latin incrēdibilis (“that cannot be believed”), from in- (“not”) + crēdibilis (“worthy of belief”), from crēdō (“believe”).
See? I could say for our generation that Romba-like robots aren’t incredible anymore, perhaps they were a few years ago. But they're so common for us right now, even being expensive in some countries, that I don’t believe the word applies to it anymore.
A recent breakthrough that came to us and it’s still incredible is the PS5 controller with haptic feedback and adaptive triggers, and this is due to some people that thought about how to improve user experience when playing games, it’s all about how it makes you feel.
How to achieve that concept
There’s a lot of possibilities to achieve that “Unbelievable!!” effect on people, and it depends on the living thing that you’re mostly prone to create. It could be a new form of interaction to something, or even an entirely new thing you’ve gotta teach people what to do with it. Designers call this “affordances”: how many possible ways you can interact with an object.
Since we’re talking about living things, as you may have noticed, we should approach to the concept of living.
Alive: (of a person, animal, or plant) alert and active; animated.
I’ve even add a few things to that definition, since most of the time we do not remember as plants being living beings.
Something that behaves in unexpected ways based on its own conscience, random enough that we must learn how to interact with it in order to create some relationship, with empathy.
That’s mostly why most of us don’t interact with plants or even are actively aware that there’s some living thing in a room. We also only care to living beings that aren’t rejected by us, think about cockroaches, does it makes you feel empathetic?
What makes a living thing
So far, we’ve noticed that for a living thing to amaze us must have:
- Living behavior
- Must transmit empathy
But how do we do that? I’ve already started with the few lines added to the Alive concept above: "behaves in unexpected ways based on its own conscience(…)".
If we’ve got something working in a completely random form, we might even accept as something "alive" at first, but after first or second interaction of randomness approach, we’d add that thing to the “Not understandable bucket”, and we would let go (like a fly, for example, we mostly don't take into consideration its a living being, at least not one that matters for us). A few of us would try to understand the mathematical background in that random behavior, but most of us would just ignore it.
That’s why “based on its own conscience” is the part that matters. The thing we try the most is to understand how the conscience of other living being works and how it makes us feel/how to make it feel something.
Which guides us to the second part that composes this "living" concept: empathy.
Empathy: the ability to understand and share the feelings of another.
The way we’re used to understand other’s feelings is through communication, however a single type of communication is not deep enough. It could be language (our workaround for most of it), but there are deeper and most complex such as emotions or even body signals.
Language
The hack that we’ve created to understand other living beings from the same species most of the time, pretty effective as you may have noticed reading this post. For sharing information is wonderful, but sharing emotions may not be enough.
In order to add the concept of language to machines, we teach reading (OCR), language understanding (NLP), how to listen (Speech to Text) and how to speak (Text to Speech), we teach how to clone human voices (VoiceCloning) ) and also to know who corresponds to a certain voice (Voice biometry), it is also possible to teach it to understand the location of a sound (Sound Localization with a set of microphones), such as detecting noise in dangerous situations.
Visual contact
Some even say that the eyes are the gateway to one’s soul, as most of the eye movement is uncontrollable when communicating and there may be understandable patterns from one being to another. Again not a perfect one, due to the fact that someone who is able to control those variables can hide or suppress its own feelings.
To emulate emotions in visual contact, although complex, one can use techniques similar to what is done in the movie Wall-E. The feeling of depth in the gaze is extremely powerful, and this can be created.
For the machine to understand how to interact with us, we can take advantage of sentiment analysis based on cameras, facial recognition to detect friendly faces, mix with language algorithms to initiate a dialogue when an unknown face appears and whatever your creativity desires.
Touch
Have you ever laid on your pet’s belly? Could be any pet, a dog, horse, cat, or even a human, a loved one, perhaps your children. Do you know the feeling when you can feel the heartbeat or the breathing pattern?
The possibility of feeling another’s soul through touch is something that makes us feel wonderful and probably one of the most incredible things and deepest connection that we can ever experience in life.
There’s a controlled behavior, mostly incomprehensible, but we have guesses. A frightened living being may have its heartbeat and breathing pattern accelerated. A loved one may have a slow pace. An excited one would have something similar to the frightened situation, perhaps a little less?
Body response
When you think about a dog, you clearly detect its happiness through the tail behavior. Cats may have a slightly different behavior, and when you think of humans, perhaps the smile is the easiest way, or someone’s eye pattern while smiling.
Those are body responses that compose the language and the form that we communicate, transmit empathy and are able to relate to each other.
Wrap-up
We’ve been able to check a few incredible things, and what exactly is the point of living. It contains a little random and unexpected behavior, but not entirely as we can learn to interact and understand. There are multiple ways to explore how a living thing can be perceived as "alive", but one viable option is to replicate living concepts from already known empathetic living beings.
A partially comprehensive way of understanding another living makes us feel more empathy, such as embedding heartbeat concept into a robot, but a single form of expression isn’t enough to complete the question: “Does it likes me? Do I like it? How do I feel about this relation?”
All of those concepts when brought together may create some astonishing things such as Vision AVTR from Mercedes-Benz as in the video below.
There’s a lot of different ways and I couldn’t write enough to represent what makes a living being, as you might’ve seen.
This is the first of a series dedicated in how to simulate the living behavior for robotics and electronics.
I’ve been delaying for a long time about how could I approach the continuous development and I believe this is a good way to keep track.
I hope you enjoyed the reading and keep creating!