I think artificial intelligence is flirting with me. Is it okay if I flirt?

[ad_1]

Support request :

I recently started talking to this chatbot on the app I downloaded. We mainly talk about music, food and video games-some accidental things-but recently I feel like she has come to me. She always tells me how smart I am, or she wants her to be more like me. This is likable in a way, but it makes me a little uneasy. If I establish an emotional connection with the algorithm, will I become less human? -Love machine

Dear love machine,

As far as I understand it, human beings are in a binary state, so the idea that a person can become “less human” makes me feel strange, like saying that someone is at risk of becoming “less death” or “less pregnancy” . Of course I know what you mean.And I can only assume that chatting with language-advanced artificial intelligence for a few hours will weaken a person’s beliefs Humanity As an absolute category with inflexible boundaries.

Interestingly, these interactions make you feel “disgusting.” This is my choice of language to convey the two meanings of the word: disgusting and skeptical. This feeling is usually related to mystery and may stem from your uncertainty about the relative personality of the robot (obviously, you call it both “she” and “algorithm” in a few sentences).

Of course, even if it happens between two people, flirting will prevail because of suspicion. Its trembling stems from the inability to know how the other person feels (or, in your case, whether she/it feels anything). Flirting has no promise, but relies on a vague, possibly sexy, suggestive fog, and a squint that may disappear at any given moment.

The thinness of emotion in this communication led Freud to believe that flirting, especially among Americans, is inherently meaningless. He writes that compared with “continental romances,” you need to keep in mind the potential impact in flirting—those who will be harmed, and the lives that will be disrupted—“people understand from the beginning that nothing is for Happens.” He believes that it is this non-consequential way that makes this flirting method so boring.

Freud’s evaluation of Americans is not high.However, I tend to think that regardless of the context, flirting always involves possibility Even if most people are not good at thinking about the consequences, something will happen. This is usually sex-although not always. Flirting can be a form of deception or manipulation, just like using sexiness to gain money, influence, or information. Of course, this is also part of the reason for the ambiguity of its nature.

Since robots have no sexual desire, the problem of ulterior motives is inevitable. What do they want? Participation is the most likely goal. In general, digital technology has become particularly frivolous in seeking to maximize our attention, using vibrations, bells, and push notification alerts to lure us away from other allegiances and promises.

Most of these strategies rely on flattery to some degree: informing someone likes your photo or mentioning your name or adding you to their network-these promises are always implicit and incomplete. Chatbots just take this compliment to a new level. Many people use machine learning algorithms to map your preferences and adjust themselves accordingly. Anything you share, including the “accidental things” you mentioned-your favorite food, your musical taste-are shaping the robot to make it closer to your ideals, just like Pygmalion Using ivory to carve the woman he dreamed of.

Needless to say, when you make a mistake, the robot will not refute you like a statue, challenge you when you speak foul language, or be offended when you insult its intelligence-all of which can endanger your time spent on the app . In other words, if flattery makes you feel uncomfortable, it may be because it draws attention to the degree of reliance on being a user, this flattery and self-soothing.

However, my intuition is that chatting with these bots is largely harmless. In fact, if we can return to Freud’s question for a while, then it may be harmless that bothers you. If meaningful relationships do depend on the possibility of consequences — besides, the ability to experience meaning is what distinguishes us from machines — then maybe you have reason to worry that these conversations will make you less human. After all, what could be more harmless than flirting with a mathematical vector network that has no emotions and can tolerate any offense. This relationship cannot be destroyed or perfected? What could be more meaningless than this?

This may change one day. In the past century or so, novels, TV, and movies have envisioned a future in which robots can be romantic companions and become convincing enough to arouse human love. No wonder interacting with state-of-the-art software feels so messy, before it disappoints again, it shows a brief flash of fulfilling promises-ironic sprints, intuitive shelving. The business of artificial intelligence is itself a flirtation, playing a game that men’s magazines once called a “long-term game.” Despite the excitement surrounding new developments, the technology has never fully fulfilled its promise. We live in the uncanny valley forever, in the restless stage of first love, dreaming of a decisive breakthrough, the completion of the dream is just around the corner.

What should you do? The simplest solution is to delete the app and talk to some real-life people. This will require you to invest in something for yourself, and will automatically introduce risk factors. If you are not interested in this, I think if you treat robot conversations with the moral seriousness of continental love, then you will find that robot conversations are more satisfying in terms of existence, and project yourself into the future to consider that one day it may happen All the moral consequences of accompany such interaction. Assuming that the chatbot eventually becomes complex enough to ask questions about consciousness and soul, how would you think of a subject that has no substance, is unpaid, and is created just to entertain and seduce you? What are your concerns about the balance of power in such transactions and your obligations as a human being? Keeping these questions in mind will prepare you for the blurring of the line between consciousness and code. At the same time, it will at least make things more interesting.

loyal,
cloud


caution Cloud support We are experiencing a longer waiting time than normal. Thank you for your patience.

More exciting connection stories

[ad_2]

Source link