Jump to content

Welcome to FutureTimeline.forum
Register now to gain access to all of our features. Once registered and logged in, you will be able to create topics, post replies to existing threads, give reputation to your fellow members, get your own private messenger, post status updates, manage your profile and so much more. If you already have an account, login here - otherwise create an account for free today!
Photo

The future of behavior-modification "apps"


  • Please log in to reply
10 replies to this topic

#1
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,312 posts
When we think of beauty, we often think of surface appearances, like youthful skin, hair, muscle, breasts (women), and so on. However, that's only the part you see in a photograph. The beauty you see in a movie of the person includes facial expression, hand gestures, the walk they walk, sense of humor, and so on. In fact, these can be more important than base appearances; for, there are countless examples of people who just aren't that attractive, physically, yet who attract a large number of people to them just the same. Perhaps in the future, there will be the equivalent of "plastic surgery" for behavior:

Imagine a future marketplace of "behavior apps" that can be installed in your brain. These include "routines" that are triggered when in the right situation -- for example, if you are perplexed, maybe you subconsciously furrow your brow and move your lips to the side in a cute way. You can learn to imitate that gesture, as actors do; but you won't so easily be able to call upon it out of the blue, when confused, unless it's baked into your subconscious. But this could be done, once brain modification becomes possible.

What would it take to get there? It's going to take some research on how gestures are learned and triggered; and it's going to take some new BCI hardware to write them into the brain. Perhaps it could work through association -- you're placed in VR environments in situations meant to elicit the behavior, and then your brain is stimulated in a precise way to produce it; and, over a few sessions, you learn to make that gesture "automatically". After learning several new gestures, people might say, "Something seems to have changed about you. Have you been working out? Have you lost weight?" -- they might not be able to quite put their finger on it.

And if gestures can be programmed-in, why not also try to train people to have a better sense of humor, or maybe produce more entertaining conversations? For example, place the subject in an environment where a joke is called for, and then stimulate certain parts of brain that trigger the generation of sarcastic comments, say; other times, perhaps a merely humorous statement would be appropriate. Soon enough, the right humor for the right occasion becomes "automatic".
  • eacao, Casey and Yuli Ban like this

#2
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,022 posts

 

...the right humor for the right occasion becomes "automatic".

 

...and who is to define what the "right humor" is that should become appropriately (?) "automatic."   Some programmer with an agenda? The first trillionaire? Vladimr Putin?


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#3
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,312 posts
One can say the same about beauty: who decides what is beautiful and what isn't?

And if you're not beautiful -- as determined "democratically" by people voting with their sexual arousal -- then you don't get hit on so often.

I am someone who used to get hit on quite a lot. I used to be attractive (and still am, to some degree, though am older). People who were not, did not get much attention. And billionaires weren't to blame.

#4
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,022 posts

So, in the future is AI also going to determine what is beautiful? Or will narcissists like Donald Trump make sure that gets programmed into AI?


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#5
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,312 posts

The people will decide, according to what they desire -- what behaviors they want.



#6
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,022 posts

So, it will be a bit of a dialectical relationship?

 

 

a future marketplace of "behavior apps" that can be installed in your brain

 

 

brain modification becomes possible.

 

 

why not also try to train people to have a better sense of humor, or maybe produce more entertaining conversations?

 

 

Perhaps, even, buy more "behavior apps" ?

 

I know Yuli probably already thinks I am a Luddite, but there is just something that makes me a little bit uncomfortable about this  whole idea. It seems like the latest iteration of the "invisible hand of the marketplace" manipulated by an Ozish wizard (or wizards).  Optimists might shrug it all off as just another way in which AI will meld with humanity. Pessimists will worry about the end of all that is human. Philosophers may be arguing about it for centuries, at least until they forget that there ever was an argument.  

 

Didt I just make a funny joke?

 

If a human cracks a joke in the forest, and nobody laughs at it, did (s)he truly make a sound?


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#7
tomasth

tomasth

    Member

  • Members
  • PipPipPipPipPip
  • 242 posts

Its just a service , no one is going to force anyone to use it , same as no one is forcing views on beauty today.

 

Why does Yuli think you are a Luddite ?



#8
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,022 posts

 

Its just a service , no one is going to force anyone to use it , same as no one is forcing views on beauty today.

 

See my remark about the "invisible hand of the marketplace manipulated by an Ozish wizard."  Remember, the debate is about "behavior apps." 

 

i am not a great fan of late twentieth early twenty-first century capitalism.

 

 

Why does Yuli think you are a Luddite ?

 

I didn't indicate that Yuli thinks I am a Luddite. I wrote that Yuli probably already thinks I am a Luddite (emphasis now added).

 

Beyond that, your question is better addressed to Yuli.


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#9
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,022 posts

Wow.  In a case of pure serendipity, I read the article featured below after I made my last post.  It makes a lot of points that I think are highly relevant to this discussion.

 


Raging robots, hapless humans: the AI dystopia

 

https://www.nature.c...586-019-02939-0

 

Introduction:

 

(Nature) In Human Compatible, his new book on artificial intelligence (AI), Stuart Russell confronts full on what he calls “the problem of control”. That is, the possibility that general-purpose AI will ultimately eclipse the intellectual capacities of its creators, to irreversible dystopian effect.

 

The control problem is not new. Novelist Samuel Butler’s 1872 science-fiction classic Erewhon, for instance, features concerns about robotic superhuman intelligences that enslave their anthropoid architects, rendering them “affectionate machine-tickling aphids”. But, by 1950, Norbert Wiener, the inventor of cybernetics, was writing (in The Human Use of Human Beings) that the danger to society “is not from the machine itself but from what man makes of it”. Russell’s book in effect hangs on this tension: whether the problem is controlling the creature, or the creator. In a sense, that has been at the core of AI from its inception.

 

Even in its infancy, AI was swaddled in bitter controversy. Russell briefly touches on the moment its inventors convened at a 1956 workshop at Dartmouth College in Hanover, New Hampshire. Here, in the legendary birthplace of AI, they quarrelled over what to call their still-slumbering creation. Polymath and future Nobel laureate Herbert Simon, and computer scientist Allen Newell, favoured the name “complex information processing”. The precision of the moniker evoked the restraint of the modern scientific method, harking back to the brick-by-brick processes of discovery exemplified by the likes of James Clerk Maxwell. Computer scientists John McCarthy and Marvin Minsky (let’s call them the Intelligentists) favoured the muddier “artificial intelligence”. For McCarthy, it had marketing value. For Minsky, defining it was “more of an aesthetic question or one of [a] sense of dignity, than a technical matter”.

 

McCarthy, Minsky and other Intelligentists had bought in to behaviourism, a field straddling the natural and human sciences and offering access to a rich psychological vocabulary. They seemed to assume that the appearance of ‘minded’ behaviour was logically sufficient to prove its existence. Thus they could claim that their machines could think and perceive simply because they looked as if they did.

 

As we know, Intelligentist nomenclature won out


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#10
caltrek

caltrek

    Member

  • Members
  • PipPipPipPipPipPipPipPipPipPipPip
  • 10,022 posts

So what is with all of caltrek's paranoid questions and confusing allusions to "invisible hand of the market place?

 

Glad you asked (ok, maybe you didn't actually ask).  Barbara Ehrenreich's book This Land Is Their Land (pages 124-125) might help you all to understand my point:

 

 

…the corporations are asking much more from businesses now; they’re demanding that we become actors.  If you don’t feel wildly enthusiastic about marketing widgets or brokering life insurance, then you damn well better fake it.

 

This “theatricalization” of business raises all kinds of problems.  Will that potentially hazardous, $300-a-month prescription drug actually help you, or was your doctor just charmed by a cheerleader’s dazzling, gloss enhanced smile?  Are the tires on your car safe, or was someone at the auto company seduced by some Sarah Bernhardt of the sales?

 

And then there are the psychological consequences of acting as a way of life.  For her classic 1983 study The Managed Heart: Commercialization of Human Feeling, sociologist Arlie Hochschild interviewed people who are required by their employers to exude emotions they may not feel – flight attendants, for example.  (This was in the old days, before all the pay-cuts and pension thefts wiped the smiles off flight attendant’s faces.)  She concluded that constant emoting produces a kind of “emotional numbness,” a reduced awareness of one’s own feelings.

 

As for the effect on the “audience” – that is , those of us who are subject to a steady dispay of “exaggerated smiles” and “exaggerated enthusiasm” – we get a little numb too.  It doesn’t take long to learn that wide-open eyes and exclamations of “awesome!” don’t indicate the sightest interest.  The entire emotional currency of human interaction is being fatally devalued, and when that happens there can be no trust.

 

The perfect circle: capitalism creates its own demand, a need to know how to "manage" one's heart, then creates a product to meet that "need."  Nothing going on here folks.  Just another manifestation of "free market" capitalism.  Move along please.


The principles of justice define an appropriate path between dogmatism and intolerance on the one side, and a reductionism which regards religion and morality as mere preferences on the other.   - John Rawls


#11
starspawn0

starspawn0

    Member

  • Members
  • PipPipPipPipPipPipPip
  • 1,312 posts

That fear has been around a long time.  It was a major theme in Fight Club, and even much earlier films.  The Network is one example.  And what about Salinger's Catcher in the Rye?  Or even One Flew Over the Coocoo's Nest?  Or A Wrinkle in Time?  Or Lenny  Bruce's and George Carlin's comedy?

 

A better argument along these lines is, perhaps, that there is a slippery slope from what is desired to what is required.  One day, you think, "Wouldn't it be great to look and act more genial, even when you don't feel it?"  And, pretty soon, it becomes a requirement, because now everybody can look and act that way 24/7 without even trying.






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users