[ EPU Foyer ] [ Lab and Grill ] [ Bonus Theater!! ] [ Rhetorical Questions ] [ CSRANTronix ] [ GNDN ] [ Subterranean Vault ] [ Discussion Forum ] [ Gun of the Week ]

Eyrie Productions, Unlimited

Subject: "Cybernetics and Humanity" Archived thread - Read only
 
  Previous Topic | Next Topic
Printer-friendly copy    
Conferences General Topic #97
Reading Topic #97, reply 7
LostFactor
Charter Member
Jul-12-01, 08:00 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
7. "RE: Cybernetics and Humanity"
In response to message #0
 
   This topic has also been moved from the WL board for sake of topic drift. ;>

>For the masses this is called television. And that's not flippant.

You see what I'm afraid of, then. ;>

>I regularly spend 10, 12, 14, or more hours a day at a computer. I
>don't then treat people like a UNIX command line, or attempt to speak
>Java to them, or think they can be pushed around like a machine can.
>I interact with machines MUCH more than I do with other people -
>that's part of my job. I still know how to function in society.

You're also not doing your job with your entire brain directly interfacing with UNIX command lines that speak Java to you, I'll note. Or at least, I'd imagine you're not, unless there are fewer differences between you and your UF avatar than I know. ;>

>I think your point of view is way too alarmist and there is no evidence
>to support it in the general case. I'm sure there will be cyber-otaku
>just like everything else. People who only function when immersed in
>cyberspace, or fetishists who prefer artificial parts to meat. That's
>just a growth of the tattoo and piercing trend that started in tribal
>days - there is already some seriously intense body modification being
>done.

Keep in mind that I'm not screaming, "Destroy the evil cyberware! Repent, my brothers, in the name of the Lord!" (Though that would be kind of funny. Heh, MKS sketch emerging.) I'm supporting a counterbalancing opinion, or at least I'm trying to. Erring on the side of caution, if you will.

To restate my original opinion, since it seems to have gotten muddled somewhere in here, due in no small part to my own actions: when cyberware does come out, I would not be first in line to get it. I happen to like my body, and I'm not willing to gamble on losing some things that make it quintessentially mine for the benefit of better performance. Hence, the analogy with Erin. If everyone else wants to get themselves cybered, that's fine by me, but I'd not look on it with universal approval. ;>

>Personally I want cyberjacks and that's about it - unless they come up
>with something to keep me from being fat but lets me eat what I want.
>Maybe a small implanted bio-reactor to produce power for other
>implants. ;-)

I actually sketched out a basic idea for something exactly like that... a sort of artificial regulator for neurotransmitters that could turn your metabolic rate up or down freely. Didn't go very far, though, since I just solved that problem by dieting. (Not much fun, I might add.)

>I remember someone posting a criticism of UF a while back that I meant
>to reply to and didn't. One of the complaints was the apparent lack
>of cyberware... But the key there is apparent. Cyberware is
>*rampant* in UF by the time of FI - but quality ware is invisible.
>And it is common, so characters aren't going to comment on it in
>story. Many people have cyberjacks, and some will have other mods.
>But only the fetish crowd does the chrome arm thing. And only on some
>pretty messed up worlds do you find much combat ware.

Not reading UF, I'd be in no place to comment here. ;>
-Eliot "Megazone also picks up a cookie" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendance Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top

  Subject     Author     Message Date     ID  
 Cybernetics and Humanity [View All] ejheckathorn Jul-11-01 TOP
   RE: Cybernetics and Humanity Wedge Jul-12-01 1
      RE: Cybernetics and Humanity Laudre Jul-12-01 2
          RE: Cybernetics and Humanity Wedge Jul-12-01 3
              RE: Cybernetics and Humanity Astynax Jul-12-01 4
              RE: Cybernetics and Humanity Laudre Jul-12-01 5
                  RE: Cybernetics and Humanity Perko Jul-12-01 9
   RE: Cybernetics and Humanity LostFactor Jul-12-01 6
      RE: Cybernetics and Humanity trigger Jul-12-01 8
          RE: Cybernetics and Humanity LostFactor Jul-12-01 10
              RE: Cybernetics and Humanity Laudre Jul-12-01 12
                  Do computers dream of eccentric sheep? trigger Jul-12-01 13
                      RE: Do computers dream of eccentric sheep? Perko Jul-12-01 20
                  RE: Cybernetics and Humanity LostFactor Jul-12-01 14
                      RE: Cybernetics and Humanity Wedge Jul-12-01 16
                      RE: Cybernetics and Humanity Laudre Jul-12-01 17
                          RE: Cybernetics and Humanity Perko Jul-12-01 21
                              RE: Cybernetics and Humanity Laudre Jul-12-01 22
                                  RE: Cybernetics and Humanity LostFactor Jul-13-01 25
                              RE: Cybernetics and Humanity LostFactor Jul-13-01 23
                          RE: Cybernetics and Humanity LostFactor Jul-13-01 24
                              RE: Cybernetics and Humanity megazone Jul-13-01 27
                                  RE: Cybernetics and Humanity LostFactor Jul-13-01 28
                                      RE: Cybernetics and Humanity megazone Jul-13-01 29
                                          RE: Cybernetics and Humanity LostFactor Jul-13-01 30
                                      RE: Cybernetics and Humanity Wedge Jul-13-01 31
  RE: Cybernetics and Humanity LostFactor Jul-12-01 7
   Cyberpsychosis remandeteam Jul-12-01 11
      RE: Cyberpsychosis ejheckathorn Jul-12-01 15
          RE: Cyberpsychosis Laudre Jul-12-01 18
          RE: Cyberpsychosis remandeteam Jul-12-01 19
      RE: Cyberpsychosis megazone Jul-13-01 26
          RE: Cyberpsychosis ejheckathorn Jul-13-01 32
              RE: Cyberpsychosis LostFactor Jul-13-01 33
              RE: Cyberpsychosis Mephronmoderator Jul-13-01 34
                  RE: Cyberpsychosis LostFactor Jul-13-01 35

Wedge
Charter Member
Jul-12-01, 00:42 AM (EDT)
Click to EMail Wedge Click to send private message to Wedge Click to add this user to your buddy list  
1. "RE: Cybernetics and Humanity"
In response to message #0
 
   LAST EDITED ON Jul-12-01 AT 00:57 AM (EDT)

Brought over here in reply to NightmareButterfly in the WL forum. And by the letter Z.

>To get vaguely back on topic, the main problem I have with cyberware
>is that I'm not convinced that the brain - the adult brain, at least -
>can adapt to radically new sorts of input. This mainly comes into play
>with cybereyes. What, exactley, would infrared look like? I think it
>would work if it were just a switch you flipped and you suddenly see
>through a Predator-cam. I don't think it would be viable to be
>on all the time, as just an expansion of the visible spectrum. I don't
>think the brain could adapt to something as radically new as that.

I wouldn't mind having something that looked like what current flir images look like. I could deal with black and white if I was getting that kind of data from it. But you'd definitly have to be able to switch it off. Kinda hard to sleep when you can see through your eyelids. :)

Though, at that rate, I suppose you'd have new composite eyelids that would block whatever new input you were seeing as well. I think I'd be just as happy with a pair of adaptive optic sunglasses with beyond visable spectrum capability as I would be with new eyes.

>Some animals see further into the spectrums than we do (bees, I
>believe, can see ultaviolet), but they have evolved that way over
>millions of years.

<fun fact>
I know that deer can see into UV, as they sell special laundry detergents that do not have all the special brighteners that normal detergent does which work into the UV range and make your hunting cammies light up like a beacon to their eyes. Hence the special soap.
</fun fact>

>I just don't think our brain would know what the
>hell to do with the extra input. Do notice that I specified the adult
>brain above.

I disagree here, because full grown adults deal with extra sensory input of that nature all the time. Military pilots fly missions routinely with night vision goggles and the above mentioned flir. If anything, a more direct input of that kind of data would be easier to deal with. For example, night vision goggles are notorious for having a lack of depth perception. If you could work that into the working of your own eyes, meat or cyber, you would gain that depth perception back and have an easier time of it.

>It is true that the human brain is extremely adaptable. it does have
>limits, however. I don't believe rigging will be possible without
>months or years of training, and reflexes would never actually work
>right when rigging. Reflexes are too tied into our central nervous and
>muscle system to translate into a machine. Unconcious things like the
>knee-jerk reflex just woulldn't happen.

Doing something abstract, like putting thoughts into a document, will probably take some sort of training just like you have to learn how to use a typewriter or keyboard. But in rigging things like, say, an airplane or a car where you can translate movements and senses back and forth more literally, I think the process will be much quicker and much more intuitive.

------------------------
"saaaausaaaage... "
------------------------
Chad Collier
Digital Bitch
J. Random VFX Company

-->edit: Damn, like have a comma splice or six. Sheesh.


  Printer-friendly page | Top
Laudre
Charter Member
Jul-12-01, 01:17 AM (EDT)
Click to EMail Laudre Click to send private message to Laudre Click to add this user to your buddy list  
2. "RE: Cybernetics and Humanity"
In response to message #1
 
   >Doing something abstract, like putting thoughts into a document, will
>probably take some sort of training just like you have to learn how to
>use a typewriter or keyboard. But in rigging things like, say, an
>airplane or a car where you can translate movements and senses back
>and forth more literally, I think the process will be much quicker and
>much more intuitive.

I'm in agreement here. That's also my argument about how a datajack full-immersion UI would work... it'd be designed to take as little training as possible, in the interest of ergonomics. A one-way connection would take a lot of training -- how to make the letters, that kind of thing, and would require a great deal of mental discipline. In fact, I'd welcome learning that way because of said mental discipline. But when you can do full-immersion, with hijacked motor impulses forming your output... then there's no learning curve at all, but it's no more efficient than what we do now, just with less potential negative impact on the body. Maybe.

Personally, I spend a substantial fraction of my day interacting with a computer, and interacting with others via a computer. If I get the kind of job I want, then I'll be spending the vast majority of my day doing that, and my physical condition -- not great as it is now -- will reflect that. If you're doing full-immersion, I can see the nation becoming even fatter than it already is, unless it's combined with some kind of automatic conditioning through electric impulses (a technology that already exists, and is sold as an alternative to exercise for lazy people).

-- Sean --

http://www.thebrokenlink.org The Broken Link 4.0 is live!
"Imagination is more important than knowledge." -- Albert Einstein
"It's not easy being green." -- Kermit the Frog


  Printer-friendly page | Top
Wedge
Charter Member
Jul-12-01, 01:31 AM (EDT)
Click to EMail Wedge Click to send private message to Wedge Click to add this user to your buddy list  
3. "RE: Cybernetics and Humanity"
In response to message #2
 
   >But when you can do full-immersion, with hijacked
>motor impulses forming your output... then there's no learning curve
>at all, but it's no more efficient than what we do now, just with less
>potential negative impact on the body. Maybe.

It would probably be faster simply by cutting out all of the interactions that we have now, such as pushing the mouse around to choose the things you want. I imagine you could go about that faster if you were the mouse pointer, as opposed to merely directing it. :)

>If you're doing full-immersion, I can see the
>nation becoming even fatter than it already is, unless it's combined
>with some kind of automatic conditioning through electric impulses (a
>technology that already exists, and is sold as an alternative to
>exercise for lazy people).

This assumes someone would come out long enough to eat. Saline drip with a side of sugar water, anyone?

------------------------
"saaaausaaaage... "
------------------------
Chad Collier
Digital Bitch
J. Random VFX Company


  Printer-friendly page | Top
Astynax
Charter Member
1061 posts
Jul-12-01, 01:45 AM (EDT)
Click to EMail Astynax Click to send private message to Astynax Click to view user profileClick to add this user to your buddy list  
4. "RE: Cybernetics and Humanity"
In response to message #3
 
  
>This assumes someone would come out long enough to eat. Saline drip
>with a side of sugar water, anyone?
>

'The Matrix' anyone?

-={(Astynax)}=-
"One pill, two pill, red pill, blue pill"


  Printer-friendly page | Top
Laudre
Charter Member
Jul-12-01, 03:00 AM (EDT)
Click to EMail Laudre Click to send private message to Laudre Click to add this user to your buddy list  
5. "RE: Cybernetics and Humanity"
In response to message #3
 
   >It would probably be faster simply by cutting out all of the
>interactions that we have now, such as pushing the mouse around to
>choose the things you want. I imagine you could go about that faster
>if you were the mouse pointer, as opposed to merely directing
>it. :)

Okay, maybe I wasn't clear in what I was posting.

A full-immersion datajack UI that was built out of an outgrowth of prosthetic/enhanced sense technology (sending signals to the visual/aural/etc. cortexes... cortices?... of the brain, and output through hijacked motor impulses -- i.e., I send a signal to move my virtual arm through a signal to my physical arm that's interpreted differently when the jack is "on") would be simply placing you inside the computer. You'd interact with it through metaphors, still, but a different set of metaphors. Files would be represented by virtual physical objects -- word processor/text files as sheafs of paper or books, audio/music files as CDs or MDs or somesuch, graphical files by whatever's most relevant -- oil painting, photograph, what-have-you. Applications would look like tools to manipulate those virtual objects -- maybe a pencil and an eraser, maybe a CD or MD player, maybe a paintbrush. Unless you could figure out a way to alter timesense, processing speed, it'd be no faster than operating on those same objects in the physical realm. No training required, though.

The other option would be to work totally abstractly, manipulating files and applications in a wholly different manner. This would be akin to what I described elsewhere, a jack that lets you plug into something like maybe an IEEE-1394 or SCSI port, and manipulate the computer with your thoughts. Perhaps you'd have a display wired up to your cybernetic eyes or implanted contacts or however those would work. This would certainly be the kind of speed-of-thought interface that Zoner was talking about, and would require some hardcore software wizardry and some serious biotech, and then a great deal of mental discipline and training on the operator's part. You'd possibly be fully immersed, but you'd also be aware of your body and, to some degree, your physical surroundings.

-- Sean --

http://www.thebrokenlink.org The Broken Link 4.0 is live!
"Imagination is more important than knowledge." -- Albert Einstein
"It's not easy being green." -- Kermit the Frog


  Printer-friendly page | Top
Perko
Charter Member
Jul-12-01, 11:11 AM (EDT)
Click to EMail Perko Click to send private message to Perko Click to add this user to your buddy list  
9. "RE: Cybernetics and Humanity"
In response to message #5
 
   Well, shoot, this is what I get for posting before finishing reading everything on the entire board...

If you read my other post (if you can keep the 200 users straight, I'm the one who is exactly like all the others) about fist and second generation interfaces, this guy (I think) just said what they are.

-Craig


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-12-01, 07:43 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
6. "RE: Cybernetics and Humanity"
In response to message #0
 
   LAST EDITED ON Jul-12-01 AT 07:43 AM (EDT)

Transferred from Wedge's post on the SF:WL forum. Good idea, this new thread.

>But if the 'language of the Wired' was written by humans to be
>interpreted by humans, how different will they really be? I think
>what's set off the difference of opinion is that, in effect, you're
>saying the computers will mold the people, but the computers--at this
>point and probably for some time to come--are in turn molded by
>people, and useless without code written by a human being.

Computers are molded by people, but they don't think like people.

Allow me to elaborate, getting back to some of the linguistic comments that were made earlier in this debate. Each language has its own concepts that are known only to it, and learning a language requires you to start thinking in terms of that language. Learning French requires you to start thinking in French. Learning Japanese requires you to start thinking in Japanese. And learning Wired requires you to start thinking in Wired.

The thing is, Wired isn't a human though process. Computers are by their very nature inhuman. What seems simply logical to us wouldn't make any sense to a computer, while things that we would never consider are essential to a computer's continued operation. Therefore, erego, and hence, communicating with an inhuman mind will require some degree of inhuman thinking if you're communicating mind to mind.

I don't think computers will "shape" humanity. As I've stated before, I sincerely doubt that your computer in 2125 will tell you to go out and kill people. But I do think that it'll be relatively clear who is and isn't a datajack user just by casually interacting with them, because their thought process won't be entirely human. Or at least, not what we recognize as human today.
-Eliot "Trying to move topics appropriately" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendance Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
trigger
Charter Member
1500 posts
Jul-12-01, 11:08 AM (EDT)
Click to EMail trigger Click to send private message to trigger Click to view user profileClick to add this user to your buddy list  
8. "RE: Cybernetics and Humanity"
In response to message #6
 
  
>>But if the 'language of the Wired' was written by humans to be
>>interpreted by humans, how different will they really be? I think
>>what's set off the difference of opinion is that, in effect, you're
>>saying the computers will mold the people, but the computers--at this
>>point and probably for some time to come--are in turn molded by
>>people, and useless without code written by a human being.
>
>Computers are molded by people, but they don't think
>like people.

Correct. There are no "thinking" computers. We have yet to achieve AI. We've achieved randomness and growth (I'm thinking of the Australian experiement where a process grows itself and has a lifecyle) but we do not have a "thinking" computer. We may eventually. That said, there's no guarantee that anyone would develop a cyber implant with its own AI. I rather think we'll see the implants that are more like peripheral devices, programmed to execute specific commands...or programmable by their wearers.

>...Each language has its own
>concepts that are known only to it, and learning a language requires
>you to start thinking in terms of that language. Learning French
>requires you to start thinking in French...And learning Wired requires you to
>start thinking in Wired.
>
>The thing is, Wired isn't a human though process. Computers are by
>their very nature inhuman. What seems simply logical to us wouldn't
>make any sense to a computer, while things that we would never
>consider are essential to a computer's continued operation.
>Therefore, erego, and hence, communicating with an inhuman mind will
>require some degree of inhuman thinking if you're communicating mind
>to mind.

I think you have two concepts confused. One is the issue of human-computer interface. The other is the issue of AI.

A human computer interface is designed by Humans. I assume this will be the case until the Vulcans show up. Humans may or may not put an OS in the cyberware you have to let you hook up to the internet. May it's just a wireless modem with IE 2010 projected in front of your left eye. You would communicate with it in the langauge of Wire just like I am communicating with you in the languages and symbolic gestures that make IE 5.5 SP1 do what I want. Remember, the OS is a new and powerful tool. We don't necessarily need OS to carry out simple commands.

If it is an OS, then it is limited by the rule structure that Humans developed. Admittedly I'd rather not have a MS OS in my implant, but that hopefully will be a matter of free trade in the marketplace. I even suspect that linux or a variant of will be used as an os for cyberware. Can anyone think of a reason why we would need an OS specific to cyberware?

Still, these interfaces are dumb. They calculate what you tell them to calculate...or what they were programed to calculate which is two different things. Without sentience, the HAL situation you're proposing doesn't happen.

That said, I still think putting an AI in your implants is a bad, bad idea. If you don't have AI, then the "thinking mind to mind" doesn't happen.

>I don't think computers will "shape" humanity.

Hmm, that carpal tunnel I'm developing isn't from computers? <grin>

>As I've stated before,
>I sincerely doubt that your computer in 2125 will tell you to go out
>and kill people.

No, but the goverment's already do. Statistical modeling and war gaming are now done by computer...and politicians often pay more attention to those reports then the reality on the ground. I don't want to start a debate on this here...if you're intersted we'll talk about this offline.

>But I do think that it'll be relatively clear who is
>and isn't a datajack user just by casually interacting with them,
>because their thought process won't be entirely human. Or at least,
>not what we recognize as human today.

Just like it's clear who's a geek? Of course a culture will grow up around this. However, the "thought process" is an issue of culture and language. I rather suspect human "thought process" were rather different in 5000 BCE. We adapt, we change. Humanity has nothing to do with your physical or cultural mores -- it has entirely to do with the values you hold, your actions towards other sapient beings, and your character. At the moment your DNA matters...it may not after First Contact.

t.
Trigger Argee
trigger_argee@hotmail.com
Manon, Orado, etc.
Denton, never leave home without it.


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-12-01, 11:34 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
10. "RE: Cybernetics and Humanity"
In response to message #8
 
   >Correct. There are no "thinking" computers. We have yet to achieve
>AI.

What are you talking about? It was released a little while ago. Long as heck, but not a bad film. ;>

Computers have a thought process. They interepret data to execute an end result. They aren't sentient, but they do think, just like a fish or a cat or an elephant is able to. Thinking does not require an independent awareness, just the ability to take information provided and process it. We are sentient thinkers. Computers are nonsentient thinkers.

>I think you have two concepts confused. One is the issue of
>human-computer interface. The other is the issue of AI.

I think my point's just not being stated very well because I'm trying to do seven things at once. ;>

>Still, these interfaces are dumb. They calculate what you tell them
>to calculate...or what they were programed to calculate which is two
>different things. Without sentience, the HAL situation you're
>proposing doesn't happen.

Who said anything about HAL?

The problem is exactly that: computers, as we think of them, are dumb. Not human in the way they process data. Concepts such as "want", "angry", "happy", "love", "hate", et cetera have no analog to the computer's thought process. And learning to speak the language of the computer carries with it thinking without these concepts in mind. Consider that generation growing up with the Wired as their first language again: their default thought pattern does not include any of those concepts.

And how do you explain "want" to someone who doesn't know what it is?

>Just like it's clear who's a geek? Of course a culture will grow up
>around this. However, the "thought process" is an issue of culture and
>language. I rather suspect human "thought process" were rather
>different in 5000 BCE. We adapt, we change. Humanity has nothing to
>do with your physical or cultural mores -- it has entirely to do with
>the values you hold, your actions towards other sapient beings, and
>your character. At the moment your DNA matters...it may not after
>First Contact.

I think you're oversimplifying the issue... but certainly, if such mind-to-mind communication becomes the rule rather than the exception, if all of us are best versed in a language that does not contain emotional overtones, then yes, that will become what is generally accepted as humanity. And if we encountered someone like that, nobody from the present day would recognize them as even remotely human.
-Eliot "Humanity is emotions" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
Laudre
Charter Member
Jul-12-01, 01:50 PM (EDT)
Click to EMail Laudre Click to send private message to Laudre Click to add this user to your buddy list  
12. "RE: Cybernetics and Humanity"
In response to message #10
 
   LAST EDITED ON Jul-12-01 AT 01:56 PM (EDT)

>Computers have a thought process. They interepret data to execute an
>end result. They aren't sentient, but they do think,
>just like a fish or a cat or an elephant is able to. Thinking does
>not require an independent awareness, just the ability to take
>information provided and process it. We are sentient thinkers.
>Computers are nonsentient thinkers.

You're taking anthropomorphization way too far. Sure, I talk about a computer "thinking" all the time. But a computer does not think. It executes commands based on the instructions encoded into it, either on the chip or in the operating system or in the application software. It's a machine, same as my car, only more sophisticated. Does my car think when I'm swinging the wheel to the left? No.

Edited to add:

Also, animals other than humans, especially as you get closer to humans, have brains that function pretty much like human brains do, just less sophisticated. Animals are self-aware, capable of problem-solving, and they have emotions. Any pet owner can tell you all of that, usually with anecdotes about "the time [pet's name] did this". Animals are readily capable of recognizing individuals, even of other species; my cat most definitely knows me as a distinct and individual entity, as she reacts to me in ways that she reacts to nobody else. Comparing an animal to a computer is as flawed as comparing a human to a computer.

>The problem is exactly that: computers, as we think of them, are dumb.
> Not human in the way they process data. Concepts such as "want",
>"angry", "happy", "love", "hate", et cetera have no analog to the
>computer's thought process. And learning to speak the language of the
>computer carries with it thinking without these concepts in mind.
>Consider that generation growing up with the Wired as their first
>language again: their default thought pattern does not include any of
>those concepts.

This is ridiculously slippery-slope. Humans are not rational beings; they are emotional beings. Period. Reason and logic are grafts added on as the mind develops. (Have you ever read Piaget?) Emotion is pretty much in full-blown when we're born; an infant is happy, sad, angry, wanting, and all of that just as much as an adult is.

What you're saying here is akin to saying, "Well, a web browser and a keyboard don't have wants and needs, therefore someone who uses them will have no concept of wants and needs"; once again, I'm forced to dismiss this as utter crap. I use computers to access content created by humans, and this content evokes an emotional response. Just what this response might be is determined by personality and culture and context; maybe it's frustration at poor coding, or awe at a well-done piece of artwork, or any number of other things.

Interacting with a computer by full-immersion won't dehumanize you any more than interacting with a computer using a mouse, keyboard, and monitor does.

>I think you're oversimplifying the issue... but certainly, if such
>mind-to-mind communication becomes the rule rather than the exception,
>if all of us are best versed in a language that does not contain
>emotional overtones, then yes, that will become what is generally
>accepted as humanity. And if we encountered someone like that, nobody
>from the present day would recognize them as even remotely human.

Quite simply, humanity will never be "best versed in a language that does not contain emotional overtones". As I've said, we are emotional beings. And the interfaces we use will be only to access content that is, ultimately, created by another human, and that will provoke an emotional response. I've never met a geek who wasn't human, who didn't have emotions just as intensely as anyone else, even if he rarely spoke to people other than online, and, like a Japanese otaku, didn't know how to even communicate to someone outside his special interest group.

-- Sean --

http://www.thebrokenlink.org The Broken Link 4.0 is live!
"Imagination is more important than knowledge." -- Albert Einstein
"It's not easy being green." -- Kermit the Frog


  Printer-friendly page | Top
trigger
Charter Member
1500 posts
Jul-12-01, 02:17 PM (EDT)
Click to EMail trigger Click to send private message to trigger Click to view user profileClick to add this user to your buddy list  
13. "Do computers dream of eccentric sheep?"
In response to message #12
 
   Yeah, what he said. ;)


the question, "Do computers think?" threw me for about 10 minutes. Gah. Brain cramp.

the sig "Humanity is emotion" made me think of Star Trek. Everyone kept telling Spock to accept his emotional side (and defacto, accept being 1/2 human). Then comes Surak who throws the whole "emotions make you human" argument out the window.

I think a true a Artifical Intelligence would necessarily have emotions. Does anyone know if emotions are required to pass a Turing test?

The language effects of Wired, more than likely, will force people to think in terms of process / cause and effect. Given the number of people who don't "think ahead", a little reasoned planning on a personal basis might be refreshing. But we humans are amazingly good at getting ourselves in trouble by acting implusively. No machine is going to change that.

And I still think that cyberware isn't going repress or elminiate human emotion. Unless you're regulating hormone levels through chemistry, _your body_ requires the cycling of testosterone and other fun stuff. The meat portions won't work without it -- esp in a crisis when you need the adrenaline spike, a disease where you need to start and stop immune responses, and esp. pregnancy. Actually, now that I think about you can regulate the hormones or muck with the serotonin levels (ah, Prozac) without cyberware. So if _drugs_ allow you to alter your emotional state, sometimes permanently, why are we worried about machines? Is it just that drugs are a more socially acceptable means and more invisible?

Now if you take drugs and are outfited like a razorguy...well, that's a personal choice and not one that I think will be terribly popular. At least not among us who will survive long enough breed the next generation of humans.

Darwin is your friend.

t.

Trigger Argee
trigger_argee@hotmail.com
Manon, Orado, etc.
Denton, never leave home without it.


  Printer-friendly page | Top
Perko
Charter Member
Jul-12-01, 11:07 PM (EDT)
Click to EMail Perko Click to send private message to Perko Click to add this user to your buddy list  
20. "RE: Do computers dream of eccentric sheep?"
In response to message #13
 
   The appearance of emotions is required for the test... not that the test actually proves anything useful...

Er, that bit about being useless is just my opinion...

-Craig


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-12-01, 02:52 PM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
14. "RE: Cybernetics and Humanity"
In response to message #12
 
   LAST EDITED ON Jul-12-01 AT 03:07 PM (EDT)

>Interacting with a computer by full-immersion won't dehumanize you any
>more than interacting with a computer using a mouse, keyboard, and
>monitor does.

How do you think when you're using a computer? Do you think in terms of what the computer wants, what it's feeling? Of course bloody not. You think in logical terms, even at times when it doesn't come naturally. You think in a very lightly inhuman sense, because you're communicating with something that doesn't respond to human stimuli. There's an old joke about trying to talk nicely to a computer when it doesn't seem to be working right: that's what comes naturally to us. But it doesn't hold any sway over a computer.

You are dehumanized ever so slightly by using a computer. It's not noticable for the most part, because it's a small deviation and our brains have no trouble compensating for it with ease. Use a computer infrequently or only for non-immersive activities, it doesn't really have any long-term effect. Delve deeply into the stuff of computers, spend more time with that than you do with humans, more time making your emotional mind think in terms of computer logic, the effect becomes more noticable. That's where "computer geek" syndrome comes from, from forcing your brain out of whack. It's not ostensibly inhuman, because again it's only minor interaction and the brain doesn't have too much trouble correcting for it.

But we're - or at least I'm - not talking about minor interaction when it comes to a datajack. This is not a VR headset that you slip on and off, that provides you with a nice sort of 3D interface with readily-identifiable objects. This is a computer communcating directly with the synapses in your brain. It's the same dehumanizing effect magnified something fierce, because you're no longer operating at a more surface level. In order to make your thoughts translate into anything the computer can recognize as a command, you're going to have to make your thoughts rational and logical, regardless of what your dispositioning is. And if you expose a child to that at an early age, if you tell a child that the only way they're going to make it do anything is by pushing the emotions under the surface, they will learn. They will probably do it, if they consider the rewards worth it. And that can easily be the prelude to years of emotional repression and mindwarping.

This, by everything that I know, is a fairly accurate portrait. If you can provide absolute proof that this isn't what happens when you plug a computer into a datajack, go ahead and call my views absolute crap. But otherwise, this is wholly subjective postulation, just the musings on possibilities and a debate on opinions. You can't have an opinion that's absolute crap.
-Eliot "Edited to remove vitriol" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.

(EDIT: Made a little less vitriolic. There are only so many times you can have your views arbitrarily dismissed before your ire gets raised.)


  Printer-friendly page | Top
Wedge
Charter Member
Jul-12-01, 06:23 PM (EDT)
Click to EMail Wedge Click to send private message to Wedge Click to add this user to your buddy list  
16. "RE: Cybernetics and Humanity"
In response to message #14
 
   >But we're - or at least I'm - not talking about minor
>interaction when it comes to a datajack. This is not a VR headset
>that you slip on and off, that provides you with a nice sort of 3D
>interface with readily-identifiable objects. This is a computer
>communcating directly with the synapses in your brain.

Ahhh, but there's the thing, and that's the bit someone mentioned about anthropomorphizing the computer. The computer isn't communicating with your brain (unless it's an a.i.), your brain is controlling the computer. It may sound like a nit but there's a big difference in there, at least to me. It's sending you impulses, visually and perhaps otherwise, but unless something has been delibrately coded in otherwise it's just sensory input, feedback to make the interface more useful to the user. But it would be the brain that is making the computer -do- things, not the other way around. The only difference I see between the VR helmet you describe and a datajack is the manner of input. In order to 'see' anything sent through the datajack, you'd have to be fed the same impulses you'd be getting from the screen in the helmet via your eyes. Certainly more can go wrong, and bad code would do a number on your head big time, but that would be the fault of the coder and not the computer.

------------------------
Chad Collier--and what about the de-computerizing effect of humans on computers...nyuk nyuk nyuk
Digital Bitch
J. Random VFX Company


  Printer-friendly page | Top
Laudre
Charter Member
Jul-12-01, 06:43 PM (EDT)
Click to EMail Laudre Click to send private message to Laudre Click to add this user to your buddy list  
17. "RE: Cybernetics and Humanity"
In response to message #14
 
   >How do you think when you're using a computer? Do you think in terms
>of what the computer wants, what it's feeling? Of course bloody not.
>You think in logical terms, even at times when it doesn't come
>naturally. You think in a very lightly inhuman sense, because you're
>communicating with something that doesn't respond to human stimuli.
>There's an old joke about trying to talk nicely to a computer when it
>doesn't seem to be working right: that's what comes naturally to us.
>But it doesn't hold any sway over a computer.

You think logically when you operate a car. You think logically when you're operating any kind of machinery; that's problem solving, but as often as not there's an effort required to think in that mode. But it doesn't drain you of emotion; quite the opposite, in fact, as I've noticed.

>You are dehumanized ever so slightly by using a computer.

How? In what way? Can you prove this? And, most of all, who decides what constitutes "human"?

I've noticed behavioral changes to prolonged stimuli -- but anything can do this. Geekdom isn't anything different from, say, a career soldier who spends thirty years on the lines, and then retires and goes in live in some peaceful area where the closest you get to combat is Quadruple Coupon Day at the supermarket. His ingrained responses are not appropriate to his environment, and he's also greatly desensitized to violence, and he dehumanizes anyone whom he might consider a threat -- all things that someone who has spent his or her entire life living in a suburb would not at all comprehend. This retired soldier is in an environment that's totally alien to him, and he doesn't understand its social cues or expectations. I've been in situations like that -- I grew up in inner-city Bridgeport, and many of my expectations and thought processes were shaped by that in my teenage years, but I went to a college for two years where almost everyone came from whitebread suburbia. Add my geekishness to this, and I was about as out of place as one can get.

>This is a computer
>communcating directly with the synapses in your brain. It's the same
>dehumanizing effect magnified something fierce, because you're no
>longer operating at a more surface level. In order to make your
>thoughts translate into anything the computer can recognize as a
>command, you're going to have to make your thoughts rational and
>logical, regardless of what your dispositioning is. And if you expose
>a child to that at an early age, if you tell a child that the only way
>they're going to make it do anything is by pushing the emotions under
>the surface, they will learn. They will probably do it, if they
>consider the rewards worth it. And that can easily be the prelude to
>years of emotional repression and mindwarping.

I fail to see why this would be necessary. Create a datajack that functions on an abstract level, and it's just another part of the body as far as a child is concerned. It's not going to warp a child's emotions any more than learning how to throw a football or drive a car might. (The context might, but that's a different story.) Create a datajack that's just a far more immersive VR environment (and that's the way I see it being done, as it's far simpler and easier to learn), and it won't require any kind of discipline at all -- you're just operating with a different set of metaphors. Emotions will still be there -- the computer won't recognize them as relevant; they'll be like comments. I can get angry and strike the keyboard harder, and the computer doesn't know that it's any different. Let me restate this: there is nothing inherent in human-computer communication that mandates the suppression of emotion. Yes, objectivity helps, because you don't get stressed out about it. But all you'll be doing is putting your thoughts into a code the computer can understand, same as if I'm typing it, only faster. Working a computer via a datajack will be nothing more or less than working one with a keyboard.

Also, if you're teaching, any teacher or psychologist can tell you that emotions are key in teaching children -- they're not robots. Children need to feel happy about what they're learning, to want to learn. Emotions would be vital to the process.

>This, by everything that I know, is a fairly accurate portrait. If
>you can provide absolute proof that this isn't what happens when you
>plug a computer into a datajack, go ahead and call my views absolute
>crap.

Then I challenge you likewise.

I've been sitting here thinking about different ways a human might interact on a mental level with a computer, and while I can see mental discipline being a necessity in an abstract UI, I can't see emotions being suppressed as a necessity or even a byproduct, nor can I see a loss of "humanity", whatever you might define that. Humans are emotional beings, not rational, as I've said before. I find the argument that a datajack would dehumanize or de-emotionalize a user to be alarmist at best, ludicrous at worst.

-- Sean --

http://www.thebrokenlink.org The Broken Link 4.0 is live!
"Imagination is more important than knowledge." -- Albert Einstein
"It's not easy being green." -- Kermit the Frog


  Printer-friendly page | Top
Perko
Charter Member
Jul-12-01, 11:10 PM (EDT)
Click to EMail Perko Click to send private message to Perko Click to add this user to your buddy list  
21. "RE: Cybernetics and Humanity"
In response to message #17
 
   Anyone who cares to know what the sides are, I think it's safe to assume I'm on Laudre's unless he says something goofy... plus, he's more clear than I am.

Of course, I wouldn't call LostFactor's ideas 'crap'... he's just been brainwashed by a different piece of society than we were. Poor guy, he's not geeky enough for us...

-Craig


  Printer-friendly page | Top
Laudre
Charter Member
Jul-12-01, 11:27 PM (EDT)
Click to EMail Laudre Click to send private message to Laudre Click to add this user to your buddy list  
22. "RE: Cybernetics and Humanity"
In response to message #21
 
   >Anyone who cares to know what the sides are, I think it's safe to
>assume I'm on Laudre's unless he says something goofy... plus, he's
>more clear than I am.

I dunno... I wasn't exactly digging trenches and loading up mortars :). I'm more interested in figuring out what the underlying assumptions are on both sides, and I think others have been more eloquent than I have in this matter, and have been better at striking at the heart of those underlying assumptions.

>Of course, I wouldn't call LostFactor's ideas 'crap'...

Eliot is voicing some valid concerns, but much of what he's said I just can't give any credence, just based on my understanding of the psychology of tool use and the methodology of interface design. It's based on certain underlying assumptions and perceptions that most people on this board don't share, which is why he's pretty much alone in his viewpoint here. Many, if not most, of the people here spend the majority of their time using computers, professionally and recreationally, and saying that using a computer dehumanizes them is borderline ludicrous and alarmist.

-- Sean --

http://www.thebrokenlink.org The Broken Link 4.0 is live!
"Imagination is more important than knowledge." -- Albert Einstein
"It's not easy being green." -- Kermit the Frog


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 08:30 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
25. "RE: Cybernetics and Humanity"
In response to message #22
 
   >I dunno... I wasn't exactly digging trenches and loading up mortars
>:).

"I'm forced to dismiss your views as utter crap" sounds like fightin' words to me, boyo. ;>

>It's based on certain underlying assumptions and perceptions that most
>people on this board don't share, which is why he's pretty much alone
>in his viewpoint here.

Let it never be said that I conceded a point simply because I was in the minority. It's not the first time that I've represented the minority view, not the first time that I've been the only person who was representing that view, and it certainly won't be the last.

(Such as a time a couple days ago when I was defending NXE... but that's beside the point.)

>Many, if not most, of the people here spend
>the majority of their time using computers, professionally and
>recreationally, and saying that using a computer dehumanizes them is
>borderline ludicrous and alarmist.

Yeah, you guys are no fun whatsoever. I've got much more luck with the Southern Baptist crowd. ;>

I use computers professionally and recreationally, I just don't ascribe much to them. I recognize that my computer dehumanizes me ever so slightly, and I don't much like it, but I also recognize that a computer allows me to write faster, play games that I enjoy, and communicate with people over a wider basis. Get enough time away from a computer, and the dehumanization is easily balanced. As I've said before, it's extremely minor with a standard interface, to the point that it's generally not noticable. So I'm willing to deal with it.

At any rate, I don't think I'll continue in this particular debate. My views have been stated as clearly as I think they can be, and as it stands I'm not going to have any luck convincing others. If nothing else, at least it gave me an idea for a humor piece. ;>
-Eliot "You may now continue" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 07:45 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
23. "RE: Cybernetics and Humanity"
In response to message #21
 
   >Of course, I wouldn't call LostFactor's ideas 'crap'... he's just been
>brainwashed by a different piece of society than we were.

I appreciate this, Perko, more than you might know. Arigatou.

>Poor guy, he's not geeky enough for us...

Nope. I'm not a geek. Never said that I was. My Geek Code says it pretty plainly - I've got peripheral resemblance to a geek, but I am nowhere near a true geek. Truth be told, I'm closer to otaku than anything else.
-Eliot "My great-aunt's cat, on the other hand, is Geek" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 08:21 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
24. "RE: Cybernetics and Humanity"
In response to message #17
 
   >How? In what way? Can you prove this?

Well, Exhibit A would be William Gates and Microsoft Tech Support. ;>

>And, most of all, who decides what constitutes "human"?

Humanity as an aggregate, of course. This is bordering into one of my favorite areas to delve into on a purely philosophical level, and I could fill a book with my musings on the nature of humanity and where we draw the lines between human and inhuman, so I'm not going to go too far into depth here (especially considering it's tangental to a topic that's already spawned off a tangent). As I said in an earlier post: if the aggregate of humanity considers a more clinical, logical mindset as human, then that will be human. But I'm speaking from the mindset of the 21st Century, not from what may or may not be the mindset of the future. And right now, the mindset that I foresee arising from using a datajack is not something that would be recognizable as entirely human.

>I've noticed behavioral changes to prolonged stimuli -- but
>anything can do this. Geekdom isn't anything different from,
>say, a career soldier who spends thirty years on the lines, and then
>retires and goes in live in some peaceful area where the closest you
>get to combat is Quadruple Coupon Day at the supermarket. His
>ingrained responses are not appropriate to his environment, and he's
>also greatly desensitized to violence, and he dehumanizes anyone whom
>he might consider a threat -- all things that someone who has spent
>his or her entire life living in a suburb would not at all comprehend.
> This retired soldier is in an environment that's totally alien to
>him, and he doesn't understand its social cues or expectations.

And you're postulating that the career soldier who's spent thirty years supressing his compassion for humanity, who's learned to kill as is necessary in the line of duty, even if emotions say that the victim is innocent, isn't dehumanized on some level? I'd not argue that it's entirely different, but in both cases there has been some level of dehumanization in the subject. It's been in different directions, certainly, and I'd say the career soldier is likely more severely dehumanized, but both would encounter similar difficulties in relating to those who've not undergone similar effects.

<Most of Children bit snipped, as it's addressed at end>

>Then I challenge you likewise.

Give me a call when the first working datajack is released. If you're proven right, I'll happily concede. ;>

>I've been sitting here thinking about different ways a human might
>interact on a mental level with a computer, and while I can see mental
>discipline being a necessity in an abstract UI, I can't see emotions
>being suppressed as a necessity or even a byproduct, nor can I see a
>loss of "humanity", whatever you might define that. Humans are
>emotional beings, not rational, as I've said before. I find the
>argument that a datajack would dehumanize or de-emotionalize a user to
>be alarmist at best, ludicrous at worst.

As near as I can figure it, no matter what the UI being used by the computer, the human's role is going to boil down to making their thoughts resemble something that can be translated into binary code, or at least some rudimentary form of coding. Thinking "move mouse to the right" wouldn't produce any sort of action, because the computer wouldn't have any idea what it meant. What you'd need to do is fire your synapses in such a fashion that it triggered some form of event in the computer. Early datajacks probably wouldn't be useful except as curiosity items, because you'd have to concentrate very hard to get the computer to do anything you wanted it to. Later datajacks would have more intuitive code and easier interface - but the basic interface is still not as simple as thinking what you want the computer to do. It could hear your thoughts, but it wouldn't be able to interperet them.

That's where I think our views are primarily deviating. You see a datajack's interface, unless I'm dramatically failing to understand you, as a more-immersive form of the interfaces that exist today, that respond to your thoughts intuitively. You think "Write following: The quick red fox jumped over the lazy red dog", and the computer writes it. But from everything I know, that requires software that can not only hear your thoughts but also has the intellect and understanding to translate them into Wired so that the computer can understand it. A datajack, in all likelihood, wouldn't HAVE a visual interface, because you wouldn't need it - you'd be dealing with pure data, turning switches on and off internally without the use of an external mechanism like a keyboard. It'd be years before someone developed a datajack interface that had visual elements like today's computers.

Having a computer respond to your thoughts requires either intimate knowledge of how to manipulate your thoughts into the Wired, or software that doesn't just read your thoughts but comprehends them. That does not currently exist. It may well never exist. Something along those lines would require an artificial intelligence of even the most rudimentary lines, something that could act as a translator for your thought-commands and the computer.

Develop software that can do that, that can make the thought "move mouse pointer" translate into movement of a mouse pointer, and I would agree with you wholeheartedly, that the user interface would be no more dehumanizing than those that exist today. But until such software exists then there will be no middle element, and you'll have to alter your thoughts directly into the Wired language. That, by everything that I know about the way computers work and the way the human mind processes information, would either dehumanize or be flatly impossible. Your mileage may vary. ;>
-Eliot "Read other reply to your post before continuing" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
megazone
Charter Member
Jul-13-01, 09:10 AM (EDT)
Click to EMail megazone Click to send private message to megazone Click to add this user to your buddy list  
27. "RE: Cybernetics and Humanity"
In response to message #24
 
   >As near as I can figure it, no matter what the UI being used by the
>computer, the human's role is going to boil down to making their
>thoughts resemble something that can be translated into binary code,

And I completley disagree.

>or at least some rudimentary form of coding. Thinking "move mouse to
>the right" wouldn't produce any sort of action, because the computer
>wouldn't have any idea what it meant.

Sure it would. There are ways to do this TODAY. They've done experiments using exterior sensors on the scalp to control pointers on a computer. It is part of the ongoing research to help the handicapped - say allowing a quadrapelegic to operated a computer interface by 'thinking'.

Every operation of our body produces distinct events in the brain. The challenge is mapping those events. I expect early systems will be something like early handwriting recognition, or speech recognition, systems - they'll take some training to adapt to the user.

But eventually they'll improve and adapt, making it easier on the user.

Doing more advanced things, like inputting text, is where the challenge lies. *Reading* text is comparatively simpler - the researching into artificial vision could lead to systems that directly stimulate the optic nerve, or the vision center of the brain. So that you 'see' something that isn't really there - your brain would just process it like any other optical input.

Getting things *out* is more difficult - your could use a 'keyboard' interface and intercept the nerve signals, so that the user only needs to 'think' of typing, and the signals would be intercepted before going out to the limbs, and you'd be typing on a 'virtual keyboard'.

Other possibility is intercepting speech signals - the main drawback to a voice interface is having to speak out loud. If you could subvocalize and the computer picks up the commands your brain is sending to your body, that is an improvement.

Ultimately there will probably be a new interface that operates like a new appendage. As if you suddenly had a tail and had to learn how to move it, you could train the brain to operate some form of interface that we cannot yet imagine.

But I *NEVER* see the use of making someone think in machine code, or anything close. There is not point in it, it is a step backwards. And if you can develop an interface that can 'understand' thoughts enough to pick up the specialized code, then you've accomplished *most* of what you need to decipher more complex thoughts. Maybe in the lab as a development stage, but it'd be pointless to stop there.

Early systems would probably be a bit like the PalmOS Graffiti system - almost normal, but with some restrictions to limit what the machine has to cope with. But still more 'human' than 'machine'.

>respond to your thoughts intuitively. You think "Write following: The
>quick red fox jumped over the lazy red dog", and the computer writes
>it. But from everything I know, that requires software that can not
>only hear your thoughts but also has the intellect and understanding
>to translate them into Wired so that the computer can understand it.

Not at all.

>mechanism like a keyboard. It'd be years before someone developed a
>datajack interface that had visual elements like today's computers.

Actually that would probably come *first* - we're already developing the necessary technology.

>knowledge of how to manipulate your thoughts into the Wired, or
>software that doesn't just read your thoughts but comprehends

No it doesn't. Just like my Palm doesn't really understand anything. But it knows that if I draw the stylus down and then right, I want to input an 'l'. A direct interface could would on a number of principles - it could be as simple as training it. "Ok, now imagine drawing an 'l'" - and it records your neural patterns. So the next time is sees those, it translates them into an 'l'. Basic research in this direction has been underway for several years, and has produced some promising results. With more sensitive sensors, faster processors, etc, computers could store and compare more patterns - so you could have it translate entire words at a time instead of letters. This pattern means 'the', this other pattern means 'outside', etc.

You're entire line of argument has been very frustrating to me because it seems like you aren't aware of the technology and research that already exists, most of which refutes the foundations of your arguments about how the tech will likely work.

-MegaZone, megazone@megazone.org
Personal Homepage http://www.megazone.org/
Eyrie Productions FanFic http://www.eyrie-productions.com/
See what I'm selling on eBay


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 09:20 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
28. "RE: Cybernetics and Humanity"
In response to message #27
 
   >Sure it would. There are ways to do this TODAY.

I've never heard of any of this technology, but assuming it exists (and I have no reason to doubt you) then, as I already stated pretty firmly, the situation is entirely different.

>You're entire line of argument has been very frustrating to me because
>it seems like you aren't aware of the technology and research that
>already exists, most of which refutes the foundations of your
>arguments about how the tech will likely work.

Don't you mean "your"?

Your entire line of argument has been very frustrating to me because up until this post you haven't used a single piece of evidence that would back up your point, and if you'd simply pointed this technology out at the beginning then we could have gotten this over with far faster. If the technology exists, than I'm misinformed, simple as that. But if it doesn't (which, as far as I knew, was the case), then it's all simple speculation and there's nothing to validate or invalidate it.
-Eliot "And that's about the last thing I plan to say about it" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
megazone
Charter Member
Jul-13-01, 09:21 AM (EDT)
Click to EMail megazone Click to send private message to megazone Click to add this user to your buddy list  
29. "RE: Cybernetics and Humanity"
In response to message #28
 
   >Don't you mean "your"?

Yep - but I've been up all night with insomnia and can't be bothered to fix my typos right now.

-MegaZone, megazone@megazone.org
Personal Homepage http://www.megazone.org/
Eyrie Productions FanFic http://www.eyrie-productions.com/
See what I'm selling on eBay


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 09:26 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
30. "RE: Cybernetics and Humanity"
In response to message #29
 
   >Yep - but I've been up all night with insomnia and can't be bothered
>to fix my typos right now.

Ah. I'm rather glad that I decided to be tactful about "completley", then. ;>

I suppose this rather settles the debate, since nobody else is going to argue my side. Although I'd still be a little reluctant to get a datajack, and I'd wait until there was a bit of research done on their effects, were it to utilize the sort of "Graffiti" system that you're referring to (a nod to the Palm's system, yes), then I'd not be terribly reluctant to get one. Then again, sometimes it's a good thing that I write a bit slower than I think. ;>
-Eliot "And there was peace in our time" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
Wedge
Charter Member
Jul-13-01, 03:19 PM (EDT)
Click to EMail Wedge Click to send private message to Wedge Click to add this user to your buddy list  
31. "RE: Cybernetics and Humanity"
In response to message #28
 
   >>Sure it would. There are ways to do this TODAY.
>
>I've never heard of any of this technology, but assuming it
>exists (and I have no reason to doubt you) then, as I already stated
>pretty firmly, the situation is entirely different.

I was in a Fry's electronics once (think Home Depot for electronics/computers) and they had a display setup in the computer games section. On the monitor was a fairly simple first person skiing game, maybe Super-Nintendo quality or a little less. You had to go downhill and go through gates for points.

The only thing you had for control was a place to rest your hand, with your pointer finger in a seperate groove with electrodes at the fingertip. The instructions were to think 'left' to go left, and 'right' to go right, while keeping your hand still. I played it, and it worked, though not with the accuracy of a keyboard or mouse, but it worked (albeit the slight electric tingle in my fingertip was a mite worrisome, but I was too busy being impressed to care :). Anyway, it was a kit you could buy to add on to your computer, and they had like three games--skiing, a flight sim, and a bowling game I think. And this was 5 years ago to boot. Don't think they sold many, as it's nowhere to be seen now, but it was interesting either way.

------------------------
"saaaausaaaage... "
------------------------
Chad Collier
Digital Bitch
J. Random VFX Company


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-12-01, 08:00 AM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
7. "RE: Cybernetics and Humanity"
In response to message #0
 
   This topic has also been moved from the WL board for sake of topic drift. ;>

>For the masses this is called television. And that's not flippant.

You see what I'm afraid of, then. ;>

>I regularly spend 10, 12, 14, or more hours a day at a computer. I
>don't then treat people like a UNIX command line, or attempt to speak
>Java to them, or think they can be pushed around like a machine can.
>I interact with machines MUCH more than I do with other people -
>that's part of my job. I still know how to function in society.

You're also not doing your job with your entire brain directly interfacing with UNIX command lines that speak Java to you, I'll note. Or at least, I'd imagine you're not, unless there are fewer differences between you and your UF avatar than I know. ;>

>I think your point of view is way too alarmist and there is no evidence
>to support it in the general case. I'm sure there will be cyber-otaku
>just like everything else. People who only function when immersed in
>cyberspace, or fetishists who prefer artificial parts to meat. That's
>just a growth of the tattoo and piercing trend that started in tribal
>days - there is already some seriously intense body modification being
>done.

Keep in mind that I'm not screaming, "Destroy the evil cyberware! Repent, my brothers, in the name of the Lord!" (Though that would be kind of funny. Heh, MKS sketch emerging.) I'm supporting a counterbalancing opinion, or at least I'm trying to. Erring on the side of caution, if you will.

To restate my original opinion, since it seems to have gotten muddled somewhere in here, due in no small part to my own actions: when cyberware does come out, I would not be first in line to get it. I happen to like my body, and I'm not willing to gamble on losing some things that make it quintessentially mine for the benefit of better performance. Hence, the analogy with Erin. If everyone else wants to get themselves cybered, that's fine by me, but I'd not look on it with universal approval. ;>

>Personally I want cyberjacks and that's about it - unless they come up
>with something to keep me from being fat but lets me eat what I want.
>Maybe a small implanted bio-reactor to produce power for other
>implants. ;-)

I actually sketched out a basic idea for something exactly like that... a sort of artificial regulator for neurotransmitters that could turn your metabolic rate up or down freely. Didn't go very far, though, since I just solved that problem by dieting. (Not much fun, I might add.)

>I remember someone posting a criticism of UF a while back that I meant
>to reply to and didn't. One of the complaints was the apparent lack
>of cyberware... But the key there is apparent. Cyberware is
>*rampant* in UF by the time of FI - but quality ware is invisible.
>And it is common, so characters aren't going to comment on it in
>story. Many people have cyberjacks, and some will have other mods.
>But only the fetish crowd does the chrome arm thing. And only on some
>pretty messed up worlds do you find much combat ware.

Not reading UF, I'd be in no place to comment here. ;>
-Eliot "Megazone also picks up a cookie" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendance Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
remandeteam
Member since Jul-31-07
78 posts
Jul-12-01, 12:43 PM (EDT)
Click to EMail remande Click to send private message to remande Click to view user profileClick to add this user to your buddy list  
11. "Cyberpsychosis"
In response to message #0
 
   >Second, will extensive augmentation (as opposed to simple medical
>replacements) of the human form necessarily result in a "loss of
>humanity" (I prefer to use the term "psychological difficulties")? My
>best guess, lacking any truly conclusive information, is "it depends".
>I have a hunch that it will depend a great deal on individual
>personality. I can easily imagine that there might be people who will
>have severe difficulties with a medical hand replacement, never mind
>wholesale augmentation of the type described in many SF and cyberpunk
>novels and games. I think that for any but rather minor augmentations,
>psychological screenings will have to be part of the package. I think
>that it will depend to some extent on the number and type of
>augmentations involved, though it will probably not be a neat linear
>progression ala Cyberpunk 2.0.2.0.. For what purposes the
>person is recieving and using the augmentations will quite possibly be
>extremely important.

CP 2020 Cyberpsychosis is almost realistic. For those who haven't played, the CP 2020 system has a phenomenon known as cyberpsychosis. Every character has an empathy stat (how well they deal with other humans), and this score drops whenever the character gains cybernetics. This score can be increased with therapy, and the "empathy cost" of a cybernetic is directly related to the added bonus it gives you. A replacement eye is "cheaper" than one with infrared optics, and both are much "cheaper" than an SMG implanted in your arm. When a character's empathy falls below zero, the character becomes cyberpsychotic. If it happens to your character, you must turn the sheet over to the GM and start again. Cyberpsychos tend to go postal, not caring about "mere humans". Usually, these monsters are taken down by cyberpsycho C-SWAT forces (think AD Police).

As it turns out, there is such a thing as being psychopathic in the Real World (not the same as being psychotic). A psychopath has zero empathy for others, and thus they do not feel bound by social protocol. Said people feel no need to obey the law except for fear of being caught. These are often the people who go out, kill lots of people simply for the feeling of power it gives them, and then feel no remorse when captured.

The idea in CP2020 is that the cyberware can do two things: it can alienate you from everybody else and isolate you, and it can give you vast superhuman powers that also isolates you from everybody else. The more "enhancing" the cyber is (rather than just replacing your own damaged meat), the more artificial it looks, and the more deadly it is, the worse it affects you. Once the people start avoiding you, and you find out that you can grind you into dust, then people just don't matter. And that is the textbook definition of a psychopath.

--rR


  Printer-friendly page | Top
ejheckathorn
Member since Aug-9-13
51 posts
Jul-12-01, 04:06 PM (EDT)
Click to EMail ejheckathorn Click to send private message to ejheckathorn Click to view user profileClick to add this user to your buddy list  
15. "RE: Cyberpsychosis"
In response to message #11
 
   >The idea in CP2020 is that the cyberware can do two things: it can
>alienate you from everybody else and isolate you, and it can give you
>vast superhuman powers that also isolates you from everybody else.
>The more "enhancing" the cyber is (rather than just replacing your own
>damaged meat), the more artificial it looks, and the more deadly it
>is, the worse it affects you. Once the people start avoiding you, and
>you find out that you can grind you into dust, then people just
>don't matter
. And that is the textbook definition of a
>psychopath.
>
> --rR

To amplify on a point I made above, I think that the reason that a person receives augmentations would be a very important factor.

(following example from a post Paul J. Adam made to rec.games.frp.cyber on April 14, 1997, though not exactly quoted)

Let's say you have a paramedic who receives implanted headware memory (to store the latest trauma care techniques), hand razors (in case he doesn't have a scalpel handy), cybereyes with thermograph, low light and magnification (for working on casualties in low light situations) and some form of muscle/strength augmentation (for pulling wreckage off people).

Now, is this person just as likely to go psychopathic as a person who received similar augmentation for combat purposes (assuming cyberware can have that kind of effect)?

Comments, anyone?

Eric J. Heckathorn
ericjh@stargate.net


  Printer-friendly page | Top
Laudre
Charter Member
Jul-12-01, 06:52 PM (EDT)
Click to EMail Laudre Click to send private message to Laudre Click to add this user to your buddy list  
18. "RE: Cyberpsychosis"
In response to message #15
 
   >To amplify on a point I made above, I think that the reason that a
>person receives augmentations would be a very important factor.
<snip>
>Comments, anyone?

I'm in violent agreement. I'd love to have some kind of permanent work done to my eyes to fix the (admittedly relatively mild) myopia and astigmatism; corrective lenses have been a pain in my ass for 20 years, and I'd welcome a way to reliably be done with them permanently, whether it's LASIK (which I've considered but can't afford right now, since insurance doesn't cover it), cybernetics, or even genetweaking. I wouldn't see such work as in any way separating me from humanity, or stripping away my own humanity. Similarly, the other mods I'd get if they were available -- nanohealers, a datajack -- they wouldn't, to me, make me any less human. I wouldn't even see myself as a transhuman, just a human with a few add-ons. Eyes would be my contacts taken to the next level; nanohealers would be immunization shots and first-aid taken to the next level; a datajack would enable me to work with computers much more efficiently (in theory, anyway). Other things, like claws, I might not be so keen on, and that might cause me problems.

-- Sean --

http://www.thebrokenlink.org The Broken Link 4.0 is live!
"Imagination is more important than knowledge." -- Albert Einstein
"It's not easy being green." -- Kermit the Frog


  Printer-friendly page | Top
remandeteam
Member since Jul-31-07
78 posts
Jul-12-01, 09:18 PM (EDT)
Click to EMail remande Click to send private message to remande Click to view user profileClick to add this user to your buddy list  
19. "RE: Cyberpsychosis"
In response to message #15
 
   >(following example from a post Paul J. Adam made to
>rec.games.frp.cyber on April 14, 1997, though not exactly quoted)
>
>Let's say you have a paramedic who receives implanted headware memory
>(to store the latest trauma care techniques), hand razors (in case he
>doesn't have a scalpel handy), cybereyes with thermograph, low light
>and magnification (for working on casualties in low light situations)
>and some form of muscle/strength augmentation (for pulling wreckage
>off people).
>
>Now, is this person just as likely to go psychopathic as a person who
>received similar augmentation for combat purposes (assuming cyberware
>can have that kind of effect)?

I agree, the paramedic is less likely to go psychopathic. Combat mods are the worst--the act of killing another human being is arguably psychopathic in and of itself, whether as murder, self-defense, or as a soldier in the field. OTOH, the mods of a paramedic are likely to be the least damaging. A paramedic's mission is to care for people, in the most direct way possible. Indeed, those mods reinforce the belief that people do matter, and there is no better defense against psychopathy.

--rR


  Printer-friendly page | Top
megazone
Charter Member
Jul-13-01, 08:52 AM (EDT)
Click to EMail megazone Click to send private message to megazone Click to add this user to your buddy list  
26. "RE: Cyberpsychosis"
In response to message #11
 
   >CP 2020 Cyberpsychosis is almost realistic. For those who haven't
>played, the CP 2020 system has a phenomenon known as cyberpsychosis.

I'll note that I thought that was one of the worst rules I have ever found in an RPG. I despised it and it was one of the main reasons I stopped playing around with CP2020.

I hated it mainly because everyone is different - the idea of a static 'cost' for something was moronic. Like Briareos in Appleseed - some people could be full conversion cyborgs and still be happy, friendly sorts who love other people. And someone else could get their eyes replaced and just snap.

Just like in real life, it comes down to how stable the person is to start with. CP2020 treated pieces of your body like they contained part of your personality - remove something and you lose humanity. Bunk.

-MegaZone, megazone@megazone.org
Personal Homepage http://www.megazone.org/
Eyrie Productions FanFic http://www.eyrie-productions.com/
See what I'm selling on eBay


  Printer-friendly page | Top
ejheckathorn
Member since Aug-9-13
51 posts
Jul-13-01, 03:28 PM (EDT)
Click to EMail ejheckathorn Click to send private message to ejheckathorn Click to view user profileClick to add this user to your buddy list  
32. "RE: Cyberpsychosis"
In response to message #26
 
   >I'll note that I thought that was one of the worst rules I have ever
>found in an RPG. I despised it and it was one of the main reasons I
>stopped playing around with CP2020.

Just out of curiosity...

The cyberpsychosis and humanity rules are, of course, a game balance mechanic. What, if anything, would you like to see in their place? Rules based on price/availability, social/legal consequences, or something like Shadowrun's Essence (where your body just gives up and dies if you have too much cyber)?

Eric J. Heckathorn
ericjh@stargate.net


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 03:43 PM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
33. "RE: Cyberpsychosis"
In response to message #32
 
   >...something like Shadowrun's Essence (where your body just gives
>up and dies if you have too much cyber)?

Essence is perhaps the most contrived artificial balancing mechanism in a game system since AD&D declared that wizards can't wear armor. (For a discussion on THAT, I refer you here.) Read the beginnings of Man & Machine, and you'll see that they didn't even really have a good explanation of what it was when they wrote it into the system.

It's an artificial mechanism that's there to prevent your cybered-up god from casting spells, too, and has a couple other uses just to offer it legitimacy. I've never played CP2020, but it sounds like a preferable mechanic.
-Eliot "And yes, most of my characters wind up fairly cybered, ironically enough" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top
Mephronmoderator
Charter Member
1896 posts
Jul-13-01, 03:56 PM (EDT)
Click to EMail Mephron Click to send private message to Mephron Click to view user profileClick to add this user to your buddy list  
34. "RE: Cyberpsychosis"
In response to message #32
 
   Lemme see if, when I get home, I can dig up the alternate rules for cybernetics I wrote and tested (with much success) in a campaign. Effects were based on percentages of your original humanity gone instead of a linear effect, there was a random effect depending on those percentages (thus insuring that only the most horrifically chromed had the most horrific problems) and also included lesser effects.

This resulted in a situation in the middle of a firefight, after someone had rolled in the '25%' zone (Cyberwear problem - possible incompatibility with other cyberwear) after getting cybereyes... pulled their gun, activated the smartlink... and suddenly the world went sky-blue with 'General Protection Fault: There Has Been A Kernel Error in Module SMRTLINK.EXE - please contact manufacturer for assistance'. Device driver conflict!

At the time, they were rolling on the ground laughing at the idea of someone's eyes GPFing from a bad smartgun link. (It was amusing as hell to me, too. Even as much fun as the rocket launcher disguised as a basset hound.)

--
Geoff Depew - Mephron
"Big O! Showtime!"


  Printer-friendly page | Top
LostFactor
Charter Member
Jul-13-01, 04:00 PM (EDT)
Click to EMail LostFactor Click to send private message to LostFactor Click to add this user to your buddy list  
35. "RE: Cyberpsychosis"
In response to message #34
 
   That broke me but good, dammit. ;>
-Eliot "Using a Macintosh smartlink with OS/2 cybereyes?" Lefebvre
-=()=-
We're only given a little time in our lives to waste. Make the most of it.
Electronic Transcendence Productions
Producer of, um, stuff for an unspecified time-period.


  Printer-friendly page | Top

Conferences | Topics | Previous Topic | Next Topic

[ YUM ] [ BIG ] [ ??!? ] [ RANT ] [ GNDN ] [ STORE ] [ FORUM ] GOTW ] [ VAULT ]

version 3.3 © 2001
Eyrie Productions, Unlimited
Benjamin D. Hutchins
E P U (Colour)