Showing posts with label humanity. Show all posts
Showing posts with label humanity. Show all posts

Sunday 19 February 2023

ARTIFICIAL INTELLIGENCE: I think, therefore I find humans just seem to be an annoying waste of space.

 

(Cartoon credit: timoelliot.com)

I love the fact that no one has yet figured out what makes humans self-aware.  We have it, we're terrified of losing it, but we haven't a clue what, in the brain, causes us to have this capacity.

Now, of course, the media is full of talk about Artificial Intelligence, or AI.  Well intelligence is different to self-awareness.  Intelligence is defined as 'the ability to acquire and apply knowledge and skills', whereas the definition of self-awareness is, 'conscious knowledge of one's own character and feelings'.

A computer acquires knowledge in the form of data in a stream of binary input.  In fact, you can't really call it knowledge because the very word implies knowing or knowingness, which to my way of thinking is allied to consciousness.  The computer's memory, or registers, hold banks of data and a program held in other registers tells it what to do with the data.  Now this explanation is pretty simple but then so are computers, no matter how large or complex their programming becomes.  What computers are, however, is really, really fast; speed of light fast.

With a sufficiently complex program, i.e. set of instructions, created by a human, computers can 'solve' problems, mathematical and logical.  But remember, it's the program that tells it how to do that.  The computer has as much awareness of what it's doing as your average toaster, or toothpick, for that matter, so don't get the idea that AI is anywhere near self-conscious or aware.  However, one day, just maybe, given that we don't know how self-awareness comes about, perhaps a computer program and an enormous data bank might create some kind of fusion, fission or whatever, and come into a state of awareness.

I seriously doubt it because I feel that somehow biology and chemistry come into it, but I, like everybody else, don't know.  I asked a friend the other day, "What happens when they can upload my self-awareness and memory into a computer?"  To be able to do so is called 'singularity'.  I then added, "Will I be able to see, touch, feel emotion and all the important aspects of being human?"

His reply was that it would be a 'virtual' existence, but that would be a program, not real.  In order to be real, biology and chemistry must be involved, not just metals, electricity, printed circuits and silicon chips.  Anyway, we've got a long way to go before that happens, but it will happen eventually, I believe.  You're not going to be able to stop scientists doing it either because, once those who can are on a roll, they'll all be trying to make it happen first.  It seems to be human nature to invent something before worrying about the consequences, which does make you wonder about our level of intelligence, in the sense of real intelligence, which is wisdom, and our ability to create it in our own likeness.

In the Bible it states that, "God created mankind in his own image".  Well that's a worry in itself, isn't it?  That leaves this highly imperfect species at the point in its evolution where it's about to try to play God, so if a self-aware, artificial being pops out of this innovation, we'd better watch out.

Think about it.  Humans are very high maintenance.  We eat and that requires agriculture and slaughter.  We need a lot of space, in the form of residences.  We move around continually, creating air pollution.  Oh yes, and we need air.  We are perishable and rot once dead.  We reproduce with gay abandon without thinking of all of the above.  We require medicines, surgery, prosthetics and psychological counselling.  Some are prone to killing others willy-nilly, even at the international level of war.  We covet and we create inequalities amongst ourselves by gender, race and religion and then we deify such unworthy types as film and rock stars.  Many people are corrupt.

How is this going to improve if we create beings similar to ourselves?  Or are we going to create beings without emotion and only logic?  I doubt that too.  Once the powers that be start, they're not going to stop at emotionless, self-aware beings, because these beings would find us totally illogical and therefore a pest and therefore, according to the logic programmed into them, something to be eradicated.  Of course, you could put stop gaps in their programming to avoid this, but your logic circuit boards would have to be pretty bug free or your AI being may put two and two together, or in their case, 10 and 10.

Another problem is that if these beings can physically function with limbs and a know how of how to build themselves, they can do their own self-improvements.  Eventually the food consuming, air absorbing beings that age, become decrepit and need care will become a roadblock in the path of progress.  Perhaps the AI could be programmed with a sense of beauty or with lust for the physical, because they'd have to appreciate what is physical to tolerate us, but I honestly don't think that those things can be programmed.

Frankly I think it's going to be pretty boring world without humans and I also don't imagine the AI beings are going to appreciate our varied fauna and flora, care for them or wonder at creation, but if humans do make the AI beings anything like us, eventually they'll get bored and start doing all the illogical things we do to stave off madness.

END