Nice forum,this one!

Moderator: sonic
Er well if you really want feedback make your post easier to read. I went back and tried to read your post and it wasn't easy to do.Bimboliquido wrote:Curious how everyone ignored my post while they answered to Mu who just said what i said in two lines...
Nice forum,this one!
Can't say you are wrong,but i'll admit that given the reading difficulty of the first posts i thought that this wasn't an obstacle.Well,my mistake,and after all,who cares?Jeni Nielsen wrote:
Er well if you really want feedback make your post easier to read. I went back and tried to read your post and it wasn't easy to do.
Granted it doesn't have to be perfect or anything, but paragraphs and punctuation are always good ways to get people to want to read what you have to say
Lightice wrote:Jeni Nielsen wrote:Well I don't know. Think about how cars get built nowadays. It's mostly machines building other machines. Who's to say that it won't end up being a completely automated process in the future. So in essence machines will be able to "reproduce" themselves.mu wrote:AI will never surpass us because they need us to reproduce them. And they are not biological creatures so they can't evolve and adapt when nessesary. they also need us to mantane them.
Indeed. And besides, technology evolves. The technological evolution is both faster and more efficent than biological evolution. When a human-level AI is realized, it will be able to take part in designing a super-human AI, which alone can design AIs of greater and greater capability. It is evolution, but unlike our evolution, it's guided by intelligence and proceeds in matter of months or even days, rather than millions of years.
No human interference is required. There is no reason, why a machine of suitable complexity could not possess all creativity of human mind and more.
Will humans be replaced? I don't think so. After all, like said, the AIs will be created by us and raised by us. Essentially, they will become us, though far more intelligent and capable.
Lethagrin wrote: Bringing this post back to topic...
Let’s not forget that the standard meaning of "evolution", at least in an organism perspective, means survival of the fittest due to genetic mutations to haploid sex cells (sperm and egg) created prior to fertilization. The traits inherited by future generations are used as a strength (or weakness) in their environment. These aspects of evolving can only happen in a society level not an individual.
I'm not saying that AIs have to be in the reflection of humans, however, man kind's creations tipically represent themselves more than anything.
OTing the post again...Lightice wrote:Lethagrin wrote: Bringing this post back to topic...
I'm not sure if the forum rules consider this necromancy or not, but new discussion is always welcome, as far as I am concerned.
I think AI's will pass us by. As pointed out the manufacturing industry is already making people useless. Computers of a year ago are already obsolete. Plus somewhere I read that MIT already have a computer that can physically interface with a human brain. They are working on computer controlled prostetics. The famous 3-laws have already been proven to be flawed to the point of being the main reason for our downfall. Our military is working on machines that can select and attack targets without any human input.mu wrote:AI will never surpass us because they need us to reproduce them. And they are not biological creatures so they can't evolve and adapt when nessesary. they also need us to mantane them.
You mean the kind of compassion that humans show their "inferior" species, such as the other life on this planet? Let's hope that there aren't as many people willing to put cows unnecessarily through the chopper and okay with thinking of other creatures as nothing more than resources to be farmed... Let's hope the AIs don't behave as the majority of humanity does in that respect. Humans put their closest cousins the apes in lab tests, though- does man act in a way that seems undeserving of similar treatment from a higher lifeform than it? (Of course many individuals do better, but fate doesn't exactly kindly stop and look and say. "Oh, but look at the individuals")And if AI develop emotion I hope they hav compassion becuase if they don't than that would suck.
Well said and a good analogy. This could apply to AI's in the sense that an AI developed and contained in a lab would not have a reason to develop a killer nature however an AI embedded in a battle tank that is programmed to have 'killer nature' or to disregard the value of (human) life, would be more of a thereat ( to it's creator) if it's thought process was not under adequate constraints.It's a bit like humans and dogs really. Humans acting out of their worser nature breed dogs for fighting. They push their luck with them and then is it any wonder one day when the unstable creature they've created turns that killer nature on them?
What if creativity is just a form of logic that is so abstract that it cannot be emulated, that is to say that humans don't understand it so cant willfully create a device to emulate it. However, if an AI were to be made to evolve it's logic processes could possibly acquire this trait.The problem nobody's talked about is programming creativity