in the shell Forum Index in the shell
Discussions about the many incarnations of Ghost in the Shell, philosophy and more!
 
 FAQFAQ   SearchSearch   MemberlistMemberlist   UsergroupsUsergroups   RegisterRegister 
 ProfileProfile   Log in to check your private messagesLog in to check your private messages   Log inLog in 

The Singularity

 
Post new topic   Reply to topic    in the shell Forum Index -> The Technology
View previous topic :: View next topic  
Author Message
Kojima



Joined: 10 Aug 2016
Posts: 56
Location: The Wired

PostPosted: Wed Dec 21, 2016 2:37 am    Post subject: The Singularity Reply with quote

The technological singularity (also, simply, the singularity) is the hypothesis that the invention of artificial superintelligence will abruptly trigger runaway technological growth, resulting in unfathomable changes to human civilization. According to this hypothesis, an upgradable intelligent agent (such as a computer running software-based artificial general intelligence) would enter a 'runaway reaction' of self-improvement cycles, with each new and more intelligent generation appearing more and more rapidly, causing an intelligence explosion and resulting in a powerful superintelligence that would, qualitatively, far surpass all human intelligence.

John von Neumann first uses the term "singularity" in the context of technological progress causing accelerating change: "The accelerating progress of technology and changes in the mode of human life, give the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, can not continue". Subsequent authors have echoed this view point. I. J. Good's "intelligence explosion", predicted that a future superintelligence would trigger a singularity. Science fiction author Vernor Vinge said in his essay The Coming Technological Singularity that this would signal the end of the human era, as the new superintelligence would continue to upgrade itself and would advance technologically at an incomprehensible rate.

In his 2005 book, The Singularity is Near, Kurzweil suggests that medical advances would allow people to protect their bodies from the effects of aging, making the life expectancy limitless. Kurzweil argues that the technological advances in medicine would allow us to continuously repair and replace defective components in our bodies, prolonging life to an undetermined age. Kurzweil further buttresses his argument by discussing current bio-engineering advances. Kurzweil suggests somatic gene therapy; after synthetic viruses with specific genetic information, the next step would be to apply this technology to gene therapy, replacing human DNA with synthesized genes.

Beyond merely extending the operational life of the physical body, Jaron Lanier argues for a form of immortality called "Digital Ascension" that involves "people dying in the flesh and being uploaded into a computer and remaining conscious". Singularitarianism has also been likened to a religion by John Horgan.

Sorry for re-hashing wikipedia content, my original post ran away from me there. The original question was, has the Tachikoma, in some special way, reached a form of singularity within themselves while establishing a ghost? That was really the question, as well as pondering whether or not, Ghost in the Shell had any elements of this so-called "Singularity" hypothosis?
_________________
"A leader is best when people barely know he exists, when his work is done, his aim fulfilled, they will say: we did it ourselves." -- Lao Tzu.
Back to top
View user's profile Send private message Visit poster's website
Freitag



Joined: 01 Sep 2008
Posts: 577
Location: Behind you

PostPosted: Sat Dec 31, 2016 8:43 pm    Post subject: Reply with quote

I'm not sure that GitS explores singularity type concepts. It seems to me to really be about the human element and how to be "real" in a world where relationships have been replaced by gadgets and data.
I think the tachis might get there if they lived long enough.

But what I've understood about the singularity is that new information is generated so rapidly that it becomes impossible to keep up with all of it. To some extent we passed that limit 100 years ago or so. For instance I'm away that a general theory of relativity exists. I'm fairly familiar with the special theory and can describe it and can even go through some of the mathematical steps that make it up. But I'm lost on the general theory, it uses math that I just do not know. And then in a totally different direction there are all the advances being made in genetic analysis. Learning and becoming proficient at the physics and maths behind relativity won't help me advance my understanding of genetic theory. And after I do get all that, I'll still have a ton to learn about microcircuitry. Oh and that general relativity stuff? That's been around since for 101 years already.

The event horizon of the knowledge singularity is already behind us.

I do think that singularity has been turned into a blanket label though. The tachis had that nightly sync where they combines multiple sets of experience into a single experience. I'm not sure why one of them liked books (and not another one) even the natural oil would only have messed with one of them. Hmm, sort of rambling. I think that if the tachis had been free to aquire experiences as they seemed to want to do and the sync repeatedly that they'd have run into a limit of how much they could store. The interconnectedness limit will kick in somewhere. Assuming you already have solved the problem of how to "know" the sum of all earthly knowledge, how do you sift through that pile of data to find the information relevant to the decision at hand? But then I guess that is the next horizon perhaps, the discovery of a way to process it all.
_________________
People tend to look at you a little strangely when they know you stuff voodoo dolls full of Ex-Lax.
Back to top
View user's profile Send private message
GhostLine



Joined: 19 Dec 2005
Posts: 631
Location: "the net is vast and infinite..."

PostPosted: Mon Jan 09, 2017 5:06 pm    Post subject: Reply with quote

Unless technology can provide a way for humans to upgrade as quickly as technology itself, then I think the organic half of a theoretical singularity will fail to meets its potential. Other then medical advances to prolong the effects of age and disease, I don't see humanity advancing being able to compete with technology. We are already abandoning human interaction for less-demanding technological social options.

It seems odd, knowing that our human-made tools and assists will pass us by unless there is actually a large cyberbrain achievement of some kind. I see things going the way of Idiocracy at this stage in history.
Back to top
View user's profile Send private message
moreorless



Joined: 21 Apr 2017
Posts: 20

PostPosted: Tue May 23, 2017 11:22 am    Post subject: Reply with quote

The big issue to me seems to be the Moores law gets applied to technology as a whole when in reality we've seen that's not been the case. Equally I think its a popular theory because laymen tend to overestimate the power of current computers relative to the human brain due to the former being better suited to applications like mathematics.
Back to top
View user's profile Send private message
Freitag



Joined: 01 Sep 2008
Posts: 577
Location: Behind you

PostPosted: Wed May 24, 2017 11:59 am    Post subject: Reply with quote

I don't think we have to wait for high tech to get an Idiocracy.
_________________
People tend to look at you a little strangely when they know you stuff voodoo dolls full of Ex-Lax.
Back to top
View user's profile Send private message
Display posts from previous:   
Post new topic   Reply to topic    in the shell Forum Index -> The Technology All times are GMT - 7 Hours
Page 1 of 1

 
Jump to:  
You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot vote in polls in this forum


Powered by phpBB © 2001, 2002 phpBB Group