Technological singularity
The technological singularity is a hypothetical event in which artificial general intelligence (constituting, for example, intelligent computers, computer networks, or robots) would be capable of recursive self-improvement (progressively redesigning itself), or of autonomously building ever smarter and more powerful machines than itself, up to the point of a runaway effect—an intelligence explosion—that yields an intelligence surpassing all current human control or understanding. Because the capabilities of such a superintelligence may be impossible for a human to comprehend, the technological singularity is the point beyond which events may become unpredictable or even unfathomable to human intelligence.
The first use of the term "singularity" in this context was made by Stanislaw Ulam in his 1958 obituary for John von Neumann, in which he mentioned a conversation with von Neumann about the "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue". The term was popularized by mathematician, computer scientist and science fiction author Vernor Vinge, who argues that artificial intelligence, human biological enhancement, or brain–computer interfaces could be possible causes of the singularity. Futurist Ray Kurzweil cited von Neumann's use of the term in a foreword to von Neumann's classic The Computer and the Brain.