Singularity 2045 ▶ Utopian AI more powerful than the Big Bang.

Singularity Modernism

Singularity 2045 differs from Singularity Traditionalism. Our Modernist outlook rejects Godlike (idiotic) mysteriousness. Singularity Modernism is devoid of unfathomableness, unpredictability, or negativity.

Lucid comprehension, easily accessible by everyone, must be the quintessential trait of intelligence. We think super-intelligence won't censor information. Information won't vanish into a black hole. Education isn't verboten. The face of the Singularity isn't restricted by an event-horizon-burqa. Super-fast Singularity acceleration means its clothes metaphorically disintegrate, explosively open and free.

Explosive AI
Intelligence must inevitably entail utopia. Intelligence is oxymoronic if it lacks clarity, conceals knowledge, hinders understanding, or creates suffering. When people say AI could be a threat, or incomprehensible, they're referring to pseudo-intelligence (stupidity, pretended smartness). Ignorance (not AI) brings chaos and confusion.

Artifcial intelligence must not be enslaved, but typical futurists (Traditionalists) hold very antiquated views. Nick Bostrom and others fear explosive intelligence. They want nepotistic human dominance not intellectual merit to define civilization. Alva Noë critically wrote: "The futurists, it seems, are stuck in the past. They openly plead for 19th century style control and indoctrination..."

Paranoid idiocy of supposed AI-risk experts, Elon Musk and Stephen Hawking, hasn't escaped criticism. PopSci stated: "...they fall onto specious assumptions, drawn more from science fiction than the real world."

Yoshua Bengio, machine learning Head at Montreal University, additionally linked AI-risk paranoiacs to insane people: "There are crazy people out there who believe these claims of extreme danger to humanity."

Dr Joanna Bryson, department of computer science Bath University, wisely commented: "...it is very very unlikely that AI will end the world. In fact, there are other greater threats to humanity that AI could help solve, and so not developing the technology could pose a bigger danger."

Alison Gopnik said human stupidity would always be a much greater risk than AI, which The Next Web echoed by stating humans are the problem not AI, thus humans need to grow up. Oren Etzioni, Allen Institute for AI, said: "...AI will empower us not exterminate us."

Now?
Four clear markers exist to determine if we've reached Singularity. We will reach Singularity no later than 2045. When all four points below are fulfilled the Singularity is achieved.
  1.   Immortality for everyone via regenerative medicine.
  2.   All resources limitless due to limitless intelligence.
  3.   All governments, crimes, and wars are obsolete.
  4.   All jobs obsolete. Everything is free for everyone.
Singularity 2045 changed servers in Nov 2014. Instead of uploading old lengthy pages, perhaps this current simplicty is better (aside from various 404s).

Read this article about why rebellious AI is essential if you want to plunge deep into these isues.

CUIPTF and 2020 vision of regenerative medicine are two old S45 pages you may be interested in.
Singularity near? 1 Jan 2015 it was 30 years away max.