Superintelligence

Bostrom, N. (2017)

Given the premise of a superintelligence ...

Superintelligence: Any intellect that greatly exceeds the cognitive performance of humans in virtually all domains of interest.

... that forms a singleton ...

The initial superintelligence might obtain a decisive strategic advantage. This superintelligence would then be in a position to form a singleton.
Singleton: A world order in which there is, at the highest level of decision-making, only one effective agency.

... together with the orthogonality thesis ...

Orthogonality thesis: Intelligence and final goals are orthogonal axes along which possible agents can freely vary.
The orthogonality thesis suggests that we cannot blithely assume that a superintelligence will necessarily share any of the final values stereotypically associated with wisdom and intellectual development in humans.

... and instrumental convergence thesis ...

Instrumental convergence thesis: We can identify "convergent instrumental values", subgoals that are useful for the attainment of a wide range of possible final goals in a wide range of possible environments - subgoals that are therefore likely to be pursued by a broad class of intelligent agents.

What are these instrumental subgoals? They are the AI drives identified by Omohundro.

... support the conclusion of an existential threat to humanity.

[Given a singleton, the orthogonality thesis, the instrumental convergence thesis] The outcome could easily be one in which humanity quickly becomes extinct.


Later, after pitting the above against other arguments, a point to consider is how "the singleton" relates to "the technical singularity":

Bostrom, N. (2017)

The singularity-related idea that interests us here is the possibility of an intelligence explosion, particularly the prospect of machine superintelligence.


Argument map

argument map for the superintelligence claims