In transhumanist literature the term transcend is used specifically to mean the process by which or interval in which an intelligent entity attains an intelligence level that is orders of magnitude greater than that of a human. There are two predominant modes encountered: the fast transcend and the slow transcend.
Fast transcend is the scenario envisioned by Vernor Vinge and Eliezer Yudkowsky. As soon as some entity achieves greater than human-level intelligence it will be able to create an entity even smarter than itself which will then create an entity even smarter until, a very short time later, the intelligence of the still existing entities becomes infinite. In this strong form it is usually referred to as the Singularity.
Slow transcend postulates almost the opposite scenario. Rather than implicitly assuming that the hardest problems in AI involve attaining human-level intelligence it postulates that there may exist a natural bottleneck in moving intelligence much beyond human-level. (This has a certain elegance to it as it quite handily explains why we are exactly as intelligent as we are.) This is the view of transcension put forth by Max More--incremental progress on the road to superintelligence.