Eliezer Yudkowsky is a noteworthy transhumanist with a webpage at http://www.yudkowsky.net. He's also noteworthy because he's made working towards the singularity the entire focus of his life - didn't even finish high school. He's developing a programming language, flare, aimed at annotative programming, to allow people to write programs that an AI can easily understand, and thereby understand itself, and hopefully reach a friendly singularity through recursive self-improvement. However, it is unclear how much effort is going into the flare project; he seems mostly focused on evangalism right now.
The Singularity Institute's website, by the way, is at http://www.singinst.org/ Yudkowsky is one of three founders of the group; see that writeup for more info. His noteworthy writings include Frequently Askwe Questions about the Meaning of Life, Creating Friendly AI, and Levels of Organization in General Intelligence.
As regards frequently asked questions about the meaning of life = he takes a frankly transhumanist view in that essay, but on the other hand he doesn't spare anyone's feelings, and he says nothing that I, and many others, do not see as logically justifiable. That's better than almost any religion you care to name. (in reply to comments by Ariels regarding every religion making claims that its followers see as justfied, I'd point out that transhumanism - which Yudkowsky hardly invented - is non-supernatural in nature)