Quote
"I think a highly rational person would have high moral uncertainty at this point and not necessarily be described as "altruistic"."
W
Wei Dai"I do have some early role models. I recall wanting to be a real-life version of the fictional "Sandor Arbitration Intelligence at the Zoo" (from Vernor Vinges novel A Fire Upon the Deep) who in the story is known for consistently writing the clearest and most insightful posts on the Net. And then there was Hal Finney who probably came closest to an actual real-life version of Sandor at the Zoo, and Tim May who besides inspiring me with his vision of cryptoanarchy was also a role model for doing early retirement from the tech industry and working on his own interests/causes."
Wei Dai is a computer engineer known for contributions to cryptography and cryptocurrencies. He developed the Crypto++ cryptographic library, created the b-money cryptocurrency system, and co-proposed the VMAC message authentication algorithm.
"I think a highly rational person would have high moral uncertainty at this point and not necessarily be described as "altruistic"."
"I think status is in fact a significant motivation even for me, and even the more "pure" motivations like intellectual curiosity can in some sense be traced back to status. It seems unlikely that [updateless decision theory] would have been developed without the existence of forums like extropians, everything-list, and LW, for reasons of both motivation and feedback/collaboration."
"I dont like playing politics, I dont like having bosses and being told what to do, I dont like competition, I have no desire to manage other people, so Ive instinctively avoided or quickly left any places that were even remotely maze-like."
"[By the way], since you are advising high school students and undergrads, I suggest that you mention to them that they can start being independent researchers before they graduate from college. For example I came up with my b-money idea (a precursor to Bitcoin) as an undergrad, and was also already thinking about some of the questions that would eventually lead to [updateless decision theory]."
"One solution [to the problem that high status might cause stupidity] that might work (and I think has worked for me, although I didnt consciously choose it) is to periodically start over. Once youve achieved recognition in some area, and no longer have as much interest in it as you used to, go into a different community focused on a different topic, and start over from a low-status (or at least not very high status) position."
"Does anyone not have any problems with taking ideas seriously? I think Im in this category because ideas like cryonics, the Singularity, [unfriendly artificial intelligence], and Tegmarks mathematical universe were all immediately obvious to me as ideas to take seriously, and I did so without much conscious effort or deliberation."