BlockBeats news, on December 21st, Ethereum co-founder Vitalik Buterin posted on social media, "My definition of AGI (Artificial General Intelligence) is: AGI is a sufficiently powerful artificial intelligence that if one day all humans suddenly disappear and this AI is uploaded into a robot body, it will be able to continue civilization independently. Obviously, this is a very difficult definition to measure, but I think it is the core intuitive difference between 'the AI we are used to' and AGI in many people's minds. It marks the transition from a tool that constantly relies on human input to a self-sufficient life form. ASI (Artificial Superintelligence) is a completely different matter - my definition is that when humans no longer add value to productivity in a loop (just like in chess games, we have actually reached this point in the past decade), yes, ASI scares me - even the AGI I define scares me because it brings obvious risks of loss of control. I support focusing our work on building intelligent enhancement tools for humans, rather than building super-intelligent life forms.
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
Vitalik: Agree to concentrate on building intelligent enhancement tools for humans rather than creating super intelligent life
BlockBeats news, on December 21st, Ethereum co-founder Vitalik Buterin posted on social media, "My definition of AGI (Artificial General Intelligence) is: AGI is a sufficiently powerful artificial intelligence that if one day all humans suddenly disappear and this AI is uploaded into a robot body, it will be able to continue civilization independently. Obviously, this is a very difficult definition to measure, but I think it is the core intuitive difference between 'the AI we are used to' and AGI in many people's minds. It marks the transition from a tool that constantly relies on human input to a self-sufficient life form. ASI (Artificial Superintelligence) is a completely different matter - my definition is that when humans no longer add value to productivity in a loop (just like in chess games, we have actually reached this point in the past decade), yes, ASI scares me - even the AGI I define scares me because it brings obvious risks of loss of control. I support focusing our work on building intelligent enhancement tools for humans, rather than building super-intelligent life forms.