Skip to main content

We're not going to die from AI.

Back in February 2023, Eliezer Yudokowsky painted a chilling and grim picture of a future where AI ultimately kills us all. Robin is here to provide a different perspective. He explains why he believes Eliezer is wrong. It’s not all sunshine and rainbows though — we might just become their pets instead.

Most importantly, Robin highlights his concerns about AI regulation and why he believes it's a greater threat than AI itself.

This is the counter-argument to Eliezer’s infamous episode “We’re all going to die”.

Artwork generated from the quote [51:48 - 52:22] “One of the most striking features of our world is the mechanisms we use to keep that peace and coordinate among those divergent and conflicting things. And one of the moves that often AI people often make is to assume that AIs have none of that. AIs do not need to coordinate. They do not have conflicts between them. They don’t have internal conflicts. They don’t have any issues on how to organize and how to keep the peace between them. None of that is a problem for AIs, by assumption. They are just these other things that have no such problems. And of course that leads to scenarios like they kill us all.

10% of proceeds go to Bankless DAO.

Bankless - We're Not Going to Die collection image

We're not going to die from AI.

Back in February 2023, Eliezer Yudokowsky painted a chilling and grim picture of a future where AI ultimately kills us all. Robin is here to provide a different perspective. He explains why he believes Eliezer is wrong. It’s not all sunshine and rainbows though — we might just become their pets instead.

Most importantly, Robin highlights his concerns about AI regulation and why he believes it's a greater threat than AI itself.

This is the counter-argument to Eliezer’s infamous episode “We’re all going to die”.

Artwork generated from the quote [51:48 - 52:22] “One of the most striking features of our world is the mechanisms we use to keep that peace and coordinate among those divergent and conflicting things. And one of the moves that often AI people often make is to assume that AIs have none of that. AIs do not need to coordinate. They do not have conflicts between them. They don’t have internal conflicts. They don’t have any issues on how to organize and how to keep the peace between them. None of that is a problem for AIs, by assumption. They are just these other things that have no such problems. And of course that leads to scenarios like they kill us all.

10% of proceeds go to Bankless DAO.

Contract Address0x7325...47bd
Token ID
Token StandardERC-721
ChainEthereum
Last Updated1 year ago
Creator Earnings
10%

We're Not Going to Die #27

visibility
2 views
  • Price
    USD Price
    Quantity
    Expiration
    From
  • Price
    USD Price
    Quantity
    Floor Difference
    Expiration
    From
keyboard_arrow_down
Event
Price
From
To
Date

We're Not Going to Die #27

visibility
2 views
  • Price
    USD Price
    Quantity
    Expiration
    From
  • Price
    USD Price
    Quantity
    Floor Difference
    Expiration
    From

We're not going to die from AI.

Back in February 2023, Eliezer Yudokowsky painted a chilling and grim picture of a future where AI ultimately kills us all. Robin is here to provide a different perspective. He explains why he believes Eliezer is wrong. It’s not all sunshine and rainbows though — we might just become their pets instead.

Most importantly, Robin highlights his concerns about AI regulation and why he believes it's a greater threat than AI itself.

This is the counter-argument to Eliezer’s infamous episode “We’re all going to die”.

Artwork generated from the quote [51:48 - 52:22] “One of the most striking features of our world is the mechanisms we use to keep that peace and coordinate among those divergent and conflicting things. And one of the moves that often AI people often make is to assume that AIs have none of that. AIs do not need to coordinate. They do not have conflicts between them. They don’t have internal conflicts. They don’t have any issues on how to organize and how to keep the peace between them. None of that is a problem for AIs, by assumption. They are just these other things that have no such problems. And of course that leads to scenarios like they kill us all.

10% of proceeds go to Bankless DAO.

Bankless - We're Not Going to Die collection image

We're not going to die from AI.

Back in February 2023, Eliezer Yudokowsky painted a chilling and grim picture of a future where AI ultimately kills us all. Robin is here to provide a different perspective. He explains why he believes Eliezer is wrong. It’s not all sunshine and rainbows though — we might just become their pets instead.

Most importantly, Robin highlights his concerns about AI regulation and why he believes it's a greater threat than AI itself.

This is the counter-argument to Eliezer’s infamous episode “We’re all going to die”.

Artwork generated from the quote [51:48 - 52:22] “One of the most striking features of our world is the mechanisms we use to keep that peace and coordinate among those divergent and conflicting things. And one of the moves that often AI people often make is to assume that AIs have none of that. AIs do not need to coordinate. They do not have conflicts between them. They don’t have internal conflicts. They don’t have any issues on how to organize and how to keep the peace between them. None of that is a problem for AIs, by assumption. They are just these other things that have no such problems. And of course that leads to scenarios like they kill us all.

10% of proceeds go to Bankless DAO.

Contract Address0x7325...47bd
Token ID
Token StandardERC-721
ChainEthereum
Last Updated1 year ago
Creator Earnings
10%
keyboard_arrow_down
Event
Price
From
To
Date