Skip to main content

Rumelhart, D. E., Hinton, G. E.,
& Williams, R. J. (1986). Learning Representations by 
Back-propagating Errors. Nature, 323, 533-536.

Backpropagation is an algorithm that is widely used in machine learning to train neural networks.

Rumelhart, Hinton, and Williams coined the term and described the algorithm with two publications in 1986, but the mathematics of the technique were independently rediscovered many times, with predecessors dating back to the 1960s.

Neural networks became popular in the 2010s, as GPU hardware – better-suited than CPUs to run backpropagation – became powerful and cheap. What followed was an abundance of applications in speech recognition, machine vision, natural language processing, and more; reviving excitement about machine learning and artificial intelligence.

Citation typeset in King's Caslon. Figure typeset in Neue Haas Grotesk.

Citations collection image

Important publications in computer science, honored on the blockchain as piece-unique NFTs.

Contract Address0x495f...7b5e
Token ID
Token StandardERC-1155
ChainEthereum
MetadataCentralized
Creator Earnings
10%

Rumelhart et al. (1986)

visibility
45 views
  • Price
    USD Price
    Quantity
    Expiration
    From
  • Price
    USD Price
    Quantity
    Floor Difference
    Expiration
    From
keyboard_arrow_down
Event
Price
From
To
Date

Rumelhart et al. (1986)

visibility
45 views
  • Price
    USD Price
    Quantity
    Expiration
    From
  • Price
    USD Price
    Quantity
    Floor Difference
    Expiration
    From

Rumelhart, D. E., Hinton, G. E.,
& Williams, R. J. (1986). Learning Representations by 
Back-propagating Errors. Nature, 323, 533-536.

Backpropagation is an algorithm that is widely used in machine learning to train neural networks.

Rumelhart, Hinton, and Williams coined the term and described the algorithm with two publications in 1986, but the mathematics of the technique were independently rediscovered many times, with predecessors dating back to the 1960s.

Neural networks became popular in the 2010s, as GPU hardware – better-suited than CPUs to run backpropagation – became powerful and cheap. What followed was an abundance of applications in speech recognition, machine vision, natural language processing, and more; reviving excitement about machine learning and artificial intelligence.

Citation typeset in King's Caslon. Figure typeset in Neue Haas Grotesk.

Citations collection image

Important publications in computer science, honored on the blockchain as piece-unique NFTs.

Contract Address0x495f...7b5e
Token ID
Token StandardERC-1155
ChainEthereum
MetadataCentralized
Creator Earnings
10%
keyboard_arrow_down
Event
Price
From
To
Date