Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning Representations by Back-propagating Errors. Nature, 323, 533-536.
Backpropagation is an algorithm that is widely used in machine learning to train neural networks.
Rumelhart, Hinton, and Williams coined the term and described the algorithm with two publications in 1986, but the mathematics of the technique were independently rediscovered many times, with predecessors dating back to the 1960s.
Neural networks became popular in the 2010s, as GPU hardware – better-suited than CPUs to run backpropagation – became powerful and cheap. What followed was an abundance of applications in speech recognition, machine vision, natural language processing, and more; reviving excitement about machine learning and artificial intelligence.
Citation typeset in King's Caslon. Figure typeset in Neue Haas Grotesk.
Rumelhart et al. (1986)
- 가격USD 가격수량만료From
- 가격USD 가격수량하한가와의 차이만료From
Rumelhart, D. E., Hinton, G. E., & Williams, R. J. (1986). Learning Representations by Back-propagating Errors. Nature, 323, 533-536.
Backpropagation is an algorithm that is widely used in machine learning to train neural networks.
Rumelhart, Hinton, and Williams coined the term and described the algorithm with two publications in 1986, but the mathematics of the technique were independently rediscovered many times, with predecessors dating back to the 1960s.
Neural networks became popular in the 2010s, as GPU hardware – better-suited than CPUs to run backpropagation – became powerful and cheap. What followed was an abundance of applications in speech recognition, machine vision, natural language processing, and more; reviving excitement about machine learning and artificial intelligence.
Citation typeset in King's Caslon. Figure typeset in Neue Haas Grotesk.