This is the term for a neural network that has more than one layer of hidden units
What is a deep network?
This is the influential great-grandparent of deep neural networks, a simple program invented by Frank Rosenblatt in the 1950's that makes a yes-or-no decision based on whether the sum of its weighted inputs meets a threshold value.
What is Perceptron?
Machine-learning researchers disparagingly referred to symbolic AI methods using this acronym, meaning "good old-fashioned AI"
What is GOFAI
What we now call neural networks were, in the 1980's, generally referred to by this term. The idea is that knowledge in these networks resides in weighted connections between units.
What is connectionism or connectionist networks?
IBM's Watson program defeated human champions on this TV game show in 2011
What is Jeopardy!?
The units in a network that simply communicate with other units rather than getting inputs or controlling outputs, aka an interior unit
what are hidden units?
Andy Clark summarized the nature of subsymbolic systems, which are better suited for perceptual or motor tasks, as "bad at logic, good at" this sport
What is Frisbee?
In 2016, this AI program stunned viewers by defeating one of the world's best players in the game of Go.
What is AlphaGo?
Instead of a simple 1 or 0 decision, each unit in a multilayer network computes this value between 0 and 1.
What is activation?
This process consists of gradually modifying the weights on connections so that each output's error gets as close to zero as possible on all training examples.
What is learning in neural networks?
This type of AI approach, exemplified by expert systems such as MYCIN, typically proved brittle because it relied on humans to explicitly define rules
What is symbolic AI?
This two-volume treatise, often called the "bible of connectionism," was published in 1986 by David Rumelhart and James McClelland.
What is Parallel Distributed Processing?
Multilayer neural networks were dismissed as likely to be “sterile” in 1969 following the publication of Perceptrons. However, this general learning algorithm for training those networks was developed in the late 1970s and early ’80s, definitively rebutting that speculation. This method's success later became the foundation of modern deep learning
What is back-propagation?