Network Architecture
What makes the network intelligent
Clash of AIs
Connectionism
The Ascent of machine learning
100

This is the term for a neural network that has more than one layer of hidden units

What is a deep network?

100

This is the influential great-grandparent of deep neural networks, a simple program invented by Frank Rosenblatt in the 1950's that makes a yes-or-no decision based on whether the sum of its weighted inputs meets a threshold value.

What is Perceptron?

100

Machine-learning researchers disparagingly referred to symbolic AI methods using this acronym, meaning "good old-fashioned AI"

What is GOFAI

100

What we now call neural networks were, in the 1980's, generally referred to by this term. The idea is that knowledge in these networks resides in weighted connections between units.

What is connectionism or connectionist networks?

100

IBM's Watson program defeated human champions on this TV game show in 2011

What is Jeopardy!?

200

The units in a network that simply communicate with other units rather than getting inputs or controlling outputs, aka an interior unit

what are hidden units?

200

Andy Clark summarized the nature of subsymbolic systems, which are better suited for perceptual or motor tasks, as "bad at logic, good at" this sport

What is Frisbee?

200

In 2016, this AI program stunned viewers by defeating one of the world's best players in the game of Go.

What is AlphaGo?

300

Instead of a simple 1 or 0 decision, each unit in a multilayer network computes this value between 0 and 1.

What is activation?

300

This process consists of gradually modifying the weights on connections so that each output's error gets as close to zero as possible on all training examples.

What is learning in neural networks?

300

This type of AI approach, exemplified by expert systems such as MYCIN, typically proved brittle because it relied on humans to explicitly define rules

What is symbolic AI?

300

This two-volume treatise, often called the "bible of connectionism," was published in 1986 by David Rumelhart and James McClelland.

What is Parallel Distributed Processing?

400

Multilayer neural networks were dismissed as likely to be “sterile” in 1969 following the publication of Perceptrons. However, this general learning algorithm for training those networks was developed in the late 1970s and early ’80s, definitively rebutting that speculation. This method's success later became the foundation of modern deep learning  

What is back-propagation?