← Back to map

Computational cognitivism

George Miller, Jerome Bruner, Ulric Neisser, Herbert Simon
EraSecond half of the 20th century · 1956
RegionNorth America · United States
DisciplinePsychology

Explanation

Computational cognitivism emerged in the 1950s-60s as a reaction to behaviourism and in parallel with the development of computing. Its central hypothesis: the mind can be understood as an information-processing system, analogous to a computer. Thinking is manipulating internal representations according to formal rules. Perceiving, remembering, reasoning, deciding and speaking are different types of symbolic processing that can be modelled by algorithms.

Key figures were Herbert Simon, Allen Newell (with their production systems and the General Problem Solver Theory), George Miller (short-term memory "7 ± 2" items), Ulric Neisser (the foundational handbook Cognitive Psychology, 1967), Noam Chomsky (generative grammar) and Jerry Fodor (The Language of Thought, 1975). All shared the idea that explaining behaviour requires positing internal representations and computations, not just environmental associations.

The metaphor of mind as computer is productive. It allows experimental modelling of phenomena such as memory, attention, perception and decision-making. It allows building programs that simulate cognition (classical AI, expert systems). It has inspired models of cognitive architecture (ACT-R, SOAR) that integrate memory, learning and problem solving in unified systems. Cognitive psychology is today the dominant current in experimental research on the mind.

Regarding consciousness, cognitivism adopts a functional approach: consciousness is a certain type of information processing with particular properties (global availability, report, monitoring). Baars's global workspace theory inscribes itself in this tradition. The hard problem (the why of qualia) tends to be considered derivative or illusory: once function is explained, experience would be part of that function, with no additional explanatory residue.

Critiques come from several fronts. Phenomenologists and philosophers (Dreyfus, Searle, Chalmers) argue that neither the semantics of language nor subjective experience can be reduced to the syntactic manipulation of symbols. Searle's famous Chinese Room argument illustrates the distinction between following rules and understanding. Embodied, enactive and situated cognition has challenged the idea that thinking is disembodied processing, pointing to the importance of body and environment.

Despite these critiques, cognitivism remains the reference framework in experimental psychology and in many areas of cognitive neuroscience. Computational models have become more sophisticated (deep neural networks, Bayesian cognition, predictive processing) and coexist with embodied and dynamicist approaches. The great open question is whether information processing alone is enough to explain consciousness, or whether there is a residue that requires additional theories.

Strengths

  • Restores the scientific study of mental processes after the behaviourist winter.
  • Quantitative, experimentally testable models.
  • Fruitful dialogue with AI, linguistics and neuroscience.
  • Basis of the global workspace theory.

Main critiques

  • The computer metaphor can be misleading: the brain is not a symbolic PC.
  • Difficulty integrating emotion, body and context.
  • Enactivist critique: cognition is not the processing of internal representations, but dynamic coupling.
  • The hard problem persists: why does computation feel like something from within?

Connections with other theories