While there are many ways to look at and define intelligence, essentially all research points to a core, general intelligence (referred to as g in academic literature), as the governing factor for people. Put another way, g is the speed limit of a human brain, and we all have slightly different ones.
Many concepts have tried to make this more complicated. But at least two examples – “multiple intelligences” and “learning styles” have been fairly thoroughly debunked as feel-good myths.
The truth is that much (but not all) of what we see as intelligence is pattern recognition. Human’s ability to recognize patterns is fundamental to how we evolved, and how we learn and adapt ourselves to any modern-day skill. Naturally, such pattern recognition is much of what AI and machine learning are focused on at the moment as it is what separates humans (for the time being).
This is why IQ tests largely consist of pattern recognition tasks. While these aren’t perfect, realizing complex patterns given limited information is the best hallmark of intelligence we have.
Basic numbers can provide good examples of this.
- Pretty much everyone can recognize:1,2,3,4,5…
- Most people could recognize a power expansion: 1,2,4,8,16…
- But fewer could recognize the Fibonacci sequence: 1,1,2,3,5,8,13…
The latter sequence – if not known from study in school – provides a more complex pattern that could appear random, or increasing in an unknown pattern. Yet it is possible to derive the Fibonacci sequence knowing essentially nothing.
Of course these sequences wouldn’t be great differentiators of intelligence, which is why those rely on more complex and less established number sequences (e.g. 1,3,0,5,-2,9,-4,13…), as well as creating sequences of shapes, colors, and symbols to increase the complexity.
Intelligence would seem to be a good thing. Obviously it does accrue its holder many benefits. But tomorrow we’ll look a major downside of excellent pattern recognition.