Heavily based on a Science News article, “The Future of Computing” by Matthew Hutson, Feb. 26. 2022.
Significant milestones #
- 1833: “Analytical Engine” thought up and partially built by Charles Babbage. Could read a series of numbers, operate on them, and print out a result. (Need to check improve this description.)
- 1843: Ada Lovelace comes up with the idea of a computer program, where numbers could be used to represent other stuff, like music.
- 1936: A computer that can rewrite its own instructions conceived of by Alan Turing
- 1943: Colossus, vacuum-tube-based computer
- ENIAC, first electronic digital computer in the US
- 1945: von Neumann architecture invented: data and instructions stored in same memory bank, separate from CPU
- 1947: transistor invented at Bell Labs
- 1948: Claude Shannon sorts out information theory, starts people using the word “bit” for “binary digit”
- 1951: Grace Hopper coins the word “compiler,” creates a sort-of compiler. (We would call it a linker today.)
- 1954: FORTRAN, first widely used programming language, invented at IBM. Still used today for fast calculations, as in the open source libraries ATLAS, BLAS, and LAPACK.
- 1955: “Artificial intelligence” summer institute proposed at Dartmouth, in New Hampshire.
- 1958-9: Texas Instruments and Fairchild Semiconductor invent the integrated circuit, where multiple transistors and other circuitry are fabricated together on one chip.
- 1965: Gordon Moore says transistors per device will double every year.
- 1969: ARPANET invented by US government agency ARPA
- 1975: Gordon Moore says transistors per device will double every OTHER year– “Moore’s Law.” Basically right, 1970-2020.
- 1981: IBM personal computer and Microsoft’s MS-DOS released
- 1984: First computer with usable graphical interface, Macintosh, released
- 1990: world-wide web invented at CERN in Switzerland
- 2005: First Arduino developed in Italy
- 2012: First Raspberry Pi developed in England