I recently read a great article on how Computing is perceived today, a natural science. Written by Peter J. Denning, director of the Cebrowski Institute for Innovation and Information Superiority at the Naval Postgraduate School in Monterey, CA, you can find this article in the ACM magazine Communications of the ACM July 2007 (Volume 50, Issue 7).

I will quote a couple of great paragraphs from the article (I hope it is OK to do so):

“This acceptance of computing as science is a recent development. In 1983, Richard Feynman told his Caltech students: “Computer science differs from physics in that it is not actually a science. It does not study natural objects. Neither is it mathematics. It’s like engineering—about getting to do something, rather than dealing with abstractions.” (Lectures on Computation, Addison-Wesley, 1996, p. xiii.)

Feynman’s idea was consistent with the computational science view at the time. Less than a generation later, his colleagues had come to see information processes as natural occurrences and computers as tools to help study them.

This is a striking shift. For a long period of time many physicists and scientists claimed that information processes are manmade phenomena of manmade computers.”

I like the next paragraph:

“The old definition of computer science—the study of phenomena surrounding computers—is now obsolete. Computing is the study of natural and artificial information processes. Computing includes computer science, computer engineering, software engineering, information technology, information science, and information systems.”

The article also covers (revisits) a top-level framework of seven (overlapping) fundamental principles (the Great Principles of Computing) that applies across many technologies (from natural such as biology, to artificial such as computers):

  • Computation (meaning and limits of computation);
  • Communication (reliable data transmission);
  • Coordination (cooperation among networked entities);
  • Recollection (storage and retrieval of information);
  • Automation (meaning and limits of automation);
  • Evaluation (performance prediction and capacity planning); and
  • Design (building reliable software systems)

From the article:

“We found that most computing technologies draw principles from all seven categories. This finding confirms our suspicion that a principles interpretation will help us see many common factors among technologies.”

One of the key points to take away from the article is that computation goes beyond artificial, and computers, and that computing is part of everything and is found everywhere: biology/DNA, social networking, physics and quantum computing, and so on.

ceo