Abstraction is a concept familiar to programmers, and a term in common use. Abstraction is often discussed as a quality of code, but it can also describe a process of technology changing in a particular way over time. Gilbert Simondon, among others, offers the term concretization to describe a kind of anti-parallel process where technology components become more specific and effective over time, as designs evolve.

That introduction is pretty abstract. Examples can be seen in the changing design of loops.

A Turing machine has no loop instruction. You can make it loop, the potential is there, just like the potential to execute every computable program is there, but the machine itself doesn’t have a loop feature. It doesn’t even have a jump feature. All it does is change the direction of reading the tape when it reads a certain value. When combined with the right program-dataset on the tape, this results in looping behaviour. It’s pretty indirect.

Note that while the Turing machine is in some sense primitive – at least you would think so if you had to write Google Maps in it – it is also very abstract. It can be described simply and concisely; it has an infinite tape; it has very few tokens and operations. This also fits the history of what is basically Alan Turing’s paper prototype. This is at least the sense in which Simondon uses the word.

The abstract, or primitive, technical object is far from constituting a natural system. It is a translation into matter of an ensemble of scientific notions and principles that at the most basic level are unconnected one with the other and that are connected only by those consequences that converge for the production of a looked-for result. The primitive technical object is not a physical natural system but a physical translation of an intellectual system.
— Simondon, Mode of Existence of Technical Objects, Mellamphy trans., all quotes throughout

We’re used to thinking of the opposite of abstraction being redundancy and disorganization – copy/paste coding and the like. But there is also a sense where abstracted code is in some sense under-specialized, like trying to put in a nail with a brick.

With experience gained from working with actual hardware, the usefulness of randomly accessible memory became apparent, rather than a tape that had to be read serially. With this hardware innovation, plus some other handy abstraction like program-data separation, constructions such as conditional jump become possible. Eg for the MIT Summer Session computer, which is fairly late at 1953, jip is “jump if positive”.

add 500 c
jip 200 b

Excuse me for eschewing a complete listing; I think most readers would be sold on compiled languages already.

In one sense conditional jump is a programming abstraction. Instead of the relatively convoluted series of instructions needed to traverse the tape in a Turing machine program, we have a single instruction. But note that instruction is also very directly supported in hardware, and is a specialized form of the earlier instructions. Simondon refers to this as differentiation.

Jump is versatile. With the program-data abstraction we can also add the simplifying mechanism of reading the next instruction by default, overlying a chronology on the sequence of instructions in memory, when not executing a jump. Jump can be used for branching, when going forward in chronological sequence, or looping, when going backwards (ie to the start of some previously executed block). And of course more complicated variants exist which cryptographers and cryptic crossword buffs can enjoy puzzling over. Jump is almost too versatile; this is arguably Dijkstra’s point in saying goto should be considered harmful.

So, historically, uses of jump were differentiated further, into if-then-else, case statements, functions and loops. Functions or subroutines emerged first (they are visible in Hopper’s A-series compiled languages), and functional languages have their own interesting evolution, but I’m going to talk about explicit loops. These are apparent in John Backus’ FORTRAN.

      do 10 i = 1, n
         sum = sum + expenses(i)
  10  continue

It shows the residue of its jump instruction origin most obviously in the ’10’ label. The block concept isn’t available yet, so we have this memory addressing metaphor leaking through. Later versions introduce a enddo construct instead.

In C there are several loop constructs, in an uncharacteristically irresolute design. There are also a lot of ways to add one to something. Kernighan and Richie really wanted to loop across arrays a lot.

for (i=0; i < expenseLen; i++){
  sum += expenses[i];

This is pretty concise and expressive compared to its predecessors: it’s a construct dedicated to concisely repeating an operation on a collection.

C loops were dumped pretty much wholesale into C++, along with many other things, including classes, templates and so on. Object-oriented collections open up other opportunities for abstraction, too. B-trees and linked lists and sets are all possible in C, of course, but they can’t be abstracted very clearly behind a common idea of collections. This is possible in C++, as recognized in the Iterator pattern, captured in the original Gang of Four Design Patterns book.

Iterator (from

The Iterator pattern gives a common convention for loops across object collections. This includes collections like linked lists. Traversing a linked list performantly requires knowing the location of the previous node; this implementation detail can now be hidden inside the Iterator. The loop now no longer needs to compensate for the implementation of the collection.

Differentiation is possible because this very differentiation makes it possible to integrate into the working of the whole – and this in a manner conscious and calculated with a view to the necessary result – correlative effects of overall functioning which were only partially corrected by correlative measures unconnected with the performance of the principal function.

There’s a similarity to jump loops though: this is something we do to the machine, not part of the machine’s structure. That only changed with the introduction of the Standard Template Library in 1994.

In Java, java.util.Iterator was baked in from day one, along with James Gosling and Bill Joy’s much greater willingness to see the libraries as a key part of the language. (This is also why you don’t see every Java program declaring its own String class, as was once common for C++ programs.) This is another concretizing design step. With Iterators specifically, though key classes such as java.util.Vector exposed them, it was very much in the pattern style from the GoF book, only marred with a pig-ugly downcast and a redundant recitation of type names like a stuttering monk coming off amphetamines.

Iterator expenseIter = expenses.iterator();
while (expenseIter.hasNext()){
  Expense expense = (Expense);
  sum += expense.getExpenseValue();

The addition of (largely) Joshua Bloch’s collections framework deepened the platform in Java 2, based on the STL and well understood data structures in textbooks like Cormen’s Algorithms. It also cemented the prevalence of Iterator. It was now something you could use to traverse collections in general, as they all came with the one standard mechanism, as well as Collection, List etc interfaces to allow a differentiated implementation. The lack of computational complexity contracts in the documentation is the main glaring absence. Sun’s Java implementations always shipped with source code, at least, though the exclusion of automated unit tests now looks rather quaint.

Generics entered the language, rather belatedly, in Java 5, and with them, for was also extended, so we could stop using the downcasting chainsaw, and cut back on the repetition.

for (Expense expense: expenses){
  sum += expense.getExpenseValue();

With its digestion by the new for loop syntax, the Iterator stops being an everyday Java programmer pattern at this point. It still exists as a class, and you could perhaps call it a compiler programmer’s pattern. It is useful to know about, but it has to be looked for, and only really in circumstances when you’re trying to define new collections, or drilling into the implementation of an existing one.

The syntax starts to look a bit like Python, in fact, though you might choose an invocation of reduce() there; or in Odersky’s Scala, which is a deliberate evolution of Java to introduce functional programming elements.

val sum = expenses.foldLeft(0) { (total, expense) =>
  total + expense.expenseValue

You can also use loop blocks as a kind of anonymous Iterator object which compose well with functions.

Finishing at Scala would seem to suggest I think it is some perfected summit of looping, or this is a very obscure thinly-disguised attempt to trick you into functional programming, because it’s just like FORTRAN, honest. This is a danger of using Simondon as a guide – he believes technology is teleological, that is, it progresses as if towards a goal. He is onto something there, as seen in this tour of looping, but it doesn’t mean every change is progress. Refinement also comes in the context of other pieces of the machine. What is a well refined machine is also still vulnerable to obsolescence. Programming language syntax is also a demonstration complicated by the fact languages are user interfaces. Refactoring a piece of code in an existing language can perhaps yield simpler examples, but their familiarity might obscure the theoretical point. I find concretization harder to spot in snippets. Using an example where there are millions of commonly observed uses emphasizes the patterned nature of software design progress.

The technical object improves through the interior redistribution of functions into compatible unities, eliminating risk or the antagonism of primitive division. — Simondon

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.