Monday, July 25, 2011

Before Python

This morning I had a chat with the students at Google's CAPE program. Since I wrote up what I wanted to say I figured I might as well blog it here. Warning: this is pretty unedited (or else it would never be published :-). I'm posting it in my "personal" blog instead of the "Python history" blog because it mostly touches on my career before Python. Here goes.

Have you ever written a computer program? Using which language?
  • HTML
  • Javascript
  • Java
  • Python
  • C++
  • C
  • Other - which?
[It turned out the students had used a mixture of Scratch, App Inventor, and Processing. A few students had also used Python or Java.]

Have you ever invented a programming language? :-)

If you have programmed, you know some of the problems with programming languages. Have you ever thought about why programming isn't easier? Would it help if you could just talk to your computer? Have you tried speech recognition software? I have. It doesn't work very well yet. :-)

How do you think programmers will write software 10 years from now? Or 30? 50?

Do you know how programmers worked 30 years ago?

I do.

I was born in Holland in 1956. Things were different.

I didn't know what a computer was until I was 18. However, I tinkered with electronics. I built a digital clock. My dream was to build my own calculator.

Then I went to university in Amsterdam to study mathematics and they had a computer that was free for students to use! (Not unlimited though. We were allowed to use something like one second of CPU time per day. :-)

I had to learn how to use punch cards. There were machines to create them that had a keyboard. The machines were as big as a desk and made a terrible noise when you hit a key: a small hole was punched in the card with a huge force and great precision. If you made a mistake you had to start over.

I didn't get to see the actual computer for several more years. What we had in the basement of the math department was just an end point for a network that ran across the city. There were card readers and line printers and operators who controlled them. But the actual computer was elsewhere.

It was a huge, busy place, where programmers got together and discussed their problems, and I loved to hang out there. In fact, I loved it so much I nearly dropped out of university. But eventually I graduated.

Aside: Punch cards weren't invented for computers; they were invented for sorting census data and the like before WW2. [UPDATE: actually much earlier, though the IBM 80-column format I used did originate in 1928.] There were large mechanical machines for sorting stacks of cards. But punch cards are the reason that some software still limits you (or just defaults) to 80 characters per line.

My first program was a kind of "hello world" program written in Algol-60. That language was only popular in Europe, I believe. After another student gave me a few hints I learned the rest of the language straight from the official definition of the language, the "Revised Report on the Algorithmic Language Algol-60." That was not an easy report to read! The language was a bit cumbersome, but I didn't mind, I learned the basics of programming anyway: variables, expressions, functions, input/output.

Then a professor mentioned that there was a new programming language named Pascal. There was a Pascal compiler on our mainframe so I decided to learn it. I borrowed the book on Pascal from the departmental library (there was only one book, and only one copy, and I couldn't afford my own). After skimming it, I decided that the only thing I really needed were the "railroad diagrams" at the end of the book that summarized the language's syntax. I made photocopies of those and returned the book to the library.

Aside: Pascal really had only one new feature compared to Algol-60, pointers. These baffled me for the longest time. Eventually I learned assembly programming, which explained the memory model of a computer for the first time. I realized that a pointer was just an address. Then I finally understood them.

I guess this is how I got interested in programming languages. I learned the other languages of the day along the way: Fortran, Lisp, Basic, Cobol. With all this knowledge of programming, I managed to get a plum part-time job at the data center maintaining the mainframe's operating system. It was the most coveted job among programmers. It gave me access to unlimited computer time, the fastest terminals (still 80 x 24 though :-), and most important, a stimulating environment where I got to learn from other programmers. I also got access to a Unix system, learned C and shell programming, and at some point we had an Apple II (mostly remembered for hours of playing space invaders). I even got to implement a new (but very crummy) programming language!

All this time, programming was one of the most fun things in my life. I thought of ideas for new programs to write all the time. But interestingly, I wasn't very interested in using computers for practical stuff! Nor even to solve mathematical puzzles (except that I invented a clever way of programming Conway's Game of Life that came from my understanding of using logic gates to build a binary addition circuit).

What I liked most though was write programs to make the life of programmers better. One of my early creations was a text editor that was better than the system's standard text editor (which wasn't very hard :-). I also wrote an archive program that helped conserve disk space; it was so popular and useful that the data center offered it to all its customers. I liked sharing programs, and my own principles for sharing were very similar to what later would become Open Source (except I didn't care about licenses -- still don't :-).

As a term project I wrote a static analyzer for Pascal programs with another student. Looking back I think it was a horrible program, but our professor thought it was brilliant and we both got an A+. That's where I learned about parsers and such, and that you can do more with a parser than write a compiler.

I combined pleasure with a good cause when I helped out a small left-wing political party in Holland automate their membership database. This was until then maintained by hand as a collection of metal plates plates into which letters were stamped using an antiquated machine not unlike a steam hammer :-). In the end the project was not a great success, but my contributions (including an emulation of Unix's venerable "ed" editor program written in Cobol) piqued the attention of another volunteer, whose day job was as computer science researcher at the Mathematical Center. (Now CWI.)

This was Lambert Meertens. It so happened that he was designing his own programming language, named B (later ABC), and when I graduated he offered me a job on his team of programmers who were implementing an interpreter for the language (what we would now call a virtual machine).

The rest I have written up earlier in my Python history blog.

23 comments:

Unknown said...

Guido--

As one of the Google CAPE instructors in Mountain View, I just wanted to thank you for your time today. These kids are smart, hard-working, and playful, and they give me hope for the future. Getting exposure to someone like you who has done amazing things is terrific and inspirational.

Regards,

Josh Paley

Unknown said...

I wonder how many of us who have used a key-punch machine (model 029 IBM) are still around and programming? Thanks by the way for Python :)

Ken Latta said...

Yes, 029s were it. I learned how to make autopunch cards. Programming in Fortran 2 for a Univac 1108 in 1966. Time was rationed in seconds per day of use and the turnaround for submitting a card deck, getting your run, picking up the printout, resubmitting meant maybe 3 or 4 runs per day, so you really did contemplate everything on that printout before repunching. I was a teenager and 40 year old professors would be on the next keypunch.

John Campbell said...

Present, HSM, but coding's a small part of the job these says. For this onetime math student born in 1956, a nice reminiscence by Guido.

Guido van Rossum said...

That's where I think I have a dream job. I still get to write lots of code!

Artie Gold said...

Long live the '56-ers!!!
[All right, I'm tail end, but...]

I certainly used an 029 in college (assembler on a card punch with a bad ribbon was an experience); in high school in NY in '71 we had 029s as well, but I think at least one 026, too (we were writing mostly FORTRAN for an 1130 -- we also used a stripped down PL/I in which ".," was used instead of a semicolon).

Ben Chun said...

Thanks again from CAPE SF for speaking to our students today! It was great to meet you.

It's always cool to learn how different people learned to program. I made a site where you can share your story: ilearnedtoprogram.com

samwyse said...

You're apparently a year younger than I; your photo makes you look somewhat younger, making me jealous. ;)

I've written several languages, if you'll permit some fairly domain-specific ones. One of the first was an interpreter written in COBOL on an IBM System/3, for what we'd now call a spreadsheet. You'd give it a bunch of cards with cell addresses and formulas, and it would print out a balance sheet. This saved me having to recompile the COBOL every time a number changed, which was a big time saver. If I'd ported it to a PC, I'd have invented VisiCalc!

VinSw said...
This comment has been removed by the author.
Ned Deily said...

LIke many CDC mainframe sites, my university used the ancient and less expensive 026 keypunches which only had about 48 characters available via keys. Although you could produce the remaining 18 characters by "multi-punching" (What? You need more than 64 characters?), it was very tedious to remember the sequences needed to produce the semicolons, colons, and square brackets needed for Algol-60 programs. Fortunately, the CDC compilers allowed alternate representations for these characters: for example, := (colon equals) the assignment operator could be punched as ..= (period period equals). While it simplified the keypunching, it made for really ugly-looking programs. But it worked.

Unknown said...

Could you elaborate on your "principles for sharing" (or link to somewhere if you already have :)?

Mike said...

Ahh Happy memories! Born in 1955, my experience in the UK almost exactly matches yours, except I went from FORTRAN on a KDF9 to Algol 68R - try reading THAT report! - before getting to Algol 60, and then Pascal...

Mike

Ian said...

I still use FORTRAN today. For numerically demanding problems it is often 20-30 times faster than Python. I'm not old enough to remember punched cards though !

Stephen said...

I first learnt on punch card equipment (IBM 1965); sorters, tabulators, etc (026 punch, or one manual punch whose name I've forgotten where you needed to know all the combinations and punch by finger pressure). Then following fashion: machine code, assembler, PLI, Pascal, C, C++ (ugh), Java, C#, .... Odd bits of RPG, Lisp, Fortran, Prolog, and even Python on the way. Implementation of our first large relation data store was based partly on the way the sorters and tabulators worked.

Stephen Todd

JimfromIndy said...

I was born in 1954. My college career, in the US, went a little like yours. My first language was FORTRAN. I was VERY good at FORTRAN.

My first academic language (in 1973) was also Algol. We were tasked with writing an Algol compiler IN ALGOL. It was daunting, and I'm not sure I even finished it.

I enjoyed programming so much, I became the guy who sat at the "insultant"s desk and told you why your FORTRAN program wasn't working.

Then came assembly (PDP-8) and then the world of IBM (COBOL and Assembler.) I was very good at COBOL, and passable at assembler.

After several years of COBOL programming, I left programming and became a manager, then a non-IT manager.

Now, I'm back in it as a consultant. Things have certainly changed. I don't know what Python is, but I look forward to learning more. It's that or die!

Jim

telic_progression said...

It is a good thing that you find reason for being so optimistic for tomorrow's weltanschauung in terms of engineering, as applied to programming, because I really still have my doubts.

Having the basis for ingenuity is a great thing for kids, and I can't really ever see that going away, but I really think that programming as a discipline is becoming more a means to an end rather than an end in and of itself. And, for that reason, you will find less interest in really seeing how things work 'under the hood'.

It makes me think of Star Trek:TNG and programming the holodeck. Obviously, these "engineers" are not fooling around with loops and exceptions. But they are still coming up with physics formulas as to how to teleport into a moving starship traveling as light speeds.

We shall see.

Anonymous said...

Ah yeah.. those where the times. Well, I started with programming 29 years ago, using Basic on a Vic20. Later assembler on the (amazing, I must say) CBM Amiga already been through the Amstrad and Atari and Spectravideo SVI328 (the latter was pretty cool too btw).

I started to develop something on the Amiga which resemble the .net platform today (I posted about this in the Amiga news group at the time, beginning of the 90's which can verify it), but never completed it. Today I'm stuck with.. um, .net, on Windows.

Fortunately I do other things to such as graphics and 3D animation.

You kind of smile looking at all the new terms and techniques of today knowing how things actually works under the "hood". There lies a lot of power in that. :-)

Anonymous said...

Ah yeah.. those where the times. Well, I started with programming 29 years ago, using Basic on a Vic20. Later assembler on the (amazing, I must say) CBM Amiga already been through the Amstrad and Atari and Spectravideo SVI328 (the latter was pretty cool too btw).

I started to develop something on the Amiga which resemble the .net platform today (I posted about this in the Amiga news group at the time, beginning of the 90's which can verify it), but never completed it. Today I'm stuck with.. um, .net, on Windows.

Fortunately I do other things to such as graphics and 3D animation.

You kind of smile looking at all the new terms and techniques of today knowing how things actually works under the "hood". There lies a lot of power in that. :-)

Guido van Rossum said...

Um, so where does this flurry of comments suddenly come from? Did someone post a link to this page on a list of old-timers? :-)

RHodnett said...

Guido, the CodeProject Daily Developer News issue for Jan. 17, 2012 linked to your article.

retnz said...

ha the old days, got my Sinclair at age of 15 and never stop since, learn Assembler and program games like a maniac! them the normal several flavours of Basic passing by all kinds of Xbase languages, using Pascal and then got caught in the windows tools and become addict on these days .Net C# is where I live :), to be very honest my best bet was to learn assembler and code a lot with it, every single language after that were dam easy ;)

Guido van Rossum said...

Re: my principles for sharing. I like sharing code, because I like my code to be useful. I don't care about being paid for my code, since I wrote it for my own satisfaction. I don't care much(*) about others making commercial use of my code. I do care about others claiming they wrote it.

(*) I would care if someone took what I wrote, added no value of their own, and made a million bucks selling it without revealing they were just repackaging. But that's pretty unlikely.

Anonymous said...

> But punch cards are the reason that some software still limits you (or just defaults) to 80 characters per line.

I am not absolutely certain about it, but I believe there are many good
reasons why to limit the width of the column. Classical typesetters (see https://www.tug.org/TUGboat/tb19-1/tb58tay1.pdf for a lot of background) discovered the easiest way to read a text is if it is broken to columns around 60 characters wide on average. Yes, it is possible, that with programming code, which has way more white space the column can be a bit wider, but I don't much above 80 columns brings more in terms of readability.