There is a lot of discussion on the Internet today about the Raspberry Pi. This is a project over which I am deeply ambiguous. On the one hand, exposure to computing at an early age was enormously influential on my life. I was fortunate enough to have a BBC Micro, a machine that was and indeed still is, extremely capable†, and easy to program. Not just in the sense that BBC Basic was an excellent language, the entire machine was easy to understand – you could maintain a pretty good mental model of “where” everything was in its 32k RAM, and what happened when and why. Everyone wrote their own simple games (today’s Angry Birds is just a modern twist on the artillery game, perfectly do-able by a keen 10-year-old in those days). Whether on the BBC or the C64 or whatever, dabbling in BASIC programming was not unusual, even for kids who mainly played games, and using a home computer mainly for programming wasn’t unusual either. Magazines had annotated listings to type in, and hardware projects interfacing with or even modifying machines. Schoolchildren in the 80s made the UK the software powerhouse it is today.
A lot of that was lost in the intervening years. It was certainly possible to program the “next generation”, the 16-bit home computers like the Atari ST and Commodore Amiga, of course. It just wasn’t what they were for. The creators of the BBC went to great lengths to include a fantastic BASIC dialect, whereas ST Basic was notoriously bug-ridden, and Devpac was a third-party product that you had to pay for. I don’t know so much about the Amiga world, but on the ST, the barriers to entry to programmers were certainly higher. Not insurmountably so, for one such as me who had grown up on the BBC and took programmability for granted, but I wonder how accessible I would have found it if it were my first machine. Certainly there was more to learn in order to, as on the BBC, produce a “professional” looking program, one that would operate with GEM for example. But, whether it was truly an unexpected emergent property of the more advanced machines, or the general zeitgeist, programming fell out of the mainstream. Games were the normal use, consoles such as the Sega Megadrive replaced 8-bit micros in some households.
Since then (fast-forwarding over most of the ’90s and all of the ’00s) programming has simultaneously gotten easier and less accessible. How is this possible? Abstraction. Moving further from the machine, or placing more and more layers between the programmer or user and the machine (once those terms were nearly synonymous). It is very easy for a user to write a macro in VBA that gets a lot of useful work done, and I would never advocate taking this type of programming away for that reason. But it encourages thinking of the machine as a “black box”, it is difficult to reason about what it is actually doing, and that in turn discourages the very powerful mindset that it’s all just code, all the way down that is needed to be an actual programmer. This is not necessarily intended as a value judgement; for many people computers are just tools and that’s fine, for them the macro-style approach is highly productive.
But someone has to make the tools, and the question is, is the Raspberry Pi going to nurture a new generation of tool-makers? Abstraction is useful because it allows one to do work without repetitive detail and focus on the problem domain. But I argue that abstraction should only be introduced once the fundamentals are understood. Learning on a machine like the BBC teaches that, no so much the details which become obsolete, but the concepts, (e.g. I do my real work in Python on Linux on x64 not BBC Basic on a Model B, but I use dis and GDB regularly). The Raspberry Pi has video, ethernet, USB, 256M RAM. It comes with a GUI and a web browser. It has more in common with a PlayStation than it has with a BBC Micro (and I make the same criticism of the OLPC). I know a lot of people of my age are excited about getting one and using it as a cheap embedded controller, like an Arduino. But I don’t think it’s a useful teaching tool, or at least, no more useful than a common PC. For that, you’d want something like a FIGnition. I honestly don’t know why that project has been relegated to the sidelines while the Pi gets all the press.
† Thought experiment: If you had a BBC clocked at 2Ghz what “real work” could you not do on it? What would you need to add? What about a BBC with a 65816 instead of a 6502, giving it 16M RAM, and ADFS access to modern storage – but fundamentally the same OS, languages(s), the same model of computation, switch it on, BASIC
> prompt and off you go? I am struggling to think of anything…
Amazing! Your thought experiment is something that was already realised for the Commodore 64, for which the SuperCPU is available with max 16MB RAM (used for program and as ram drive), and a WDC 65816 at 20 MHz.
I never tried it, because it was to expensive for me back when it came out, which was somewhere late 1995, early 1996 (it was discussed in http://www.ffd2.com/fridge/chacking/c=hacking12.txt in March 1996). According to Wikipedia (http://en.wikipedia.org/wiki/Commodore_64_peripherals), there are still peripherals being developed for the C64, so why not for the BBC.
BTW, I lately found out about Arduino, and now about Raspberry Pi. Stuff like this gets me excited, even if I would only think about playing with it, together with my kids.
There is something similar in the BBC world too: http://www.sprow.co.uk/bbc/armcopro.htm
If you are interested in Pi, also check out FIGnition, which to my mind is far truer to the original vision of a powerful yet simple device as those old 8-bit micros: https://sites.google.com/site/libby8dev/fignition
I totally forgot the projects by Chuck Moore involving MISC – Minimal Instruction Set Computer: MuP21 and F21 (links are on the page mentioned below). A cheap CPU with video-out. Running Forth. It really got me excited when I started using (or getting used to) Forth.
More stuff like that: http://www.ultratechnology.com/chips.htm
I can’t agree with you more. After ordering one and switching it on it crossed my mind: what does this do that a Windows laptop doesn’t. I’m having trouble getting my son’s primary school interested in it, which is a shame as I was preparing to donate it to them and help to get them started. For hackers like us it’s wonderful and it’ll probably make a good BeagleBone replacement. Without lots of I/O (like, enough to make and program a robot or some christmas lights or something) there’s still no compelling reason for schools to use one.