Thursday, September 20, 2007

The Rise of Modern Morality

Every so often I read a letter in a newspaper or some other forum about "the decline of modern morality", in which the writer laments the failing moral standards of private and public life. I'm old enough to have been reading these articles for two or three decades now, and I've seen some samples of the same genre from past times. Elvis Presley caused a moral panic in 1956 by gyrating his hips on stage, and this was cited by commentators at the time as corrupting the morals of young people. Much has been written about the sociology of moral panics and I don't propose to repeat it here. Instead I want to argue that, far from declining, modern morality is actually superior to moralities of the past (and I use the plural deliberately).

  • In 1970 in the UK it was not only legal but widely accepted practice to pay a woman less than a man for doing the same job. Women who wanted a life outside of home making and child rearing were seen as aberrant, and often regarded with scorn.
  • In 1960 in America many parts of the country segregated public facilities by race, with black people consistently and blatantly short-changed. This invidious system was widely supported by prominent politicians and churchmen.
  • Even into the 1970s in Australia aborigine children were forcibly removed from their parents and placed in institutions where they were denied proper education and were terribly vulnerable to physical and sexual abuse. Again, this programme was considered perfectly proper and moral by the standards of the time.
  • There was a standing assumption for many years that unmarried mothers should immediately give up their children for adoption. In Ireland they were also incarcerated in the "Magdalene Laundries". Other countries had similar systems. Strangely, the fathers were left to go free.
  • For most of the 20th century, when it became necessary to remove children from their parents because of neglect or abuse little thought was given to keeping siblings together: they would be split up to suit the convenience of potential adopters or fosterers.
  • Until recent years in most Western countries, homosexuals were persecuted and discriminated against, both by the law and by society at large.
  • Until the 90s in the UK drink-driving was considered a minor peccadillo. Drivers convicted of the offence were more likely to encounter sympathy at the unfair attitude of the legal system than censure at their reckless disregard for the safety of others.
Not all of these evils are completely gone, but in all cases there has been a substantial moral shift. I know that some readers will look at some of this list, especially the tolerance of homosexuality, and regard this as evidence of moral decay rather than ascent. But this brings up another issue I have with the Cassandras of moral decay: their belief in moral absolutism; that right and wrong are entirely self-evident, and that any deviation stems from lack of morality rather than a genuine disagreement over what the moral course is.

The problem with moral absolutism is that (in its commonest form) it asserts that the perfect moral code has already been revealed and the only possible improvement is in closer adherence to that code. However I describe myself as a Liberal Utilitarian (although I confess I haven't actually read any of Mill's work). Hence I see "the greatest good of the greatest number" as a basic moral principle, while at the same time acknowledging that there may be a lot of disagreement about exactly what that means and how to bring that about. Liberal utilitarianism recognizes no absolute moral imperative apart from the general principle that life, liberty and the pursuit of happiness are good things. However by retaining that anchor it avoids the charge often leveled by the Absolutists that those who question their moral code are "relativists" who believe that any moral code is as good as any other. Liberal utilitarianism examines the impact of moral rules on the real people affected by them. If different rules would have a better outcome then those rules are automatically better. I see the arguments (both past and ongoing) about freedom, equal opportunities and tolerance to be a part of this process, and I see a steady improvement through history. Those who object to the current crop of improvements in morality because they contradict their particular absolutist code would do well to read their history books.

Friday, September 7, 2007

Windows Disaster Recovery with Bacula

A few days ago my wife asked me to look at her computer. Programs were taking ages to respond even to a simple mouse click, or just crashed. And the system box kept making odd clicking noises.

I confirmed her diagnosis of a failing hard drive and shut the computer down. The following lunchtime I drove to the nearest PC World and bought a replacement hard drive. I normally buy mail-order, but we wanted the computer back up as soon as possible. That evening I removed the faulty drive and installed the new one, and I also stuck in some extra RAM while I had the case open. Then it was time to find out if my rather sketchy disaster recovery plan was going to work.

I run Fedora on my own box, and both computers are backed up using Bacula. This is designed for backing up lots of computers to tape drives, but it also supports backup to disk drives. I have two USB drives, keep one plugged in, and swap them every month or so. I use the default Bacula backup schedule of a weekly full dump on both machines and nightly incremental dumps. Most of the Bacula components only run on Unix, but there is also a Windows client installed on my wife's computer (which runs Windows XP SP2). Bacula configuration is rather hairy, but if you have more than one computer its well worth the effort.

The big headache for restoring Windows is the Registry. After previous bad experiences with this horrible blob of data I had made sure the registry was backed up by using regedit to dump the registry contents to a file before each Windows backup, so I hoped it would be OK. There are also problems with other files in C:\Windows that are constantly in use and therefore unwritable during system restore.

The Bacula manual pointed me at Bart PE Builder, which generates a Live CD version of Windows XP. It suggested that you might be able to run Bacula in this environment to get around these problems. Bacula ran fine, but I couldn't get the drivers for the network card to install, so that wouldn't work.

Instead I re-installed Windows XP SP1 from the original CD and ran Bacula there. The restore duly dumped copies of C: and D: in C:/temp/restore/c and /d. (The computer has two partitions because it used to have two drives. My wife got used to having C: and D: drives and various programs had been configured to look for data on D:, so I kept the layout even when it went back to a single drive.). Then I booted under the Bart PE disk, deleted the existing contents of C: (except for /temp) and copied the restored contents into the root directories. Then for the big test: would it reboot?

Well, sort of. The login screen came up, but when I tried to log in the computer hung. I tried rebooting in safe mode, and then in "safe mode with console", which seemed to be the bare minimum. That at least got me a command line, so I ran regedit and imported the registry copy that had been created by the last backup. This warned me that some registry keys could not be modified and claimed that the changes had failed. I rebooted again, and this time found that I could log in, but pretty much all the settings had been forgotten. Office wanted to re-install itself, and every time I started Word it asked for my name and initials twice. This suggested that important parts of the registry were not only still not recovered, but also unwritable.

Windows also decided that it was running on new hardware, and badgered me to activate it. So I did. I then tried logging in as a different user and restoring the registry again in the hope that being a different user would lock different bits of registry. This merely overwrote the new activation data and Windows now point blank refused to let me log in at all until I activated it again. So I tried. This time it told me that I had exceeded my activation limit and would have to phone up for activation. So I did. Windows gave me a 36 digit number and a robot on the other end of the phone line told me to type in the number. Then a polite gentleman named "Fred" with an Indian accent asked me how many computers I had Windows installed on and why I needed to activate it. Then he gave me another 36 digit number to type into Windows to activate it. This worked. But when I logged in Windows wasn't behaving any better.

A bit of Googling reminded me of something I should have remembered much earlier: Windows occasionally checkpoints its critical state, including the Registry, and you can wind it back to a previous known good state using the System Restore function. So I located a checkpoint from before all the trouble started, restored Windows to that state and rebooted.

When I tried to log in I got immediate joy: the desktop background had been restored. This suggested that the registry was now intact. But this restore had also overwritten the registration data, and Windows once again demanded to be activated before I could log in. Back to the phone, this time to a polite woman named June with a much stronger Indian accent who asked me the same questions and gave me yet another 36 digit number to type in. This time everything worked. Disaster recovery was completed.

I'd like to thank the authors of Bacula for their excellent backup program. It saved both of us a lot of heartache. In the past I've found it too easy to neglect backups, and sometimes our computers have gone for months without being backed up. When I got it properly configured (not a trivial task) Bacula made backups automatically with minimal intervention by me (basically, swapping USB drives occasionally). That meant I had a good recent backup to work from.

I'd also like to thank Bart Lagerweij, author of Bart PE. I could probably have managed by booting Knoppix and using its NTFS driver capture facility, but having a native Windows environment made life much easier.

Saturday, September 1, 2007

Composability and Productivity

This was posted to my original blog on January 7th 2007. It was also discussed on Reddit, so please do not repost it.

----------------------------------------------------

My earlier post about increased productivity through functional programming stirred up a good deal of comment. A number of people replied that the libraries are more important than the language. More recently a similar point has been made by Karsten Wagner, who argues that code reuse is what makes languages productive.

I remember the early days of OO, when it was argued by many people (me amongst them) that OO languages would finally let us write reusable software. There was optimistic talk of “software factories”, and Brad Cox gave lots of talks about how software could now finally move from craft to engineering discipline, built on the back of libraries of reusable code. So the entire industry went for C++, but the easy creation of reusable software remained elusive. That is not to say you can’t ever produce reusable software in the language, but it is not significantly easier than in C.

Karsten accurately pins the reason why C++ failed to deliver: memory management. Back in those old days I was trying to explain to people why Eiffel would be so much more productive than C++, and garbage collection was a big part of it.

The reason why GC is so important lies in a more general principle called composability. Composability means that you can put two bits of code together and important correctness properties will be preserved automatically. This does not mean, of course, that the composition is automatically correct in a wider sense, but it does mean that you don’t introduce new bugs merely by sticking two things together.

Imagine two modules of code in a non-GC language like C or C++. Module X creates an object and hands a reference to that object over to Module Y. At some point in the future X will delete that object. However it is only safe to do so once Y has finished with it. So if X and Y were written independently then it is quite possible that Y will hang on to the reference longer than X expects. In short, manual memory management is not composable, because the composition of X and Y can introduce stale pointer bugs that were not present in either X or Y.

In a language with GC this is a non-problem: the collector will reap the object once it sees that both X and Y have finished with it. X can therefore forget about the object in its own time without having to know anything about Y. But programmers in non-GC lanuages have to resort to a number of workarounds. Either objects have to be copied unecessarily (which leads to stale data bugs instead of stale pointer bugs), or else some kind of reference counting or similar scheme must be employed. Reference counting is of course merely an ad hoc, informally-specified, bug-ridden, slow implementation of GC, and therefore stands as a classic example of Greenspun’s Tenth Rule.

But programming languages contain other examples of non-composable constructs. The current biggest offender is shared memory concurrency using locking. If X takes locks Foo and Bar, and Y takes locks Bar and Foo (in those orders), then sooner or later they are going to deadlock with X holding Foo and Y holding Bar. Ironically Java has the biggest problems here.

Surprisingly, program state generally is a source of non-composability. Mutable state is actually another form of manual memory management: every time you over-write a value you are making a decision that the old value is now garbage, regardless of what other part of the program might have been using it. So suppose that X stores some data in Y, and then Z also stores some other data in Y, overwriting what X did. If X assumes that its old data is still there then it is going to be in trouble. Either X needs to defensively check for new data, or Y needs to tell X about the change (the Observer pattern). Whichever way, the addition of Z to the system can introduce bugs that stem from the composition of modules rather than the modules themselves.

Another example is the Command pattern, which includes a method to “undo” a previous command. Why undo it? Because the global state has been changed. Get rid of the concept of a single unique state for the entire application and you get rid of the problem.

On a more prosaic level, many library modules have some internal state, and so require an initialisation call to “set things up”. Who exactly is responsible for this initialisation? If you compose two modules together then they may both imagine themselves soley responsible for making this initialisation call, and hence it will be done twice.

It is a truism of programming that “integration” is the most difficult and risky (from a scheduling point of view) part of developing a large system. Integration basically means composing together all the units to try to get a working system. And the biggest integration problems are due to the non-composability of the units. A defect within a unit is fairly easy to identify, but composition bugs are down to the interaction of two separate units in some unforseen way, which makes it correspondingly more difficult to pin them down.

So what would a language look like if it got rid of all of these non-composable constructs? The answer, basically, is Haskell. The underlying goal of functional language designers is to make everything completely composable. Haskell is the language that has advanced furthest towards this goal.

Erlang is pretty good as well. It doesn’t eliminate state, but it does keep it confined to individual processes: there are no shared memory locks. Processes have to communicate according to shared protocols, but this is a manageable dependency on a common abstraction. Even within a process the shared state can be reduced by careful use of functional programming rather than imperative.