Indistinguishability Obfuscation, or how I learned to stop worrying and love DRM

There’s a lot of hype running around about this:

https://www.quantamagazine.org/20150902-indistinguishability-obfuscation-cryptographys-black-box/
https://www.quantamagazine.org/20140130-perfecting-the-art-of-sensible-nonsense/

Lots of excitable talk of “perfect security” and other stuff. One of the possible applications is supposedly quantum-resistant public-key crypto. But if you read into it, it’s actually a way of making code resistant to decompilation. So instead of creating a secret number that’s processed by a well-known algorithm, you create a secret algorithm that you can ask other people to run without fear of it being reverse engineered. So the “public-key” crypto is really shared-secret crypto with the secret sealed inside an obfuscated algorithm.

In other words, it’s bulletproof DRM. deCSS is even referenced (obliquely) in one of the articles as a use case.

Of course, this makes it in principle impossible to test code for malicious behaviour. You could insert a latent trojan into it and never be discovered, and it removes one of the most important security features of security software – auditability of the algorithm. For example, someone could write a rot13 algorithm and call it “encryption” and the only way to (dis)prove it would be to run a statistical analysis on the ciphertext.

So the question becomes – why would anyone allow IO programs to run on their systems? Virus scanners would be useless in principle. Performance, even in the most optimistic case, would be dreadful. And it doesn’t do anything for the end user that can’t be achieved by traditional crypto (barring the development of a quantum factoriser, and even that is not yet certain). No, the only people who gain are the ones who want to prevent the next deCSS.

Ubuntu upgrade hell

So I decided to upgrade my work ubuntu laptop. I had been putting it off for ages because of the hell I went through upgrading it from 6.06 to 8.10, but being stuck on old versions of pretty much everything (but especially openoffice) was becoming impossible. Strangely though, it was when I tried to install monkeysphere that I finally snapped. Time to do a dist-upgrade to 10.04 I thought.

I started using the desktop package manager – it had been prompting me with “a new version of Ubuntu is available” for quite some time, so I pushed the button and let it do its thing. After an initial false start (out of disk space) it downloaded the upgrades and went to work. About half an hour in, it decided to restart kdm, which is where the fun started.

Now I’m at a command login and the upgrade is in a bad state. A few apt-get -f installs later I find that there’s a file missing in the latex hyphenation config and one of the postinst scripts keeps failing. The postrm script fails too, so I can’t even remove the offending package. After sleeping on the problem, grep -r comes to my rescue and I find the reference to the missing file. Commenting that out just bumped me to the next missing file, so I rmed the lot. I know I now have a badly broken latex system but dammit, I just want this upgrade to finish.

A few more apt-get -f installs and apt-get dist-upgrades later, and the system is ready to reboot. But grub can’t find the root partition, and it drops me to an initramfs which can’t see any hardware. So I had to fire up a 10.04 install cd that I fortunately had to hand, and use it to chroot into the root partition and rebuild the initrd.

Finally, it boots. But I can’t login graphically because gnome-session can’t be found. Back into the command line and apt-get install ubuntu-desktop, which takes another half an hour because it’s all missing, the lot of it. At this point, I notice something odd – I can use my thinkpad fingerprint reader to log in on the command line, but not graphically – scanning the finger when in X gives nothing, not even an error.

Anyway, my xorg.conf file is apparently no longer valid, as it doesn’t recognise the dual screen setup. I rename it and let it run on auto-detect and the screen comes back, but the EmulateWheel on my trackball now doesn’t work. So I run X :2 -configure to get a skeleton xorg.conf file, save that in /etc/X11/xorg.conf and cut&paste my old mouse settings into it. This doesn’t work.

At this point, having used sudo several times, I accidentally discover how to make the fingerprint sensor work while inside X. When prompted, scan your finger. Wait two seconds, hit Ctrl-C and then enter. Don’t ask me why.

It turns out that in recent versions of xorg, you need to set the option “AllowEmptyInput off” in the ServerFlags section or else it ignores any mouse or keyboard configuration sections. Sure enough, this allows EmulateWheel to work again, but the mouse pointer also moves while the emulated scroll wheel is turning. I’m now running very late and this sorry saga will have to continue in the new year.

Merry Christmas, software developers.

Do what I mean, dammit. Or, why being silently “helpful” is evil.

For the last three years (i.e. before Time Machine), I have been using rsync to make incremental backups to an external FireWire disk from my trusty iBook. Now, rsync does this by backing up into a fresh location each time and referring to the previous backup to check if any data can be de-duplicated, which it does by creating hard links.
rsync -a --link-dest=$PREVIOUS_BACKUP $SOURCE $NEW_BACKUP
A wrapper script is needed because rsync doesn’t do any rotation of the backup paths (mine preserves six previous backups, which is quite enough), but overall the solution is elegant – restoration from any given backup is just a cp -pR away, due to the transparent nature of hard links. However (and this is the important bit) rsync only de-duplicates when the source file is identical to its previous backup in every way, including metadata.

As time went on and my laptop drive started getting full, I found that the backup window was getting suspiciously long for an 80GB disk. But it was the lack of space that finally drove me to buy a bigger external drive (admittedly, backups aren’t the only thing taking up GB on my FireWire farm).

Finally I ripped apart the wrapper script and dug through the previous backups. Turns out that rsync wasn’t preserving all the metadata, specifically file ownership. Google, as ever, gave me the answer:

Official Google Mac Blog: User 99, Unknown

By default, OS X silently maps all file ownership on external HFS+ disks to a special user “unknown”, while pretending to the user that he still owns the files. This is a “feature” to prevent weird permissions problems when swapping external drives between machines. It is so low-level that not even root can override it – if you try as root to chown a file on an external HFS+ filesystem, it will silently do nothing.

The upshot is that rsync didn’t believe that any backed-up files were identical to their originals, and so didn’t de-duplicate anything. Instead of six lean incrementals, I have six wasteful bulk backups, some of them incomplete because I ran out of backup window. And all the file ownerships have been lost, so a system restore would have failed spectacularly – if I had been foolish enough to try.

You can turn off this behaviour for any given disk (Finder>Drive>right-click>Get Info>uncheck “Ignore ownership…”), but it’s a bit late now.

Particle smasher ‘not a threat to the Earth’ – opinion – 28 March 2008 – New Scientist

A group in the US has taken out a lawsuit to try to prevent the Large Hadron Collider starting up. Why? Because it might destroy the earth, of course.

Particle smasher ‘not a threat to the Earth’ – opinion – 28 March 2008 – New Scientist

I hate to say it, but that’s what happens when popular science magazines blindly give credence to any old nonsense wittered by someone with a science degree. Planet-eating particles, my arse.

The ActiveX effect

In my current job, Microsoft Sharepoint rules. The decision was made long before my time to employ it as the common document repository, and for most people it works reasonably well. I have to force myself to use it though, and I can distil the reasons down to one root cause: ActiveX controls.

In its vanilla, static-page form, Sharepoint is barely functional. It takes a minimum of four clicks (open, edit, modify, save) to change a radio-button option. Checking files in and out is a pain – it’s easier to just overwrite. And I’ve never managed to attach a file to a list entry. To make it really useable, you need to run IE.

Oh wait, you’re using a Mac. Well you can just piss off then.

It’s interesting that Microsoft poured thousands of programmer-hours into developing an alternative to JavaScript and not one of the other browsers makes any attempt to support the resulting spec – this in a world of Mono, Samba, Moonlight and OpenOffice. MS dropped IE support on the Mac but keep pushing ActiveX in their server products. I’m no expert, but the only thing I see ActiveX doing that JS can’t is installing software updates without bothering the user with dialog boxes. Which is a good indication of why nobody else will touch it with a barge pole.

The problem is not confined to Sharepoint. The motivation for this post was finding that Microsoft’s certificate management server requires scripting to be turned on (it doesn’t say what sort of scripting, but it isn’t JS) in order to process a simple form that could have been written in 1995. In this case there was no option but to boot up the VM and use IE.

So what to do? Struggle manfully with Sharepoint’s prehistoric static interface or retreat into the VM, cut off from my usual editing suite – Office X for Mac. Somebody somewhere is no doubt enjoying this juicy irony. But it’s not me.