A Viewer’s Contract

In my previous post I lamented the annual round of TV show cancellations, leaving plotlines unfinished and cliffhangers unresolved. But is there actually anything that can be done about this?

I think one of the big problems is that we are not actually the TV company’s customers. We might think that we are – indeed they often suggest that we are – but we’re not. We’re the product.

The broadcasters are selling eyeballs. They’re selling viewing figures. They’re selling a captive audience to their advertisers. The programmes they produce are just a means of pulling in as many viewers as possible in order to improve their ratings – and thus increase the amount they can charge for advertising. With subscription channels you might even be getting reamed twice – not only do you pay for the privilege of being their “customer”, but you still get subjected to advertising.

In the case of the BBC the situation is slightly different – but still similar. They also chase ratings in order to bring in money. In their case it’s not advertising revenue, but money from the TV licence fee. In order to get that money they have to show that they’re serving the British public… which requires good ratings. Sure, there are also requirements about quality and content which ensures that minority interests are served to some degree – but if they didn’t also produce ratings-chasing mainstream programmes you can be sure that their funding would get cut pretty quickly.

The broadcasters have contracts in place with the producers of the programmes. These contracts typically cover a series at a time, usually with an option to renew. That option gets exercised if the show is a ratings success, otherwise the contract isn’t renewed and the show is effectively cancelled.

The broadcasters also have contracts in place with the advertisers. These contracts basically sell our eyeballs at a certain rate based on the time of day and number of viewers. If a programme fails to attract enough viewers in one timeslot, it will often get moved to another cheaper slot (typically losing a portion of the viewers in the process). If it fails to get sufficient viewers that the advertising revenue more than pays for the show, it is unlikely to be renewed.

So there’s a contract with the producers, and a contract with the advertisers. Note, however, that there’s no contract with the viewer. There’s no contract that guarantees the completion of a story. There’s no contract to stop a programme being moved around the schedules and then dropped entirely (even if there are unshown episodes available). There’s no contract to stop the programme being sold to a different channel that the viewer may not even have access to (such as when Lost and 24 moved from terrestrial to satellite TV in the UK).

I think it’s time for a viewer’s contract. Not a literal piece of legally-binding paper, but an implied consideration towards the viewers and fans of a show. A set of guidelines that tries to minimise the artistic compromises that occur when a show is cancelled without sufficient warning.

Perhaps it would mean putting a stop to cliffhanger endings. Perhaps it would mean that each series has to be self-contained, closing all the major plotlines. Perhaps it would mean and end to dropping programmes entirely if there are still episodes to show.

More realistically what I would like to see is a contingency fund. When a show is commissioned, a proportion of the money would be put into this fund, to be used in order to produce one final episode when the series gets canned. Instead of commissioning a series, the studios would instead commission a series plus an episode. If the programme runs to more than one series, the contingency could be cumulative – providing enough money for a TV-movie, perhaps.

A programme that runs for a single series might only get a small contingency fund – enough to create a short or an alternative ending. The commission for a second series would provide enough for a whole extra episode. Add a third series and there’s enough for two episodes, or one TV movie. After that point the contingency contribution would drop off – still adding something to the fund, but not a whole episode’s worth.

This scheme adds little to the price of creating a single series, so as not to reduce the incentive to try out different ideas. Seasons two and three are a little more expensive. After that each series becomes relatively cheap again, so there’s actually an incentive to keep a programme running.

What this does, more than anything else, is create a complete narrative – with a start, middle and most importantly an ending. Surely that’s got to make it easier to sell DVDs in future, or to sell programmes to other countries and broadcasters. Most of all it stops the viewers feeling let down and cheated, and that’s got to be worth something.

To be (dis)continued…

Consider how you would feel if you bought a book from a bookshop and found that the last chapter was missing. Or if you’d been reading a series of books, and the last one was never released.

Of course no self-respecting book publisher would omit the last chapter or leave a popular series incomplete – but this is exactly what happens in the world of TV every year.

A few weeks ago it was cancellation time; that frustrating part of the year when you discover that many of the programmes you’ve been following will not get renewed for another season. When you have to concede that the plots you’ve invested time in will remain unresolved. When the characters you love (and hate) will be left in perpetual limbo.

This year the cull seemed particularly harsh:

  • My Name Is Earl – Cancelled after four seasons, with the last episode promising “To Be Continued”
  • Terminator: The Sarah Connor Chronicles – Cancelled after two series, leaving several open plot lines and a cliffhanger ending
  • Pushing Daisies – Cancelled after two series; although many of the plot lines had been brought to a close, there was no overall resolution
  • Primeval – Cancelled after three series leaving unresolved plotlines and a cliffhanger ending

This is by no means a comprehensive list – it just covers the particular programmes that I watched. Notable mentions also go to Chuck (nearly cancelled, but given a temporary lease of life by a fan campaign) and Kyle XY (cancelled this year, but I don’t know what happens at the end because I’m watching it on BBC2, and they’ve only shown series 1 so far). Although it was cancelled last year, I’ll also mention Jericho (killed, resurrected by a fan campaign, then killed again).

As a viewer, I feel somewhat cheated by this state of affairs – in particular by the lack of closure for most of these series. This is the equivalent of a missing last chapter, or no final book in a series. This willingness to abandon a programme on a whim, with no concern for the integrity of the series as a whole, makes me disinclined to invest my time in watching such narratives in future. Why waste hours of my life on characters and plots that will never get a proper ending?

Perhaps the answer is to wait until after a programme has been cancelled – then download or buy it on DVD if it had been given a real conclusion. But if everyone did that the viewing figures would drop so low that no company would bother investing in anything with an ongoing narrative, and the airwaves would become so much poorer for it.

Thoughts on Ubuntu 9.04, Jaunty Jackalope

Having upgraded a few machines to Ubuntu 9.04 a few days ago, here are my thoughts on this latest version:

Things I Love

The speed of the system – both in booting and in general use – seems to be faster. There’s more of a delay after logging in before it draws my desktop, but when it does the system is immediately usable. Previously the desktop appeared quickly, but it would take a few seconds longer before the panel applets were drawn and the system could actually be used.

Things I Hate

The artistic direction of 9.04 leaves something to be desired. The primarily dark login screen is intimidating, rather than friendly and welcoming. Yes it can be changed easily, but it’s not the sort of thing that will encourage a novice user

Reasonably friendly login screen from 8.04

Reasonably friendly login screen from 8.04

Intimidating login screen from 9.10

Intimidating login screen from 9.10

Similarly the default desktop image is dull. It looks like someone took a blue wavy-line desktop image off an Apple machine, and converted it to brown. The best thing that can be said for it is that it’s inoffensive. The Hardy Heron desktop image was daring and imaginative – and beautiful enough as a work of art that I bought the T-shirt. The Intrepid Ibex image was a bit too abstract – with many comments that it looked like a coffee stain (rather than the Ibex it was supposed to be). But even that was more interesting and inspirational than a few wavy lines.

Good. Artistic. Original

Good. Artistic. Original

Coffee stain? But at least it's original and different.

Coffee stain? But at least it's original and different

Just like an Apple desktop image, only more dull

Just like an Apple wallpaper, only more dull

While I’m on the subject of desktop images, why does Jaunty have packages for “edgy-wallpapers”, “feisty-wallpapers” and “gutsy-wallpapers” but no sign of “hardy-wallpapers” or “intrepid-wallpapers”?

Things I Just Don’t Understand

Since the first version of Ubuntu, five years ago, the “Log Out” and “Shut Down” options have been available from the System menu. Now they’re gone. Sort of. Depending on whether or not you’re using the “User Switcher” applet.

That’s right – the presence, or absence of some key menu items, which muscle memory has trained me to look for in the same place over the past five years – is determined by whether or not you want to switch users from within a running session. In what universe does that make sense?

The logic seems to be this: the User Switcher applet offers Log Out and Shut Down options on its menu; therefore to avoid duplication/confusion, when the User Switcher applet is present those options should be removed from other parts of the user interface.

What I don’t understand is why I can’t have both. Why can’t I shut down from the System menu most of the time, as I’m used to doing, but still have the ability to switch to a different user from time to time? Why can’t I log out from the System menu if my mouse is close to it, or log out from the user switcher if my mouse is closer to that?

To make matters worse, the icon on the User Switcher applet tends to change. This is what it looks like with my normal settings (there’s also an option to show your name instead of the little person icon, but as I already know my name, I prefer it to take up less screen space instead):


See, a little red icon that contains a well known symbol denoting power controls. A new user might not spot it as a means to log out quite as quickly as a sweep through the menus, but at least when they do spot it they’re likely to remember that they need to click on the red power button to log out.

Now look what happens when I do something that’s largely unrelated to logging out or shutting down: I’ll launch Pidgin, the instant messaging (IM) client shipped with Ubuntu:


The “power” icon has now been replaced with a green circle, indicating my IM status. If my IM status changes, so does the icon. So for any user who wants to have the User Switcher applet, and who also runs Pidgin, the options to Log Out or Shut Down are hidden behind an icon whose colour and shape changes based on instant messaging settings. That’s nice and user friendly, isn’t it. Whatever happened to the principle of least surprise?

If the Log Out and Shut Down options were also still present on the System menu, this wouldn’t be so much of an issue. Users who are thrown by the changing state of the User Switcher icon would soon learn to use these functions from the System menu instead. Those old Ubuntu hands whose muscle memory still sends them to the System menu wouldn’t be surprised by the absence of the Log Out and Shut Down options.

By all means expose common functionality like this in more than one place, but please don’t expose it in one place if you’ve got an applet installed, and another if you haven’t. And please don’t make the one place that it’s exposed also be an icon whose shape and colour can change frequently, depending on the state of yet another application. If you want to confuse new users that’s a good way to go about it.

Got a band? Had some hits? Then bloody well play them!

I went to see The Cure on Thursday at the 02 Arena in London (one of my least favourite venues, but that’s another story). It was a sold-out gig, with the stage at one end, which according to that Wikipedia link means that I was one of about 16,000 people.

The gig was in honour of their “Godlike Genius” award from the NME – given to them at an awards ceremony the previous night at Brixton Academy. According to Wikipedia, Brixton Academy has a maximum capacity of 4,921 – but I would guess that an awards ceremony would have been nearer to the all-seated capacity of 2,391.

So, two gigs in as many nights – one 30 minute set at Brixton, one 90 minute set at the O2. Here’s the set list for the 30 minute gig, performed to less than 5,000 people (taken from here):

  • ‘Lullaby’
  • ‘The Only One’
  • ‘Friday I’m In Love’
  • ‘Close To Me’
  • ‘The End Of The World’
  • ‘In Between Days’
  • ‘Just Like Heaven’
  • ‘Boys Don’t Cry’
  • ’10:15 Saturday Night’
  • ‘Killing An Arab’

Top 40 hits are shown in bold, top 20 hits in bold-italic. Looking at it that way, that’s a heck of a half-hour set list.

The set list for the 16,000 strong 90 minute set, with the same bold and italic coding (taken from here):

  • ‘Underneath The Stars’
  • ‘From The Edge Of The Deep Green Sea’
  • ‘The Perfect Boy’
  • ‘The End Of The World’
  • ‘Sleep When I’m Dead’
  • ‘A Forest’
  • ‘Three Imaginary Boys’
  • ‘Shake Dog Shake’
  • ‘Maybe Someday’
  • ‘The Only One’
  • ‘In Between Days’
  • ‘Just Like Heaven’
  • ‘Primary’
  • ‘Want’
  • ‘The Hungry Ghost’
  • ‘Disintegration’
  • ‘One Hundred Years’
  • ‘Its Over’
  • ‘Boys Don’t Cry’
  • ‘Grinding Halt’
  • ‘10.15 Saturday Night’
  • ‘Killing An Arab’

Let me pull one little quote out of that NME report for you:

…Smith said he saw the two sets as linked, and having done some of their more famous tracks 24 hours earlier, he was keen to showcase some different areas of their career at the arena show.

Yeah, thanks for that Robert. I’m sure that the 16,000 people at the O2 Arena really appreciated you showcasing different areas of your career, rather than including a few more well-known hits, for more than £30 per ticket.

That was sarcasm, by the way, in case you hadn’t twigged.

This is my plea to bands and artists who have had some hit singles: play them. Yes, you might be bored with them by now, but that’s what a large contingent of your audience wants from a gig. That’s not to say that you can’t go off on an indulgent ramble of album tracks and “fan favourites”, but you should make sure that you provide a good mix with the well known numbers. For every hardcore fan in the audience, there will also be someone who only really knows you from your chart successes. Try to make your gig inclusive enough for both sets.

So, in short, here’s a quick list of things not to do at a gig, if you don’t mind:

  • Perform mostly album tracks and very few hits
  • Rub your audiences noses in it by telling them that you played a load of hits the previous night at a gig that they weren’t present at
  • Play the whole of your new album from start to finish, leaving too little time for your hits (Ash, I’m looking at you)
  • Play the opening lines, or the first verse of your hits, but rarely get as far as actually finishing any of them (Prince, at least on the night that I saw him)
  • Go off on some rambling, self-indulgent, stream-of-consciousness instrumental break that takes up half the gig (yes I mean you, Hawkwind)

P.S. For what it’s worth, I do own about half of The Cure’s albums, so recognised a fair number of the tracks they played. I still would have preferred a higher proportion of hits and well-known numbers though.

Perhaps a little too edgy?

The latest Ubuntu release, 6.10 or “Edgy Eft“, came out last week. I’ve got a few Ubuntu PCs that I maintain; a couple of my own, some at work, and a few others for friends and relatives. Rather than overburden the Ubuntu servers with numerous updates I usually grab the “alternate installer disc” which can also be used to update an existing system. One download, one CD, multiple updated machines.

So far I’ve updated two systems – one at work and my main machine at home – and I have to say that I’ve been less than impressed. This has been the most problematic update I’ve dealt with, and from the posts on the Ubuntu forums it looks like I’m not the only one to have experienced problems.

One updated machine wouldn’t get to the login screen, and wouldn’t let me access a command prompt either[1]. The other machine refused to update at all[2]. When I finally massaged them into submission, I logged in to find that my games were missing[3], half my virtual desktops had gone AWOL[4] and my Apache/PHP installation had also disappeared[5].

In fairness I should point out that Edgy is very nice indeed – once it’s up and running – and I have no plans to jump to another distro just yet 😉 But my systems aren’t all that far from a standard install: I really think the upgrade should have gone more smoothly, especially on a system that bills itself as “Linux for Human Beings”.

I have one Tablet PC with no optical drive, so that won’t get updated until Edgy is pushed via the update servers. I can only hope that the delay in this is due to some of the issues being resolved before they flick the switch to make it live[6]. The friends and family PCs will be getting their updates from the servers, so it would be nice to think that a slight delay now will mean that I don’t have to go visiting people to rescue their systems.

“Edgy” indeed. I guess there’s quite some way to go before the “Stable Sable” release.

[1] The solution to this one was the addition of “vga=791” to the grub scripts, but how is a novice supposed to work that one out?

[2] It turned out that the update wanted to remove some old kernels that were there from previous versions of Ubuntu, but it wasn’t allowed to because they’re marked as essential components. The installer should have made it clearer what the problem was, or preferably just removed them anyway; so long as it doesn’t remove the kernel currently in use then it’s a pretty safe bet that the machine will still be able to boot afterwards.

[3] Simply re-installing the Gnome-games package got them back, but they shouldn’t really have disappeared in the first place

[4] This was a trivial issue of re-setting the number of desktops in the preferences of the pager applet. Still a little annoying though.

[5] I had to reinstall Apache and PHP. There was also a little configuration required to enable the PHP module which was less-than-obvious. This page contains all the relevant information, once you’ve worked out which sections apply.

[6] I’d also like to know when Firefox 2.0 is going to start hitting the Mozilla update servers. I’m waiting for the update on my Windows box at work (Edgy already has FF2.0), but that also seems to be taking a long time to get out. Much longer and I’ll just get bored enough to download the installer directly.

A Ripping Yarn, Pt. 3

Having ripped and converted all my CDs, it was time to play back the mp3 files on my Ubuntu box. I knew that I’d need to install some codecs which aren’t shipped with it by default (for legal reasons), but that wasn’t too tricky. A couple of quick commands later and I was able to play back mp3 files in the default media player, as well as preview them by hovering the mouse over them in the file manager.

What I hadn’t expected, however, was that my preferred music player, Amarok, should fail to play them. Even more annoyingly it didn’t bother to give any useful error message, it just kept zooming down the playlist not playing song after song after song, unwilling to stop but unwilling to actually play anything either. Not good 🙁

At first I thought it was confused about the location of my files. Previously I’d used Amarok to play the FLAC files, but they didn’t exist on the drive anymore, replaced by the mp3 files in a different (but similarly structured) directory. Perhaps it was still showing the FLAC files, but was skipping through them because it was unable to play files that weren’t there anymore. It seemed logical to me, so I double-checked the directories it should be monitoring, told it to re-scan the whole collection, restarted a couple of times, deleted the preferences from my home directory, told it to re-scan the collection again, and finally gave up.

It took a while to dawn on me that perhaps it just couldn’t play mp3 files, because everything else on my machine can. Eventually I twigged and a quick google found that I’m not the first person to have this problem. Sure enough the fix presented on that page worked (after logging out and in again), but a useful error message from Amarok would have got me there a lot faster.

Technical (lack of) Support

Being the local “person who knows about computers” I was recently drafted to sort out someone’s inability to get a new laptop talking wirelessly to a new router on a new ADSL connection.

It didn’t help that she’d been mis-sold a few times over. By the time I was called in she’d already bought an ADSL USB modem, two wi-fi dongles, and an ADSL router. Of course she’d installed the software that had been shipped with each and every one of them, as well as the contents of some would-be drinks coaster from Tiscali, her ISP.

In an effort to avoid suffering death by driver overload in Windows I decided to boot from a Ubuntu live CD. I was hard-wired into the router at this point, as I just wanted to verify that the ADSL connection was up and running before worrying about the complexities and nuances of wi-fi.

I let the laptop boot from the CD, and watched it get a DHCP connection as expected. I pointed Firefox at the configuration screen for the router and stepped through the connection wizard. All went well, the lights looked okay, and I was able to ping Google and the BBC. Try as I might, however, I couldn’t get a web page to appear. I thought I’d better check that there was no filtering at Tiscali’s end of the line, so made the foolish mistake of calling their technical support line. The conversation went something like this:

Me: … I can ping Google and the BBC but can’t get a web page up. Are you filtering anything at your end?
Tech Support: No sir, we don’t do any filtering here. If you can ping then you should be able to view the web sites. What version of Windows are you running?
Me: I’m not running Windows, I’ve booted from a Linux live CD for testing purposes.
Tech Support: Okay sir, but what version of Windows are you running?
Me: I’m not running Windows, I’m running Linux.
Tech Support: Okay sir, so you’re running Linux.
Me: Yes.
Tech Support: What version of Windows are you running that on?
Me: I’m not running Windows at all. I’m running Linux instead of Windows.
Tech Support: Can I put you on hold, sir, while I consult with one of my colleagues.

… time passes … quite a bit of time, actually …

Tech Support: Thank you for holding sir. I’ve consulted with one of my colleagues, and we think you need to reinstall Internet Explorer

Now I don’t expect every tech support person to be a Linux expert, but it would be nice if they at least knew a little bit about computers. Like what an OS is, and that if the customer isn’t running Windows then telling them to reinstall Internet Explorer probably isn’t going to help much.

I never did find the source of the problem, but a reboot seemed to do the job. After removing all the drivers and extra rubbish that I could I finally got the wireless connection working. The router web interface was absolutely rubbish and refused to remember any of the settings I applied, so although it’s working there’s no encryption or MAC address filtering. That should make any technically savvy neighbours happy. Maybe I’ll fix that sometime, but I really didn’t feel like dealing with D-Link’s technical support staff at that point.

“Do you want to restart your computer now?”

Automatic updates on Windows XP are the bane of my working life. I don’t mind the update itself, but it’s the constant nagging to reboot the machine afterwards which really grates. Thankfully someone has posted a simple solution to this problem – and the comments include other options for achieving the same result.

Why are we paying twice?

The Guardian has an interesting article questioning why we in the UK are effectively paying twice for data that is collected by government agencies.

I would be inclined to take this question one step further, and ask why I’m also paying twice if I want to keep content that has been created by the BBC.

Having paid for the BBC’s content to be produced via the television licence why do I have to pay again if I want to purchase a copy on DVD, video or CD? Of course I expect there to be a charge for the media itself, but why can’t I telephone the BBC’s consumer sales department, give them my address (so they can confirm that I own a TV licence) and get a copy at a subsidised price?

Unfortunately there’s nobody that I can really lobby about this, because although the TV licence fee is effectively a tax on owning a television in the UK, I have no means of influencing the operation of the BBC itself. I suppose I could write to “Points of View“, but that’s hardly the same as being able to vote for the membership of the Board of Directors. Didn’t someone once say something about no taxation without representation?

Printing and the web

I wear two hats (at least metaphorically) when I’m working with a web browser: sometimes I’m a user, and sometimes I’m a developer of browser-based applications. But whichever hat I’m wearing, there’s one thing that is always true: printing sucks. To be clear about this, what sucks is the amount (or rather, lack) of control that the browser gives me over what appears on the page when I click the Print button.

It sucks in different ways, depending on which hat I’m wearing, so let’s look at the issues on a per-hat basis:

1) As a user

When wearing this hat I don’t print very often. When I do it’s almost always to produce a paper receipt, and very occasionally to create a dead-tree version of some HOWTO or other documentation. The biggest problem here is that I only want to print the relevant part of the page, but I invariably end up with ads and site navigation links taking up space on the page. Usually the relevant part gets chopped off, or split across pages because it’s been pushed out of place by the extraneous crap that surrounds it.

What I’d like is a means to crop the page: the ads and links almost always take up space at the sides and the top of the page, so a simple rectangular cropping tool would suffice for most sites. Just let me drag a marquee around the part of the page I want to print, and crop everything else. Simple but effective.

At the moment the closest thing I’ve found to this is to use the Aardvark extension for Firefox which lets me selectively delete items from a page and remove any artificial width restrictions. This extension works well if you know something about how web pages are structured (the tree structure of the DOM), but could be a bit confusing otherwise (if you don’t know what the DOM is, then you probably fall into the latter category). It also works on the screen version of the page, which might not be the same as the print version (though it usually is). The cropping tool I’d like to see would work on the Print Preview screen, giving you a better idea of what will actually come out of the printer.

As a user I often want the header and footer that my browser creates. With this hat on it’s useful to have a record of the original URL, and a timestamp. However URLs can be very long, and some browsers simply truncate them. Why does it even have to go in the header or footer: if I’m printing in portrait mode then having the URL running up the side of the page (the long edge) would be acceptable on receipts. How about a choice between wrapping long URLs, truncating them or printing them along the edge of the page rather than in the header or footer?

2) As a developer

I write web-based applications for a living – usually for deployment on a company’s intranet. There are many advantages to a web-based application, not least of which are centralised management and ease of deployment (you just have to send the users an email with a link to the application).

Web apps have become more and more sophisticated over the years, but in doing so they are really starting to show the limitations of the current crop of browsers when it comes to printing. Consider a web-based word processor: this is an obvious example of a web app that is broadly achievable with today’s web browsers, but who wants to print a letter to the bank manager which has a URL and timestamp emblazoned on it?

As a developer I’d like a way to turn the browser’s header and footer off. As a user I often want them on. The solution, it seems, would be to provide a way for a web page to indicate that it really would like to have them turned off, but for the user to easily override the setting. The user could have an option to always print them, never print them, or turn them off when a site requests it. A more advanced system would allow white- or black-listing of sites which can turn them off.

The next requirement in a web app, after turning the browser’s headers and footers off, is to turn your own headers and footers on. Again consider a word processor or desktop publishing program: there needs to be a way for a web application developer to add headers and footers that are printed on every page, irrespective of the amount content, the size of the page, or the page margins. The browser knows where its page-breaks will be, and it should be able to work backwards from that to allow enough space for the footer, and work forwards to insert the header.

In fact there’s a whole CSS specification that deals specifically with paged media (i.e. printing and slideshows) and which should, in theory, allow developers to specify repeating content for each page and other such tricks. Alas, such features are poorly supported by current browsers, and even if we could specify our own headers and footers the advantage is rendered less useful until we can also request that the browser’s own inclusions be supressed.

Thankfully browser developers are starting to think a bit more about printing. At least one of the Firefox developers is looking at printing problems, and the release of Internet Explorer 7 (when it finally happens) includes numerous improvements to the printing system. Unfortunately neither browser appears to be implementing the features I’d like to see (namely cropping (user hat), and suppressing browser-generated printing (developer hat)).

This might be the start of good things for printing. With luck the emphasis being applied in IE7 will encourage the Firefox developers to go one better, and that in-turn will push Microsoft further in the right direction. The web browser is a fascinating petri dish for growing applications, Firefox’s mixture of HTML, SVG and MathML particularly so. But we’re nowhere near the paperless office yet, and aren’t likely to be for some time. In the meantime we need more powerful and more flexible printing solutions, both as users and developers.

If the browser developers can sort out printing in a way that works for both users and web developers I’ll certainly take my hats off to them.

Posted in Tech, Things I Hate. No Comments »