Can modern software be snappy?

No bloatSolomon once said not to pine for the good ol’ days, and that’s sage advice, but I’m sure he didn’t intend it to apply to software bloat.

I’m not just talking about memory and megahertz bloat — we’ve also got performance unbloat. And it almost seems like the two go hand-in-hand. The more memory something hogs, the slower it’ll run.

Compare Foxit’s PDF reader to Adobe Reader:

Okay, so maybe Foxit doesn’t display all the latest 3D JavaScript-enhanced PDFs, but it does all you need, and it does it (relatively) small, fast, and light.

Maybe it helps (or hinders) to be an embedded programmer, and know how much you can do with 32 KB of code, 2 KB of RAM, and an 8 MHz, 8-bit processor. Sometimes I wish developers had to write code on ancient 66 MHz 486s. Constraints are the mother of optimization, and programmers will usually forget about optimization after it runs “fast enough” on their 42-core Pentium IX, 10 TB RAM development machine.

Sure, I grant that RAM and clock cycles are cheap these days, and we might as well use ’em. But surely there’s a limit to all this. When my (plain text!) editor runs slow enough so I can see the screen updating, there’s something wrong.

When Visual Studio takes 2 entire seconds to pop up a simple properties window the first time, there’s something wrong.

Back in the days when programmers cared about how many characters they could write to video memory during the CGA horizontal retrace time without it producing snow — back in those days, and running on a 286, I couldn’t see my screen updating.

We now have 1000’s of times the computing power, and as far as user experience is concerned, stuff runs slower than it used it. And we put up with it, because they’ve added one or two features we like, and it’s “good enough”. But it just ain’t right.

And it’s not just the my text editor and Visual Studio. When Thunderbird first came out, it was so slow on my fairly average hardware that I simply couldn’t use it. And now on my 2 GHz dual-core whatever-it-is, it’s still slow. I click on Inbox the first time, and it takes a second and a half for the message to pop up. Come on, people — you could load a usenet thread off a floppy drive in that time!

Somehow we’ve convinced ourselves that Gmail’s conversation view and anti-spam features make it worth putting up with 750 millisecond Ajax delays all the time. Nope, it just ain’t right.

Recently I was reading from Michael Abrash’s Graphics Programming Black Book, and he has something telling (albeit provocative) to say about all this in chapter 2:

Abrash's Black BookYou will notice that my short list of objectives for high-performance assembly programming does not include traditional objectives such as easy maintenance and speed of development. Those are indeed important considerations— to persons and companies that develop and distribute software. People who actually buy software, on the other hand, care only about how well that software performs, not how it was developed nor how it is maintained. These days, developers spend so much time focusing on such admittedly important issues as code maintainability and reusability, source code control, choice of development environment, and the like that they often forget rule #1: From the user’s perspective, performance is fundamental.

My theory is that it’s gonna take an RMS-style prophet to come along, take a few pages out of Abrash’s book, write his own instant-GUI operating system, wake up the masses to the joy of responsive computing, and watch the bloat industry crumble.

It’ll be a world where you can turn on your computer and use it immediately. A world where you wait at most half a second for big programs to load. A world where your screen will update even before the keyup event is sent. It won’t be utopia, but I’m still looking forward to it.

We’re taking submissions for this kind of prophet. Drop your résumé or CV in the comments below. :-)

27 June 2008 by Ben    29 comments

29 comments and pings (oldest first)

Ben 27 Jun 2008, 11:46 link

Just a small P.S. about tiny programs.

It was probably my dad who first gave me a love of antibloat. He taught me to program in x86 assembly, as well as in Forth — a language in which you can write a compiler in about 2 KB. Ages ago he wrote a little TSR text editor for DOS called PED (Popup EDitor). The PED.COM executable is 3559 bytes, and it works on everything from an 8086 running DOS to a Pentium IV running Windows XP.

Norman 27 Jun 2008, 20:49 link

You’re in good company :

http://cr.yp.to/bib/1995/wirth.pdf

Norman 27 Jun 2008, 21:18 link

I just tried PED.COM, great program !

If you’re still into small code and Forth then try this system that’s being developped by Pablo Reda :

http://reda4.org/

All the code is in TXT files, the text editor might interest you or your father. The Forth is a Colorforth inspired. It’s all free.

Ben 27 Jun 2008, 22:51 link

Norman, thanks for the links. Looking forward to reading the article by Niklaus Wirth (tomorrow). Though I admit to having trouble reading the (is that Spanish?) redo4 website. :-)

James Justin Harrell 27 Jun 2008, 23:14 link

Which is newer, Foxit or Adobe Reader? Seems like a poor example.

aare 27 Jun 2008, 23:40 link

Yeah, visual studio and gmail are ones of the most infuriating programs for me also. It is such a pleasure to use textpad editor outside of work when I have the time. It’s hard to image industry chaning though. I think industry will keep producing slow and bloated programs as long as there are no better programming and design paradigms.

Norman 27 Jun 2008, 23:45 link

re Spanish reda4.org site : there is an english version of the Manual. The software really is worth installing and playing around with :) Compiles to ASM and generates small executables.

Dan 28 Jun 2008, 00:20 link

It seems to me that when people complain about bloat they really mean ‘I don’t need it so no one needs it’.

Ben, the reason your TSR runs under XP is down to a large chunk of compatibility code shipped with every copy of Windows. This is 100% pure bloat for those of us who don’t run legacy DOS apps.

Andrew Murdoch 28 Jun 2008, 00:46 link

The argument of “give developers slow crappy machines so they write faster software” is I think invalid for a couple of reasons. Firstly, the more times a developer can edit, compile and test an application during development, the faster bugs can be squashed. Secondly, for some software (in particular games), if the projected release date is 2 years away, a developer should be using the sort of hardware most people will have then, so that they can take advantage of the new capabilities or speed.

Stephane Grenier 28 Jun 2008, 02:32 link

I have two comments in regards to your post:

  1. I agree that a lot of software today is unresponsive! It’s absolutely amazing how many software shops don’t even run their applications through a code profiler to find the bottlenecks. Just this alone can very significantly increase the performance of many software applications! If nothing else, at least run your application through a code profiler.

  2. I really appreciated your comment about running the application on a slower box. Most developers forget that they have really powerful boxes. Running good enough on these boxes is not good enough. Try it on a $300-500 system. That should be the good enough metric.

And speaking of this, I remember in university where we had to create some kind of graphics driver in assembly. We all worked on our development boxes and had good enough performance. However the prof pulled a good one on us at the last minute and decided to test our drivers on a box that ran at a quarter of the speed. A quick note, he had instructed us that our driver had to run within certain specs, including the slower box speed, however most people ignored it. So those who took shortcuts with good enough performance got hit pretty hard.

Eric Normand 28 Jun 2008, 02:51 link

Yes. I agree. Too much bloat.

In general, though, I don’t think maintainability and optimization are mutually exclusive. Our languages and systems just suck at doing both.

Trey Boudreau 28 Jun 2008, 03:26 link

The CGA display you remember fondly contained 80*25 glyphs at 2 bytes per glyph (one each for character and color) for a whopping 4000 bytes of data. The same screen of data in a modern bit-mapped display with a 10×12 pixel grid at 16 bites per pixel per fixed width character takes 480000 bytes. And that only represents a fraction of all the pixels on your desktop. Modern graphics cards have the highest memory bandwidth of any device in your computer for a reason.

Still, I occasionally miss the Brief text editor and Turbo Debugger from MS-DOS days :-)

capsid 28 Jun 2008, 03:49 link

You can’t control the aesthetic of all the application programmers out there, but you can work on a easily installable collection of speed tweaks for a certain hugely popular distribution. Let’s make an Ubuntu remix that is optimized for snappiness. Zoombuntu:)

I was able to cut my boot time down to 15 seconds using an Ubuntu Server installation and the Blackbox desktop, but it lacked a lot of the applets that make my notebook functional (like the wireless manager). I feel like the key to speed is in perception rather than performance. I want to see the visual feedback of starting my computer or opening a program right away, even if it’s loading extra components in the background.

Perception can affect the experience the other way, too. A program may be loading extraordinarily fast for how large it is, but if it is consuming all of the cpu and IO bandwidth while it does it, it seems slow.

David Hogarty 28 Jun 2008, 05:43 link

Efficiency, Flexibility, and Complexity. You can optimize for two of the three at the expense of growth in the others. Because both efficiency and flexibility are necessary, the only way to get there is to start to tackle the issues of complexity: how can I write code general enough that it can be optimized efficiently for other architectures, and yet ‘instantiate’ it to a specific architecture and take advantage of all optimizations the constraints of that system allow? How can I minimize the pain of adding new data processing streams (e.g. function parameters, I/O devices, compute devices) to an existing solution?

Gabriel C 28 Jun 2008, 05:51 link

You can make your system go faster and less bloated… set your screen resolution and colors to minimum (640×480 x16 colors?), kill or stop all the background process, or better, give the your program direct access to the hardware… As long as people are willing to buy new hardware/software so they can plug-n-play their digital camera, browse the 1000 1Mb photos and send a few to a color photo printer they just plugged, have his text processor spell/syntax check and suggest while they write, get direct updates from internet, check for virus in files and emails in the background, have the disk compressed, etc, we’ll have the “slowness” and “bloat”, because it requires layers over layers of indirection…

StCredZero 28 Jun 2008, 06:24 link

It should be possible to set up QEMU or some other virtual machine so that you can compile your application natively but use it on a much slower machine. This gives you the best of both worlds: the developers have to eat their own cooking on non-uber machines, but they get to have fast compiles and more iterations.

robvas 28 Jun 2008, 06:32 link

Microsoft did the right thing when the ported Office to the Macintosh. They tested every build on the original Power Mac, a 60MHz 601. Not a bad idea.

Ben 28 Jun 2008, 10:59 link

Dan, fair point that DOS TSR’s still working under XP is an example of bloat — I guess I agree. And aare, I agree about Textpad: that is light and fast. Pity it’s not free, and it’s a pity version 5 is not as good as version 4.

Brett Morgan 28 Jun 2008, 11:04 link

Yes there is something wrong with GUI programs. I don’t, however, think it can be fixed with a mantra of “do less.”

The simple problem with most software is that everything is done in one thread, and thus the interactivity has to wait for what ever processing is being done. Or worse, parts of the program’s ui become unusuable when you are carrying out a specific action, e.g. MS Word’s application modal file save dialog.

The real answer here is transitioning to MT aware code such that the work gets done on background threads and the foreground UI thread is never held locked.

Leonard Rosenthol 1 Jul 2008, 06:15 link

I agree with the other posters – that comparing FoxIt Reader to Adobe Reader 8 is not a fare comparison since, as you state later in the article, Adobe Reader does a LOT MORE than FoxIt. As such, it makes perfect sense that it would take up more disk space. But disk space is cheap.

Your comments concerning performance and memory usage, however, are well taken and don’t in any way relate to the disk space usage of a given application. One can be small on disk and have large memory footprint – or vice versa.

To that end, I recommend that you get yourself a copy of Adobe Reader 9 – just released – and compare again. I think you’ll find a product that is now MUCH FASTER and SMARTER than it’s predecessor…

[…] I wrote a blog entry about bloated software, and how much better Foxit PDF reader was than Adobe Reader. But I was using Adobe Reader 8. Little […]

Please, it is supposed to render a pdf file. Foxit can do everything i need with a lot less resources.

ngobikannan 6 Jul 2008, 11:56 link

I am actually surprised you even tried installing a newer Adobe reader

Ч 10 Aug 2008, 09:18 link

“I grant that RAM and clock cycles are cheap these days, and we might as well use ‘em.” yes, lets use the hardware to do something useful.

zoc.i.am 31 Aug 2008, 10:14 link

Actually, a “prophet” did come along, and wrote his own instant-GUI operating system: The prophet’s name is Niklaus Wirth, and the OS name was “Oberon”.

It was snappy, and it was SMALL – it used to ship on a 1.44 MB floppy (or was it two?). But still, for one reason or another, the OS never really made it out of the university halls of ETH Zürich.

Ping: 27 months » The Vir… 10 Mar 2009, 06:22 link

[…] thrashing as a prime example of what’s wrong with contemporary software. Ben posits the question, Can Modern Software Be Snappy? and draws on some examples from coding for embedded devices and graphics programming. Both are […]

[…] My dad could code, and he made some neat stuff, like a code-generating pentomino puzzle solver and a really tiny pop-up editor. […]

[…] I wrote a blog entry about bloated software, and how much better Foxit PDF reader was than Adobe Reader. But I was using Adobe Reader 8. Little […]

Daniel José dos Santos 29 Jan 2018, 09:04 link

One of my dreams is to work with people that have this mindset that you show in this post. I’m tired of bloated and slow modern software.

7-Zip is another very good example of small, simple and great software.

Add a comment

We reserve the right to edit or remove your comment if it’s spammy, offensive, unintelligible, or if you don’t give a valid email address. :-)

We’ll never share your email address.