Tag Archives: Windows

Mac OS X market share accelerates in 2008

2008 was a good year for Apple and Mac OS X. According to netapplications.com data (via sharewarepromotions blog) Mac OS X’s share of the OS market increased from 7.31% in Dec 2007 to 9.63% in Dec 2008. That is a 32% increase in market share during 2008, compared to a 22% increase during 2007.

macosx_market_share_2007_2008

Windows market share fell from 91.79% to 88.68% in the same time. While Mac OS X’s annual gains are impressive, it has a long way to go to catch Windows. 15 years if you project the 2008 gains forward.

macosx_vs_windows_market_share_2007_2008Of course, it is highly questionable to project 15 years from a single year of data, but it gives an idea how much work Apple still has to do.

I sell table planning software for Windows and Mac OS X. Mac visitors to my website have followed the general trend, up from 7.41% in 2007 to 8.5% in 2008 and accounting for around of 10% of visitors at the end of 2008.

macosx_visitor_percentage% Mac visitors to http://www.perfecttableplan.com

My data also shows that Mac users are twice as likely to purchase my software as Windows users (I have heard similar figures have reported by others). So Mac users currently account for 20% of my sales. I wouldn’t want to live off my Mac sales, but it is very useful additional income. Given the disparity in cost between Windows and Mac hardware it is hardly surprising that Mac users are more ready to reach for their credit card.

My software is built on top of the Qt cross-platform toolkit. The recent porting of Qt 4.5 to Cocoa gives me the opportunity to further improve PerfectTablePlan’s Mac look and feel and to release a 64 bit version. Hopefully this, coupled with increasing Mac market share, will further improve my Mac sales.

A beta of Windows 7 has just been released.  It will be interesting to see if it can repair some of the damage caused by Vista and slow the growth of Mac OS X. Personally, I doubt it – the Windows 7 feature list certainly doesn’t set my pulse racing.

CoverageValidator v3

The nice folk at Software Verification have done a major new release of Coverage Validator, and the new version fixes many of the issues I noted in a previous post. In particular:

  • The instrumentation can use breakpoint functionality to get better line coverage on builds with debug information enabled.
  • Previous sessions can be automatically merged into new sessions.
  • The default colour scheme has been toned down.
  • The flashing that happened when you resized the source window has gone.
  • It is now possible to mark sections of code not to be instrumented. I haven’t had time to try this yet, as it was only introduced in v3.0.4. But it should be very useful as currently I have a lot of defensive code that should never be reached (see below). Instrumenting this code skews the coverage stats and makes it harder to spot lines that should have been executed, but weren’t.

There are still a few issues:

  • I had problems trying to instrument release versions of my code.
  • It still fails to instrument some lines (but not many).
  • I had a couple of crashes during testing that don’t seem to have been caused by my software (although I can’t prove that).

But the technical support has been very responsive and new versions are released fairly frequently. Overall version 3 is a major improvement to a very useful tool. Certainly it helped me find a few bugs during the testing of version 4 of Perfect Table Plan on Windows. I just wish there was something comparable for MacOSX.

Sometimes the best way to recover Windows data is Linux

knoppixMy Windows laptop refused to boot into Windows. The ominous error message was:

Windows could not start because the following file is missing or corrupt:

\windows\system32\config\system

A quick Google suggested that the registry had been corrupted. I tried various things to recover the OS, including using the XP recovery console to manually restore a backup of the registry. It didn’t work.

No problem. I have a fairly paranoid back-up regime. All the important information on my laptop is also stored on my subversion server. I could just reformat the laptop, reinstall the applications (including subversion) and check out all the files again. Except that I hadn’t thought to include my wife’s files on the laptop in my back-up plans. Oops. After hours of making no progress recovering the data. I tried Knoppix. I got access to the data in not much longer than it took to download Knoppix.

Knoppix is a Linux distribution that can run from a CD (i.e. it doesn’t require installation on your harddisk). It is also capable of understanding Windows file systems. To use it:

  1. Download the latest Knoppix CD .iso file (approx 700MB). Note – The DVD version is much larger.
  2. Burn the .iso to a CD, for example using the free Active ISO Burner.
  3. Boot the stricken machine from the Knoppix CD. You may need to change your system to BIOS to boot from the CD first. How you access the BIOS varies between machines. On my Toshiba laptop you press F2 as the system boots.
  4. Drag and drop data from the stricken machine to a USB harddisk or memory stick. Or copy to another machine using FTP from Knoppix. The Knoppix user interface is easy enough to use, even if you haven’t used Linux before.

Note that you don’t have to enter your Windows password to recover the files. This brings homw how easy it is to get data off a password protected Windows machine, if you have physical access to the machine. Another good reason to encrypt sensitive data on your laptop, for example using the free Truecrypt.

Thanks Knoppix! I’ve added you to my mental list of worthy software causes to make a small donation to one day. Obviously you need access to a functioning machine to do the above. So why not make a Knoppix CD now, while everything is fine? You never know when you might need it.

Further reading:

Life hacker: Rescue files with a boot CD

Choosing a development ‘stack’ for Windows desktop applications

beauty_parade.jpgI have have heard plenty of people saying that desktop software is dead and that all future development will be done for the web. From my perspective, as both a buyer and seller of software, I think they are wrong. In fact, of the thousands of pounds I have spent on software in the last three years, I would guess that well over 90% of it was spent on software that runs outside the browser. The capabilities of web based applications have improved a lot in recent years, but they still have a long way to go to match a custom built native application once you move beyond CRUD applications. I don’t expect to be running Visual Studio, PhotoShop or VMWare (amongst others) inside the browser any time soon. The only way I see web apps approaching the flexibility and performance of desktop apps is for the browser to become as complicated as an OS, negating the key reason for having a browser in the first place. To me it seems more likely that desktop apps will embed a browser and use more and more web protocols, resulting in hybrid native+web apps that offer the best of both worlds.

So, if Windows desktop apps aren’t going away any time soon, what language/libraries/tools should we use to develop them? It is clear that Microsoft would like us to use a .Net development environment, such as C#. But I question the wisdom of anyone selling downloadable off-the-shelf software based on .Net [1]. The penetration of .Net is less than impressive, especially for the more recent versions. From stats published by SteG on a recent BOS post (only IE users counted):

No .Net: 28.12%
>= .Net 1.0: 71.88%
>= .Net 1.1: 69.29%
>= .Net 2.0: 46.07%
>= .Net 3.0: 18.66%
>= .Net 3.5: 0.99%

Consequently deploying your app may require a framework update. The new .Net 3.5 framework comes with a 2.7 MB installer, but this is only a stub that downloads the frameworks required. The full set of frameworks weighs in at eye watering 197 MB. To find out how much the stub really downloads Giorgio installed .Net 3.5 onto a Windows 2003 VM with only .Net 1.0 & 1.1. The result: 67 MB. That is still a large download for most people, especially if your .Net 3.5 software is only a small utility. It is out of the question if you don’t have broadband. Microsoft no doubt justify this by saying that the majority of PCs will have .Net 3.5 pre-installed by the year X. Unfortunately by the year X Microsoft will probably be pushing .Net 5.5 and I dread to think how big that will be.

I have heard a lot of people touting the productivity benefits of C# and .Net, but the huge framework downloads can only be a major hurdle for customers, especially for B2C apps. You also have issues protecting your byte code from prying eyes, and you can pretty much forget cross-platform development. So I think I will stick to writing native apps in C++ for Windows for the foreseeable future.

There is no clear leader amongst the development ‘stacks’ (languages+libraries+tools) for native Win32 development at present. Those that spring to mind include:

  • Delphi – Lots of devoted fans, but will CodeGear even be here tomorrow?
  • VB6 – Abandoned and unloved by Microsoft.
  • Java – You have to have a Java Run Time installed, and questions still remain about the native look and feel of Java GUIs.
  • C++/MFC – Ugly ugly ugly. There is also the worry that it will be ‘deprecated’ by Microsoft.
  • C++/Qt – My personal favourite, but expensive and C++ is hardly an easy-to-use language. The future of Qt is also less certain after the Nokia acquisition.

Plus some others I know even less about, including: RealBasic and C++/WxWidgets. They all have their down sides. It is a tough choice. Perhaps that is why some Windows developers are defecting to Mac, where there is really only one game in town (Objective-C/Cocoa).

I don’t even claim that the opinions I express here are accurate or up-to-date. How could they be? If I kept up-to-date on all the leading Win32 development stacks I wouldn’t have any time left to write software. Of the stacks listed I have only used C++/MFC and C++/Qt in anger and my MFC experience (shudder) was quite a few years ago.

Given that one person can’t realistically hope to evaluate all the alternatives in any depth, we have to rely on our particular requirements (do we need to support cross platform?), hearsay, prejudice and which language we are most familiar with to narrow it down to a realistic number to evaluate. Two perhaps. And once we have chosen a stack and become familiar with it we are going to be loathe to start anew with another stack. Certainly it would take a lot for me to move away from C++/Qt, in which I have a huge amount of time invested, to a completely new stack.

Which Windows development stack are you using? Why? Have I maligned it unfairly above?

[1] Bespoke software is a different story. If you have limited deployment of the software and can dictate the end-user environment then the big download is much less of an issue.

Coverage Validator

coverage_validator.pngThe sink is full of washing, I am wearing odd socks and I haven’t been out of the house in days. It must be time to put out that new release. But how can I be sure my testing hasn’t missed a hideously embarrassing bug? Maybe I introduced a major bug when I made that ‘cosmetic’ change at 2am?

In an ideal world I would just run a comprehensive automated regression test suite. Unfortunately it is difficult to automate graphical user interface (GUI) testing and the majority of lines of code in most applications are GUI. I estimate that the code for my own table planner software is at least 75% GUI code (not including generated code, which would push it even higher).

So I try to manually execute every line of my application before I release it. If I have to make any changes to the code, I start over again. This is very dull, but at least I have a tool to help me: Coverage Validator. Coverage Validator instruments code and shows, in real time, which lines have been executed. Click a few buttons on your application and watch the executed lines of code change colour from pink to yellow. Execute every line in the file and all the lines change colour to cyan. No recompilation or relinking is required and it doesn’t slow down the tested application too much. This real-time feedback is incredibly powerful for testing.

code_coverage_small.gif

Unfortunately it also has a lot of shortcomings:

  • The usability isn’t great. There is a confusing plethora of options for instrumenting your code that I would rather not have to know about.
  • It isn’t able to ‘hook’ (instrument) all the lines of code. Whole blocks get missed out for reasons I don’t fully understand. Single line branches are particularly likely to be missed.
  • The GUI isn’t great. For example, the display flashes horribly if you resize it.
  • The automatic results merging is just plain weird. At the end of a session it can merge your coverage results into a previous session. This information isn’t much use to me at the end of a session. I want to merge previous results at the start of a session so I know which lines I haven’t tested.
  • The GUI is quite ugly. They really need to update those tired old icons.

However being able to see line coverage information in real time is just so incredibly useful that I am prepared to put up with the many shortcomings. I just run my application alongside Coverage Validator and, file-by-file and function-by-function, I try to turn the lines of code yellow (or, better still, cyan). Every time I have used Coverage Validator I have found at least one potentially embarrassing bug that I hadn’t discovered by any other means. The support has also been responsive. It is just a pity about the flaws, without them this would be a ‘killer app’ for testing.

Coverage Validator works with C++, Delphi and VB on Windows NT4, 2000, 2003 and XP[1]. A single licence costs $199. A free 30-day evaluation licence is available.

[1]I am using it on Vista currently, and it seems to work fine.

Windows Vista service pack 1

vista.gifMicrosoft have announced that service pack 1 for Windows Vista has been released to manufacturing. Microsoft claim “great progress in performance, reliability and compatibility”. SP1 will be rolled out through Windows update from mid-March.

My own stats show that Vista has been slowly increasing market share at 1% per month. At this rate it will take it another 5 years to reach the 75% share currently held by XP. But perhaps a lot of people have been wisely waiting for SP1 before committing?

I have been using Vista on my main development machine for the last few months. It is OK once you turn the deeply annoying UAC off. But it is still hard to see any compelling reason to upgrade from XP.

Optimising your application

When I first released PerfectTablePlan I considered 50-200 guests as a typical event size, with 500+ guests a large event. But my customers have been using the software for ever larger events, with some as large as 3000 guests. While the software could cope with this number of guests, it wasn’t very responsive. In particular the genetic algorithm I use to optimise seating arrangements (which seats people together or apart, depending on their preferences) required running for at least an hour for the largest plans. This is hardly surprising when you consider that seating assignment is a combinatorial problem in the same NP-hard class as the notorious travelling salesman problem. The number of seating combinations for 1000 guests in 1000 seats is 1000!, which is a number with 2,658 digits. Even the number of seating combinations for just 60 guests is more than the number of atoms in the known universe. But customers really don’t care about how mathematically intractable a problem is. They just want it solved. Now. Or at least by the time they get back from their coffee. So I made a serious effort to optimise the performance in the latest release, particularly for the automatic seat assignment. Here are the results:

ptp308_vs_ptp_310.png

Total time taken to automatically assign seats in 128 sample table plans varying in size from 0 to 1500 guests

The chart shows that the new version automatically assigns seats more than 5 times faster over a wide range of table plans. The median improvement in speed is 66%, but the largest plans were solved over ten times faster. How did I do it? Mostly by straightening out a few kinks.

Some years ago I purchased my first dishwasher. I was really excited about being freed from the unspeakable tyranny of having to wash dishes by hand (bear with me). I installed it myself – how hard could it be? It took 10 hours to do a wash cycle. Convinced that the dishwasher was faulty I called the manufacturer. They sent out an engineer who quickly spotted that I had kinked the water inlet pipe as I had pushed the dishwasher into place. It was taking at least 9 hours to get enough water to start the cycle. Oops. As soon as the kink was straightened it worked perfectly, completing a cycle in less than an hour. Speeding up software is rather similar – you just need to straighten out the kinks. The trick is knowing where the kinks are. Experience has taught me that it is pretty much impossible to guess where the performance bottlenecks are in any non-trivial piece of software. You have to measure it using a profiler.

Unfortunately Visual Studio 2005 Standard doesn’t seem to include profiling tools. You have to pay for one of the more expensive versions of Visual Studio to get a profiler. This seems rather mean. But then again I was given a copy of VS2005 Standard for free by some nice Microsofties – after I had spent 10 minutes berating them on the awfulness of their “works with vista” program (shudder). So I used an evaluation version of LTProf. LTProf samples your running application a number of times per second, works out which line and function is being executed and uses this to build up a picture of where the program is spending most time.

After a bit of digging through the results I was able to identify a few kinks. Embarrassingly one of them was that the automatic seat assignment was reading a value from the Windows registry in a tight inner loop. Reading from the registry is very slow compared to reading from memory. Because the registry access was buried a few levels deep in function calls it wasn’t obvious that this was occurring. It was trivial to fix once identified. Another problem was that some intermediate values were being continually recalculated, even though none of the input values had changed. Again this was fairly trivial to fix. I also found that one part of the seat assignment genetic algorithm took time proportional to the square of the number of guests ( O(n^2) ). After quite a bit of work I was able to reduce this to a time linearly proportional to the number of guests (O(n) ). This led to big speed improvements for larger table plans. I didn’t attempt any further optimisation as I felt was getting into diminishing returns. I also straightened out some kinks in reading and writing files, redrawing seating charts and exporting data. The end result is that the new version of PerfectTablePlan is now much more usable for plans with 1000+ guests.

I was favourably impressed with LTProf and will probably buy a copy next time I need to do some optimisation. At $49.95 it is very cheap compared to many other profilers (Intel VTune is $699). LTProf was relatively simple to use and interpret, but it did have quirks. In particular, it showed some impossible call trees (showing X called by Y, where this wasn’t possible). This may have been an artefect of the sampling approach taken. I will probably also have a look at the free MacOSX Shark profiler at some point.

I also tried tweaking compiler settings to see how much difference this made. Results are shown below. You can see that there is a marked difference with and without compiler optimisation, and a noticeable difference between the -O1 and -O2 optimisations (the smaller the bar, the better, obviously):

vs2005_optimisation_speed.png

Effect of VS2005 compiler optimisation on automatic seating assignment run time

Obviously the results might be quite different for your own application, depending on the types of calculations you are doing. My genetic algorithm is requires large amounts of integer arithmetic and list traversal and manipulation.

The difference in executable sizes due to optimisation is small:

vs2005_optimisation_size.png

I tried the two other optimisation flags in addition to -O2.

  • /OPT:NOWIN98 – section alignment does not have to be optimal for Windows 98.
  • /GL – turns on global optimisation (e.g. across source files, instead of just within source files).

Neither made much noticeable difference:

vs2005_additional_opt.png

However it should be noted that most of the genetic algorithm is compiled in a single file already, so perhaps /GL couldn’t be expected to add much. I compared VC++6 and VS2005 version of the same program and found that VS2005 was significantly faster[1]:

vc6_vs_vs2005_optimisation_speed1.png

I also compared GCC compiler optimisation for the MacOSX version. Compared with VS2005 GCC has a more noticeable difference between optimised and unoptimised, but a smaller difference between the different optimisations:

gcc_optimisation_speed.png

Surprisingly -O3 was slower than -O2. Again the effect of optimisation on executable size is small.

gcc_optimisation_size2.png

I also tested the relative speeds of my 3 main development machines[2]:

relative-machine-speed.png

It is interesting to note that the XP box runs the seat assignment at near 100% CPU utilisation, but the Vista box never goes above 50% CPU utilisation. This is because the Vista box is a dual core, but my the seat assignment is currently only single threaded. I will probably add multi-threading in a future version to improve the CPU utilisation on multi-core machines.

In conclusion:

  • Don’t assume, measure. Use a profiler to find out where your application is spending all its time. It almost certainly won’t be where you expected.
  • Get the algorithm right. This can make orders of magnitude difference to the runtime.
  • Compiler optimisation is worth doing, perhaps giving a 2-4 times speed improvement over an application built without compiler optimisation. It probably isn’t worth spending too much time tweaking compiler settings though.
  • Don’t let a software engineer fit your dishwasher.

Further reading:

“Programming pearls” by Jon Bentley a classic book on programming and algorithms

“Everything is fast for small n” by Jeff Atwood on the Coding Horror blog

[1] Not strictly like-for-like as the VC++6 version used dynamic Qt libraries, while the VS2005 version used static Qt libraries.

[2] I am assuming that VS2005 and GCC produce comparably fast executables when both set to -O2.