Monthly Archives: August 2007


codekanaI don’t remember when or where I first saw an editor with syntax highlighting. But I do remember that I was ‘blown away’ by it. It was immediately obvious that it was going to make code easier to understand and syntax errors easier to spot. I would now hate to have to program without it. So I was interested to try version 1.1of CodeKana, a recently released C/C++/C# syntax highlighting add-in for Visual Studio.

Codekana features include:

  • Finer grained syntax highlighting than VS2005 provides.
  • Highlighting of non-matching brackets and braces as you type.
  • Easy switching between header and body files.

In the code below Codekana colours the if/else/while blocks differently and visually pairs the braces:

syntax highlighting

I have only been using Codekana a few hours, but I am already impressed. I find the ability to quickly switch between C++ header and body files particularly useful. VS2005 only appears to allows switching body to header, not header to body (doh!). You need the dexterity of a concert pianist for the default Codekana keyboard shortcut (Ctrl-Shift-Alt-O), but it can be customised. I changed it to Ctrl+. (dot) .

Codekana also has other features, such as the ability to zoom in/out on code. This is quite ‘cool’, but I’m not sure yet whether it will be of much use. Time will tell.

I am new to VS2005 and I have yet to try out other add-ins, such as Visual Assist, but Codekana certainly seems to have a lot of potential and is excellent value at $39. I look forward to seeing what other features get added in future versions. Find out more and download the free trial here.

Disclosure: The author of Codekana is a JoS regular who I have corresponded with in the past and was kind enough to send me a complimentary licence.

Having a crack at the crackers

crack siteSoftware cracks are a real problem for software vendors large and small. I have discussed in a previous article some of the ways in which developers can try to mitigate their effects. A fellow ASP member (who might wish to remain nameless) has gone a step further by creating a fake crack site . It looks quite convincing, but when you try to download a crack it gives you an ominous message about the error of your ways and logs your IP address. I would have gone for a less confrontational message, but it will be interesting to see how effective this approach is.

I think is worthy of support by developers. Please consider giving the site some Google juice by linking to it from your site or blog using link words such as crack, keygen and/or serials. If you don’t want to do this on a main page of your site, link to it only from your site map page. Alternatively create a Google site map (a good idea anyway) and only reference the page with the links from there. I believe the site owner is going to try to cover his costs by donations, Google ads and possibly, referral fees. I certainly don’t begrudge him some return on his efforts. I also don’t feel bad about them playing a little trick on someone looking for illegal cracks. It might even save them from downloading malware.

digg vs reddit vs slashdot vs stumbleupon – who’s the daddy?

traffic spike from digg reddit stumbleupon and slashdotSocial news and bookmarking sites, such as,, and, use voting by users or selection by editors to rank interesting stories. Much to my surprise, I recently had an article from this blog featured prominently on all four of these popular sites. This generated a large amount of traffic and gave me an interesting opportunity to turn the tables, by using my hit statistics to rank these sites.

On the 16th August I published an article about a little experiment I did to prove that many software download sites hand out awards automatically, without reviewing the software. Most developers who have submitted software to such sites probably suspected this already. But the experiment proved it conclusively by garnering awards for software that didn’t even run.

I wrote the article because I wanted to shine some light on this unsavoury practise. I wanted it to be as widely read as possible, so I posted a link to it on a few software developer and entrepreneur forums that I frequent. Later in the day I posted it to I also added my vote to the people who had already posted it to and I expected a few hundred people would read the article, mostly regular readers of my blog. But it got voted up and made its way on to the home page of Traffic started to flood in.

My recollections of the next few days are a bit hazy as it all happened rather quickly. From the front page of the article made its way across the front pages of, and then, like a electronic Mexican wave. The article also appeared on the home page of and received traffic from social bookmarking sites and Large numbers of blogs and forums also linked to the article. Hits on my blog peaked on the 17th at 53,422 hits for the day.

total blog hits per day

blog hits from reddit, digg, slashdot and stumbleupon

A few observations from the data:

  • The social news sites have the attention span of a one year old on amphetamines. The hits from went from 15,161 on Friday to just 648 on Saturday.
  • The article was linked to from 375 blogs (according to and an unknown number of forums and other sites. The top 4, 10 and 20 sites account for 52%, 61% and 65% of the total traffic, respectively. A long tail of less popular sites makes up the rest.
  • Things really took off once the article reached the front page of I visualise the links spreading across the Internet analogous to a sub-atomic chain reaction. Just as energetic particles decompose into cascades of ever smaller particles, bigger sites propogate their links to ever larger numbers of smaller and smaller sites.
  • The onslaught was wide, but not deep. A relatively low percentage of readers followed links in the article or read other articles on my blog. While that still made quite an impact on the number of visitors to the home page of my seating planner software PerfectTablePlan, there were few additional downloads and (according to my cookie tracking) 0 additional sales. This is not too surprising when you think how untargetted the traffic is. Experience has shown me that small volumes of targetted traffic make more sales than large volumes of untargetted traffic. But still, one of you must know someone who is getting married? ;0)


Totalling all the visitors to the blog over the 5 days I give you the official ranking for social news and bookmarking sites.

stumbleupon, digg, reddit and slashdot

Here is the full top 20:

top 20 referrer sites

The article has generated a lot of comments. I particularly enjoyed the reviews here (I hope they haven’t been deleted). Interestingly the order of the number of comments/reviews for the 4 top sites is very different to the number of hits.

comments and reviews on stumbleupon. digg, reddit and slashdot

Please don’t take my ranking too seriously. The story reached similar positions on the reddit, digg and slashdot home pages[1], but my methodology here is far from rigorous. A different type of story on a different day might have resulted in a quite different ranking. Amongst other issues:

  • The WordPress stats only show the top 40 referrers for each day.
  • The article made the front page of different sites at different times.
  • Just because someone clicked through, doesn’t mean they actually read the article.
  • My article might simply have been more interesting to the type of people who read one site than the type of people who read another.
  • I have no way of knowing whether any of the visitors were bots.

But social news sites aren’t exactly rigorous in their ranking either.

Please note that I created this blog to write about what it takes to successfully create and market commercial software. I don’t intend to become another blogger blogging about blogging. It’s bad for your eyesight (see point #10 here). Normal service will be resumed shortly.

[1] To the best of my knowledge the article reached a highwater mark of positions 1,2 and 2 on, and respectively and was featured in one of the ‘popular’ pages on stumbleupon.

The software awards scam

software awardI put out a new product a couple of weeks ago. This new product has so far won 16 different awards and recommendations from software download sites. Some of them even emailed me messages of encouragement such as “Great job, we’re really impressed!”. I should be delighted at this recognition of the quality of my software, except that the ‘software’ doesn’t even run. This is hardly surprising when you consider that it is just a text file with the words “this program does nothing at all” repeated a few times and then renamed as an .exe. The PAD file that described the software contains the description “This program does nothing at all”. The screenshot I submitted (below) was similarly blunt and to the point:


Even the name of the software, “awardmestars”, was a bit of a giveaway. And yet it still won 16 ‘awards’. Here they are:


Some of them look quite impressive, but none of them are worth the electrons it takes to display them.

The obvious explanation is that some download sites give an award to every piece of software submitted to them. In return they hope that the author will display the award with a link back to them. The back link then potentially increases traffic to their site directly (through clicks on the award link) and indirectly (through improved page rank from the incoming links). The author gets some awards to impress their potential clients and the download site gets additional traffic.

This practise is blatantly misleading and dishonest. It makes no distinction between high quality software and any old rubbish that someone was prepared to submit to a download site. The download sites that practise this deceit should be ashamed of themselves. Similarly, any author or company, that displays one of these ‘awards’ is either being naive (at best) or knowingly colluding in the scam (at worst).

My suspicions were first aroused by the number of five star awards I received for my PerfectTablePlan software. When I went to these sites all the other programs on them seemed to have five star awards as well. I also noticed that some of my weaker competitors were proudly displaying pages full of five star awards. I saw very few three or four star awards. Something smelled fishy. Being a scientist by original training, I decided to run a little experiment to see if a completely worthless piece of software would win any awards.

Having seen various recommendations for the submission service on the ASP forums I emailed the owner, Mykola Rudenko, to ask if he could help with my little experiment. To my surprise, he generously agreed to help by submitting “awardmestars” to all 1033 sites on their database, free of charge.

According to the report I received 2 weeks after submissions began “awardmestars” is now listed on 218 sites, pending on 394 sites and has been rejected by 421 sites. Approximately 7% of the sites that listed the software emailed me that it had won an award (I don’t know how many have displayed it with an award, without informing me). With 394 pending sites it might win quite a few more awards yet. Many of the rejections were on the grounds of “The site does not accept products of this genre” (it was listed as a utility) rather than quality grounds.

The truth is that many download sites are just electronic dung heaps, using fake awards, dubious SEO and content misappropriated from PAD files in a pathetic attempt to make a few dollars from Google Adwords. Hopefully these bottom-feeders will be put out of business by the continually improving search engines, leaving only the better sites. I think there is still a role for good quality download sites. But there needs to be more emphasis on quality, classification, and additional content (e.g. reviews). Whether it is possible for such a business to be profitable, I don’t know. However, it seems to work in the MacOSX world where the download sites are much fewer in number, but with much higher quality and more user interaction.

Some download site owners did email me to say either “very funny” or “stop wasting my time”. Kudos to them for taking the time to check every submission. I recommend you put their sites high on your list next time you are looking for software: (German)

This is the response I got from Lothar Jung of when I showed him a draft of this article:

“The other side for me as a website publisher is that if you do not give each software 5 stars, you don’t get so many back links and some authors are not very pleased with this and your website. When I started, I wanted to create a site where users can find good software. So I decided the visitor is important, and not the number of backlinks. Only 10% of all programs submitted get the 5 Suns Award.”

Another important issue for download sites is trust. I want to know that the software I am downloading doesn’t contain spyware, trojans or other malware. Some of the download sites have cunningly exploited this by awarding “100% clean” logos. I currently use the Softpedia one on the PerfectTablePlan download page. It shouldn’t be too difficult in principle to scan software for known malware. But now I am beginning to wonder if these 100% clean logos have any more substance than the “five star”awards. The only way to find out for sure would be to submit a download with malware, which would be unethical. If anyone has any information about whether these sites really check for malware, I would be interested to know.

My thanks to for making this experiment possible. I was favourably impressed by the thoroughness of their service. At only $70 I think it is excellent value compared to the time and hassle of trying to do it yourself. I expect to be a paying customer in future.

** Addendum 1 **

This little experiment has been featured on,,, and a number of other popular sites and blogs. Consequently there have been hundreds of comments on this blog and on other sites. I am very flattered by the interest. But I also feel like Dr Frankenstein, looking on as my experiment gains a life of its own. If I had known the article was going to be read by so many people I would have taken a bit more time to clarify the following points:

  • I have no commercial interest in, or prior relationship with, the three download sites mentioned. I singled them out because I infer from emails received that they have a human-in-the-loop, checking all submissions (or a script that passes the Turing test, which is even more praiseworthy). I offered all three a chance to be quoted in the article. Today I received a similar email from, but they were too late to make the article. I don’t know if they read the article before they emailed me.
  • I have no commercial interest in, or prior relationship with, the automatic submission service mentioned. I approached them for help, which they generously provided, free of charge.
  • The only business mentioned in which I have a commercial interest is my own table planning software, PerfectTablePlan.

** Addendum 2 **

23 awards ‘won’ at the latest count.

Your code is sub-optimal!

source gear evil mastermind characterWhen I saw the new Source Gear ‘evil mastermind’ t-shirts, I had to have one for my nerd warddrobe. So I struck a Faustian bargain with Source Gear mastermind Eric Sink. The photo below is part of the bargain. Sorry about that.

Given that version control isn’t the most wildly exciting of topics (no offense intended) I think their comic book campaign is a very imaginative piece of marketing. I wish them every success with it. While I am plugging Source Gear I should mention that they offer a free one-developer licence for Vault. This could be very useful for microISVs out there looking for a source control tool. So far I am pretty happy with Subversion, apart from the ‘don’t come crying to me for help’ approach to merging branches.

I also recommend Eric’s blog and the resulting book as an excellent source of marketing information for techies. It helped me a lot.

Have I earnt my t-shirt yet Eric? This blog has also earned me one sale of PerfectTablePlan and a couple of dollars in referral fees from e-junkie. At this rate I will soon be able to retire from programming to blog full time. Watch out Scoble.

source gear evil mastermind

If you aren’t embarrassed by v1.0 you didn’t release it early enough

releasing v1.0I cringe every time I hear about someone who has spent years writing their ‘killer app’, but still hasn’t released it. My preferred approach is to get a solid, but minimally featured, v1.0 out there and then iterate like crazy based on real customer feedback. There are a number of arguments for and against releasing early:

Against: Feature poverty

A common reason for holding back on a release is “my competitor has features A and B, so I have to have A and B”. BZZZZT. Wrong. If you are trying to compete feature for feature with a competitor who is already in the market, you are at big disadvantage. By the time you have added A your competitor will have added B. Anyway, maybe some of your potential customers don’t want A or B. Perhaps they actually want something simpler. Or they really want C, which you can do in half the time of doing A and B. Microsoft has released a number of products that were derided at v1.0, but went on to dominate the market (Windows, for one).

Against: Reputation

If you release early, won’t you get a bad reputation? Only if you produce shoddy software that crashes all over the place. There is no excuse for that, even at version 1.0. The key is to pare down the features without sacrificing quality. Pick the smallest sub-set of features that will be useful. Then add more features at each subsequent release, based on user feedback.

The truth is, unless you are a big company with a lot of marketing muscle or you have picked a tiny niche, very few of your potential customers will ever hear about version 1.0 of your software anyway.

Against: Support overheads

As soon as you have customers you will have to spend considerable amounts of effort supporting them. The sooner you release the software the sooner you get this overhead.

Against: Release overheads

Creating a stable release is a lot of work, even if you manage to automate some of it. If you do more releases in a given period of time than your competitor you will inevitably spend a higher percentage of your time testing, proof reading and updating your website.

For: Feedback

Every product launch is a huge guess. If you have lots of competitors, you don’t know if you will be able to take customers from them. If you don’t have many competitors, you don’t know if there is a real market for your product. It is also tough to know what features people really want and how much they are prepared to pay. What people say and what they do are often quite different. Even if you manage to figure all that out, every market is constantly changing.

The only reliable way to find out if people will buy your product is to release it. As soon as you have paying customers they will let you know what you need to improve. Even emails from prospective buyers asking “does it do X?” can be very valuable. Many (perhaps most) successful products have ended up quite different to what the developers originally intended.

For: Motivation

Having customers is great for motivation. If you are working for a year or two on a project without customers to push you on, it is very easy to lose focus or run out of steam.

For: Failing faster

Despite the best efforts of all concerned many products fail. In fact I would guess that the majority of commercial products fail to recoup the initial investment. Yours could be one of them. If you are going to fail, you should fail as fast as you can so you can start over on something more profitable. The sooner you release and start asking people for money the sooner you will know if your product is a dog.

For: Cash-flow

The sooner you start selling, the quicker you start to recoup your investment. As a simple (contrived) illustration:

Company1 release v1.0 after 6 months. As they improve the product and website and word-of-mouth kicks in sales increase linearly for the next 18 months: 10 sales in month 1, 20 sales in month2, 30 sales in month 3 etc.

Company2 release v1.0 after 12 months. They have the same sales: 10 sales in month 1, 20 sales in month2, 30 sales in month 3 etc.

In 18 months Company1 will sell 1900 licence, whereas Company2 will sell only 910 licences. But surely Company2 will have a better product when the release it? Even if Company2 sales grow twice as fast (20 sales in month 1, 40 sales in month2, 60 sales in month 3 etc) they will still sell 80 less licences than Company1. Also I believe Company1 will probably have a better product than Company2 12 months or 24 months in because they have 6 months more feedback on what customers really want and will pay for.


It is well known that the sooner you catch a mistake in development, the cheaper it is to fix. I believe this is just as true in marketing. A sure way to find these marketing mistakes is to release. You wouldn’t write a thousand lines of code before you tried to compile it. Why would you spend a year or more on development before testing it in the market? Creating software should be an incremental process.

The best time to release is a trade-off between the various factors above. Obviously your software has to be able to solve a real problem, or no-one is going to buy it. This is going to take longer for an air traffic control system than a back-up utility. But I would always try to release v1.0 in less than 6 months of elapsed time if it is my money paying for the development (I don’t write air traffic control systems). Spending a year or more writing something with no real customer feedback is more risk than I am prepared to accept. If you think it isn’t possible to produce something useful in that time, then maybe you aren’t being creative or brutal enough with the feature set. As a rule of thumb, I would say that if you aren’t embarrassed by the lack of features in v1.0, then you didn’t release it early enough.