Category Archives: surveys

The psychology of successful bootstrappers

the psychology of successful bootstrappersI am curious about how the people who bootstrap software businesses are different to the general population, and to each other. I investigated this using a standard (‘big 5’) personality test. I think the results make for interesting reading.

I asked a number of software company founders to complete an online personality test and send me their results. 18 of them did (19 including me). You have probably heard of some of them, however I promised anonymity. We are all founders of bootstrapped (i.e. not VC funded) software product companies and have been involved in programming a significant portion of our products. Most of us are solo founders. Some of us (including myself) are lifestyle programmers, others have employees. We are all successful to the extent that we make a living from our software product sales. None of us are billionaires (Bill Gates probably wouldn’t return my email).

The test measures personality across 5 major axes of personality identified by psychologists:

  • Extraversion (outgoing/energetic vs. solitary/reserved) – how much you derive satisfaction from interacting with other people.
  • Conscientiousness (efficient/organized vs. easy-going/careless) – how careful and orderly you are.
  • Neuroticism (sensitive/nervous vs. secure/confident) – how much you tend to experience negative emotions.
  • Agreeableness (friendly/compassionate vs. analytical/detached) – how much you like and try to please others.
  • Openness (inventive/curious vs. consistent/cautious) – how much you seek out new experiences.

See Wikipedia for more details.

For each personality axis I have created a histogram of the results, showing how many founders fit in each 10% ‘bin’ compared to the general population. For example, for extraversion: 0 bootstrappers were in the 1-10 percentile (i.e. least extrovert 10%) of the general population, 1 founder was in the 11-20 percentile, 2 were in the 21-30 percentile etc.

extraversionconscientiousnessneuroticsmagreeablenessopenness

Extraversion Conscientiousness Neuroticism Agreeableness Openness
average (mean) 59.9 61.7 37.6 48.3 50.3
standard deviation 23.0 21.9 23.1 21.1 23.2

If bootstrappers were like the general population we would expect each bar to be the same height, with a bit of random variation, and the average score to be 50. Clearly this is not the case.

We are more extrovert on average than the general population. Although programming is stereotypically a profession for introverts and quite a few of us work alone, you need to get yourself noticed and interact with customers and partners to be a successful bootstrapper.

We are more conscientious on average than the general population. Shipping a software product requires a lot of attention to detail.

We are less neurotic on average than the general population. You need a some self belief and a thick skin to weather the ups and downs of being a bootstrapper.

We are about average for agreeableness. However the scores are not evenly distributed. Only 1 scored above the 70 percentile. Perhaps being too ready to please, rather than following your own vision, is a handicap for bootstrappers.

We are about average for openness. But the scores are clumped around the centre. Initially I was a bit surprised by this result. I expected bootstrappers to be inventive/ideas people and to score well above average. But perhaps the people who score very highly on openness are easily distracted (squirrel!), and never get anything finished.

The 5 personality traits are supposed to be orthogonal (not correlated). Picking some random pairs of traits and drawing scatter plots, that does indeed appear to be the case. For example extraversion doesn’t appear to be correlated with conscientiousness:

extroversions vs conscientiousnessI am aware that this survey suffers from some shortcomings:

  • The test is fairly simplistic. It doesn’t begin to capture what unique and precious little snowflakes we all are. However I don’t think I would have any results at all if I asked people to complete a massive survey. We are busy people.
  • Any survey suffers from selection bias. I am more likely to know other founders who are extroverts (the introverts probably go to less conferences). It is also likely that the people who responded were more conscientious and agreeable than those that didn’t!
  • 19 is a small sample size.

Correlation doesn’t imply causation. So these results don’t prove that high levels of conscientiousness and extraversion and low levels of neuroticism make you proportionally more likely to succeed at bootstrapping a software company. But, given that personality is considered fairly stable over time, it seems unlikely that the success caused the personality traits. However both could be correlated to some underlying factor, e.g. these traits could conceivably make you more likely to try starting a software business, but no more likely to succeed. Or the correlations could conceivably be a statistical fluke. I leave it as an exercise for an interested reader to work out the exact level of statistical significance of these results. It would be interesting to compare these results with those who tried to bootstrap business, but failed. However such data might not be easy to come by.

Given what I know about the trials of starting your own software business I think an above average level of conscientiousness and extraversion and a low level of neuroticism are a real asset. However it is also clear that the personalities of individual founders vary a lot. So don’t be disheartened if you don’t fit this profile. There are successful bootstrappers who don’t fit the profile. Personality is not destiny. And you can always partner with or employ someone who has complementary personality traits. But if you are a slap-dash, neurotic, who doesn’t like talking to other people, perhaps bootstrapping a software company isn’t for you. A career in government funded IT projects might be more suitable.

I sent a draft of this post to Dr Sherry Walling for feedback. Sherry is particularly well qualified to comment as she is both an adjunct Professor of Psychology and married to well know bootstrapper/micropreneur Rob Walling. Her response (paraphrased a bit) was:

“Your standard deviations are quite large which indicates that there is quite a lot of variability in your data. You would much rather have standard deviations between 0-10 when working with this kind of scale.

From my perspective, the only domain where I would expect significant difference is Conscientiousness. Conscientiousness is an essential bootstrapper trait. I am not sure how a solo founder could be successful if he/she is not naturally conscientious.

There are so many ways to be a successful bootstrapper. A neurotic person can fuel his sensitivity to negative emotions into hard work. A less neurotic person may not have enough anxiety to get up early and get to work. On the other hand too much neuroticism can be very debilitating. I don’t think there is a formula. The combination of factors could vary tremendously with each person, but conscientiousness is the one that seems essential.”

If you want to do your own analysis, the anonymised results are available to download as a CSV file here.

Many thanks to everyone who took part in the test.

You can do the test yourself. You don’t have to give your email address or answer the additional questions at the end. How do you compare?

2010 microISV Pain Point Survey

Russell Thackston is running a survey to try and find out which tasks cause microISVs the most pain. He is then going to use the results of this survey to guide further research at the microISV Research Alliance at Auburn University. I have completed the survey and will be interested to see what the results are. You can take the survey here. You could win an iPod touch or an iPod shuffle. The survey will run until 21st August.

Lessons learned from 13 failed software products

“No physician is really good before he has killed one or two patients.” – Proverb

Software entrepreneur culture is full of stories of the products that succeeded. But what about the products that failed? We rarely hear much about them. This can lead to a very skewed perspective on what works and what doesn’t (survivor bias). But I believe that failure can teach us as much as success. So I asked other software entrepreneurs to share their stories of failure in the hope that we might save others from making the same mistakes. To my surprise I got 12 excellent responses, which I include below along with one of my own. It is a small sample and biased by self selection, but I think it contains a lot of useful insights. It is an unashamedly a long post, as I didn’t want to lose any of these insights by editing it down.

Case #1: DRAMA

Contributor

Andy Brice.

The product

DRAMA (Design RAtionale MAnagement) was a commercialization of a University prototype for recording the decision-making process during the design of complex and long-lived artefacts, for example nuclear reactors and chemical plants. By recording it in a structured database this information would still be available long after the original engineers had forgotten it, retired or been run over by buses. This information was believed to be incredibly valuable to later maintainers of the system, engineers creating similar designs and industry regulators. The development was part funded by 4 big process engineering companies.

Why it was judged a commercial failure

Everyone told us what a great idea it was, but no-one bought it. despite some early funding from some big process engineering companies, none of them put it into use properly and we never sold any licences to anyone else.

What went wrong

  • Lack of support from the people who would actually have to use it. There are lots of social factors that work against engineers wanting to record their design rationale, including:
    • The person taking the time to record the rationale probably isn’t the person getting the benefit from it.
    • Extra work for people who are already under a lot of time pressure.
    • It might make it easier for others to question decisions and hold companies and engineers accountable for mistakes.
    • Engineers may see giving away this knowledge as undermining their job security.
  • Problems integrating with the other software tools that engineers spend most of their time in (e.g. CAD packages). This would probably be easier with modern web-based technology.
  • It is difficult to capture the subtleties of the design process in a structured form.
  • A bad hire. If you hire the wrong person, you should face up to it and get rid of them. Rather than keep moving them around in a vain attempt to find something they are good at.
  • We took a phased approach, starting with a single-user proof of concept and then creating a client-server version. In hindsight it should have been obvious that not enough people were actively using the single-user system and we should have killed it then.

Time/money invested

At least 3 man years of work went into this product, with me doing most of it. Thankfully I was a salaried employee. But the lack of success of this product contributed to the demise of the part of the company I was in.

Current product status

The product is long dead.

Any regrets?

It was a fairly painful experience. I would rather have spent all that money, time and energy on something that someone actually used. But at least I learnt some expensive lessons without using my own money.

Lessons learned

  • Creating a new market is difficult and risky.
  • Changing people’s working habits is hard.
  • Social factors can make or break a product. The end-users didn’t see anything in it for them.
  • If the end-users don’t like a product, they will find a way not to use it, even if their bosses appear to be enthusiastic about it.
  • Talk is cheap. Lots of people telling you how great your product is doesn’t mean much. You only really find out if your product is commercially viable when you start asking people to buy it.

Case #2: CleanChief

Contributor

Sam Howley.

The product

CleanChief was to be ‘The easy management solution for cleaning organisations’. Managing assets, employee schedules, ordering supplies, you name it CleanChief handled it. Essentially it was light weight accounting software for cleaning companies.

Why it was judged a commercial failure

A small number of copies were sold. No one is actively using it at present. Once I realised that it wasn’t a complete product and that additional development was required I moved on to other product ideas. I had basically run out of enthusiasm for the product.

What went wrong

  • I am not an accountant.
  • I have never run a cleaning company.
  • I developed it for more than two years without getting feedback from real cleaning companies. I was arrogant enough to think that I knew what they wanted (or could work it out on my own). Or maybe it was that I was just where I was most happy and comfortable – writing software. Talking to real users was new and to be honest a bit scary for me.
  • A successful cleaning company operator, a friend of a friend, offered to become involved for a 30% share. This was a gift from the heavens, exactly what I needed. I refused.
  • In a way, even though I spent so long on the product, I gave in too soon, I was just getting feedback from real users, just getting my first batch of sales when I decided to move on.
  • I developed the application in VB6 even though I knew it was outdated technology when I started the project.This meant there was no ‘cool factor’ when discussing it with other developers, I told myself it didn’t bother me, but it probably did.

Time/money invested

I worked on it at night and weekends for about 2 1/2 years. I paid for graphic design work, purchased stock icons and images. I probably spent a couple of thousand Australian dollars in total and an awful lot of time.

Current product status

I moved on to other products that have gone much better. My newer products were released in months rather than years and I looked for real feedback from real users from day one. they are:

I do occasionally ponder returning to CleanChief and trying to raise it from the ashes.

Any regrets?

No. Looking back I learned a few lessons from a huge amount of time and work, it was a very inefficient way to learn those lessons. But when you are new to something like starting a business or creating useful software being inefficient at learning lessons is the best you can do, it’s a thousand times better than not learning lessons at all.

I learned so much more in my two and a half years of trying to develop CleanChief than I did in the two and a half years prior to that, during which time I really wanted to start a software business but didn’t take any action.

Lessons learned

Hearing or reading some piece of advice is totally different to living it. Here are some of the ideas that I always agreed were true but didn’t fully understand the implications of until I had lived them out:

  • Force yourself to get out and talk to people. Ask their advice. Almost everyone will help if you ask them for feedback.
  • Force yourself to cold call a few businesses in your target market.
  • Create a plan of how to market your product.
  • Try and use your product as much as possible as you build it.
  • Get out of your comfort zone from day one
  • Do not have the mind set that the day you release version 1.0 is the finish line, it’s the starting line, so hurry up and get there.

Case #3: Chimsoft

Contributor

Phil Anderson.

The product

ChimSoft – Software for Chimney Sweeps.

Why it was judged a commercial failure

I believe this failed for two reasons:

  • Focusing on too small of a niche
  • Me not being able to work full time on it.

I don’t consider it a complete failure because I sold two copies when it retailed for $2k, and maybe 10-15 more copies when I lowered the price to $200. Those sales proved that I wasn’t completely off base in thinking there was a market for the software, but the cost of customer acquisition and the size of the market were too small. Customers wanted to have a bunch of phone calls, face-to-face etc… the type of stuff you only see with much more expensive software. The problem was that for a niche this small we had to charge a lot of money to make it worthwhile for us, but the customers were small businesses where this is a major investment, so the fit was never right. The other issue was the people that did buy it were not super tech savvy, so there was a high cost of support that made even a $200 product not worth it.

What went wrong

  • Having all partners who were not full-time, and had equal equity.  I ended up doing most of the work and this is the main reason I didn’t force success is I felt I was in it alone.
  • Focusing on too narrow of a niche.  The plan all along was to expand for all service industries, but it was much harder to make that move than we expected.
  • Not researching pricing more, we knew small businesses made major purchases for things that really helped their business, but I think it would have been better to have a cheaper product with wider appeal than an expensive product with narrow appeal.

Time/money invested

I invested maybe a year of time and $3k into the company. I did not take any huge risks on it, so there were no big negative outcomes.

Current product status

The company folded in 2007, I refocused my efforts on my existing companies (AUsedCar.com and BudgetSimple.com) and both have been doing well enough that I quit my day job.

Any regrets?

I don’t regret it entirely, I think I learned several valuable lessons about working with other people, small business sales, trade-shows and software development.

Lessons learned

  • Pick partners wisely. Don’t try to be even-steven with equity. Use restricted stock to ensure everyone does their part.
  • Know what your customers expect (24/7 phone support?) to determine if you can do this while working a day job.

Case #4: PC Desktop Cleaner

Contributor

Javier Rojas Goñi.

The product

PC Desktop Cleaner. Simple software that cleans your desktop and archives your files.

Why it was judged a commercial failure

My goal was to sell 10 units per month. I’ve sold less than 1 unit per month.

What went wrong

  • I think that the product concept is not useful enough. It’s not a thing that people would pay for.
  • The market exists (some people buy) but it’s too little or difficult to reach.
  • I didn’t do any market research. I just got in love with the idea and did it. Later, I’ve learnt to use “lazy instantiation marketing” and have trashed a lot of embryo projects. :-)

Time/money invested

I think I wasted near $500 in development tools and some freelancers. Not too much.

Current product status

I’m still selling it. I’ve thought about others products, but not really decided yet.

Any regrets?

No, it was a lot of fun and I learnt lot of things. In my “day job” I own a small firm that sells software for production scheduling. I’ve learn a lot of SEO and AdWords in the DesktopCleaner project that now I’m using with great results.

Lessons learned

Go for it, maybe you win, maybe you fail, but you will grow and get tons of useful knowledge on the way.

Case #5: Smart Diary Suite

Contributor

Dennis Volodomanov.

The product

Smart Diary Suite.

Why it was judged a commercial failure

It sells and the profits cover current investments in the product, but there is little left over on top of that.

What went wrong

If I had a chance to do anything differently:

  • Take it seriously from day one.
  • Never stop developing and supporting.
  • Invest as much as possible in marketing early on.
  • Don’t stop believing in your creation.

Time/money invested

Up to this point, I have spent 13 years on Smart Diary Suite and a lot of money went into buying hardware, software, hosting, marketing, etc… All of that money came from my day job, but at this point SDS has recovered all of that back and is now making a small profit. The actual amount is hard to calculate (over the 13 year span), but we would be talking in tens of thousands of US dollars.

Current product status

For a while it may have seemed like SDS is not going to be successful, but that’s probably my fault – I stopped believing for a little while. Now I am back, starting again and this time I’ll make sure it doesn’t fail.

Any regrets?

I do not regret doing it. I regret allowing myself to stop working on it, basically bailing out on it for a while – that is my biggest mistake.

Lessons learned

If you want a successful product – believe in it and let others know that you believe in it.

Case #6: Highlighter

Contributor

Mike Sutton.

The product

Highlighter. A utility to print neatly formatted, syntax highlighted source code listings.

Why it was judged a commercial failure

I earnt a grand total of £442.52 (about $700 in todays money) in just over two years, so I guess it paid for itself if you exclude my time.

What went wrong

Since it was my first product and I was very green about both marketing and product development. I would suggest the following would have made things better:

  • Get feedback from potential users about the product (eg from the ASP forums). Some parts of the program where probably too option heavy and geeky.
  • Diversify. If people didn’t want to print fancy listings, maybe they would have wanted them formatted in HTML.
  • Better marketing. I’m not sure this would have saved it, but all I knew in those days was uploading to shareware sites. I never even sent a press release.

I figure it failed simply because it was a product nobody wanted. Actually, more importantly than that,, it was a product *I* didn’t want to use, but it developed from a larger product I was working on, on the assumption I could earn some money on the side from part of the code.  Since then I’ve stuck to products which I’ve actually wanted to use myself. There’s a lot to be said for dogfooding, not just for debugging, but for knowing where the pain points are and what extra features could be added.

Time/money invested

I would guess a couple of months of evening/weekend development time. Financially there was little spent, except that I offered the option of a printed manual and CD for an extra charge. One customer took me up on the offer, so I had to get 100 manuals printed and 99 of them went in the bin.

Current product status

I moved on to another product which has sold over £50,000 and a third which has earnt even more than that. Not enough to retire on but considering I only do this part time it must work out at a great hourly rate. There’s a lot to be said for not giving up…

Any regrets?

Nope. I figure every failure in life teaches you valuable lessons. Of course if I’d made a large financial investment I may feel differently, but that’s one of the big advantages of software over physical product sales.

Lessons learned

Just to reiterate – develop something which you find useful, instead of second guessing others.

Case #7: R10Clean

Contributor

Steve Cholerton.

The product

R10Clean. A data cleaning and manipulation tool.

Why it was judged a commercial failure

In the 18 months or so it’s been on the market I have sold 6. It has been £199, £99 and £19 – with no effect on sales !

What went wrong

Not sure what I did wrong ?  The product is maybe too techie ?

Time/money invested

No effect financially as at the time I was in a strong financial position.

Current product status

I still have it for sale but do not market it at all. I have other products.

Any regrets?

I don’t regret it as it saved me a ton of time when I was working with legacy databases a lot, as a commercial product it has been raved about (once!) and received a good review from the Kleper report, but has failed totally.

Lessons learned

Advice to others ?  Just because you need it personally, don’t assume the rest of the world does too. :-)

Case #8: nBinder

Contributor

Boghiu Andrei.

The product

nBinder, packs multiple files into a stand alone executable with over 50 advanced output and file unpack options, conditional run and commands.

Why it was judged a commercial failure

It was the first product I began selling. It sold to 300+ customers in 4 years. But for about a year the sales began to go down and have finally stopped completely.

What went wrong

  • The biggest problem was that because it was a packer intended for people that wanted to pack their products (software or games) into a single package (compressed and encrypted) many have used it for creating malware by binding malware files to legit files and then distributing the output so it isn’t detected by antivrus software (although it would be detected at runtime). Because of this I had lots of problems with antivirus companies that flagged files create with nBinder as malware. This was of course affecting legit users as their files would be falsely marked as malware. I used virustotal.com to see which antivirus detected it and contacted the antivirus manufacturer as soon as I detected the problem. In most cases they would remove it from their definitions. But it was an uphill battle because it would appear again in a matter of weeks. Some small AV companies didn’t event bother to reply to my emails to fix the problem. Others were using heuristics to flag files create with my applications and AV developers were reluctant to whitelist files created with nBinder. You can imagine it that it was enough for an AV such as Kaspersky or Norton to pick my files as malware for a day and customers would be affected and not use my product any more, especially that it took about 3 days for AVs to remove the false positive.
  • Infrequent updates. Due to lack of time I only updated the product once or twice a year and this affected the product a lot.
  • No marketing. I decided that I didn’t want to invest money in marketing so, except for a short AdWords campaign, I invested no money in marketing.
  • My decision to develop 3 products instead of concentrating on one or two affected development time and quality. I have worked on 3 products simultaneously instead of concentrating on making a single good one. The reason I worked on 3 is because I enjoyed developing different software in different categories. I didn’t start this for money but for the fun of development.

Time/money invested

I invested almost no money (except for hosting costs). Time invested I can’t really say exactly, but not too much as I only worked on nBinder in short bursts like 6 hours a day for a week or so before releases.

Current product status

Still for sale. My other products are:

Any regrets?

It’s not a total failure as I did make some money out of it with no investment, so I don’t regret starting it, but it could have been much better.

Lessons learned

Words of advice for others trying to make money from software development:

  • Study the market and the current trends very well.
  • Before deciding to take on large competition make sure you have something better (at least from one point of view) than the competition ( for example you might not have the same features but you have a better GUI and general presentation).
  • Do not get scared of an overly populated market segment. For example with nBinder I picked a segment with very little competition but also few possible users and the results were not so great (I didn’t have many users). With nCleaner I went head-to-head with lots of already established products but also the market is very big. Although nCleaner is free it has had the most success because there are so many potential users (anyone with a PC actually), so it had over 2 millions downloads and I still receive lots of mails regarding it, even if the last update was in 2007. So it is possible to have success in a market with lots of competition with no investment but it’s hard to reach the level of more established products.

Case #9: Net-Herald

Contributor

Torsten Uhlmann.

The product

Net-Herald – a monitoring application for water supply companies. It was a complex client server application that would receive monitoring data from specialized hardware and store that data inside a SQL database.  The client displays that data in different graphs, provides printable reports or sends alarm messages via SMS if a monitored value is not within its specified limits.

I developed Net-Herald as a perfect fit for that specialized hardware that is provided by a local manufacturer. That way, so I hoped, I could profit from their sales leads and would find a smoother way into these water supply companies. The downside of course, was that my software would only work with their hardware.

Why it was judged a commercial failure

I sold a first license fairly soon after I had a sellable product, although it took the customer nearly a year until they finally bought. But since then I sold only one more license within the last 4 years or so.

What went wrong

  • I didn’t do my own marketing and the hardware guys weren’t really concerned with selling my software.
  • Water management companies have a terribly long sales cycle. Other vendors monitoring applications usually cost tens of thousands and are geared toward large suppliers. Whenever a supplier buys into such a product he is unlikely to change within the next decade or more. I tried to position my software towards small suppliers but even then most of them were already locked into another vendor’s solution.
  • My software only worked with a specific hardware. That narrowed the marked down substantially.
  • In the end the software became too complex for one poor mortal to maintain. Because the software didn’t produce any substantial income I had to stop adding new features which would make it attractive for more prospective clients.
  • This kind of software is not sold over the Internet. Rather it needs very active sales people that nurture clients over a rather long period of time.
  • All these facts indicate that software like this should not be developed by a one man show.

Time/money invested

The development time for the first sellable version was maybe about 9 months. I didn’t have a job income at that time, but got funding due to government support for small start-up businesses. So I didn’t drain our family’s personal finances. But I did of course invest a great deal of time and sweat.

Current product status

Now, I have drawn a line and stopped active development of Net-Herald. I still do some custom extensions for my first clients. But I no longer market the software. I have instead focused on my consulting services. I also try to learn developing and selling software with my cross-platform drag and drop product Simidude.

Any regrets?

I didn’t succeed yet selling my own software (which is still my goal) but I do not regret doing it. I developed Net-Herald using (Java) technologies that now give me leverage at my consulting gigs. All in all it was a heavy ride. But it was fun and I would do it again.

Lessons learned

  • My biggest mistake was the lack of market analysis. I trusted the word of the hardware manufacturer without verification.
  • I have written more about the above and some other failures on my blog.

Case #10: HabitShaper

Contributor

Adriano Ferrari.

The product

HabitShaper – set and track daily targets for your goals (weight loss, quit smoking, jogging, writing, etc…).

Why it was judged a commercial failure

I sold a few copies, but not enough to make back the time I invested in it and my conversion numbers and traffic are below average.

What went wrong

  • Did not do enough pre-production research (talking to customers, etc).
  • Did not do a large enough beta to make up for lack of initial research.
  • Ignored gut-feeling that my product is better suited to being web-based and multi-platform (incl. mobile).
  • Did EVERYTHING myself (logo, web design, video, software, AdWords, etc).

Time/money invested

I worked on it two years, part-time, while doing Masters/PhD in Physics. It had no impact on my finances (very little money invested) or circumstances.

Current product status

I am relaunching as a web-based product this summer.

Any regrets?

Not in the least! I learned about as much from making HabitShaper as I have from my MSc thesis and PhD work.

Lessons learned

  • Most important: PAPER prototypes, minimum viable product, and iterate.
  • Don’t be afraid to launch early.
  • Launch a little bigger than you’d expect (it’s harder to find those initial customers than you think).
  • Don’t be afraid to change directions, especially early on.
  • Doing things yourself is a great learning experience, but if you want to get your product out to customers as fast as possible, don’t be afraid to invest money and outsource your weaknesses.

Case #11: BPL

Contributor

Jim Lawless.

The product

BPL – Batch Programming Language Interpreter.

Why it was judged a commercial failure

I sold about 10 copies.

What went wrong

  • I didn’t really do enough research to find out if the target market was in existence. I was hoping that network admins and support staff members would find it easier to use than batch files and less complicated than any of the free scripting language options available. So, I just rushed to get the MVP (Minimum Viable Product) out the door.
  • I never did provide a compiler that would build a stand-alone EXE. I think that might have met with more success.
  • I didn’t do much as far as advertising the existence of the product.

Time/money invested

I only spent a few weeks coding and documenting it in my spare time. Support issues sometimes took a whole evening, but nothing major. It did not have any impact on my finances as I had invested nothing but my time.

Current product status

I will still address support issues with this product for registered users, but I don’t actively sell it. I’ve open-sourced the program and it still really isn’t seeing heavy use.

I was more successful with other products. I have a few retired products that saw some good bulk-purchase deals ( command-line DUN HangUp, command-line scheduler ) and I still sell the following (for Windows):

All of the above still bring in a modest passive income.

Any regrets?

Not at all. “Nothing ventured,…”.

Lessons learned

Had I not attempted to bring the BPL product to life, I might still be sitting here wondering “what if?” I think it was very beneficial for me to invest the time to try out this idea.

Case #12: Anonymous

Contributor

Anonymous.

The product

A time tracker.

Why it was judged a commercial failure

Because it is not my primary income. I have about 150 customers in one year.

What went wrong

  • No marketing.
  • No real thought into features.
  • I don’t spend any time on it.

In my defense, the reason I do not spend much time on it is that the market became saturated with ‘me toos’ right after I released, which was quite expected. In fact, as I was looking for users, I got an email from a competitor suggesting that I don’t enter the market because they are working on the same thing! I don’t know what I would do differently. Maybe spend more time on it? I think the law of diminishing returns applies quite early in this space so I am not sure.

Time/money invested

Since inception (Nov 2008), I’ve spent close to 250 hours total. Total cash outlay was something like $500.

Current product status

I never tried to make it succeed, to be honest. It was only a learning experience for me. What I probably need now is to go all in. Quite frankly, if I double the sales for this product, I can quit all consulting work. But I really do not think it is a good idea to work on this app full time as it is too simple.

Any regrets?

Definitely not.

Lessons learned

  • Do it!
  • Solve a problem people know they have.
  • Don’t invest too much time and money at the beginning.
  • Don’t be wedded to a particular idea.
  • Don’t only listen to your customers. Listen to yourself. After all, you created the idea which attracted the customers.
  • Never promise a feature for a sale. I’ve never done it but the pressure is really great. My stock response is always: “While such a feature may be available in the future, I recommend that you only use current features when deciding on your purchase.”
  • Do use Google to your advantage.

Case #13: ScreenRest

Contributor

Derek Pollard.

The product

ScreenRest – a consumer software product that reminds users to take regular rest breaks while using their computer.

Why it was judged a commercial failure

ScreenRest failed commercially because we built a product without having a clearly defined market.  This was compounded by it offering prevention, not a solution. ScreenRest continues to regularly sell a small number of licences but not in sufficient quantity to justify further enhancements.  The conversion rates are good, but there are simply not enough visitors to the website.

What went wrong

  • Not doing market research first.
  • Creating a prevention rather than solution product – people generally wait until they have a problem and then look for a solution.
  • Creating a product with medical associations – the SEO and PPC competition for related keywords is prohibitive for a product with a low purchase price.

Time/money invested

At least £2000 was spent on the project, including software licences and additional hardware.  The product and website were created over roughly 12 months by myself and my wife Lindsay, some during spare time, then part-time and finally full-time so it is difficult to determine the total number of hours.  Working part-time and then full-time on ScreenRest caused a significant impact on our finances.  Although right from the beginning we saw this as in investment for building a business.

Current product status

Once the product was complete and we started learning SEO it became all too apparent that organic search traffic for related keywords was going to be insufficient.  Research into PPC then revealed that the price point was too low to support purchasing medical terms. Planned features for ScreenRest have been put on hold and no further marketing is planned.  We continue to support new and existing ScreenRest customers and plan to do so for the foreseeable future. Rather than create another software product we chose to use what we had learned about marketing, copywriting and SEO to create a series of websites targeting a range of topics (often known as niche sites).  The most successful of these sites we are expanding in value and functionality to fill gaps not serviced by the competition.

Any regrets?

No.  ScreenRest succeeded in every way intended, other than commercially.  Creating it was a rewarding learning exercise that started us down a path to finding the intersection of our skills, experience and market opportunities.

Lessons learned

  • Start with market research – creating a high-quality product you believe in is not enough on its own.
  • Make sure you can identify a specific target market, that you can reach that market and that it is large enough to support your financial goals.

Conclusion

Analysing the above (admittedly small and self-selected sample) it is clear that by far the commonest cause of failure were:

  • lack of market research
  • lack of marketing

With the benefitof 20/20 hindsight it seems blindingly obvious that we should:

  • spend a few days researching if a product is commercially viable before we spend months or years creating it
  • put considerable effort into letting people know about the products we create

Yet, by my count, a whopping 6 out of 13 of us admitted to failing to do each of these adequately. Probably we were too busy obsessing over the features and technical issues so beloved of developers, which actually contributed to far fewer failures.

It is also noticeable that, despite the failure of these products, there are few regrets. Important lessons were learned and no-one lost their house. Many of us have gone on to develop successful products and the others will be in a much stronger position if they do decide to try again.

A big thank you to everyone who ate a large slice of humble pie and submitted the above. I hope we can prevent other budding software entrepreneurs making the same mistakes. Even if you don’t succeed, you will learn a lot.

Feel free to add your own hard-won lessons from failure in the comments below.

“No physician is really good before he has killed one or two patients.” – Hindu proverb

Social factors can make or break a product.

A survey of ecommerce providers for software vendors

Overview

The choice of ecommerce provider is probably one of the more important ones you make as a software vendor. It isn’t too hard to compare providers by feature set or price. But what about other vital attributes, such as support, reliability, ease of set-up and how they treat your customers? It isn’t realistic to try every provider, so this major decision is often made on the basis of haphazard anecdotal evidence from forums. I created a survey in an attempt to gather some systematic data on the ecommerce providers most commonly used by small software vendors. I present the results below without fear or favour. Skip ahead to ‘Overall ranking’ if you are in a hurry.

Methodology

I posted a request for survey responses on this blog and on a few forums frequented by microISVs and small software companies. Any vendor of software (desktop or web based) not directly affiliated with an ecommerce provider was eligible to take part. Software vendors were invited to fill out a survey form on wufoo.com for each ecommerce provider they had used in the last 2 years. They had to supply their product URL and an email address from the same domain so that I could verify their identity. They also had to check a box proclaiming:

I am a software vendor and I have used this Ecommerce provider in the last 2 years. I have no commercial interest beyond being a customer. (If you have affiliate links to the Ecommerce provider, that isn’t a problem.)

They then had to reply to an automated email from wufoo to the email supplied confirming it was them that had completed the form. If they didn’t reply to the automated email I followed up with a few more emails. Although tedious for me, I felt this was an important safeguard to avoid any possibility of fraudulent entries. I also checked for duplicate entries, duplicate IP addresses and other suspicious patterns. The survey was open from the 5th to the 8th October. Any responses not validated by 10th October were removed from the data.

The data

202 survey responses were received from 166 different software vendors. 9 responses were rejected as I could not verify their identity (they didn’t respond to several emails). 1 response was rejected due to a possible conflict of interest raised by the software vendor (they had done paid work for one of the providers). This left 192 valid responses. I saw no evidence of any attempt to rig the results.

You can download the raw data. It has been stripped of any personal identifying information. Feel free to do your own analysis or check my results.

Providers

The survey listed 14 of the major ecommerce providers, plus an ‘other’ box for providers not listed. Valid responses were received for 25 different ecommerce providers, as shown below:

responses

Note that ‘e-junkie+PayPal/GoogleCheckout/2Checkout’ has been shortened to ‘e-junkie’ for brevity.

Questions

Below I show the average (mean) score per ecommerce provider by survey question. The providers are sorted by score. Providers with less than 3 responses weren’t considered statistically valid and are not shown here (see the raw data for all responses).

Features

“How do you rate the range of features offered, e.g. coupons, support for multiple currencies, CD shipping, affiliate tracking, handling of tax etc.”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

features

Ease of use

“How easy is their system to set-up, manage and modify?”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

ease_of_use

Reliability

“How reliable is their service? Does their server ever go down?”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

reliability

Support

“How good is their support? Do they respond in a timely manner? Are their staff helpful and knowledgeable?”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

support

Fraud protection

“How well do they protect you from chargebacks and false positives (i.e. valid cards declined)?”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

fraud_protection

Ethics

“Does this service disrespect you (e.g. by paying you late) or your customer (e.g. by spamming them, adding unwanted items into their cart or making hidden charges)?”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

ethics

Value for money

“How do you rate their service compared to the cost?”

5=”Excellent”, 4=”Good”, 3=”Satisfactory”, 2= “Unsatisfactory”, 1=”Dismal”

value_for_money

Future

“What is the probability you will still be using this service in 12 months time?”

5= “>95%”, 4= “>75%”, 3= “>50%”, 2= “>25%”, 1= “<25%”

futureThe average score and standard deviation for each question across all providers is shown below:

question_analysis

From the averages software vendors are most happy with reliability and least happy with ease of use. From the standard deviation the least variation is in fraud protection and the greatest variation is in support.

The correlation (R squared) between the likelihood of staying with a provider and the answers to the other 6 questions is shown below:

correlation

Perhaps providers should be concentrating more on ease of use and support to differentiate themselves from the competition.

Providers

Below I show the average (mean) score per question by ecommerce provider. The providers are shown in alphabetical order. The standard deviation is also shown to give an idea of how consistent the responses were (the larger the standard deviation the more variation there was in responses). Providers with less than 3 responses weren’t considered statistically valid and are not shown here (see the raw data for all responses).

avangate

bmt_micro

e-junkie

esellerate

fastspring

kagi

paypal

plimus

regnow

shareit

swreg

Overall ranking

The average (mean) score and overall ranking for providers with at least 3 responses is shown below.

overall

The chart below shows the score broken down by question (click to enlarge):

overall_detailed

The chart below compares the 4 top performers by question:

top_performers

Avangate, Fastspring, BMT Micro and e-junkie all did well. The difference between the Avangate and Fastspring score (approx 0.3%) is probably too small to be statistically significant, but the survey shows significant differences between the best and worst providers. SWREG trails in last place by quite a margin, managing to place last or second to last in an impressive 7 out of 8 questions. It is also noticeable that the providers owned by industry heavyweight Digital River fill 4 out of the bottom 5 places in the ranking. This rather begs the question of how they got to be the industry heavyweight in the first place.

Note that the ranking does not show who the ‘best’ ecommerce provider is, for the following reasons:

  • ‘Best’ depends on your requirements. All the questions have been equally weighted here. If you decide (for example) that good support should be weighted higher than ease of use you might come up with a quite different ranking.
  • The assignment of numerical values to responses (e.g. Excellent=5, Good=4 etc) was done for easier analysis, but is entirely arbitrary. Different values might have resulted in a different ranking.
  • We aren’t comparing like with like. Software vendors using ‘lightweight’ e-commerce providers such as PayPal or e-junkie might have lower expectations than software vendors using ‘fully featured’ e-commerce providers .
  • e-junkie, SWREG, BMT Micro and RegNow had respectively only 8, 7, 5 and 3 responses. They are therefore vulnerable to statistical fluctuations.

That said, the ranking does correlate fairly well with the many comments I see about ecommerce providers on various forums. I don’t think I would want to use any of the providers in the bottom half of the ranking.

Conclusion

While one shouldn’t take the overall ranking too seriously, it is clear that there are major differences in the performance of ecommerce providers in important areas other than pricing and features. I hope these results will allow software vendors (myself included) to make a better informed choice of ecommerce provider. Hopefully this will, in turn, improve ecommerce services overall by rewarding the good companies at the expense of the poorer performers. It would be interesting to run this survey in another year or two and find out what has changed. Thank you to everyone that took part.

Disclosure: I use e-junkie+PayPal/GoogleCheckout/2Checkout as my payment provider for my Perfect Table Plan software. I have an affiliate link to them in another article on this blog which brings me a few dollars a month. I have no other commercial relationship with any of the other ecommerce providers.

BMT Micro
e-junkie
eSellerate
Fastspring
Kagi
PayPal
Plimus
RegNow
ShareIt
SWREG

How good is your Ecommerce provider?

ecommerce surveyIt is important to choose the right Ecommerce provider for your business. A bad choice can have a significant impact on your sales and switching provider can be a major headache. But which one is the right one? It is easy enough to find out about prices and features, but what about the all-important intangibles such as support, ease of set-up and reliability? I hear a lot of good and bad reports about various vendors. I thought it was time for something a bit more comprehensive and systematic – a survey. That’s where you come in.

I hope this survey will provide a useful resource to software vendors looking for an Ecommerce provider and also force the under-performers to raise their game. But I need your help. So please click the link below and tell me what you think about your Ecommerce provider. Please note:

  • You must be a software vendor (web or desktop) and I need your email address and product website to prove this. You will have to reply to an automatic email after the survey to verify your identity. Without this it would be easy to rig the results. Your address and domain will remain confidential and won’t be used for any other purposes.
  • If you have used more than one Ecommerce provider in the last 2 years you can fill out a separate survey response for each one.
  • As there are quite a lot of Ecommerce providers I think I will need at least 100 responses to get a good data set. 200 would be better. So tell a friend.

** the survey is now closed **

results here

The truth about conversion ratios for downloadable software

conversion funnel?Overview

An anonymous survey of software vendors shows that the average sale to visit ratio is very close to the much quoted “industry average” of 1%. However the data shows large variations between products and across different sectors (e.g. Windows vs Mac OS X).

The data

The data set comprises 92 valid survey responses to an 8 question survey in April 2009. The survey was advertised through a request on this blog, posts on  BOS, ASP, MACSB, OISV and BOSnetwork forums and emails to the author’s contacts. The results are inevitably biased towards small software vendors, due to the places where the survey was advertised. As the survey was anonymous it is impossible to verify the accuracy of the data. However it is unlikely that many vendors would have completed a survey that wasn’t anonymous.

The survey consisted of 3 compulsory questions (unique visits, downloads and sales over a given timeframe) and 5 optional questions (the time frame of the data, primary market, primary OS, licence price and trial type). One record had 0 visits (an iPod app), another had 0 downloads (presumably a web app) and a few had numbers that I didn’t consider statistically valid for some purposes (e.g. <500 visits per month or <3 sales transactions per month).  I did the best I could with the data available, ignoring obvious outliers in some cases.

The data set comprises a total of:

  • 8.1 million unique website visits
  • 2.2 million downloads
  • 110 thousand sales transactions

Where a time frame for the results was given it is possible to work out the range of visitors, downloads and sales per month.

metrics_all_visitors

metrics_all_downloads

metrics_all_sales

Interestingly the distribution of monthly visits, downloads and sales across the different products all follow the Pareto 80:20 power law quite closely:

  • 22% of the products account for 80% of the visits
  • 21% of the products account for 80% of the downloads
  • 19% of the products account for 80% of the sales

This gives me some faith that the data is reasonably accurate and representative of the industry as a whole.

The data is broken down by OS, market, price and trial type as follows:

metrics_all_os

metrics_all_market

metrics_all_price1

metrics_all_trial

Analysis

The average (mean) ratio of downloads:visits across all products is 28%. 50% of products are in the range 12.1% to 35.3%.

metrics_download_visit_ratio

I am surprised at how high the average ratio is. This could partly be due to products that receive a high percentage of downloads from download sites, without the downloader ever visiting the product site. Conversely sites where visitors make frequent returns after purchase (e.g. to read forums) will have a lower downloads:visits ratio.

The average ratio of sales:downloads across all products is 4.5%. 50% of products are in the range 1.3% to 6.4%.

metrics_sale_download_ratio

The average sales:downloads ratio is noticeably lower than the average downloads:visits ratio. The sales:downloads ratio is noticeably skewed on the right of the graph – a sales:downloads ratio >20% seems very high.

The (logarithmic) scatter plots below show that the downloads:sales ratio varies a lot more than visits:downloads ratio.

metrics_visits_vs_downloads

metrics_downloads_vs_salesThe average (mean) sales:visits ratio of all products is 1.16%[1]. However one of the product ratios is an obvious outlier at 13.94% (see below). With this outlier removed the average sales:visits conversion ratio across all the products is 0.99%. 50% of products are in the range 0.28% to 1.39%.

metrics_sale_visit_ratio

0.99% is suspiciously close to the mythical ‘industry average’ of 1%. But I haven’t (consciously) massaged the results to get this figure.

You can work out how you compare to this data set using the red (cumulative) graph in the histogram below. For example, if your product sales:visits ratio is 1.5%, then it is higher than approximately 80% of the products in the data set.

conversion-ratio-distribution2

We can also look at how the ratios vary across sectors. Surprisingly the average Mac product conversion ratio is more than 4 times the Windows product conversion ratio.

metrics_sale_visit_ratio_by_os1Even if we try to compare like for like, and only compare consumer products selling for <= $50, the ratios are still 2.27% for Mac and 0.51% for Windows. Possible reasons for this large disparity include:

  • Mac owners more ready to spend money.
  • There is less competition in the Mac software market.
  • Mac vendors have a higher percentage of purchasers  who never visit their site due to higher quality of Mac download sites.
  • It is a statistical blip (there are a lot less Mac products in the survey).

My own experience with selling a cross-platform product (Perfect Table Plan) on Windows and Mac OS X is that the Mac sales:visits ratio is approximately double that for Windows.

The sales:visits ratio is similar for business and consumer products, with developer products lagging behind. However there are too few developer products in the data set to draw any real conclusions.

metrics_sale_visit_ratio_by_market1The sales:visits ratio does vary across the price range. However there are too few products with price >$50 in the data set to draw any real conclusions.

metrics_sale_visit_ratio_by_price2The sales:visits ratio does not seem to vary significantly by trial type. There were insufficient ‘number of use’ trial products to include them.

metrics_sale_visit_ratio_by_trial1

Conclusion

One has to be careful about drawing conclusions from a relatively small and unverifiable data set. However the results certainly seem to support the much-quoted “industry standard” sales:visits conversion ratio of 1%. But there are huge variations between products.

The fact that the sales:downloads ratio is both lower on average and more variable than the downloads:visitors ratio implies that getting people to download is the easy bit and converting the download to a sale is a tougher challenge.

The average sales:visits conversion ratio is noticeably higher for Mac OS X products than Windows products. This is supported by anecdotal evidence and the author’s own experience with a cross-platform product. However the number of Mac respondents to the survey is too small for the result to be stated with any great confidence. Also remember that the Mac market is still a lot smaller than the Windows market before you rush off to start learning Cocoa and Objective-C.

These ratios can be useful for a number of purposes, including: identifying a bottleneck in your conversion funnel (is your downloads:visitors ratio low compared to other products?); estimating how much traffic you might need for a viable business; or estimating how much you can afford to bid on Google Adwords. And it is useful to track how these ratios change over time (I track mine on a monthly basis). But make sure you compare like with like if you are comparing your ratios with other products. For example, a 10% sales:downloads ratio might be achievable for a niche business product, but unrealistic for a casual game. And remember that these ratios are only one part of a bigger picture. There are other, more important, metrics. Profitability for a start.

The data set is available here:

Raw data (some invalid records deleted), CSV format

Processed data, Excel XLSX format

Feel free to publish your own analysis. Thank you to everyone that took part in the survey.

[1] Calculating the mean of all the ratios probably isn’t the way a proper statistician would do it. But anything more seems overkill given the limited size and unverifiable nature of the data set.

Is the average visitor conversion ratio really 1%?

We have probably all heard that the industry standard conversion rate is 1%. But where did this data come from? Is that the visitor to sale ratio or download to sale ratio (I have seen it quoted for both) and just how standard is it across the industry? I have put together a survey in an attempt to find out.

There are 8 questions in the survey, but only 3 are compulsory. It should only take you a few minutes to complete and it is completely anonymous. The results will be posted on this blog, assuming I get enough responses to make it worthwhile. If you are selling downloadable commercial software on the web then please spare a few minutes to do the survey.

Click here to go to the survey

** Update : the survey is now closed **

Business of Software microISV survey

microISV sales per hour workedThe Business of Software blog has published the results of a survey of 96 microISVs:

survey results – part 1
survey results – part 2

 

As the survey is self-selecting it is hard to know how representative the results are for microISVs in general, but it makes interesting reading.

Of respondents whose microISVs had been running 6 months or more, 50% made less than $25 in sales per hour worked. Assuming modest expenses of 20% that means that the majority of microISVs are making less than $20 per hour worked, before tax. This sounds rather discouraging, but some claim to be making >$200 per hour. The author has kindly provided the raw stats for download, so I looked at them in a bit more detail. According to my quick analysis the situation is, unsurprisingly, more encouraging for established microISVs. If you take you all the respondents who have been in business at least 12 months, are working at least 30 hours per week and are making any sales at all, the average is around $60 in sales per hour worked. This is not too bad for an indoor job, with no heavy lifting, that you can do in your underwear.

The data also shows an interesting difference in sales by category. I took the data for all the 1-man companies with monthly sales >0, divided them by category and then removed the top and bottom performers in each category (to prevent outliers distorting the averages).

hourly_sales_by_category.gif

Average sales per hour worked ($), by category, click to enlarge

I am not surprised that the average sales is relatively low in the ‘Developer tools’ market given the fierce competition, prevalence of free tools and the effects of developer ‘not invented here’ syndrome. I am rather surprised that consumer software appears to pay better than business software. This seems to turn conventional wisdom on its head (assuming I got the numbers right, it was after midnight). Of course, sales is not the same as profit. There appears to be little (if any) correlation between the ticket price of an item and the total monthly sales.

Digging a bit further, the stats also show some correlation between marketing spend and sales:

microISV marketing v sales

Monthly marketing spend ($/month) vs monthly sales ($/month), click to enlarge

Of course (repeat after me) correlation does not imply causation.

Thanks to Neil for taking the time to do the survey and publish the results.