Google Adwords: improving your ads

google adwordsOne of the keys to success in Google Adwords (and other pay per click services) is to write good ad copy. This isn’t easy as the ads have a very restrictive format, reminiscent of a haiku:

  • 25 character title
  • 2×35 character description lines
  • 35 character display url

Whats more, there are all sorts of rules about punctuation, capitalisation, trademarks etc. You will soon find out about these when you write ads. Most transgressions are flagged immediately by Google algorithms, others are picked up within a few days by Google staff (what a fun job that must be).

Google determines the order in which ads appear in their results using a secret algorithm based on how much you bid, how frequently people click your ads and possibly other factors, such as how long people spend on your site after clicking. Nobody really knows apart from Google, and they aren’t saying. The higher your click frequency, generally the higher your ad will appear. The higher your ad appears in the results, generally the more clicks you will get. So writing relevant ads is very important. This means that each adgroup should have a tightly clustered group of keywords and the ads should be closely targeted to those keywords.

There is no point paying for clicks from people who aren’t interested in your product, so you need to clearly describe what you are offering in the few words available. For example you might want to have a title “anti-virus software” instead of “anti-virus products” to ensure you aren’t paying for useless clicks from people looking for anti-viral drugs (setting “drugs” as a negative keyword would also help here).

I have separate campaigns for separate geographic areas. Each campaign contains the same keywords in the same adgroups, but with potentially different bid prices and ads. This allows me to customise the bid prices and ads for the different geographic areas. For example I can quote the £ price in a UK ad and the $ price in a US ad. Having separate campaigns for separate geographic areas is a hassle, but it is manageable, especially using Google Adwords editor.

Writing landing pages specific to each adgroup can also help to increase your conversion rate. It is worth noting that the ad destination url doesn’t have to match the display url. For example you could have a destination url of “http://www.myapp.com/html/landingpage1.html?ad_id=123” and a display url of “www.myapp.com/freetrial”.

Obviously what makes for good ad copy varies hugely with your market. Here are some things to try:

  • a call to action (e.g. “try it now!”)
  • adding/removing the price
  • different capitalisation and punctuation
  • keyword insertion (much beloved of EBay)
  • changing the destination url

But, as always, the only way to find out what really works is testing. Google have made this pretty easy with support for conversion tracking and detailed reporting. I run at least 2 ads in each adgroup and usually more. Over time I continually kill-off under-performing ads and try new ones. Often the new ads will be created by slight variations on successful ads (e.g. changing punctuation or a word) or splicing two successful ads together (e.g. the title from one and the body from another). This evolutionary approach (familiar to anyone that has written a genetic algorithm) gradually increases the ‘fitness’ of the ads. But you need to decide how to measure this fitness. Often it is obvious that one ad is performing better than another. But sometimes it can be harder to make a judgment. If you have an ad with a 5% click-through rate (CTR) and 0.5% conversion rate is this better than an ad with a 1% click-through rate and a 2% conversion rate? One might think so ( 5*0.5 > 1*2 ) but this is not necessarily the case. I think the key measure of how good an ad is comes from how much it earns you for each impression your keywords get.

I measure the fitness by a simple metric ‘profit per thousand impressions’ (PPKI) where, for a given time period:

PPKI = ( ( V * N ) – C ) / ( I / 1000 )

V = value of a conversion (e.g. product sale price)

N = number of conversions (e.g. product sales) from the ad

C = total cost of clicks for the ad

I = impressions for the ad

Say your product sells for $30. Over a given period you have 2 ads in the same adgroup that each get 40k impressions and clicks cost  an average of $0.10 per click.

  • ad1 has a CTR of 5%, a conversion rate of 0.5% and gets 10 conversions, which gives PPKI=$2.5 per thousand impressions
  • ad2 has a CTR of 1%, a conversion ratio of 2% and gets 8 conversions, which gives PPKI=$5 per thousand impressions

So ad2 made, on average, twice the profit per impression despite the lower number of conversions. Given this data I would replace ad1 with a new ad. Either a completely new ad or a variant of ad2.

PPKI has the advantage of being quantitative and simple to calculate. You can just export your Google Adwords ‘Ad Performance’ reports to Excel and add a PPKI column. Some points to bear in mind:

  • Selling software isn’t free. You may want to subtract the cost of support, CD printing & postage, ecommerce fees, VAT etc from the sale price to give a more accurate figure for the conversion value.
  • PPKI doesn’t take account of the mysterious subtleties of Google’s ‘quality score’. For example an ad with low CTR and high conversion rate might conceivably have a good PPKI but a poor quality score. This could result in further decreases in CTR over time (as the average position of the ad drops) and rises in minimum bid prices for keywords.
  • PPKI is a simple metric I have dreamt up, I have no idea if anyone else uses it. But I believe it is a better metric than cost per conversion, or any of the other standard Google metrics.

To ensure that all your ads get shown evenly select ‘Rotate: Show ads more evenly’ in your Adwords campaign settings. If you leave it at the default ‘ Optimize: Show better-performing ads more often’ Google will choose which ads show most often. Given a choice between showing the ads that make you most money and the ads which make Google most money, which do you think Google will choose?

Text ads aren’t the only type of ads. Google also offer other types of ads, including image and video ads. I experimented with image ads a few years ago, but they got very few impressions and didn’t seem worth the effort at the time. I may experiment with video ads in the future.

The effectiveness of ads can vary greatly. Back in mid-December I challenged some Business Of Software forum regulars to ‘pimp my adwords’ with a friendly competition to see who could write the best new ads for my own Google Adwords marketing. The intention was to inject some fresh ‘genes’ into my ad population while providing the participants with some useful feedback on what worked best. Although it is early days, the results have already proved interesting (click the image for an enlargement):

adwords ad results

The graph above shows the CTR v conversion ratio of 2 adgroups, each running in both USA and UK campaigns. Each blue point is an ad. The ads, keywords and bid prices for each ad group are very similar in each country (any prices in the ads reflect the local currency for the campaign). Points to note:

  • There were enough clicks for the CTR to be statistically significant, but not for the conversion rate (yet).
  • The CTRs vary considerably within the same campaign+adgroup. Often by factor of more than 3.
  • Adgroup 1 performs much better in the USA than in the UK. The opposite is true for adgroup 2.
  • Adgroup 1 for the USA shows an inverse correlation between CTR and conversion rate. I often find this is the case – more specific ads mean lower CTR but higher conversion rates and higher profits.

‘Pimp my adwords’ will continue for a few more months before I declare a winner. I will be reporting back on the results in more detail and announcing the winner in a future post. Stay tuned.

Oryx Digital is diversifying

BBC microI first became interested in programming in about 1978, at the age of 12. I can recall the exact moment. I was in a classroom at The Royal Hospital School watching a very basic demo (written in BASIC) of a ball bouncing around a screen on an RM-380Z. Actually it wasn’t a ball, it was a single pixel. But the screen resolution was so low it was easy enough to see from the back of the classroom. Computers with floppy drives were rather expensive for schools in 1978, but some pupils from the school had won it in a competition. I was intrigued – how did it work? The teacher giving the demonstration (Mr Albert) encouraged my early interest and a few years later my grandmother was generous enough to buy our family an Acorn BBC B computer. My future path was set.

30 years later, including 22 years as a professional software developer, I am still fascinated by software. Experience showed me programming skills were necessary, but far from sufficient, to produce successful commercial software. So my interests have grown from programming to include the whole nascent discipline of software engineering. I have also become increasingly interested in the effective marketing of software. Many developers recoil with horror from marketing, but I want my software to make money and be used by lots of people. This requires good marketing as well as a good product. In my experience talented software marketers are even harder to find than talented software developers, so I have learned as much as I can about marketing software. It is actually quite a challenging and creative field.

3 years ago I set up my own one-man company, Oryx Digital, to create software products and offer consulting services to other software companies. Since then I have been extremely busy developing and marketing my product, PerfectTablePlan, which has gone from strength to strength. I released PerfectTablePlan v3.1.1 for Windows and Mac OS X a few days ago. I am very pleased with this new version, which has over 50 improvements and new features. The response from customers has been very favourable and the software appears to be very stable – no automatic crash reports (yet). It has grown far beyond my original ideas and now weighs in at around 100K lines of C++ and 200 pages of user documentation. In my (biased) opinion it is way ahead of any of it’s competitors.

Although PerfectTablePlan remains my main focus, I feel now is a good time to diversify a little. So I am now making myself available a few days a month for consultancy to other software companies, large and small. Do you need a new perspective on your product development and marketing? Perhaps I can help?

Meanwhile I have already started thinking about PerfectTablePlan v4. No rest for the wicked…

Mobile Internet access

3-mobile-broadband.pngI try to check my sales and support emails at least twice a day, every day. I managed this 362 days in 2007 (I took a break for Christmas day and 2 days I was in Germany at a conference). But providing this level of service can prove to be a problem for a one-man software company when it comes to taking holidays. Last year I restricted holidays to places with broadband Internet access. But finding child-friendly accommodation with broadband access proved to be quite a headache.

I have considered getting a Blackberry, but I really need something that can run my application to do proper support.

After some dithering I have now finally got mobile Internet access for my laptop through 3 Networks at £10/month. This provides 1GB per month of free data in the UK. You can get a higher data allowance with a more expensive contract, or a pay-as-you-go contract. But 1GB/month will hopefully be sufficient for my needs. Data costs outside the UK are a frightening £6/MB, so I will probably have to look for alternative arrangements if I take a holiday abroad. Vodaphone offer contracts with more reasonable roaming rates, but the contracts are much more expensive (£25 – £99/month).

Installation of the USB mobile modem and software was very easy – it took me about 5 minutes from opening the packaging to being connected. Only time will tell how good the coverage and service are. Watch this space.

Having mobile Internet access could also be a useful back-up if I lose my landline broadband connection. This is quite reassuring after several website outages and a failed harddisk in the last couple of weeks.

The curse of the 419 scam

I expect that anyone reading this blog has had hundreds, if not thousands, of emails like this:

NIGERIAN NATIONAL PETROLEUM CORPORATION
CORPORATE HEADQUARTERS,LAGOS
STRICTLY CONFIDENTIAL.
FROM THE DESK OF: DR WALI AHMED .
LAGOS-NIGERIA

Dear Sir/Madam ,

After due deliberation with my colleagues, I decided to forward this proposal. We want a reliable person who could assist us to transfer the sum of Thirty Million United States Dollars (US$30,000,000.00) into his/her account.

..etc

Or

My father (Late) DR EDWARD HOSANNA the former Deputy Minister of Finance under the executive civilian president of Liberia, but was assasinated by the rebels during the civil war and properties destroyed, but I narrowly escaped with some very important documents of (US$7.5M) Seven Point Five Million U.S Dollars deposited by my late father in a high financial company here in Dakar-Senegal under my name as next of kin.

..etc

(picking two at random from my inbox).

Needless to say, it is a scam. Basically, they ask for a sum of money (e.g. to bribe an official) with the promise that you will get a much larger sum once the transaction is complete. But each payment results in a request for a larger payment, until you run out of money. It is often known as the ‘Nigerian 419 scam’, as many of these emails originate from Nigeria and they are an offence under article 419 of the Nigerian criminal code. It is a variant on the spanish prisoner scam, which dates back to the early 1900s. It is hard to believe anyone would fall for this scam, but they do in their thousands. Advance fee fraud (e.g. 419 scams) was estimated to cost at least £275 million in 2005 in the UK alone, with average individual loss to victims over £31,000. Greed and stupidity are indeed a dangerous combination.

So what can we do to hit back? Not a great deal. The governments of poor and corrupt countries probably don’t care that much about gullible and greedy westerners being cheated. In fact, the scam may be in their interests. We can report the scammer’s email to their ISP, in the hope that it will be shut down before they can con anyone. But they can easily get another. We can play along a bit and waste the scammer’s time (perhaps with highly entertaining results). But it wastes our time as well. However another idea occurred to me while listening to a BBC radio report from Nigeria.

Apparently many Nigerians are incredibly superstitious, even the highly educated ones. Rumours of penis stealing witches and killer phone numbers are taken very seriously. So now I occasionally respond to 419 emails with a curse email from a little used email account. My email starts with some impressive looking pig Latin. It then tells them that reading the above has activated a curse and that they will suffer increasingly bad headaches until they renounce their wicked ways. If they are sufficiently superstitious I figure this might be enough to start a headache, which will get worse the more they worry about it. Hopefully they will either find an honest way to make a living or a psychosomatic feedback loop will cause their head to explode like a scene from the film Scanners (warning: very gory). I have no idea if it works – but I think it is worth a try. I haven’t included the text of my email as I don’t want it to appear on Google. Make up your own curse. Be inventive.

Site uptime monitoring

siteuptimeMy PerfectTablePlan website and all the associated websites (such as http://www.weddingtableplan.com) have gone down three times this week. Sigh. The first time they went down for 5 hours, the second time for 3.5 hours and the third time for 6 hours. Perhaps they have been overdoing the festivities at my ISP, 1and1.co.uk . I am not impressed. Somebody needs a good kick up the arse.

To rub salt into the wound they even had the cheek to put up parking pages with ads in place of my sites. Some of these ads were for my own sites, which also displayed parking pages  – with more ads. So 1and1 were potentially taking money off me through adwords at the same time! I have a feeling 1and1 and I may be parting company in the not too distant future.

I am now setting up a copy of my site with a different ISP. If (when?) the site goes down again I should be able to point the DNS to the backup ISP.

At least I found out about it quickly due to site monitoring service www.siteuptime.com . I use their free service, which is adequate for my needs at present. Are you monitoring your site(s)? Do you have a back-up plan?

Be nice Microsoft

This is what you see when you try to log on to the Microsoft Partners website using my preferred browser, FireFox:

techsmithwor1047.gif

Despite their charm offensive and attempts to be more open, old habits die hard.

Installing MacOSX 10.5 (Leopard) on an external harddisk

install macosx 10.5 leopardI need to support both MacOSX 10.4 (Tiger) and 10.5 (Leopard) for the latest release of PerfectTablePlan. I could have created a new partition on the current harddisk for 10.5, but apparently you can’t do that without erasing the whole disk. I really didn’t want to mess with my existing 10.4 setup, so I purchased a 320GB WD MyBook USB/Firewire external harddisk to install 10.5 on to. 320GB for £75, bargain! But I had quite a bit of trouble installing Leopard on to it. After about the tenth time looking at a “Mac OS X could not be installed on your computer. The installer cannot prepare the volume for installation.” message I finally got it working. In case anyone else gets stuck, here are some hints:

  • When you set up the new harddisk partitions using Disk Utility make sure you choose Apple Partition Map using the Options button (it may be set to Master Boot Record if the disk is shipped set-up for Windows).
  • Disconnect the harddisk USB cable. Just use the Firewire cable.

I hope this saves someone else a few hours. Thanks to Jeff B for a hint that got me moving in the right direction.

Optimising your application

When I first released PerfectTablePlan I considered 50-200 guests as a typical event size, with 500+ guests a large event. But my customers have been using the software for ever larger events, with some as large as 3000 guests. While the software could cope with this number of guests, it wasn’t very responsive. In particular the genetic algorithm I use to optimise seating arrangements (which seats people together or apart, depending on their preferences) required running for at least an hour for the largest plans. This is hardly surprising when you consider that seating assignment is a combinatorial problem in the same NP-hard class as the notorious travelling salesman problem. The number of seating combinations for 1000 guests in 1000 seats is 1000!, which is a number with 2,658 digits. Even the number of seating combinations for just 60 guests is more than the number of atoms in the known universe. But customers really don’t care about how mathematically intractable a problem is. They just want it solved. Now. Or at least by the time they get back from their coffee. So I made a serious effort to optimise the performance in the latest release, particularly for the automatic seat assignment. Here are the results:

ptp308_vs_ptp_310.png

Total time taken to automatically assign seats in 128 sample table plans varying in size from 0 to 1500 guests

The chart shows that the new version automatically assigns seats more than 5 times faster over a wide range of table plans. The median improvement in speed is 66%, but the largest plans were solved over ten times faster. How did I do it? Mostly by straightening out a few kinks.

Some years ago I purchased my first dishwasher. I was really excited about being freed from the unspeakable tyranny of having to wash dishes by hand (bear with me). I installed it myself – how hard could it be? It took 10 hours to do a wash cycle. Convinced that the dishwasher was faulty I called the manufacturer. They sent out an engineer who quickly spotted that I had kinked the water inlet pipe as I had pushed the dishwasher into place. It was taking at least 9 hours to get enough water to start the cycle. Oops. As soon as the kink was straightened it worked perfectly, completing a cycle in less than an hour. Speeding up software is rather similar – you just need to straighten out the kinks. The trick is knowing where the kinks are. Experience has taught me that it is pretty much impossible to guess where the performance bottlenecks are in any non-trivial piece of software. You have to measure it using a profiler.

Unfortunately Visual Studio 2005 Standard doesn’t seem to include profiling tools. You have to pay for one of the more expensive versions of Visual Studio to get a profiler. This seems rather mean. But then again I was given a copy of VS2005 Standard for free by some nice Microsofties – after I had spent 10 minutes berating them on the awfulness of their “works with vista” program (shudder). So I used an evaluation version of LTProf. LTProf samples your running application a number of times per second, works out which line and function is being executed and uses this to build up a picture of where the program is spending most time.

After a bit of digging through the results I was able to identify a few kinks. Embarrassingly one of them was that the automatic seat assignment was reading a value from the Windows registry in a tight inner loop. Reading from the registry is very slow compared to reading from memory. Because the registry access was buried a few levels deep in function calls it wasn’t obvious that this was occurring. It was trivial to fix once identified. Another problem was that some intermediate values were being continually recalculated, even though none of the input values had changed. Again this was fairly trivial to fix. I also found that one part of the seat assignment genetic algorithm took time proportional to the square of the number of guests ( O(n^2) ). After quite a bit of work I was able to reduce this to a time linearly proportional to the number of guests (O(n) ). This led to big speed improvements for larger table plans. I didn’t attempt any further optimisation as I felt was getting into diminishing returns. I also straightened out some kinks in reading and writing files, redrawing seating charts and exporting data. The end result is that the new version of PerfectTablePlan is now much more usable for plans with 1000+ guests.

I was favourably impressed with LTProf and will probably buy a copy next time I need to do some optimisation. At $49.95 it is very cheap compared to many other profilers (Intel VTune is $699). LTProf was relatively simple to use and interpret, but it did have quirks. In particular, it showed some impossible call trees (showing X called by Y, where this wasn’t possible). This may have been an artefect of the sampling approach taken. I will probably also have a look at the free MacOSX Shark profiler at some point.

I also tried tweaking compiler settings to see how much difference this made. Results are shown below. You can see that there is a marked difference with and without compiler optimisation, and a noticeable difference between the -O1 and -O2 optimisations (the smaller the bar, the better, obviously):

vs2005_optimisation_speed.png

Effect of VS2005 compiler optimisation on automatic seating assignment run time

Obviously the results might be quite different for your own application, depending on the types of calculations you are doing. My genetic algorithm is requires large amounts of integer arithmetic and list traversal and manipulation.

The difference in executable sizes due to optimisation is small:

vs2005_optimisation_size.png

I tried the two other optimisation flags in addition to -O2.

  • /OPT:NOWIN98 – section alignment does not have to be optimal for Windows 98.
  • /GL – turns on global optimisation (e.g. across source files, instead of just within source files).

Neither made much noticeable difference:

vs2005_additional_opt.png

However it should be noted that most of the genetic algorithm is compiled in a single file already, so perhaps /GL couldn’t be expected to add much. I compared VC++6 and VS2005 version of the same program and found that VS2005 was significantly faster[1]:

vc6_vs_vs2005_optimisation_speed1.png

I also compared GCC compiler optimisation for the MacOSX version. Compared with VS2005 GCC has a more noticeable difference between optimised and unoptimised, but a smaller difference between the different optimisations:

gcc_optimisation_speed.png

Surprisingly -O3 was slower than -O2. Again the effect of optimisation on executable size is small.

gcc_optimisation_size2.png

I also tested the relative speeds of my 3 main development machines[2]:

relative-machine-speed.png

It is interesting to note that the XP box runs the seat assignment at near 100% CPU utilisation, but the Vista box never goes above 50% CPU utilisation. This is because the Vista box is a dual core, but my the seat assignment is currently only single threaded. I will probably add multi-threading in a future version to improve the CPU utilisation on multi-core machines.

In conclusion:

  • Don’t assume, measure. Use a profiler to find out where your application is spending all its time. It almost certainly won’t be where you expected.
  • Get the algorithm right. This can make orders of magnitude difference to the runtime.
  • Compiler optimisation is worth doing, perhaps giving a 2-4 times speed improvement over an application built without compiler optimisation. It probably isn’t worth spending too much time tweaking compiler settings though.
  • Don’t let a software engineer fit your dishwasher.

Further reading:

“Programming pearls” by Jon Bentley a classic book on programming and algorithms

“Everything is fast for small n” by Jeff Atwood on the Coding Horror blog

[1] Not strictly like-for-like as the VC++6 version used dynamic Qt libraries, while the VS2005 version used static Qt libraries.

[2] I am assuming that VS2005 and GCC produce comparably fast executables when both set to -O2.

Brand recognition: PayPal beats Google

I offer both PayPal and GoogleCheckout as payment option on my pounds sterling payment page (GoogleCheckout only allows me to price in pounds sterling, unfortunately). As GoogleCheckout is effectively free to me at present[1] I put the GoogleCheckout button on the left in the hope of getting more payments through Google. But 70.5% of purchasers clicked on the PayPal button.

I have since then become a bit disgruntled with GoogleCheckout for their slow processing times, chargeback fees, lack of multi-currency support and use of anonymised email addresses[2]. So I swapped the button order in the hope of increasing the number of purchasers using PayPal. 69.3% of purchasers now click on the PayPal button.

paypal-vs-googlecheckout.gif

From this I conclude that GoogleCheckout still has a long way to go to beat PayPal in brand recognition, positioning on the left may not be more prominent (although 1.2% may be statistical noise) and button order is less important than I thought. Or perhaps the PayPal icon is just more compelling. I wonder if GoogleCheckout have tested their icon against the PayPal icon?

[1] Google currently process £10 of payments free for each £1 I spend on Adwords.

[2] The user can opt to have their email anonymised at time of purchase. The vendor then recieves an email address like Miss-abc123xyz@checkout.l.google.com. Google forwards email from this address to the purchaser, until they choose not to receive further emails. In theory this protects the purchaser from vendor spam, but in reality it makes support more difficult. For example, the purchaser can’t retrieve their key from your online key retrieval system unless they remember to use the anonymised address (they never do).

The software awards scam (update)

This is an update on my The software awards scam post in August.  Below is an updated list of the download site awards I ‘won’ for software that didn’t even run. 23 in total, and that is only the ones I am aware of.

awardmestars awards

The article got a surprising amount of interest, including front page mentions on reddit, digg, slashdot and wordpress and a mention in the Guardian newspaper (they were too mean to give the URL of the article). There were also some entertaining reviews posted on download sites. The page has so far had over 150,000 hits, 263 comments and has a Google page rank of 6. I hope this exposure will make a small contribution to ending this sordid little practice.

It has been quite instructive to be on the receiving end of the news, albeit in a small way. Much of the commentary was inaccurate. One ‘journalist’ from ZDNET Belgium/Holland even managed to get both my first name and last name wrong, which is quite a feat considering we had exchanged several emails. I don’t know how many other mistakes he made, because the rest of the article was in Dutch or Flemish. I wonder if the mainstream media is much better. Definitely don’t believe everything you read.