19 10 / 2014
In my fifteen years of developing web-based systems, I have seen a lot of business plans for e-commerce sites. They often start with unrealistic revenue projections and optimistic costs. When I ask where the numbers come from it usually turns out to be no more than guesswork. And finance people can spot this a mile off.
But you can use objective defensible assumptions to develop the plan. Here is my approach.
You will be doing this in a spreadsheet, so have years (or quarters) along the top and revenue/cost lines down the left hand side. The forecast is going to be dependent on your assumptions, so list those first and reference them in the body of the spreadsheet. So your spreadsheet is going to have four sections
You need to be able to defend all your assumptions. If you do this right, once the assumptions are accepted then your whole plan has to be accepted.
The maths for calculating revenue is simple. It is just the number of visitors you expect to see on the site times the conversion factor (the percent of visitors that buy something) times the average revenue per sale.
But where do you start?
You will not in the first instance be getting any significant traffic from organic traffic, that is traffic that comes across your site from search engines, links from other sites, social media and other free sources. Don’t kid yourself that all you need is a bit of SEO (search engine optimisation) or social media activity to generate business. Unless you are unbelievably lucky, it takes months or even years to build any significant organic traffic. There may be ways of artificially boosting your Google listing, but they are not free and put you in risk of being blacklisted.
I suggest you assume that in the first year at least you get your traffic from Google pay per click (PPC). There may be more cost-effective ways of generating traffic but at least the cost per visitor with PPC is predictable. After about a year or so you can reasonably assume that your site will become known and that organic traffic will start to build up.
How do we get these numbers?
Get a Google account and sign up for pay per click. They provide all sorts of tools to forecasting cost per click for different keywords. I won’t go into it here, there are many tutorials on the web. The Google forecasts can be optimistic, and a trial site with a small outlay will verify the numbers.
Just divide your (assumed) marketing budget by the average cost per click (CPC) to get the traffic. E.g.
£1,000 budget and CPC of £0.50 gives 1,000/0.5 = 2000 visits.
What about the organic traffic?
You won’t know how much traffic you will get until you have the site up and running and you have done your SEO and all your activity on social media. But you don’t have to guess what traffic you will be getting. You can find out what your main competition is getting.
Look out for sites that are most similar to yours and attract the same demographic. Then use sites such as alexa.com or similarweb.com which will give you an estimate of their traffic. This is not going to be accurate in any sense, but will give you an order of magnitude estimate of what can be achieved in the long run that you can defend.
In your spreadsheet I suggest you show a steady growth of organic traffic from zero in year one to that level in year five. Don’t forget that you will have to work on it. But your opposition is working on it too.
Keep the figures for PPC and organic separate.
Now we have traffic figures, the next item of data we need is the conversion rate.
The conversion rate you achieve will depend on a lot of factors such as the way you word the advertisement or the quality of your website. But you need to assume average rates for your industry. There are sites that give averages by industry, and you will find blog sites that give figures. Do your research and try and get as good an average as you can. This is an important number and you need to spend time researching it.
I would expect to get a much better conversion rate for PPC than organic traffic. People who search for your selected keywords are more likely to buy than people who come across your site when searching for something else.
I have seen estimates of about 30-50% better conversion rate for PPC. So take your average conversion rates and assume 20% better for PPC and 20% worse for organic. It is a rough and ready figure and the more research you can do the better. Don’t forget you may need to justify this figure more than any other.
Now we have PPC and organic traffic estimates and conversion rates. All we need to do is multiply by your average sale value to get your total estimated sales.
The first cost you have identified, the PPC cost.
Another big item is likely to be the cost of sales, the cost of the things you are selling. This will be some percentage of the sales depending on your margin.
A figure that drives a lot of overhead costs is the staffing. How many staff do you need per hundred sales per week (or per thousand, whatever is appropriate)? Use the sales figure to drive the staffing. Then add management.
You also need to budget for rent, accounting, hardware, website hosting and so on. How much kit will each employee need? Computers, desk, floor space etc. The staffing figure will drive a lot of the overheads.
Now you have costs and revenues, you can build a cash forecast and profit estimate. This is based on a set of assumptions that you can defend.
Does it come out negative? Don’t be tempted to fix the assumptions to give you some acceptable figure. All your assumptions need to be backed up.
Good luck with your business plan.
01 5 / 2013
As I mentioned in the previous post, I did a Google search when my doctor changed medicines on me. It seems that my type of statins (Simvastatin) when combined with my blood pressure medicine (Amlodipine, a calcium channel blocker) caused some muscle problems.
I found that each authority in the NHS had produced its own recommendations to doctors. And that these were all different. See below.
Note in particular the two recommendations to (a) consider changing the calcium channel blocker (CCB) and (b) NOT to consider changing the CCB. Completely contradictory.
While we are short of doctors and nurses, how many bureaucrats do we have sitting in offices dusplicating work like this? And doing it badly.
My doctor laughed out loud at some of the recommendations.
Recommendations: heavily edited.
For patients on 40mg Simvastatin plus Amlodipine….
If patient is stable on the amlodipine and 40mg simvastatin …, maintain on 40mg dose of simvastatin. Explain to the patient to return if any symptoms develop.
a) Change to an alternative Formulary statin – for cost implications see table 2
• atorvastatin* 10mg/day (for dose effects on LDL‐C see table 3) …..
b) Reduce simvastatin to 20mg/day
East Kent Prescribing Group
Switch to atorvastatin 10mg daily, increasing to atorvastatin 20mg daily if required to achieve target lipid levels
Prescribers are NOT advised to switch to an alternative calcium channel blocker
changed to atorvastatin 20mg and titrated up as appropriate….
NHS Kent and Medway
Switch to atorvastatin 10mg daily, increasing to atorvastatin 20mg daily if required to achieve target lipid levels…
Switch to pravastatin 40mg daily …
a. Reducing the dose of simvastatin to 20mg daily
b. Switching to atorvastatin 20mg daily
This dose has approximately the same lipid lowering efficacy as simvastatin 40mg daily…… (Not true if you are also taking amlodipine! The whole point of this .The amlodipine causes the statin to stay in the system longer and give the effect of a larger dose.)
1. Reducing simvastatin dose to 20mg–…
2. Staying on simvastatin 40mg– discuss the risks and benefits….
3. Change to an alternative statin
Atorvastatin (20mg or 40mg daily) is an option.
4. Change to an alternative calcium channel blocker-
• Primary prevention: reduce the dose of simvastatin to 20mg …
• Secondary prevention in patients whose lipids are controlled on simvastatin 40mg, switch to atorvastatin 40mg and treat in accordance with lipid guidelines –
12 4 / 2013
I am taking amongst other things a couple of drugs on a daily basis. They are Amlodipine for blood pressure and Simvastatin (40mg) for cholesterol. This month the pharmacist changed the statin to Atorvastatin (40mg) without any comment.
I thought that this was probably another brand of the same stuff, but to be sure I did what anyone would do. I googled it. It turns out that the Amlodipine/Symvastatin combination causes muscle and joint pain, which maybe accounts for the frozen shoulder and tennis elbow I have suffered from in the last couple of years..
So far so good. But the google search threw up various circulars and newsletters from health authorities to doctors recommending a course of action. This varied from ‘do nothing unless the patient has side effects’, reduce the Statin dose or change to 10mg, 20mg or 40mg of Atorvastatin. So the drug and dose you are prescribed depends on which health authority you happen to live in.
It sounds like a shambles. I am visiting my doctor to try and get to the bottom of it.
This issue was reported by the American FDA in August 2011. Atorvastatin became available as a generic drug in May 2012 and the UK equivalent authorities issued a notice (which didn’t cover dosages but recommended a switch to Atorvastatin) in August 2012. Did they wait a year because the alternative wasn’t available in generic form? You decide.
13 7 / 2012
A recent blog by Zoe O’Connell reported on the Parliamentary committee looking at this bill. She has summarised it very well so I won’t go into as much detail here. If you have missed it, this bill is intended to allow the police to access data about any form of communication you or I might have, but not the content. So If I send you an email the police can find out that I sent you an email but not what was in it. Or if I made a Skype call, the fact that I called you, and maybe how long we talked. This is much like the data the police can already obtain from phone companies on phone calls and it sounds logical to extend this to new forms of technology. The Home Secretary mentioned terrorist and paedophiles as likely targets.
Initially the blogosphere went wild with speculation that the home office was going to put black boxes in every network to sniff out every packet of information going over the Internet and interpret the contents to decide if it was a communication and if so who was it from and to. This is called deep packet Inspection (DPI) or as we used to call it screen-scraping. The technical difficulties in this are legion and more informed contributors suggested that the people who were going to be asked to retain this data were in fact the Facebooks and Googles of this world.
The evidence to the committee by the civil servants who drew up the bill makes it clear that this is the case and that the major names that you are familiar with are quite willing to co-operate. As Google already gets more requests from the UK that any other country (another excellent post from Zoe) I can believe that this is the case.
This leaves us with overseas service providers who are not co-operative, say for example someone were to set up a webmail system in Pakistan, after all this just needs a PC and a broadband connection. The answer is that they envisage the use of DPI for just those cases.
But what if my assumed Pakistani webmail system on a PC in someone’s bedroom in Karachi uses encryption (which it would)?
The answer is that the Police could only get at the unencrypted portion of the packet, i.e. the IP addresses of the machines communicating. As this is likely to be an internet cafe in Bradford communicating with a domestic broadband in Karachi I would suggest that this information will not be very useful.
So what do we have?
The act will be very useful for the police to deal with routine crimes and investigations. The two types of criminal likely to be able to hide from this are (a) a Paedophile ring, who would go to lengths to hide themselves and (b) terrorists, who are likely to avoid the American devil’s email systems.
01 6 / 2012
I wrote on this a few weeks ago. Here is an update.
The ICO (Information Commissioners Office) has spoken officially earlier this week and a pattern is emerging among large websites.
The new ICO guidance is as confusing as ever on what is meant by consent, and they waffle on for pages on the concept of ‘implied consent’. I think the key paragraph is “To be confident in this regard the provider must ensure that clear and relevant information is readily available to users explaining what is likely to happen while the user is accessing the site and what choices the user has in terms of controlling what happens."
Then to make sure they are not being specific in what is required they continue as follows:
”Exactly how this information is provided is a matter for the person seeking consent and it is not the Information Commissioner’s role to provide precise wording or impose particular methods of communication."
The pattern which is arising seems to be not to ask for an opt-in permission, but simply to inform users that cookies are used. This is done in some sort of popup, or slide down that the user cannot avoid. See for example:
They satisfy the requirement of informing the users in a way that is unavoidable, however the pop-ups or slide-downs or whatever are easy to dismiss and users will soon be viewing them as yet another annoyance on the way to getting the information they want.
I would suggest that the smaller guy, waits a while to see if this is truly a consensus and not an interim step to something more horrifying, then update web sites to something like that.
I am afraid the ICO is going to have to remove Google Analytics from my cold dead hands (picture of me holding up a spreadsheet). So let us hope something appropriate can be worked out.
05 5 / 2012
I was in Germany last week and noticed a headline in the local newspaper with forecasts for the upcoming election in North-Rhine Wesphalia. This included Piraten: 9%. My German is pretty bad but I could recognise the word - Pirates. It turns out that they have a membership in Germany of 25,000 and in the Schleswig-Hollstein state election they are also running at a projected 9%. Last year they did well in the Berlin elections with 8.9% of the vote giving them 15 seats.
Nearly one in ten voters are supporting a bunch of pirates and they are getting legislators elected - what is that about?
The Piraten party is standing on a platform of unrestricted Internet, and they are international with branches in many European countries including Great Britain. The platform of the British party includes:
- Ending surveillance.
- Ensure that everyone is guaranteed real freedom of speech
- Reforming copyright law to legalise non-commercial file sharing and reduce the period of copyright protection.
Basically they want to be able to legally pirate files, music, video and probably software.
It sounds outrageous; and it is! But I recall an article by the New York times technology correspondent David Pogue back in 2007. You can read it here http://pogue.blogs.nytimes.com/2007/12/20/the-generational-divide-in-copyright-morality/. It became obvious from his questioning of a student audience that they didn’t pirate music and films simply because they could get away with it. They just didn’t see that it was wrong to do so. It is worth reading the article because it graphically illustrates that young people just plain don’t get it.
Now the Pirate party is cashing in on this and saying that if it isn’t wrong it shouldn’t be illegal. And a lot of young people are voting for them - in Germany at least.
01 5 / 2012
The EU Cookie law is now in operation and websites are breaking the law if they create cookies without asking the user’s permission. Er.. um… that is most websites in the UK. Users of Google Analytics (about two-thirds of websites) who do not ask permission of users and get their acceptance, are breaking the law. The Information Commissioner’s Office (ICO) website has a little checkbox for you to click to give permission and only about 10% of visitors do so. This makes their analytics totally useless.
A recent interview in e-consultancy with Dave Evans, Group Manager for Business & Industry at the ICO, revealed what a mess we are in. Here are a few quotes:
"It is unlikely (though not impossible) that we would take action just for analytics cookies"
"Just because analytics cookies are caught by this law doesn’t mean a strict opt-in is necessary. It could, in some cases, be seen as an essential part of the relationship. "
“Therefore, can you be confident that your users know about cookies? In the medium to long-term, if lots of websites are more transparent about cookies and privacy, then users will become more informed and it will be easier to assume knowledge.
If we can operate on the basis that, since a website has made efforts to inform customers, and through this collective education process, people understand how and why online businesses are using their data, a website could claim with some justification that since we made it clear, and they are still using our website, opt-on consent may not be necessary. “
So that is all clear then.
What a total mess.
The minister in charge is Ed Vaizey. He needs to get a grip on this.
09 4 / 2012
The Guardian today has an article that encapsulates the problem with British Business. Tesco are trying to break into the US market. It is an enterprise which is likely to be long and difficult with many mistakes on the way. But Tesco can afford it, and the long-term benefits are potentially huge.
Long-term you say? Sadly this is of zero interest to British shareholders who are only interested in a fast buck. They want it shut down, and will probably get their way.
05 4 / 2012
A lot of people thought it was an April fool joke when it was announced that the government planned to force internet firms to give the intelligence agencies information about emails and Facebook accounts in real time. Politicians came on the radio and reassured us that our content was safe, and that it was really just the same as telephones. CGHQ could find out who we called and when but not what we said. Cast-iron guarantees were given.
The whole idea of doing this makes my brain hurt. If I send an email via Google, I am (as far as the network is concerned) loading a web page. One of zillions of web pages being loaded at any point in time.
The network system will presumably have to realise that this is a Gmail page (one of many possible mail systems) and understand enough about the structure of this type of web page to extract the ‘to’ and ‘from’ information. Remember we are dealing with web pages here, not some fixed protocol like ‘classic’ email.
Leave aside for the moment that if I am a bad guy I am likely to use ‘thowaway’ email addresses. I see some issues.
This would have to be done using a technique called Deep Packet Inspection (DPI). This involves inspecting every packet of information through the network and figuring out what is inside it. We used to call it ‘screen scraping’ in the old days. Once you have the web page you need to de-construct it to sort out the bits that make it up. You can only do that if you know how the page is made up.
If the mail system changes its page layout then the system has to change as well. It would be quite possible to create a mail system that changed page structure for every mail. I could start an email service tomorrow using my PC, and the system has to know all about it. Once it has figured out that I exist (how?) then I can move it somewhere else.There is nothing internet people hate more than snooping and someone much brighter than me is going to figure out how to break whatever someone comes up with.
And for most mail systems these days do this with web pages that have been encrypted. Now DPI with encrypted pages is possible but as I understand it they operate on a ‘man in the middle’ basis. i.e. they deal directly with the web server and re-encrypt the page with a key they know about before sending it on to the end user. This would be highly detectable by the punter at the other end and could easily lead to problems with high security systems such as banking websites.
The government is not doing this with some great firewall of England, but each service provider is going to have to do it, and keep the software up to date as mail systems change.
I just don’t see it. But if someone in the Civil Service has figured out how to do it, then Respect!
04 4 / 2012
A friend of mine was advised by his service provider that he should remove all the Google Analytics tags from his website because it violated the new EU Cookie regulation. My reaction was disbelief because I know that the majority of sites on the web use the Google package, so simply making it illegal is going to cause one or two issues. I have been doing some research.
First what is Google Analytics for those not fully up to speed? It is a package that tells the website owner how many visitors the site is getting, where they came from, and what routes visitors take through the site. For a web site administrator this stuff is vital information. You can get most of the information by analysing the log files kept on the web server, in fact that is what we used to do, but a product such as Google Analytics works by monitoring traffic in real time by the central Google computers. It gives more information and it is more accurate.
There are competitors to Google but they are expensive, whereas Google is free, so they have the field pretty much to themselves. In fact around 50-60% of sites on the web use it.
To make it work it is necessary for Google to create items of data called ‘cookies’ and leave them on the user’s computer. It is the same technique used by advertisers to track user behaviour. Ever wondered why you visit (say) Amazon and the next you know you are getting Amazon adverts on all sorts of other websites? It’s cookies.
The EU thought this was an intolerable invasion of privacy, so they passed a law saying that you could create one of these cookies provided you asked the visitor first. To see this in action go to The Information Commissioner Office (ICO) website. Do you see the little box that appears at the top of the page? Unless you check the box, they won’t create cookies, and your visit won’t appear on their analytics.
I read that 90% of visitors don’t.
That pretty much makes Google Analytics a non-starter if you want to be fully legal then. You would think that Google would be really hot on the case here. I see that they are working on it, and you should check their blog for updates.
I would imagine that Google could fix their system to be fully compliant, but it would involve a degradation of the level of service, for example not being able to identify returning visitors. This is important information and my guess is that they are in negotiation with someone to try and get an exemption or at least some sort of understanding by the authorities. None of the information retained is personal information or can be linked to personal information so I am sure a resolution is possible if only people behave sensibly.
The UK Governent is not keen on this. The following is a quote from the snappily-titled Implementer Guide to Privacy & Electronic Communications Regulations (PECRs) for public sector websites.
Use of web-analytics/metrics: The use of metrics are integral are to departments’ being able to provide the best possible user experience in order to encourage citizens to use more cost-effective channels for accessing government services. They also allow departments to assess and demonstrate whether the digital services they offer provide “value-for-money” as demonstrated by the recent National Audit Office (NAO) report.
Consequently, collecting these metrics are essential to the effective operation of government websites, at present the setting of cookies is the most effective way of doing this.
The ICO guidance supports this view as it states “…it is highly unlikely that priority for any formal action would be given to focusing on uses of cookies where there is a low level of intrusiveness and risk of harm to individuals. Provided clear information is given about their activities we are unlikely to prioritise first-party cookies used only for analytical purposes in any consideration of regulatory action”
So they seem to think that it may be non-compliant with EU regulations but nobody is going to do anything about it. (I think that is what the last bit of gobbledegook means.)
So what do you do?
First up recognise that this is an almighty mess. Personally (and this is not legal advice. The only legal advice I ever give is not to sign anything without reading it first.) I would make sure that my site has a privacy statement. A good example here http://www.culture.gov.uk/4902.aspx (I don’t see a checkbox on this government site by the way) and watch the news for developments.
Something will need to be done to sort this out. Watch this space.