Great post. I think you all took a very ‘scientific method like’ approach to quantify some typeof results. Like some of the other commenters, I’d like to know how you got those 800 visits so quickly. What were your promotion techniques? I recently watched a talk with Gary Vaynerchuk, a well known venture capitalist that’s known for getting a little ‘scrappy’. He had the opinion that you SHOULD use google ads or whatever marketing tactics available. Simply using ALL forms of marketing is important. I’m not sure how valuable purchasing ads is for an experiment, but spending marketing dollars to attract customers is a no brainer. Google created a tool to be utilized, not to be debated among entrepreneurs questioning the validity in their product. Batman uses all of his tools at one time or another :).
– Joshua
]]>How do you know it isn’t 😉
]]>I think you make a strong point here, and we could have done a better job discussing our interpretation of the results.
Ultimately there’s no good way (that we found) to distinguish such events. You could also argue that features that are more prominent in the UI (at the top of the sidebar, for example) are more likely to see clicks.
The best you can do is take the data for what it is – fuzzy. But that doesn’t mean it’s not meaningful, just that results should be carefully considered and (most importantly) discussed with customers who went through the experiment. We were very diligent about doing both of those.
]]>As long as I find the approach very effective altogether, I’d like to point out something I was wondering about: most of the non-implemented features aren’t really self-explanatory and someone might click on the related item just out for curiosity (to try and understand how it looks like, e.g. “API config”).
For you all the clicks translate into a potential client’s desire to have that feature, which might not always be the case.
How do you distinguish actual needs from general curiosity?
]]>