I will be the first one to admit I don’t really consider myself a developer, I am more of a coder. The irony is that I have a computer science degree, ha. Anyway, a developer understands all the theory behind the implementation. A coder gets the code to do what they want without fully understanding how. In the past two years although I have been moving away from coding, I keep getting sucked back in. It’s so easy to throw together a simple web page with email capture, etc. The hard part is scaling it to 1M+ daily active users. Scaling is when a real developers are needed, not a coder. A coder is needed when hacking together a Minimum Viable Product (MVP) to get out the door. Building a startup requires both hackers and coders, you just need to know when and how to leverage each of the strengths.
Any way, enough about developers/coders – it’s time to optimize that code. I have built a number of products the past few years and have really started to think about the optimization that SHOULD go behind the code, or lack there of. Just think of a simple checkout page. Should you capture the email before they complete the order for an instant email campaign? Do you validate the shipping address before or after the order has proceeded. There are many things to test when it comes to ecommerce as well as a web application, but who REALLY does this? I get sent links to Which Test Won every time there is a new post (thanks Nelson), but I have to wonder, do companies really invest this amount of time and money into optimization. I guess what I am really asking is what companies out there are doing true A/B testing to improve conversion. Are they taking it the next step further and integrating all this testing knowledge into their products while bugs and fixes are simultaneously happening?
While attending Distilled SearchLove I got to listen to Mat Clayton of MixCloud who discussed a lot about testing. I got the chance to pull him aside and understand how testing is looped into development. One of the ways they do this is leveraging 3 states per feature. Global, limited, and disabled. I am sure this is where Dave McClure would chime in to preach about how you should always be able to kill features and drop an Fbomb, but it adds to the complexity. When you are building something for the first time you are most likely not going to get it right. Why not build in features to test your assumptions and see if it works. On top of that, build a throw away as fast as possible to get something live and out the door. Worst cast, you can disable it and move on. Building out this process to disable features and test assumptions is at the core of a lean startup, but the big question is… is it part of your current business development roadmap?
Building in a simple test platform within your features is a LOT easier when you start development rather than trying to retrofit them post launch. Although I am a HUGE fan of building 2 – the first one is a learning experience that you throw away – one of the only lessons from CS class at UVM I always think about.
So who is building in testing as part of development besides MixCloud? Are you testing your product while it’s being developed and killing features no one uses? Taking it a next step further, are you testing links to features that don’t exists but could be built out or on the road map that would help improve a conversion/goal/ROI. Just think before you even code a feature, drop a link on a segment of users, see who clicks and throw a “feature not available yet” in there. This will give you some numbers and a baseline on whether users want it and/or if it will help your conversion.