I love the way André slices through the bs to deliver a clear and simple idea. If you can’t explain something fast and in simple terms, you don’t know enough about it to even try. In this case, André is a great example.
- Your web-analytics quantitative data can’t tell you what you can’t measure, for example intention and deep motivations, these are things that are only evident in the more complex abstractions of UX and user perception. As well as intention. “Web analytics tell you where you can optimize, but not what you should do”
- UX is composed of both an emotional dimension and a rational dimension. To lubricate and if possible increase motivation to go down the funnel, you need to address both the rational value proposition as well as the emotional value proposition. Humans are much more rational than animals, but a lot less rational than we give ourselves credit.
- Understand the subconscious needs of your customers, as these are the driving factors of pretty much any human effort.
- Analysis, Concept and Hypothesis have to be aligned. You are creating a model of how your users react to your online experience, and this has to be fed by the creation of personas that better explain their behavior. A good model of your personas, is one that allows for bold extrapolations of what you know from personas into hypothesis and take big leaps in conversion rates.
- Test among great options. A great secret among any kind of artists, is using great materials. The great chefs of the world take great care in selecting the produce, meats and other ingredients before buying them. The same can be said for hypothesis, focus on the ones that promise big positive change.
- Conversion Rate Optimization has to be measured at ROI level and this information must be evident to top decision makers. Having directives support and interested on the CRO cycle is critical to the success of the process.
- Do not let your EGO push to you find bias confirmation amongst test data. Be objective. A loosing hypothesis is as informative and valuable as a winning hypothesis, because the value is a more accurate understanding of UX and users in general. The best way to avoid this, is separating the test analyst from the hypothesis analyst, removing the very human tendecy called “Confirmation Bias“
I’ll leave you guys with this great podcast from PRWD