“We want to be more data driven” or “We want to create a stronger data culture” are common refrains from organisations. Often these notions go unchallenged. The claim that organisations make better decisions with data at hand is, perhaps ironically, never investigated empirically. Dig a little deeper however, and it’s not the data that drives the value, but the better decisions made with empiricism. And empiricism, isn’t merely data, it’s a method of acquiring knowledge.
To even get off the ground analysing data, you need theory of measurement. What should we track, given limited engineering resources and raising storage costs? Claiming we should track, say payments and logins, at the exclusion of say, battle logs, implies a cost-benefit value ranking. It’s more valuable spending resources tracking payments. But these values are derived from somewhere. This “somewhere” are assumptions made about if-then statements. “If we raise price then quantity demanded will fall”, is a claim that can be shown on empirical or tautological grounds (1+1=2 because the rules define it as such). It’s true because whenever we raised price in the past quantity demanded fell or because we assume people respond to incentives (holding all else constant, higher price is a disincentive).
The methodological refinement process of these assumptions is crucial in evaluating the validity of these if-then statements. Will adding a clan system to the game maximize LTV relative to adding a login reward system? A method is not just necessary to evaluate this question, it’s impossible to do so without one.
The refinement of methodology through deliberation is at the core of what makes science, science. For science and its refinement process to even take place we need clear assumptions that produce those previously mentioned if-then claims. This is what I’ve seen Product Managers struggle with the most. Putting a feature on the roadmap has a set of implicit assumptions associated with it and these assumptions deserve to explicit so they can be debated and analysed.
As it turns out, the results produced by the scientific process are increasingly monetizable by the low marginal costs, stale fixed costs economics of software distribution. For example, if you find that people generally play your game on low-end devices and then start optimising for such devices, the benefit scales to million of players. The total addressable market is the extent of the market.
And to be clear, you don’t need empirical data to do science. Economists and Physicists make claims about the real world using accounting identities or tautologies like “something that is priced higher costs more” as a function of the rules (in this case, language).
Arguing to be data-driven or informed misplaces the value in the supply chain. Western science has never been more adored or employed then it has been now, something we all benefit from. But we need to more explicit in our endeavour – this isn’t about data, it’s about science.