The Environmental Economics Approach to Liveops Content Management

I’m sure Candy Crush will put Hotelling over the line for a posthumous Nobel Prize.

In 1931, American economist Harold Hotelling published the seminal paper The Economics of Exhaustible Resources. Harold described a problem many firms face: how much of a non-renewable resource should they sell at any given time? This problem is more obvious when thinking about managing an oil supply, but just as relevant when considering how to manage match-3 levels.

For oil firms, since supply is always declining, price should move up at every \(t + 1\) period, holding all else constant. As a result, the optimization question emerges: Do I sell my oil now or wait for a higher price later? To solve, we first need to understand net present value or the idea that value now is worth more then value later. We can model this as:
$$ V_{today} = V_{later} / {(1+r)^n}$$
Where \(r\) is the interest rate or a proxy for opportunity costs and \(n\) is the number of periods. All we’re saying here is that $1,000, say, 7 years from now is only worth $710 dollars today at a 5% interest rate. This is because I could take that $710 dollars, invest it, and in 7 years I’d have $1,000 given the 5% interest rate.
$$ $710 = $1000/ (1+.05)^7 $$

The more the interest rate rises, the more sense it makes to sell oil now and invest the money rather than waiting for the price of the oil increase. The oil firm needs to compare the rate at which the oil price increases against the rate of interest. As such, the price of the oil is strongly correlated with the increase of the interest rate over time.

Source

Hotelling’s approach and intuition helped form what is called dynamic programming economics. More wrinkles have been added to the model to solve everything from cake-eating to managing how many fish should be caught. The tools of the field help us solve tricky now or later dilemmas.

Consider the release cadence of television show episodes. Should Netflix release all episodes in a given season all at once, in batches, or weekly? Perhaps even monthly? The problem is littered throughout gaming: should I use a limited-time direct store or an always growing catalog store? How should I distribute rewards throughout progression? Should I save content for a UA burst? And like the oil firm, we’re in search of the content rationing solution that maximizes \(LTV\).

Match-3 Player Level Management

Consider how King should ration Candy Crush levels for new players. King is producing new levels at some constant rate \(p\). But we know that players are consuming levels at a greater rate than King can make them in any given time period \(t\). $$ c_t > p_t $$
At some point, a given player will run out of levels, the speed of which is governed by the size of \(c_t – p_t \). We can pick some random constants to visualize this:
We might also think of a player as having a churn probability on any given day since install. This declines over time: the best predictor of a player retaining on D30 is if they retained on D29 ( decreasing convex or \( \frac {\partial Pr(Churn)}{\partial DaysSinceInstall} {<}0 \)). This suggests elder players are less difficulty “elastic” or respond less to increases in difficulty.
And yet, we might imagine that the probability of churn increases every day without new levels as well (increasing convex or \( \frac {\partial Pr(Churn|NoNewLevels)}{\partial DaysSinceInstall}{>}0 \)). In the example above, the player ran out of levels on D205.
In the long-run, King could ramp up level production, but that would eventually hit diminishing returns and has its own labor costs. As a substitute, King’s main lever to control player progression speed is difficulty. By raising difficulty, King can shift the line further right, as it would take longer for the player to reach the no content or exhaustion point. King could even tune difficulty such that \(c_t = p_t\) and the player would never run out of levels. On the other hand, if difficulty is too hard then churn will spike (low difficulty also causes this)! This suggests that King needs to solve an intertemporal difficulty optimization problem. That’s a mouthful, but all it means is that King needs to balance the spike in probability from not enough levels against the spike in probability from increasing difficulty too much. To do so, we first need to model the total marginal effect of no new levels for every day since install, \(d\). This is the counterfactual churn probability of when there were levels against the current probability of churn when there are no more levels.
$$ \text{Total Marginal Effect of No New Levels} \\= \Sigma{[Pr(Churn|NoNewLevels)_d – Pr(Churn|NewLevels)_d}] $$

Whereas increasing difficulty to delay this from occurring has its own cost.

$$ \text{Total Marginal Effect of Increasing Difficulty} \\= \Sigma{[Pr(Churn|Difficulty_{x+1})_d} – Pr(Churn|Difficulty_x)_d ] $$

The solution to this system of equations is what we might call the maximum sustainable difficulty. And the intuition is similar to what first developed for the oil firm.

A bunch more stuff falls out of the model. For instance, King should significantly increase the difficulty of levels that are just near the exhaustion point, just as the oil firm would significantly increase the price of the last bits of oil. There’s very little downside to doing so, as the player will be in a high probability space soon enough anyways. Furthermore, since King is always stockpiling levels, new players face an exhaustion point further away then players who downloaded the game at first launch. It’s in King’s interest to ease up on the difficulty for early levels since the area underneath the churn curve expands as the exhaustion point moves rightward.

There’s much more room for expansion of the model and I encourage data scientists to play a strong role in systems design.

Leave a Reply

Your email address will not be published. Required fields are marked *