Remix.run Logo
Clay Christensen's Milkshake Marketing (2011)(library.hbs.edu)
25 points by vismit2000 4 days ago | 13 comments
pwatsonwailes 2 hours ago | parent | next [-]

The problem I tend to see is that companies say they're doing JTBD research, but they're actually just running attribute preference surveys (asking customers to rank features, from a list of things the company would like to build, rather than starting off by assuming you don't know what customers require).

Listening to what people say they want (feature preferences) almost always diverges from what they actually want the product to do (a functional, emotional, or social outcome). That gets more complex when we think about that there's different levels by which you can evaluate what someone wants, which in the JTBD word are thought of as jobs as progress (why they're doing the thing), and jobs as outcomes (how they're doing the thing). There's another famous example, which is from Bosch's circular saw evolution. Professionals said they wanted lighter tools (and that's true), but the constraint they experienced as a result of weight was the impacts that had. So you can solve for weight, or you can solve for improved usability. Symptoms vs causes sort of thing.

This is also why product teams should involve marketers, and why marketers should understand research design. The teams who I've seen do this well at this aren't running quick preference tests and A/B tests on features most of the time. They're generally more focused on running continuous feedback loops, where they conduct broader research, then engage in grounded theory style interpretation to understand what they can do, look at field validation to figure out what they should do, and then iterate.

For B2B especially as a side note, if your value proposition is something like accountability or proof of value, but your product's workflows don't make accountability or proving value effortless, fixing that workflow will do more for brand perception than any campaign, because nothing nukes good comms like a poor experience.

squirrel6 2 hours ago | parent [-]

This is all very true. I always try to push for customer interviews to be as much about formulating new hypotheses as they are about validating existing hypotheses.

getnormality 38 minutes ago | parent [-]

I want to scream [1] every. single. time. a business wants to talk to me. Every second of the conversation is just the same thing, over and over: here are my boxes. Please put yourself in one of my boxes.

There is no room for my feelings or free expression. It is inconvenient to the agenda of whoever created the survey.

If companies are so desperate to know what customers want, how come no one, in the past 25 years of my life as a consumer, has ever had any time to ask me a single real question?

[1] At least that's how I would feel if I hadn't numbed myself to this decades ago. Now I just escape. I hang up, I click the X, I think of nothing but getting away from this thing that, no matter what its motivation, no matter what product or cause it's selling, has exactly the same agenda: "would you like to dehumanize yourself for five minutes, so that I can make a data table that I was paid to make by someone who doesn't know what they're doing or why?"

deaux 6 minutes ago | parent | next [-]

Too many layers and competing interests. The person who ends up creating the survey realistically cares much more about going home 1 minute earlier or with 1% less neural energy spent. They've got their kids to manage when they get home.

Do you use any products from a one-person shop? They will be so much more likely to ask you real questions. Not guaranteed, of course, as some people are just clueless. Those usually don't last long as business owners though.

pwatsonwailes 32 minutes ago | parent | prev [-]

"how come no one, in the past 25 years of my life as a consumer, has ever had any time to ask me a single real question"

There's a lot of shit marketers, is my short answer.

Like, a lot.

btilly 25 minutes ago | parent | prev | next [-]

The problem with general surveys is that you find out what people who don't buy your product think that they might like. Then they don't buy your improved product either.

A second problem is that we're unaware of what we responded to. That's one of the things that A/B testing reveals.

fifticon 4 hours ago | parent | prev | next [-]

It is a bit sad that people have to be taught this; I am presuming the product people are a kind of humans too. But when I see their outputs, maybe this Christensen guy is right.

I tried to adjust the background image on microsoft Teams video calls this morning; the UI I had to use or rather figure out, to achieve that, was a major depression. (1) the settings menus in teams are well hidden, for reasons unclear to me(). (2) but the _actual_ settings you need are hidden unless you START a meeting call. (3) but, the _actual_ settings are a long chain of ".. but are you sure you REALLY want to see the ACTUAL settings?", where you must continue to click 'more settings', 'advanced settings', 'full actual settings' (I am paraphrasing.)

() I suspect what they are though.. Something about dumbing the UI down to the level where the people in charge of teams can understand them, plus some kind of fear of UI designs where any given screen or view contains more than 1 or 2 elements (the second element being "show further settings").

We are dumbing down UI to the level of people with no hands, no eyes, no brains, which I presume is the target audience. I must have mah minimalism.

pwatsonwailes 3 hours ago | parent | next [-]

It's what always happens when there's a disconnect between the product built and the actual thing people want to do. In marketing, we differentiate between Jobs-As-Activities (the task of "changing a background") and Jobs-As-Progress (the user trying to go from something being unsatisfactory to something better).

When UI feels dumbed down to that level, or hidden behind advanced settings, it’s often because the product team ends up treating users as a gestalt persona, rather than thinking about their constraints around time and attention. The most meaningful innovations occur when customer insights influence development before launch; sadly, that frequently doesn't happen. People launch the thing, come up with features they could add, ask what people want from that list (and potentially don't even do that) and then add stuff like barnacles accumulating on a ship.

fifticon 4 hours ago | parent | prev [-]

Though I did not follow the idea of chunkier fruit in those travel milkshakes, isn't that what clogs the straw, which is not ideal for a 1-hand treat.

thaumasiotes 3 hours ago | parent [-]

The article presents the fruit as a way to make the milkshake more "interesting", addressing the fact that existing customers were purchasing milkshakes in part to make their commute less boring.

Weirdly, there's no followup on whether the changes improved sales, margins, or other goals.

JohnCClarke 2 hours ago | parent | prev [-]

This example is always cited as different from the "demographics" approach. But it literally started by segmenting the buyers, and then focussing on a previously unrecognised demographic sector (car commuters).

Clay Christensen is smart, and one of the many things he is smart about is marketing Clay Christensen.

getnormality 33 minutes ago | parent | next [-]

How is "car commuters" a demographic segment?

Does the word "demography" mean anything?

Is "people who like to buy milkshakes" a demographic segment too?

pwatsonwailes 31 minutes ago | parent | prev [-]

Good segmentation is mostly not by demography nowadays. At best, demographics are a correlative element to something more fundamental, usually economic, behavioural or psychographic.