Remix.run Logo
johnfn 4 hours ago

Sure, you can spend the weeks to months of expensive and time consuming work it takes to get a fuzzy, half accurate and biased picture of what your users workflows look like through user interviews and surveys. Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision.

Sometimes HN drives me crazy. From this thread you’d think telemetry is screen recording your every move and facial expression and sending it to the government. I’ve worked at places that had telemetry and it’s more along the granularity of “how many people clicked the secondary button on the third tab?” This is a far cry from “spying on users”.

graphememes 31 minutes ago | parent | next [-]

You're never going to win this argument, most of the people who post here have never actually shipped a product themselves and only work on isolated features and others have to handle / manage all of this for them so they have no real understanding of what it takes to do it

the other crowd that pretends otherwise are larping or only have some generic open source project that only a handful of people use or they only update it every 6 years

embedding-shape 2 minutes ago | parent [-]

> You're never going to win this argument

Probably because there is no "truth" here, only subjective opinion, there is no "winning", only "learning" and "sharing".

I could ramble the same about how "people relying on data never shipped an enjoyable thing to people who ended up loving, only care about shipping as fast as possible" and yadda yadda, or I can actually make my points for why I believe what I believe. I do know what I prefer to read, so that's what I try to contribute back.

embedding-shape 4 hours ago | parent | prev | next [-]

> Sure, you can spend the weeks to months of expensive and time consuming work it takes to get a fuzzy, half accurate and biased picture of what your users workflows look like through user interviews and surveys. Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision.

Yes, admittedly, the first time you do these things, they're difficult, hard and you have lots to learn. But as you do this more often, build up a knowledge base and learn about your users, you'll gain knowledge and experience you can reuse, and it'll no longer take you weeks or months of investigations to answer "Where should this button go?", you'll base it on what you already know.

johnfn 2 hours ago | parent | next [-]

You seem to be interpreting my position as saying that one should only use telemetry to make decisions. Of course, no one reasonable would hold that position! What I’m saying is that only relying on user interviews without supplementing them with analytics would be knowingly introducing a blind spot into how you understand user behavior.

embedding-shape an hour ago | parent [-]

Yes, probably because someone else said "If you dont have analytics you are flying blind" which I initially replied to, then when you replied to my reply, I took that as agreeing with parent, which isn't necessarily true.

> What I’m saying is that only relying on user interviews without supplementing them

I also took your "spend the weeks to months of expensive and time consuming work [...] Or you can look at the analytics" as a "either this or that proposition", where if we're making that choice, I'd go with qualitative data rather than quantitative, regardless of time taken. But probably it comes down to what tradeoffs we're willing to accept.

3 hours ago | parent | prev | next [-]
[deleted]
hombre_fatal 3 hours ago | parent | prev | next [-]

Asking users isn't a substitute for usage data.

Usage data is the ground truth.

Soliciting user feedback is invasive, and it's only possible for some questions.

The HN response to this is "too bad" but it's a thought-terminating response.

AlotOfReading 3 hours ago | parent | next [-]

It goes the other way as well. Usage data isn't equivalent to asking users either. A solid percentage of bad decisions in tech can be traced to someone, somewhere forgetting that distinction and trusting usage data that says it's it's okay to remove <very important feature> because it's infrequently used.

hombre_fatal 43 minutes ago | parent | next [-]

Yeah, it's not a good discussion without concrete examples.

One: Building a good UX involves guesswork and experiments. You don't know what will be best for most users until you try something. You will often be wrong, and you rarely find the global maximum on the first try.

This applies to major features but also the most trivial UI details like whether users understand that this label can be clicked or that this button exists.

Two: Like all software, you're in a constant battle to avoid encumbering the system with things you don't actually need, like leaving around UI components that people don't use. Yet you don't want to become so terse with the UI that people find it confusing.

Three: I ran a popular cryptocurrency-related service where people constantly complained about there being no 2FA. I built it and polished a UX flow to both hint at the feature and make it easy to set up. A few months later I saw that only a few people enabled it.

Was it broken? No. It just turns out that people didn't really want to use 2FA.

The point being that you can be super wrong about usage patterns even after talking to users.

Finally: It's easy to think about companies we don't like and telemetry that's too snitchy. I don't want Microslop phoning home each app I open.

But if we only focus on the worst cases, we miss out on the more reasonable cases where thoughtful developers collect minimal data in an earnest effort to make the UX better for everyone.

ragall 22 minutes ago | parent [-]

> You don't know what will be best for most users until you try something.

That's because you don't understand your users. If you did, you wouldn't need to spy on them.

> you rarely find the global maximum on the first try

One never finds the "global maximum" with telemetry, at best a local sort-of maximum. To find what's best, you need understanding, which you never get from telemetry. Telemetry tells you what was done, not why or what was in the people's mind when it was done.

junon 2 hours ago | parent | prev [-]

This. If I'm forced to use a feature I hate because it's the only way to do something, the "ground truth" reflects that I like that feature. It doesn't tell the whole story.

groby_b an hour ago | parent [-]

Most metrics teams are reasonably competent and are aware of that. Excepting "growth hackers"

I haven't been in a single metrics discussion where we didn't talk about what we're actually measuring, if it reflects what we want to measure, and how to counterbalance metrics sufficiently so we don't build yet another growthhacking disaster.

Doesn't mean that metrics are perfect - they are in fact aggravatingly imprecise - but the ground truth is usually somewhat better than "you clicked it, musta liked it!"

embedding-shape 2 hours ago | parent | prev [-]

> Usage data is the ground truth.

For what, precisely? As far as I know, you can use it to know "how much is X used" but not more than that, and it's not a "ground truth" for anything besides that.

acedTrex 4 hours ago | parent | prev [-]

So if you don't want to spend the time doing that, or as is more accurate in corporate settings, the general turnover of the team is high enough that no one is around long enough to build that deep foundational product knowledge, and to be frank most people do not care enough.

This is why telemetry happens, its faster, easier and more resilient to organizational turmoil.

embedding-shape 3 hours ago | parent [-]

> This is why telemetry happens, its faster, easier and more resilient to organizational turmoil.

I don't disagree with that, I was mainly talking about trying to deliver an experience that makes sense, is intuitive and as helpful and useful as possible, even in exchange for it taking longer time.

Of course this isn't applicable in every case, sometimes you need different tradeoffs, that's OK too. But that some favor quality over shorter implementation time shouldn't drive people crazy, it's just making different tradeoffs.

acedTrex 3 hours ago | parent [-]

> even in exchange for it taking longer time.

I think in terms of corporate teams this is the issue a lot of times, people just are not on the team long enough to build that knowledge. Between the constant reorgs, these days layoffs and other churn the no one puts in the years required to gain the implicit knowledge. So orgs reach for the "tenure independent knowledge base.

6r17 21 minutes ago | parent | prev | next [-]

"You’d think telemetry is screen recording your every move" - that's literally what tracing and telemetry is about.

"Sure, you can spend the weeks to months of expensive and time consuming work it takes to get a fuzzy, half accurate and biased picture of what your users workflows look like through user interviews and surveys. Or you can look at the analytics, which tell you everything you need to know immediately, always up to date, with perfect precision." -> your analytics will never show what you didn't measure - it will only show what you already worked on - at best, it's some kind of validator mechanism - not a driver for feature exploration.

This kind of monitoring need to go through the documented data exposure - and it's a sufficient argument for a company to stop using github immediately if they take security seriously.

But I'd add that if you take security seriously you are not on Github anyway.

Lammy an hour ago | parent | prev | next [-]

> and sending it to the government

It literally is. The network itself is always listening: https://en.wikipedia.org/wiki/Room_641A

The mere act of making a network connection leaks my physical location, the time I'm using my computer, and the fact that I use a particular piece of software. Given enough telemetry endpoints creates a fingerprint unique to me, because it is very unlikely that any other person at the same physical location uses the exact same set of software that I do, almost all of which want to phone home all the goddamn time. It's the metadata that's important here, so payload contents (including encryption) don't even matter.

sdevonoes 3 hours ago | parent | prev | next [-]

Telemetry is the previous obvious step to surveillance. Not the telemetry you implement in your own small bus, but at the scale of microsoft, apple, meta… yeah

ambicapter 3 hours ago | parent | prev | next [-]

> with perfect precision.

Precision isn't accuracy and all that.

3 hours ago | parent | prev [-]
[deleted]