| ▲ | orangecat 5 hours ago | |
this metric is more centered around the mode of the distribution (poor people). It's focused on the very poorest, who are not the mode. (Income distribution is approximately lognormal; see https://www.researchgate.net/figure/The-lognormal-distributi...). Say you have 10 people: one making $800/year, 8 making $80k/year, and one evil billionaire making $800 million. Their times to earn $1 are respectively 10 hours, 0.1 hours, and essentially zero. If you take the arithmetic mean of that you get 1.09 hours, and that's dominated by the single poor person. If you double that person's income to $1600, then they're at 5 hours to earn $1, and the overall average is nearly cut in half to 0.58. Meanwhile you can reduce the income of all the middle class people to $40k and not much changes; the average time to $1 would be (5+8(0.2)+0)/10=0.66. It captures the income distribution much better than average income. Not really, and certainly not better than median income which is what people typically use. It tries to measure exactly how little income the very poor make, which is not normally what people mean when they talk about inequality or poverty, and also hard to measure at the accuracy that you need when small changes produce huge swings in the result. In particular I don't believe he's correctly accounted for government benefits; hardly anyone in the US is consuming less than $8000/year. | ||
| ▲ | megaman821 5 hours ago | parent [-] | |
Thanks for the comment, I was trying to parse the meaning of "time needed to earn $1" for a bit. This just boils down to what countries have the highest floor for their poorest members. | ||