▲ | eviks 11 hours ago | |||||||
> You don't need to have a "structured data with fields" to parse it. You do if you want to have nice things like being able to format your output without having to worry about breaking the dumb tools down the pipe, which can't sort the numbers they don't see: - 2.1K (this isn't the same as the second) - 2.1K - 2.1M Also, why do I need to count columns like a cave man in 'sort -k 5' instead of doing the obvious "sort by size"? > print color/format escape codes to pipe too A problem that would disappear with... structured data! > Ah, some of the "enhanced" ls tools so use the other "some" that can? | ||||||||
▲ | bayindirh 9 hours ago | parent [-] | |||||||
> which can't sort the numbers they don't see Then you sort at the point you can see the numbers and discard them later. > Also, why do I need to count columns like a cave man in 'sort -k 5' instead of doing the obvious "sort by size" awk can sort the columns for you. Plus, ls can already sort by size. Try "ls -lS " for biggest file first, or "ls -lSr" for smallest file first. Add "-h" to make human readable. > A problem that would disappear with... structured data! No. A problem that would disappear with "a small if block which asks which environment I'm in". If you're in a shell "-t" test in sh/bash will tell you that. If you're coding a tool, there are standard ways to do that (via termcap IIRC). Standard UNIX tools are doing this for decades now. IOW, structured data is not a cure for laziness... > so use the other "some" that can? Yes, because their authors are not that lazy. | ||||||||
|