▲ | eviks 21 hours ago | |||||||||||||||||||||||||
They don't do one thing well since it's all text, not structured data, which makes chained analysis a challenge, which leads to the desire for integration | ||||||||||||||||||||||||||
▲ | bayindirh 20 hours ago | parent [-] | |||||||||||||||||||||||||
ls is tabular data, and you can format it (ls -1, ls -l, ls -w, plus sorting, field formatting, and more), and you can cut/parse/format in a standard way. Every field sans the filename is fixed length, can be handled with awk/cut/sed according your daily mood and requirements, etc. etc. So, ls can be chained very nicely, which I do every day, even without thinking. You don't need to have a "structured data with fields" to parse it. You just need to think it like a tabular data with line/column numbers (ls -l, etc.) or just line numbers (ls -1). So, as long as ls does one thing well, it's alright. Ah, some of the "enhanced" ls tools can't distinguish between pipe and a terminal, and always print color/format escape codes to pipe too, doubling the fun of using them. So, thanks, I'll stick with my standard ls. That one works. | ||||||||||||||||||||||||||
|