Remix.run Logo
m132 15 hours ago

>Before HD, almost all video was non-square pixels

Correct. This came from the ITU-R BT.601 standard, one of the first digital video standards authors of which chose to define digital video as a sampled analog signal. Analog video never had a concept of pixels and operated on lines instead. The rate at which you could sample it could be arbitrary, and affected only the horizontal resolution. The rate chosen by BT.601 was 13.5 MHz, which resulted in a 10/11 pixel aspect ratio for 4:3 NTSC video and 59/54 for 4:3 PAL.

>SD channels on cable TV systems are 528x480

I'm not actually sure about America, but here in Europe most digital cable and satellite SDTV is delivered as 720x576i 4:2:0 MPEG-2 Part 2. There are some outliers that use 544x576i, however.

mrandish 6 hours ago | parent | next [-]

Good post. For anyone wondering "why do we have these particular resolutions, sampling and frame rates, which seem quite random", allow me to expand and add some color to your post (pun intended). Similar to how modern railroad track widths can be traced back to the wheel widths of roman chariots, modern digital video standards still reverberate with echoes from 1930s black-and-white television standards.

BT.601 is from 1982 and was the first widely adopted analog component video standard (sampling analog video into 3 color components (YUV) at 13.5 MHz). Prior to BT.601, the main standard for video was SMPTE 244M created by the Society of Motion Picture and Television Engineers, a composite video standard which sampled analog video at 14.32 MHz. Of course, a higher sampling rate is, all things equal, generally better. The reason for BT.601 being lower (13.5 MHz) was a compromise - equal parts technical and political.

Analog television was created in the 1930s as a black-and-white composite standard and in 1953 color was added by a very clever hack which kept all broadcasts backward compatible with existing B&W TVs. Politicians mandated this because they feared nerfing all the B&W TVs owned by voters. But that hack came with some significant technical compromises which complicated and degraded analog video for over 50 years. The composite and component sampling rates (14.32 MHz and 13.5 MHz) are both based on being 4x a specific existing color carrier sampling rate from analog television. And those two frequencies directly dictated all the odd-seeming horizontal pixel resolutions we find in pre-HD digital video (352, 704, 360, 720 and 768) and even the original PC display resolutions (CGA, VGA, XGA, etc). To be clear, analog television signals were never pixels. Each horizontal scanline was only ever an oscillating electrical voltage from the moment photons struck an analog tube in a TV camera to the home viewer's cathode ray tube (CRT). Early digital video resolutions were simply based on how many samples an analog-to-digital converter would need to fully recreate the original electrical voltage.

For example, 720 is tied to 13.5 Mhz because sampling the active picture area of an analog video scanline at 13.5 MHz generates 1440 samples (double per-Nyquist). Similarly, 768 is tied to 14.32 MHz generating 1536 samples. VGA's horizontal resolution of 640 is simply from adjusting analog video's rectangular aspect ratio to be square (720 * 0.909 = 640). It's kind of fascinating all these modern digital resolutions can be traced back to decisions made in the 1930s based on which affordable analog components were available, which competing commercial interests prevailed (RCA vs Philco) and the political sensitivities present at the time.

leguminous 4 hours ago | parent | next [-]

> For example, 720 is tied to 13.5 Mhz because sampling the active picture area of an analog video scanline at 13.5 MHz generates 1440 samples (double per-Nyquist).

I don't think you need to be doubling here. Sampling at 13.5 MHz generates about 720 samples.

    13.5e6 Hz * 53.33...e-6 seconds = 720 samples
The sampling theorem just means that with that 13.5 MHz sampling rate (and 720 samples) signals up to 6.75 MHz can be represented without aliasing.

There's some history on the standard here: https://tech.ebu.ch/docs/techreview/trev_304-rec601_wood.pdf

danny8000 4 hours ago | parent [-]

Non-square pixels come from the legacy of anthropomorphic film projection. This was developed from the need to capture wide aspect ratio images on standard 33mm film.

This allows the captured aspect ratio on film to be fixed for various aspect ratios images that are displayed.

https://en.wikipedia.org/wiki/Anamorphic_format

roygbiv2 6 hours ago | parent | prev [-]

> Similar to how modern railroad track widths can be traced back to the wheel widths of roman chariots

This is repeated often and simply isn't true.

mrandish 6 hours ago | parent [-]

I based that on seeing the BBC Science TV series (and books) Connections by science historian James Burke. If it's been updated since, then I stand corrected. Regardless of the specific example, my point was that sometimes modern standards are linked to long-outdated historical precedents for no currently relevant reason.

drmpeg 14 hours ago | parent | prev | next [-]

Here's some captures from my Comcast system here in Silicon Valley.

https://www.w6rz.net/528x480.ts

https://www.w6rz.net/528x480sp.ts

m132 14 hours ago | parent [-]

Cool!

Doing my part and sending you some samples of UPC cable from the Czech Republic :)

720x576i 16:9: https://0x0.st/P-QU.ts

720x576i 4:3: https://0x0.st/P-Q0.ts

That one weird 544x576i channel I found: https://0x0.st/P-QG.ts

I also have a few decrypted samples from the Hot Bird 13E, public DVB-T and T2 transmitters and Vectra DVB-C from Poland, but for that I'd have to dig through my backups.

ErroneousBosh 11 hours ago | parent | prev [-]

My DVCAM equipment definitely outputs 720x576i, although whether that's supposed to render to 768x576, or 1024x576 for 16:9 stuff.

It still looks surprisingly good, considering.