| ▲ | adrianmonk 3 hours ago | ||||||||||||||||
I spent literally thousands of hours staring at those screens. You have it backwards. Interlacing was worse in terms of refresh, not better. Interlacing is a trick that lets you sacrifice refresh rates to gain greater vertical resolution. The electron beam scans across the screen the same number of times per second either way. With interlacing, it alternates between even and odd rows. With NTSC, the beam scans across the screen 60 times per second. With NTSC non-interlaced, every pixel will be refreshed 60 times per second. With NTSC interlaced, every pixel will be refreshed 30 times per second since it only gets hit every other time. And of course the phosphors on the screen glow for a while after the electron beam hits them. It's the same phosphor, so in interlaced mode, because it's getting hit half as often, it will have more time to fade before it's hit again. | |||||||||||||||||
| ▲ | numpad0 13 minutes ago | parent | next [-] | ||||||||||||||||
There are no pixels in CRT. The guns go left to right, ¥r¥n, left to right, while True for line in range(line_number). The RGB stripes or dots are just stripes or dots, they're not tied to pixels. There would be RGB guns that are physically offset to each others, coupled with a strategically designed mesh plates, in such ways that e- from each guns sort of moire into only hitting the right stripes or dots. Apparently fractions of inches of offsets were all it took. The three guns, really more like fast acting lightbulbs, received brightness signals for each respective RGB channels. Incidentally that means they could go between brightness zero to max couple times over 60[Hz] * 640[px] * 480[px] or so. Interlacing means the guns draw every other lines but not necessarily pixels, because CRTs has beam spot sizes at least. | |||||||||||||||||
| ▲ | bitwize 39 minutes ago | parent | prev | next [-] | ||||||||||||||||
Have you ever seen high speed footage of a CRT in operation? The phosphors on most late-80s/90s TVs and color graphic computer displays decayed instantaneously. A pixel illuminated at the beginning of a scanline would be gone well before the beam reached the end of the scanline. You see a rectangular image, rather than a scanning dot, entirely due to persistence of vision. Slow-decay phosphors were much more common on old "green/amber screen" terminals and monochrome computer displays like those built into the Commodore PET and certain makes of TRS-80. In fact there's a demo/cyberpunk short story that uses the decay of the PET display's phosphor to display images with shading the PET was nominally not capable of (due to being 1-bit monochrome character-cell pseudographics): https://m.youtube.com/watch?v=n87d7j0hfOE | |||||||||||||||||
| ▲ | cm2187 3 hours ago | parent | prev [-] | ||||||||||||||||
You assume that non interlaced computer screens in the mid 90s were 60Hz. I wish they were. I was using Apple displays and those were definitely 30Hz. | |||||||||||||||||
| |||||||||||||||||