He mangles some of the pros and cons of CRTs towards the end.
They aren’t going to be indefinitely reliable. The phosphor goes bad over time and makes for a weaker image. Doubly so for color phosphors. Some of them are aging better than others, but that’s survivorship bias. We might be looking at the last decade where those old CRTs can still be in anything close to widespread use. Will probably be a few working examples here or there in private collections, of course.
CRTs do have latency, and this is something a lot of people get wrong. A modern flatscreen display can have better latency than CRTs when the hardware takes advantage of it.
The standard way of measuring latency is at the halfway point of the screen. For NTSC running at 60Hz (which is interlaced down to 30fps (roughly)), that means we have 8.33ms of latency. If you were to hit the button the moment the screen starts the next draw, and the CPU miraculously processes it in time for the draw, then it takes that long for the screen to be drawn to the halfway point and we take our measurement.
An LCD can have a response time of less than 2ms. That’s on top of the frame draw time, which can easily be 120Hz on modern systems (or more; quite a bit more in some cases). That means you’re looking at (1 / 120) + 2 = 10.3ms of latency, provided your GPU keeps up at 120 fps. Note that this is comparable to a PAL console (which runs at 50Hz) on CRT. A 200Hz LCD with fast pixel response times is superior to NTSC CRTs. >400Hz is running up against the human limit to distinguish frame changes, and we’re getting there with some high end LCDs right now.
When talking about retro consoles, we’re limited by the hardware feeding the display, and the frame can’t start drawing until the console has transmitted everything. So then you’re looking at the 2ms LCD draw time on top of a full frame time, which for NTSC would be (1 / 60) + 2 = 18.7ms. Which is why lightguns can’t work.
Nope. There is an industry standard way of measuring latency, and it’s measured at the halfway point of drawing the image.
Edit: you can measure this through Nvidia’s LDAT system, for example, which uses a light sensor placed in the middle of the display combined with detecting the exact moment you create an input. The light sensor picks up a change (such as the muzzle flash in an fps) and measures the difference in time. If you were to make this work on a CRT running at NTSC refresh rates, it would never show less than 8.3ms when in the middle of the screen.
If you are measuring fairly with techniques we use against LCDs, then yes, CRTs have latency.
Nope. There is an industry standard way of measuring latency, and it’s measured at the halfway point of drawing the image.
No. And if you want to actually provide a link to your “industry standard” feel free to, just make sure that your “standard” actually can be applied to a CRT first.
You can literally focus the CRT to only show one pixel (more accurately beam width) worth of value. And that pixel would be updated many thousands of times a second (literally constant… since it’s analog).
If you’re going to define latency as “drawing the image” (by any part of the metric) then a CRT can draw a single “pixel” worth of value 1000s of times a second… probably more. Where your standard 60hz panel can only do 1/60th a second… (or even the highest LCDs at 1/365).
If there is a frame to draw and that frame is being processed, then yes. You’re right. Measuring at the middle will yield a delay. But this isn’t how all games/operations work for devices in all of history. There are many applications where data being sent to the display is literally read from memory nanoseconds prior. CRTs have NO processing delay that LCDs do have.
Further points of failure in your post. CRTs are not all “NTSC” standard (Virtually every computer monitor for instance). There’s plenty of CRTs that can push much higher than the NTSC standard specifies.
800 x 600/155 Hz
1024 x 768/121 Hz
1280 x 1024/91 Hz
1600 x 1200/78 Hz
So on a 60hz LCD will always be 0.016 to do the whole image. Regardless of it’s resolution being displayed.
Not so on the CRT… Higher performance CRTs can draw more “pixels” per second. and when you lower the amount of lines you want it to display the full frame draw times go down substantially. There’s a lot of ways to define these things, that your simplistic view doesn’t account for. The reality is though, it’s possible if you skip the idea of a “frame” that the time from input to the time of display on the CRT monitor is lower simply because there’s no processing occurring here, your limit is physics of the materials you build the monitor out of. Not some chips capability to decode a frame. thus… No latency.
Not frametime. Not FPS. Not Hz. Latency is NONE of those things, otherwise we wouldn’t have those other terms and would have strictly used “latency” instead.
Those lines/colors are drawn straight from memory without the concept of a frame. There is no latency here. Many scene demos abused this function to achieve really wild affects as well. Your LCD cannot do that, those demos don’t function correctly on LCDs…
Lightguns are a perfect example of how this can be leveraged (which is completely impossible on an LCD as well).
By timing the click of the lightgun input to which pixel is currently being drawn by the frame to take that as input for the gun. That requires minimal latency to do. LCDs cant do that.
Ultimately people like you are trying to redefine what latency is that flies in the face of actual history that shows us there is a distinct difference that has historically mattered and even applications of that latency that CANNOT be what you’re claiming it to be.
can you tell me why the LCD on the right is ALWAYS behind? And why it will ALWAYS be the case that it will not work, regardless of how fast the LCD panel is? The reason you’re going to come to is that it’s processing delay. Which didn’t exist on CRTs. That’s “LATENCY”.
When talking about retro consoles, we’re limited by the hardware feeding the display, and the frame can’t start drawing until the console has transmitted everything.
This is where you’re completely wrong. CRTs don’t know the concept of a frame. It draws the input that it gets. Period. There’s no buffer… there’s no where to hold onto anything that is being transmitted. It’s literally just spewing electrons at the phosphors.
Edit: typo
Edit2: to expound on the LCD vs CRT thing with light guns. CRTs drawn the “frame” as it’s received… so as it gets the voltage it varies that voltage on the electron gun itself, which means that when the Sega console in this case sets the video buffer to the white value for a coordinate and displays it, it knows exactly which pixel is currently being modified. The LCD will take the input, store it in a buffer until it gets the full frame. Then display. The Sega doesn’t know when that frame will actually be displayed as there’s other shit between it and the display mechanism doing stuff. There is an innate delay that MUST occur on the LCD that simply doesn’t on the CRT. That’s the latency.
He mangles some of the pros and cons of CRTs towards the end.
They aren’t going to be indefinitely reliable. The phosphor goes bad over time and makes for a weaker image. Doubly so for color phosphors. Some of them are aging better than others, but that’s survivorship bias. We might be looking at the last decade where those old CRTs can still be in anything close to widespread use. Will probably be a few working examples here or there in private collections, of course.
CRTs do have latency, and this is something a lot of people get wrong. A modern flatscreen display can have better latency than CRTs when the hardware takes advantage of it.
The standard way of measuring latency is at the halfway point of the screen. For NTSC running at 60Hz (which is interlaced down to 30fps (roughly)), that means we have 8.33ms of latency. If you were to hit the button the moment the screen starts the next draw, and the CPU miraculously processes it in time for the draw, then it takes that long for the screen to be drawn to the halfway point and we take our measurement.
An LCD can have a response time of less than 2ms. That’s on top of the frame draw time, which can easily be 120Hz on modern systems (or more; quite a bit more in some cases). That means you’re looking at (1 / 120) + 2 = 10.3ms of latency, provided your GPU keeps up at 120 fps. Note that this is comparable to a PAL console (which runs at 50Hz) on CRT. A 200Hz LCD with fast pixel response times is superior to NTSC CRTs. >400Hz is running up against the human limit to distinguish frame changes, and we’re getting there with some high end LCDs right now.
When talking about retro consoles, we’re limited by the hardware feeding the display, and the frame can’t start drawing until the console has transmitted everything. So then you’re looking at the 2ms LCD draw time on top of a full frame time, which for NTSC would be (1 / 60) + 2 = 18.7ms. Which is why lightguns can’t work.
Only if you’re measuring “how long to draw a full image”. (which is not latency).
The time it takes for voltage input to equal drawn pixel on the phosphor is much less than the ms scale, which LCD panels simply cannot do.
Latency. Not refresh rate or FPS.
Nope. There is an industry standard way of measuring latency, and it’s measured at the halfway point of drawing the image.
Edit: you can measure this through Nvidia’s LDAT system, for example, which uses a light sensor placed in the middle of the display combined with detecting the exact moment you create an input. The light sensor picks up a change (such as the muzzle flash in an fps) and measures the difference in time. If you were to make this work on a CRT running at NTSC refresh rates, it would never show less than 8.3ms when in the middle of the screen.
If you are measuring fairly with techniques we use against LCDs, then yes, CRTs have latency.
No. And if you want to actually provide a link to your “industry standard” feel free to, just make sure that your “standard” actually can be applied to a CRT first.
You can literally focus the CRT to only show one pixel (more accurately beam width) worth of value. And that pixel would be updated many thousands of times a second (literally constant… since it’s analog).
If you’re going to define latency as “drawing the image” (by any part of the metric) then a CRT can draw a single “pixel” worth of value 1000s of times a second… probably more. Where your standard 60hz panel can only do 1/60th a second… (or even the highest LCDs at 1/365).
If there is a frame to draw and that frame is being processed, then yes. You’re right. Measuring at the middle will yield a delay. But this isn’t how all games/operations work for devices in all of history. There are many applications where data being sent to the display is literally read from memory nanoseconds prior. CRTs have NO processing delay that LCDs do have.
Further points of failure in your post. CRTs are not all “NTSC” standard (Virtually every computer monitor for instance). There’s plenty of CRTs that can push much higher than the NTSC standard specifies.
Here’s an example from a bog standard monitor I had a long time ago… https://www.manualslib.com/products/Sony-Trinitron-Cpd-E430-364427.html
So on a 60hz LCD will always be 0.016 to do the whole image. Regardless of it’s resolution being displayed. Not so on the CRT… Higher performance CRTs can draw more “pixels” per second. and when you lower the amount of lines you want it to display the full frame draw times go down substantially. There’s a lot of ways to define these things, that your simplistic view doesn’t account for. The reality is though, it’s possible if you skip the idea of a “frame” that the time from input to the time of display on the CRT monitor is lower simply because there’s no processing occurring here, your limit is physics of the materials you build the monitor out of. Not some chips capability to decode a frame. thus… No latency.
Not frametime. Not FPS. Not Hz. Latency is NONE of those things, otherwise we wouldn’t have those other terms and would have strictly used “latency” instead.
And a wonderful example of this is the commodor64 tape loading screens. https://youtube.com/watch?v=Swd2qFZz98U
Those lines/colors are drawn straight from memory without the concept of a frame. There is no latency here. Many scene demos abused this function to achieve really wild affects as well. Your LCD cannot do that, those demos don’t function correctly on LCDs…
Lightguns are a perfect example of how this can be leveraged (which is completely impossible on an LCD as well).
Specifically scroll down to the Sega section. https://www.retrorgb.com/yes-you-can-use-lightguns-on-lcds-sometimes.html
By timing the click of the lightgun input to which pixel is currently being drawn by the frame to take that as input for the gun. That requires minimal latency to do. LCDs cant do that.
Ultimately people like you are trying to redefine what latency is that flies in the face of actual history that shows us there is a distinct difference that has historically mattered and even applications of that latency that CANNOT be what you’re claiming it to be.
https://yt.saik0.com/watch?v=llGzvCaw62Y#player-container
can you tell me why the LCD on the right is ALWAYS behind? And why it will ALWAYS be the case that it will not work, regardless of how fast the LCD panel is? The reason you’re going to come to is that it’s processing delay. Which didn’t exist on CRTs. That’s “LATENCY”.
This is where you’re completely wrong. CRTs don’t know the concept of a frame. It draws the input that it gets. Period. There’s no buffer… there’s no where to hold onto anything that is being transmitted. It’s literally just spewing electrons at the phosphors.
Edit: typo
Edit2: to expound on the LCD vs CRT thing with light guns. CRTs drawn the “frame” as it’s received… so as it gets the voltage it varies that voltage on the electron gun itself, which means that when the Sega console in this case sets the video buffer to the white value for a coordinate and displays it, it knows exactly which pixel is currently being modified. The LCD will take the input, store it in a buffer until it gets the full frame. Then display. The Sega doesn’t know when that frame will actually be displayed as there’s other shit between it and the display mechanism doing stuff. There is an innate delay that MUST occur on the LCD that simply doesn’t on the CRT. That’s the latency.