A very simple black-to-white gradient can only be, at most, 256 pixels wide before it starts banding on the majority of computers that use SDR displays. HDR only gives you a couple extra bits where each bit doubles how wide the gradient can be before it starts running out of unique color values. If the two color endpoints of the gradient are closer together, you get banding sooner. Dithering completely solves gradient banding.
The average desktop computer is running with 8 bit color depth the vast majority of the time, so find or generate basically any wide basic gradient and you'll see it.
In terms of color spaces, SRGB (the typical baseline default RGB of desktop computing) is quite naive and inefficient. Pretty much its only upsides are its conceptual and mathematical simplicity. There are much more efficient color spaces which use dynamic non-linear curves and are based on how the rods and cones in human eyes sense color.