sRGB is unrelated to monitors. It was indeed convenient for CRT monitors to display sRGB images, simply because their gamma curve matched sRGB precisely. Modern LCD monitors simulate the same gamma response, so from our point of view nothing has changed.
The real reason we use sRGB is because it allows us to pack more useful information in 8 bits. Think of it as a compression scheme from linear-space to gamma-space, in such a way the preserves more information towards the "dark" side of the visible spectrum, where the human eye is more sensitive to details. This is true in general and doesn't have anything to do with the output technology, it's only about being able to store a nice image in as little bits as possible.
Now, ARB_framebuffer_sRGB (which is what PixelFormat.withSRGB enables) provides two pieces of functionality:
a) Automatically converts the shader output (assumed to be linear) to sRGB and stores that to the framebuffer. This is trivial to do manually using shader arithmetic, so that's not too useful.
b) Performs blending in linear space. This is super important. What happens is this:
A: shader output (linear)
X: blending operation
B: previous framebuffer color (sRGB)
F: final framebuffer color (sRGB)
F = (A X B.toLinear()).toSRGB()
There's nothing else in OpenGL that lets you do that. As a bonus, the blending operation is performed with higher accuracy (I'm guessing in 16-bit floating-point or a fixed-point format with 10+ bits per channel).
So the answer is yes, you should still care about sRGB if you want to do gamma-correct rendering.