The main thing that has happened is that GPUs themselves have become pretty standardized even across the various vendors. DirectX and OpenGL have historically put a lot of effort into hiding the differences between the various vendor's GPU designs, many of which in the past had quite distinct hardware architecture--OpenGL has always been a much 'leakier' abstraction with all those zillions of vendor-specific extensions.
DirectX 12 and Mantle both expose a lot more of the underlying behavior of the GPU than in the past, so the runtime libraries are a much thinner abstraction over the underlying driver. This approach won't work on every video card, particularly many of the older ones which traditional DirectX and OpenGL could support--with OpenGL carrying a huge legacy tail of support going back decades. The push to normalize GPU features for DirectX 10 then DirectX 11 have made the consumer-grade PC graphics hardware from different vendors a lot less quirky since the early days of DirectX.
The other thing to keep in mind is that this trade-off also demands a lot more of the application/game itself to properly handle CPU/GPU synchronization, efficiently drive the hardware, track and exhaustively describe the state combinations they use, and generally operate without nearly the level of safety & support from the runtime.
Writing a DirectX 12 application takes a lot more graphics programming skill than DirectX 11 or OpenGL today. Setting up a device/swapchain and simple render scene in Direct3D 9 could take a few dozen lines of code, in Direct3D 11 a few hundred, but in DirectX 12 a whole lot more because the application itself has to do all the individual steps that used to be done 'by magic' in the runtime. This lets applications/games tweak the exact sequence and maybe skip steps they don't actually need, but it's on them and not the runtime to make that happen.
To quote Uncle Ben, With great power comes great responsibility.
Developers have insisted they want a 'Console-like' API on PC for years, so now they will get to find out if they really want it after all. For game engine writers, AAA titles with custom engines, and the technology demo scene, it's pretty darn cool to have so much nearly direct control over the hardware. For mere mortals, it will be a lot easier to use DirectX 11 for a while longer or use an engine that has already implemented DirectX 12.