> So I can know if something will run or not
You won't find any (or, at least, many) modern MacOS apps that specify a particular clock rate, so the information isn't as useful as you might think. I think that's John Galt's meaning. He's not being glib.
In the PC world, there are about gazenty kabillion (a real number ;) ) permutations and combinations of hardware and its impossible for a software vendor to list them all, so the industry settled on a 'lowest common denominator' model, listing CPU clock speeds and GPU generation as a best-effort attempt to indicate performance - "You need a 3.5GHz Intel i7 processor with a NVIDIA 4xxx GPU to run this app", but the actuals are far more nuanced. No mention of AMD processors or GPUs, no mention of Intel GPUs, no idea whether a 3080-based GPU will work at all, or just at a lower frame rate, etc.
In the Mac world, Apple are the only player, and they generally release models on an annual basis. Along with the specific hardware models for that year, there's typically a corresponding OS release - Ventura, Sonoma, Tahoe, High Sierra, etc.
Since the number of permutations and combinations are much lower (and the variations within them are much smaller), it's easier for a developer to say you need either a minimum model year or a specific CPU (e.g. M1, M2, etc.) and an OS version (Sonoma/MacOS 14, Monterey/MacOS 12, etc.) and you have a solid sense that the application will run, without needing to know anything about clock rates, bus speeds, core counts, etc.
Newer systems (generally), will run better, but knowing the software calls for "Sonoma or later" means you're good to go.