To me, it isn't a question of whether ray tracing is a good idea or not (categorically, it is). It's just a question of whether it's a good idea right now.
I remember implementing a basic ray tracer as a Computer Science undergrad student well over a decade ago and being blown away at how much better the end result looked than the techniques we were using then (or even now). At least in terms of visual fidelity, it is a fundamentally and significantly better way of rendering than anything else we've been using up to this point, in much the same way that the transistor was a fundamentally better way of computing than the vacuum tube that preceded it. The only problem was that it took several orders of magnitude too long to be usable in real-time for things like AAA video games, which is why you only ever hear about ray tracing being used in offline renderers (e.g. Pixar using it in recent films).
Ever since that class, there's been no doubt in my mind that at some point—once the costs were down, the technology was in people's hands, and the content production pipelines had time to adapt—every single major 3D game engine would be (re-)built using real, honest-to-goodness ray tracing.
So is that time now? I have no idea. I haven't looked into the current state of affairs enough to know whether Nvidia is cutting corners to sell us "ray tracing" before the hardware is actually ready. But whenever real ray tracing does come, I expect that its arrival will look something like what we see now: it'll start on the high end of the market and won't have wide support or adoption, after a few years it'll be supported on every card and will have mixed adoption, and then a few years after that we'll be on the other side of a one-way transition, after which point we won't look back, just like no one is switching back to using vacuum tubes for general purpose computing.