That's the plan, but that will require stamping, and we are planning to do stamped parts.
Every bit of information you release on 3.0 makes me more and more willing to upgrade from 2.0 to it!
Nanite approach is an engine-level / pipeline changing approach that if enforced in the pipeline, will mean games will be able to be optimised automatically with it, of course to a certain degree. I don't know at which point we are with the data bandwidth on Direct Storage implementation and pulling data for Nanite, this may be the initial problem of what kind of quality assets should be in the game, whether the SSD performance will be the bottleneck/limiting factor here OR whether a better/higher-end GPU will mean better performance when it comes to data bandwidth in Direct Storage. This will determine whether the optimisations that are a must have will be working well for every card assuming the SSD bandwidth will be the driving factor or only for the high end cards if the card side is the bottle neck etc. I did not investigate this deep enough yet, but I also think it may be too early for that.
As I see it, DirectStorage (and all its equivalent), once it will become a prerequisite, will change how levels will be designed (especially in open worlds) without the needs for corridors, tunnels, elevators or whatnot to allow assets swap in GPU memory on-the-fly, without the need of loading screens. Also, it will allow in any type of game bigger and more diverse assets. If, with the traditional SSD>CPU (decompression)>RAM>CPU (copying)>GPU approach you can do a total GPU memory swap every, let say for example, 30 seconds, meaning that all the needed assets in all directions for the next 30 seconds (at least) have to reside on the GPU memory, with DirectStorage (eventually, SSD>GPU, both decompression and copy to VGA memory) you will do a total assets swap in, let's again say for example, 3 seconds, meaning that you can allocate assets of the same quality for a smaller area (the next 3 seconds in all directions), allowing more diversity on the scene, OR fewer, repeated, but
better, assets for the same 3 next seconds. In other words, if we take Insomnic Games' Spider-Man as an example,with DirectStorage you can either have
A: (
diversity) the same quality as today's assets but different for each and every skyscraper, because you can keep swapping them on GPU memory while you swing across the city seamessly
or
B: (
quality) you can have the same, repeated, amount of skyscrapers as today, but with better, more detailed, assets, again, because you only need to keep a much smaller portion of the city in GPU memory.
Of course, the more memory the GPU has, the better it will still be. What will be possible to achieve with DirectStorage, though, is that today's highest level quality of assets will require lesser GPU RAM, because, unlike today, you'll need to keep less of it readily avaiable.
Nanite can only add to all that but, and I'm being pessimistic here, I don't see developer studios using it to lower GPU requirements capping the quality at what can be done with today's top-end graphic cards, they will simply increase the quality of the next games, driven by the newer, more powerful, graphic cards. And so on, and so on.