Not really, it's just a more powerful server computer. The headlines and descriptions are rather misleading as to its actually capabilities. It can't actually "manage" 160 petabytes of data in 250 ns, it can just randomly access any part of it in 250ns. And it's not likely to be ready for a few years. More powerful computers help, but only to a degree. If you make the model from 2x the number of segments, then you get 4x the number of possible interactions. However you could certainly throw far more computing power at it than was used by NIST (I think they just used a cluster of 10 fairly regular PCs). It's just rather expensive - and again, there's no real need, as there are too many unknowns, so all you can do anyway is verify the general principles.