Hold off on those upgrades people! Nvidia’s Pascal line of GPUs has been shown off and it looks glorious!
Details here - LINK
http://bi9he1w7hz8qbnm2zl0hd171.wpengine.netdna-cdn.com/wp-content/uploads/2015/11/NVDA_Pascal_EngBoard.png
At the Japanese edition of NVIDIA GTC (GPU Technology Conference), NVIDIA finally revealed details behind its 2016 graphics architecture, codenamed Pascal. The architecture was launched at the main GTC event, which took place in San Jose on March 17th, 2015 (watch Jen-Hsun Huang’s GTC keynote here). GTC Japan was hosted by Marc Hamilton.
As always, the Pascal GPU will be manufactured in Taiwan Semiconductor Manufacturing Company (TSMC), using the brand new 16nm FinFET process. This process is much more than a simple number, since it marks the shift from planar, 2D transistors to the FinFET i.e. 3D transistors. This shift required that the engineers make lot of changes in the thought process, and should result in significant power savings.
But that is just the beginning, as Pascal will bring support for up to 32GB of HBM2 memory. However, the actual products based on Pascal will launch with 16GB HBM2 memory, and more memory will depend solely on memory vendors such as SK.Hynix and Samsung. What is changing the most is bandwidth. Both the Kepler-based Tesla (K40) and Maxwell-based M4/M40 featured 12GB of GDDR5 and achieved up to 288GB/s of memory bandwidth. Those 16GB HBM SDRAM (packed in four 4GB HBM2 chips) will bring 1TB/s in bandwidth, while internally the GPU surpasses the 2TB/s barrier.
Pascal will also be available in multi-GPU packaging, replacing the Tesla K80 (NVIDIA skipped Maxwell-gen dual-GPU Tesla). Combined figures are very interesting to compare – 24GB GDDR5 and 480GB/s bandwidth should be replaced with 32GB HBM2 and 2TB/s bandwidth, mutually connected through NVLink rather than PCIe. The NVLink will enable up to 80GB/s, which should replace PLX PCIe Gen3 bridge chips that can only support 16GB/s (8GB/s per GPU). This part should be ‘warm up’ for 2018 and the Volta architecture.
Unfortunately, the company did not disclose how much would ECC (Error Correcting Code) reduce the memory performance and overhead, but that is something all HBM-powered products will have to deal with. In any case, the company is gearing for a battle with Intel Xeon Phi, which in its recent incarnation is becoming quite the competitor. Still, Pascal is expected to deliver double-digit single-precision TFLOPS performance, and a lot of focus will be placed on so-called mixed-mode precision (INT8, FP16 and FP32).
Pascal is expected to hit the market during the first half of 2016.
What are your thoughts on this?
How do you think AMD will react?
Will you buy one one of these cards?
What are your thoughts on the next generation of Nvidia Based mobile hardware?
Naveed
(Naveed)
November 18, 2015, 12:10pm
2
Dang and I just picked up a Titan X last week!
This power will come at an eye watering price at first though so I’ll probably consider a Pascal close to the Volta launch. A broader range of Pascals should be available for rock bottom prices when that happens.
I was converted from Radeon to Nvidia a few generations back so will probably stick with them.
Nawrot
(Nawrot)
November 18, 2015, 12:15pm
3
Dang and I just picked up a Titan X last week!
This power will come at an eye watering price at first though so I’ll probably consider a Pascal close to the Volta launch. A broader range of Pascals should be available for rock bottom prices when that happens.
I was converted from Radeon to Nvidia a few generations back so will probably stick with them.
I am kind of happy i picked “only” 980ti, this new architecture for me is matter of 2 years or so, but there will be new one.
These Gpu up to 32GB are for the 2016 4K gaming hype.
Ppl will buy the 16gb one and a few months later there’s a better gpu 32gb model out there…
I now upgrade GPUs only if I have a real reason to do so; mine is still running UE4 Editor smoothly, I wont buy another one for the next coming years.
Naveed
(Naveed)
November 18, 2015, 4:35pm
5
I upgraded from a GTX 970 x2 SLI. I realised I made a huge mistake in investing in SLI set up as UE4 doesn’t support it!
4K gaming is absolute bliss…when there’s a decent framerate, which my 2 970’s couldn’t even do. I highly recommend it!
Well… I bet those will be too expensive to buy! (For people like me!)