
So.
Will it play Crysis?
At the tail end of the GPU Technology Conference in San Jose this week, graphics chip juggernaut and compute wannabe Nvidia divulged the salient characteristics of the high-end "Kepler2" GK110 GPU chips that are going to be the foundation of the two largest supercomputers in the world and that are no doubt going to make their …
@Ru
You are correct. I ported my neural net program to cuda and found that it was not the best match. Either reads are in order but writes are random, or the reverse (depending on which direction you decide to slice and dice the calculation and how the data structure is sorted in advance).
why OC enthusiasts have been training with liquid nitrogen for years. Those who have money to spare might want to consider investiging in Linde AG (disclaimer : this is not an offer to sell or a solicitation of an offer to buy any securities. I am not in any way connected to the afore-mentioned company)....
Henri
Remember all the people who complained that the Cell was hard to program, due to the fact the SPUs were not the same as the PPC cores, and that the SPUs needed the programmer to explicitly manage moving data from main memory to the SPU memory?
Now this - is this any better (other than the fact that there are more than 8 cores)?
Will nVidia allow this chip to be sold in anything other than board level assemblies? That was the problem (from my perspective) with the Cell - IBM didn't want to sell the chip alone unless you were buying hundreds of thousands of them, so you had to buy a board from one of the board vendors like Mercury Computing. If nVidia won't let companies create their own boards (or package this into more useful form factors than 6U Compact PCI) then it will have similar issues.
Cell was marketed as a general purpose CPU, though. These things are not. Cell needed a whole new rack of skills and tools that didn't really exist before its release, the new Kepler stuff builds upon existing tools and skillsets. Far as I can tell, your existing CUDA and shader programs can be ported across to the new hardware just fine, and will work that little bit better without you ever needing to know about the new features.
It isn't quite Apples and Oranges, but it isn't far off.
...case in the USA had ripples out into other countries as well. They were made to replace the known-duff ones (like the thousands supplied to the likes of Dell and HP) , and although this didn't fix the unknown-duff ones, i.e the ones they didn't deliberately sell on knowing they would probably fail, it caught a lot. Had my GS7900 in an old laptop swapped out at 4.5 yrs old last year, got one with double the RAM in it's place.
But, yeah: better check the quality of the glue - at 300W, they'll probably self-solder without having to resort to sticking the graphics card in the oven for 10 minutes or so!