It seems that HEXUS are on a roll with speculation regarding AMD's next-generation R600 GPU, with no less than four stories regarding the part - Some of it outlandish, some of it slightly more plausible. Here's a recap of what they're saying.
Citace:
R600 has had several challenges, principally over power and cooling.
ATi has looked at water-cooling and traditional air cooling. But, the speed of the processor and its resultant heat output have meant that ATi also had to consider more complex hi-tech solutions.
So, the company turned to NanoFoil cooling technology. This, as best we can understand, works by having ultra-thin nanolayers of aluminium and nickel that can be kicked into life by heat, electricity or mechanical or optical stimulation, causing a reaction that gives off heat in a controlled fashion.
All the might of the AMD/ATi combine can't deliver R600 on time
Citace:
The reasons why a re-spin might be need can be highly technically convoluted. They can also be ultra simple. Humans, remember, are involved in the production decisions.
With A11, as we understand it, the issue was very, very simple. Someone forgot to connect the pins that let the GPU communicate with the outside world.
ATi keeps on spinning to try to get R600 right
Citace:
Seemingly, ATi has been aiming to achieve 11,000 3DMark scores at 750MHz. That's tasty and may mean the R600 offers a strong challenge to NVIDIA's G80 GPU given that, as we reported in HEXUS.Beanz a few hours ago, ATi's already got samples running at 1GHz and has higher targets in mind.
Will R600's performance be enough to put ATi back in the lead?
Citace:
ATi is targeting February 14 - Valentine's Day - for getting R600 review samples into the hands of the press.
And, just in case you were under any doubts. Yes, we do mean next Valentine's Day, not the year after next.
ATi targets CeBIT for R600 launch
So, there you have it - Nanofoil cooling and a 2GHz clock speed, and a Valentine's Day launch to press? Personally, I'd bet against at least two out of those three points being wrong, but who knows?