Tyhle blaboly nebudu komentovat. Jen ti doporucim, aby sis precetl cely clanek, ze ktereho tu bylo citovano: http://www.bit-tech.net/gaming/2006/...e_Engin/1.html
Kdyz neveris takovymu loserovi, jako jsem pro tebe ja, treba mozna, mozna uveris vyvojarum ve Valve, ikdyz mi je jasny, ze proti tobe to jsou vylozeny zabari. A jak znam hry od valve, jsem na 100% presvedceny, ze ten jejich multi-threaded engine bude vylozene "prasarna se snizenou vystupni kvalitou"![]()
![]()
Tos me vazne pobavil
Ted pro vsechny: Dost me tam zaujal zaver:
But the chance of physics cards taking off is pretty minimal if we draw inferences from the views of Newell on hardware. He talks enthusiastically of the "Post-GPU world" - rather like the one envisaged by AMD and ATI with their Fusion project. In this world, we see a number of homogenous CPU cores all tasked with different projects - including graphics rendering. This allows for more flexibility when it comes to splitting up workloads, and means that engine-functions such as AI and physics can become a more integral part of the gaming experience because of the scalability such an architecture adds. "All of a sudden," raves Gabe, "If your AI isn't running fast enough, you can lower your graphics resolution. That's some awesome flexibility."
Ono to je docela logicke. Graficke karty se zacinaji stale vice podobat beznym procesorum a naopak nektere procesory, napr. cell, se zacinaji napadne podobat GPUckam. Ten 80-ti jadrovy terra chip od intelu je zrejmne tim, co mel autor citovaneho prispevku na mysli. Vysoce univerzalni, masivne paralelni a velmi vykonny CPU+GPU.