But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Jen pro poradek, z toho co je zatim znamo.
Nova "GTS"
112SP GPU 500MHz Shadery 1200MHz Mem. 1600MHz 320-bit
vs
GT
112SP GPU 600MHz Shadery 1500MHz Mem. 1800MHz 256-bit
Opravdu bude "GTS" lepsi volba? Jeste diky novemu clanku o vlivu taktu shaderu, kde dosli k zaveru, ze od urcite frekvence GPU uz OC tolik nepomaha, ale na taktu shaderu zalezi vzdycky bych si tim nebyl tak jisty. Uvidime jak velky handicap bude uzsi sbernice.
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Laser G9 + Logitech G930
A kde se pise, ze "GTS" bude stat 10k?![]()
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Ciste spekulativni otazka pro zkuseny uzivatele:
myslite (podle parametru a 65nm procesu), ze bude mozny 88GT chladit pasivne (samozrejme s tichym ofukem), treba nejakym Thermalrightem?
CPU: E8200 @3,6GHz 1.2V + Scythe SCNJ-1100P Ninja Plus Rev.B MB: Gigabyte P35-DS3 rev 2.0 RAM: 2x1GB A-Data Vitesta EE @900MHz CL4-4-4-12 VGA: Leadtek WinFast PX8800GT @730/1836/2100MHz + AC Accelero S1 HDD: Samsung 320GB SpinPoint HD321KJ DVD: Samsung SH-S203D Zdroj: Fortron 400W Case: Cooler Master Elite 330 LCD: LG L1953TR-SF
G92 based 8800 GTS on Nov 19th
ExpreviewIt's clear that 8800GT will appear in the market in monday, 29th Oct. But maybe you are all thinking about the second G92 product: new 8800GTS.
According to the sources, the new 65nm 8800GTS will announce in Monday,19th Nov, a week later than 780i(12nd Nov). The price is CNY2299(~305 USD).
It's known new 8800GTS will have 128 SP. But still no evdience about the core clock, memory clock and shader clock. The score we have earlier already prove that 8800GT can beat 8800GTS(old). And we believe the 65nm 8800GTS(new) will better than 8800GTX(old).
So please be awared of 29th Oct,and 19th Nov. And please remember the RV670 will be coming 15th Nov.
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Moc to nesleduju, ale ví se už počet tranzistorů? Nějak mi to nesedí...
G80 / 90nm / 384bit sběrnice / 484mm2
G92 / 65nm / 256bit sběrnice / někde jsem četl 289mm2 (je to správně?)
vzhledem k tomu, že jsou převodníky a analogová část mimo čip, se přechod na menší proces dá přepočítávat docela snadno, takže v případě, že by G92 byla zmenšená G80 (bez změny hustoty tranzistorů), by měla být velká 484/1,92 = 250mm2
takže by mě zajímalo:
1. proč je o 40mm2 větší, než by měla být
2. proč nVidia hodlá prodávat čip high-ednových rozměrů (~300mm2) za mainstream/midrange cenu (G71 měla necelých 200mm2, G70 334mm2, NV40 287mm2)
3. co a proč je v G92 deaktivováno (je-li deaktivováno)
4. zda G92 byla vždy určena pro mainstream/midrange
5. proč je G92 taktována tak konzervativně (řekněme, že proto, že ji stejně brzdí propustnost sběrnice)
6. pokud G92 brzdí propustnost sběrnice, proč nVidia nepoužije GDDR4
nápady?![]()
senzory pro DF - DSLR | senzory pro DF - kompakty a EVF | senzory pro DF - mosaic masky | senzory pro DF - full-color masky | filtry pro DF #1 / #2 | tisk DF na inkoustech od Canonu | M42 skla pro DSLR | Thoriové sklo a M42 objektivy
Neděkujte, nenadávejte, pokud se vám něco líbí nebo nelíbí, používejte prosím reputaci (tj. ikonka s vahami pod avatarem)
Toto téma si právě prohlíží 1 uživatelů. (0 registrovaných a 1 anonymních)