[e2160 @ 3ghz(1.275V). 2GB 825mhz ddr2. 8800gts 320mb. msi p35 neo2.] @ 450w (350w rip) (280w rip)
Je přirozené pro lidi jíst maso, OK? My jíme krávy, krávy jedí trávu. My sekáme trávu, co nás dělá hladovými po dalších kravách.Koloběh života!
E4400 Ninja ¦ P35-DS3 ¦ 4x1GB DDR800 ¦ AMD HD6670 ¦ Crucial M4 64GB + WD6400AAKS + WD2500KS + 7200.10 250GB ¦ Enermax 420W ¦ NEC 3520A ¦ Centurion 5 + KAMABAY + AK-FC-03 ¦ MX510 ¦ UltraX ¦ Formula Vibration Feedback ¦ 223BW ¦ Creative T5900 + AKG K 530 ¦ not watercooled anymore
"Pokud máte jiný názor než já, je to jasný důkaz, že se pletete." "když má 1000 pičmulínků jiný názor, jedná se o přímý důkaz, že pravdu mám já" "Jinak ke quadcore - dualcore mě vždy zdržoval, měl jsem už před 10 lety dualCPU, až quadcore je konečně rychlejší než já" "Podle některých je lepší Cialis, nicméně, pokud Vás baví píchat různé svěží mladé kusy zhruba 6 až 7 hodin denně (mám to místo posilovny), 3 hodiny spát, a zbytek pracovat, tak Viagra skutečně funguje lépe. Naprosto doporučuji. 25mg modré tabletky, a denní norma je splněna. Pak 15 hodin programování, nějaký ten hip-hop do Sennheiser sluchátek, a 3 hodiny zase spát." RH
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Black holes are where God divided by zero. - Steven Wright
2600K s AC HF14 | P8P67Deluxe s 16GB DDR3 | GTX1080 s LP2480zx | EVO850 s ICH10R | X-Fi s HD555 | TripleXtreme 360 s HPPS+ | Windows 10 + 7
Nvidia to launch 8700
FudzillaNvidia is about to have its first working samples of the card that might end up be called Geforce 8700. The new mainstream chip should be the one we mentioned before codenamed D8M but it can be highly likely that Nvidia has another codename for it, something like G8x.
The new chip aims to be launched together with G92 in the Novemberish time and this will secure Nvidia’s performance leadership in this market segment.
If the yields are good Nvidia will have a killer chip as the thermal dissipation of this improved version of G84 core is going to be much better and it will let Nvidia to set the higher clocks.
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
HKEPCLatest rumours from HKEPC about G92 and friends :
- 65nm TSMC.
- To be launched on November 12th
- Will replace the 8800 GTS directly, in anticipation of a possible December-ish RV670 launch.
- There is a G98 ready to compete against a RV620, launch dates still unknown on either of them. Could be a G84 replacement, for a CES 2008 January introduction (hinting at mobile part first, desktop version in late Winter/early Spring).
- They say it has a "TCCD" memory controller
- PCI-Express 2.0, DX10.1 (as expected), DisplayPort and HDMI support.
- Purevideo Generation 3.
- G92 will remain strictly a "Geforce 8" family member, despite what the codename might have implied.
+
HmmmmIn addition, the company also issued another good news was reported earlier on the face of Nvidia G80, G84 and WEI chip will support DirectX Shader Model 10.1 and 4.1, so that once plagued the market scares of the problem.10.1 Shader Model 4.1.But Nvidia has recently clarified the news, saying the news is rumors, Nvidia said its G80, G84 and the chip is already WEI 10.1 and DirectX 4.1 Shader Model prepare, when Microsoft released DirectX 10.1, Shader Model 4.1, NVIDIA Drivers provide support programs, the problem will be resolved smoothly.![]()
Naposledy upravil Masster; 30.08.2007 v 16:22.
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Teď ještě, jestli tím myslí softwarovou kompatibilitu, nebo HW podporu![]()
senzory pro DF - DSLR | senzory pro DF - kompakty a EVF | senzory pro DF - mosaic masky | senzory pro DF - full-color masky | filtry pro DF #1 / #2 | tisk DF na inkoustech od Canonu | M42 skla pro DSLR | Thoriové sklo a M42 objektivy
Neděkujte, nenadávejte, pokud se vám něco líbí nebo nelíbí, používejte prosím reputaci (tj. ikonka s vahami pod avatarem)
Salty specs of G92 and G98 hit the web
http://www.tcmagazine.com/comments.php?id=15946&catid=2G92 will have a 256-bit memory interface, will work at 740 MHz (core) and will have 64 Stream Processors. The G92 would have 512MB of GDDR3 memory clocked at 1800MHz and a TDP between the 7900GS and the 7900GTX (quite a large gap).
The card's price would be from $249 to $299 and it would score around 9700 points in 3DMark06 on a system powered by an Intel QX6700 (4 cores at 2.66 GHz).
As for the G98, it would have only 32 Stream processors (like the G84), a 256-bit memory interface, a core clock of 800 MHz, a TDP a little higher than the 8600GTS and a price tag hovering between $169 and $199, making it whether a replacement for the 8600GTS or the card to bring the price of the 8600GTS down close to the price of the Radeon HD 2600XT. This one would score about 7400 in 3DMark06 on the same, CPU-only configuration.
Zacinam byt opravdu hodne zvedav, co opravdu uvidime v listopadu, protoze ty specifikace k odhadovanym vysledkum opravdu moc nesedi.
But of course, the unlimited evilness of the ultimate tool of mischief, the universal explanation for sucky performance from anything other than nVidia, TWIMTBP, strikes again. The shenanigans know no bounds....luckily, in this sea of pain and anguish,the shining beacon of light and righteousness, ATi, stand, with their Get in the Game program that they managed so badly(because they`re not evil like nV, see, so they couldn`t actually have a program where they worked really close with devs, pushing their tech into their stuff) that no game is part of it. - Morgoth the Dark Enemy
Toto téma si právě prohlíží 1 uživatelů. (0 registrovaných a 1 anonymních)