Share
| Welcome to Evolved. We hope you enjoy your visit. You're currently viewing our forum as a guest. This means you are limited to certain areas of the board and there are some features you can't use. If you join our community, you'll be able to access member-only sections, and use many member-only features such as customizing your profile, sending personal messages, and voting in polls. Registration is simple, fast, and completely free. Join our community! If you're already a member please log in to your account to access all of our features: |
| ATI beats NVidia | |
|---|---|
| Tweet Topic Started: Oct 25 2005, 02:52 PM (392 Views) | |
| Maplassie GTR | Oct 25 2005, 02:52 PM Post #1 |
|
MR GTR
![]()
|
It seems that last week's ATI event in Ibiza produced an interesting example of one of the X1K range's biggest weapons; overclockability. An overclocker that goes by the name of Maki, apparently hailing from Finland, managed to use his magic on an X1800XT so that a single card outperformed a dual-wielding (not overclocked) SLI Geforce 7800GTX setup. The creative Fin, managed to bring a regular 625MHz core and 1500MHz memory X1800XT card to the dizzying heights of an 877.5MHz core and memory clocking-in at 1980MHz. Before you rush of to throw USD600 ATIs way make sure you understand that to reach such levels of performance, you would need a steady supply of dry ice to keep your baby from setting your desk alight. The above mentioned setup, using an AMD Athlon 64 FX-57 (clocked at 3617.5 MHz), achieved the impressive score of 12278 in 3Dmark05, a score that beats the SLI dual Geforce 7800GTX setup using the same processor. This victory is not immediately significant to prospective ATI owners but is a positive comment on the potential of the R520 chip which appears to be just as, if not more, flexible as its predecessor. The latest news from the ATI camp even suggests that the X1800XT can reach 10k+ in 3DMark05 with conventional air powered cooling. The biggest proble of the X1800XT however, is that it is still not available while the Geforce 7800GTX, for example, is. Posted Image Sorry this is the only pic I could find |
| |
![]() |
|
| someoneelse | Oct 25 2005, 03:10 PM Post #2 |
|
Through the looking glass
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
It's hectic! But one has to remember that this is overclocked like a biatch! nVidia still has the lead with availability of cards and remember this: How well will two overclocked 7800GTX's compare to this card? Still - this X1800 is a flippen BEAST to be reckoned with. As always, the race for best graphics solution is on, and for now ATI have the lead again. But for how long? |
![]() |
|
| Maplassie GTR | Oct 25 2005, 03:13 PM Post #3 |
|
MR GTR
![]()
|
till jan/feb, Que 7800 Ultra! |
| |
![]() |
|
| someoneelse | Oct 25 2005, 03:21 PM Post #4 |
|
Through the looking glass
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Lets admit it dude - each has their positive aspects. I would really not mind a X1800 and I would really love a 7800GTX. Both ATI and nVidia have brought us a lot of things to smile about |
![]() |
|
| Bent_Anat | Oct 25 2005, 03:25 PM Post #5 |
|
Monster male chicken
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
the X1K series of cards, liek the 7XXX series of cards, IMO is a time-staller. Green Goblin was pushed by ATi to ship out another card, and they did that. They also did a BLOODY good job with what they had given. ATi was then forced to counter again, and did so with the X1K series. What really bothers me, not to mention makes me ditch any thought of ever spending money on any of these cards is the fact that they're not DX10 compatible. Nor, for that matter, do they support PS4. MS announced that DX10 is due in Q3 2006 and PS 4 will be made availiable ina DX9 upgrade shortly before. I cannot help but wonder: Where are the cards that support this. CLEARLY, both manufacturers must be working on this. Why haven't we seen any deials on these cards??? Assuming that PS4 comes out in may (what with Vista seeming to come out in june), i would expect the cards to be availiable jsut before that. a windows that NOONE can run on max details??? i don't think so...(and yes... Vista max details uses PS4). So that would mean: unless the card manufacturers do not mind falling behind on WINDOWS tech, the next generation fo card is due in may at the very latest. This makes the lifespan of the currnt "high end" card between 4 and 8 months... which, i might add is lower than the pathetic lifespan on the Geforce FX series... |
|
"Do not respond blindly To growing gardens in decay Stand guard when visitors fall silent Blow the candles out, end the road here." - Dimmu Borgir | |
![]() |
|
| someoneelse | Oct 25 2005, 03:32 PM Post #6 |
|
Through the looking glass
![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]() ![]()
|
Dude - you've gone gung-ho on special effects. Remember that raw power is a big player in these cards, and SM4.0 will bring dazzling effects and not monster performance. But as usual, vista WILL be delayed and I do tend to agree with you on the May release dates of next gen cards. But I wouldn't say that the lifespan of the current cards will be as short fused as the FX series. |
![]() |
|
| Maplassie GTR | Oct 25 2005, 04:21 PM Post #7 |
|
MR GTR
![]()
|
THe windows issue aside, the reason Why I dont mind getting a 7800GTX now, even if it is just DX9, is this: Lets say DX10 comes out in 6 - 8 months. How long is it gonna be for a TRUE DX10 VGA card to come out. And then, how long is it gonna take for a TRUE DX10 ONLY title to become available? I say in 2007 only. Mark my words. 2006 is still a safe year for any dx9 card holder. So, I have a year to a year and a half atleast,play with my new toy, wich does not bother me at all! |
| |
![]() |
|
| VoodooprophetII | Oct 25 2005, 04:32 PM Post #8 |
|
PC HATER
![]()
|
Well if you talking about efficiency....Nvidia with the G70 is what you looking for. And I'm talking about power consumption. Read the following: http://www.bit-tech.net/news/2005/07/07/g70_clock_speed/ And mean a GPU that changes its clocks depending on demand.....not a bad idea at all. |
|
<a href="http://profile.mygamercard.net/SARELSLIPSTREAM"> <img src="http://card.mygamercard.net/gelsig/teal/SARELSLIPSTREAM.png" border=0> </a> | |
![]() |
|
| Maplassie GTR | Oct 25 2005, 04:36 PM Post #9 |
|
MR GTR
![]()
|
VFI (very fukcing interesting) NVIDIA's Chief Scientist, David Kirk has suggested that "People just don't know as much as they think they do" when it comes down to the "Many" clocks within the GeForce 7800, aka G70. "It's somewhat hard for us to say 'the core clock in G70 is this single number'", says Kirk. "We didn't want to be accused of exaggerating the clock speed, so we picked a conservative number to talk about the core clock speed. But, yes, that is just one of the multiple clocks." David's comments come as he speaks exclusively to bit-tech about the issue which has become a hot topic amongst the community over the past couple of days, following the discovery that RivaTuner was reporting varied clock speeds for the 7800. We met with David in central London today, and he talked to us about many different issues. We'll have the full interview for you tomorrow, but we couldn't sit on this one until then. "People have said that G70 doesn't have any new architecture, but that's not really true. It has new architecture, it's just not always visible. "The chip was designed from the ground up to use less power. In doing that, we used a lot of tricks that we learned from doing mobile parts. The clock speeds within the chip are dynamic - if you were watching them with an oscilloscope, you'd see the speeds going up and down all the time, as different parts of the chip come under load." So why haven't we heard about this feature before? "We haven't talked about this feature before now because we wanted the technology to speak for itself," says NVIDIA's PR Manager Adam Foat. "People noticed its effect - that the 7800 is amazingly quiet and fantastically cool - and that's what we wanted. " You can pretty much bet that the other reason that NVIDIA haven't talked about the technology is because they didn't want ATI to find out about it and copy it. We asked David what the three visible clocks did (that's the ROP clock, pixel clock and geometry clock if you're still playing catchup). "You're making the assumption there's only three clocks," was his cryptic reply. "The chip is large - it's 300m transistors. In terms of clock time, it's a long way across the chip, it makes sense for different parts of the chip to be doing things at different speeds." What of the speculation that certain parts of the chip only overclock in multiples of more than 1MHz, appearing to restrict overclocking? "Well, the chip is actually better for overclocking, since it's so low-power and low-heat," Kirk tells us. "We're going to have to work with the guys at RivaTuner, because it could be that it makes sense for overclocking tools only to offer options that are really going to give a performance benefit, rather than letting users hunt around for the best combinations and multiples that work. Because of the way the chip works, it makes sense for different parts to be working in multiples." So there you have it - there are an undisclosed number of individual clockspeeds within G70, possibly more than 3. Those clocks scale up and down to save power, and this is one of the big features that has kept G70 to a single-slot-heatsink design, and an astoundingly quiet one at that. This is proprietary NVIDIA tech, and they're incredibly pleased with how well it works. NVIDIA is going to work to iron out issues with overclocking and RivaTuner, but don't expect too much more to be given away - we think that NVIDIA see this as a great technology advantage over their rivals. |
| |
![]() |
|
| 1 user reading this topic (1 Guest and 0 Anonymous) | |
| « Previous Topic · Entertainment · Next Topic » |
| Theme: Zeta Original | Track Topic · E-mail Topic |
4:32 PM Jul 11
|









![]](http://z2.ifrm.com/static/1/pip_r.png)

4:32 PM Jul 11