Not yet a member? No problem!
Sign-up just takes a second.
Forgot your password?
Recover it now.
Already signed up?
Log in now.
Forgot your password?
Recover it now.
Not yet a member? No problem!
Sign-up just takes a second.
Remember your password?
Log in now.
27 Comments
BoneRemakesays...@Funny
iauisays...What's the original video about?
notarobotsays...So, what's this actually about? Both the original video, and the computer car wars story?
antsays...4 GB of VRAM? I still have use 512 MB of VRAM!
BoneRemakesays...absolutely no clue on either count, but the laugh was hilarious ! and I know the difference between 4gb and 3.5 gb. Other than that no clue and I could not /can not find.
So, what's this actually about? Both the original video, and the computer car wars story?
eric3579says...May find some answers here.
http://www.pcper.com/reviews/Graphics-Cards/NVIDIA-Discloses-Full-Memory-Structure-and-Limitations-GTX-970
and or
http://www.guru3d.com/news-story/does-the-geforce-gtx-970-have-a-memory-allocation-bug.html
So, what's this actually about? Both the original video, and the computer car wars story?
00Scud00says...Being an owner of a 970 I can safely say that I am not amused. (ಠ_ಠ)
Mordhaussays...I own a 3975 EVGA 970 and I can say that, while I am pissed at Nvidia, this is funny as balls.
Ralghasays...3.5 gb ! LOL. This is so good. And no, I don't own a 970. But I'd take one over an AMD GPU any day.
00Scud00says...I have no particular allegiance to either brand and have bounced back and forth between the two for years. This is my first Nvidia card in a while, the main reason for buying ATI cards came down to price for me, why spend hundreds more when I could get a comparable ATI card with more VRAM to boot? That changed with the 970, I could get a better GPU with another gig of VRAM over my Radeon 7970, or is this only a half a gig now?
3.5 gb ! LOL. This is so good. And no, I don't own a 970. But I'd take one over an AMD GPU any day.
RFlaggsays...I agree it seems shady to advertise the 4 GB if only 3.5 GB is full bandwidth... there should have at least been an asterisk on it. Not sure the complaints about the ROPs is as big a deal... I mean if I purchased just off specs I would perhaps be, but really, I look at performance for the dollar. The two articles linked still say they stand by the recommendation due to performance for the dollar, which is the main metric I look at. Then again, if I was running more than 1080p I'd probably be a bit more concerned... That or if the few games that require 4GB to run in ultra wouldn't run in ultra on my card that supposedly has 4GB...
mxxconsays...ask for a refund.
Being an owner of a 970 I can safely say that I am not amused. (ಠ_ಠ)
Paybacksays...Seriously though, if he's making so much damn money, you'd think he'd spend more on dental surgery.
Jinxsays...The real story here is that apparently nobody is able to tell the difference between a 3.5gb card and a 4gb one.
Gutspillersays...The sad thing is, as an owner of an AMD 7990 (dual GPU) and seeing that most of todays games run horrible with a dual GPU setup because the devs have not taken the time to code for it, my next upgrade I will be forced to go to a single GPU card.
So either way, both companies have a darker side that isn't talked about as much as they should be in reviews.
Example: i7 4.2Ghz, 16GB ram, AMD 7990 GFX card, CoD: AW @ 1920x1080 will drop to 7fps.... IN THE MENU. (and it's not just COD). I've seen multiple new releases run like crap.
Talk to support, they answer is disable crossfire. Why pay for a dual GPU if no new games are supporting them, because their biggest audience runs 1 GPU.
I wish a new company would knock both AMD and nVdia on their ass, and take over with better tech, faster cards, and not be so sleazy. (One can dream)
ForgedRealitysays...It's not dual-GPU that's the problem. It's dual-GPU AMD card that's the problem. AMD drivers are effing terrible. They always have been. Get a dual-GPU nVidia card and everything changes.
The sad thing is, as an owner of an AMD 7990 (dual GPU) and seeing that most of todays games run horrible with a dual GPU setup because the devs have not taken the time to code for it, my next upgrade I will be forced to go to a single GPU card.
So either way, both companies have a darker side that isn't talked about as much as they should be in reviews.
Example: i7 4.2Ghz, 16GB ram, AMD 7990 GFX card, CoD: AW @ 1920x1080 will drop to 7fps.... IN THE MENU. (and it's not just COD). I've seen multiple new releases run like crap.
Talk to support, they answer is disable crossfire. Why pay for a dual GPU if no new games are supporting them, because their biggest audience runs 1 GPU.
I wish a new company would knock both AMD and nVdia on their ass, and take over with better tech, faster cards, and not be so sleazy. (One can dream)
Quboidsays...I own a 970 and I'm not particularly concerned. I think it was an accidental error by Nvidia which affects them (egg on face) more than it affects me (imperceptible performance loss).
Daldainsays...That was probably the funniest thing I have seen in years.
BoneRemakesays...@lucky760 What are you running ?
I have a nicely working Radeon R7 760 2gb. Works aces for me, non of this hoo ha the apparent story seems to be.
Chairman_woosays...Next you'll be telling me that Intel put a 1-200%ish mark-up on their top end CPU's because they have no real competition from AMD....
......*checks parts shop* OH NOES that's exactly what the twats have done!"!!!!
Chairman_woosays...I have a R9 280x and to be honest I've never really seen it get past about 60% GPU & 2ish Gig of the Vram.
However I'm only running a single 1080p monitor, nor am I running any kind of upscaling based anti aliasing.
The future seems to be 4k monitors and for the serious psychos 4k eyefinity and maybe even that silly Nvidia 3D thing.
When you start to get into anything like that (and 4000p will inevitably come down to consumer level in price), coupled with the recent push for texture resolution in AAA games, all your're futureproofing starts to go out of the window.
The reason people are pissed off is because this card could have easily seen users through the next few years of monitor and games tech and they artificially gimped it such that anyone that wants to stay reasonably cutting edge will have to buy new cards in 2-3 years.
4 gig is fine for now, but it's a joke that a new top end card would have less Vram than some medium weight cards from two generations ago. Even my 280x has 3.
Long story short resolution eats Vram for breakfast and resolution is where most of the next gen game developments are likely to be biased. It's frustrating but as some others have suggested, it's really nothing new.
@lucky760 What are you running ?
I have a nicely working Radeon R7 760 2gb. Works aces for me, non of this hoo ha the apparent story seems to be.
oohahhsays...Found this in the YT comments:
I think is "Catalan" language,.. the only thing i can understand is the beginning: Y me llama el cocinero, Rosita! qué, vé por la paella, venga venga son las 2 de la tarde ...." etc etc is something , some kind of joke about going to the beach . the swimsuit and a tipical dish from spain (paella) ...
What's the original video about?
lucky760says...Mine: http://www.geforce.com/hardware/notebook-gpus/geforce-gtx-770m/specifications
Doesn't really matter as I don't do any gaming anymore, but it can play the hell out of some YouTube videos, lemme tell ya.
@lucky760 What are you running ?
I have a nicely working Radeon R7 760 2gb. Works aces for me, non of this hoo ha the apparent story seems to be.
Gutspillersays...Unfortunately, at least in my scenario, it's not an AMD vs nVidia. I was on nVidia before with CoD: Ghosts... still very poor fps, and still the same answer, disable your dual GPUs, and the game will run better.
Maybe it's a CoD problem more so than other devs, but really... If one of the biggest titles to launch every year isn't optimizing for dual GPU cards, there isn't much of a reason to buy one if big AAA devs aren't going to take advantage of it.
It's like you're throwing your money away buying the 2nd GPU.
I do agree about the drivers tho, I constantly see driver updates for nVidia cards, and sit and wonder where AMD's drivers are.
Single nVidia GPU for me from here on out.
Too bad those "The way it's meant to be played" mean shit when it comes to the devs supporting your companies top-of-the-line cards.
It's not dual-GPU that's the problem. It's dual-GPU AMD card that's the problem. AMD drivers are effing terrible. They always have been. Get a dual-GPU nVidia card and everything changes.
ForgedRealitysays...Based on reviews, Ghosts was not only a terrible port but just a terrible game in general. EA tends to push their devs to release things more quickly rather than with higher quality. I wouldn't doubt that at least contributed negatively to your experience.
Crossfire and SLI can be dicey though just because the GPU's have to know how to talk back and forth and stay in sync with each other, which just escalates the probability that something can go wrong either from the game developers' end or from driver support . Single GPUs are plenty powerful these days, so I'm right there with you with the single GPU approach. I've not gone back to multi-GPU since my 4870x2 many years ago (which usually had issues due to ATI constantly breaking the drivers and then having to fix it again every other update).
Unfortunately, at least in my scenario, it's not an AMD vs nVidia. I was on nVidia before with CoD: Ghosts... still very poor fps, and still the same answer, disable your dual GPUs, and the game will run better.
Maybe it's a CoD problem more so than other devs, but really... If one of the biggest titles to launch every year isn't optimizing for dual GPU cards, there isn't much of a reason to buy one if big AAA devs aren't going to take advantage of it.
It's like you're throwing your money away buying the 2nd GPU.
I do agree about the drivers tho, I constantly see driver updates for nVidia cards, and sit and wonder where AMD's drivers are.
Single nVidia GPU for me from here on out.
Too bad those "The way it's meant to be played" mean shit when it comes to the devs supporting your companies top-of-the-line cards.
Grimmsays...Anyone else think of Andy Kaufman when listening to this guy?
Gutspillersays...Yeah, I've been a fan of CoD, but am slowly losing interest as they are getting worse each year, despite them putting 3 different devs on each new release.
Ghosts was published by Activision, but I know what you mean about the publishers pushing for a certain release whether a game is done or not. Seems like the last few years devs have been trying to skip the quality assurance phase, and it's only hurt them and their brand in my eyes.
I'm glad to hear that at least CoD is selling less these days... I can only hope it's enough to show the publisher to not skip that optimization phase of development, and to make sure their game runs better on all different setups of PCs, and not solely making sure it runs smooth on consoles.
Based on reviews, Ghosts was not only a terrible port but just a terrible game in general. EA tends to push their devs to release things more quickly rather than with higher quality. I wouldn't doubt that at least contributed negatively to your experience.
Crossfire and SLI can be dicey though just because the GPU's have to know how to talk back and forth and stay in sync with each other, which just escalates the probability that something can go wrong either from the game developers' end or from driver support . Single GPUs are plenty powerful these days, so I'm right there with you with the single GPU approach. I've not gone back to multi-GPU since my 4870x2 many years ago (which usually had issues due to ATI constantly breaking the drivers and then having to fix it again every other update).
Discuss...
Enable JavaScript to submit a comment.