search results matching tag: NVIDIA

» channel: learn

go advanced with your query
Search took 0.000 seconds

    Videos (37)     Sift Talk (2)     Blogs (6)     Comments (123)   

Grilling Food on my Laptop....big mistake.

Mordhaus says...

One of them, yes. I think we had 2 or 3 battery recalls for the macbooks and one hell of a recall for the macbooks with nvidia graphics. The graphics one was horrendous; I didn't have to deal with the support calls once we nailed it down and they stopped escalating them to tier 3, but god the call volume was out of this world for the tier 1 folks.

People don't realize it, but I can tell you that for the years I worked there (from 2005 to 2011), we used to joke that we should change the catchphrase to "It just doesn't work." Of course, a lot of the problems were because end users were trying to use products in a way they weren't meant to be used.

ant said:

That's nuts. Is this the recalled battery model that Apple recalled?

Photoshop experts use Photoshop 1.0

spawnflagger says...

I mostly use Corel PhotoPaint because it came with CorelDraw suite X5 (they are up to vX7 now). I've always liked Draw better than Illustrator, but not a big fan of PhotoPaint either. I do have Adobe CS5.5 on my work laptop, but I swear anytime I need to do something I think is straightforward, it takes 20 minutes of googling and howtos to get it done.

Gimp is so-so, Pixlr was fun to play with until the annoying ads came. If you have a Wacom tablet (or screen), then Corel Painter is nice. On iPad, the Paper App is really impressive, as is the Dabbler App that comes with nVidia Shield Tablet. Lastly will give a shout-out to Microsoft for Fresh-Paint App on Win8.1 tablets (be careful about your save files though)

ant said:

Ditto. 8.10 on my very old, updated Windows XP Pro SP3 machine! What do you use now on the newer OSes? I used Paint.net in 64-bit W7 @ work. It was OK.

Graphics card woes

Gutspiller says...

Unfortunately, at least in my scenario, it's not an AMD vs nVidia. I was on nVidia before with CoD: Ghosts... still very poor fps, and still the same answer, disable your dual GPUs, and the game will run better.

Maybe it's a CoD problem more so than other devs, but really... If one of the biggest titles to launch every year isn't optimizing for dual GPU cards, there isn't much of a reason to buy one if big AAA devs aren't going to take advantage of it.

It's like you're throwing your money away buying the 2nd GPU.

I do agree about the drivers tho, I constantly see driver updates for nVidia cards, and sit and wonder where AMD's drivers are.

Single nVidia GPU for me from here on out.

Too bad those "The way it's meant to be played" mean shit when it comes to the devs supporting your companies top-of-the-line cards.

ForgedReality said:

It's not dual-GPU that's the problem. It's dual-GPU AMD card that's the problem. AMD drivers are effing terrible. They always have been. Get a dual-GPU nVidia card and everything changes.

Graphics card woes

Chairman_woo says...

I have a R9 280x and to be honest I've never really seen it get past about 60% GPU & 2ish Gig of the Vram.

However I'm only running a single 1080p monitor, nor am I running any kind of upscaling based anti aliasing.

The future seems to be 4k monitors and for the serious psychos 4k eyefinity and maybe even that silly Nvidia 3D thing.

When you start to get into anything like that (and 4000p will inevitably come down to consumer level in price), coupled with the recent push for texture resolution in AAA games, all your're futureproofing starts to go out of the window.

The reason people are pissed off is because this card could have easily seen users through the next few years of monitor and games tech and they artificially gimped it such that anyone that wants to stay reasonably cutting edge will have to buy new cards in 2-3 years.

4 gig is fine for now, but it's a joke that a new top end card would have less Vram than some medium weight cards from two generations ago. Even my 280x has 3.

Long story short resolution eats Vram for breakfast and resolution is where most of the next gen game developments are likely to be biased. It's frustrating but as some others have suggested, it's really nothing new.

BoneRemake said:

@lucky760 What are you running ?

I have a nicely working Radeon R7 760 2gb. Works aces for me, non of this hoo ha the apparent story seems to be.

Graphics card woes

Quboid says...

I own a 970 and I'm not particularly concerned. I think it was an accidental error by Nvidia which affects them (egg on face) more than it affects me (imperceptible performance loss).

Graphics card woes

ForgedReality says...

It's not dual-GPU that's the problem. It's dual-GPU AMD card that's the problem. AMD drivers are effing terrible. They always have been. Get a dual-GPU nVidia card and everything changes.

Gutspiller said:

The sad thing is, as an owner of an AMD 7990 (dual GPU) and seeing that most of todays games run horrible with a dual GPU setup because the devs have not taken the time to code for it, my next upgrade I will be forced to go to a single GPU card.

So either way, both companies have a darker side that isn't talked about as much as they should be in reviews.

Example: i7 4.2Ghz, 16GB ram, AMD 7990 GFX card, CoD: AW @ 1920x1080 will drop to 7fps.... IN THE MENU. (and it's not just COD). I've seen multiple new releases run like crap.

Talk to support, they answer is disable crossfire. Why pay for a dual GPU if no new games are supporting them, because their biggest audience runs 1 GPU.

I wish a new company would knock both AMD and nVdia on their ass, and take over with better tech, faster cards, and not be so sleazy. (One can dream)

Graphics card woes

00Scud00 says...

I have no particular allegiance to either brand and have bounced back and forth between the two for years. This is my first Nvidia card in a while, the main reason for buying ATI cards came down to price for me, why spend hundreds more when I could get a comparable ATI card with more VRAM to boot? That changed with the 970, I could get a better GPU with another gig of VRAM over my Radeon 7970, or is this only a half a gig now?

Ralgha said:

3.5 gb ! LOL. This is so good. And no, I don't own a 970. But I'd take one over an AMD GPU any day.

Graphics card woes

Graphics card woes

Mac Pro No (Funny)

spawnflagger says...

I like this commercial, but Boxx workstations are basically just a rackmount server with nVidia quadro/tesla cards and a remote PCoIP card. If money is no limit, I can build you an 80-core workstation with 2TB ram and 200TB of storage. (probably about 6kW of power required). You'd have to run Windows 2012 Server to support that much ram though (8.1 pro only supports 512GB)
EDIT: with recently released Xeon E7v2, can have 8*15-core, so 120 cores, and 4TB ram...


Most of their arguments are ATI vs nVidia and Windows vs Mac OS X.

I got to unbox a MacPro two weeks ago, they are quite nice, small, easy to open (flip 1 switch, then lift). What's different about seeing them in person is how shiny it is - all the images made it seem kinda flat black, but it's very glossy. The internal SSD could sustain 900+ MB/s (both read and write) on the Blackmagic benchmark tool. Really impressive. Attach a thunderbolt Drobo, and you are set for storage capacity.

Titanfall Gameplay video @ 1440p

RFlagg says...

It was one of the best games I've ever played. Certainly the best shooter. That PC beta period didn't last nearly long enough and can't wait to play it again. I had an i3 530 and nVidia 450 GTS at the time and still got fairly good graphics on my 1600x900 monitor... By March 11th I should have a new i7 4770k and an nVidia GTX 770 and a 1080p monitor so win there... Quad Titans on the system shown though is insane... And can the XBox One even do the game on 1080p or was it locked to 720p like most other titles? Which is the crazy part... I mean, I might get an XBox One Titanfall edition since you get the game and a headset for no additional cost, and Destiny sadly isn't coming to PC for some odd reason.

the large pixel collider PC gamer

ChaosEngine says...

That $5k Nvidia card is a quadro. It's a workstation card, designed more for CAD or 3d visuals. I'm not sure the difference would actually be that big for games.

I was kinda disappointed they didn't go SSD for the storage, but holy roller-blaing christ..... quad titans!!!

Orz said:

As near as I can figure out on the Digital Storm website, a rig like this will cost about $17,236. The Titans are about $1k each but Nvidia has a card that's $5k.

the large pixel collider PC gamer

Orz says...

As near as I can figure out on the Digital Storm website, a rig like this will cost about $17,236. The Titans are about $1k each but Nvidia has a card that's $5k.

Photo-Realistic Virtual World Rendered LIVE server-side

Reactions and some Ingame-Footage of the Occulus Rift

bmacs27 says...

Honestly, I'm pessimistic. I work with Virtual Reality for a living and I find head mount displays (even premiere research quality head mounts with top notch head/body tracking) to be a shitty experience. I'd rather nvidia glasses done well. My main complaint is with the very short optical distance between your eyes and the display.

Even with the latency thing, he kind of sets up and breaks down his own argument. He says, 2ms samples, so latency should be great! Then he says that they integrate over those samples to get an accurate measure... well... there goes your low latency.



Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon