search results matching tag: GPU

» channel: learn

go advanced with your query
Search took 0.000 seconds

    Videos (23)     Sift Talk (0)     Blogs (0)     Comments (86)   

A "give Steve Jobs your money" tribute video.

dag says...

Comment hidden because you are ignoring dag. (show it anyway)

I think you are conflating two different platform arguments. The Mac is not the iPad. If you're saying that OpenGL does not have direct access to the GPU on Macs- that's incorrect.

It's actually kind of funny that for you - the Mac is a "closed platform" for using an open standard like OpenGL compared with Microsoft's VERY proprietary - DirectX.

But yeah, Apple's all closed and shit - boo, Steve Jobs.

>> ^ForgedReality:

>> ^dag:
I'm finding Portal to be fast and good-looking on my MBP. >> ^ForgedReality:
At first, I was like, "OH LOOK! MACS CAN PLAY STEAM GAMES NOW! WOOHOOO!"
Then I was like, "Fuck. This is slow as shit! I'm going back to playing on my Windows PC."


Perhaps for a Mac, but the problem is, it uses OpenGL, which Valve's games are not as well-optimized for. A Windows machine with similar specs could run at higher settings with much improved framerate, and better visuals.
Last time I played an OGL game (which, granted has been some time), there were quite a few extensions that could not be ported over from DirecX, as there just was no OpenGL counterpart. I'm sure things have improved since then though.
But the problem is in Apple's clenched fist. It's a shame that Ol' Steve-o has such an extreme closed-platform mindset, because, really, all it does is harm his users' experience. Except, of course, where his concern is, when it causes problems such as incompatibility for his users. But really, were he more open to outside influence, any such problems could easily be worked out in short order, since Macs generally have a fairly standardized set of hardware.

Agenda Circling Forth: Stunning demoscene competition winner

Homeworld 2 - The Inspiration for Battle Star Galactica?

YouTube allows full 1080p HD (Sift Talk Post)

Impressive NVIDIA CUDA Tech Demo

Safari & VideoSift bad bedfellows (Sift Talk Post)

berticus says...

Flash is so broken these days. It used to be able to use my GPU - not anymore. Now my CPU goes spastic trying to take over, so much so that I had to download a manual fan controller proggy just to keep it from overheating. Oddities, like only certain videos cause the CPU to go nuts (codec related perhaps?), and wild memory leaks. I thought it was Firefox for a while, before getting the same problem in every browser. I miss Macromedia. Fuck Adobe.

Impressive NVIDIA CUDA Tech Demo

XBOX 360 Project Natal: No Controllers required

Edgeman2112 says...

You can't compare the success of the Wii to this.

The Wii is a severely crippled console compared to Xbox and PS3. Developers have an increasingly difficult time developing good looking games for it. 90% of the artistic style of Wii games have to do with the lack of any real shader support on the GPU. Anything more than that would tank the framerate.

Mythbusters Paint the Mona Lisa with Paintballs

OnLive - Play Any Game On Any Mac PC With Full Performance

dgandhi says...

If they have their machines spread out enough that their customers have < 50ms round trips through the net ,then I don't see a problem. As long as they have graphics cards which can generate the HD data stream in the GPU and send it to some form of super fast (single digit ms turn around) digital video routing system the video lag should not be a problem. I have seen lag comparisons between CRT and LCD, and it is a hell of a lot more than 50ms.

I don't have a 5Mbps connection, so I don't really know how big their potential customer base is, as it is I could not even get 480i from their service with my DSL.

Paying for more then 1Mbps seems to me to just be subsidizing services like this. If I was inclined to use half a dozen or more such services then it might make sense, but as it stands I don't see paying $50 more per month just to have the option to pay for a service like this, that kind of money can buy me games and consoles pretty quickly.

So what's your rig? (Videogames Talk Post)

jimnms says...

I tend to do a major upgrade or build every two years, but so far this system has kept me going for 3 years with only two minor upgrades (GPU & HD).

ASUS A8N-SLI Premium
AMD 64 X2 4400+ running @ 2.4GHz
2GB OCZ RAM 2-2-2-5T
BFG GeForce 260 (655MHz Core, 2250MHz MEM)
SB Audigy 2
250GB SATA II HD
400GB SATA II HD
DVD-R and DVD+RW

As for upgrades, I originally only had a 250GB HD, then added the 400GB later, and I recently upgraded from a GeForce 7800GT to the GeForce 260. I was thinking I'd need to upgrade my MB, CPU and RAM since my MB and CPU are Socket 939, but after installing the video card I can run everything I have maxed at 1680x1050, so think I'm good for a while. I'll see after November when all the new games start coming out.

It's also been very stable, I have not reinstalled WinXP since the day I built it.

Flash 10 is here

MythBusters: CPU vs GPU.. or Paintball Cannons are Cool!!!

10874 says...

Excellent video!

In the last year, I upgraded from an Athlon 64 3500+ (2.2 GHz single core) to a Core 2 Duo E6750 (2.66 GHz dual core, 1333 FSB). I was getting bad performance in my games and needed an upgrade. My video card was a Radeon X1600 Pro PCI-E 512 MB (DDR2 400).

I then bought a new tower and transferred the hardware over. What happened was very interesting.

In Frontlines: Fuel of War (Unreal 3 engine), I had been running in minimal detail on my Athlon. With the Core 2, the game detected my processor and decided that it would raise certain detail levels unadjustable through the ingame menu, resulting in crappier performance than I had been getting. I had to go into the config files and manually adjust several settings to get a playable framerate.

This means that GPUs are much more important to gaming than people think. My video card had set a ceiling limit on my system's gaming performance, even though I now run Crysis in high detail w/ silky smooth framerates with a 9800 GT.

Basically, you are better off with a not so great CPU and a decent video card than the other way around.

If at the time I had to choose between a new CPU or a new video card, I'd have been better off with the GPU.

MythBusters: CPU vs GPU.. or Paintball Cannons are Cool!!!

Quboid says...

It's a good video, but it is misleading. This was at an nVidia sponsored event, correct? Generally upgrading your GPU is a good help if you're a gamer (and if you're watching this you probably are) but this does sort of suggest that GPUs are just far faster than CPUs and it's not black and white. Technically it's right in that GPUs will do this quicker, but only because these tasks are very simple and there's a large number of them. If you were trying to model a physics theory then a CPU will generally be better at taking that one single task, chewing on it and spitting out one single result as the 196 processors on my new nVidia GTX 260 would spend all their time waiting for some other processor to finish a calculation which it needs the result for. Graphics is different, I game with about 2 million pixels needing processed at least 40 times a second so each processor can largely work on it's own batch of pixels.

The right coder with the right compiler can use GPUs for some funky stuff and CPUs will continue to increase the number of cores on each chip too. I've heard of some research places using game consoles as both their CPU and their GPU are dirty cheap (as MS or Sony is footing the bill, expecting to make it up from commission on game sales).

Anyway the "cpu" had a speed control on it. For all we know they only ever put it up to 3. Crank it to 11 baby!

MythBusters: CPU vs GPU.. or Paintball Cannons are Cool!!!



Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon