search results matching tag: GPU

» channel: learn

go advanced with your query
Search took 0.000 seconds

    Videos (23)     Sift Talk (0)     Blogs (0)     Comments (86)   

The Most Popular Programming Languages - 1965/2020

fuzzyundies says...

As a kid:

- C64 BASIC interpreter
- Pascal

As a teenager/student/intern:

- Perl scripts
- Java
- x86 ASM
- C

20 years later, in video game development:

- C++ (/14, /17) for PC and console game clients
- HLSL for GPU shaders
- Python for support scripts and build systems
- Typescript/JavaScript for web client games
- C# for Unity games

Not me, but some of our backend server guys even use Go.

Capitalism Didn’t Make the iPhone, You iMbecile

bcglorf says...

your contention that ONLY personal profit drives invention or innovation.

I'm afraid I've never argued that, I can lead by agreeing whole heartedly that such a contention is false.

I merely pointed out that in a video about how 'capitalism didn't create the iphone', the authors own examples of innovations that lead to the iphone are all 100% from within an economy based on capitalism. My very first post stated clearly that it's not a purely capitalist system, but that it is noteworthy that not a one of the examples chosen by the author making his point came from a socialist country.

Can you offer a comparative American/Russian timeline of computer innovations
Well, I could actually. If you want to deny the fact that Russia basically halted their computer R&D multiple times in the 70s, 80s and 90s in place of just stealing American advances because they were so far behind I can cite examples for you...

And for some unknown to you reason China is beating the ever loving pants off America lately.
1. Factually, no they are not. The fastest network gear, CPU and GPU tech are all base on American research and innovation. America is still hands down leading the field in all categories but manufacturing cost, but that isn't for reasons of technological advancement but instead a 'different approach' to environmental and labour regulations.
2. Within the 5G space you alluded to earlier, there is an additional answer. Their 5G isn't 'better' but rather 'cheaper' for reasons stated in 1. The existence of their 'own' 5G tech though isnt' because Huawei's own R&D was caught up so fast through their own innovation. Instead if you look into the history of network companies, Canadian giant Nortel was giving Cisco a solid run for it's money for a time, until they utterly collapsed because of massive corporate espionage stealing almost all of their tech and under cutting them on price. China's just using the same playbook as Russia to catch up.

Russia beat America into space

Well, if you want to go down that road the conclusion is that fascism is the key to technological advancement, as America and Russia were largely just pitting the scientists they each captured from the Nazis against one another.

Once again though, my point has never been that only capitalism can result in innovation. Instead, I made the vastly more modest proposal that personal profit from inventions is beneficial to innovation. I further observed that the video author's own examples support that observation, and in that contradict his own conclusion.

newtboy said:

Really? Can you offer a comparative American/Russian timeline of computer innovations, or are you just assuming? Be sure to focus on pre '68 era, before American socialism was applied in large part (public funding/monopoly busting).

And for some unknown to you reason China is beating the ever loving pants off America lately....so what's your point? Certainly not that Capitalism always beats socialism, I hope you aren't that deluded. Both have strengths and weaknesses, both ebb and flow. Neither are the sole determining factor for inventiveness, neither has a monopoly on invention.

Russia beat America into space even with their near poverty level economy at the time, and despite the fact that their scientists definitely didn't personally profit from their myriad of inventions required to make it happen.
I'm not arguing which is better, that's like arguing over which color is better....better in what way? I'm arguing against your contention that ONLY personal profit drives invention or innovation. That's clearly a mistaken assumption.

John Oliver - Parkland School Shooting

MilkmanDan says...

Good points.

I'm not a gun nutadvocate, but I have friends who are. I have shot a fairly wide range of guns with them, including an AR-15. For myself, I only ever owned BB guns and a .22 pellet air rifle, for target shooting and varmint control on my family farm. I did go pheasant hunting with borrowed 20 and 12 gauge shotguns a couple times.

My friend that owns the AR-15 is a responsible gun owner. Do I think he needs it? Hell no. But he likes it. Do I need a PC with an i7 processor and nVidia 1060 GPU? Hell no. But I like it.

So I guess it becomes a question of to what extent the things that we like can be used for negative purposes. My nVidia 1060 is unlikely to be used to facilitate a crime (unless games or bitcoin mining get criminalized). However, even though AR-15s might be one of the primary firearms of choice for murderous wackos, the percentage of people that own AR-15's who are murderous wackos is also extremely low.

If banning AR-15s would significantly reduce the rate of mass shootings and/or the average number of deaths per incident, it could be well worth doing even though it would annoy many responsible owners like my friend. ...But, I just don't think that would be the case. Not by itself.

I think we're at a point where we NEED to do something. If the something that we decide to do is to ban AR-15s, well, so be it I guess. But I don't think we'd be pleased with the long-term results of that. It'd be cutting the flower off of the top of the weed. We need to dig deeper, and I think that registration and licensing are sane ways to attempt to do that.

criticalthud said:

In 1934 the Thompson submachine gun was banned partly because of it's image and connection to Gansters and gangster lifestyle.
In the same way the AR-15 has an image and connection to a different lifestyle: that of the special ops badass chuck norris/arnold/navy seal killing machine. then they join a militia, all sporting these military weapons. there's a fuckin LOOK to it. a feel, a code, an expectation there. It's socialized into us.

That image is big fuckin factor in just how attractive that particular weapon is to a delusional teenager.

Steve Jobs Foretold the Downfall of Apple!

Mordhaus says...

As a former employee under both Jobs and Cook, I can tell you exactly what is wrong with Apple.

When I started with Apple, every thing we were concerned with was innovating. What could we come up with next? Sure, there were plenty of misses, but when we hit, we hit big. It was ingrained in the culture of the company. Managers wanted creative people, people who might not have been the best worker bee, but that could come up with new concepts easily. Sometimes corporate rules were broken, but if you could show that you were actively working towards something new, then you were OK.

Fast forward to when Cook started running the show, Steve was still alive, but had taken a backseat really. Metrics became a thing. Performance became a watchword. Managers didn't want creative thought, they wanted people who would put their nose to the grindstone and only work on things that headquarters suggested. Apple was no longer worried about innovating, they were concerned with 'maintaining'.

Two examples which might help illustrate further:

1. One of the guys I was working with was constantly screwing around in any free moment with iMovie. He was annoyed at how slow it was in rendering, which at the time was done on the CPU power. Did some of his regular work suffer, yeah. But he was praised because his concepts helped to shift some of the processing to the GPU and allow real time effects. This functionality made iMovie HD 6 amazing to work with.

2. In a different section of the company, the support side, a new manager improved call times, customer service stats, customer satisfaction, and drastically cut down on escalations. However, his team was considered to be:

a. making the other teams look bad

and

b. abusing the use of customer satisfaction tools, like giving a free iPod shuffle (which literally costs a few dollars to make) to extremely upset customers.

Now they were allowed to do all of these things, no rules were being broken. But Cook was mostly in charge by that point and he was more concerned with every damn penny. So, soon after this team blew all the other teams away for the 3rd month in a row, the new manager was demoted and the team was broken up, to be integrated into other teams willy-nilly.

Doing smart things was no longer the 'thing'. Toeing the line was. Until that changes, nothing is going to get better for Apple. I know I personally left due to stress and health issues from the extreme pressure that Cook kept sending downstream on us worker bees. My job, which I had loved, literally destroyed my health over a year.

Terry Crews explains why he decided to build his own PC

LukinStone says...

That's the worst time, the inevitable second act dilemma, of PC building.

You can budget in the expectation of how long it takes to do the housekeeping stuff. Loading the OS, essential programs, personal preferences - the games themselves...but there's often that one random thing.

I built a nice medium-range game PC with someone else recently, my building partner was so excited. It's amazing how much of a bond that creates between people, or how it can strengthen a relationship. Not just for building PC's specifically, but for sharing something and having that moment of realization of how cool that thing shared really is.

I felt more pissed off than anything for a brief moment during the boot up, when the display seemed to shutdown startup before anything really happened. Luckily, I'd paid attention enough when researching the GPU and eventually remembered someone mentioning there was a button on the card itself that controls the LED lights on it, pressing it seemed to clear whatever was blocking the startup processes for the card.

There was definitely a soul-crushing few hours of doubt and agony before I remembered that detail. During that time, I stared at the clean interior of the fully assembled build, having had a hard enough time getting the cords to fit and wondering if something minor and imperceptible had wiggled loose, wondering if I would go mad.

Having someone else depending on the solution was another intense emotion heightening element. I'd done my best to prime for this likelihood. I'd shared stories of problems I'd had on previous builds, the random thing that went wrong. I stressed the fact that the computer had always, eventually, got built.

It's a good, stinging bit of humility for me. Even when I try to minimize problems and anticipate potential issues, I'll still miss something as obvious as a big button right in front of my face.

Phreezdryd said:

I can't help but wonder about how much fun was had in the unmentioned time between pressing the power button, and actually being able to play games.

Star Wars Mos Eisley recreated in Unreal 4

NaMeCaF says...

If you have the specs necessary to run it, you can download it and play it for yourself here. Warning: it's huge (over 7GB)


This project was an experiment to see how much geometry and textures we could push in UE4 on modern PC hardware, and as a result, it is fairly system intensive. We recommend 16GB of RAM and at least an Nvidia GTX760 or AMD equivalent as the minimum requirements.

For more fluid frame-rate, and/or screen resolutions above 1920x1080 we recommend at least an Nvidia GTX970, or AMD equivalent or higher (2GB of VRAM for Low texture settings / 3GB for Medium texture settings / 4GB for High texture settings / 6GB or higher for EPIC texture settings)

Also, Terrain displacement is a very system heavy feature. Please note that enabling this feature on a GPU slower than an Nvidia GTX980 Ti may cause a significant drop in frame-rate.

Computer Nightmares, China USB hub kills PC by design

SDGundamX says...

They have without doubt some of the most quality engineered laptops on the planet. I have a Macbook and my wife has an Asus Macbook clone (straight down to the silver-polish finish). And yes, hers cost less and has a dedicated GPU so she could play games on it (if she had any interest in games) but the Macbook is lighter, keeps the battery charged longer, has a much more beautiful display (Retina vs Full Hd), is much more comfortable to type with, and the touchpad is just freaking heaven to use. I now hate having to use touchpads on any Windows laptop, even my bootcamped Mac!

And you hit the nail on the head about using the right tool for the right job--I work with video as part of my job sometimes and I don't think I can ever go back to video editing on a Windows machine. I can do it easier and faster on an OSX device.

I think also the initial outward simplicity of Mac operating systems makes them ideal for people who don't want to or don't have time to become "computer people" and worry about dealing with downloading the latest drivers or all of the other BS that you need to constantly deal with on a Windows machine. I especially wish my dad, who is constant calling me and my brother for help with his PC, would just switch over to a Mac as it would solve probably 95% of the issues he calls us about.

dannym3141 said:

To be fair, Apples are actually really useful for certain jobs. And i don't mean propping tables up or holding doors open.

Star Wars Battle Pod From Concept to Cockpit

LiquidDrift says...

Ughh, it might just be the way they shot the video, but it looks like the framerate is pretty choppy, especially on the landspeeder clip. Why have a giant pod and skimp on the GPU, wtf?

Graphics card woes

ForgedReality says...

Based on reviews, Ghosts was not only a terrible port but just a terrible game in general. EA tends to push their devs to release things more quickly rather than with higher quality. I wouldn't doubt that at least contributed negatively to your experience.

Crossfire and SLI can be dicey though just because the GPU's have to know how to talk back and forth and stay in sync with each other, which just escalates the probability that something can go wrong either from the game developers' end or from driver support . Single GPUs are plenty powerful these days, so I'm right there with you with the single GPU approach. I've not gone back to multi-GPU since my 4870x2 many years ago (which usually had issues due to ATI constantly breaking the drivers and then having to fix it again every other update).

Gutspiller said:

Unfortunately, at least in my scenario, it's not an AMD vs nVidia. I was on nVidia before with CoD: Ghosts... still very poor fps, and still the same answer, disable your dual GPUs, and the game will run better.

Maybe it's a CoD problem more so than other devs, but really... If one of the biggest titles to launch every year isn't optimizing for dual GPU cards, there isn't much of a reason to buy one if big AAA devs aren't going to take advantage of it.

It's like you're throwing your money away buying the 2nd GPU.

I do agree about the drivers tho, I constantly see driver updates for nVidia cards, and sit and wonder where AMD's drivers are.

Single nVidia GPU for me from here on out.

Too bad those "The way it's meant to be played" mean shit when it comes to the devs supporting your companies top-of-the-line cards.

Graphics card woes

Gutspiller says...

Unfortunately, at least in my scenario, it's not an AMD vs nVidia. I was on nVidia before with CoD: Ghosts... still very poor fps, and still the same answer, disable your dual GPUs, and the game will run better.

Maybe it's a CoD problem more so than other devs, but really... If one of the biggest titles to launch every year isn't optimizing for dual GPU cards, there isn't much of a reason to buy one if big AAA devs aren't going to take advantage of it.

It's like you're throwing your money away buying the 2nd GPU.

I do agree about the drivers tho, I constantly see driver updates for nVidia cards, and sit and wonder where AMD's drivers are.

Single nVidia GPU for me from here on out.

Too bad those "The way it's meant to be played" mean shit when it comes to the devs supporting your companies top-of-the-line cards.

ForgedReality said:

It's not dual-GPU that's the problem. It's dual-GPU AMD card that's the problem. AMD drivers are effing terrible. They always have been. Get a dual-GPU nVidia card and everything changes.

Graphics card woes

Chairman_woo says...

I have a R9 280x and to be honest I've never really seen it get past about 60% GPU & 2ish Gig of the Vram.

However I'm only running a single 1080p monitor, nor am I running any kind of upscaling based anti aliasing.

The future seems to be 4k monitors and for the serious psychos 4k eyefinity and maybe even that silly Nvidia 3D thing.

When you start to get into anything like that (and 4000p will inevitably come down to consumer level in price), coupled with the recent push for texture resolution in AAA games, all your're futureproofing starts to go out of the window.

The reason people are pissed off is because this card could have easily seen users through the next few years of monitor and games tech and they artificially gimped it such that anyone that wants to stay reasonably cutting edge will have to buy new cards in 2-3 years.

4 gig is fine for now, but it's a joke that a new top end card would have less Vram than some medium weight cards from two generations ago. Even my 280x has 3.

Long story short resolution eats Vram for breakfast and resolution is where most of the next gen game developments are likely to be biased. It's frustrating but as some others have suggested, it's really nothing new.

BoneRemake said:

@lucky760 What are you running ?

I have a nicely working Radeon R7 760 2gb. Works aces for me, non of this hoo ha the apparent story seems to be.

Graphics card woes

ForgedReality says...

It's not dual-GPU that's the problem. It's dual-GPU AMD card that's the problem. AMD drivers are effing terrible. They always have been. Get a dual-GPU nVidia card and everything changes.

Gutspiller said:

The sad thing is, as an owner of an AMD 7990 (dual GPU) and seeing that most of todays games run horrible with a dual GPU setup because the devs have not taken the time to code for it, my next upgrade I will be forced to go to a single GPU card.

So either way, both companies have a darker side that isn't talked about as much as they should be in reviews.

Example: i7 4.2Ghz, 16GB ram, AMD 7990 GFX card, CoD: AW @ 1920x1080 will drop to 7fps.... IN THE MENU. (and it's not just COD). I've seen multiple new releases run like crap.

Talk to support, they answer is disable crossfire. Why pay for a dual GPU if no new games are supporting them, because their biggest audience runs 1 GPU.

I wish a new company would knock both AMD and nVdia on their ass, and take over with better tech, faster cards, and not be so sleazy. (One can dream)

Graphics card woes

Gutspiller says...

The sad thing is, as an owner of an AMD 7990 (dual GPU) and seeing that most of todays games run horrible with a dual GPU setup because the devs have not taken the time to code for it, my next upgrade I will be forced to go to a single GPU card.

So either way, both companies have a darker side that isn't talked about as much as they should be in reviews.

Example: i7 4.2Ghz, 16GB ram, AMD 7990 GFX card, CoD: AW @ 1920x1080 will drop to 7fps.... IN THE MENU. (and it's not just COD). I've seen multiple new releases run like crap.

Talk to support, they answer is disable crossfire. Why pay for a dual GPU if no new games are supporting them, because their biggest audience runs 1 GPU.

I wish a new company would knock both AMD and nVdia on their ass, and take over with better tech, faster cards, and not be so sleazy. (One can dream)

Graphics card woes

00Scud00 says...

I have no particular allegiance to either brand and have bounced back and forth between the two for years. This is my first Nvidia card in a while, the main reason for buying ATI cards came down to price for me, why spend hundreds more when I could get a comparable ATI card with more VRAM to boot? That changed with the 970, I could get a better GPU with another gig of VRAM over my Radeon 7970, or is this only a half a gig now?

Ralgha said:

3.5 gb ! LOL. This is so good. And no, I don't own a 970. But I'd take one over an AMD GPU any day.

Graphics card woes



Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon