What is a Nanosecond?

Watch computer scientist Grace Hopper demonstrating the concept of a nanosecond.

Rear Admiral Grace Murray Hopper (December 9, 1906 – January 1, 1992) was an American computer scientist and United States Navy officer.

A pioneer in the field, she was one of the first programmers of the Harvard Mark I computer, and developed the first compiler for a computer programming language.She conceptualized the idea of machine-independent programming languages, which led to the development of COBOL, one of the first modern programming languages.

She is credited with popularizing the term “debugging” for fixing computer glitches (motivated by an actual moth removed from the computer). Because of the breadth of her accomplishments and her naval rank, she is sometimes referred to as “Amazing Grace.” The U.S. Navy destroyer USS Hopper (DDG-70) was named for her, as was the Cray XE6 “Hopper” supercomputer at NERSC.

-GeeksAreSexy
deathcowsays...

Modern computers clocked upwards towards 4 ghz are having clock transistions in chunks of picoseconds now. Transmission of these signals around a circuit board no longer is simple! Many special design considerations come in to moving the data around. Wires on the circuit boards are setup as carefully impedance balanced transmission lines. Imagine that! Sitting at some microchip and having a clock transition sweeping by you and another clock wavefront coming in just a couple inches away.

MonkeySpanksays...

This is why we have on-processor cache (L2, L3) in the first place. Even at the speed of light, the distance traveled between the processor and motherboard memory, when done a few billion times a second, starts to become a limiting factor on processing speed.

bmacs27says...

Latency's a bitch ain't she?

It's interesting, people always get all worked up over bandwidth. 99 times out of 100 it's long latencies you're noticing.

>> ^MonkeySpank:

This is why we have on-processor cache (L2, L3) in the first place. Even at the speed of light, the distance traveled between the processor and motherboard memory, when done a few billion times a second, starts to become a limiting factor on processing speed.

skinnydaddy1says...

Latency - The amount of time it takes from the moment the user first logs in to when real work is started.
I.E.- Jim logged in at 7am but after checking email, facebook, news and then the lunch menu. He did not start on the Presentation till around 11:30am.

Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists




notify when someone comments
X

This website uses cookies.

This website uses cookies to improve user experience. By using this website you consent to all cookies in accordance with our Privacy Policy.

I agree
  
Learn More