search results matching tag: 10K

» channel: learn

go advanced with your query
Search took 0.000 seconds

    Videos (25)     Sift Talk (4)     Blogs (7)     Comments (174)   

Numberphile - The Fatal Flaw of the Enigma Code Machine

radx says...

Edit: Oh boy, wall of text crits for 10k.

His explanation was rather short and somewhat misleading. Maybe they thought a proper explanation would have been too dry or too lengthy to be of any interest for a sufficient number of their viewers.

tl:dr

If all rotor settings are indicated to be correct, a feedback loop within the circuit indicated a subset of correct connections on the plugboard, even if the initially assumed connection turned out to be wrong. It didn't show all connections, but enough to run it through a modified Enigma to determine if it's a false positive or in fact the correct setting. If it was correct, the rest could be done by hand.

----------------------- Long version -----------------------

Apologies in advance. We had to recreate parts of the Bombe as a simulation, but a) it's been a while and b) it was in German. I'll try to explain the concept behind it, hopefully without screwing it up entirely.

The combination of clear message and code snippet (2:25) is called a crib. This can be used to create a graph, wherein letters are the vertices and connections together with their numerical positions are the edges.

For example, at position 1, "A" corresponds to "W". So you'd create an edge between "A" and "W" and mark that edge as "1". At position 4, "B" corresponds to "T", so there's the edge marked as "4". All edges are bidirectional, the transformation at a specific position can go either way.

Once your graph is finished, you check for loops. These are essential. Without loops, you're boned. In this case, one loop can be found at positions 2,3,5 in form of "T->E->Q->T".

Here the Bombe comes into play. It uses scramblers, each combining all three rotors plus reflector of an enigma into one segment. This way, one Enigma setting is functionally equal to a single scrambler.

Now you can use those scramblers to create an electrical circuit that corresponds to your graph -- scrambler = edge. All scramblers are set to the same initial configuration. The first scramber remains at in the inital configuration, while the second and third get configurations in relation to their edge's numerical value. Configuration in this case means the value of their internal three rotors, so there are 26*26*26 possible settings within each scrambler.

It's basically a sequence of three encryptions.

Example: in our little TEQ triangle, the first scrambler (TE, 2) gets a random starting position. The second scrambler (QE, 5) gets turned three notches, the third scrambler (QT, 3) gets turned one notch. The initial configuration might be wrong, but only the relation between the scramblers matters. A wrong result simply tells you to turn all scramblers another notch, until you get it right.

You have a possibly correct setting when the output matches the input. Specifically, a voltage is applied to the wire of letter "T", leading into the first scrambler. And on a test register attached to the last scrambler, the wire of letter "T" should have a voltage on it as well. If the setting is incorrect, a different letter will light up. Similarly, all incorrect inputs for this particular setup will always light up a different letter at the the end, never the same (thanks to the reflector). If output equals input, you're golden. And if several loops are used, all with the same input/output letter, each of their outputs must equal the input.

To reduce the number of false positives, you need as many connected loops within the crib as possible.

So far, that's an Enigma without a plugboard. To account for that, they introduced feedback loops into the circuit. In our small scale case, the output of the third scrambler would be coupled back into the input of the first scrambler. The number of loops determines the number of possible outcomes with each specific setting. All of these are fed back into the first scrambler of each loop.

The plugboard, however, changed the input into the system of rotors. Instead of a "T" in our example, it might be a "Z", if those two letters were connected on the board.

A random hypothesis is made and fed into the machine. If the scramblers are set incorrectly, a different letter comes out at the end of each loop and is in return fed back into the first scramblers. Result: (almost) everything lights up. If you start with a good graph, everything will light up.

-----
A key element for this was the "diagonal board", which represented a) all possible connections on the plugboard and b) the bidirectional nature of those connections (AB = BA). Maybe it can be explained without pictures, but I sure as hell can't, so "a grid of all possible connections between scramblers and letters + forced reciprocity" will have to suffice.
-----

If, however, the setting was correct, a wrong hypothesis for the input connection merely meant that everything except the right connections was lit up.

Let's say the fix point of the loops in our graph is the letter "T". We assume that it's connected to the letter "Z" on the plugboard. A voltage is applied to "Z" on the test register, and thereby inserted into the circuit at the first scrambler. Loop #1 applies voltage to the letter "A" on the test register, #2 lights up "B", #3 lights up "F". These three outputs are now fed back into the first scrambler, so now the scrambler has voltage on ZABF, which in return lights up ZABF+GEK on the test register.
This goes on until everything except "U" is lit up on the test register. That means three things: a) the settings are correct, b) the hypothesis is wrong, c) "T" is connected to "U".

Reasons:
a) if the settings were incorrect, the entire register would be alive
b) if the hypothesis was correct, only the letter "Z" would be alive on the register
c) due to the feedback loop, the only way for the output to be "U" is if the input was also "U", and the reciprocity within the system makes it impossible for any other input to generate the output "U". Since "T" was the fix point for our loops, "T" is connected to "U".

Similarly, if the initial hypothesis is correct, everything on the test register except "U" stays dead.

The diagonal board provides registers for every single letter and allows the user to pick one as a test register. During operation, all the other registers serve as visual representations of the deductions based on the initial hypothesis. So you actually get to see more than just the initial connection, all based on the same concept.

rychan said:

I do not understand at all why finding one contradictory plug setting, e.g. (t a) and (t g), means that every other plug setting you found during that trial was wrong. That cannot possibly be true. The space of possible plug connections (on the order of 26*25) is too small. You've probably got millions of trials that end in conflicting plug settings. You would end up invalidating all of them. I must be misunderstanding what he was trying to say.

Jon Stewart on Gun Control

RedSky says...

@jimnms

I'll address by paragraphs:

(1)

The reason I suggested that you are implying that the US is more violent by nature is because statistically it is far more murderous than a country of its socio-economic development should be. Have a look at Nationmaster tables of GDP/capita and compare than to murders/capita in terms of where the US sits.

If we take the view that you are suggesting that we should simply reduce violence globally then that is a laudable goal but it would suggest that the US is abysmally failing at this currently. I happen to believe this reason is gun availability. I see no reason to believe this abysmal failure comes from gross police incompetence or any other plausible factor, rather the gun ownership and availability that sticks out like a sore thumb when you compared to other countries such as those in the G8.

(2)

I think that we would be both agree that there are more gun enthusiasts in rural areas. Many of those would also own collections of guns for recreation rather than merely what self protection would require. The article below cites a study from 2007 by Harvard that says 20% own 65% of the nation's guns.

http://www.usatoday.com/story/news/nation/2012/12/19/tragedy-stresses-multiple-gun-ownership-trend-in-us/1781285/

There is no reason to suspect that these people are any more violent than your non gun-owning folk. The issue is not so much ownership levels, but the availability that feeds a would-be criminal's capacity to carry out a crime.

While actual ownership levels might be lower, guns can no doubt be purchased for cheaper and within a closer proximity in densely populated cities. This availability feeds the likelihood of them being employed as a tool to facilitate a crime.

This is also incidentally a key misunderstanding of the whole gun debate. No one is (or should be at least) implying that recreational gun owners are the problem. It is the necessity for guns to be freely available to gun enthusiasts among others for them to enjoy this hobby that causes the problems.

(3)

Building on my above point above, gun control shouldn't be seen as a punishment. There is no vidictiveness to it, merely a matter of weighing up the results of two courses of action. On the one hand there is diminished enjoyment of legal and responsible gun owners. On the other hand there is the high murder rate I discussed earlier, which really can't be explained away any other way than gun availability.

Let's do a back of the envelope calculation. Australia and the US are culturally relatively similar Anglo-Saxon societies. Let's assume for the sake of argument that my suggestion is true. Referencing wiki here:

http://en.wikipedia.org/wiki/List_of_countries_by_intentional_homicide_rate

The homicide rate in Australia is 1.0/10K/year and 4.8/10K/year. Let's say that gun availability explains 2/3rds of the difference. So we're talking about a 2.5/10K/year increase. Taking this against the US's 310M population this represents 7,500 more deaths.

Now to me, the issue is clear cut. The lives lost outweight gun enthusiast enjoyment.

And it's not just to me. There is a very clear reason that the vast majority of developed countries have made gun ownership incredibly difficult. I can guarantee, at some point they have done this back of the envelope calculation for their own country.

(4)

You raise the comparison to cars. See my workings above. With cars, they obviously provide a fundamentally invaluable benefit to society. The choice every society has made is to instead heavily regulate them. The reason there is no outcry to impose heavy restrictions on them is because there already are.

- Being required to pass license tests.
- Strict driving rules to follow.
- Speeding cameras everywhere.
- Random police checks for alcohol.

Can you think of any further regulations plausibly worth trying with cars that could reduce the accident death rate? I struggle to think of anything else effective that hasn't already been implemented.

With guns there are dozens of options not yet tried.

- Rigorous background checks.
- No gun show exemption.
- Assault weapon restrictions.
- Restrictions of ammo such as cost tariffs.

The list goes on. Imagine if we lacked the regulations we do on cars and there was a NCA (National Car Association) that was equating requiring to pass a driving test to tyranny.

(5)

I don't think there's much irrationality here. The US is clearly more murderous than other G8/OECD countries. To me, Occam's Razor explains why.

As for the comment on focussing on tragedies than the large issue, see my previous comment. You're missing the point that it's not just the gun sprees that are the problem, it's the steadily high murder rate. Mass shooting are just blips in this.

(6)

I will have a read through this.

Solar Roadways

newtboy says...

I call shenanigans.
Let's do some math...
5 billion panels at approximately $10K each at (far less than) today's prices for regular home panels which may be as low as $500 each when on sale, purchased in bulk, and discontinued models (note his panels are 12'x12', which is approximately 15-20 normal size panels spliced together) This means JUST the panels for his idea would cost over $50 TRILLION (at the insane discount price I described). This does not include instalation (labor and hardware), inverters, wireing, transmission, super-glass coating, heater, LED setup, .... All this is usually at least 2/3 -3/4 the cost of a solar system without super-glass, so lets be stupidly kind again and say his system can be put together at the same cost per watt as a cheap home system (which is insanity, it would cost 10 times that per watt or per square foot at least, probably more like 100 times with the super glass), that comes to >$150 TRILLION. Then you need to completely redesign cars (even the already electrics) to capture and use the electricity, another huge, ignored cost of this system. Just an educated guess, it seems like a realistic number for this entire nation wide system would likely be well over $3000Trillion. Get real.
Also, in what world are most roadways not shaded, it's not the one I live in.
And I guess we won't be driving at night anymore, or should we spend another $100-$1000 trillion to line the roadways with batteries?
They answer the comment 'you'll just slip off when it's wet' with 'it's nearly as stong as steel'. ? Perhaps they think sliding off the road is fine, I don't.
They imply they think they're system is cheaper than asphalt, or soon will be. That's just plain wrong under the best (or worst) of circumstances.
The entire 'pressure plate' idea is just stupid, it ignores the energy used to push the plate down, which is the same as constantly climbing a steep hill, or really more like constantly driving up stairs.
Get your broom people.

Seconds From Disaster : Meltdown at Chernobyl

GeeSussFreeK says...

@radx No problem on the short comment, I do the exact same thing

I find your question hard to address directly because it is a series of things I find kind of complexly contradictory. IE, market forces causing undesirable things, and the lack of market forces because of centralization causing undesirable things. Not to say you are believing in contradictions, but rather it is a complex set of issues that have to be addressed, In that, I was thinking all day how to address these, and decided on an a round about way, talking about neither, but rather the history and evolution as to why it is viewed the way you see it, and if those things are necessarily bad. This might be a bit long in the tooth, and I apologize up front for that.

Firstly, reactors are the second invention of nuclear. While a reactor type creation were the first demonstration of fission by humans (turns out there are natural fission reactors: Oklo in Gabon, Africa ), the first objective was, of course, weapons. Most of the early tech that was researched was aimed at "how to make a bomb, and fast". As a result, after the war was all said and done, those pieces of technology could most quickly be transitioned to reactor tech, even if more qualified pieces of technology were better suited. As a result, nearly all of Americas 104 (or so) reactors are based on light water pressure vessels, the result of mostly Admiral Rickover's decision to use them in the nuclear navy. This technological lock in made the big players bigger in the nuclear field, as they didn't have to do any heavy lifting on R&D, just sell lucrative fuel contracts.

This had some very toxic effects on the overall development of reactor technology. As a result of this lock-in, the NRC is predisposed to only approving technology the resembles 50 year old reactor technology. Most of the fleet is very old, and all might as well be called Rickover Reactors. Reactors which use solid fuel rods, control rods, water under pressure, ect, are approved; even though there are some other very good candidates for reactor R&D and deployment, it simply is beyond the NRCs desire to make those kinds of changes. These barriers to entry can't be understated, only the very rich could ever afford to attempt to approve a new reactor technology, like mutli-billionaire, and still might not get approved it it smells funny (thorium, what the hell is thorium!)! The result is current reactors use mostly the same innards but have larger requirements. Those requirements also change without notice and they are required to comply with more hast than any industry. So if you built a reactor to code, and the wire mesh standards changed mid construction, you have to comply, so tear down the wall and start over unless you can figure out some way to comply. This has had a multiplication effect on costs and construction times. So many times, complications can arise not because it was "over engineered", but that they have had to go super ad-hawk to make it all work due to changes mid construction. Frankly, it is pretty amazing what they have done with reactor technology to stretch it out this long. Even with the setbacks you mention, these rube goldbergian devices still manage to compete with coal in terms of its cost per Kwh, and blow away things like solar and wind on the carbon free front.

As to reactor size LWRs had to be big in the day because of various reasons, mostly licencing. Currently, there are no real ways to do small reactors because all licencing and regulatory framework assumes it is a 1GW power station. All the huge fees and regulatory framework established by these well engineered at the time, but now ancient marvels. So you need an evacuation plan that is X miles wide ( I think it is 10), even if your reactor is fractionally as large. In other words, there is nothing technically keeping reactors large. I actually would like to see them go more modular, self regulating, and at the point of need. This would simplify transmission greatly and build in a redundancy into the system. It would also potentially open up a huge market to a variety of different small, modular reactors. Currently, though, this is a pipe dream...but a dream well worth having and pushing for.

Also, reactors in the west are pretty safe, if you look at deaths per KWH, even figuring in the worst estimates of Chernobyl, nuclear is one of the best (Chernobyl isn't a western reactor). Even so, safety ratcheting in nuclear safety happens all the time, driving costs and complexity on very old systems up and up with only nominal gains. For instance, there are no computer control systems in a reactor. Each and every gauge is a specific type that is mandated by NRC edict or similar ones abroad (usually very archaic) . This creates a potential for counterfeiter parts and other actions considered foul by many. These edicts do little for safety, most safety comes from proper reactor design, and skillful operation of the plant managers. With plants so expensive, and general costs of power still very competitive, Managers would never want to damage the money output of nuclear reactors. They would very much like to make plant operations a combination of safe, smooth, and affordable. When one of those edges out the other, it tends to find abuses in the real world. If something gets to needlessly costly, managers start looking around for alternatives. Like the DHS, much of nuclear safety is nuclear safety theater...so to a certain extent, some of the abuses don't account for any real significant increase in risk. This isn't always the case, but it has to be evaluated case by case, and for the layperson, this isn't usually something that will be done.

This combination of unwillingness to invest in new reactor technology, higher demands from reactors in general, and a single minded focus on safety, (several NRC chairmen have been decidedly anti-nuclear, that is like having the internet czar hate broadband) have stilted true growth in nuclear technology. For instance, cars are not 100% safe. It is likely you will know someone that will die in a car wreak in the course of your life. This, however, doesn't cause cars to escalate that drastically in safety features or costs to implement features to drop the death rate to 0. Even though in the US, 10s of thousands die each year in cars, you will not see well meaning people call for arresting foam injection or titanium platted unobtanium body frames, mainly because safety isn't the only point of a car. A car, or a plane, or anything really, has a complicated set of benefits and defects that we have to make hard choices on...choices that don't necessarily have a correct answer. There is a benefit curve where excessive costs don't actually improve safety that much more. If everyone in the USA had to spend 10K more on a car for form injection systems that saved 100 lives in the course of a year, is that worth it? I don't have an answer there as a matter of fact, only opinion. And as the same matter of opinion on reactors, most of their cost, complication, and centralization have to do with the special way in which we treat reactors, not the technology itself. If there was a better regulatory framework, you would see (as we kind of are slowly in the industry despite these things) cheaper, easier to fabricate reactors which are safer by default. Designs that start on a fresh sheet of paper, with the latest and greatest in computer modeling (most current reactors were designed before computer simulations on the internals or externals was even a thing) and materials science. I am routing for the molten salt, thorium reactors, but there are a bunch of other generation4 reactors that are just begging to be built.

Right now, getting the NRC to approve a new reactor design takes millions of dollars, ensuring the big boy will stay around for awhile longer yet. And the regularly framework also ensures whatever reactor gets built, it is big, and that it will use solid fuel, and water coolant, and specific dials and gauges...ect. It would be like the FCC saying the exact innards of what a cellphone should be, it would be kind of maddening to cellphone manufacturers..and you most likely wouldn't have an iPhone in the way we have it today. NRC needs to change for any of the problems you mentioned to be resolved. That is a big obstacle, I am not going to lie, it is unlikely to change anytime soon. But I think the promise of carbon free energy with reliable base-load abilities can't be ignored in this green minded future we want to create.

Any rate, thanks for your feedback, hopefully, that wasn't overkill

TYT - Romney: Why Don't Airplane Windows Roll Down?

skforty says...

>> ^lantern53:

"In case you missed it, this week, there was a tragedy in Kansas. Ten thousand people died -- an entire town destroyed." --on a Kansas tornado that killed 12 people

This is just one from a whole page listing stupid things obama said, per good ole google


That article reported 10 dead initially, the text of the article now says 11, and it turns out maybe 12 died now. There is a big difference between misspeaking the data you hear from your earpiece or PR guy (or even brainfarting and saying 10k when you mean 10), and not understanding why windows don't open up in airplanes.

Daily show has it right though, Obama is the luckiest man right now...he's not doing anything right to win it, but most likely will, because Romney "blew out his ACL".

Camp stove generates electricity for USB charging

GeeSussFreeK says...

@bmacs27 Cell phones don't launch you to a higher standard of living more than a fully integrated energy system. Refrigeration, transportation, fertilizer, steel production, manufacturing, iron ore reprocessing, food production, water purification, medicine distribution/production/manufacture ect. These things require TONS of energy, something a cellphone charger stove is not. A phone charging stove, IMO, will confer very little improvement to the standard of living to the their world. "Worthwhile" is a pretty arbitrary idea, though, so there isn't a "right" answer per say. But I would wager greater quality of life would be had if you used all the money from these stove thingies into an energy infrastructure; having access to clean water and a electrical grid might be better than a marginally cleaner stove.

Particulate matter is usually the risk associated with burning coals and woods from coal and wood ash. Greenhouse isn't the issue (the CO2 in plants will go back into the air via decomposition), it is the junk that goes into your lungs. Less is good, but electric stoves (or gas) would be better as you can move the source of smoke to some distant place. I don't know the economics of small village towns, so perhaps this has a place as a stopgap until there is serious economic development, but it isn't a very big step...so I am trying some googlefu to see how much they plan to spend over there on these. If it is like 10k bucks or something, then ya, those 200 people or so you help is niceish, if they plan to spend millions, or they are charging them instead of handing them out, then other developments would be far better. As someone who has burned wood to heat a house as the primary source, it isn't fun, even if it was 50% better, using money to buy a better stove would of been silly compared to just using gas or electric provided there was an system for doing so. I think money would be better spent developing those systems instead of vesting money in a dead end technology (burning wood). One might liken it to fixing up that old car that keeps breaking down, it is a money trap...best to go get a better car if you could.

Even so, I think I might get one in the future for camping. Could be fun to mess with the TEG and main container to perhaps tweak some higher power levels out of it!

Peroxide (Member Profile)

bcglorf says...

It's not even that I am 'doubting' the proxy measures. I am directly observing that the proxy measures DO NOT register the last 100 years as particularly unusual or abnormal. In fact, the more accurate and improved the proxy reconstructions have become, the more normal the last 100 years appears in those reconstructions.

The proxy reconstructions are as much 0.6 degrees cooler than instrumental records at the exact same point in time. I am objecting to a laymen like in the video coming along and saying the instrumental record's warmth is unprecedented over the last 2k years. Sure the proxy records don't show temperatures as high as the instrumental record in the last 2k years. The proxy records don't even show temperatures as high as the instrumental record in the last 10. The proxy records fail to recreate the temperatures observed in the instrumental record.

Is that making sense or clear what I am talking to?


In reply to this comment by Peroxide:
hmmm, I was aware that we only have thermometric readings of temperature for the last 100-150+ years. So basically you are doubting the ability of tree rings, pollen identification in sediments, and other methods of temperature reconstruction.

If I may reiterate my point, which I made rudely in the video post, I would say that you might be interested to know that if you go back further than 2k years, as in, more than 10k, there are temperature changes that were even greater than 1 degree, however, homo-sapiens was not around to endure them. Irregardless of previous temperature deviation, science tells us that our "freeing-up" of carbon dioxide, and creation of methane, are the culprits of the current temperature increase.

How does our ability to measure the last 2k years change that? or change the fact that we are heading for a 6 degree increase (which would not be uniform, for instance the poles have already warmed more than by 0.8 degrees, while the tropics may have warmed by less than 0.8 degrees)?

I fail to see that you have any point outside of that our estimations going back past 150+ years may be slightly off.

"It is obviously true that past climate change was caused by natural forcings. However, to argue that this means we can’t cause climate change is like arguing that humans can’t start bushfires because in the past they’ve happened naturally. Greenhouse gas increases have caused climate change many times in Earth’s history, and we are now adding greenhouse gases to the atmosphere at a increasingly rapid rate." -s.s.


bcglorf (Member Profile)

Peroxide says...

hmmm, I was aware that we only have thermometric readings of temperature for the last 100-150+ years. So basically you are doubting the ability of tree rings, pollen identification in sediments, and other methods of temperature reconstruction.

If I may reiterate my point, which I made rudely in the video post, I would say that you might be interested to know that if you go back further than 2k years, as in, more than 10k, there are temperature changes that were even greater than 1 degree, however, homo-sapiens was not around to endure them. Irregardless of previous temperature deviation, science tells us that our "freeing-up" of carbon dioxide, and creation of methane, are the culprits of the current temperature increase.

How does our ability to measure the last 2k years change that? or change the fact that we are heading for a 6 degree increase (which would not be uniform, for instance the poles have already warmed more than by 0.8 degrees, while the tropics may have warmed by less than 0.8 degrees)?

I fail to see that you have any point outside of that our estimations going back past 150+ years may be slightly off.

"It is obviously true that past climate change was caused by natural forcings. However, to argue that this means we can’t cause climate change is like arguing that humans can’t start bushfires because in the past they’ve happened naturally. Greenhouse gas increases have caused climate change many times in Earth’s history, and we are now adding greenhouse gases to the atmosphere at a increasingly rapid rate." -s.s.

In reply to this comment by bcglorf:
You are exactly right, the red line is directly measured temperature, and it shows that we are indeed breaking records. The trick is it only shows that we have hit the record for the last 100 years for which we actually have a measured record. The temperature over the last 2k years is not directly measured, but derived from proxies like tree rings. Those temperature reconstructions never hit the heights of the measured record. The speaker in the video then declares that current temperature is then a record over 2k years. The trick is to look closer. When we state that the reconstruction of the last 2k years never hits the current measured records it doesn't only mean it never hit them before today, it means that even where the measured temperature hits the record today, the reconstruction STILL doesn't come close to hitting the measured record. Seems very strong evidence that the reconstruction might have missed a record like today that happened over the last 2k years, as clearly they missed THIS ONE.


In reply to this comment by Peroxide:
I can't understand how you are interpreting Mann's graph, you do realize the red line at the end is not a projection, but in fact current measurements...

So how are you coming to the conclusion that we are not (already, if not soon) breaking records?

In reply to this comment by bcglorf:
>> ^alcom:

By your own admission, the composite readings are now at record highs. Nowhere in my previous post did I mention that these measurements were exactly equal to the 0.8 instrumental record. I only said that they followed the same directional trend. Read it and weep, a 2000 year-old record is a 2000 year-old record.
To argue that 'Mann's statement that the EIV construction is the most accurate,' is to willfully ignore all the other data sources. You completely miss the point that instrumental data is by far the most accurate. Proxy reconstructions are relied on in the absence of instrumental data.
The lack of widespread instrumental climate records before the mid 19th century, however, necessitates the use of natural climate archives or “proxy” data such as tree-rings, corals, and ice cores and historical documentary records to reconstruct climate in past centuries.
The curve of this warming trend is gradually accellerating accourding to all data sources, even if you ignore the two instrumental lines. And what a coincidence that it matches the timeline of the industrial revolution. Ignore science at your own peril!
>> ^bcglorf:
What graph are you reading?
CPS Land with uncertainties: Peaked at 0.05 in 600, but yes a new peak in the lat 1990's at 0.15(not 0.8), recent temp is the internal record by only 0.1.
Mann and Jones 2003: current peak at 0.15(not 0.8), but current is the record by 0.25.
Esper et. Al 2002: peaks once in 990 and again in 1990 at negative 0.05, not positive 0.8 nor is current warming a record.
CPS land+ocn with uncertainties: peaks at 0.2 (not 0.8) and only starts at 1500 not sure how much the record would've been set by if it included the year 600 where land alone hit 0.05.
Briffa et al. : Series begins in 1400, but again peaks at 0.15 (not 0.8). Can't tell from the graph how much of a record but by Briffa et al's original 2001 paper it's by 0.2
Crowly and Lowery (2000): peaks at 0.15(not 0.8), granted it current warming sets the record within the series, by 0.25 higher than 1100.
Jones et al. (1999): peaks at 0.05(not 0.8), current is record by 0.1
Oerlemans (2005) and both borehole sample go back less than 500 years. The boreholes who a smooth curve throughout, with warming starting 500, not 100 years ago. They all peak at 0.2 or lower, again not 0.8.

If I repeat my main point, I think it is reinforced by each of the series above. Instrumental measured warming is completely anomalous compared to the proxy reconstructions. The instrumental record peaks fully 0.6 degrees higher than any of the proxy series. How can anyone look at that and NOT object to the declaration that the last 2k years as shown by proxies proves temperatures have been far cooler and more stable than the last 100 years as shown on the instrumental record. If you instead compare like to like, and compare the last 100 years as projected by each proxy and not the instrumental record, you clearly see that the last 100 years is anything but a radical anomaly.
If you accept Mann's statement that the EIV construction is the most accurate, it can be easily said that the last 100 years, as appears in proxy reconstructions, isn't much of an anomaly at all.

>> ^alcom:
Ah, now I see your point, bcglorf. Of the various methodologies, the 2 instrumental record sets of the last 100 years are the only ones that show the extreme spike of temperature. The composite reconstructions have not yet shown data that is above previously held records in the last 2 millennia, with the exception of the following:
CPS Land with uncertainties
Mann and Jones (2003)
Esperg et al. (2002)
CPS land+ocn with uncertainties
Briffa et al. (2005)
Crowely and Lowery (2000)
Jones et al. (1999)
Oerlemans (2005)
Mann et al. Optimal Borehole (2003)
Huang et al. Borehole (2000)
If you closely follow these lines, you will see that each plot above has indeed set 2000 year-old records in the last 25 years within their own recorded plots, even if not as pronounced as the instrumental record highlighted by the red line. I'm not sure why the EIV lines stop at 1850, but I'm also not a climatologist. The instrumental record has more or less agreed with the EIV record since its existence, including an extending cooling trend midway through this century. The sharp divergence is not fully understood perhaps, but I still think it foolish to ignore the provable, measurable and pronounced upward trend in all calculated measurements.
>> ^bcglorf:
>> ^alcom:
After a cursory reading of Mann's Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia, I can't see how climate change deniers can both read about his methodology AND at the same time gripe about a bias towards measurements that support his argument while ignoring conflicting measurements through other means.
These measurements AGREE. The regression measurement data seems to have a wider variance as you go backwards, but they all trend in the same directions both up and down over the last 2000 years. (I'm looking at the graph here: http://www.pnas.org/content/105/36/13252/F3.expansion.html ) converge and climb at the same incredible rate at the end (the last 25 years or so.) Show me ANY scientific data that reports a measurement of +.8°C over such a short period of time.
As for the polynomial curve objection, the variation in measurements over such a limited data set my not yet reveal the true nature of the curve. And Earth's feedback mechanisms may already be at work to counteract the difference in atmospheric composition. For example, trees will grow faster and more quickly in slightly warmer temperatures with elevated CO2 levels, and naturally counteract the effects of fossil fuel burning by simply converting more of it into O2. There are undoubtedly many, many more factors at play. I'm suggesting perhaps that apparent "straight line" graphing is currently the fastest rate of increase possible based on the feedback systems that are at work.
The point is that it is a losing battle, according to the current trend. At some point, these feedback systems will fail (eg., there will come a point when it is so hot in a region that no type of tree will continue to grow and absorb CO2) and worst still, there are things like the methane in permafrost that will exacerbate the problem further. This isn't like a religious doomsday scenario, the alarm bells are not coming from a loony prophet but from real, measurable evidence that so many people continue to ignore. I'd rather be wrong and relieved that there is no climate crisis and clean energy initiatives end up being a waste of time and money than wrong that there IS in fact cause to make serious changes. The doubt that has driven so much misinformation will at some point be exposed for the stupidity that it truly is.

Look closer at the graph in http://www.pnas.org/content/105/36/13252/F3.expansion.html for me. It is the graph I was talking about. NONE of the reconstructions from proxy sources spike up to 0.8 at the end. The all taper off short of 0.2, the bold red(instrumental) line makes them very hard to see precisely. The closest curve to the red that can be seen is the grey one, which is in fact the other instrumental record they include, and even it stops below 0.6. What is more, the green EIV reconstruction peaks past 0.2 twice over the last 2k years. It also spikes much more quickly around 900 and 1300. Most noteworthy of all is if you read further up in Mann's report, because the big reason for re-releasing this version of his paper is to evaluate the EIV method because statisticians recommended as far more appropriate. Mann notes in his results as well stating that of all the methods, 'we place the greatest confidence in the EIV reconstructions'.
My key point is glaringly obvious when looking at Mann's data, even on the graph. The instrumental record of the last 100 years spikes in an unprecedented fashion. The proxy reconstruction of that same time frame does not. Two different methodologies yielding 2 different results. The speaker in this video points at that and declares it's because of human emissions 100 years ago, but we must look at the fact the methodology changed at that exact point too. The EIV reconstruction was the latest attempt to bridge the gap between the proxy and instrumental records, and although it more closely matches the instrumental, it still doesn't spike 0.8 degrees over the last 100 years, and more interestingly it also shows much greater variation over the last 2k years. Enough variation in fact that if you look at just the green EIV line, the last 100 years isn't particularly note worthy or anomalous.





The speaker in the video makes a very big deal though about the current 0.8 of the instrumental record, and similarly a very big deal about it being unprecedented in the last 10k years, which you have to admit has been plainly proven as apples to oranges.

Yes, most current proxies show a 2k year old record, except the EIV which is the most recent and deemed most accurate record. If you can accept comparing reconstructions using the same proxy data but different analysis, the EIV proxy reconstruction shows that current temperatures are not record breaking at all.

If you want to say that "the curve of this warming trend is gradually accellerating accourding to all data sources" I'd say you need also observe warming trends in the reconstructions over the last 2k years. Again, the rate of change over the last century is not unprecedented in the proxy records over the last 2k years. You can again plainly see that similarly or more severe warming and/or cooling within the last 2k years in the proxy reconstructions.



Peroxide (Member Profile)

bcglorf says...

You are exactly right, the red line is directly measured temperature, and it shows that we are indeed breaking records. The trick is it only shows that we have hit the record for the last 100 years for which we actually have a measured record. The temperature over the last 2k years is not directly measured, but derived from proxies like tree rings. Those temperature reconstructions never hit the heights of the measured record. The speaker in the video then declares that current temperature is then a record over 2k years. The trick is to look closer. When we state that the reconstruction of the last 2k years never hits the current measured records it doesn't only mean it never hit them before today, it means that even where the measured temperature hits the record today, the reconstruction STILL doesn't come close to hitting the measured record. Seems very strong evidence that the reconstruction might have missed a record like today that happened over the last 2k years, as clearly they missed THIS ONE.


In reply to this comment by Peroxide:
I can't understand how you are interpreting Mann's graph, you do realize the red line at the end is not a projection, but in fact current measurements...

So how are you coming to the conclusion that we are not (already, if not soon) breaking records?

In reply to this comment by bcglorf:
>> ^alcom:

By your own admission, the composite readings are now at record highs. Nowhere in my previous post did I mention that these measurements were exactly equal to the 0.8 instrumental record. I only said that they followed the same directional trend. Read it and weep, a 2000 year-old record is a 2000 year-old record.
To argue that 'Mann's statement that the EIV construction is the most accurate,' is to willfully ignore all the other data sources. You completely miss the point that instrumental data is by far the most accurate. Proxy reconstructions are relied on in the absence of instrumental data.
The lack of widespread instrumental climate records before the mid 19th century, however, necessitates the use of natural climate archives or “proxy” data such as tree-rings, corals, and ice cores and historical documentary records to reconstruct climate in past centuries.
The curve of this warming trend is gradually accellerating accourding to all data sources, even if you ignore the two instrumental lines. And what a coincidence that it matches the timeline of the industrial revolution. Ignore science at your own peril!
>> ^bcglorf:
What graph are you reading?
CPS Land with uncertainties: Peaked at 0.05 in 600, but yes a new peak in the lat 1990's at 0.15(not 0.8), recent temp is the internal record by only 0.1.
Mann and Jones 2003: current peak at 0.15(not 0.8), but current is the record by 0.25.
Esper et. Al 2002: peaks once in 990 and again in 1990 at negative 0.05, not positive 0.8 nor is current warming a record.
CPS land+ocn with uncertainties: peaks at 0.2 (not 0.8) and only starts at 1500 not sure how much the record would've been set by if it included the year 600 where land alone hit 0.05.
Briffa et al. : Series begins in 1400, but again peaks at 0.15 (not 0.8). Can't tell from the graph how much of a record but by Briffa et al's original 2001 paper it's by 0.2
Crowly and Lowery (2000): peaks at 0.15(not 0.8), granted it current warming sets the record within the series, by 0.25 higher than 1100.
Jones et al. (1999): peaks at 0.05(not 0.8), current is record by 0.1
Oerlemans (2005) and both borehole sample go back less than 500 years. The boreholes who a smooth curve throughout, with warming starting 500, not 100 years ago. They all peak at 0.2 or lower, again not 0.8.

If I repeat my main point, I think it is reinforced by each of the series above. Instrumental measured warming is completely anomalous compared to the proxy reconstructions. The instrumental record peaks fully 0.6 degrees higher than any of the proxy series. How can anyone look at that and NOT object to the declaration that the last 2k years as shown by proxies proves temperatures have been far cooler and more stable than the last 100 years as shown on the instrumental record. If you instead compare like to like, and compare the last 100 years as projected by each proxy and not the instrumental record, you clearly see that the last 100 years is anything but a radical anomaly.
If you accept Mann's statement that the EIV construction is the most accurate, it can be easily said that the last 100 years, as appears in proxy reconstructions, isn't much of an anomaly at all.

>> ^alcom:
Ah, now I see your point, bcglorf. Of the various methodologies, the 2 instrumental record sets of the last 100 years are the only ones that show the extreme spike of temperature. The composite reconstructions have not yet shown data that is above previously held records in the last 2 millennia, with the exception of the following:
CPS Land with uncertainties
Mann and Jones (2003)
Esperg et al. (2002)
CPS land+ocn with uncertainties
Briffa et al. (2005)
Crowely and Lowery (2000)
Jones et al. (1999)
Oerlemans (2005)
Mann et al. Optimal Borehole (2003)
Huang et al. Borehole (2000)
If you closely follow these lines, you will see that each plot above has indeed set 2000 year-old records in the last 25 years within their own recorded plots, even if not as pronounced as the instrumental record highlighted by the red line. I'm not sure why the EIV lines stop at 1850, but I'm also not a climatologist. The instrumental record has more or less agreed with the EIV record since its existence, including an extending cooling trend midway through this century. The sharp divergence is not fully understood perhaps, but I still think it foolish to ignore the provable, measurable and pronounced upward trend in all calculated measurements.
>> ^bcglorf:
>> ^alcom:
After a cursory reading of Mann's Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia, I can't see how climate change deniers can both read about his methodology AND at the same time gripe about a bias towards measurements that support his argument while ignoring conflicting measurements through other means.
These measurements AGREE. The regression measurement data seems to have a wider variance as you go backwards, but they all trend in the same directions both up and down over the last 2000 years. (I'm looking at the graph here: http://www.pnas.org/content/105/36/13252/F3.expansion.html ) converge and climb at the same incredible rate at the end (the last 25 years or so.) Show me ANY scientific data that reports a measurement of +.8°C over such a short period of time.
As for the polynomial curve objection, the variation in measurements over such a limited data set my not yet reveal the true nature of the curve. And Earth's feedback mechanisms may already be at work to counteract the difference in atmospheric composition. For example, trees will grow faster and more quickly in slightly warmer temperatures with elevated CO2 levels, and naturally counteract the effects of fossil fuel burning by simply converting more of it into O2. There are undoubtedly many, many more factors at play. I'm suggesting perhaps that apparent "straight line" graphing is currently the fastest rate of increase possible based on the feedback systems that are at work.
The point is that it is a losing battle, according to the current trend. At some point, these feedback systems will fail (eg., there will come a point when it is so hot in a region that no type of tree will continue to grow and absorb CO2) and worst still, there are things like the methane in permafrost that will exacerbate the problem further. This isn't like a religious doomsday scenario, the alarm bells are not coming from a loony prophet but from real, measurable evidence that so many people continue to ignore. I'd rather be wrong and relieved that there is no climate crisis and clean energy initiatives end up being a waste of time and money than wrong that there IS in fact cause to make serious changes. The doubt that has driven so much misinformation will at some point be exposed for the stupidity that it truly is.

Look closer at the graph in http://www.pnas.org/content/105/36/13252/F3.expansion.html for me. It is the graph I was talking about. NONE of the reconstructions from proxy sources spike up to 0.8 at the end. The all taper off short of 0.2, the bold red(instrumental) line makes them very hard to see precisely. The closest curve to the red that can be seen is the grey one, which is in fact the other instrumental record they include, and even it stops below 0.6. What is more, the green EIV reconstruction peaks past 0.2 twice over the last 2k years. It also spikes much more quickly around 900 and 1300. Most noteworthy of all is if you read further up in Mann's report, because the big reason for re-releasing this version of his paper is to evaluate the EIV method because statisticians recommended as far more appropriate. Mann notes in his results as well stating that of all the methods, 'we place the greatest confidence in the EIV reconstructions'.
My key point is glaringly obvious when looking at Mann's data, even on the graph. The instrumental record of the last 100 years spikes in an unprecedented fashion. The proxy reconstruction of that same time frame does not. Two different methodologies yielding 2 different results. The speaker in this video points at that and declares it's because of human emissions 100 years ago, but we must look at the fact the methodology changed at that exact point too. The EIV reconstruction was the latest attempt to bridge the gap between the proxy and instrumental records, and although it more closely matches the instrumental, it still doesn't spike 0.8 degrees over the last 100 years, and more interestingly it also shows much greater variation over the last 2k years. Enough variation in fact that if you look at just the green EIV line, the last 100 years isn't particularly note worthy or anomalous.





The speaker in the video makes a very big deal though about the current 0.8 of the instrumental record, and similarly a very big deal about it being unprecedented in the last 10k years, which you have to admit has been plainly proven as apples to oranges.

Yes, most current proxies show a 2k year old record, except the EIV which is the most recent and deemed most accurate record. If you can accept comparing reconstructions using the same proxy data but different analysis, the EIV proxy reconstruction shows that current temperatures are not record breaking at all.

If you want to say that "the curve of this warming trend is gradually accellerating accourding to all data sources" I'd say you need also observe warming trends in the reconstructions over the last 2k years. Again, the rate of change over the last century is not unprecedented in the proxy records over the last 2k years. You can again plainly see that similarly or more severe warming and/or cooling within the last 2k years in the proxy reconstructions.


bcglorf (Member Profile)

Peroxide says...

I can't understand how you are interpreting Mann's graph, you do realize the red line at the end is not a projection, but in fact current measurements...

So how are you coming to the conclusion that we are not (already, if not soon) breaking records?

In reply to this comment by bcglorf:
>> ^alcom:

By your own admission, the composite readings are now at record highs. Nowhere in my previous post did I mention that these measurements were exactly equal to the 0.8 instrumental record. I only said that they followed the same directional trend. Read it and weep, a 2000 year-old record is a 2000 year-old record.
To argue that 'Mann's statement that the EIV construction is the most accurate,' is to willfully ignore all the other data sources. You completely miss the point that instrumental data is by far the most accurate. Proxy reconstructions are relied on in the absence of instrumental data.
The lack of widespread instrumental climate records before the mid 19th century, however, necessitates the use of natural climate archives or “proxy” data such as tree-rings, corals, and ice cores and historical documentary records to reconstruct climate in past centuries.
The curve of this warming trend is gradually accellerating accourding to all data sources, even if you ignore the two instrumental lines. And what a coincidence that it matches the timeline of the industrial revolution. Ignore science at your own peril!
>> ^bcglorf:
What graph are you reading?
CPS Land with uncertainties: Peaked at 0.05 in 600, but yes a new peak in the lat 1990's at 0.15(not 0.8), recent temp is the internal record by only 0.1.
Mann and Jones 2003: current peak at 0.15(not 0.8), but current is the record by 0.25.
Esper et. Al 2002: peaks once in 990 and again in 1990 at negative 0.05, not positive 0.8 nor is current warming a record.
CPS land+ocn with uncertainties: peaks at 0.2 (not 0.8) and only starts at 1500 not sure how much the record would've been set by if it included the year 600 where land alone hit 0.05.
Briffa et al. : Series begins in 1400, but again peaks at 0.15 (not 0.8). Can't tell from the graph how much of a record but by Briffa et al's original 2001 paper it's by 0.2
Crowly and Lowery (2000): peaks at 0.15(not 0.8), granted it current warming sets the record within the series, by 0.25 higher than 1100.
Jones et al. (1999): peaks at 0.05(not 0.8), current is record by 0.1
Oerlemans (2005) and both borehole sample go back less than 500 years. The boreholes who a smooth curve throughout, with warming starting 500, not 100 years ago. They all peak at 0.2 or lower, again not 0.8.

If I repeat my main point, I think it is reinforced by each of the series above. Instrumental measured warming is completely anomalous compared to the proxy reconstructions. The instrumental record peaks fully 0.6 degrees higher than any of the proxy series. How can anyone look at that and NOT object to the declaration that the last 2k years as shown by proxies proves temperatures have been far cooler and more stable than the last 100 years as shown on the instrumental record. If you instead compare like to like, and compare the last 100 years as projected by each proxy and not the instrumental record, you clearly see that the last 100 years is anything but a radical anomaly.
If you accept Mann's statement that the EIV construction is the most accurate, it can be easily said that the last 100 years, as appears in proxy reconstructions, isn't much of an anomaly at all.

>> ^alcom:
Ah, now I see your point, bcglorf. Of the various methodologies, the 2 instrumental record sets of the last 100 years are the only ones that show the extreme spike of temperature. The composite reconstructions have not yet shown data that is above previously held records in the last 2 millennia, with the exception of the following:
CPS Land with uncertainties
Mann and Jones (2003)
Esperg et al. (2002)
CPS land+ocn with uncertainties
Briffa et al. (2005)
Crowely and Lowery (2000)
Jones et al. (1999)
Oerlemans (2005)
Mann et al. Optimal Borehole (2003)
Huang et al. Borehole (2000)
If you closely follow these lines, you will see that each plot above has indeed set 2000 year-old records in the last 25 years within their own recorded plots, even if not as pronounced as the instrumental record highlighted by the red line. I'm not sure why the EIV lines stop at 1850, but I'm also not a climatologist. The instrumental record has more or less agreed with the EIV record since its existence, including an extending cooling trend midway through this century. The sharp divergence is not fully understood perhaps, but I still think it foolish to ignore the provable, measurable and pronounced upward trend in all calculated measurements.
>> ^bcglorf:
>> ^alcom:
After a cursory reading of Mann's Proxy-based reconstructions of hemispheric and global surface temperature variations over the past two millennia, I can't see how climate change deniers can both read about his methodology AND at the same time gripe about a bias towards measurements that support his argument while ignoring conflicting measurements through other means.
These measurements AGREE. The regression measurement data seems to have a wider variance as you go backwards, but they all trend in the same directions both up and down over the last 2000 years. (I'm looking at the graph here: http://www.pnas.org/content/105/36/13252/F3.expansion.html ) converge and climb at the same incredible rate at the end (the last 25 years or so.) Show me ANY scientific data that reports a measurement of +.8°C over such a short period of time.
As for the polynomial curve objection, the variation in measurements over such a limited data set my not yet reveal the true nature of the curve. And Earth's feedback mechanisms may already be at work to counteract the difference in atmospheric composition. For example, trees will grow faster and more quickly in slightly warmer temperatures with elevated CO2 levels, and naturally counteract the effects of fossil fuel burning by simply converting more of it into O2. There are undoubtedly many, many more factors at play. I'm suggesting perhaps that apparent "straight line" graphing is currently the fastest rate of increase possible based on the feedback systems that are at work.
The point is that it is a losing battle, according to the current trend. At some point, these feedback systems will fail (eg., there will come a point when it is so hot in a region that no type of tree will continue to grow and absorb CO2) and worst still, there are things like the methane in permafrost that will exacerbate the problem further. This isn't like a religious doomsday scenario, the alarm bells are not coming from a loony prophet but from real, measurable evidence that so many people continue to ignore. I'd rather be wrong and relieved that there is no climate crisis and clean energy initiatives end up being a waste of time and money than wrong that there IS in fact cause to make serious changes. The doubt that has driven so much misinformation will at some point be exposed for the stupidity that it truly is.

Look closer at the graph in http://www.pnas.org/content/105/36/13252/F3.expansion.html for me. It is the graph I was talking about. NONE of the reconstructions from proxy sources spike up to 0.8 at the end. The all taper off short of 0.2, the bold red(instrumental) line makes them very hard to see precisely. The closest curve to the red that can be seen is the grey one, which is in fact the other instrumental record they include, and even it stops below 0.6. What is more, the green EIV reconstruction peaks past 0.2 twice over the last 2k years. It also spikes much more quickly around 900 and 1300. Most noteworthy of all is if you read further up in Mann's report, because the big reason for re-releasing this version of his paper is to evaluate the EIV method because statisticians recommended as far more appropriate. Mann notes in his results as well stating that of all the methods, 'we place the greatest confidence in the EIV reconstructions'.
My key point is glaringly obvious when looking at Mann's data, even on the graph. The instrumental record of the last 100 years spikes in an unprecedented fashion. The proxy reconstruction of that same time frame does not. Two different methodologies yielding 2 different results. The speaker in this video points at that and declares it's because of human emissions 100 years ago, but we must look at the fact the methodology changed at that exact point too. The EIV reconstruction was the latest attempt to bridge the gap between the proxy and instrumental records, and although it more closely matches the instrumental, it still doesn't spike 0.8 degrees over the last 100 years, and more interestingly it also shows much greater variation over the last 2k years. Enough variation in fact that if you look at just the green EIV line, the last 100 years isn't particularly note worthy or anomalous.





The speaker in the video makes a very big deal though about the current 0.8 of the instrumental record, and similarly a very big deal about it being unprecedented in the last 10k years, which you have to admit has been plainly proven as apples to oranges.

Yes, most current proxies show a 2k year old record, except the EIV which is the most recent and deemed most accurate record. If you can accept comparing reconstructions using the same proxy data but different analysis, the EIV proxy reconstruction shows that current temperatures are not record breaking at all.

If you want to say that "the curve of this warming trend is gradually accellerating accourding to all data sources" I'd say you need also observe warming trends in the reconstructions over the last 2k years. Again, the rate of change over the last century is not unprecedented in the proxy records over the last 2k years. You can again plainly see that similarly or more severe warming and/or cooling within the last 2k years in the proxy reconstructions.

Climate Change; Latest science update

bcglorf says...

@alcom

The speaker in the video makes a very big deal though about the current 0.8 of the instrumental record, and similarly a very big deal about it being unprecedented in the last 10k years, which you have to admit has been plainly proven as apples to oranges.

Yes, most current proxies show a 2k year old record, except the EIV which is the most recent and deemed most accurate record. If you can accept comparing reconstructions using the same proxy data but different analysis, the EIV proxy reconstruction shows that current temperatures are not record breaking at all.

If you want to say that "the curve of this warming trend is gradually accellerating accourding to all data sources" I'd say you need also observe warming trends in the reconstructions over the last 2k years. Again, the rate of change over the last century is not unprecedented in the proxy records over the last 2k years. You can again plainly see that similarly or more severe warming and/or cooling within the last 2k years in the proxy reconstructions.

Climate Change; Latest science update

bcglorf says...

>> ^ChaosEngine:

>> ^bcglorf:
So the moral is, it is absolutely time to panic.
Not just maybe, but absolutely time to panic.
Fortunately, he IS overstating the situation. Right from the very start he declares how stable the last 10k years have been, and that the last 100 have already broken all records seen over those 10k years. Go use google scholar and read Michael Mann's recent work on reconstructing the last 2k years. Mann is one of the leading scientists arguing that it is time to panic and things are getting bad very fast. His research is publicly available on google scholar for everyone to go and read.
If you can be bothered to go and read that before shouting me down as a denier, you will find the following in his research. That there is at least some evidence that on at least two occasions over the last 2k years, climate HAS been as warm or warmer than current.
I'm not saying it's all roses and that there is nothing to see here. I AM saying that if you go read the actual research you'll find a much more nuanced and less panic stricken assortment of facts than what is presented in this video.

Can you post a link to the page your talking about? I used google scholar, but Mann has published quite a few papers and I really don't have time to read them all.
That said, even if I read the paper, I'm not confident I'd understand it fully. From my limited research into climatology, it's a reasonably complex science. My problem is that I don't really have time to study all the theory around this.
And frankly, I shouldn't have to. I'm not a climatologist. No-one alive today can possibly hope to understand all science in every field. That's why we specialise. With a small amount of ego, I'm willing to say that most climatologists are worse programmers than I am, but that's ok too, 'cos that's not their field.
What I'm trying to say in my trademark, rambling, incoherent way is that I generally accept a scientific consensus (assuming it's been properly peer reviewed and so on). Fallacy of majority? Possibly. I'm willing to accept the possibility that there's a gifted climatologist out there who is desperately trying to get the rest of them to understand the crucial theory/evidence/algorithm they've missed, and it's all going to be ok. Hell, I hope there is, but it seems unlikely to me.
To apply Occams razor: which makes more sense?


You can see Mann's latest work here. Just don't stop with reading the abstract where he declares the reinforcement of his previous studies and findings. Go further down and look at the reconstruction of the last 2k years the article was built on. The green EIV line is the 'newer' statistical method recommended to him by statisticians that claimed his previous method was biased towards 0(minimized highs and lows). You can clearly see the EIV reconstruction shows multiple peaks in the past. More importantly though, look at the last 100 years on the graph. The bold red line is the instrumental record. It blots out most of the last 100 years, but if you look closely, you can see that none of the reconstructed lines spike away into scary land like the instrumental record. In fact, none of the reconstructed lines climb above where the EIV line has peaked multiple times in the past. To me that screams the need to look harder still at the probability that our methods for reconstruction aren't sensitive enough to pick up a short spike like what we know from the instrumental record is currently taking place. That doesn't prove spikes like the last 100 years are common, but it DOES call into serious question the claim that it's never happened before in the last 2k years. That final claim is the vital and key point between everyone panic and lets study this further to understand it fully.

Climate Change; Latest science update

ChaosEngine says...

>> ^bcglorf:

So the moral is, it is absolutely time to panic.
Not just maybe, but absolutely time to panic.
Fortunately, he IS overstating the situation. Right from the very start he declares how stable the last 10k years have been, and that the last 100 have already broken all records seen over those 10k years. Go use google scholar and read Michael Mann's recent work on reconstructing the last 2k years. Mann is one of the leading scientists arguing that it is time to panic and things are getting bad very fast. His research is publicly available on google scholar for everyone to go and read.
If you can be bothered to go and read that before shouting me down as a denier, you will find the following in his research. That there is at least some evidence that on at least two occasions over the last 2k years, climate HAS been as warm or warmer than current.
I'm not saying it's all roses and that there is nothing to see here. I AM saying that if you go read the actual research you'll find a much more nuanced and less panic stricken assortment of facts than what is presented in this video.


Can you post a link to the page your talking about? I used google scholar, but Mann has published quite a few papers and I really don't have time to read them all.

That said, even if I read the paper, I'm not confident I'd understand it fully. From my limited research into climatology, it's a reasonably complex science. My problem is that I don't really have time to study all the theory around this.

And frankly, I shouldn't have to. I'm not a climatologist. No-one alive today can possibly hope to understand all science in every field. That's why we specialise. With a small amount of ego, I'm willing to say that most climatologists are worse programmers than I am, but that's ok too, 'cos that's not their field.

What I'm trying to say in my trademark, rambling, incoherent way is that I generally accept a scientific consensus (assuming it's been properly peer reviewed and so on). Fallacy of majority? Possibly. I'm willing to accept the possibility that there's a gifted climatologist out there who is desperately trying to get the rest of them to understand the crucial theory/evidence/algorithm they've missed, and it's all going to be ok. Hell, I hope there is, but it seems unlikely to me.

To apply Occams razor: which makes more sense?

Climate Change; Latest science update

bcglorf says...

So the moral is, it is absolutely time to panic.

Not just maybe, but absolutely time to panic.

Fortunately, he IS overstating the situation. Right from the very start he declares how stable the last 10k years have been, and that the last 100 have already broken all records seen over those 10k years. Go use google scholar and read Michael Mann's recent work on reconstructing the last 2k years. Mann is one of the leading scientists arguing that it is time to panic and things are getting bad very fast. His research is publicly available on google scholar for everyone to go and read.

If you can be bothered to go and read that before shouting me down as a denier, you will find the following in his research. That there is at least some evidence that on at least two occasions over the last 2k years, climate HAS been as warm or warmer than current.

I'm not saying it's all roses and that there is nothing to see here. I AM saying that if you go read the actual research you'll find a much more nuanced and less panic stricken assortment of facts than what is presented in this video.

Zero Punctuation: Diablo 3

oohlalasassoon says...

At least they patched the game to autojoin you to general chat so the first thing you see when logging on is:

~~~{GOLD-4-U}~~~ 10k for $19.99 delivered ~~~{GOLD-4-U}~~~

It didn't feel like a Blizzard game until they did that.



Send this Article to a Friend



Separate multiple emails with a comma (,); limit 5 recipients






Your email has been sent successfully!

Manage this Video in Your Playlists

Beggar's Canyon