> Do you think the global average temperature anomolies would change if ...?

Do you think the global average temperature anomolies would change if ...?

Posted at: 2015-03-12 
I don't think the temperature anomalies would change. They are typically measured from a baseline of 1950-1980, or 1980-2010, or some other modern period. Extra thermometers in the past wouldn't change those time periods.

On top of that, even with good measurements, there are all sorts of adjustments made that end up making the past cooler. Even in measuring current temperature, good rural thermometers are then smeared with nearby urban thermometers, as part of an 'urban heat island' adjustment gone wrong.

As it is, they are not reporting accurate measurements for hundreds of years ago at the same level of precision as modern temperatures.

The instrumentation behind climate change analysis is a concern to me. Part of that concern revolves around the fact that the people involved with the instrumentation do not seem to be the modellers and vice versa. So the modellers just start from a file full of figures and play with it.

Having worked with instrumentation in the past I know how fussy some measurements can be. Even if you have 10 pieces of nominally identical hardware instrumented in the same way you still get different answers and you can never tell if the answers really should be different or not.

So when you hear that NASA have cut their temperature sites from 6000 to 1500 you really wonder if they can make that work. Appending Argo data to the previous data also has to be suspect (as does adding thermometer data to the end of proxy data).

The Hansen's of this world clearly have an agenda and I am not convinced that all their decisions could have been 100% objective.

So, Yes, the more transducers the better and with the (impractical!) numbers of transducers you are suggesting I am sure the temperature patterns and values would have been different.

Satellites look at the whole world, of course, but they don't look at surface temperatures. Some sort of 3D satellite system that can look into the sea would be good!

"t seems that some scientists (AGW theorists in particular) think that we have enough temperature measurements to form a consensus on what actual global average temperature has been in the past. In knowing how temperatures can change in a matter of minutes in a specific area I find it hard to know that past temperature readings can really reflect a true anomaly. "

But unless there was a systemic bias the errors would tend to average to zero percent.

From house to house people often differing political views, yet polling just a scattering of random people gives excellent data about who will win elections.

There is considerable uncertainty in what global average temperature was in the past. More data would give us more knowledge. But what could such knowledge actually change?

1. Perhaps we would find out that the Medieval Warm Period was warmer than we previously thought. But we could just as easily find out that the Medieval Warm Period was cooler than we previously thought.

2. If we also had satellites measuring the Sun and of clouds and aerosols, we would almost certainly determine that any warming in the past was due to different causes than current warming.



By denialists. Climatologists use averages from all over the world and trends to come to conclusions.

And not cherry picked trends. They use trends of at least 30 years.



I would think much less than that. It seems to me that a 1% energy imbalance would result in 2 degrees C of warming happening over a period of hours, rather than decades.

As an atmospheric scientist, I would LOVE to have more (accurate) measurements--the more the better. I actually live in an area that has extremely dense mesonet coverage (and I am part of that coverage, you can go to the NWS sites and check the weather at my house), and having all that data is a wonderful thing.

However, unless you know something that the rest of us don't know, there is no way to go back in time and add more instruments. There is an initiative to make more historical data available, but that's very labor intensive and requires people to digitize the data in ships' logs, etc.

The question is whether we have enough historical data to identify the long term trend, and I think the answer is clear that we do. Techniques from objective analysis and data assimilation are used to re-grid the historical data for use in climate models. As for the "global mean temperature", you'll see that the error associated with it decreases with time, which is a reflection of the increasing number of stations available.

It would certainly be useful to have more of certain types of data. All data is very sparse in oceanic areas, and radiosonde data is particularly sparse. The radiosonde data set was not really designed for climatological purposes to begin with. I believe our best hope for indisputable evidence of climate change (indisputable by deniers, that is) will be the water vapor data from ground-based GPS and the limb sounding of GPS satellites.

By the way, just having more data is no panacea to modeling problems. The data still have to be checked for errors and assimilated, and it doesn't do much good to have lots of observations in a small area if you're not modeling on a resolution with comparable grid size. The computers also have to be fast enough and have enough memory to handle the data. Due to those constraints, we're still limited to much larger grid size than we have available data for. It doesn't do any good to run a model if it takes 100 years to complete.

EDIT: Jim Z, You're apparently unaware that there is a specific field of study called "Objective Analysis." If your level of study of a field is at such a low level that you have not even heard of it, that could be a problem in your understanding of it. Perhaps if you actually took a course in the subject then you might know what I was talking about. Your present state is comparable to somebody trying to lecture you on the geologic time scale having never heard of "stratigraphy" or Steno's Laws.

In response to Zippi62's comments, it does not do any good to have unrealistic goals. If you're suggesting that we wait thousands of years so that you can find the data sufficient for your standards, that's just stupid. If someone had a temperature of 105 F, would you say that we should not treat them because we have insufficient data on their past temperature? Very few things in life allow you to wait thousands of years to make a decision.

And you seem to be getting confused about the change in CO2 in the atmosphere. The mass of atmospheric CO2 has increased by about 40% through anthropogenic emissions. I know you like to dilute the effect by giving it as a fraction of things that are not really relevant to the discussion, but I think using a nice extensive measure like "mass" is a more appropriate way of looking at it--and a lot less prone to numerical sophistry.

Another EDIT: You're arguing against yourself--you say that "effect" is the important thing, yet you want to include gases in your number that have no effect. Just admit you're wrong or you're trying to mislead people. Be honest about something for once in your life.

No.

Global temperatures vary according to heat distribution, a fact known to climate researchers but widely ignored by the pubic, especially deniers. Heat distribution is known to be affected by many global effects such as the Pacific Decadal Oscillation. That's why scientists use long-term averages and why past temperatures are reconstructed from proxies that record averages, not momet-to-moment variation.

The Law of Large Numbers and the Central Limit Theorem give us some level of confidence.

They did have a similar network at one time. It did not have near the resolution that you describe. However, James Hansen stopped taking data on the Siberian ones during the winter.

I think the data would still be inconclusive. We have seen examples of East Anglia and James Hansen corrupting the data.

However, if honest data was to be taken that way, I think we would find a lot of differences from what we no have. However, we would still have to take that in 3D, or up in the sky also. Then also, we would still have to take those measurements in the oceans. Would that be enough to satisfy the AGW crowd? Only if it went their way and we know that is highly doubtful.

I'm not sure what who Pegminer is referring to that is objective. We are supposed to just sit back and believe that Hansen for example was simply making corrections time after time when the corrections are always to cool the past relative to the present. We are supposed to swallow the garbage that came from Mann. No thanks. I would rather keep my scientific objectivity and common sense.

Yes. Accuracy of measurement over time is itself a variable, which is impossible to factor in. That's one of the biggest reasons why the models they generate now are always wrong.

... for the past thousand years we had a thermometer every mile and in every direction (even in the ground) and we took measurements every hour on the hour?

It seems that some scientists (AGW theorists in particular) think that we have enough temperature measurements to form a consensus on what actual global average temperature has been in the past. In knowing how temperatures can change in a matter of minutes in a specific area I find it hard to know that past temperature readings can really reflect a true anomaly.

Do you think that this is one of the main problems with developing climate model accuracy?

Haha!