Sunday, May 21, 2017

On Believing Alarmist Climate Scientists - Or Not

Subtitle:  The Balance of Evidence Shows Disbelief is Justified

The alarmist climate scientists, or as referred to by me on SLB: False-Alarmists, are in the news and electronic sites quite regularly these days, with opponents (such as me) pointing out the many errors, inconsistencies, questionable practices, and even outright lies.  A friend recently sent me a link to Scott Adams' (of Dilbert cartoon fame) blog article, see link. to  "How to Convince Skeptics that Climate Change is a Problem," from March 8, 2017.   It is an excellent article, with 14 main points that are discussed below.   Many of these same points were made by me on SLB over the years. 

Before getting to the Adams points and my comments on those, a brief excursion into what (in my opinion) is, or perhaps could be, one of the motivations for false-alarmists to take and hold the positions they have taken.  This requires considering a few historic facts and myths. 

In the 1950s (or even earlier if one believes that), The End Of Oil was a common concern, even a fear among some.  Peak Oil was the term widely used, to describe the way oil fields decreased their production, while new, replacement oil fields were getting more and more scarce.   Some advocated for abandoning oil altogether and using nuclear energy for as much as possible.  They envisioned nuclear cars, trains, buses, and of course every electric grid would be powered by nuclear plants.    This of course led to roughly 450 nuclear power plants being started up globally, with 120 (approximately) of those in the US.  (note, many of them are now shut down, with 99 or so still running).    It is worth mentioning that, before the 1970s oil price shocks, oil-fired power plants provided approximately 20 percent of all electricity in the US.  As oil became too expensive to burn for making electricity, nuclear plants came online and produce the same fraction of US power: 18 to 20 percent annually.   That is all the nuclear plants could achieve in market share, only what very expensive oil had produced. 

In the 1960s, pollution concerns from industry, and from common daily life became a growing concern.  A widely-read book predicted humanity's collapse under the weight of the massive pollution problem (see Limits To Growth, a widely debunked and spectacularly wrong book; but back then it had some credibility).   And, of course, Silent Spring by Rachel Carson was a must-read book for those who were oh-so-concerned with saving the planet.   That book, Silent Spring, has also been widely debunked as false science.  

The concern over widespread pollution led to environmental regulations, including of course the Clean Air Act, Clean Water Act, and many others.  Those did a lot of good, with much cleaner air and water today compared to 50 years ago. 

However, oil did not run out.   The geologists, oil drilling exploration companies, computer scientists, all worked together to find not only new oil fields, but ways to produce more oil from existing oil fields.   Industrialization, and modernization created huge demands for electricity world-wide.  But, the justifiable concerns over nuclear plant safety (we've had 5 nuclear reactors melt down or explode, or both thus far) led to much of that electricity being produced from burning coal.  

The Limits To Growth crowd saw an opportunity: if pollution via smelly particles and fumes was already reduced, and oily, smelly, foamy substances on the waterways was gone, it would be necessary to find another way to shut down those evil oil companies that kept finding cheaper and cheaper oil.  Peak Oil refused to peak, nuclear power was way too expensive and far too dangerous so was not powering everything, so some other way had to be found to eliminate oil companies. 

Their way forward was to seize on a little-known physical truth, that carbon dioxide can be made to absorb and re-radiate heat energy.  That is an absolute fact, no doubt about it.  It is taught in universities to chemical engineers and mechanical engineers in the heat transfer course.  see link to SLB article on this, "Chemical Engineers, CO2, and Absorptive Re-Radiation; Subtitle:  Fired Furnaces Have Strong Radiating CO2; Atmosphere Does Not."  If carbon dioxide could be blamed for the Earth's apparent (some say measured) warming, and it is a fact that burning of oil and coal produces carbon dioxide, then the link could be made and evil oil exterminated forever.   

It was also convenient that the Earth was going through a natural cycle from a cold state, or at least colder than the recent past, and into a warmer state.  The cold period lasted approximately 500 years (1350 to 1850), and is known as the Little Ice Age.   (this leads to some serious contradictions for the false-alarmists, what caused the Little Ice Age, what caused it to end, and their pet theory about carbon dioxide heating up the Earth.  But, more on that in a bit.)   The few hundred years before the Little Ice Age were much warmer, according to all the evidence.   

So, there is the situation in a nutshell: a burning desire to end oil companies, build nuclear plants for all, led to dubious (some say fraudulent) manipulations of science to create a false-alarm over man-made global warming.  

Now, to Scott Adams' 14 points.   (The Adams article is an excellent essay, highly recommend reading it). 

  1. Stop telling me the “models” (plural) are good.
  2. Stop telling me the climate models are excellent at hindcasting, meaning they work when you look at history.
  3. Tell me what percentage of warming is caused by humans versus natural causes. If humans are 10% of the cause, I am not so worried. If we are 90%, you have my attention.
  4. Stop attacking some of the messengers for believing that our reality holds evidence of Intelligent Design.
  5. Skeptics produce charts of the earth’s temperature going up and down for ages before humans were industrialized. If you can’t explain-away that chart, I can’t hear anything else you say.
  6. Stop telling me the arctic ice on one pole is decreasing if you are ignoring the increase on the other pole. 
  7. When skeptics point out that the Earth has not warmed as predicted, don’t change the subject to sea levels.
  8. If the rate of change of temperature is key, stop telling me about record high temperatures as if they are proof of something.
  9. Stop pointing to record warmth in one place when we’re also having record cold in others.
  10. Don’t tell me how well your models predict the past. Tell me how many climate models have ever been created, since we started doing this sort of thing, and tell me how many have now been discarded because they didn’t predict correctly. 
  11. When you claim the oceans have risen dramatically, you need to explain why insurance companies are ignoring this risk and why my local beaches look exactly the same to me. 
  12. If you want me to believe warmer temperatures are bad, you need to produce a chart telling me how humankind thrived during various warmer and colder eras.You also need to convince me that economic models are accurate. 
  13. Stop conflating the basic science and the measurements with the models. Each has its own credibility. -- (the basic science is badly corrupted, as shown on SLB)
  14. If skeptics make you retreat to Pascal’s Wager as your main argument for aggressively responding [to] climate change, please understand that you lost the debate. The world is full of risks that might happen. 
Sowell commentary on the 14 points. 

  1. The “models” (plural) are good.  Adams correctly points out that having multiple models for climate simulations is just begging for zero credibility.  Here on SLB, it has been pointed out many times that settled science, credible science, is based on a single model that has been proven over and over.   Things like gravity, electromagnetism, heat transfer, chemical reactions, and many others.   False-alarmists have more than 20 models (depending on what source one cites), and there are many different outcomes.    The idea that the models are "good" is also false, as many writers have pointed out repeatedly.  As just one instance, recently Dr. John Christy (University of Alabama at Huntsville) testified before Congress that there is complete disagreement between the climate models' results of great warming, and measured data that show there is no warming in the atmosphere.                                                     
  2. Climate models are excellent at hindcasting.  Adams points out that hindcasting is good, but not sufficient for believing a future forecast.   This is an important point in complex system modeling; we have a very great amount of experience in modeling oil refineries and chemical plants.   There are essentially two types of models: the first is based on first principles; the second is based on empirical data.   The only one that can ever be used for a forecast is based on first principles.  An empirically-based model is only as good as the narrow region or range of the data.   In climate science, the future is necessarily out of the range of empirical data.   It is also quite true that the climate models have far too much parameterization, or empirical modeling.   It is no surprise to the chemical engineers (such as me) with process modeling experience that the climate models fail, and fail miserably.   There are many other reasons for the climate models' failure, though, including false attribution of causation (blaming Carbon Dioxide, for example).                                                                                                   
  3. Percentage of warming caused by humans versus natural causes.  This point is very good; if humans are responsible for 10 percent of the recent warming, nobody need be concerned, but if it's 90 percent, there is cause for alarm.   As written just above, the Earth is climbing out of a natural, 500-year cold period.  Of course the Earth is warming.   The fact that climate scientists choose to believe that Carbon Dioxide has something to do with the warming is more than a bit suspect.   The fact is, the measured rate of warming (and glacier shrinkage) is the same today as it was in 1850-1900.  Yet, even the false-alarmists admit that no change in carbon dioxide occurred back then.  A further temporary warming occurred from about 1910 to 1940, then temperatures actually decreased a bit for 30 years.   All that time, 1910-1970, carbon dioxide was increasing.    False-alarmists simply gloss over this point.   To the detriment of their credibility, though.                                                                                      
  4. Attacking some of the messengers for believing [in] Intelligent Design.  There is a broader point here.  Scientists throughout history had various beliefs, some in God or gods, some agnostic, some atheistic.   This is entirely beside the point.   The question is, had the scientist that discovered a life-saving thing e.g. Pasteur and making milk safe by heating it for a short interval, or Dr. Salk with the polio vaccine, held religious views, would it be wise to discard their discoveries?   The entire scientific foundation would be discarded if modern science had to discard every discovery by every researcher that held a religious view.    For false-alarmists to resort to this tactic suggests they don't have any faith in their science.   (no kidding...)                            
  5. Charts of the earth’s temperature going up and down for ages before humans were industrialized.   This is, again, the point in a different way of a natural warming period before the Little Ice Age.  However, charts of much longer history also point out the natural descent into glacial periods (100,000 years roughly), followed by natural warming into inter-glacial periods (15-20,000 years roughly).   Clearly, humans had zero to do with those events.                                                                                             
  6. Ice on one pole is decreasing [while] increas[ing] on the other pole.  The Arctic ice is (perhaps) decreasing, and the Antarctic ice is certainly increasing.  Yet, the false-alarmists downplay the increasing ice.  They also trumpet the Antarctic ice that breaks away.  What is seldom mentioned - especially in the press releases - is the Antarctic ice that breaks away is all from the same spot, which is located above an active volcanic area.   What is also not mentioned is the pattern of ice decrease in the Arctic: a very small decline for several years, followed by a substantial decline of a few years, then a steady period over the past few years with no additional decline.  That is certainly not consistent with the steady increase in Carbon Dioxide.   However, that ice trend is consistent with the dark soot particles deposited on the ice from over-the-pole jet aircraft, and the soot from Asian coal-burning power plants.   False-alarmists need to be honest about the causation of the measured phenomena such as polar ice extent.                                                                                                                                       
  7. The Earth has not warmed as predicted, so don’t change the subject to sea levels.  Another excellent point, with false-alarmists unable to answer the obvious flaw in their argument so they switch to another topic.   The problem is, false-alarmists resort to various tactics that make them sound like they are hiding something.  They talk about global warming in one breath, showing the temperature trends over the land   By their measurements, there has been a warming.   But, when the entire globe is including - meaning the oceans also - there is almost no warming.  They love to show a graph of the Arctic ice extent, but stop their graph at the lowest point recently.  That gives a false impression that the ice is still shrinking, when it certainly is not.  Changing the subject is a favorite tactic.                                                                               
  8.  If the rate of change of temperature is key, stop telling me about record high temperatures as if they are proof of something.  Again, with the switch in topics.   It is also very important to know that these supposed record high temperatures are only true after the false-alarmists made multiple, repeated, adjustments to their temperature databases.   Adams does not mention that one, but I will discuss it more in a bit.                                                                                                                                    
  9. Stop pointing to record warmth in one place when we’re also having record cold in others.  This is a favorite technique of false-alarmists: trumpeting the data that supports their argument while ignoring all the contrary data.   That's not science; that's advocacy.  It also is a slap at the intelligence of the audience, who presumably cannot determine that contrary data exists.                                                                                  
  10. Don’t tell me how well your models predict the past. Tell me how many climate models have ever been created, since we started doing this sort of thing, and tell me how many have now been discarded because they didn’t predict correctly.   The false-alarmists really have a problem with their multiple climate models and the wide range of results from them.   Instead of identifying those that best match the actual, measured data (that shows almost zero warming), and discarding those models that are clearly way off, they simply average together all the model results.   In what universe does that produce an acceptable result?   This is not science; it is a mockery of science.                                                                                                        
  11. Claim[ing] the oceans have risen dramatically, [requires that] you . .  explain why insurance companies are ignoring this risk and why . . . local beaches look exactly the same. . .. The false-alarmists employ two very devious tactics in their claims of sealevel rise.  The first is false attribution of causation, the second is smearing the data by averaging a few data points with all the rest. The ocean surface measurements clearly show that most of the oceans are not rising very much, if any.   Those areas that show an increase are typically influenced, even heavily influenced, by land subsidence.   The subsidence is both natural, and man-made from pumping groundwater.   The false-alarmists then take these few areas of false sealevel increase, then average that with the great majority of the ocean that has no increase, to produce an average increase of about 8 inches per century.  Then, the false-alarmists have the nerve to say the rate of increase is increasing, so the next century will have a rise of 20 to 40 inches.   Yet, there is zero increase in the rate, the oceans - even by their false measurements - are rising at the same slow, steady rate as 100 or even 200 years ago.                                                                                                                 
  12. [If] warmer temperatures are bad, . . . produce a chart [comparing] how humankind thrived during various warmer and colder eras.  You also need to convince me that economic models are accurate.   This is one of the best, in my view.  Adams hits to the heart here, as it is certainly true that human death rates are much higher in periods of prolonged cold than periods of warmth.   The second part of the point is the predictions of economic harm in future warmer decades.  The fact is, almost every economic prediction model is woefully wrong.  Econometrics is rightfully known as the dismal science.                                                                                                                                         
  13. Stop conflating the basic science and the measurements with the models. Each has its own credibility. This one is also like Adams' first point above, and Dr. Christy's testimony to Congress.  The model predictions, or projections as the false-alarmists call them, are not the science.  Those are simply the results of computer models.  Those model results can be, and have been, compared to an average of air temperatures from around the globe.  It must be stated, and widely recognized, that  the basic science has data that is badly corrupted, as shown on SLB.   The credibility that Adams mentions does not exist, in my view.  There is, of course, the irrefutable fact that carbon dioxide can actually absorb heat radiation and re-emit those photons, but only in a very limited wavelength.  The radiant heat science requires that high altitude, low pressure, very cold, and very low concentration carbon dioxide absorbs and re-emits only a tiny, tiny, miniscule amount of heat.   Every heat transfer design engineer knows this.  The credibility of the temperature data is absolutely zero, due to the way that the data is collected and manipulated, then adjusted over and over to achieve the desired result.  The credibility of the climate models has been addressed above, and is also zero.                                                                                                                          
  14. If skeptics make you retreat to Pascal’s Wager as your main argument for aggressively responding [to] climate change, please understand that you lost the debate. The world is full of risks that might happen.   This last one refers to the false-alarmists' demands that every industrialized society reduce or eliminate their fossil fuel use to prevent the globe from overheating and unleashing an entire litany of horrible results.   The list is familiar, and very long: extended droughts, severe heat waves, spread of heat-tolerant tropical diseases, Biblical floods, sealevel increases and shorelines disappeared, with many millions of people flooded out of their homes, crop failures and starvation, ice caps disappeared, stronger and more frequent hurricanes and tornadoes, snow reduced or not occurring at all, low level islands disappeared, oceans absorbing more carbon dioxide and sea creatures permanently and badly affected, coral reefs bleached and dead, crustaceans unable to form protective shells, just to name a few.  Adams is correct that using global warming as a scare tactic to spend un-told trillions in an attempt to prevent an uncertain outcome is a sure sign that the false-alarmists have no valid arguments.  The "You need this, because what if we are right?" argument is an excellent reason to purchase home owners' insurance, when the insurance salesman provides the actual statistics on disasters than have done and still do impact homes.   However, dire predictions that have zero basis in science are no reason to waste trillions and trillions of dollars (or Euros or Yuan, for that matter).     There are, as Adams states, a great number of pressing issues that most certainly will detrimentally impact the future.  The world will need great effort and money to combat those issues.  Others have studied and compiled lists of pressing needs, such as fresh water, reliable electricity, disease preventions and cures, crop blights, pollution reduction, increased nutrition for billions more population, recycling of truly scarce minerals, a much more resilient electricity grid to withstand a massive solar flare, and many others.  
The Adams post is very good, in my view.    I would add a few more reasons for skeptics to be quite skeptical of the false-alarmists' statements.   Much of this has already been discussed in various articles on SLB. 

The claim that recent warming must be due to carbon dioxide increases in the atmosphere, because the scientists simply cannot think of any other cause.   There are many other known causes of an air temperature increasing over time.  

The fact that scientists keep adjusting the temperature data over and over and over yet again, each time stating (with great solemnity) that they have it right this time.   How many times will the public (and especially, politicians) fall for that scam?   

The many and widespread attempts by the false-alarmist community to ostracize and silence other scientists that hold dissenting views.  In the same Congressional testimony referenced above with Dr. Christy, another prominent scientist, Dr. Roger Pielke, Jr. related his horrible treatment by the false-alarmist community.   All Dr. Pielke, Jr. did was point out in scientific publications that there are no increases in severe weather events.  

The fact that false-alarmists resort to a very old, and very inappropriate, device to create alarm where none is justified: creating an average of a few outlier data points (example, apparent sealevel rise in coastal areas with known subsidence) with the vast majority of data points that show very little or even zero increase, then declare that all the world's oceans are rising and every shore will be inundated.   This trick is pulled over and over again, e.g. Antarctic continental temperatures smeared with a few warming data points from the volcanically-active peninsula, and others.   

The fact that climate scientists (the good ones, and there are many of them) agreed that the climate temperature data was not very good, then installed excellent temperature measuring equipment in more than 100 pristine locations throughout the US.  This network of climate monitoring stations is known as the United States Climate Reference Network, USCRN, and has been collecting data for the past 12 years, approximately.   SLB has several articles on this, and the trends taken from that data.  There is no warming, certainly none from man-made carbon dioxide. 

The fact that, for almost every single long-term temperature station, a steady increase in temperature is easily seen, but scientists emphatically agree that no warming from carbon dioxide could possibly have occurred before about 1960.  Yet, the long-term temperature records go back to the year 1900 and even earlier.   It is indeed odd that those temperature trends, which do show a steady increase over time, correspond almost exactly to population increases in those cities.  SLB has more than 70 charts of US cities and their temperature records that clearly show this, taken from the climate false-alarmists' own data, the Hadley Climate Research Unit.   Something caused the cities to warm from 1900 to 1960, but it clearly was not carbon dioxide.  . 

And finally, skeptics should be aware that none, repeat, none, of the dire predictions of false-alarmists have ever proven true, and especially cannot be attributed to the tiny increase in carbon dioxide in the atmosphere over the past 60 years.  

Roger E. Sowell, Esq.

Marina del Rey, California
copyright (c) 2017 by Roger Sowell - all rights reserved

Topics and general links:

Nuclear Power
Climate  and here
Fresh  and here
Free Speech.................... here

Saturday, May 20, 2017

Is 100 Percent Renewable Energy a Good Thing?

Subtitle: 100 Percent Will Not Happen on a National Basis

It has become increasingly clear that renewable energy forms, especially solar and wind turbines, are and will be important sources of electricity.   Along with this fact, some cities and even a few states have declared they will become 100 percent renewable.   What that means is they will somehow obtain all their annual electricity, measured in MWh, from renewable sources.   It should be noted that California (a large and rather stupid state on the West coast of the United States), has already sourced more than 27 percent of all power sold in the state in 2015 from renewable sources.   Not all of that was solar and wind, however, and not all of that was generated in-state, either.   California counts renewable energy in what is perhaps a unique way: large hydroelectric power is not part of their counting.   If large hydroelectric sources were included, California would be well past 40 percent for the year. 

A rather interesting discussion along these lines occurred several days ago (and may still be exchanging comments) at Dr. Judith Curry's blog Climate Etc. (see link)   The post was written by an anonymous author, who appears to be involved in some way with the PJM grid, the Pennsylvania-New Jersey-Maryland grid that serves multiple states along the East coast and Mid-West.   I read the post with some interest, then entered the discussion with a few comments.   Rather than replay all that here, this post will discuss some aspects of renewable power production and integration into existing grids.  

As background, a few comparisons between the California grid and PJM.  California grid is known by the name CAISO, for California Independent System Operator.  (see link to CAISO website)

PJM and CA have significant differences, and therefore will have different problems. The most obvious differences are in the thermal and nuclear portions of the generation mix: CA has but one nuclear plant of 2200 MW running, zero coal plants, and a great number of efficient and agile gas-fired plants. PJM, as I understand it, has several nuclear plants, many coal-fired plants, and also some gas-fired plants. A notable gas-fired CCGT plant will soon be brought online in Lordstown, OH for PJM to help cope with wind power changes. (see link to SLB article on this plant)

Next, renewables in CA are from approximately 11,000 MW grid-scale solar, and 5000 MW of wind. PJM, again by my reading, has also 5000 MW of wind and very little solar at grid scale.

It also must be noted that the CA grid is smaller in size, measured in peak load, annual generation, and generating capacity compared to PJM.

Population – 65 million PJM; 38 million CAISO (ratio 1.71)
• Generating capacity – 176,569 megawatts PJM; 70,900 CAISO (ratio 2.48)
• Peak demand – 165,492 megawatts PJM; 46,000 CAISO (ratio 3.59)
• Annual energy delivery – more than 792 million megawatt-hours PJM; approximately 295 million for CAISO (ratio 2.68)
(the differences are almost entirely due to climate, mild in CA and typical US Northeast – aka brutal winters – in PJM region)
• Nuclear Power installed - 30,000 megawatts PJM; 2,200 in CAISO (ratio 15)
• Nuclear as percent of all capacity - 17 pct PJM; 2.8 pct CAISO (ratio 5.9)

The problems for PJM will stem from much higher nuclear as a percent of total annual generation; as has been noted, nuclear does not reduce load in the US. It appears that PJM has approximately 30 GW of nuclear installed, compared to 2.2 GW nuclear in CA. However, PJM is surely acutely aware that the nuclear plants are closing in great numbers, as they reach the 40 year age mark. The next 10 years will see many if not most of the nuclear plants in PJM territory close. CA will close its final nuclear plant in the early 2020s.

Also, PJM is facing offshore wind power generation as a soon-to-be reality, with Maryland already announcing projects of approximately 500 MW. That is small as a percent of the PJM market, but will likely grow quickly.

Another significant difference in the PJM versus California situation is that PJM has many states integrated into the grid, compared to just one state for CAISO.  The importance of this is that PJM member states each have a renewable goal, and those are not all the same.  California has just the one state with a renewable goal, however, that goal keeps changing.  

Yet another important difference is the resource availability - how much solar energy hits the ground, and how much wind energy exists.  On this, the two grids are just about the opposite.  California has what is likely the best solar energy resources of any large state in the country.  Arizona has comparable sunshine, but a much smaller population.   PJM has much less solar energy due to the higher latitude and much more cloudy conditions.   Wind energy is small in California - and essentially already tapped out.   PJM states have substantial wind resources that have yet to be harvested.  

Now, to the heart of the matter: can any location, city, or state actually obtain 100 percent of its electricity on an annual basis from renewable energy?  The answer is Yes, of course it can.   The only questions are how much will the grid be modified, and how much will the electricity price change, if any. 

The Climate Etc article referenced above referenced a PJM publication that claims a hard limit exists for a grid, with 20 percent annual production from wind and solar.    That may very well be true for PJM, with all the nuclear and coal-fired plants on that grid.   But, in California, that is not a problem at all.   Some grid-scale storage is required, which California already has.   More storage will be installed as more solar PV production is installed, keeping the grid safe, reliable, and cost-effective.   

SLB has previous articles on grid-scale storage, with example technologies including batteries, ARES rail-based gravity trains, pumped storage hydroelectric, and others.  

As to achieving 100 percent renewables, that actually will not occur as long as large hydroelectric generation is not counted as a renewable.  The US average for large hydroelectric generation is approximately 7 to 8 percent on an annual basis of total electricity produced.  Various states have different percentages, with Washington State the highest due to the Columbia River and the several hydroelectric dams there.   (for numbers, the US produces approximately 4,000 million MWh each year, of which large hydroelectric dams produce approximately 250 to 300 million MWh each year - source EIA Electric Power Monthly Table 1.1). 

Even if one did count large hydroelectric as a renewable, the near future will not allow 100 percent renewables due to the great number of nuclear power plants (99 of them, although the numbers keep falling as more and more give up and shut down to stop their huge economic losses).    On average, nuclear power produces approximately 18 percent of the US' electricity (recent decade numbers range from 769 to 806 million MWh per year; same EIA source as for hydroelectric).  

Next, coal-fired power plants still produce the lion's share of US electricity, although that is declining rapidly with environmental regulations now in place.  Coal-based electricity was approximately 1,200 million MWh per year in 2016, which is approximately 30 percent of the US total of 4,000 million MWh, same source as for nuclear and hydroelectric.  

Therefore, on a national average, the US presently stands at 30 percent coal, 18 percent nuclear, and 8 percent hydroelectric, which combined provide 56 percent.   Even with nuclear plants closing, as most of them will certainly do within 15 years, and with coal-fired plants shutting down as their economic mines are exhausted and shut down, again within 20 years, the US still has 8 to 10 percent hydroelectric, depending on the annual rainfall.  

Concluding, the post title asks Is 100 Percent Renewable Energy a Good Thing?   The answer must be, yes, but only on a local basis, as engineers take adequate care to ensure grid safety, reliability, and cost effectiveness.   As noted here and elsewhere, California has not had price increases due to a greater and greater share of wind and solar generation.   Indeed, CA prices have barely kept up with inflation. (see link to SLB articles on this topic).  The grid has also remained quite stable and reliable. 

For many policy reasons, wind, solar, and other renewables are a very good thing.  Among the many reasons for this are job creation, absolutely free energy that has no foreign implications, inexhaustible energy, pollution-free energy, industries expanding into solar production and wind turbines, other industries racing to produce economically attractive grid-scale storage systems,  and innovations in grid system technology and controls.

Can PJM meet their renewable energy targets (approximately 25 percent by 2025), or will they have insurmountable difficulties?   The answer there is no, not with all those nuclear power plants on their grid, and the many coal-fired plants.  It is quite expensive and time-consuming to shut off a large coal-fired power plant, then start it back up again.  Nuclear plants, of course, simply refuse to do that, citing safety concerns.   So, with nuclear plants on PJM system stubbornly running all-out, and coal-fired plants losing money if they cycle on and off, PJM has some daunting challenges.   

Good luck to the engineers and planners on the PJM grid.  

It will be quite interesting to observe and report on their progress in meeting the 25 percent renewables mandates, in light of the PJM study that says 20 percent is an absolute hard limit.   

UPDATE 5/21/2017:  With 100 percent renewables on a national level clearly impossible, (hydroelectric will run for the foreseeable future), how then does a city or state intend to achieve 100 percent?  

The City of San Diego (again, in loony California), with a population of approximately 1.4 million, has a Climate Action Plan (2015) that calls for 100 percent renewable electricity by the year 2035.   They gave themselves a full 20 years to accomplish this, being such optimists.    How does this work, in San Diego's version?   The answer lies in a clever trick of buying electricity from a designated generator, but not from all the others that are also on the grid.   San Diego would identify solar, wind, and other renewable-energy generators and claim to purchase power only from them.   The problem occurs when the wind does not blow and the sun does not shine.   Wind is quite erratic in California, so on nights when the wind is barely stirring, how would San Diego find electricity to purchase?  

One answer is grid-scale storage; another is small-scale home-based electricity storage; another is time-shifting demand for electric vehicle charging; yet another (remote possibility) is long-distance power imports.  

Grid-scale storage already occurs in California (and other states) via pumped storage hydroelectric, massive batteries, and a rail-based gravity storage system is under construction on the California-Nevada border near Pahrump. 

Small-scale home-based storage is already available from several vendors including Tesla. 

A favored scheme by many renewable advocates is using millions of EV (electric vehicles) as grid loads while the batteries are charged up during sunny days or windy nights.  Then, the cars are plugged into the home where power can be supplied by the car's batteries into the home to run the big screen TV and DVD player, the lights, charge the cell phones, and even run the refrigerator.    One must wonder just exactly how big that car's batteries must be to achieve all that.  Another consideration is, if the car is running the house at night, how will the car be able to travel the next morning to the workplace?

The very long-distance importing power is an interesting scheme.  California has already implemented a version of this, both importing and exporting power, to help balance the grid as more and more solar power plants are brought online.    The term is regional integration, or regional energy market.  see link to CAISO presentation from 2016.  The basic idea is to integrate several states into a regional energy grid, much as PJM grid has done on the East coast, MISO does in the Mid-West see link, and others.   California can no longer be an island, in the electrical grid sense.   With multiple states on a common grid, the hope is that the wind will always be blowing at some location, providing power that can be exported to users that are becalmed.   The problem with this, of course, is that on a day with strong wind over the entire region, far too much wind power would be produced.   As that happens, grid-scale storage would be needed to absorb the production.  ---  end update 

Roger E. Sowell, Esq.
Marina del Rey, California
copyright (c) 2017 by Roger Sowell - all rights reserved

Topics and general links:

Nuclear Power
Climate  and here
Fresh  and here
Free Speech.................... here

Tuesday, May 2, 2017

Chemical Engineers, CO2, and Absorptive Re-Radiation

Subtitle:  Fired Furnaces Have Strong Radiating CO2; Atmosphere Does Not

A discussion on WUWT over the past few days led to the following question to me, and my answer just below.   The topic is, once again, whether the Greenhouse Effect (GHE) is real, and carbon dioxide in the atmosphere can and does warm the Earth's surface below.  

One commenter asked me to comment on this statement in italics below.  My reply is below that.  

"[There is ] not even a semblance of a testable hypothesis regarding the GHE, GHGs, and all the rest of the claptrap spouted by those who should (but obviously don’t) know better."

I must disagree on this one.  My knowledge squares with the statement by Professor Richard Lindzen (MIT), that GHG warming is trivially true but numerically insignificant.  

As proof of GHE, or better expressed, radiant heat absorbed and re-radiated by CO2 and water vapor, my chemical engineering colleagues and I refer to the well-known properties of luminous gases in fired heaters.  It turns out that furnace design (for industrial fired heaters that burn coal, oil, natural gas, or other such carbonaceous fuels) requires an adjustment for the combustion products' composition.  Otherwise, the furnace does not work as expected.  

The basic textbooks on chemical engineering, and heat transfer, all have a comprehensive section on this.  The key parameters are gas composition, gas pressure (and hence the CO2 and water vapor partial pressures), flame temperature, and mean beam length.  Here, beam length is the radiating distance.   That distance is not far in a furnace, typically measured in inches.   see e.g. Perry's Chemical Engineers' Handbook, 8th Edition (2008) pg 5-15 et. seq.   The same material is in the same Handbook, 5th Edition (1973)  (that I used in undergrad) starting on pg 10-48 et. seq. 

The upper atmosphere where the CO2 and water vapor radiate energy back to Earth, aka TOA or top of atmosphere, has far different properties compared to a fiery furnace's interior.   It turns out that heat radiated is far less for colder temperatures, lower partial pressures, and longer beam length.  TOA is at approximately minus 50 deg C, CO2 partial pressure is very, very low at 400 ppm and (probably) one-tenth bar total pressure, and the beam length is measured in miles, not inches.   In contrast, a furnace's radiant heat section is at approximately 2600 degrees F, partial pressures are much, much higher at one atmosphere and several percent (10 to 15 percent, not ppm), and the beam length is a few inches. 

And that is why (I am certain) that Dr. Lindzen says the GHE is trivially true (we know for a fact that CO2 and water vapor can absorb and re-radiate), but numerically insignificant.  The values are so small as to be meaningless. 

The absorption and re-radiation of IR by CO2 and water vapor absolutely is true.   It's all a matter of degree. 

The chemical engineers have known this for decades.   It is why our fired furnaces work, in the countless millions, around the globe, 24/7.  And, it is also why global warming due to man's production of greenhouse gases, GHG, (primarily CO2 is blamed) is absolutely a false alarm.  

To believe that a tiny few ppm of CO2 at very cold temperatures and miles up in the atmosphere can measurably increase the Earth's average temperature requires chemical engineers to dis-believe what we know to be true about fired furnaces.  

Roger E. Sowell, Esq.
Marina del Rey, California
copyright (c) 2017 by Roger Sowell - all rights reserved

Topics and general links:

Nuclear Power
Climate  and here
Fresh  and here
Free Speech.................... here

Sunday, April 30, 2017

Diablo Canyon Nuclear - Power Replacement via Solar

Subtitle: Replacing Nuclear with Solar also Requires Storage

An interesting question was posed to me on Dr. Curry's post recently at her blog Climate Etc. The question and my answer are shown below, but first a bit of background. 
Diablo Canyon Nuclear Plant, via Google Maps 2016.
Arrow idicates twin reactors.   Pacific Ocean to the bottom right. 

California has but two nuclear power reactors left running, both at the Diablo Canyon nuclear plant near Morro Bay, right on the Pacific Ocean shore.  The 40-year operating license for each reactor expires in a few years, and PG&E announced it will not seek a 20 year license extension but will shut them down in 2024 and 2025, respectively.   Also, PG&E announced it would compensate for that power by a mix of renewable energy, conservation, and grid storage.  My June, 2016 article on this on SLB is at this link.   The increased renewable energy likely poses several issues for the grid operator.   

A commenter on Dr. Curry's post put the following question to me:

“. . .does California have enough regulatory authority to demand PG&E replace Diablo Canyon only with the renewables and with energy conservation measures; and further, to directly and explicitly prevent PG&E from placing greater reliance on natural gas for servicing California’s electricity demand?”

The short answer is probably Not. The long answer is more complex.

Renewables generation in California are regulated under multiple regulations, including (but not limited to) the RPS (Renewable Portfolio Standard). see link to

The law requires PG&E (and other Public Owned Utilities) to procure 33 percent of their power sold in 2020 from renewable sources. PG&E is reported to already have under contract 43 percent renewables for 2020.

The next milestone for RPS is 50 percent by 2030. Clearly, PG&E must find a source of more renewable energy to meet the 50 percent requirement. With Diablo Canyon nuclear to close the reactors in 2024 and 2025, that gives PG&E an opportunity to obtain approximately 2,200 MW of power from renewable sources. However, RPS does not work on installed capacity, it requires kWh delivered to be from renewables.

As most everyone knows (and detractors cannot stop shouting it), wind and solar renewable power do not run 24/7, they have approximately 25 percent annual capacity factor in California. Thus, about 8,800 MW of either wind, solar, or both, would be required to replace the retired nuclear output. However, installing that much renewables would likely create grid issues, and put PG&E in the 60-percent range for RPS.

The state does not have very much, if any, untapped wind resources left, so the new installations will be solar. The economics of solar thermal are not as good as solar PV, therefore the new installations will be solar PV.

The main point is that solar PV must have some form of backup due to intermittency with sunshine. Those aspects are well known, from night, to clouds in daytime, to seasonal variation, to solar eclipses (a big one is coming in August 2017), and to normal outages for maintenance.

The usual backup in California is natural gas-fired plants that use combined cycle gas turbine (CCGT) technology. Indeed, CA law requires a minimum efficiency for fossil power plants that essentially dictates that only CCGT can be built. Therefore, it is safe to conclude at this time that PG&E will very likely build, or purchase power from, approximately 8 to 10 GW of new solar PV in the next 7 to 8 years. That will also require the installation of some amount of gas-fired CCGT.

The situation would change if and when grid-scale batteries (or other storage) are sufficiently economic to be installed for baseload power. The economics for batteries are already good for peaker power plants. Whether the battery installed cost declines sufficiently by 2022, when investment decisions must be made, of course is not yet known.

I do not always agree with what Planning Engineer writes, but on one thing we do agree: a large amount of solar power on a grid creates the problematic Duck Curve. California is already managing very well with a substantial Duck Curve, with recent numbers showing thermal generation in mid-day at 11 GW and 26 GW at the peak 8 hours later (data from 4/29/2017).

However, if an additional 8 to 10 GW of solar generation is installed, the grid would have thermal generation of 2 to 3 GW in mid-day, then must produce 26 GW only 8 hours later. That is a problem for the grid planners and operators. It remains to be seen how all this will play out.

Roger E. Sowell, Esq.
Marina del Rey, California
copyright (c) 2017 by Roger Sowell - all rights reserved

Topics and general links:

Nuclear Power
Climate  and here
Fresh  and here
Free Speech.................... here

Saturday, April 29, 2017

Electrical Grids - Coal-fired Baseload and Batteries

Subtitle: Can Coal-Fired Power Exist with Grid-Scale Batteries

There is an increasing interest in what the electrical grids of the near future will have as generating assets and loads, as evidenced by a short memo from Secretary of  Energy Rick Perry this past week.  see memo below. 

This 4-14 Memo is quite interesting for what it says, and what it does not say.  Others have reacted (apoplectically and hysterically, in some cases) to the absence of the word "environment" or derivatives thereof.   

The Secretary did use the words reliable and resilient, affordability and fuel assurance, technologically advanced, affect the economy and national security, diminishing diversity of generation mix, changing nature of electricity fuel mix, previous policies to decrease coal-fired power generation, market-distorting effects of federal subsidies, regulatory burdens, but no mention of environment. 

First, the overview of what the Secretary is doing here.  This is entirely my judgment and not based on any insider information.   This 4-14 memo is part of the President's policy and campaign promises to revive the US coal industry.  Most of the coal production has provided electric power in conventional, Rankine-cycle steam power plants.  Recent regulatory changes by the Obama administration resulted in many coal-fired power plants closing.  Essentially, the coal-fired plants closed because they are now required to remove various pollutants from their stack gases, but cannot afford to retrofit the plants with the pollution control equipment.   This is a part of the "regulatory burden" the Secretary referred to. 

The balance of the intent is to study grid reliability as baseload power is reduced, primarily from coal-fired plant closures.   For background, in recent years the overall US grid has had nuclear and coal-fired plants retired, with natural gas-fired plants, solar, and wind power plants installed.  

So, there is the intent: can the policies of former administrations that brought abrupt closure of so many coal-fired power plants have an adverse affect on grid reliability, resiliency, and electricity affordability?  Next, how can the Trump administration justify bringing back coal-fired power?

Second, a brief analysis of each of the terms featured above, with emphasis on coal-fired plants and renewable power plants..

Reliable and Resilient

Electrical power grids are required, by law, to provide safe, reliable, affordable electricity to consumers of all types, be they residential, commercial, industrial, transportation, or other.   It is interesting that the 4-14 memo does not mention safety.   Perhaps that is simply assumed.  Some states also require an environmental aspect of generating plants, based on pollutants such as sulfur oxides, nitrogen oxides, particulate matter, and in California's case, carbon dioxide emissions. 

Reliability on a grid means that the power is available when the consumer wants to use the power, or on demand.   The electricity must meet certain quality standards, such as but certainly not limited to frequency and voltage.  Interested readers are encouraged to read the IEEE publications on grid reliability. 

Resilient means the grid supplies power as required under any scenario, planned shutdowns, emergency shutdowns, long-term drought, but not necessarily severe weather events such as ice storms, tornadoes, and hurricanes.   Generating assets are expected to stay online during severe weather, but transmission and especially distribution lines will likely be disrupted for a short time.  As an aside, the NRC safety regulations require that a nuclear plant be shut down in advance of a predicted hurricane that could or will put hurricane-force winds at the nuclear plant.   The grid must take all this into account. 

Coal-fired plants historically have good reliability and resiliency, that is, if they have sufficient fuel on hand.  That topic is addressed next. 

Affordability and Fuel Assurance

Affordability is not only a goal for electrical power grids, but is required by law.  The agencies that oversee electric utilities typically allow the electricity rates to provide the utility a modest return on investment, such as 10 percent.   The idea is that the electric utility should not get rich by having an exorbitant return on investment, such as for example alcoholic beverage companies enjoy.   At the same time, a utility cannot just break even else the bonds that finance spending would go unsold.  

The crux of the matter is that different generating technologies have different wholesale costs for the power produced, if all are to obtain the same 10 percent return on capital employed.  Many studies of actual investments, and projected investments have been performed and typically conclude as follows: lowest-cost generating assets are natural gas-fired combined cycle plants (CCGT), natural gas-fired steam plants, large high-pressure coal-fired plants, and hydroelectric plants.   The most expensive, or highest-cost are nuclear, simple cycle gas peaker plants, solar thermal, and offshore wind.    The others are somewhere in the middle, solar PV, onshore wind, geothermal, and a few others.   Nuclear plant advocates will dispute this, of course, but they cannot rationally defend their position. 

However, as stated above, coal-fired plants are shutting down in record numbers because they cannot afford to install the required pollution abatement equipment.  

Fuel assurance is a topic that has several aspects, most notably the variability of wind and solar power.  This is no surprise, wind has always varied, and the sun's intensity at the ground is affected by many factors including season, and clouds.   In fact, a notable event will occur in California's solar plants this year as a total eclipse of the sun occurs in August. 

Coal-fired plants are touted as having substantial fuel assurance with 60 days or more fuel supply stockpiled at the plant.   This is not always true, as coal deliveries are affected by rail shortages, shipping delays due to ice on lakes, and other reasons.  see link to a recent SLB article on coal fuel supply problems. 

Technologically Advanced

A grid that is technologically advanced could mean any number of things.  Recently, the term "smart grid" has been advanced.   More on that in a bit.   Coal-fired power is one of the oldest and not-advanced forms of power generation.  Coal-fired power certainly predates nuclear power, modern solar PV, modern wind turbine power, and CCGT.   There are and have been some advances in coal-fired power, however, such as pulverized coal, high-pressure designs, and more recently, coal gasification with gas turbines.   It is not entirely clear if the 4-14 memo intends to investigate all the various forms of coal-fired power plants.  Clearly, if any of the coal technologies were economic, they would have great market share.  The fact is, they do not have much market share.  

The smart grid concept has many aspects, however one aspect worth noting is the consumer has a display in his home or business that indicates the cost if more load is added to the grid.  The consumer can then decide to flip the switch right then, or postpone the power consumption to a later time when price are cheaper.  

A smart grid from the generation aspect can provide reliable power when intermittent generation is providing power, such as wind turbines and solar PV.   One key aspect of the smart grid is grid-scale storage by batteries or other means.   More on that below. 

Affect the economy and National security

Economic effects due to grid reliability are well-known, where blackouts disrupt commerce.   The argument for coal-fired plants is that they have been highly reliable (again, when fuel is available) and it's not their fault that blackouts sometimes occur. 

What the Secretary means by including national security is a bit obscure.  Certainly, military installations are grid-supplied.  However, they also have more than adequate back-up plans for self-sufficiency when needed.  

Diminishing diversity of generation mix

As stated above, coal-fired plants are closing in record numbers.  The grid is stable, though, because more than enough wind, solar, and natural gas-fired plants are installed.   On a state-wide basis, the generating mix in California has also changed.  Specifically, half of the nuclear plants were shut down almost 5 years ago, and substantial solar power has been installed.   The state has transitioned from 4 GW of nuclear and almost zero solar, to 2 GW of nuclear and 10 GW of grid-scale solar production.   It is noteworthy that grid stability remains, and power prices have not skyrocketed.  

The Secretary is more concerned with the national mix, and primarily the coal-fired plants located east of the Rockies.  Those states do not have superb solar resources, but they do have outstanding wind resources in many areas.  

Changing nature of electricity fuel mix

This is essentially the same as the previous topic, diversity of generation mix. 

Previous policies to decrease coal-fired power generation

This is a direct reference to the various environmental laws that the Obama administration placed on coal-fired power plants.   The result is clear, as above, with coal-fired plants closed or closing in record numbers.    This is a tangential reference to environmental aspects. 

Market-distorting effects of federal subsidies

The issue of federal subsidies for power generation has a long history and much debate.  Various camps shout that the "other guys" are being subsidized, but their favored technology is not.   Accusations abound, but a rational, fact-based analysis shows that almost every form of electric power generation has grants, loan guarantees, tax credits on investments, direct subsidies, regulations that favor that technology, and many more.  SLB has extensive articles on this.   Nuclear power, for example, has subsidies in the form of direct payments for new nuclear plants of 2.3 cents per kWh generated for the first ten years, complete indemnity under the Price-Anderson Act for harm caused by a radiation release (above a modest insured amount), changes to safety regulations to allow continued operation, new plant construction loan guarantees, direct subsidies for existing plants to keep operating as a jobs-protection program, and others.    Nuclear power itself resulted from government research, and was promoted by Eisenhower as a way to show the world that atomic power has peaceful uses, not just the terrible destruction from atomic and hydrogen bombs. 

Coal-fired plants enjoyed a long, many decades period of exemption from the Clean Air Act laws under the "grandfather" clause.  

Solar, wind, and other forms of renewable energy have various forms of subsidy, including investment tax credits, direct subsidy of 2.3 cents per kWh, and in some cases, construction loan guarantees.   Renewables also have a priority in the generation mix, however curtailment certainly occurs under some conditions.  

Finally, almost all of the hydroelectric power in the US was built by federal funds.   The Hoover Dam and its power plants on the Nevada-Arizona border is but one example. 

Regulatory burdens

This is a reference to much of what is already written above, the various regulations on subsidies, environmental exemptions, but also state-mandates for Renewable Portfolio Standards (RPS).   RPS typically requires utilities to obtain a stated percentage of all power sold from renewable forms.  Such renewables include wind power, solar PV power, solar thermal power, geothermal, biogas, biomass, and small hydroelectric.    Many states have RPS or a functional equivalent; California's RPS requires 20 percent renewables by 2010, 33 percent by 2020, and 50 percent by 2030.  The state easily met the 20 percent requirement, achieved 27 percent in 2015, and has contracts for 43 percent by 2030.    Hawaii has one of the most ambitious RPS, with 70 percent by 2040 and 100 percent by 2045.    see link

The concern, clearly, is what will become of coal-fired baseload generating plants when RPS standards are implemented.  It is notable that California has no problems with its substantial renewables, primarily due to the lack of exactly such intractable baseload coal and nuclear-powered plants.   Flexible natural gas-fired CCGT provides California with more than adequate ability to meet power loads with approximately 10 GW of grid-scale solar, 4 GW of wind on a good day, and 2 GW of steady geothermal and other renewables.  

No mention of environment

Memo 4-14 does not explicitly mention the environment, but as above, makes passing reference to such in regulatory burdens and previous Administration policies.   This was very likely by design, as there is substantial pressure to re-open the pollution debate over coal-fired power plants. 

Grids and Batteries

The major unknown at present is the technology for grid-scale batteries.  SLB has several articles on grid-scale storage and batteries.  The fact is, such batteries already exist and are operating in many locations.   The problem today is their cost to install.  That cost is steadily declining, however.  Also, new battery technologies are in research and development.  Great advances are being made.   

The coal-fired power industry, and all those involved from mining, transportation, power plant design and construction, pollution control systems supply, plant operation, all are keenly aware of the dramatic transformation that awaits when, not if, such batteries do achieve reduced costs for installation.  

Such batteries then allow wind and solar power to be stored as that power is produced, then fed back into the grid on demand for load-following or even baseload power.    

An application already under construction in Los Angeles, California for Southern California Edison is to use solar-powered batteries to replace a costly gas-fired peaker power plant.   Also in California, batteries are used on Santa Catalina Island to allow diesel-powered generators to run at a steady pace.  The batteries are charged at night, then discharge to meet demand each day.   Grid-stabilizing electronics already exist, the only issue is cost of the battery systems.  

As with many technological advances, the early systems have a high cost and will supplant only the highest-cost production.  This is the case today in Los Angeles, with the batteries replacing the high-cost peaker power plant.    As more batteries are installed, as technology improves, costs will be less.  As those cost reductions occur, less-costly power generation can be replaced.  

There may actually be a need for a small amount of baseload, rotating-generator power generation such as coal-fired steam plants produce.  (In California, coal is not allowed so that would be from natural gas-fired baseload power).   Presently in California, where the grid peaks at approximately 50 GW, the baseload is approximately 30 percent of that at 15 GW.   These issues are different for each state.  Some of the key factors are the amount of industrial load and building cooling load.  California, of course, has very little of either.   Illinois, on the other hand, has the industries in Chicago and a great number of nuclear plants and coal-fired plants. 


The results from the 4-14 memo from Secretary of Energy Perry will be quite interesting.  The memo requires the study to be delivered in 60 days, so on June 18, 2017. 

It will be interesting to see the arguments, and based on what facts, to keep existing coal-fired power plants open.  

Roger E. Sowell, Esq.
Marina del Rey, California
copyright (c) 2017 by Roger Sowell - all rights reserved

Topics and general links:

Nuclear Power
Climate  and here
Fresh  and here
Free Speech.................... here