The Tuition Addition

Education debt soars with rising college tuition and correlated Piketty inequality, causing loss of opportunity in America

Want a good job? Though experts say get an advanced degree, tuition costs continuously rise while salaries continuously drop.  If you agree to excessive debt at market rates, you must  accept the grinding consequences.  Data show clear and shocking correlation to Piketty’s “income inequality” (= workers’ fraction of all U.S. wages has been falling since 1981) .  Student debt burden has been rising at a well known but astonishing rate.

Post high-school education is expensive and has been growing ever higher for some time, with no good way to pay for it. This should worry you if you are considering how to survive our present and future reality of shriveling opportunities in the American world.

The NCES (National Center for Educational Statistics) is a government agency that reviews current and past American education.  We use the current NCES post-secondary education (post high-school) tuition trend data to understand what has happened over the past 40+ school years (1969-70 to 2011-12), then demonstrate a correlation with Saez-Piketty economic changes.  See the Postscript for details on our data choices.  Click any figure for a full resolution image.

Tuition and Associated Fees

The NCES provides averaged cost data for tuition and fees, for R&B (room and board), and for the total cost.  The R&B data really indicates cost of living averages over different localities, and and is a non-education based parameter. Our charts report just the tuition+fee baseline cost of education; R&B is explicitly excluded.  But be aware:  depending on the location, R&B costs can add another $10,000/yr to the total cost.
Pretty sobering for perspective students.

Four Year Schools

Universities and Colleges that issue 4 year baccalaureate degrees.  NCES also provides separate data tables for averages over private 4 year schools and over private ones.


Fig 1   Tuition+Fees — Upper: 4 year private schools., Lower: 4 year public  schools

Fig 1 shows the dramatic rise in costs at both public and private schools.

    • Public post-secondary school: All state funded universities and colleges. Think of the University of California schools and their much less expensive California State schools.
    • Private post-secondary school: Paid for by tuition, fees, endowments and overhead charges on grants. Example: Bryn Mahr, Harvard, Cal Tech.

If they are so expensive, why the demand for a private University?  Does a B.S. graduate from Cal Tech mean better understanding of undergraduate physics than one from University of Michigan or Texas?  This is arguable.  But.  Does a graduate from Harvard or Yale have a more powerful or influential network of friends than one from a state university?  Certainly. Since most successful steps are take through one’s network of insiders, it is always better to graduate from the best school possible.  Thus, bad-student George W Bush went to Yale.

Four Year vs. Two Year Schools


Fig 2  Tuition and fees —  Upper: 4 year public schools,  Lower: 2 year public schools

    • Two year schools provide low-cost classes, allowing students to satisfy requirements for their desired four year school and minimize non-major requirement costs.
    • Two year schools issue Associate Degrees to acknowledge successful study.
    • Two year schools usually support trade apprenticeships with high quality technical/vocational training. This is one of the greatest values they provide their communities.

Fig 2 compares the annual cost of a 2 year vs 4 year school. The upper curve is the same data as shown as the lower curve in Fig 1.  The relative flatness of the 2 year trend reflects the chart display scale, the 2 year costs actually rose more than 2½ times over 1980 costs, in constant 2012 dollars.

The 4-year vs.2-year ratio changed, too.  In 1980-81, a class in an average 4 year public school was twice as expensive as the same class in an average 2 year college.  Today, this factor is close to 3.

Tuition and Inequality

All three types display a break in trend lines at the 1980-81 school year. The presences and timing of such a visual break suggests we look again using eyes accustomed to Saez-Piketty graphs.  Our review of Thomas Piketty’s 2014 book here,  our first discussion of the the inferred income pump is here.

Public vs top10% earners

Fig 4  Public School tution and the top 10% Saez-Piketty graph.

PrivateVS top 10% incomes

Fig 3  Private School tuition and the top 10%  Saez-Piketty graph.

Figures 3 and 4 show the 4 year institution data against the background of Saez-Piketty data for the fraction of the total income earned by Americans each year taken by the top earning 10% of all workers with incomes.  Even without clicking for full resolution, you can see that tuition curves are remarkably similar to upper income fractions.

The Finals

The data show the double whammy whiplash being done to American society:

    1. NCES trend:  University tuition has risen high enough to expel most  lower wage earners from post high-school education, and are challenging families even near the 90% income point ($130,000). Higher tuition closes access to advanced degrees, to opportunities to enter influence networks, and for opportunities for good jobs.
    2. Income trend:  Meanwhile our ultra high earners (top 1/2% of the earning population) divert to themselves an ever-larger share of the income flow, reducing savings of median workers,  assuring their current buying power dwindles away.

Trends A and B have severe impact on personal lives.  While our country’s industry was “offfshored” (shipped abroad) and desk “outsourced” (contracted abroad) :

  • People are told that to stay in the “middle class,” they must have college degrees,
  • Students must acquire new and ever-larger loans to  pay for the ever-rising tuition.

Fig 5  Total Outstanding Student Debt, Number of Borrowers with Outstanding Debt, and Average Balance, Relative to 2005 Fourth Quarter, 2005 to 2012

Fig 5 is from a CollegeBoard (makers of the SAT tests)  report  and shows how student loan debt has changed from 2005.  Total loan debt was just over twice that in 2005.  Debt increased by about 50% between 2007 and 2012, but the same report shows loans from Federal government actually  decreased by 2  between 2007 and 2012.

In 1999, the total loan debt was $119 billion and an astonishing  $1.2 trillion in 2014.  Note that manufacturing offshoring and outsourcing started in the late 1970s but really grew during the last 15 years.

Figs 3 and 4 show that trends A and B have been advancing hand-in-hand since 1981, are correlated and do reinforce each other.  But one does not necessarily cause the other.  Just how many other ingredients contribute in this societal  witches brew?

As time events moves forward, trends A and B could sharpen much further; America  could  morph into the land of no opportunity.  In such a future, most would be unable to better themselves  and unequal life experience would be the norm. In such a world, aristo children, the scions of the top 1%,  would get good degrees, would enter insider networks, and would get the good jobs.

As Thomas Piketty says over and again, it does not have to happen this way, we could change things and open opportunity again to all.  But the more probable path would be to see tuition costs increase with enhancement of income transfer to the richest few.  That is, it is most likely that opportunity will drop while inequality grows.

Student loans_gph

Fig 6:  From 2014 Jul 17  MetroTrends blogsite

Update 2014 Jul 27:   Maia Woluchem and Taz George  published a report on the Urban Institute’s  Metro Trends  blog that included this chart of the steady rise of student debt (rising diagonal on this chart).  They show 2014 debt at $1.1 U.S.-trillion, slightly different from our 1.2  Very good short write-up.


Charles J. Armentrout, Ann Arbor
2014 July 23
Listed under General
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture


Over the years, NCES has modified its data base, dropping some parts, expanding others, and consolidation its reports.

Private vs Public schools.   NECS also averages the two averages for typical annual cost of 4 year schools, but private costs about three times public (Private = 3×Public). If tuition costs were to stop rising, a student in a typical private school would pay around $64,000 more for a 4 year degree, not counting the R&B. Typical is nearly useless for planning.

NFP-FP_def Not-For-Profit  vs.  For-Profit.   One difficult post-secondary category separates FP from the NFP institutions.  FP education is a territory inhabited by predatory organizations, what used to be called “diploma mills,” though not all FP schools are diploma mill predators.

Advanced education is a demanding, high stress activity and people not in the education tradition expect (demand?) a diploma when money is exchanged. You pays your money and you gets your car, right?

Some FP providers play this attitude to extract big money – the racket:  replace quality education with a pretty, nicely printed diploma.  FP education is buyer-beware territory.  We use only NFP school data.

A relatively new classroom replacement called MOOCs (Massively Open, Online Classes) make good-guy/bad-guy classification murky. A recent New York Times report says that a MOOC with 700,000 enrollees might graduate a total of 7000 students.  MOOCs are different from traditional classes, students are on their own to get the understanding.  Some schools require students to come to campus for testing, others allow them done on-line.  MOOCs may be natural soil for diploma mills to root and flourish, but it is way too early for a final judgement on the MOOC movement.

Separating predators from the education herd may be impossible because of our cultural acceptance of the economic home-run ploy.    Even NFPs tend to channel the legal maximum to their top people:  President, executive staff, full professors, etc.   Ann Arbor is the traditional home base  for the University of Michigan (UofM).  The attitude: charge the highest that can be extracted from the students and the research grants and UofM gains.  What is wrong with a little greed, anyway?  The flow of easy money lubricates the entire system and, carried to its logical end, this argues that to get things done, we (U.S.) need graft and bribery throughout society.  This attitude of our times makes education fertile ground for predators to stalk.

About The Graphs   You can lie with charts almost as easily as with statistics.  For example, All three tuition curves would appear flat if the data were plotted vertically between $0 (bottom) and $1,000,000 (top).  It is easy to hide trends by such tricks of graphic arts.

We try to get past chart lies by plotting all the data on the same sized grids.  The vertical scale was  made similar by requiring the data to fill about 2/3 of the vertical space.  So the tuition curves all have about the same span in their max/min vertical extent  Finally, we do not smooth the data by averaging over time.

Posted in General | Tagged , , , , , , , , , , , , | Leave a comment

World Oil Production – Our Finite World

Just a note on a blog we’ve discussed before.   A perceptive analysis of our civilization’s current status in tapping its finite oil reservoir has been posted on Our Finite World.  It is called World Oil Production at 3/31/2014–Where Are We Headed?.

As we have pointed out before, Our Finite World has many carefully crafted analyses on the usage of our petroleum reserves; this one is a must-read if you are interested in the situation.   Personally, I think of it as a needed update to LastTechAge’s  Patterns in World Oil Production.

OFW WorldOilProduction 2014-0331Gail Tverberg discusses production from Russia, Saudi Arabia, Iran, China, Iraq, US, and Canada – a meaningful list.  It includes a very perceptive analysis of the price implications.

You can see the issue in her first graph (Figure 1) on World crude production. This is an increasing curve but the rate of rise drops off and the production curve is starting to flatten out, though not yet at a peak.  She uses linear graphs which are more intuitive than the LTA preferred semilog ones.  Her points are clear.

Read it then think – how good are projections of future prosperity if viewed under the shadow of outrageously expensive oil?

George Bush filled up our Strategic Oil Reserve in the early 2000’s when oil was about $35/US-barrel.  Today it is in the neighborhood of $100/US-barrel (factor of nearly 3 over the decade).  Multiply this by only 2 for the coming decade then make a do-it-yourself projection as to how we will be living.


Charles J. Armentrout, Ann Arbor
2014 July 23
Listed under Natural Resources  … thread   Natural resources > Oil
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture

Posted in Natural Resources | Tagged , , , | Leave a comment

Piketty Under Attack

Piketty is under attack by the Far Right. They argue the data , or imply he is a hidden Communist.

Thomas Piketty published the English translation of his Capital in the Twenty First Century (C21) and pulled his massive open database into the bright light of fame. Reviews started arriving in March; praise came from economists in the political moderate center to central left. The far left think it is incomplete, it ignores important things. The strongest pushback, though, has been from the fringe on the political right. A small sampling of responses:

Positive: Eduardo Porter3 reviews from economistsJennifer SchuesslerSteven Erlanger,   Paul Krugman.

Negative:  James Pethokoukis,  David BrooksClive Crook (neg)Thomas EdsallChris Giles,  plus the Scott Winship and Russ Douthat (their comments below).

C21 is incomplete:   Clive Crook (pos)  later version.
Modern left of center thought
(Andrew Mackay).    Adviser to President Neil Irwin.
There are many more that say C21 has incomplete solutions than in this tiny sampling

The LastTechAge review of C21 is here.

The Attack

Suddenly, the economic trends illuminated by 100 years data are universally available. The charts are easy to use; trends visually correlate with recalled events (if we are old enough).
Suddenly, the outlandish proposal for a global progressive tax enter cocktail party conversation.

The far Right must not leave Piketty be. The point of his analysis is nearly intuitive, too hard to miss, dangerous to discount. Many successful decades of rightist achievement could be undone.

The first strategy is to paint Prof. Piketty as a communist. He is French & susceptible to mutual Yankee/French distrust – commie commie commie ought to work. Second, attack C21 as flawed over as many categories as possible. This worked against the right in 2013 when the Reinhart-Rogoff proof that government debt varies inversely to growth was demolished by outside analysis (Herndon-Ash-Pollin, Krugman).

So the two fronts in the Piketty attack are Name-Calling and Doubt-Casting.
Click any figure for full resolution image

Paint Him Communist


Fig 1 Russ Douthat, NYT Columnist

One attack came from Russ Douthat’s 2014 Apr 19 New York Times Op Ed column, Karl Marx Rises Again. Douthat uses ‘Marx’, ‘Karl’ or ‘USSR’ 13 times in his scree of 14 paragraphs. This does not count calling the book “Capital,” as in Das Kapital. Douthat’s column indicated that his was not the first to imply that Piketty is a French communist.

The idea is to imprint nasty commie into your brain whenever you hear “Piketty.” Douthat …

Piketty himself is a social democrat who abjures the Marxist label. But as his title suggests, he is out to rehabilitate and recast one of Marx’s key ideas…

Hmm, “…as his title suggests…“? We revisit this next section. Russ continues – Dr. P observes that normal market forces mean that the acquisition of new wealth by the currently rich must naturally suck out the income reserves of the poorest part of society. Piketty does indicate this with the observation that the ratio of wealth-rate-to-income converges to the ratio of savings-rate-to-growth rate, and the implication of what happens when earnings (r) from invested wealth exceeds growth rate (g).

He tries to convert this into a Piketty dogma. We have discussed this many times, starting with our post Zero Sum Game. This is not an ideological postulate, Russ, it is a data observable stretching back over 200 years (France) and 100 years (US). Other industrialized country data vary as to when their records began. These are matters of economic record. If Douthat denies this, he must assume the robes of the old Soviet censors, expunging and rewriting history.


Fig 2: Scott Winship

Douthat quotes a Scott Winship of the Manhattan Institute (a right edge group founded in 1978 by William Casey + Anthony Fisher, funded by Koch Family Foundation and others).

Winship’s review is pretty cool. He starts by charging that on page ONE (Introduction), C21 quotes from the Declaration of the Rights of Man and the Citizen.

Winship: that Declaration says “… to the effect that all inequality should be viewed as suspect.”


Fig 3 Start up of the Introduction

What he does not explain – the Declaration of the Rights of Man and the Citizen was written in 1789 as a foundation of the French Revolution. Benjamin Franklin helped write it and it is about as communistic as the American Declaration of Independence.

Golly! What an indictment, Scott! The rest of Winship’s writing is about as accurate so we will move on.

Both efforts were pretty sad, but Winship seems to have read the book and tries to glean snippets to attack; comments similar to his attack on the Rights of Man. Douthat, though, appears to have not read C21. In passing: Clive Cook’s April comment (link at top of post) starts with bust of Karl Marx labeled “This guy wrote a manifesto, too”.

This kind of name calling ‘with intent to damage’ works for Limbaugh and others – why was it not swallowed whole by the American public? My take: By the 1950s, the USSR was recognized for the terrible dictatorship it was, operated by the nomenklatura aristocracy. There were always a few American true-believers and a few U.S.S.R. spies, but, the by 1960s, most communist penetrations succeeded by appealing to base greed. Two decades after the fall of the USSR, communist agitators do not threaten our Republic.

Fig   Simon Kuznets

Fig 4 Simon Kuznets, Kuznets Curve

Actually, Piketty is highly sarcastic about dogmatists/ideologues of the last several centuries, Marx included.  He dismisses Marxist thinking, praises Simon Kuznets‘ data analysis but says the Kuznets Curve used too short a timeline and drew faulty inferences.  (Rightists use the  K-Curve to justify unregulated capitalism.)

Piketty (C21) says that going from Marx to Kuznets is like going from “Apocalypse to Fairy Tale.” Even so, Piketty repeatedly calls Kuznets his intellectual predecessor.

Why pick on Piketty personally?


Fig 5 Capital In The Twenty First Century 2014

You cannot attack a data base unless you have one, too.  So destroy the person.

The flow of capital has not been something American economists have spent much time on since WW-II.  Do capital flow research, get branded “commie.”   Also, the publisher designed a bright, clean arresting cover, which happened to cause knee jerks on the Right. Think of the cover as red-meat bait for attack dogs; this one is a doozy.

The actual title is 6 words: Capital In The Twenty-First Century. The layout makes it appear simply Capital. But a subtitle would have been a standalone phrase like “Economics in the twenty-first century.” The red border gives it a manifesto-look of flaming rhetoric.  These were traps the that caught Russ Douthat.    The actual book is careful analysis of data, not a flaming manifesto of any sort

Even the New York Times Sunday Review of Books got caught. Here are two C21 descriptions from 2014 Jul 6.

BestSellers listingA French economist’s analysis of centuries of economic history predicts worsening inequality and proposes solutions.

Separate display caption on image of top sellers “Piketty’s neo-Marxist account of capitalism and inequality is No.3 in its 11th week…

‘BestSellers’ isn’t too bad. But Piketty does not “…predicts worsening inequality”, he uses data and demonstrates it happened. Most people picture economists as mathematicians, priests, shamans, horoscope readers. Dr P. says the opposite, that economists are more like archeologists than physicists.

‘Display Caption’ fell into the trap. This could have been written by a Heritage Foundation Spin Meister trying to sound neutral.

As Steven Erlanger points out (see top section), Piketty takes on all economists who work from dogma. Erlanger says Piketty’s work is a challenge both to Marxism and laissez-faire economics, because “both count on pure economic forces for harmony or justice to prevail.”

If you hear someone say Piketty’s ideas are a “soft Marxism” you know that he/she is frustrated over the lack of concrete critical points.

Database Integrity


Fig 6: Chris Giles, Financial Times

There have been a few lists drawn of discovered faults. Near the end of May Chris Giles, the financial editor for the UK based Financial Times, published his broadside (ref at top of post) → there are many structural issues with C21.

Giles’ points are in the reference at the top of this post. Piketty’s response was on the same day. Read both for details. A good summary of this is by Neil Irwin. Highlights – The data files are open and on-line, not hidden. They are diverse and heterogeneous, requiring assumptions to be usable. The assumptions are described many places, and are the team’s judgement calls. Anyone is free to use their own . Since everyone lies about income, why use tax data?   Piketty:  tax data may be the most reliable source, since people lie for all ways to gather income data.

And so it goes. Name calling may work on Talk Radio or Fox TV, but not with the kind of audience that follows Inequality issues.  If the well-funded, right-wing gnomes are going to refute the C21 trends they must present analytical facts that stick. Their candles for midnight work are still lit, but the game keeps enlarging as Piketty, Saez and others publish newer and newer analyses that continue to validate C21.


Charles J. Armentrout, Ann Arbor
2014 July 12
Listed under Economythread Economy > Inequality
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture

Posted in Economics | Tagged , , , , , , , , , | 3 Comments

ABM Test Finally Succeeded, Maybe

Successful test of troublesome ABM used known-flawed kill vehicle. Meaningful or theater?

The U.S. successfully tested its anti-ICBM missile on 2014-Jun-22, with a launch from Vandenberg AFB in California.  News was trumpeted everywhere!  The LA Times says that the launch came just after noon (Pacific Time), and could be seen in nearby towns.  I used to enjoy seeing Minuteman  launches from much further south, near San Diego.

The GBI  (Ground Based Interceptor, also called GMD for Ground-based Midcourse Defense) dates back to the 1980s SDI program with a very bad history (see our final ABM report  last year).   So we have to ask about fraud, graft, and conspiracies to protect hidden activities, cash flows.


Fig 2    GBI launch,   probably 2014 Jun 22 (source: MDA)

MDA logo from Website

Fig 1    MDA logo from its Website

Apparently, the GBI’s  EKV (Exoatmosphere kinetic Kill Vehicle) did make contact with a simulated attacking warhead.

But, if you exclude the patriotic hype, the Missile Defense Agency released very little about this test.

The  Under The Dome series notes that a previous Glorious Success was declared when they partially hit a target that was actually sending radio “look-at-me” beeps.  Actually, GBI had no history of working properly  when our idiot president (too young to use Reagan’s excuse) declared it operational and started distributing federal fortune  to contractors.  As late as 2011, it was discovered that its EKV could not work correctly  … declared operational for a couple years prior to that …

We pointed out that a tactical ABM need only swat the incoming warhead aside, destruction was not needed to protect its military target.  Tactically, it is okay that collateral damage happens.  GBI’s are used against the horribly expensive ICBM threat; these things deliver H bombs.  Everything a successful attack delivers might be called collateral damage in the tactical theater of war.  The EKV of an effective GBI must stop –halt– inside its target if it is to deposit its energy into the target for destruction.

Did the EKV deliver its energy potential to the target?  What we do know is that one of the early (& kinda functional) EKV units was removed from an ‘operational’ GBI for this test.

We tried to explain the support that the GBI has always received in our post Knights To Battleships.  Pictures of the system are awesome, the program’s verbiage is highly patriotic,  the leaders are tough-lookn’  good old boys, too.  How could you not respond positively? (… unless you are not a real Amurcan.)

So.  Something happened on June 22.  A GBI was launched.  The old and known-defective EKV is reported to have contacted its target.  The question remains – is this supposed to mean something?     ABM defense is a valid concept – wish we had a  valid system; but, we are paying valid money for it.

Update: 2014 Jul 20: Comment By Dean A Wilkening, LLNL   “If you’re going to rely on that as an operational system, one shouldn’t be too surprised that it does tend to fail more than you’d like.”    Really good 1 sentence summary of the GMD system


Charles J. Armentrout, Ann Arbor
2014 July 3
Listed under    TechnologyTechnology > Aerospace
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture

Posted in Technology | Tagged , , , , , , , | Leave a comment

Piketty discovers America or vice versa

Piketty describes the growth of U.S. and World inequality with unique large data resource. Our review of Capital In The 21st Century

The French economist Thomas Piketty  published the French edition of his new book Capital In The Twenty-First Century in  2013.   (Call it C21.)  English version came out in April 2014.  Finally.

Click any image for full resolution.  Click here to jump to the review.


Fig 1:  the book causing all the fuss.

Fig 1  is a link to the Harvard University Press website.  Review copies were passed around after Christmas and breathless reports  started appearing by February.

Response was immediate and strong long before the planned release date and I can almost see/hear rioting mobs outside Harvard offices…  Mobs probably never happened, but by the end of March, the publisher agreed to early release its English translation of C21.

My own impatience pushed me over the top in mid-March when The American Prospect ran its critique.  I pushed until I received one of the first copies in our area from a local bookstore.    So I was ready by the time Krugman and Brooks had face-to-face comments in the New York Times issue of April 25.


Fig 2:   Thomas Piketty (2014)

First Things First.

Americans tend to pronounce Piketty’s name as  either PICK-etty, a la Paul Krugman,  or  pe-KET-y,  a la moi.

Oh – no no no No!

Dr. Piketty is a native Frenchman.  As Krugman wrote, “who knew it should be PEE-ket-EE?”

Personal name?  A guess:  probably “toe-MAS

The buzz

This book made inequality the word de jour from mid-winter to early-summer.  The hype started prior to his April tour in the U.S; those pre-release descriptions generated standing room only for nearly every appearance.  Stunning – probably the most stunned was the serious scholar, Thomas Piketty.   As most people know, C21  hit the top of Best Seller lists (!) and stayed there.  Copies evaporated from booksellers’ shelves.  Harvard University Press reports the highest sales for any new book in its century of business  …  this for the hardcopy edition of a scholarly tome!

My best guess – most people are not reading it for any detail.  If you can get it,  read at least the introduction.  It has good, interesting background and and lays out the ideas underlying economic inequality.

At its end, C21 presents a discussion of how to avoid the successful diversion of the world’s wealth into the hands of an ultra tiny, ultra powerful  and ultra rich group

Why C21 is important – Review

C21 provides inferences from data on wealth, income and inequality available nowhere else.  It is a data-driven discussion on the diversion of capital and income into the wealthiest segment of our population.

Piketty wrote C21, but he is part of a large cooperative team assembling the WTID (World’s Top Income Database) available to all here.  This is the world’s largest database on trends in income flowing to the different earning sections of populations world-wide.  He has announced that  WTID will be transformed into the much more inclusive WWID (World Wealth and Income Database).   These workers are special –  Thomas Piketty in France, Emmanuel Saez in the U.S.,  Anthony Atkinson in the U.K. and all the others.  The effort provides unrestricted access to information not otherwise obtainable, anywhere.

Piketty contributed French data extending back to the late 1700s.  Atkinson  did the same for England using data from the 19th and 20th centuries.   Saez got hold of U.S. tax records from 1913 and forward.  Saez’ work caught my attention in 2009 and, to a very important extent,  LastTechAge started in 2011 due to those data (see our Zero Sum Games). I shifted from an  experience-based belief that for most of my productive life, our technical competence eroded, to a data-based understanding as to why.

What  Capital In The 21st Century  Says

C21 is a detailed book for non-economists written by a scholarly author;  it runs to 577 pages, not counting pages of notes and good index.

The issue of the journal Science, May 25, 2014, published an overview  by Pikettey and Saez that is 5½ pages long.  It is one of the articles in its 50 page Special Issue on the science of inequality.   Although it is aimed at a slightly more technical professional audience, I think it is a really good introduction to C21.

The WTID  group works with historic raw data, such as tax returns, and generates many charts.  It does not use the differential equation modeling typical of many economists.  The team makes adjustments on the raw data using assumptions that are clearly stated, so that the results are consistent with each other.  Results are placed in  Excel files.

Pikettty, Seaz, and team draw their inferences from those data, not from ideologically “reasonable” models.

The U.S. trendings

Cmt-A  Definition of Wealth, Income, Output

Cmt-A   Definition of Wealth, Income, Output

Cmt-A is my non-economist descriptions of important concepts, though the Output local/global comment is per the book.  Don’t blame Dr.P –  all errors are mine.

Note: Piketty uses wealth and capital interchangeably and explains his reasons in the book.

I am focused on income inequality trends, but C21 actually shows  global trends on income and wealth, industrial production and growth.  Our comments here will stay with what has been going wrong in America.

Fig 3 shows Piketty during a presentation.  Fig 4 shows the same data from the WTID independently plotted .

LTA graph of same data base 1917-2011

Fig 4:  LTA graph of same data base 1917-2011


Fig 3:  Piketty presenting iconic top-earning 10% of population graph

C21 has a fruitful way to  look at the data.  There is always a fixed 100% of all income flowing through the economy.  Divide the the total group of earners into various sub-groups, from the lowest paid earners to the highest.  Then, determine what fraction of that 100% total flow goes into each sub-group.  You have to work at it though – the two percentages are easy to confuse and many commentators are diverted by the changing total dollars in motion.

Informed inferences are there for the making; but you must be willing to correlate WTID results with 100 years of U.S. history.

Post WW-1, the highest-paid 10% raked in ever larger fractions of the money moving through the economy.  For example, one can see the ’29 crash, the quick resurgence of wealth percentage, and its growth stopping at 45% total flow by the 1935  progressive tax law.  Real equilibrium emerged about 1945 when WW-2 ended.

The equilibrium share of income for the top-compensated 10% was 34% of income (with small fluctuations).  C21 uses the French phrase Trente Glorieuses (“30 glorious” years) for the equilibrium decades between 1945 and about 1980 or 81.

Reagan’s Economic Recovery act on 1981 was an undo for  Roosevelt’s progressive tax and you can see the sudden % shift upwards  as ever-greater income flow was diverted to our fine & deserving Toffs.  This means that the rest of us had to accept ever-decreasing fractions.

WTID provide historical data to use for comparisons.   Other data sets may be fruitfully compared to Piketty’s database charts, and the LastTechAge has frequently used them for correlative support.  Here are some examples …

Bottom 20pct vs top 10pct

Lowest earning 20% + top 10%

Fig 5 shows census bureau data on the lowest paid 20% of the population with the highest 10% shown in light blue, for visual correlations.  Census data set started late and is noisy (they are not so interested in the lower earning levels). But it is perfectly believable that the data from the 1967-1980 are just fluctuations about a flat line, too.  The lowest 20% start losing income share in 1981, when income share for the top 10% begins its spectacular rise.


Fig 6:   Fusion research + Top earning 10%

Fig 6  shows the timeline for the U.S. fusion energy budget – the American Fusion Engineering Act of 1980 allocated a massive increase in funding and set the stage for a shift from basic research on why things happen to the engineering how-to with Prototype 1 of a net power producing magnetic fusion machine.  Correlations to WTID anchor data suggests why fusion funding never really started up.

LastTechAge calls it a Zero Sum Game when you have to describe what happens when growth of one faction requires corresponding reduction by another.  With our apologies to game theory, we have a number of posts on the idea.   (Refer to our Index under the header for related posts.)

Synergistic/Parasite_txtWe will continue to probe other data sets using C21‘s WTID data as a correlation baseline.

Visible even in our example Fig 5 and 6: Unrestricted capitalistic acquisition of wealth seems to focus on maximization of the personal family wealth of select individuals, even to the detriment to the society that makes financial success possible (my statement, not Piketty’s).

In the next section, we discuss how Piketty used the measured total Wealth of the U.S. and its total Income to draw inferences.

Now – Inferences from the math

Someone who comments on a book should read at least  the introduction; wouldn’t hurt to read a couple paragraphs of Chapter 1 and probably the last page or so at the end.  Many C21 critics  seem to have done none of those.

Even in his introduction, Piketty states that, without a solid database, economics is not science, just speculation.  Page 2, Piketty comments on data-free discussions as “A Debate without Data“; page 3, he refers to ideological hypotheses as “the dialogue of the deaf “, using “intellectual laziness” where one side points to the other side’s intellectually lazy habits; page 11, in reference to Marx and  Kuznets he says that without data, one can find “theories” that range from “Apocalypse to Fairy Tale“; page 3 again,  “Social scientific research is and always will be tentative and imperfect. It does not claim to transform economics, sociology and history into exact sciences.”   These are not words of a rigid ideologue, though Greg Mankiw tries to frame him as one.

Predictive Econ is science fiction

Cmt-B  Predictive Econ is science fiction

Economic math models (even those with differential equations) start as the verbal ones do, maybe from divine revelation of an axiom set, maybe a set of blindingly-clear but hypothetical truths.

Several of our posts that say Economics is not Science.   Cmt-B discusses SciFi imagery.

Piketty scorns economists who use complex mathematical models, but he does use a few simple equations.

For me to detail how the math is used, I would surely need more than 577 book pages.  For good discussions of what it all means, read Piketty’s book or the Science journal summary.


Cmt-C: Algebra of Capital in the 21st Century

The few math relations used are shown in Cmt-C.  The content of his resulting inferences are why some high-wealth people are trying to undermine his credibility.

The second law generates concerns.  It is given without proof, and means that over a long time period, a country’s beta value (its total wealth divided by total income) will gradually adjust its value to match its average rate of savings divided by its growth rate. This assumes that everything stays in equilibrium: no awful wars, no asteroid hits, no super volcano at Yellowstone Park, etc.

If true, and if the saving rate is constant,  a country needs continuous growth  to keep its wealth from concentrating into a few hands.

If growth g gets really small (no growth – static population size and static manufacturing productivity),  the ratio s/g must become really big.  2nd Law: (s/g) = β  = (K/Y) and since  capital wealth K cannot grow to unbounded infinity,  it must be that the income Y averaged over the population becomes very small – lots of earners with subsistence paychecks.

Oh, wow… Static societies develop into inherited aristocracies with their majority as starvation level serfs . ? !

I have a  small question about the 2nd Law.  Beta is measured in years but the law convergence ratio is a pure number.  Where did the ‘year’ dimension go? I am a physicist and all our valid results keep the measurement dimensions consistent.  If it starts as ‘years,’ it must remain as ‘years.’  Maybe my issue is only because I have not seen how this convergence was proven?

The r > g condition is more serious.  When the rate of return on capital investments exceeds the growth rate of the country, you should also expect wealth concentration over the long run.  Because when the wealth capitalization rate r (return on capital investment) is bigger than the rate at which wealth of the bulk of the population grows, the folks with inherited wealth acquire resources faster (much faster?) than all others.  For really large r, new capital flows to the very rich & well connected; all others lose their pool of resources.

r > g  is serious without the second law.  With the 2nd Law, it acts as a strong accelerant. (Think of what an arsonist uses to start building fires.)

Piketty’s team best model for capital growth uses an economy with with successive shocks – sudden and unanticipated changes in the fortunes of various subsets in our economy.  We are currently passing through such a “shock.”  in 2009, the rich and well connected got huge government bailouts + monster bonuses.  But manufacturing had been hustled off shore and most of us found lower paying jobs waiting to replace what we lost.

Finally:  The model is not the thing, the  map is not the territory.

The model points toward wealth concentrated into a few hands, especially if you include resource depletion and climate change as forcing elements.   But the issues are complex, and include forces that push toward inequality but also those that push away.

Piketty’s proposal.  Reinstate a progressive tax rate (PTR).  Suppose the capital return rate r  does try to grow above growth g. Government can shift the r/g ratio.  After all, turning the PTR off shut down the 30 Glorious Years.  This is a way to reduce the rate of net capital return r so that wealth agglomeration does not happen.  And why he is under such extreme attack by the ultra rich and their (much poorer) far right followers.

These seem to be our choices:  (A) a feudal oligarchy where the very rich rule our impoverished people, or (B) a revolutionary oligarchy where the very rich rule our impoverished people, or (C) an increased share in the support for our country paid by those who rose to the top via the American avenue of opportunity.

Piketty says that his model shows a potential direction, but does not predict the actual future.  (C) is certainly possible, the world is not required to follow the path of inequality  that leads to injustice. We could, in fact, return to the “Glorious Years.” It is just that, in his opinion, (C) with its PTR action is the most unlikely choice the country would accept.

The final statement of the Piketty and Saez inequality summary.

“To summarize: Inequality does not follow a deterministic process.  … There are powerful forces pushing alternately in the direction of rising or shrinking inequality. Which one dominates depends on the institutions and policies that societies choose to adopt.”


Charles J. Armentrout, Ann Arbor
2014 Jun 26
Listed under   Economics    thread    Economics > Inequality
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture

Posted in Economics | Tagged , , , , , , , , , , , , , | 2 Comments

NIF shifts toward success – 4

NIF addresses the universal RT instability all implosion fusion schemes must face, is there 7more that they can do?

NIF people limited the impact of the RT instability in their most recent report.  But they have more to do and surprises to absorb before they achieve their goals.  This is our 4th post in this thread.  Summary of our previous 3:

    1. Gain: Target implosions produced about as much energy as was applied.
    2. Enhanced heating: The fusion reaction is starting to cause more fusion. First step towards an ignited plasma core.
    3. Not in-control:  The stagnation core at the end of the implosion is an inefficient donut with hard peanuts on one side, should be a plum with a hard/hot central core.
    4. Something is not working as expected:  Are there hidden issues that block success?

This is a “popular” discussion, not a truly technical one, but we must estimate one more technical topic if we want to understand.  But you may skip the close-up discussion; click Summary and jump past the nitty-gritty  details.

Click any image to expand to full size.


Fig 1  RT mixing of water

Rayleigh-Taylor Instabilities

The RT (Rayleigh-Taylor) instability is the hard reality that ICF researchers must accommodate. When a heavy liquid pushes on a lighter one, RT mixes the boundary with growing turbulence.

Fig 1 shows the beginnings of RT driven tendrils from the red (heavier) water into the lighter, clear water.

Click the here to see the full discussion of this effect.

RT is a universal instability
It is insidiousness because any bump, no matter how small, will grow and never stop growing.

Target implosion simulationi

Fig 3   Simulated collapse of a target due to bumps on the target surface


Fig 2  Target cutaway

Fig 2   is the target design (from our previous post).  Click this link to see the Specification Table.

Fig 3   shows a computer model of an imploding ICF target, driven by non uniform bams; this was done at LLNL and displays the RT turbulent structures. Top half of the target is shown.  The  structure causes havoc when the fuel compresses into its stagnation-point core.

RT starts with very small bumps on the interface. In Fig 3, the largest tendrils are from the fuel penetrating into the low pressure in the cavity.  The bumps expand exponentially, driven by  RT mixing caused by the monstrous drive acceleration not Earth gravity.

Tbl A        formulas to calculate RT effects - simplest model

Tbl A   Formulas to calculate RT effects – simplest model

Now the math:   The answer to:
Will RT appear?  … is always YES, whenever a dense fluid pushes against a lighter one.

The correct question is:
WHEN will RT effects start to  mess things up?  … this  has to be estimated.

Table A defines our terms.   A target has two interfaces that can generate RT instabilities,  • the Shell Wall to Fuel surface and the  • Fuel to inner cavity surface

StartHeight is the size of the initial bump.  The smaller the bump the longer it takes to be noticed, but bumps will grow like penny stacks discussed in the “exponentially” link, above.

BumpHeight is the size a time t after the beams have turned on.

Gamma The rate of growth of a bump in the interface. It determines how fast the bump will rise,  the bigger gamma, the faster the growth, the sooner RT turbulence becomes an issue.

Doubling Time A bump doubles in size every time the clock ticks through  It will be easier to understand and most calculators with  the [Log] key has the [Ln] key too.

Estimating Gamma means we have to estimate 3 quantities


Tbl B Atwood values for target shells

Atwd  is the Atwood number.  It is the difference between the two densities of the 2 materials divided by their sum.  Table B gives some values for this function.

Case A  applies for the inner surface of the fuel cell, solid DT on one side, essentially nothing in the cavity.  Cases B, C, D, and F refer to the interface between the pusher shell and the solid fuel.

Case F is for the shell/fuel interface in the current NIF target.   The value Atwd=0.5 for F instead of 0.6 would make the RT growth rate 15.5% lower than D, a simple PVA plastic shell with a cryo fuel layer.  The 0.5  might justify the much larger target expense,  but we will stay with the conservative 0.6 guess in F.

KMS built the shells used by most ICF labs in the 1970s and ’80s,  Here is my link for the our simple estimates.

WaveNumberWavN  is the Wave Number, units of “per cm.”  Basically, 2π divided by the average distance between successive RT peaks.  This assumes number of peaks is 10.  If it were actually 8 or  12, the square root in the growth rate would change  the effects only a bit.

Accel  was discussed in the Target Demands table (post -3, this thread),
and is 7.2×1012 m/s2.   This enters the RT growth rate as its square root, too.


Tbl C    RT Instability growth, Summary

Table C shows that, at both boundaries  (Shell|Fuel and Fuel|Cavity), any and all imperfections grow by a factor of 5 or 6.  This means the Target Fabricators must keep all surface bumps below some maximum size, or the targets will fail to meet goals for high efficiency;  the shell will  mix and “poison” the fuel, or the fuel will not converge into a spherical core.

Same argument goes for the drive beams; they must all turn on at the same time so that implosion irregularity does not locally cause (much) worse RT effects.  Of course, beams that are hugely out of balance  will destroy the collapse without any help from RT.  This was the topic of the last post where we pointed out ICF labs have been serving up donuts for decades.


Tbl D      RT limits to flaw sizes

Table D is our bottom line to Rayleigh-Taylor effects on ICF implosions. Recall that the target is 2.2 mm id diameter (about 1/8 of an inch).  The fuel layer is 70 microns thick (slightly thinner than writing paper)

The Shell | Fuel irregularity limit is on the ID of the tiny plastic sphere and are about 0.1 % of the shell thickness.  the Fuel | Cavity limits represent about 0.1% of the Fuel layer, too.

These requirements  cannot  be met.

What NIF Reports Mean

The NIF data are a single start of the unfolding of ICF capabilities, not the final results.  We will see much better one before they have finished.  … but lets evaluate current status

Seven requirements for fusion success

Target Criteria

    1. Fuel: Cryogenic solid layer, vacuum cavity – Best bang-time convergence (highest density, reduced Atwood number – next point)
    2. Shell: Lowest element mass, Z – Minimize the Atwood number and minimize Rayleigh-Taylor instability of a high mass pusher driving a much lighter fuel layer.
    3. Shell: Symmetric as possible – Avoid instabilities that shred the implosion, mix fuel and shell material. Use perfect spheres (conic eccentricity= 0), perfect surface finish (RMS = 0).

Drive Beam Criteria

    1. Wavelength – make as short as feasible. Minimize fuel preheat (which blocks compression), avoid excessive loss of beam power to the ablating plasma.
    2. Pulse Shape Control – control shocks on ablating shell, initial preheat of fuel, then gentle ramp-up of the drive with a  hard push at stagnation.  Follow adiabat ( min entropy trajectory) through the implosion phase space
    3. Symmetry: Spatial – Avoid hot spots that drive instabilities. Every beam with uniform intensity, all beams with same intensity.
    4. Symmetry: Temporal – Avoid time dependent hot spots or regions that implode faster or slower than average.

Items 1-3, NIF does these well, as best as can be accomplished.
Item 4, put a check next to this, too, because of the indirect-drive x-ray strategy.  The conversion from IR to X-ray is lossy,  but at least it is in place at high power.  But a UV laser like a KrF system would be much more efficient.
Item 5, a work in progress, not done yet – the laser appears to have all the control capabilities desired. But as we discuss,  optimal  strategy is still under development.
Item 6 is not under control, as evidenced by the poor shape of the collapsed core. This will never be perfect because each of the 192 beams have spatial hot spots, though uncontrolled, unsmooth beams were expected to be smoothed hohlraum. The inner layer of solid DT fuel will self-smooth by what its discoverers called Beta Heating.
Item 7 was never meant to be controlled (IMHO). The 192 separate beamlines turn on not quite together (guess: within slightly less than 1 ns. )  I suspect beam space and time inequalities were meant to be smoothed by the x-ray conversion.

Non uniformity may be why they use a helium prefill of the hohlraum.  Edwards made a comment to this effect in his report (last post).  However, there may be jetting from the gold and this would mess up expected smoothing.  If true, it would add to my arguments against excessive trust in simulations, even when done by the LLNL theory powerhouse.

Beam irregularities and RT have messed up many a target shot.


Fig 4   Square pulse, KMS 1985-86 data

Our simple estimates on the effects of RT make the IFE (fusion energy generation) task appear impossible.  And it is, if one were to use only the square laser pulses as modeled.

Square:  beam OFF at start, beam ON at full power during shot,  beam OFF at end.  [Fig 4, reference.]

The killer assumption is that the stupendous implosive acceleration was fully on at start and constant during the inward drive.  Acceleration pushes RT, without it – no RT growth.

Use Fermi degenerate adiabats The idea – dating back to the early days of the 1970s – was to push with a force that kept the electron distribution in their quantum lowest states, the “Fermi degenerate condition.” This is equivalent to saying that we should not push so fast that we increase the entropy of the fuel core, should move system along a thermodynamic adiabat.

Fancy words for “Start with low push (acceleration) and gradually increase the pressure by increasing intensity, until the end.” I am not aware of any actual Fermi-degenerate implosions being done, but this is the prescription for low-RT shots.


Fig 5  t square pulse, KMS 1985-86 data

Fig 5, is 30 year old data to show that ramped pulses are nothing new. This is data from the Chroma laser at KMS fusion.  The shots are 1/10 the length of NIF shots and the targets were about 1/10 the diameter, too.  Chroma was significantly smaller than NIF/10, however.

Ignition and Burn-wave   At full run-in, the core will be at its most compacted form, is a plasma, and is way too cold for fusion.  Now smack it with the highest power shock to initiate fusion at the center.  The products (helium nuclei,  “alphas”) will heat up the plasma just outside the core, these heat the next  fuel out from the center, and so forth.  The  ignited, fusion burn wave produces the energy.

Good news:   NIF has demonstrated the self heating that means ignition and burn wave is possible!  Edward’s Dec 2013 report (see post thread, part 3) indicates that they are working very hard at reduced RT shots.

I am not at all certain I understand the actual shot profile.  Edwards’ Report to the NIF Management Advisory Council NMAC) shows Fig 6 as the successful “High Foot” beam profile (pg 17 of the report).  “NIC” is the “Low-Foot” profile. The intent seems to be a shock hitting the capsule and causing an inward drift, as the laser beam was held back until the final drive shock of full power at the end.


Fig 7    Park, Hurricane, et al PRL Feb 2014


Fig 6  Edwards High Foot profile Dec 2013


Fig 7 is from the data paper by Park, Hurricane, et al and shows a slightly different “High Foot” success profile.  Similar, but there is significant drive power in the plateau between the initial shock and final, power drive shock.  These data are in the form or a rough ramp, similar to Fig 5, except …

    • the initial preheat shock that starts an inward drift.
    • the intermediate power plateau, which does not go away, but continues to gently (?) push the fuel layer inwards.
HiFt with Tsqr

Fig 8  High Foot with time squared (dotted)

Fig 8 is the High Foot part of Fig 7, to display the shape of the supplied laser power.

The actual success pulses roughly follow the dotted T2 curve drawn over the data.

First the initial preheat shock, then a flat power plateau which starts above the quadratic curve, then goes below it.

What the results would have been if they had followed the  T2  profile closer?


The NIF researchers emphasize that they worked hard to reduce the universal RT effects, and the results show that they are on the right path.

    • First they shortened the shot duration – is it short enough yet?
    • Then they ramped the intensity  so that the RT did not rise to destructive levels.
    • Next, they plan to try a larger hohlraum to see if they can remove most drive anisotropies.

The NIF workers have done well.   I wish them success and hope/expect to see really good results before the end of the year.


Charles J. Armentrout, Ann Arbor
2014 Jun 11
Listed under    Technology    …    Technology > ICF/IFE
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture

Posted in Technology | Tagged , , , , , , , , , , , , , | Leave a comment

NIF shifts toward success – 3

NIF released positive data.  Is some unexpected effect is blocking full success?

The staff of the National Ignition Facility (NIF) in Livermore California released papers on very positive new results in their work toward inertial confinement fusion (ICF).  NIF is on a success path, and they need to continue doing what they just did, but more so.  This will actualize the potential for 5 MJ power output/target shot, as  required for true success.

This is our 3rd post in this thread.  NIF results discussed in Parts 1 and 2:

    1. Gain: Target implosions produced about as much energy as was applied.
    2. Enhanced heating: The fusion reaction is starting to cause more fusion. First step towards an ignited plasma core.
    3. Not in-control:  The stagnation core at the end of the implosion is an inefficient donut with hard peanuts on one side, should be a plum with a hard/hot central core

We continue our discussion by looking at what might have been helped by their new operating changes. This is a “popular” discussion, not a truly technical one, but we must get through some of the technical points if we want to understand.

Click any image to expand to full size.

 Overview:  Beams on Target

Table A outlines specifications for a NIF laser, hohlraum and target.


Fig 1   Target shell cutaway. Compressed core is shown at the center (red)


Tbl A  Specifications for Target, hohlraum, Beams

Fig 1 is an image of the target, less than 1/8″ in diameter.

    • The ablating shell (called the pusher) has a complex construction of 5 plastic layers to provide density gradation between laser and the fuel layer.
    • The inner layer (gray) is the deuterium/tritium fuel mix frozen onto the inner shell in a layer about 3 thousands of an inch thick.
    • The compressed core is shown in the center, drawn in as the red circle.
    • The central void is mostly vacuum, except the thin DT gas that has evaporated from the inner surface.

The target must hold together and collapse smoothly during the extreme environment of the implosion.   What researches must do is smoothly guide the target plasma (something like an arc welding “flame,” but thousands of times hotter) into a smooth core.


Tbl B   Extreme target environments

Table B  lists several of the severe conditions they must deal with.

    • To reach the bang-point while the laser is on, it must be pushed smoothly with a trillion times more force than Earth’s gravitational field.
    • It will be in the plasma state but must be moving at 10 million cm/sec, 5 to 10 faster than most plasma acceleration  schemes in current use.
    • The shell blows off but it must be evenly removed over the surface, not like a space craft’s re-entry.    The shell should be mostly used up by Bang-Time, but the ablation front must not burn through to the fuel layer.

We will look at NIF using a very simple model, the kind called “back of the envelope.”
About modeling with calculations:  Models extrapolate what would happen if everything works as expected , but doing new things is hard because of unexpected events happen.  So my guiding idea is to check feasibility with simple estimates first; if the simple estimate are good, then do the detailed computer modeling.

Our point:
if simple estimates say …

it will not work.  It probably won’t.  But cross check your findings!
it is “iffy” – maybe it is possible.  Try it but don’t start by pushing the boundaries.
it definitely will work.  Go for it! Expect tension and high stress to make it go.

Computation is worthwhile, but over-the-top detailed models do not reveal the actual truth. Consider this slight misquote from General Semantics:  The map is not the territory, the calculation is not the thing.

The data and the simple criteria


Fig 2:  New HF and old LF Shot sequences

Shorter shot time What the NIF managers did is shown in Fig 2.  The shot length was reduced from from 25 to 15 ns and the power was raised in the trough leading to the ramp-up of laser energy.

This new method is called the “high foot” (HF) mode of operation. (A nanosecond is 1 billionth (US) of a second.  Light moves about a foot ··· 30cm ··· in 1 ns.)

Their change pushes the same energy as before into the target, but  faster … 870 GW instead of the previous 520 GW.

Are the laser beams misaligned – or blocked?


Fig 3 1995 Nova data. Alleged RT spikes visible … 0r are they beam-target instabilities?

Fig 3 an x-ray image of a target implosion from the large Nova laser, predecessor to NIF. The image exposure time is (about) 1/3 nanosecond.  We see strong speckle because there are not many photons available in short a time. Wikipedia calls these images of the RT instability inside a target. (We discuss RT in part 4 of this  blog thread.)


Fig 4   Nova hohlraum with the 12 laser beam hit points

These 12 spikes in the images do not look like RT inside a targetFig 4 shows the Nova hohlraum with this 12 hit points.  Looks to us as that the spikes in Fig 3 are plasma jets from the hit points.


Fig 5 Cold jet from beam on Au

Does beam-on-metal data help make things clear?  One of our last studies at KMS Fusion focused the intense laser beams on flat gold plates. Fig 5 was taken by our imaging holographic interferometer; 1 of 4 images, 1/3 ns exposure.

Result – we observed cold dense plasma jets from the strike spot, originating between the hot spots in the laser beam: high density (1020 g/mL) objects moving from the target surface at plasma expansion speeds (> 106 cm/s).  Our working model: higher intensity beam regions pushed the plasma jets up and out from the lower intensity  and cold nearby regions , similar to squeezing a tube of toothpaste.


Tbl C Jet size, assuming sonic velocity


Fig 6 NIF hohlraum with beam hit points

Table C  shows estimates the size of a beam-plasma jet at the end of a NIF pulse.  Our model is of a jet expanding at about the plasma speed of sound, as in the KMS data.  Fig 6 shows the NIF  situation: many beam hit points on the walls.  Jets could occur between beam strike points (if close enough) or in any one of the hit areas, between the hot spots.

If jets are a real phenomenon, they are most likely to form in the equatorial band where beams from the two ends overlap (see Part 2, Fig 2).

The target is separated from the hohlraum wall by 1.75 mm (Table A).  The coldest estimate for the plasma indicates that a jet will have crossed about 1/3 of the original distance.  Such jets would change laser beam and x-ray propagation in the chamber, change the distribution of intensities; they could  affect the heating of the target.

Blockage by cold jets is a hypothetical process.  I have not read of one discussing such a possibility – no one (else) has indited the hohlraum of misbehavior, but Nova data seems to show the event and NIF geometry invites it.

Is shell burn-through possible?


Fig 7  Imploding target with beam breakthrough. Blue is x-ray drive illumination.

What if uneven shell erosion is driven by hot spots in the driving illumination?  Is this a question too basic and simple to ask? The the ablation process ought to move through the solid shell at the speed of sound (in the shell material).  But what if shell erosion is boosted by interactions between the shell wall and the beam intensity? Fig 7 shows the implosion issue.

The fuel perturbation were sketched as a bulge toward the center, but it might well be an depression in the ID.

NIF results looks like square donuts, not the intended spherical billiards ball. Meaning: yes, there are irregularities in the illumination.  Look again at Fig 1.  If the driving light is not symmetrical and uniform, the smooth sphere will immediately be cratered like the surface of the moon.  Once formed, these craters do not vanish but grow proportionately deeper.

Shell Punchthrogh+txt

Tbl D Time for drive beam to punch through the target shell, into the fuel

Bright spots imprint dents into the pusher shell that do not disappear during collapse – one of the messages from a presentation by a member of Stephen Bodner’s NRL team (Nov 1988).

Table D shows estimates of the time needed for a deeper-than-normal crater to reach the inner fuel layer, assuming the pit propagates inwards at sound speed.

The estimates are for sound speed expansion in through a plasma, and show no joy.  Hotter than 20 eV might be a bit too hot; 2 eV is the temperature of an arc welder – it is a bit too cold.

Pusher shell burn-through is regime where the estimates are ambiguous.  A cold pusher shell (2 eV case) indicates that the 25 ns original shots were probably too long and current 15 ns pulse lengths are possibly so.  If the beam is hotter than shown on the table, the ambiguity goes away; the pusher shell would be burned away in even the best scenario and the last nanoseconds are about ablating fuel.  Bad idea.


We have done “back of the envelope” estimates for 2 possible reasons that the NIF shots, good as they were, did not do better.

LLNL has some of the wold leaders in detailed modeling, it is inconceivable that they would not have simulated any foreseeable issue in implosion physics.  They are masters at target design, just look at the actual complex target – it takes real skill to design and assemble such a multilayer shell.  Could they have missed any normal occurrence?  Not likely.

The performance limiter must be something arising from an unexpected direction.  These are just guesses without sufficient access to data (or computation power) to put more depth into them.

Things have gone horribly wrong for NIF these past 5 years. They are working their way out of the depressing pit and toward success. The next post in this thread will discuss the important RT instability and will do an overall summary of the result.


Charles J. Armentrout, Ann Arbor
2014 Jun 6         Update Jun 7 to correct several typographical errors
Listed under   Technology    …   Technology > ICF/IFE
Have a comment? Click on the title of this post, go to bottom, let us know.
Related posts: Click the INDEX button under the Banner picture

Posted in Technology | Tagged , , , , , , , , | Leave a comment