Back to Intercepted Transmission 2002

by Gary G. Ford
Let me describe another problem in computing.1

    We assume that Numerical Methods are virtually complete and that all is well and ready in this area. It is not so. How can THAT be?!

Take a very simple problem, like solving an n-by-n matrix (linear system) Equation Ax = b, where A and b are known (by some method - for instance A and b may derive from an instantaneous (know Newton's method for finding x to f(x) = 0? - when x can be an n-vector X, and f(x) can be another n-vector F(X), there is STILL a Newton's Method for solving F(X) = 0 - trouble is, it often breaks down in real problems) - linearization of some physical and chemical equations written to try to faithfully represent a real world problem, which in itself is not always so clear or complete in itself!

Solving for x when A and b are known - provided 'A' is NOT 'singular' (zero determinant) - is 'cut and dried', a real 'piece of cake', taught in a first Linear Algebra or Numerical Linear Algebra Numerics course - RIGHT?!

Well, a method - there are many - is, well, yes, taught, usually the Gaussian 'Elimination and Backsubstitution' - or its equivalent as LU (LowerUpper) factorization in A, using parallel manipulations in 'b'. Simple?!

NOT so Simple.

For one thing, in many realistic problems, the generated 'A' matrix may be 'practically' nonsingular, or as better stated, 'ill-conditioned'. This is usually interpreted as the ratio of the magnitude of the Largest Eigenvalue

(remember EIGENVALUES?! These are numbers 'c' such that Av = cv for some 'Eigenvector'. Eignevalues can be REAL. Then 'A' doesn't change the direction of the eigenvector 'v', it just stretches it, compresses it, or reverses its direction, or does a combo of these.

When an eignevalue 'c' is 'imaginary' - in the complex numbers sense - then 'A' rotates the Eigenvector 'v' when multiplied on as in Av - rotated around SOME Axis in n-dimensional space.

Generally Complex Eignevalues 'c' indicate rotation along with stretching or compression.) to the magnitude of the smallest Eigenvalue.

It turns out that as the Condition Number (let's call it CN - okay?!) tends to get larger and larger for related structure matrices A (say these may come from finer and finer discretization - more and more elements or grid cells - of an 'attempting to be realistic' PDE - partially differential equation).

And in General, the Classical, and Modern too, Matrix Algebra (Abstract or Numerical) Linear System Solution Algorithms get worse and worse in producing an accurate determination of 'x' from Ax = b, with A and b in hand and 'known', as the CN grows larger!

Classical Gaussian Elimination or LU factorization easily starts to fail when 'too large' and 'too fine' a discretized system of a PDE is attempted.

But NOT only accuracy fails, the method itself may start giving first bad, then just outrageously wrong answers. Or completely brake down, often not 'bombing' but just making 'so much progress' and then 'No More' towards a good approximate solution of JUST THIS INTERMEDIATE Linear SYSTEM. Like Gears breaking off teeth somewhere in the depths of a '24 Jewel' fine mechanical watch.

Unfortunately, Gaussian Elimination was generally abandoned long ago for the intermediate, iterative, linearization stages, of Important Real WORLD Fluid Flow Problems - SUCH AS Advanced Aerodynamics, WEATHER and CLIMATE.

Why?! Because the work and storage required grow outrageously fast with increases in problem size (finer mesh, more discretization cells or elements).

What is really bad is that the Replacement Methods, Classical Modern and Post-Modern, are often NOT any better behaved do the growth of Condition Number, but often are even MORE seriously affected. Thus for example Gauss-Sidel SOR, which was a standby for much of the mid to later 29th Century. All sorts of arcane mathematics was done - really wasted effort, wasted on a lame horse! - to develop 'good ways' of picking an instant relaxation factor 'Omega' to 'make GS-SOR' Go! The only place it needs to go, however, is AWAY - Far, Far Away!

At Texaco an 8-year experienced Engineering Program with an M.Sc. in Chemical Engineering (lots and lots of hokey numerical calculations are routinely bit off in Chemical Engineering) was under my supervision for writing a Black Oil Simulator at Texaco Canada. in the mid 1980's. Of COURSE he insisted on GS-SOR, in which he was a 'past master'. So he made a preliminary simulator and I gave it a test.

I told him the answers looked wrong.

He insisted they were right.

Then I had him substitute the values he found for the physical variables back into the equations he THOUGHT he was solving. Well, they sure didn't balance out - there were huge areas. Many of the flows were only about 30% of flows he found when he executed the algorithms I insisted upon, and then checked to see that the New Methods balanced the equations with much much smaller errors. His GS-SOR had simply Not Enough POWER to solve the ugly systems which naturally evolved from the quite ordinary 'Black Oil Equations' - a very simple type of Oil, Water and GAs simulation regimen.

Of course, we used relatively small test problems.

On 'Larger Problems' - that is in dividing the Reservoir into larger numbers of smaller grid cells -

  NB: Methods are always improving, but the Best tend
  NOT to be over-decoupling of neighboring regions
  contributions, and they tend to be highly recursive.

  Unfortunately these are NOT the conditions most
  favorable for efficient uses of large multi-multi-CPU
  'Super Computing Machines'.

  However, ONCE such a machine is Installed, its users,
  and requisitioners, must Make it Look GOOD to justify
  their continuing good performance rating and jobs.

  It is a situation rife with opportunities and incentives
  for both conscious and true-believing unwitting, or even
  subconscious, cheating and other scientific corruption.

- the methods I suggested also deteriorated, but much more slowly (with increasing problem size). Some methods are less sensitive to CN increase, but ALL deteriorate.

I was simply MORE experienced in the particular problems, and am a fanatic for doubt and checking. That is all I really had over the poor, confident, self-superior, proud, Industrial Scientific/Engineering Programmer!

Ning, my temporary supervisee, simply had been used to 'getting away' with GS-SOR, which he had been taught in School, and which he had used - successfully, he claimed - in pure water flow, and low concentration pollutant convection problems he had worked on in previous jobs in Hydrogeological Simulation.

Of course, because Ning had such Confidence in the value of GS-Sor, he deemed 'convergence' was attained when successive iterations starting yielding miniscule adjustments of the proposed 'numerical solution'. And he never bothered to check how small the 'residual errors' - the "out of balance" in the governing equations - were.

This was the basis of a SUBCONSCIOUS cheat - failing to check, and merely presuming that a famed algorithm, once it stopped producing serious adjustments of the estimated solution, had simply "completed its job handily"!

Why waste Precious Computer Resources CHECKING on such matters, when YOU BELIEVE the Methods you are using are the Very Best?!

Well, the Stinking Oil Industry - its kind of Sloppy - Right, but AERODYNAMICS!!!! ...

Ok, Ok, Great Aerodynamic Design is done these days - Right?!

Yes, indeed, but the problem is masked by all the many Wind Tunnel Measurements (I have made wind tunnel measurements by the way, when I was a graduate student in Mechanical/Wind Engineering; and for a couple of decades in the Canadian Oil Industry, I did and advised on Numerical Flow Simulation of Underground and Well Flow for Oil Field Production and Development, Before I studied Mechanical Engineering, I took an M.Sc. in Mathematical Hydrodynamics from the Math Department at University of BC. Earlier, when I was in Alaska, I had an Introduction to Physical Oceanography. I know quite a bit about a variety of fluid and gas flow. And as well, many aspects of Numerical Simulation of the same.), which allow all sorts of 'fixes' and 'tweaks' to be worked out for the methods - not only on a particular design project, but often in a whole region or type of flow regime application in Aerodynamic Simulation Software Systems.

Yes, even a Black Box with lots of screws for adjustments can be 'tuned in' with enough trials and error correcting adjustments when repeated compared against physical 'reality'!

Unfortunately, we DON'T have an accurate 'Wind Tunnel' equivalent for the Earth ... for Weather ... and for Climate.

Yet Multi-Multi-Multi-CPU Machines are requisitioned and placed in action by NASA and NOAA for exactly things like Climate, Ocean Currents, and Weather.

Although there are wonderful discoveries in abstract algorithms using 'Divide and Conquer' Algorithms for solving large linear systems, the logistics, data & computation 'flow' (ie, BUS & CPU's Waiting for data which Other CPUs are generating) problems involved in use of STRONG, usually highly Recursive Methods, to attack instant linearizations of Non repeatable Real World Problems, or the Immense Data and computational Resources required to - say - adequately represent the Earth's Fluid spheres, even in abject oversimplification, promotes the Feckless IDEA that FASTER is Better.

Well, the Faster which 512-Cpu or 1024-CPU machines illicit is often 'tested' on Small Discretizatons of known or sort-of-known Problems, inspires 'decoupling' of related variables and cell groupings, so as to distribute 'the workload' over the Big Machine and take advantage of its promise in parallel Computing ...

A BIG Obstacle: Known Solution problems are quite Often Linear, Nearly Linear or Drastically OverSimplified, or when recorded by some measuring instruments, of less than full value short time duration.

Yet the questions we want to know with Climate, for instance, is what the average weather will be like in 20 or 50 years, so that we can 'plan for it'!

Simulations which 'Look Good in the Shower' over a 5 year or less period of retro-historical 'matching' may be impotent and invalid over fifty years, and as dominating Weather cycles and fluctuations an be 'debalanced' by a spate of 'El Ninnos/La Ninnas' or Volcanic Eruptions, or disturbances of the Jet Stream by Pulses of Auroral Energy falling in the Arctic Atmosphere, what matches for the last 5 or 10 years may become less relevant even for the next 5 or 10!

All we need is for the Sun to enter some somewhat infrequent regime of gross energy output or distribution of its energy flows between Light and Particles, and the whole applecart may be upset!

I have seen many cases in Petroleum Reservoir Engineering, where a Simulation Set Up is well fitted and matched to the processes in the early or mid phase of the depletion of a Field, but somewhat irrelevant for the New effects which occur when water or gas production suddenly starts to bypass Oil flow and changes the distribution of produced product in later phases of a field's production.

Unfortunately, the life of a Weather Cycle can be as long or longer than the life of an aggressively produced Oil Field, and when Solar, Climate or Mini-Ice Age Cycles (like the one from 1250-1850 which froze the Vikings from Greenland and impaired food production in much of Northern Europe, stopping, for whatever reason, Long Before the Great American Forest and Prairie Clearings for Agriculture or the massive growth of Coal Consuming Industry - and, Why did it START even?!) are involved? ... well those Simulations can be just stuffed in the 'Circular File'!

Remember the Bogus Club of ROME Simulations in the Late 1960's or Early 1970's?! Well, Today's Club of ROME may be based NOT on a mistake in programming as the CoR Simulations were, but rather upon undue faith in the Power of More CPUs and FASTER and FASTER 'throughput'.

By the way, what is the Point of Fast Climate Simulations?!

DOES IT MATTER if they take 5 hours or 5 weeks?!

Well, it DOES, if 'Errors and Bugs' are being repeatedly found, and WORSE YET, when Algorithmic Deficiencies NOT Found in the Small, or Too Linear, or OverSimple TESTS, are discovered and have to become corrected, "BEFORE SOMEONE FINDS OUT!" Quick, Quick! THEY Must have SOMETHING to Prove! ...

Ultimately, as I have described, even large LINEAR systems have problems, as the 'Resolution' gets finer and finer.

Another difficulty in Time Dependent PDEs: When spatial resolution gets finer and finer, timesteps often have to be taken finer and finer, also. THAT drastically increases the work which must be done to complete a simulation.

And when 'Trouble Spots' gets really finely 'gridded' the same applies. Problem Nonlinearity and Difficulty, ultimately Condition Number of Linearized Iterates, Climbs like a Rocket bent for the Nearest Star!

Yeh! Bigger, Faster, Distributed, Expensive MultiCPU Monster Computers for Bigger FASTER Erroneous Throughput! For SOME Problems, with Zillions of cases, it's The Cat's Meow, but were intrinsically integral entire global problems? ... Where'd that Almanac Go?!

  But you know that one of the BIG PROBLEMS
  of Computers are DATA Buses, with Traffic Control
  on Data Buses, in particular. How are these many
  microscopic units to be interfaced together?!
  This is still a problem for Todays Optically Imaged
  Lithographic, essentially 2-D machines. It will
  be even more trouble with the extra degrees of
  freedom (don't forget pivoting, rotation, and flexure!)
  on jumping to Actual, Full, Bottom-Up 3-D!

Connections Upon Connections!

"The Wring Experts! - WE WIRE YOU GOOD!"

    "The Bionic Connection
    Are we already a lot closer to a mind-machine
    interface than we ever guessed?
    By Jocelyn Selim

    "Kevin Warwick is about to become telepathic,
    luck and technology permitting. His lips are
    parted expectantly as he sits blindfolded and
    perched on a lab stool at Reading University
    in England. One inch below his left wrist,
    a pincushion array of 100 silicon electrodes,
    all of them together about one-sixteenth the
    size of a dime, has been surgically inserted
    into his median nerve. From the electrodes,
    22 wires run eight inches under the skin and
    exit one inch below his elbow. There they are
    soldered to a connector board where a
    2-inch-by-2-inch maze of circuitry amplifies,
    filters, and converts the electrochemical
    impulses coming down his median nerve into
    digital signals."


I will look.

Some Things we may just Never Get Right, so don't be a Sop for all the Overwhelming Optimisms of the Self-Promoting Worldlies - Y'All Hear, Now?!

GGF - The Iowan Idiot
PS: "Legions and Legions of Nanobots,
All Lined in Fine Array, Ready to HELP
The Denti$t, Find all the Tooth Decay!

© 2002 Gary G. Ford            | 1. Notes |

MT © 1999-2002-2013... All Rights Reserved. Return to Intercepted Transmission