James C. McWilliams
13 January 1998
1. Introduction
This is a brief sketch of trends I see in the purposes and practices of numerical modeling for physical oceanography. Unlike some other papers in this series, it is not a scholarly survey of progress in this subject, since there have been several reviews published recently.
The practice of numerical modeling is steadily and rapidly increasing. Its relative importance in physical oceanography is much greater than it was a decade ago, whether measured by scientists' interests, the quality of solutions, scientific publications, or resources expended.
Why is this so?
One obvious part of the answer is steady growth in computer power. To experience a doubling of technical capabilities every few years, with no end in sight, is very exciting to a scientist, and there is no comparable growth in the more traditional approaches of theory and measurement.
Another part is the experience that nearly any problem in physical oceanography that can be cleanly posed mathematically can be solved numerically, subject only to the constraint of not comprising too large a calculation (which often is only a rather weak limitation). Fluid dynamics are computable, even in fully turbulent regimes, and we have access to equilibrium solutions for nearly any dynamical sub-system of the ocean, from the micro-scale to the planetary scale. (Breaking surface waves are, so far, an exception.) As a result the boundaries of doable theory---i.e., solvable mathematical problems---are greatly expanded by numerical modeling.
Of course, we can never expect to compute all physical behaviors in a single calculation---a Grand Unified Model---but we can reasonably hope to be comprehensive about including all the important influences on any single behavior. This indicates we should follow a bootstrapping strategy: compute in many regimes, using many model formulations, and the sum of all these computations may be comparable with the ocean as it is observed.
Numerical modeling is a tool for constructing cause-and-effect relations between how a problem is formulated---what is assumed to be relevant---and which phenomena arise in its solutions. It is the construction of hypothetical realities.
Are they truly real? The traditional answer of science is, let experiments test their reality. However, as D. Rudnick remarked in his presentation, we observe the ocean as it is and do not do controlled experiments. Furthermore, oceanic observations usually show complex behavior. So, it can be quite subtle to judge the reality of a numerical model. One part of this comparison is to make it with whole classes of solutions (varying parameters, initial conditions, etc.), not just single solutions. Another is to distinguish between idealized formulations and full simulations:
As far as I am aware, there are no hard failures of numerical models to match known qualitative behavior, apart from breaking surface waves. No dark matter seems to be needed, and maybe there is not even any significant missing diapycnal mixing, although there undoubtedly are more forms of mixing in the ocean than are contained in any model to date. However, neither are there any hard confirmations of complex numerical simulations.
2. Phenomena and Ideas
In place of a review of numerical modeling, I list here what I believe are some of the most significant phenomena and ideas in the recent history of physical oceanography. The important contributions of numerical modeling should be seen as implicit here.
3. Scientific Themes and Model Types
Numerical modeling is relevant to all fluid dynamical phenomena in the ocean, albeit on a piecemeal basis rather than comprehensively. If we grossly categorize the scientific themes of physical oceanography and their associated phenomena by their space- and time-scale content, then we can associate different types of models with each category:
4. The Practice of Numerical Modeling
Numerical modeling requires a large infrastructure to be successful:
Global and coastal-zone observing systems---sustainable and fairly comprehensive---are required for being able to tie models to reality for large-scale phenomena. Most ocean observations, particularly those sponsored by NSF, are made primarily for local purposes and thus do not fall in this category. So there is understandably a wide gap between the observing and modeling/theorizing communities, which goes well beyond their relative unfamiliarity with and disinterest in each others' techniques. Therefore, cultural evolution is needed before we will be worthy of the observing systems our scientific goals require us to have.
Oceanic numerical modeling has, on the whole, not been done with smart algorithms developed in other fields. There is considerable scope here for improving model solutions per computing work unit, and there are signs of growing interest within our community. Among the key algorithmic issues are the following:
The principal intellectual limits on the quality of model solutions are ignorance about the processes not resolved in the calculations, both external (or larger scale) conditions and smaller scale fluid dynamics (parameterization):
Oceanographers need to figure out how to do better at combined observational and numerical process studies brought to the stage of yielding parameterization rules. Data assimilation in a LES?
Discovering what the ocean model must do to achieve stable, credible Climate-System or Earth-System Model solutions is a central scientific challenge for oceanography.
COMs (Coastal Ocean Models) are a new frontier for modeling whole-system behavior, analogous to GCMs but with many new and different phenomena and processes to consider. This also is an excellent context for addressing biogeochemical coupling issues, since the latter have such strong signals in the coastal zone.
There is much confusion among oceanographers about the significance of model resolution. The evidence in hand suggests that it is relatively unimportant for the solution properties (i.e., tracer and material fluxes; water mass properties) that are simulated reasonably well in GCMs that do not resolve the eddies (i.e., with horizontal grid spacings dx > 50 km, say). On the other hand, resolution seems to be extremely important in ERMs, up to an as yet poorly determined threshold at least as small as 10 km, in order to calculate mesoscale eddies and intense, narrow currents credibly (i.e., with qualitative similarity to observations in eddy energy level and current location). For ERMs resolution convergence has not yet been demonstrated. The computational cost scales with horizontal resolution roughly as 1/dx3, assuming that the vertical resolution, duration of integration, and domain size are not varied; this implies roughly a thousand-fold disparity in computation costs for any given problem configuration.
Thus, there is currently a clear division in usage between GCMs that are also eddy resolving and those that are not, forced by limitations in computing power. Although growing computer power will narrow this division, it will be at least a decade, and perhaps much longer, before it disappears and everyone will prefer an eddy resolving-GCM. Eddy resolving GCMs can be used well only for intervals as long as decades and domains as large as basins. The fact that GCMs without eddies and with sensible parameterizations can do reasonably well in calculating the large scale thermohaline circulations, heat and water fluxes, and water mass distributions remains somewhat mysterious and thus must be accepted only provisionally; however, it suggests that there is something of a dynamical decoupling between these phenomena and the mesoscale eddies, strong currents, and other small scale phenomena.
Ocean GCMs are necessary components of Climate-System Models and Earth-System Models, and because the latter have global domains and lengthy integration times, only GCMs without eddies are currently affordable for this. In contrast, it's hard to imagine that Coastal Ocean Models (COMs) won't usually be eddy resolving.
The cost-benefit trade-offs in resolution are not as well established for COMs and LESs as they are in basin- and global-scale models. However, because the uses of COMs and LESs are more varied, there may not be any useful general statements about their trade-offs.
Comprehensive models, especially of the GCM or COM type, are sufficiently complex in their formulation and solution behavior and labor intensive in their usage and interpretation that scientific teams are required to bring them to fruition. The time scale for improvement of models is usually quite long (many years at least, and often decades), although I believe the recent rate of progress and foreseeable prospects are quite good.
The public discourse about the value and practice of numerical modeling in our community is often remarkably unsophisticated. Silly statements are too often made without challenge. Time will no doubt ameliorate this, but it certainly slows the acquisition of a more sophisticated understanding by those who primarily are not modelers.
5. Physical Processes
Among the processes that are important to parameterize well are the following:
One way to pose these parameterization problems is relative to the default option: where are the small-scale transports different than can usefully be represented as an eddy diffusion process? A partial answer is where the transport behavior is non-local (as with convective boundary layers or gravity waves propagating towards a distant breaking site) or is due to a mean Lagrangian motion (as with surface gravity waves and mesoscale eddies).
Designing process experiments that intimately relate observations and models has been insufficiently exploited in oceanography, and this is particularly true for microscale processes and their appropriate LES models.
6. Data Synthesis, Interpretation, and Assimilation
Because ocean currents are often complex, broad-band, and chaotic, many observational data sets are not easily interpretable by themselves or even with the use of simple theoretical constraints (e.g., geostrophy and tracer conservation).
This has led to widespread rhetorical enthusiasm for data assimilation (DA) in GCMs, and possibly other types of models, although as yet there are not many substantial examples to support the rhetoric (largely because DA is computationally so demanding).
DA, by its definition, can provide the best available analysis of the state of the ocean, which is the same thing as an initial condition for a numerical integration. Even 4D DA is possible, although even more computationally demanding. Therefore, DA is an important methodology for the future of oceanography, and I can imagine it being used for data syntheses in many contexts.
There are, however, many potential perils in the mis-use of DA. Because of the complexity of its algorithms---as extensive meteorological experience has shown---code errors have plenty of room to hide in for quite long times, even in operational implementations. DA also requires what are often essentially subjective judgements about important quantities in its specification (e.g., weights given to different sources of information in the cost function and error covariances for both the data and model). Thus, there is likely to be a very long learning curve for oceanographic DA, comparable probably to that for GCMs and COMs (which are relatively much more mature).
DA products are more likely to be useful if their underlying measurements are made with favorable sampling scales, as determined by the model solutions with which they are to be combined. This argues in favor of using numerical models to help design experiments, if DA is anticipated. Thus far, this has been done very little in oceanography, but it is likely to increase.
There is also a DA formalism for hypothesis testing, leading to a probabilistic statement about whether particular observations and a model are consistent, within the uncertainties of the comparison. Proponents of this use of DA like its potential for statistical rigor. However, in my view this is usually a much less fruitful approach compared to less formally defined searches for consistency between observations and model classes by imaginative scientists who understand the uncertainties in each beyond anyone's ability to specify error covariances. Therefore, I expect that most model improvements and assessments of their skill will not be done through DA, but instead be done by comparison of model solutions and data analyses made as independently of each other as feasible. (Again, meteorological experience provides guidance.) Human intelligence is still better than artificial intelligence, except in simple games like chess.
7. Desirable Alliances
Physical oceanographers often view their subject as somewhat self-contained in its subject matter; sometimes even the sub-disciplines defined by methodology (e.g., numerical modeling) are viewed this way. Looking forward to problems in climate and the human-modified environment, however, these disciplinary borders are unlikely to endure. This is perhaps especially true for numerical modeling, since it requires whole-system formulations to achieve well-posed calculations. We therefore should be seeking to build alliances of several types:
8. Summary
Much progress has been made in the quantity and scope of numerical solutions of physical oceanographic phenomena, in the computing power utilized, and in devising the algorithms and parameterizations that enable the calculations of the fluid dynamical equations. The prospects are very good for continuing progress in all these areas.
The means and methods for comparing observations to model solutions is a problematic enterprise; nevertheless, these are challenges that should be confronted. Oceanic observations are often too sparse in their information content, and new measurement techniques are desirable that could increase sampling density and breadth. Since both the ocean and numerical solutions exhibit complex behavior, new comparison procedures are desirable that better accommodate this complexity; the formalism of data assimilation is one approach to this problem, but other approaches are also desirable.
Please send your comments to Brian Jackson at bjackson@ucar.edu; be sure to include in the Subject box Comments and the corresponding outline Roman numeral; Example: "Comments on Paper IX".
webmaster@joss.ucar.edu Last modified: 26 January 1998