Title: Optima
Full Citation
Permanent Link: http://ufdc.ufl.edu/UF00090046/00046
 Material Information
Title: Optima
Series Title: Optima
Physical Description: Serial
Language: English
Creator: Mathematical Programming Society, University of Florida
Publisher: Mathematical Programming Society, University of Florida
Place of Publication: Gainesville, Fla.
Publication Date: July 1995
 Record Information
Bibliographic ID: UF00090046
Volume ID: VID00046
Source Institution: University of Florida
Holding Location: University of Florida
Rights Management: All rights reserved by the source institution and holding location.


This item has the following downloads:

optima46 ( PDF )

Full Text







International Mathematical
Programming Symposium
The Symposium is held every three years un-
der the auspices of the Mathematical Pro-
gramming Society. By a tradition of the So-
ciety, the site of the Symposium has usually
been alternated between places in and outside
ofNorthAmerica. Thus, since the 1997Sym-
posium is to be held in Lausanne, locations
within North America are preferred for the
2000 Symposium. Proposals for any site will
be considered, however. Meetingdates during
the month of August are also preferred.
The main criteria for selection of the
Symposium site are:
i Presence of mathematical programming re-
searchers in the geographic area who are inter-
ested in organizing the symposium.
- Attendance open to prospective participants
from all nations.
*r. Availability of an attractive facility with a
sufficient number of meeting rooms, standard
lecture equipment, and other facilities required
by a Symposium.
-Availability of.. In.....r .P1''I/ of reason-
ably economical hotel and/or university dormi-
tory rooms fairly near the meeting facility.
A copy of the Society's "Guidelines for Sub-
mission of Proposals" and further information
can be obtained from the chairman of the Ad-
visory Committee for the 2000 Symposium:
Robert Fourer (4er@iems.nwu.edu)
Department of Industrial Engineering and
Management Sciences
Northwestern University
2225 North Campus Drive
Evanston, IL 60208-3119, U.S.A.
Other members of the Advisory Committee
are Jens Clausen, Copenhagen
(clausen@diku.dk), and Katta G. Murty,
Ann Arbor (Katta.Murty@umich.edu).

The following memorial resolution was
adopted by the Faculty ofPrinceton
University at its monthly meeting on
March 6, 1995. It was written and
presented to the Faculty by Harold
Kuhn on behalfofa committee
composed ofProfessors Kuhn,
Joseph J. Kohn and Hale Trotter.
Albert William Tucker, who was Albert
Baldwin Dod Professor Emeritus of Math-
ematics, died at the age of 89 on January 25,
1995, after a long illness. He was one of the
last surviving members of a group of math-
ematicians who, in the 1930's and '40's,
transformed the Mathematics Department of
Princeton University into an. .-.r_ ri... 11. 1
renowned center for mathematical research.
It is an impossible task to try to capture the
unique quality of this man in a short memo-
rial resolution. The lists of positions held, of
honors conferred, and of projects completed
are far too cold and impersonal to convey his
essential achievements. Although much of
this impressive record will be recounted, it is
the human evidence that is most real and im-
portant. He was known to his friends, his
c. li. i:g, his students, and his children as
"Al" and so I shall call him here.



Honors First

Rank Statesman

of Mathematical




software&computation 6

conference notes 7-10

book reviews



gallimaufry 16



PA 2"%~-F-~~_~-- N0- 46 -


Al Tucker
C 0 N T I N U E D

Born and reared a Canadian, Al
Tucker came to Princeton for
graduate study in 1929 after
receiving his B.A. and M.A. from
the University of Toronto. This
choice was made against the advice
of his professors in Canada, who
wanted him to study in Europe or, if
he insisted on the United States, at
Harvard or Chicago. His selection
was made on the basis of a graduate
catalog that listed courses by Veblen,
Lefschetz, Alexander and Eisenhart
in various areas of geometry that
excited him. His single letter of rec-
ommendation, sent to Dean Henry
B. Fine, went unanswered for many
weeks; it was not known in Toronto
that Fine had been killed in the first
automobile related fatality on
Nassau Street. When the letter was
found, it was too late to apply in the
regular way; however, on the basis of
his teaching experience in Canada,
Tucker was appointed a part-time
instructor for the 1929-30 school
year with a salary of $1,000 and free
Thus begun Al Tucker's
distinguished career as a teacher.
After completing his Ph.D. under
Lefschetz, he joined the Princeton
Faculty in 1933, was appointed
Assistant Professor in 1934, Associ-
ate Professor in 1938 and Professor
in 1946. He succeeded Emil Artin as
Dod Professor in 1954, retiring to
emeritus status in 1974. Princeton
shared his teaching talents on a
number of occasions, with Stanford,
MIT, Dartmouth, Arizona State, as
Phillips Visitor at Haverford
College, as Visiting Lecturer for the
Mathematical Association, as guest
Lecturer at the Rockefeller Institute.
He was Fulbright Lecturer at four
Australian universities and lecturer
at several European universities for
the OEEC. In the year following his
retirement, he was Mary Shepard
Upson Visiting Professor of Engi-
neering at Cornell University.

Behind this formal record stands a
veritable army of active mathemati-
cians who give evidence of a man
who taught them with exquisite care
and precision. Among them, one
may cite John Milnor, who heard
from Al Tucker the Borsuk conjec-
ture on the total curvature of a knot
and went to solve the problem while
still a freshman. As another ex-
ample: Marvin Minsky, a pioneer in
artificial .r r. I 11. .. who was taken
on by Tucker as a Ph.D. student
when no one else was either quali-
fied or courageous enough to do so.
As a last example: John Nash, who
shared the 1994 Nobel Memorial
Award in Economics for his Ph.D.
thesis on noncooperative game
theory written under Al Tucker's
supervision in 1950.
His lectures sparkled with penetrat-
ing examples. Perhaps the most fa-
mous is an illustration of non-zero-
sum game theory constructed for an
expository seminar before a general
audience of psychologists at
Stanford in 1949-50. This game,
which goes by the name of the
Prisoner's Dilemma, has inspired
countless research papers and several
entire books. He was generous with
his teaching materials as well. His
examples and teaching exercises
have found their way to the math-
ematical public more often than not
-1 I..-. the books of others. With a
tendency for procrastination that
came from a deep seated perfection-
ism, his last book, coauthored with
Evar Nering, appeared last year
some eighteen years after it was be-
Al Tucker was a statesman of the
mathematical community of the
first rank. He was a Council mem-
ber and Trustee of the American
Mathematical Society, President of
the Mathematical Association of
America, a Vice President of the
American Association for the Ad-
vancement of Science (AAAS),
Chairman of the Conference Board
of the Mathematical Sciences
(CBMS), and Chairman of the
Mathematical Programming Soci-

ety. It was characteristic that his ser-
vice extended beyond the confines of
individual mathematical organiza-
tions to strenghten our ties with
other sciences through the AAAS

and to unite the mathematical com-
munity through the CBMS. At the
highest national level, his wisdom
was sought as a member of the first
President's Committee on the Na-
tional Medal of Science and as a
consultant to the President's Science
Advisory Committee. In all of these
positions, his colleagues learned to
appreciate the wise deliberation with
which he confronted any task, large
or small.
Al Tucker's initial research interests
were in topology as a student of
Solomon Lefschetz. Although he had
a substantial introduction to applied
mathematics during World War II, a
major shift in his research occurred
in the summer of 1948, when he ini-
tiated a research project with two
graduate students on the relation-
ship between linear programming
and the theory of games. That
project, with major support from the
Office of Naval Research, continued
until 1972. The special quality of
Tucker's leadership is shown by the
large number of young people who
produced basic results in game
theory, mathematical programming,
and combinatorics while participants
in this project.
In the '50's and '60's, Al Tucker was
in the forefront of efforts to reform
the mathematical curriculum at all
levels. Again the measure of his ser-
vice is not in the number of offices
held, which were numerous, but
rather in the quality of the final
products and in the many jewels of
expository mathematics that bear the
imprint of his clarity and elegance.
Out of his many achievements, I
know that he was most proud of two
which must be cited here. In 1934-
39 he was in charge of the Princeton
Mathematical Notes and then orga-
nized and managed their successor,
the Annals ofMathematics Studies
from 1941 to 1949. This was the
first paperback series produced by

photo-offset technology from type-
written manuscripts that '-.- ',.. 1
important works of higher math-
ematics to a wide audience of stu-
dents and researchers at a reasonable
cost. He co-edited six of the first
100 volumes himself.
A second project was undertaken
when Al Tucker was 79. He then
initiated an oral history of math-
ematics at I1 ,,. r..r, in the 1930's.
Funded by the Sloan Foundation, it
consists of roughly three dozen in-
terviews and more than 100 hours of
tape recordings. The interviews ex-
plore the conditions that enabled
Princeton to develop the best math-
ematics department in the world. Of
the three dozen interviewees, he was
acknowledged by the director of this
project to be the most responsive
and to have the best memory.
One of the most fitting tributes to
the qualities of Al Tucker's life in
mathematics was composed by John
Sloan Dickey, President of
Dartmouth College, and read on the
occasion of the award of an honor-
ary degree of Doctor of Science in
June, 1961. It captures such a large
share of the debt we owe him that it
seems proper to paraphrase it here:
"[Over six] decades ago [he] began
an academic career at i', ll,..I
which became a mission to math-
ematics. In a field where scholarship
scores only if the idea is both new
and demonstrably true, [his] ideas
have won their way in topology, in
the theory of games, and in linear
[and nonlinear] programming. But
even in mathematics a mission is
more than ideas; it is also always a
man, a man who cares to the point
of dedication, whose concern is that
others should care too, and who can
minister to the other fellow, as the
need may be, either help or forbear-
ance. I L.I. lu... [Al Tucker embodied]
in extraordinary measure both [the]
profession's love of precision and
man's need for conscientious leader-
ship, mathematics in America at all
levels is today higher than it was and
tomorrow will be higher."



JULY 1995

N? 46

PAGE 3 N"46 JULY 199


Roger J.-B. Wets received the 1994 Dantzig Prize, for his contri-
butions to all aspects of stochastic pJi, .1,iImii-.: and to variational
convergence in the approximation of infinite-dimensional problems.
The prize was jointly awarded to Claude Lemarechal who was in-
terviewed in the previous issue of OPTIMA. The Dantzig Prize is
awarded once every three years by the Mathematical Programming
Society and the Society for Industrial and Applied Mathematics to
recognize original, broad and deep research making a major impact
on the field.
Among Wets' many important contributions to stochastic program-
ming are fundamental investigations into the geometry of the solu-
tion set, the properties of the value function, conditions for exist-
ence and stability of optimal solutions, and the structure of dual
problems. On the algorithmic side his contributions include the ba-
sic and fundamental L-shaped method. Wets has also been very ac-
tive in applications ranging from the environment to finance.
Wets received his Ph.D. in Engineering Sciences in 1964 from the
University of California at Berkeley. From 1964 until 1970 he was a
staff member of the Mathematics Group of the Boeing Scientific Re-
search Laboratories in Seattle. He then joined the University of
Chicago, and in 1972 he joined the University of Kentucky. Since
1984 he has been at the Department of Mathematics at the Univer-
sity of California at Davis.

OPTIMA: How did you start
your work in stochastic pro-
RW: When I finished high-
school, due to special circum-
stances I had to get involved in
the family's business that in-
cluded a small cardboard fac-
tory. While I was taking care of
the daily management of the
company, I also studied for a de-
gree in applied economics. After
I had worked for the company
for about five years and realiz-
ing that I had only limited inter-
est in staying in the business
world, I went back to studying
some mathematics. Because of
my background, I took a course
in operations research from
Jacques Dreze at Universite
Catholique de Louvain. I think it
was the only such course offered

in western Europe at that time.
After I indicated to Dreze that I
was interested in optimization,
he suggested that I should go
and study with Dantzig at Ber-
keley. So in some way he was
the person responsible for get-
ting me in this field. At Berkeley,
I eventually got to the stage
where I had to discuss a possible
thesis topic with George
Dantzig. While spraying the
roses in his garden, Dantzig
said, "You know, Roger, you like
optimization, probability theory
and economics, so why don't
you work in uncertainty?" Al-
though I had no idea what he
meant by "work in uncertainty,"
the subject seemed like it had the
right mix. So I started to look at
the few papers dealing with the
OPTIMA: Once you had sur-
veyed the literature, what was
the first problem you decided to
work on?
RW: The first problem I looked
at that was beyond the problems
that Dantzig had studied was
the manufacturing of antifreeze
for an oil company. That prob-
lem led naturally to introducing
the notion of induced constraints.
These constraints are not explic-
itly included in the formulation
of the problem but are induced
by the 'tl11.i L of the system.
They are generated by the need
to only choose decisions (at the
present time) that will guarantee
future feasibility, i.e., limit the
choice to decisions that are such
that, whatever occurs, it will be
possible to find feasible solu-
tions in the future. In fact, what
always motivated my work,
even the most theoretical parts,
was a specific application that
would represent a class of poten-
tial applications.

OPTIMA: What is the main
question in stochastic program-
RW: Solving stochastic program-
ming problems and being able to
give a full analysis/ i, i! i_ .:-:
tion of the solution is the only
question that really matters. If
we concentrate on solving prob-
lems, I consider the question of
designing valid approximation
schemes as central to the subject.
Stochastic programming prob-
lems are basically infinite-di-
mensional optimization prob-
lems, so to solve them one has to
approximate them somehow.
And that's quite difficult, be-
cause in order to approximate
you should have the 1..- -.!. ltt
of calculating bounds, and in the
case of some large problems it
might be very difficult to get to
that point. The basic approach to
solving stochastic programming
problems is almost always the
same: you construct a very effi-
cient algorithm for a certain class
of deterministic problems with a
given structure. The special
structure is that of a stochastic
programming problem involv-
ing a distribution for random el-
ements with finite (manageable)
support, i.e., there are only a fi-
nite number of possible realiza-
tions for the random parameters
of the problem. Let's call such an
algorithm the core ,l-.' it i. The
next step is to justify the
discretization. By this I mean
that once you have a solution
provided by the core algorithm,
you need to be able to claim that
this solution is reasonably good
for the originally formulated
problem. This is done in two dif-
ferent ways: either you try to es-
tablish bounds by an approxi-
mation technique, or you use
sampling which gives a probabi-
listic justification saying that the
solution you obtained is going to
be valid with a high level of reli-


N? 46

JULY 1995


- - -4Y


OPTIMA: Could you mention
and explain some of your results
that had a big impact on the
RW: On the algorithmic side, i.e.,
in the design of the "core" algo-
rithm, there is the so-called L-
shaped method and the progressive
:;..., :,i. algorithm, based on the
i .. . ,t;i.. principle for sto-
chastic optimization problems.
In 1965-66, R. Van Slyke was
working at the interface between
optimal control and mathemati-
cal programming and was inter-
ested in the design of an algo-
rithmic procedure that would
handle state-constraints effi-
ciently. Once the problem was
formulated as a mathematical
programming problem, the lin-
ear constraints had an L-shaped
structure. So did the formulation
of a two-stage linear stochastic
programming problem. We de-
signed a decomposition proce-
dure that would make efficient
use of this special structure. It is
really a cutting plane method
and, as was pointed out by
E. Balas when we showed him
the galley-proofs, was along the
lines of a method suggested by
Benders for mixed integer pro-
gramming. But in the case of sto-
chastic programs with recourse,
one is able to simplify tremen-
dously the work required to gen-
erate both optimality cuts by re-
lying on bunching/sifting tech-
niques, and feasibility cuts by re-
lying on a partial ordering of the
possible realizations. There are
now a number of variants and
extensions of this method
(Ruszczynski's regularized ver-
sion, Birge's nested decomposi-
tion for multistage problems,
Gassmann MSLiP, for example)
that have been suggested and
implemented, including some
methods relying on sampling to
generate the cuts such as the sto-
chastic decomposition method

(Higle and Sen) or the method
proposed by Dantzig, Glynn and
Infanger that relies on impor-
tance sampling.
Scenario analysis is an alterna-
tive approach to stochastic pro-
gramming when there is uncer-
tainty about some of the param-
eters of an optimization prob-
lem. In scenario analysis, one
considers all possible scenarios
(possible values, or realizations,
of the parameters). For each of
these scenarios, one solves the
corresponding optimization
problem which will yield a num-
ber of different solutions. How-
ever, such a solution isn't very
good if the actual values as-
sumed by the parameters are
different from the specific sce-
nario used to generate the solu-
tion. The progressive hedging al-
gorithm enables us to proceed in
a manner similar to scenario
analysis and then modify the in-
dividual "scenario problems" so
that their solutions converge to
the optimal solution of the sto-
chastic optimization problem. I
had been impressed at a confer-
ence in Bad V. ,njd-h, in, Ger-
many, by a report about the use
of nonlinear programming to de-
sign an optimal management
plan for a hydroelectric power
system which involved solving a
nonlinear programming prob-
lem nearly 18,000 times to take
potentially different weather
patterns into account. The idea
that users might be willing to
solve a large number of optimi-
zation problems to gain some in-
sight in what might be a robust
decision suggested that they
would also be willing to con-
sider methods for stochastic pro-
gramming problems that would
arrive at a solution by exploiting
the information gained from
solving (deterministically) a
large number of cases. In 1986,
R.T. Rockafellar and I had to
stay overnight in Beijing. Prob-
ably in order to give ourselves

some justification for not getting
involved in some urgent (but
less pleasant) tasks, we started to
discuss the p...- -tL.' design of an
algorithmic procedure for sto-
chastic optimization problems
that would never require more
than solving deterministic ver-
sions of the given problem, al-
beit numerous times. We quickly
came to the realization that the
key was to exploit a certain de-
composition of the problem
based on relaxing the
nonanticipativity restrictions
that model the fact that a deci-
sion at stage t can only take into
consideration the information
available at time t. Our work in
the early '70s on optimality and
duality for stochastic optimiza-
tion problems had for the first
time shown that a price system
could be attached to these re-
strictions, i.e., that
nonanticipativity could be intro-
duced explicitly as a constraint.
This eventually led to the pro-
gressive hedging algorithm
where "progressive hedging" is
supposed to suggest that al-
though one solves the problem
scenario by scenario, the
nonanticipativity restrictions are
progressively enforced through
a price mechanism.
On the approximation side, very
early on researchers such as A.
Madansky, W. Ziemba, P. Kall,
etc., in stochastic programming
had been concerned about
"bounds." It was clear that one
couldn't expect that finding a
"solution" to a stochastic pro-
gramming problem would have
the same meaning as finding a
solution of a deterministic opti-
mization problem. A general sto-
chastic programming problem
can't be solved, at least not in fi-
nite time! In '73 G. Salinetti came
to work on her research under
my supervision, and we thought
we should try to develop some

convergence results for stochas-
tic programming problems that
in particular would justify, or
not, the discretization of the
measure associated with the ran-
dom events. Since random pa-
rameters in a stochastic optimi-
zation problem possibly affect
both the objective and the con-
straints, we were led to work
with a convergence notion that
would handle optimization
problems from a global view-
point. The work in convex analy-
sis in the '60s had already
stressed the importance and the
usefulness of the "epigraphical"
approach, and this led us to rely
on epi-convergence (a term only
coined at a later time) to study
and obtain convergence results
for stochastic programming
problems. As this technique be-
came more familiar, it was ex-
ploited to obtain convergence re-
sults for the optimal solutions of
"sampled" problems, to make
the connection between stochas-
tic programming and math-
ematical statistics. Now there is
even a powerful law of large
numbers for stochastic optimiza-
tion problems.
OPTIMA: If you look at the
applications you have worked
on, could you say that we are
now at a point where we have
"standard" tools available to
solve them?
RW: There are now quite effi-
cient computer codes that will
solve linear multistage stochastic
programming problems, and al-
though there are no commercial
packages available at this time,
at least one of them, the SP/OSL
code of A. King at IBM, is of that
quality. There are a number of
high quality experimental codes
that have been written, even for
parallel processors. Among the

-- -- ~-~---~------- ~------~


N 46

JULY 1995

~'AGE 5 N~46 JULY 1995

developers of computer codes I
should mention: A. Gaivoronski,
T. Szantai, J. Birge,
G. Gassmann, L. Nazareth,
K. Ariyawansa, G. Infanger,
J. Mulvey, A. Ruszczynski,
H. Vladimirou, S. Zenios,
J. Higle, S. Sen,..., and I am cer-
tainly forgetting some.
Oi'Ti I -'.- What are the areas
of stochastic programming
where still a lot of work has to
be done, and what is the most
important open theoretical
RW: There is probably no single
question that can be identified as
"the most important theoretical
question" because there are
quite a number of stochastic pro-
gramming models (recourse
models, models with chance
constraints, and variants
thereof), each one with its own
particular collection of open
questions. After this disclaimer, I
think that the most important
questions are related to solution
validation. Since it's almost al-
ways in p.-. 'il:. to solve the full
version of a stochastic program-
ming problem, and one has to be
satisfied with a solution gener-
ated by solving an approximat-
ing problem, one must be able to
accept or reject this proposed so-
lution as a reasonably good solu-
tion of the full version. For cer-
tain classes of problems, bounds
can be computed, but even then
obtaining sharp bounds might
involve almost as much work as
solving an only slightly reduced
version of the full problem. For
multistage stochastic program-
ming problems, even the calcula-
tion of rough bounds might be
quite involved. But stochastic

programming problems have
properties that we have not yet
been able to translate in quanti-
tative terms and exploit
computationally. For example,
the objective of the deterministic
equivalent problem is usually
quite flat in a region, say R, sur-
rounding the optimal solution.
Although Lipschitz continuity
results have been obtained for
this objective function, the
Lipschitz constant in these re-
sults would certainly not allow
us to conclude anything about
this function being "nearly flat"
in the region R. But there are
also many other exciting/open
questions in this area. I even re-
cently wrote a paper about
"Challenges in Stochastic Pro-
gramming" which is to appear
in Mathematical Programming
in a special issue devoted to sto-
chastic programming. Let me
just mention a few of the issues
mentioned in this paper: the dis-
tribution problem, stochastic in-
teger programming, the relation-
ship between recourse and
chance-constrained models, the
evaluation of information, and
the extension of the multistage
models to continuous time mod-
OPTIMA: With which other
fields does stochastic program-
ming interact?
RW: Since all deterministic opti-
mization models have a stochas-
tic version, there is a high level
of interaction with the method-
ology of linear and nonlinear
programming, both at the theo-
retical and computational level.
And I should even add combina-
torial optimization now that
there are researchers that have
started to investigate stochastic
integer programming. The work
on large-scale systems can easily
be justified by the need to solve

stochastic programming prob-
lems. So contributions to sto-
chastic programming methods
turn out to also yield contribu-
tions to decomposition methods
and related topics. The approxi-
mation theory for variational
problems, in particular, the
theory of epi-convergence, has
three roots. One of them is the
classical calculus of variations,
dealing with the limits of inte-
gral functions. Another motiva-
tion came from the study of par-
tial differential equations with
highly oscillating coefficients
(homogenization). The third root
is the design of approximation
schemes for stochastic program-
Stochastic programming has also
contributed a number of new
ideas in pr. I-'. I..It; and statis-
tics, and I expect this interaction
to bear many fruits in the future.
One of them was the work on
exponential families of di- I : 1.-
tions and log-concave measures
initiated by A. Prekopa. There
are now also new and quite gen-
eral laws of large numbers for
function spaces that came from
what was needed in stochastic
OPTIMA: What is your feeling
about the future of optimization
and, more particularly, of sto-
chastic programming?
RW: Optimization is, of course, a
fundamental human activity and
has been a major modeling tool
in the physical and social sci-
ences. Because of this, it has con-
stantly motivated the develop-
ment of mathematical theories
and techniques (calculus, the cal-
culus of variations, significant

portions of analysis, algorithmic
procedures, etc.). There seems to
be no reason why this shouldn't
go on as long as there is some in-
terest in finding "best" solutions.
From a more limited viewpoint,
I see some shift in the center of
interest for the mathematical
programming community. Our
motivation has mostly come
from the operations research-
type problems. I suspect that en-
gineering-type problems will be-
come a more important source of
motivating problems, and that
will affect both the theoretical
and computational develop-
ments. Typically, engineering-
type problems are infinite di-
mensional in nature and not nec-
essarily convex. This should
stimulate work on approxima-
tions, large scale systems, and
global optimization. As far as
stochastic programming is con-
cerned, I think the future is very
bright now that the computa-
tional tools are becoming avail-
able. Since most important deci-
sion problems almost always in-
volve some uncertainties, and
since the solution of any model
that doesn't take these uncer-
tainties into account can be seri-
ously flawed, even sometimes
-i., I i-i114 just the opposite
course of action, I suspect that in
the future any good decision
maker confronted with an im-
portant decision will only let
himself be guided by solutions
generated by stochastic optimi-
zation models. Not only are sto-
chastic programming problems
of immediate practical interest,
they are challenging conceptu-
ally, mathematically and
computationally. That's what
makes it so stimulating.

~ I_ _I ~

N? 46

JULY 1995


PAGL- 6 N0 46 JULY 199

HIS is the first regular
article on computation
and software. With time,
I hope to develop this
column into a standard
format which will feature descrip-
tions of optimization software, in-
terviews with their developers,
and informational pieces on mod-
eling and computational issues.
From time to time I will also invite
a prominent researcher to write a
feature article on the state of the
art for solving specific classes of
problems. The content of the col-
umn in this issue is primarily in-
sioned for this column are: (1) to
provide readers with information
for identifying and locating the
best available software for their
work and (2) to keep readers
abreast of developments in com-
putational technology. The latter
would include both new math-
ematical procedures and their
implementations on various archi-
tecture computers. Much of the in-
formation in (1) is published in
newsletters of different profes-
sional societies. As it comes to my
attention, I will provide it to the
readers of OPTIMA. Please for-
ward to me by e-mail any informa-
tion and announcements which
you would like to see receive
wider circulation. As for the infor-
mation in (2), I will rely on authors
of :,i. ir L....I..i.,_- and vendors of
software and hardware to contrib-
ute items to be communicated to
members of the Mathematical Pro-
gramming Society. I welcome e-
mail correspondence on any mat-
ter of interest.
THE COLUMN in this issue will be
devoted to problems and solutions
in nonlinear and nonconvex pro-
gramming. These problems have
traditionally been relegated to aca-
demic pursuits with few real-
world applications since large-
scale instances were effectively in-
tractable. But with the increasing
use of heuristics coupled with the
availability of fast and affordable
computers, some promising
progress is being made on a num-
ber of problems. Because most

nonlinear and nonconvex pro-
gramming techniques are based
on solving a sequence of approxi-
mating linear or quadratic pro-
grams or models, our ability to
solve efficiently large instances of
these latter problems largely gov-
erns the tractability of the former
problems. Unfortunately, popular-
ity of a software product is not al-
ways positively correlated with its
effectiveness, especially when it is
too difficult to use correctly. Early
nonlinear programming codes re-
quired the user to supply subrou-
tines for all problem functions and
their derivatives. Chang-
ing problem parameters
involved editing and
recompiling subroutines
which added to the tedium of the
task. Recent developments in alge-
braic modeling languages and au-
tomatic differentiation have done
more to popularize nonlinear pro-
gramming than decades of theo-
retical advances because now the
power of the methods are placed
in the hands of end-users who do
not have to be nonlinear optimiza-
tion researchers in order to under-
stand how to input the problem
and to interpret the solution out-
THERE IS still a long way to go be-
fore methods, software and hard-
ware will come together to rou-
tinely solve the most persistent en-
gineering design and real time
control problems. However, the
rate of progress should accelerate
as more scientists and researchers
embrace the many p..t ini'iti..-
that are opening up to explore
new paths for surmounting previ-
ously unreachable peaks. In order
to contribute to that end, I offer
below a sampling of software and
reference material to (hopefully)
launch some of our readers into
this fruitful field of endeavor.
THE APRIL 1995 issue of ORIMS
Today (the li O:l I-. news maga-
zine) contains a survey of nonlin-
ear programming software by
Stephen Nash. It lists 30 packages,
both commercial and public do-
main, and includes pricing and all
correspondence information. The
packages are compared by the


-.. : .. '

modeling languages they support,
the platforms they run on, the al-
gorithms they use, and the size of
the problems they can solve. The
listed software (usually by acro-
nym) is as :..II. AMPL, Con-
strained Maximum Likelihood,
Constrained Optimization, CUTE,
DFNLP, DOC (Direct Optimal
Control), DOC/DOT, FSQP/
IMSL FORTRAN and C Libraries,
Toolbox, MATLAB Optimization
Toolbox, MINOS 5.4, NAG C Li-
brary, NAG FortMP, NAG FOR-
4.06, OPTIMA (no relation to this
newsletter!), SLP, Smart Optimizer
(SOPT) V1.2, SPRNLP, SQP, and
What's Best! Future software sur-
veys in OR/MS Today are sched-
uled for:
June Algebraic Modeling
August Simulation
October Linear Programming
December Scheduling.
devoted to computational issues in
nonlinear and nonconvex pro-
gramming continue to grow un-
abated, especially on focused ap-
plications areas. Some conferences
are more like workshops dedi-
cated to a single family of prob-

lems. Established journals for
methodological developments in
addition to Mathematical Pro-
gramming Series B, include the
I.\ I \ R \ ~ Journal on Computing,
lI.'.I Iilnal on Scientific Com-
'iiln.i (.. nputational Optimiza-
lion. ,ii applications Optimiza-
Ihr \I th,,ds and Software, and
;, Iloiiu,l of Global Optimiza-
i.o .... I -,ine only a few. New
I...u i,, i- irat come to mind in-
,id,. tirl,. journal of Heuristics
,, 11,.i Lt, Fred Glover), Compu-
1,i r, il,1 i ld Mathematical Orga-
im:iti,,i Iheory, and Reliable
C, il'itii ,1 (formerly Interval
C.,i ,'lilions). Recently con-
cluded conferences include the
State of the Art in Global Optimi-
zation: Computational Methods
and Applications held at
Princeton University in April and
the DIMACS Workshop on Global
Minimization of Nonconvex En-
ergy Functions: Molecular Con-
formation and Protein Folding
held at Rutgers University in
March. Proceedings of these two
conferences are forthcoming.
clude the IMACS/GAMM Inter-
national Symposium on Scientific
C. 'l',,,, -' Computer Arithmetic
and Validated Numerics to be
held at Bergische Universitat,
Wuppertal, Germany, in Septem-
ber 1995, the Third Workshop on
Global Optimization to be held in
Szeged, Hungary, in December
1995, and the Fifth INFORMS
Computer Science Technical Sec-
tion Conference to be held in Dal-
las, Texas, in January 1996.
we plan to publish a list of
internet sites that provide either
access to or information on opti-
mization software, test problems
and problem generators, both
commercial and public domain.
Readers are encouraged to supply
information for this list; I would
like it to be as complete as pos-



N-- 46

JULY 1995

AGEI 71 NU 46 jY 199





0 Franco-Japanese and
Franco-Chinese Conference on
Combinatorics and Computer
Science, Brest, France, July 3-5,
)0 Conference on Optimization
'95, Braga, Portugal,
July 17-19
International Symposium
on Operations Research with
Applications in Engineering,
Technology, and Management
(ISORA), Beijing, Aug. 19-22,
International Workshop on
Parallel Algorithms for Irregu-
larly Structured Problems,
Lyon, France, Sept. 4-6, 1995
I0 Symposium on Operations
Research 1995, University of
Passau, Germany, Sept. 13-15
) AIRO '95 Annual Confer-
ence, Operational Research
Society of Italy, Ancona, Italy
Sept. 20-22, 1995
ICCP-95-International Con-
ference on Complementarity
Problems: Engineering &
Economic Applications, and
Computational Methods,
Baltimore, Maryland, U.S.A.
Nov. 1-4, 1995
) Third Workshop on Global
Optimization, Szeged, Hun-
gary, December 10-14, 1995
Conference on Network
Optimization, Feb. 12-14,
1996, Center for Applied Opti-
mization, Gainesville, Florida
Workshop on
Rutgers University
March 11-13, 1996
IPCO V, Vancouver, British
Columbia, Canada, June 3-5,
) IFORS 96 14th Triennial
Conference, Vancouver, Brit-
ish Columbia, Canada, July 8-
12, 1996
) International Conference
on Nonlinear Programming,
September 2-5, 1996,
Beijing, China
) XVI International
Symposium on Mathematical
Programming, Lausanne,
Switzerland, Aug. 1997

1- 1 II

N 46


JULY 1995

PAGE~ 8 L4 UL 19

Report on the Oberwolfach Conference on
Computational and Applied Convexity
January 29 February 4, 1995
Oberwolfach, Germany

This conference, organized by P. Gritzmann (Trier),
V. Klee (Seattle) and P. Kleinschmidt (Passau) was at-
tended by 38 participants from classical convexity theory,
mathematical programming, computational geometry and
computer science.
The presentations revealed exciting new developments in a
field where, typically, the problems are algorithmic in na-
ture, and the underlying structures are geometric with a
special emphasis on convexity. The questions are usually
motivated by practical applications in mathematical pro-
gramming and computer science as well as other, less
mathematical, areas of science.
Some lectures were devoted to integer programming and
polyhedral combinatorics (William R. Pulleyblank,
Andreas Hefner, Panos Pardalos). A new approach (Rekha
R. Thomas) based on Gr6bner bases and Newton-
polytopes was presented. New results, some general, some
related to particular applications, were given which utilize
polyhedral approaches for solving large-scale combinato-
rial optimization problems. Various lattice point problems
were studied, in part from the viewpoint of integer pro-
Other talks dealt with linear optimization (Robert M.
Freund, Uriel G. Rothblum), convex optimization prob-
lems (Farid Alizadeh, Dorit Hochbaum), and semi-defi-
nite programming. New algorithms, partly motivated by
results from classical convexity theory, were presented,
and new insight was gained in known methods. There
were also reports on some special purpose approaches
which were tailored to particular practical applications.

Geometric aspects of nonlinear (smooth and nonsmooth)
optimization were scrutinized in some other lectures. Geo-
metric partitioning and covering problems turned out to
be particularly relevant to global optimization (Pierre
Hansen, Reiner Host).
Another group of talks focused on the computation and
optimization of certain geometric functionals. One of
these was motivated by the Hadamard determinant prob-
lem. Some centered on algorithmic reconstruction prob-
lems which are related to problems in computer vision or
computer tomography (Alexander Hufnagel, Peter
Gritzmann). In this context the algorithmic theory of con-
vex bodies played an important role.
Also presented were new algorithmic and theoretical re-
sults in geometric graph theory, the theory of polytopes,
tilings, and related combinatorial objects, (Ludwig
Danzer, Marek Karpinski, Victor Klee, Jeffrey C. Lagarias,
Jinos Pach, Shmuel Onn). One of the lectures (Giinter M.
Ziegler) surveyed the outstanding new results of Richter-
Gebert on the realization space of convex polytopes, which
solve a large variety of long-standing open problems in
polyhedral theory.
The conference showed that even though the participants
belonged to different fields with quite different tool-boxes,
approaches, and ideas for solving their problems, there is a
deep and close connection which is centered around the
basic concept of convexity.
In addition to those mentioned above, lectures were given
by Imre Biriny, Jirgen Bokowski, Thomas Burger, James
V. Burke, Dietmar Cieslik, Komei Fukuda, Martin Henk,
Petar Kenderov, David G. Larman, Horst Martini, Jiff
Matousek, Diethard I'I .1 I11 ., Svatopluk Poljak, Ricky
Pollack, Nagabhushane Prabhu, Peter Recht, Joseph Stoer,
and Eckhard Weidner.

Franco-Japanese and Franco-
Chinese Conference
Combinatorics and Computer
Science, Brest, France July 3-5,
The 8th I i .i 't .4th Franco-
Chinese Conference will focus on pre-
sentation of new results from various
branches of Combinatorics and Com-
puter Science and discussion of new
problems of common interest. Topics
will include combinatorial (optimiza-
tion) problems in architectural synthe-
sis, artificial i,,..!i,, ..,- .- im age process-
ing, logic synthesis, parallel and distrib-
uted computing, scientific computing,
and theoretical computer science.
There will be a series of 30-minute con-
tributed talks. The preliminary list of
speakers includes C. Berge,
H. Fleischner, P. Hell, T.C. Hu, and
H. Noltemeier.
Publication of the proceedings in a rec-
ognized ...... I ..... I be consid-
Laboratoire d'Informatique de Brest
(LIBr) Faculte des Sciences 6, Avenue
Victor Le Gorgeu B.P.809 29285
fax: +33 98016131,
e-mail: ccs95@univ-brest.fr

First Announcement and
Call for Papers
International Conference on
Nonlinear Programming
September 2-5, 1996
Beijing, China
An International Conference on
Nonlinear Programming will be held
at the Institute of Computational
M mathematics and _-i_.,.;. I ,,, -
ing Computing, Chinese Academy of
Sciences, Beijing, China, from Sep-
tember 2-5, 1996. It is organized by
the Chinese Academy of Sciences and
the Chinese Natural Science Founda-
Invited lectures on recent advances of
nonlinear programming will be given.
A preliminary list of invited speakers
includes: J. Burke, R. Byrd,
T. Coleman, A.R. Conn, J. More,
J. Nocedil, M.J.D. Powell,
R.B. Schnabel, M. Overton,

K. Tanabe, R. Tapia, Ph. Toint,
H. Wolkowitz and M.H. Wright.
A limited number of short (20 min-
utes) papers will be accepted for pre-
sentation. Papers on theoretical, com-
putational and practical aspects of
nonlinear programming are welcome.
In part, this meeting is intended to
honor the many contributions of Pro-
fessor M.J.D. Powell to Optimiza-
tion. It is hoped that this meeting
will be similar to the one which Pro-
fessor Powell organized in Cambridge
in 1981. There will be no r i!.. I ses-
sions. Apart from the invited lectures
and submitted short talks, there will
be also discussion sessions. The con-
ference proceedings will be published
by an international publisher, and all
the papers will be reviewed.
One or two sightseeing tours, includ-
ing visiting the Great Wall, will be
organized by the conference. There is

F 0 R

also a possibility of a post conference
tour to Xi-an, an ancient capital of
China, depending upon the number
of interested participants.
Prospective participants other than
invited speakers should send their
pre-registration giving address (postal
and e-mail, if available) and accom-
modation preference (single or
double bedroom in hotel) to the ad-
dress below by post or e-mail before
July 31, 1995. A second announce-
ment will be sent in September 1995
to all those who pre-register.

International Organizing Committee:
A.R. Conn (IBM' i ..n Research
Center, Yorktown Heights, USA);
J. Nocedal (Northwestern University,
USA); Ph. Toint (University of
Namur, BELGIUM); and Y. Yuan
(Chinese Academy of Sciences,
For further information, please con-
tact the following address or any of
the International organizing commit-
Prof. Ya-xiang Yuan State Lab. of Sci-
entific and Engineering Computing
ICMSEC, ( Ih,. .. Academy of Sci-
ences, P.O. Box 2719, Beijing
100080, China
FAX: +86-10-254-2285
e-mail: yyx@lsec.cc.ac.cn

N? 46

JULY 1995


N0 46 JULY 1995

Third Workshop on Global Optimization
Szeged, Hungary
December 10-14, 1995

This workshop is being organized by
the Austrian and Hungarian Opera-
tions Research Societies. Many tech-
nical, environmental and economic
problems have cl. ll..',,, optimiza-
tion aspects that require reliable and
efficient solution methods. Many of
these problems belong to the class of
nonlinear and nonconvex optimiza-
tion problems where standard opti-
mization methods r.... nI. fail since
local optima, different from the glo-
bal ones that we aim to find, exist.
The workshop focuses on theoretical,
.-,..il.._l. and algorithmic issues of
global optimization problems with
special emphasis on their real-life ap-
plications. The workshop aims at dis-
cussing and further developing the
most recent results in the wide range
of diverse approaches to global opti-
mization problems.
After the first (1985) and the second
(1990) Workshops held in Sopron,
Hungary, we are glad to announce
the Third Workshop on Global Op-
timization. From our preliminary dis-
cussions at various occasions during
the last two years, we know that the
overwhelming majority of the earlier
participants and many other col-

leagues are interested. Thus we look
forward to a meeting that is very
likely to match or even surpass the
very successful two earlier meetings.
Program Committee:
Pierre Hansen, Reiner Horst and
Panos M. Pardalos
Organization Committee:
Immanuel Bomze and Gabriele
Danninger, University of Vienna,
Vienna, Austria,
AndrLs Erik Csallner and Tibor
Csendes, J6zsefAttila University,
Szeged, Hungary,
Organizing Committee Address:
Tibor Csendes, J6zsefAttila Univer-
sity, Institute of Informatics, H-6701
Szeged, P.O. Box 652, Hungary
Phone: +36 62 310 011 (ext. 3839),
Fax: +36 62 312 292
E-mail: globopt@inf.u-szeged.hu
More information, including a regis-
tration form, can be found on the
WorldWideWeb at site http://
www.inf.u-szeged.hu/~globopt/ or
obtained via anonymous ftp from
ftp.jate.u-szeged.hu, in the direc-
tory pub/math/optimization/globopt.

Workshop on
Rutgers University
March 11-13, 1996
" I,_ it'...lr; .". T) problemiscentralinmathematicallogic, computingtheory,
and many industrial application problems. The main focus of this workshop is to
bring together the best theorists, algorithmists, and practitioners working on the
SAT problem and its industrial applications. This workshop will feature the ap-
, ; ,. . ,n l... I ., ,f',,I.I.;-. l'. I ..;r ,. F., I r I- 1..I i,.rn I I I- Tr ,. Irc-
sentation of practical problems for theoretical/algorithmic study. Major topics to
be covered in the workshop include: practical and industrial SAT problems and
benchmarks, ., i .. r case studies and practical applications of the SAT prob-
lem and SAT algorithms, new algorithms and improved techniques for satisfiability
testing, specific data structures and implementation details of the algorithms, and
the theoretical study of the problem and algorithms. As an important activity of
the workshop, a set of SAT problem benchmarks derived from practical industrial
engineering applications Ili be provided for algorithm benchmarking.
Ding-Zhu Du, Jun Gu, and Panos Pardalos
E-mail: dzd@cs.umn.edu, gu@enel.ucalgary.ca, pardalos@ufl.edu
Advisory Committee:
Bob Johnson, David Johnson, Christos Papadimitriou, Paul Purdom,
Benjamin Wah

During the three days approximately
30 papers will be presented in a series
C A L L of sequential ,. p ... ,I...,. sessions.
Each lecture will be 30 minutes long.
F O R The program committee 1I select
Sthe papers to be presented on the ba-
SIsis of extended abstracts which should
I be submitted as described below.

Vancouver, British Columbia,
June 3-5, 1996

The Fifth IPCO Conference On In-
teger Programming and Combinato-
rial Optimization will be held in
Vancouver, British Columbia, and is
sponsored by The Mathematical Pro-
gramming Society. The conference
will be held on the campus of the
University of British Columbia. The
campus is situated on the tip of Point
Grey, with easy access to beaches,
parks, and mountains. There ilI be a
welcoming reception Sunday night
and a social dinner on Tuesday night
on top of Grouse Mountain which
overlooks Vancouver. Reduced-rate
dormitory as well as hotel
accommodations will be available.
This series is held every year in which
no MPS International Symposium
takes place, and this :,...,r.g ill
highlight recent developments in
theory, computation, and applica-
tions of integer programming and
combinatorial optimization.
Topics include but are not limited to:
polyhedral combinatorics
integer programming
cutting planes
branch and bound
geometry of numbers
semi-definite relaxations
computational complexity
network flows
matroids and submodular func-
S0,1 matrices
approximation algorithms
scheduling theory and algorithms
In all these areas, we welcome struc-
tural and 1, .., iai;.: results, reveal-
ing computational studies, and novel
applications of these techniques to
practical problems. The algorithms
studied may be sequential or ;, ,Ill. I.
deterministic or randomized.

The proceedings of the conference
will contain full texts of all presented
papers. Copies will be provided to all
participants at the time of registra-
A World-Wide Web page has been
set up describing the history and
scope of IPCO, with pointers to
pages about the site and local univer-
sities. It also will be updated to con-
tain the most recent information
about the conference. The URL is
William H. Cunningham (Chair),
University of Waterloo; William J.
Cook, Columbia University; Gerard
Cornuejols, Carnegie Mellon Univer-
sity; Jan Karel Lenstra, Eindhoven
University of Technology; Laszlo
Lovasz, Yale University; Thomas L.
Magnanti, Massachusetts Institute of
Technology; Maurice Queyranne,
University of British Columbia; and
Giovanni Rinaldi, Istituto de Analisi
dei Sistemi ed Informatica, Rome.
Maurice Queyranne (Chair),
Frieda Granot,
and S. Thomas McCormick,
Faculty of Commerce University of
British Columbia Vancouver, BC
Canada V6T 1Z2
Fax: +1 (604) 822-9574
Persons wishing to submit a paper
should send eight copies of an ex-
tended abstract before October 31,
1995, to the :..11.. .i, address:
Ms. Jessie Lam, H.A. 459, Faculty of
Commerce, University of British Co-
lumbia, 2053 Main I -.1 Vancouver,
BC Canada V6T 1Z2
telephone +1 (604) 822-8505 fax +1
(604) 822-9574


- --- -- -- -----~

JULY 1995

N? 46

-~lL -Ib~

The extended abstract should be be-
tween five and 10 pages (typed,
double spaced), i.e., approximately
2,000 words. TeX and LaTeX ab-
stracts must use the single-column
"article" style in at least eleven-point
size. The abstract must provide suffi-
cient detail c ...... r. i:. the results and
their significance to enable the pro-
gram committee to make its selec-
tion. Please include an e-mail address
with your submission if possible.
Authors of all accepted papers will be
notified by January 31, 1996. Notifi-
cation will be by e-mail if an address
is supplied; titles and authors of ac-
cepted papers will also be posted on
the conference's Web page.
Final..11 versions of all accepted pa-
pers must be provided, in camera-
ready format, by March 10, 1996.
This will enable the proceedings to be
printed and made available at the

time of the meeting.
It is intended to publish the proceed-
ings in the Springer Lecture Notes of
Computer Science. Further details
concerning the format of the final
versions of the papers for the pro-
ceedings will be provided with notifi-
cation of acceptance. The format will
also be available from the
conference's Web page, see above.
Papers in the proceedings will not be
refereed, and it is expected that re-
vised versions will subsequently be
submitted for publication in appro-
priate journals.
October 31, 1995: Deadline for sub-
mission of abstracts; January 31,
1996: Notification of acceptance of
papers; March 10, 1996: Deadline
for submission of papers and deadline
for early registration.

IFORS 96 14th Triennial Conference
Vancouver, British Columbia, Canada
July 8-12, 1996

The 1996 International Federation of Operational Research Societies conference
will provide a bridge to link researchers and practitioners in OR.
A number of sessions have been arranged, but we welcome further suggestions.
Papers will be invited from prominent researchers and practitioners by session
organizers. Every national and kindred member society of IFORS is also invited
to present a national contribution. Authors wishing to contribute papers are re-
quested to submit an abstract not later than October 31, 1995.
[. r, . t l r Ill I d ,r, .. .. r ..-..r I.- J ....r 1, I ,
ofVol. 4 (1997) of International Transactions in OperationalResearch (ITOR) have
been reserved for publication ofpapers presented. Authors are encouraged to prepare
full papers for submission to this journal. All submitted papers will be refereed.
Submission details and deadlines will be included in the Invitation Programme.
A subscription to Vol. 4 of ITO I' .1 i .... 11 be provided to each registrant
as part of the registration package. Commercial exhibitors are invited to show books
and computer software that have relevance to OR (-.r. .- ..1 ...I ...11.. .l
The Ir. ,r..... ,.. I ,..r... I II..-mailed during the Fall of 1995 to allwho sent
an Abstract Submission or request for information to the IFORS 96 secretariat
,,",d r. ,1[ ,u .... .ll r _-lu.a r ,I1 l., . ,, -.i r..... r.. , ;r, L ., ;.r ,rr .,,,
form included in the Invitation Programme.
Submit three copies of the abstract in English or French including title, 50-word
abstract, authors name(s), organization, and mailing address. Presenting author
should be listed first or underlined. Include cheque or money order for $1 OOCAD
(Canadian D..II *' I, 1.I.. r i FORS 96. To pay byVISA or Mastercard include
card number, expiration date and signature on one submission page. The fee will
be returned if paper is not accepted. Otherwise, it will be credited toward regis-
tration fee. Abstract fee is non-refundable after December 31, 1995.
Conference Secretariat, IFORS 96, Venue West Conference Services Ltd.,
645-375 Water Street, Vancouver, British Columbia, Canada V6B 5C6
Phone: (604) 681-5226 FAX: (604) 681-2503

Vol. 65, No. 1

J.E. Martinez-Legaz and A.
Seeger, "A general cone decompo-
sition theory based on efficiency."
S. Helbig, "Duality in disjunctive
programming via vector optimiza-
M. Kojima, T. Noma and A.
Yoshise, "Global convergence in
infeasible-interior-point algo-
J. Renegar, "Some perturbation
theory for linear programming."
P.L. Erdis and L.A. Sz6kely, "On
weighted multiway cuts in trees."
S.D. Flim and A. Seeger, "Solving
cone-constrained convex programs
by differential.)" inclusions."

Vol. 65, No. 2

A.S. Lewis, "Facial reduction in
partially finite convex program-
W.H. Cunningham and J. Green-
Kr6tki, "A separation algorithm
for the matchable set polytope."
B. Xiao and P.T. Harker, "A
nonsmooth Newton method for
variational inequalities, I:
B. Xiao and P.T. Harker, "A
nonsmooth Newton method for
variational inequalities, II:
Numerical results."
M.J. Todd, "Interior-point
algorithms for semi-infinite
\ol. (6. No. 3

R. i t t in and J.L. Nazareth,
"I li, ih as t pi 1or deviation quasi-
\, w.'on i.pdate."
B.C La ve- and U.G. Rothblum,
"Fo, inuinititri of linear problems
and solution by a universal
I. Averb kh, "Probabilistic
properties of the dual structure of
the mult dimensional knapsack
problem nd fast statistically
A. Mi alas, "A regularization of
the Frink-Wolfe method and
uiifi ation of certain nonlinear
'i ramming methods."
0 Giiler, "Limiting behavior of
;t weighted central paths in linear
p" ogramming."
K.T. Medhi, "A two-stage
successive overrelaxation
algorithm for solving the symmet-
ric linear complementarity

Vol. 66, No. 1

A. Jourani, "Qualification
conditions for multivalued
functions in Banach spaces with
applications to nonsmooth vector
optimization problems."
L. Qi and J. Sun, "A trust region
algorithm for minimization of
locally Lipschitzian functions."
D. Siegel, "Modifying the BFGS
update by a new column scaling

- -- ~---~-- s


N? 46

IULY 1995



D. Medhi, "Bundle-based
decomposition for large-scale
convex optimization: Error
estimate and application to block-
angular linear programs."
F. Gamboa and E. Gassiat, "The
maximum entropy method on the
mean:Applications to linear
programming and
I.J. Lustig, R.E. Marsten and D.F.
Shanno, "Computational
experience with a globally
convergent primal-dual predictor-
corrector algorithm for linear
Vol. 66, No. 2

B. He, "A new method for a class
of linear variational inequalities."
L. Tunnel, "Constant potential
primal-dual algorithms: A
D. Pallaschke and R. Urbafiski,
"Reduction of quasidifferentials
and minimal representations."
C.P. Schnorr and M. Euchner,
"Lattice basis reduction: Im-
proved practical algorithms and
solving subset sum problems."
A. Tamir, "A distance constrained
p-facility location problem on the
real line."
P. Barcia and J.D. Coelho, "A
bound-improving approach to
discrete programming problems."
J. Falkner, F. Rendl and H.
Wolkowicz, "A computational
study of graph partitioning."
O.L. Mangasarian and J. Ren,
"New improved error bounds for
the linear complementarity
K.A. Andersen, "Characterizing
consistency in probabilistic logic
for a class of Horn clauses."

Vol. 66, No. 3

I. BarAny, R. Howe and H.E.
Scarf, "The complex of maximal
lattice free simplices."
I. Bongartz, P.H. Calamai and
A.R. Conn, "A projection method
for I-p norm location-allocation
A.V. Karzanov, "Minimum cost
multiflows in undirected net-

R. Fletcher and S. Leyffer,
"Solving mixed integer nonlinear
programs by outer approxima-
R. Hirabayashi, H.Th. Jongen and
M. Shida, "Stability for linearly
constrained optimization
Y. Zhang and D. Zhang,
"Superlinear convergence of
infeasible-interior-point methods
for linear programming."
L.R. Huang and K.F. Ng, "Second-
order necessary and sufficient
conditions in nonsmooth optimi-
J.O. Cerdeira, "Matroids and a
forest cover problem."
Vol. 67, No. 1

Z.-Q. Luo and J.-S. Pang, "Error
bounds for analytic systems and
their applications."
S.J. Wright, "An infeasible-
interior-point algorithm for linear
complementarity problems."
C. Zhu, "Solving large-scale
minimax problems with the
primal-dual steepest descent
M. van Rooyen, X. Zhou and S.
Zlobec, "A saddle-point charac-
terization of Pareto optima."
A.R. Conn, N. Gould and Ph.L.
Toint, "A note on exploiting
structure when using slack
A. Shapiro, "Quantitative
stability in stochastic program-
S. Mizuno, "Polynomiality of
infeasible-interior-point algo-
rithms for linear programming."
K.T. Talluri and D.K. Wagner,
"On the k-cut subgraph polytope."
F. Giider and J.G. Morris,
"Optimal objective function
approximation for separable
convex quadratic programming."

Vol. 67, No. 2

J.L. Higle and S. Sen, "Finite
master programs in regularized
stochastic decomposition."
R. Cominetti and J. San Martin,
"Asymptotic analysis of the
exponential penalty trajectory in
linear programming."

T.F. Coleman and Y. Li, "On the
convergence of interior-reflective
Newton methods for nonlinear
minimization subject to bounds."
J.-P. Penot, "Optimality condi-
tions in mathematical program-
ming and composite optimiza-
T. Bannert, "A trust region
algorithm for nonsmooth
M.H. Wright, "Some properties of
the Hessian of the logarithmic
barrier function."

Vol. 67, No. 3

Y. Pochet and L.A. Wolsey,
"Polyhedra for lot-sizing with
Wagner-Whitin costs."
H. Nagamochi, T. Ono and T.
Ibaraki, "Implementing an
efficient minimum capacity cut
J.R. Brown, "Bounded knapsack
F.A. Potra, "A quadratically
convergent predictor-corrector
method for solving linear
programs from infeasible starting
R.W. Freund and F. Jarre, "An
interior-point method for
fractional programs with convex

Vol. 68, No. 1

P. Kleinschmidt and H.
Schannath, "A strongly polyno-
mial algorithm for the transporta-
tion problem."
J.F. Bonnans, J.Ch. Gilbert, C.
Lemarechal and C.A.
Sagastizibal, "A family of
variable metric proximal meth-
G. Zhao, "On the choice of
parameters for power-series
interior point algorithms in linear
A. Marchetti-Spaccamela and C.
Vercellis, "Stochastic on-line
knapsack problems."
J. Outrata and J. Zowe, "A
numerical approach to optimiza-
tion problems with variational
inequality constraints."

Vol. 68, No. 2

S.-P. Hong and S. Verma, "A note
on the strong polynomiality of
convex quadratic programming."
R.A. Tapia, Y. Zhang and Y. Ye,
"On the convergence of the
iteration sequence in primal-dual
interior-point methods."
E.A. Boyd, "Resolving degeneracy
in combinatorial linear programs:
Steepest edge, steepest ascent, and
parametric ascent."
W.T. Obuchowska and R.J.
Caron, "Minimal representation
of quadratically constrained
convex feasible regions."
G.S.R. Murthy, T. Parthasarathy
and G. Ravindran, "On co-
positive, semi-monotone Q2
P. Ruiz-Canales and A. RufiAn-
Lizana, "A characterization of
weakly efficient points."
D. Bienstock and 0. Giinliik,
"Computational experience with a
difficult mixed-integer
multicommodity flow problem."

Vol. 68, No. 3

E. Balas, M. Fischetti and W.R.
Pulleyblank, "The precedence-
constrained asymmetric traveling
salesman polytope."
J.M. Martinez and S.A. Santos, "A
trust-region strategy for minimi-
zation on arbitrary domains."
Y. Zhang and D. Zhang, "On
polynomiality of the Mehrotra-
type predictor-corrector interior-
point algorithms."
J.E. Dennis Jr., S.-B.B. Li and R.A.
Tapia, "A unified approach to
global convergence of trust region
methods for nonsmooth optimiza-
C. Zhu, "Solving large-scale
minimax problems with the
primal-dual steepest descent
algorithm." [Erratum to Math.
Programming 67 (1) (1994) 53-76]


NQ 46

JULY 1995


I --- ---- .... .. .... ...-.- ..... .... ...


Eigenvalues ofMatrices

by F Chatelin
John Wiley and Sons, Chichester, 1993
ISBN 0-471-93538-7
Asolverof....... I .. I .. II .. ... the fundamental blocks in many computer soft-
ware packages to analyze and simulate the behavior ofstructures, various processes in physics
and chemistry, economy, etc. In recent years efficiency of the solvers of eigenproblems
has improved dramatically, resulting in solutions and computer simulations of very
complex and large problems. This improvement has become possible mainly by combin-
ing progress in computers and progress in the theory of matrices and eigenvalues.
iI I ....I .. '. ,..I. i. i -. ... .i r.. .l I,, i. , ..I .. y oftheeigenvalueprob-
lems for matrices. Based on functional analysis and linear algebra, the author discusses
various strategies and approximation methods in a general, though relatively simple, way.
i ..'' ,i .l., ,! .1 ... ., t '. -"'.. compact, clearpatterns; many theoretical and
numerical concepts are illustrated .1..r1.... II,
The first two chapters present fundamentals of linear algebra and functional analysis.
Invariant subspaces, bases of and distances between two subspaces, spectral decomposi-
tion and projection are introduced and related to convergence of various iterative pro-
cedures. The third chapter shows examples of practical applications of the eigenvalues
analysis, from mathematics to economy, chemistry and structural dynamics. Chapter 4
discusses the spectral conditioning and stability of the iterative processes. The analyses
ofapriori and aposteriorierrors are presented. In Chapter 5 the convergence ofa Krylov
sequence ofsubspaces is analyzed. The last two chapters (Ch. 6 and 7) discuss methods
ofcalculating eigenvalues forlarge matrices based on subspace iterations techniques. These
methods arec .r : I, I i .. .. r r -, , i. t .. ,. . ...rr!/ the mostpopular in
computer applications. Chapter 6 presents the numerical methods of obtaining a set of
extreme eigenvalues for large Hermitian (the Lanczos method) or non-Hermitian (the
Arnoldi method) matrices. In Chapter 7 methods that use the concept of the Chebyshev
polynomials for i-..[,i.,,, ; .. .. ,1 1,, of greatest real part of non-symmetric ma-
trices are outlined.

N" 46

I __________ _________

JULY 1995

"This book presents a
modern and com-
plete mathematical
theory of the eigen-
value problems for
matrices. Based on
functional analysis
and linear algebra,
the author discusses
various strategies
and approximation
methods in a gen-
eral, though rela-
tively simple, way."


Most concepts and methods discussed are illustrated by numerous exercises and examples,
some of them solved in the appendix.
Thebook,due toitsmathematicalvocabulary ,,..r i... ,, ,,,..,, .,,r; il, ,,. .
for mathematicians. However, researchers and students from other areas, who want either
n. i ,,n;.,I ,_ ..T l, I,, ,, .. ,1 _. .,; ,1, .. .,. ,;.. ,rr ... ...... n u m er-
ous alternative iterative procedures now available, should also find it interesting and useful.

Convex Analysis and Minimization

Algorithms I and II
by J.-B. Hiriart-Urruty and C. Lemar&chal
Springer-Verlag, Berlin, 1993
ISBN 3-540-56850-6 AND 3-540-56852-2
These two volumes comprise a comprehensive introduction to those areas ofconvex analysis
and optimization that bear on the practical problem of instructing a computer to locate
the minimizer of a convex function over n-dimensional Euclidean space. This innovative
text is well written, T .' ". ,I I ,. ,i... ,,-,i ...... .1 .1. to a wide audience. However, it
is devoid of exercises and fails to deliver a reliable minimization algorithm. The former
omission is important given the authors' pedagogical ambitions while the latter is a
consequence of the state of the art and the authors' intent "to demonstrate a technical
framework, rather than to establish a particular result." Let us take a closer look.
A smooth convex function, fis minimized by a zero of its gradient, Vf Minimization
l:..,;rl-,,. for such problems enjoy powerful convergence properties and wide use.
Dropping the smooth assumption we recall that a minimizer is an element of the
subdifferential, f the collection of vectors dominated by the directional derivative of
f The design of an . r, minimization algorithm in this case is considerably more
difficult. The progress made to date has required a much closer study of convex analysis
than was necessary in the smooth case. Hiriart-Urrutyand Lemarchal attempt to present
this closer study and its numerical application in a manner accessible to one whose
mathematical background may not exceed calculus. Given their prominent roles in the
development of the field and their open faced enthusiasm for its interplay between theory
and practice, it should come as little surprise that the authors have largely succeeded in
their task.
Volume I constitutes an ..1 r .,.l;, ;r, ....I ,. .., to the fields of convex analysis and
minimization algorithms. The text begins with a chapter on each, the first on convex
functions of one variable and the second on optimization algorithms for smooth func-
tions. Convex sets, convex functions of several variables, and constrained convex mini-
mization are then considered. The volume closes with an application ofa modified steep-
est descent algorithm to a general nonsmooth convex function. The modification being
th. .,il.. rr,, .......t rt t,,ll ,,I..h ..r r i. .. rl d. ",.. , T heauthors, bythistim e,have
prepared the reader to the degree that he understands why such an application is satis-
factory neither on theoretical nor on practical grounds. This whets the reader's appetite
for the more refined techniques of Volume II.
The second volume opens with a conceptual algorithm for producing a descent direction
without necessarily computing the entire subdifferential. The authors make this concept
,, .I.-, r, .[ ,,r l.... r .ii,, . r.1,-. Ia. ,l.. ... ... native. In particular, they
require the E-subdifferential, f, the collection of vectors dominated by E plus the di-

"Given their prominent
roles in the develop-
ment of the field and
their open faced en-
thusiasm for its inter-
play between theory
and practice, it should
come as little surprise
that the authors have
largely succeeded
in their task."

JULY 19)95

N 46


rectional derivative off. The calculus of such ,1..1,I11 ,.. I and their relation to the
convex conjugate offis developed in some detail prior to their proposal of a preliminary
E-descent algorithm. Its careful scrutiny leads the authors to the improved Algorithm
3.4.2 of Chapter XIV. After a number of numerical tests, the authors conclude that the
"algorithm can be viewed as a robust minimization method," though "the need for a
parameter hard to choose (namely E) has not been totally eliminated," and hence that
1,. ,,. .;l, ,, "d... I l I I, .Il, T hereader I.. .. . .. I .1 ..!. the
bait in the final chapter of volume I, ill.. i. accepted the E-subdifferential and its
analytical baggage. II t..l t, -r, .,. this I.. I ,ll .i I, i "The authors,
I believe, would argue that they never promised a general purpose algorithm and that
a reader who insists on extracting a practical tool from a mathematics text will find
frustration more often than not.
Indeed, one cannot judge a text that purports to I1 i, ri1. ..1, ',.. ..Fthe monument
C. . ,i ; i ,;,, r....... ... ri..,, "intermsofitsabilitytodeliveraworking
piece of code. The authors have in fact lit considerably more than just the entrance. Who
shall benefit? This dialectic of theory and algorithm seems to me especially well suited
to a sophomore/junior level year-long introduction to computational science, a major
that has begun to stand on its own. The first volume raises all the right questions but
in a sufficiently narrow context, namely convex. This permits the young reader to ac-
. i i I . .... I. ., .I 1i ,.I. . I. .. ..i . .,, .r .._ ,I, . ;often m issingfrom asurvey-
like introductory course. The second volume, being perhaps too narrow to constitute
the remainder of such introduction, is amenable to hopping .: i. i ., .i1 many oppor-
tunities to address broader issues in the larger arenas of nonlinear programming and
nonsmooth analysis. The appeal of these volumes is in no way limited, however, to an
undergraduate audience. One can easily imagine building a graduate seminar around
volume II while the ample bibliography together with the bibliographic comments
associated with each of the chapters makes the set an invaluable resource for research.

-" L`-

N? 46

JULY 1995

S101 Philip Dr. Norwell, MA 02061 PH 617-871-6600 FX 617-871-6528

Nonconvex Optimization

and Its Applications

This new series publishes monographs and state-of-the-art ex-
pository works which focus on algorithms for solving nonconvex
problems and which study applications that involve such prob-
lems. The following list of topics best describes the aim of
Nonconvex Optimization and Its Applications.
Nonlinear optimization; nonconvex network flow problems;
stochastic optimization; optimal control; discrete optimization;
multi-objective programming; description software packages;
and sub-optimal algorithms (e.g. simulated annealing, genetic al-
gorithms, heuristics).

Series Managing Editors:
Panos M. Pardalos, University of Florida
Reiner Horst, University of Trier
Advisory Editorial Board: Ding-Zhu Du; Chris Floudas;
Gerd Infanger; Jonas Mockus; Hanif D. Sherali

For more information on books in the series, contact the publisher. In North
America-Kluwer Academic Publishers 101 Philip Dr. Norwell, MA 02061
e-mail: jmkluwer@world.std.com FX: 617-871-6528

Published in the series:
Advances in Optimization and Approximation
edited by D-Z. Du and J. Sun, ISBN 0-7923-2785-3
1994 404 pages, cloth $141.50
The papers in this book provide a broad spectrum of research
on optimization problems including 1.. i,,,-,, location, assign-
ment, linear and nonlinear programming problems as well as
problems in molecular biology.
Handbook of Global Optimization
edited by R. Horst and P.M. Pardalos
ISBN 0-7923-3120-6
1994 500 pages, cloth $269.00
The Handbook is the first comprehensive book to cover
recent developments in global optimization; contributions are
essentially expository in nature but scholarly in treatment.
Soon to be Published:
Introduction to Global Optimization
by R. Horst, P.M. Pardalos and N.V. Thoai
1995 ISBN 0-7923-3556-2 318 pages, cloth
1995 ISBN 0-7923-3557-0 318 pages, paper
The primary goal of this textbook is to provide an introduction
to constrained global optimization. Recent developments in the
area including nonconvex quadratic programming, general
concave minimization, network optimization, Lipschitz and D.C.
programming are addressed.

New Journal for 1995 Lifetime Data Analysis: An International Journal devoted to Statistical
Methods and Applications for Time-to-Event Data Editor-in-Chief: Mei-Ling Ting Lee
For more information or to request a free sample copy e-mail: jmkluwer@world.std.com


Mail to:

The Mathematical Programming Society, Inc.
c/o International Statistical Institute
428 Prinses Beatrixlaan
2270 AZ Voorburg
The Netherlands

Cheques or money orders should be made
payable to The Mathematical Programming
Society, Inc., in one of the currencies listed
below. Dues for 1995, including subscription
to the journal Mathematical Programming,
are HFL100.00 (or USD55.00 or DEM85.00
or GBP32.50 or FRF300.00 or CHF80.00).
Student applications: Dues are 1/ the above rates.
Have a faculty member verify your student status
and send application with dues to above address.

0 R

I wish to enroll as a member ofthe Society. My subscription is for
my personal use and notfor the benefit ofany library or institution.
I enclosepayment as follows:
Dues for 1995






New Book Series ...


- ----

JULY 1995

Egon Balas (Carnegie-Mellon) received the
1995 Johnvon Neumann theory prize for his
career work in integer programming. The
award was made at the Spring INFORMS
meeting in Los Angeles.eThe executive
committee of MPS is planning to create a
home page on the World Wide Web. More
news follows soon.#The book, Network
Flows, by R.K. Ahuja, T.L. Magnanti and
J.B. Orlin received the Lanchester Prize at the
Fall 1994 ORSA/TIMS meeting. ,Deadline
for the next OPTIMAis September 15,1995.

Donald W. Hearn, EDITOR
e-mail: hearn@ise.ufl.edu
Utrecht University
Department of Computer Science
P.O. Box 80089
3508 TB Utrecht
The Netherlands
e-mail: aardal@cs.ruu.nl
Georgia Tech
Industrial and Systems Engineering
Atlanta, GA 30332-0205
e-mail: faiz@isye.gatech.edu
Department of Econometrics
Tilburg University
P.O. Box 90153
5000 LE Tilburg
The Netherlands
e-mail: talman@kuh.nl
Elsa Drake, DESIGNER

Journal contents are subject to change
by the publisher.


Center for Applied Optimization
371 Well Hall
PO Box 116595
Gainesville FL 32611-6595 USA


University of Florida Home Page
© 2004 - 2010 University of Florida George A. Smathers Libraries.
All rights reserved.

Acceptable Use, Copyright, and Disclaimer Statement
Last updated October 10, 2010 - - mvs