NA Digest Sunday, November 25, 1990 Volume 90 : Issue 41

Today's Editor: Cleve Moler

Today's Topics:

-------------------------------------------------------


From: Frank Luk <luk@jacobi.EE.CORNELL.EDU>
Date: Fri, 23 Nov 90 12:04:00 EST
Subject: Jeff Speiser

A Tribue to Jeff Speiser

Jeffrey Speiser died on Wednesday, November 14, during a heart
operation. He turned fifty in September.

Jeff was an undergraduate at MIT, and a graduate student at
Berkeley. He joined the research staff at the Naval Ocean
Systems Center in San Diego, California, right after Berkeley and
worked there for twenty-seven years.

Trained as an electrical engineer, Jeff developed a keen interest
in applications and computing, particularly accurate matrix
techniques as applied to signal processing problems. He repeated
to me a few times the following story about his favorite
colleague Harper Whitehouse. Harper spent a study leave at the
University of Southern California in Los Angeles. One day,
Harper drove back to San Diego all excited because he had
discovered a computational means to achieve high accuracy in
signal processing computations. The technique was the singular
value decomposition (SVD), and the year was 1981. As his usual
self, Jeff proceeded to do a thorough literature search and a
detailed study of the SVD. He found my paper on computing the
SVD on the Illiac IV, called a few people to obtain my telephone
number, and gave me a cold call. We talked for a long time,
during which he told me of the important uses of the SVD in
signal processing and urged me to look at an emerging field
called systolic computing. The conversation initiated a nine
year friendship that I will treasure for the rest of my life.

For the past five years, Jeff and I have co-organized an annual
SPIE Conference on Signal Processing Algorithms and
Architectures. Although I was the Conference Chair in name, Jeff
was the real brain behind the scene. Right up to a month ago, we
still corresponded feverishly on our plans for next year's
conference.

Jeff was always cheerful and outgoing, and he loved to talk and
joke. At this past July's SPIE Conference, he laughed when he
described how shocked his doctor was when he discovered that
Jeff's pulse was zero during an examination. Two weeks ago I was
away at the SIAM Conference in San Francisco and at the NA Day at
Stanford. When I returned, I read a message from Jeff sent on
November 7, a message that was uncharacteristic of the always
upbeat person that I knew. For the first time, Jeff said that he
was suffering from his heart ailments. Two days later, he died.

In this era of fierce competition in the research field, Jeff
stood out as a totally selfless scientist who would tell you
everything he knew. He always tried to bring together signal
processing practitioners and numerical analysts. During his
plenary lecture at the SIAM Conference on November 7, Tom Kailath
credited Jeff as the one who pointed his research group to the
technique of total least squares. We will all do much better
research if we can behave a little bit more like Jeff. I believe
he would like us to remember him by the example that he had set:
A dedicated, knowledgeable, broad, and selfless scientist.

-- Frank Luk


------------------------------

From: George Corliss <georgec@boris.mscs.mu.edu>
Date: Sat, 24 Nov 90 6:40:55 CST
Subject: Brent's Directed Rounding

A student of mine has been looking at Richard Brent's multiple precision
arithmetic package. The code claims to do directed roundings.
One calls the routine MPSETR to set the rounding to nearest
(default), toward plus infinity, toward minus infinity, or toward
zero. Our experiments suggest that this rounding is not performed
as advertized. In particular, we get exactly the same answer from
setting rounding toward minus infinity and adding 10000 copies of
pi as from setting the rounding toward plus infinity and adding.
The rounding mode DOES affect the output routine, though, so a
casual test might conclude that rounding works. We are currently
studying the code; at first glance, the addition operator appears
to be doing the right things. We are looking more carefully now.

Has anyone else looked at this? Any experiences would be welcome.

George F. Corliss
georgec@boris.mscs.mu.edu


------------------------------

From: David Bernholdt <bernhold@qtp.ufl.edu>
Date: 24 Nov 90 22:28:03 GMT
Subject: Sparse-BLAS Implementations

There is a paper "Sparse Extensions to the Fortran Basic Linear
Algebra Subprograms", by Dodson, Grimes, and Lewis which is available,
along with a model implementation in Fortran from the netlib@ornl.gov
in the 'sparse-blas' directory. The paper describes a set of BLAS1
routines for sparse vectors. It was written in 1985 or 1986.

I am wondering how widely this definition of the sparse-blas1 has
caught on: What vendors implement it or plan to? Is there another
formulation which has caught on instead, or is this area still
developing too much to settle on a particular model for the sparse
blas1?

For example, I know that Cray's SCILIB includes routines which
duplicate the functionality of several of the Dodson, Grimes & Lewis
proposal, but have different names and slightly different arrangements
of the arguments. Maybe these have caught on instead? (I think they
predate the proposal to which I refer).

Thanks for any light you can shed.

David Bernholdt bernhold@qtp.ufl.edu
Quantum Theory Project bernhold@ufpine.bitnet
University of Florida
Gainesville, FL 32611 904/392 6365

------------------------------

From: K. J. Tiahrt <umsfkbow%msu.dnet@deimos.oscs.montana.edu>
Date: Mon, 19 Nov 90 16:02:42 MST
Subject: Positions at Montana State University

FACULTY POSITIONS

MONTANA STATE UNIVERSITY
Department of Mathematical Sciences

The Department of Mathematical Sciences at Montana State University invites
applications for tenure track Assistant Professor of Mathematics positions, to
begin August, 1991. Requirements include a PhD in Mathematics or a related
field and evidence of strong research potential and teaching abilities.
Applicants should complement the department's Ph.D. programs in dynamical
systems and applied numerical analysis. Appropriate areas include numerical
analysis, dynamical systems, ordinary or partial differential equations,
applied mathematics, geometric analysis or control theory. The department has
a large graduate program with a strong PhD centered about research efforts in
the above areas. An NSF Engineering Research Center provides many opportunities
for interdisciplinary work.

Outstanding outdoor recreation including fishing, hunting, skiing and
backpacking is abundant in this lovely mountain valley which lies only 90
miles from Yellowstone National Park. Local schools are much above average
by national standards.

Send resume and names of three references to Dr. K. J. Tiahrt, Department of
Mathematical Sciences, Montana State University, Bozeman, MT 59717-0240. For
further information on the positions, write Dr. Tiahrt or e-mail your request
to umsfnegg@mtsunix1 (BITnet). Deadline for applications is February 1, 1991
or until positions are filled. Veterans preference. AA/EO.


------------------------------

From: R. Beauwens <beauwens@bbrnsf11.bitnet>
Date: Mon, 19 Nov 90 12:53:53 PST
Subject: IMACS Symposium on Iterative Methods

IMACS International Symposium on Iterative Methods in Linear Algebra

April 2-4, 1991, Brussels, Belgium.

CO-CHAIRMEN:
Robert Beauwens (Universit\'e Libre de Bruxelles)
Pieter de Groen (Vrije Universiteit Brussel)

SCOPE:
The purpose of the symposium is to provide a forum for the
presentation and the discussion of recent advances in the analysis
and implementation of iterative methods for solving large linear
sysytems of equations and for determining eigenvalues, eigenvectors
or singular values of large matrices.

TOPICS:
Matrix analysis: convergence acceleration - preconditioning -
methods for nonsymmetric, singular and
overdetermined systems - sparse eigenvalue problems
Boundary value problems:
multigrid methods - domain decomposition - spectral methods
Implementation techniques: on vector processors - on multiprocessors -
on massively parallel systems
Software developments: for sparse linear systems - for sparse
eigenproblems
Mathematical applications: partial differential equations - systems
theory - least squares problems

INVITED LECTURES:

O. Axelsson, On multilevel iteration methods for problems in
elasticity theory.
F. Chatelin, The Arnoldi Chebyshev iterative method for the
stability of evolution equations.
D. Kincaid, Second degree iterative methods.
A. van der Sluis, The convergence behaviour of Conjugate
Gradients in various situations.
H. van der Vorst, Conjugate gradient type methods for
non-symmetric systems.
E.L. Wachspress, Consistent sparse factorisations .
Yu. Yeremin, To be confirmed.

SPECIAL SESSIONS

Coupled inner-outer iteration methods (O. Axelsson);

Numerical methods for the analysis od Markov models (G. Latouche)

Spectral Methods (M. Delville & E. Mund)

Complex Variable methods for solving non-positive definite
linear systems (M. Eiermann and W. Niethammer)

Parallel iterative methods (D. Kincaid & C. Wu)

Iterative solution of unsymmetric systems (H. van der Vorst)

The Lyapunov equation (E. Wachspress)


CONTRIBUTED LECTURES:

More than 70 papers have been submitted, covering a large number of
subjects within the scope of the conference.
No more than three parallel sessions will be scheduled at the
same time.
All papers published in the proceedings will be refereed.

CALENDAR:

Monday, 1st April: Advance registration and informal get together
in one of the hotels listed below.

Tuesday, 2nd April: 9.00 am. Registration.
10.00 am opening of the conference.

Wednesday, 3rd April: Reception in the Gothic Brussels town hall.
Admission free for participants.
Conference Dinner, BF 2000,- per person.

Thursday, 4th April: 17.00 p.m. Closure

CONFERENCE HALL: Aula of the VUB (Vrije Universiteit Brussel),
Pleinlaan 2, B-1050 Brussel.
IBM, Honeywell and IMACS.

SPONSORS: NFWO-NFRS (Belgian National Science Foundation),

MORE INFORMATION:

R. Beauwens
IMACS International Symposium on
Iterative Methods in Linear Algebra
Universit\'e Libre de Bruxelles, CP 165
Av. F.D. Roosevelt 50, B-1050 Brussels, Belgium

fax. +31-2-6503564, phone +31-2-6502085, email beauwens@bbrnsf11.bitnet
fax. +31-2-6413495, phone +31-2-6413307, email pieter@tena2.vub.ac.be
Note change of phonenumber of the ULB: 650xxxx instead or 642xxxx


------------------------------

From: Michael Cohen <mike@park.bu.edu>
Date: Wed, 21 Nov 90 12:40:33 -0500
Subject: Neural Networks Course and Conference

Short Conference Announcement For you Information

NEURAL NETWORKS COURSE AND CONFERENCE AT

BOSTON UNIVERSITY


NEURAL NETWORKS: FROM FOUNDATIONS TO APPLICATIONS

May 5-10, 1991

This self-contained 5-day course is sponsored by the Boston University
Wang Institute, Center for Adaptive Systems, and Graduate Program in
Cognitive and Neural Systems. The course provides a systematic
interdisciplinary introduction to the biology, computation, mathematics,
and technology of neural networks. Boston University tutors are
Stephen Grossberg, Gail Carpenter, Ennio Mingolla, Michael Cohen, Dan
Bullock, and John Merrill. Guest tutors are Federico Faggin,
Robert Hecht-Nielsen, Michael Jordan, Andy Barto, and Alex Waibel.
Registration fee: $985 (professional) and $275 (student). Fee includes
lectures, course notebooks, receptions, meals, coffee services, and
evening discussion sessions.

NEURAL NETWORKS FOR VISION AND IMAGE PROCESSING

May 10-12, 1991

This research conference at the Wang Institute will present invited
lectures and contributed posters, herewith solicited, ranging from visual
neurobiology and psychophysics through computational modelling to
technological applications. Invited speakers include: Jacob Beck, Gail
A. Carpenter, David Casasent, John Daugman, Robert Desimone, Stephen
Grossberg, Robert Hecht-Nielsen, Ralph Linsker, Ennio Mingolla, Alex
Pentland, V.S. Ramachandran, Eric Schwartz, George Sperling, James
Todd, and Alex Waxman. A featured Poster Session will be held
on May 11. To present a poster, submit 3 copies of an abstract
(1 single-spaced page), postmarked by March 1, 1991, for refereeing.
Include with the abstract the author's name, address, and telephone number.
Mail to VIP Poster Session, Neural Networks Conference, Wang Institute of
Boston University, 72 Tyng Road, Tyngsboro, MA 01879. Authors will be
informed of abstract acceptance by March 31, 1991. Registration fee:
$95 (professionals) and $75 (student). Fee includes lectures and
poster session, reception, meals, and coffee services.

TO REGISTER: For one or both events by phone, call (508) 649-9731 with VISA
or MasterCard between 9 a.m. - 5 p.m. (EST). For a meeting brochure, call
as above or write: Neural Networks, Wang Institute of Boston University,
72 Tyng Road, Tyngsboro, MA 01879.


------------------------------

End of NA Digest

**************************
-------