From dongarra@CS.UTK.EDU Fri Apr 12 16:50:11 1996
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA07592; Fri, 12 Apr 1996 16:50:11 -0400
Received: from mail.cs.utexas.edu (root@mail.cs.utexas.edu [128.83.139.10])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id QAA19509; Fri, 12 Apr 1996 16:53:27 -0400
Received: from daffy.cs.utexas.edu (rvdg@daffy.cs.utexas.edu [128.83.143.203]) by mail.cs.utexas.edu (8.7.1/8.7.1) with ESMTP id PAA09872; Fri, 12 Apr 1996 15:45:34 -0500 (CDT)
From: Robert van de Geijn
Received: by daffy.cs.utexas.edu (8.7.1/Client-1.4)
id PAA27599; Fri, 12 Apr 1996 15:45:33 -0500
Date: Fri, 12 Apr 1996 15:45:33 -0500
Message-Id: <199604122045.PAA27599@daffy.cs.utexas.edu>
To: blast-core@CS.UTK.EDU, blast-parallel@CS.UTK.EDU, blast-ob@CS.UTK.EDU
CC: plapackers@cs.utexas.edu, rvdg@cs.utexas.edu, stewart@cs.umd.edu
Subject: Recent advances
Folks,
we at UT-Austin have been playing with a number of issues that may be
of interest to the BLAST forum.
As part of the PLAPACK project, we have implemented a Simple Library
(SL_library) which uses Physically Based Matrix Distribution (PBMD),
Object Based Programming (OBP), and recent advances in parallel linear
algebra algorithms. The results are quite encouraging:
* We currently have about 1200 lines of code, primarily providing
the infrastructure for the object based library.
* With this we should be able to implement all parallel blas in
another 1200 lines of code,
* and the rest of "core" LAPACK in yet another 1200 lines.
* What is also interesting is that we are getting the same high
performance as our hand coded parallel level 3 BLAS, on the Intel
Paragon.
For more information, the webpage for the SL_library can be accessed
through my home page:
http://www.cs.utexas.edu/users/rvdg
A more direct path to the reference manual, example of matrix-matrix
multiplication, and performance data, you can access
http://www.cs.utexas.edu/users/gunnels/SL_Library/library.html
We would welcome comments
Robert
From dongarra@CS.UTK.EDU Tue Jan 28 16:04:47 1997
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA03566; Tue, 28 Jan 1997 16:04:47 -0500
Received: from mail.cs.utexas.edu (root@mail.cs.utexas.edu [128.83.139.10])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id QAA13240; Tue, 28 Jan 1997 16:09:35 -0500
From:
Received: from daffy.cs.utexas.edu (rvdg@daffy.cs.utexas.edu [128.83.143.203])
by mail.cs.utexas.edu (8.8.5/8.8.5) with ESMTP id PAA06570;
Tue, 28 Jan 1997 15:09:16 -0600 (CST)
Received: by daffy.cs.utexas.edu (8.8.5/Client-1.5)
id PAA16250; Tue, 28 Jan 1997 15:09:15 -0600
Date: Tue, 28 Jan 1997 15:09:15 -0600
Message-Id: <199701282109.PAA16250@daffy.cs.utexas.edu>
To: blast-parallel@CS.UTK.EDU
CC: blast-comm@CS.UTK.EDU
Subject: Parallel BLAS standard prototype
Dear Forum participants,
we have been busy with our PLAPACK package, which we believe can be
used as at least an example of how standardization can proceed.
The PLAPACK project represents an effort to provide an infrastructure
for implementing application friendly high performance linear algebra
algorithms. The package uses a more application-centric data
distribution, which we call Physically Based Matrix Distribution, as
well as an object based (MPI-like) style of programming. It is this
style of programming that allows for highly compact codes, written in
C but usable from FORTRAN, that more closely reflect the underlying
blocked algorithms. We show that this can be attained without
sacrificing high performance.
The following example shows how PLAPACK code reflects the natural
description of a linear algebra algorithm. Consider the Cholesky
factorization of matrix A using a level-2 right-looking variant. One
can explain this algorithm as follows: Partition
/ a_11 * \ / l_11 0 \ / l_11 l_21^T \
A = \ a_21 A_22 / = \ l_21 L_22 / \ 0 L_22^T /
where a_11 and l_11 are scalars. Then the algorithm can be
described as given below on the left, which translate to PLAPACK code
given on the right:
let A_cur = A PLA_Obj_all_view( a, &acur );
do until done while ( TRUE ){
PLA_Obj_global_length( a, &size );
if ( 0 == size ) break;
- partition
A_cur = / a_11 * \ PLA_Obj_split_4( a, 1, 1, &a11, &a12,
\ a_21 A_22 / &a21, &acur );
- a_11 <- sqrt( a_11 ) Take_sqrt( a11 );
- a_21 = 1/a_11 * a_21 PLA_Inv_Scal( a11, a_21 );
- A_22 <- A_22 - l_21 l_21^T PLA_Syr(PLA_LOW_TRIAN, min_one, a21, acur);
- let A_cur = A_22
enddo }
Here A_cur becomes a reference into the original matrix, rather than a
copy. Level-3 BLAS versions can be coded similarly, and achieve
within 30% of peak on most parallel architectures.
The web page for PLAPACK is
http://www.cs.utexas.edu/users/plapack
The web page for the Users' Guide is
http://www.cs.utexas.edu/users/plapack/Guide
Unfortunately, we cannot give out postscript, since we have signed a
contract with The MIT Press.
Hopefully this material will provide food for discussion before the
meeting.
Best Regards
Robert
======================================================================
Robert A. van de Geijn Taylor Hall 4.115C
Associate Professor (512) 471-9720 (office)
Department of Computer Sciences (512) 471-8885 (fax)
The University of Texas rvdg@cs.utexas.edu
Austin, Texas 78712 http://www.cs.utexas.edu/users/rvdg
From dongarra@CS.UTK.EDU Wed Nov 26 02:39:12 1997
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id CAA02367; Wed, 26 Nov 1997 02:39:12 -0500
Received: from sgi1.ccrl-nece.technopark.gmd.de (sgi1.ccrl-nece.technopark.gmd.de [193.175.160.69])
by CS.UTK.EDU with SMTP (cf v2.9s-UTK)
id CAA26261; Wed, 26 Nov 1997 02:46:00 -0500 (EST)
From:
Received: from laplace.ccrl-nece.technopark.gmd.de by sgi1.ccrl-nece.technopark.gmd.de via ESMTP (950413.SGI.8.6.12/940406.SGI.AUTO)
id HAA17818; Wed, 26 Nov 1997 07:36:23 +0100
Received: by laplace.ccrl-nece.technopark.gmd.de (950413.SGI.8.6.12) id IAA23008; Wed, 26 Nov 1997 08:30:13 +0100
Date: Wed, 26 Nov 1997 08:30:13 +0100
Message-Id: <199711260730.IAA23008@laplace.ccrl-nece.technopark.gmd.de>
To: blast-parallel@CS.UTK.EDU
Subject: In preparation for the coming meeting ...
Cc: petitet@ccrl-nece.technopark.gmd.de
Dear BLAST participant,
Please find enclosed a draft introduction/layout to the
Distributed-memory Dense BLAS chapter. I will present its
contents at the coming meeting. Please feel free to send
me comments/suggestions.
See you in Knoxville next week,
Best Regards,
Antoine
\chapter{Distributed-memory Dense BLAS Interface} \label{chap:dmblas}
%%%%%%%%%%%%%%%%%%%%%%%%%% Introduction to Distributed-memory Dense BLAS %%%%%%%
\section{Introduction to Distributed-memory Dense BLAS} \label{sec:dmblas_intro}
\subsection{Overview and Goals} \label{subsec:dmblas_overview}
There has been much
interest in the past
few years in developing
versions of the BLAS
for distributed-memory
computers\cite{cmssl90a,
elster90a, aboelaze91a,
falgout93a,brent93a,
choi94a,lawn100,chtchelkanova95a,
bangalore95a,geijn97a}.
Some of this research
proposed parallelizing
the BLAS\cite{velde87a,
cmssl90a,choi94a,lawn100,
slug,chtchelkanova95a,
strazdins96a,geijn97a},
and some implemented a
few important BLAS
routines\cite{cmssl90a,
lichtenstein93a,choi94a,
lawn100,slug,chtchelkanova95a,
strazdins96a,geijn97a},
such as matrix-matrix
multiplication\cite{fox87a,
agarwal94a,huss94a,mathur94a,
agarwal95a,lawn57,lawn96}
or triangular system solve
\cite{heath88a,li88a,li89a,
bisseling91b}. This document
has been inspired to a
large extent by these
research initiatives.
This document proposes
to reuse the interface
design of the dense part
of the ScaLAPACK project
\cite{lawn100,slug} as
a basis for the
distributed-memory
dense BLAS interface.
This choice is motivated
by the following reasons.
\begin{itemize}
\item The simplicity of this interface allows for ease-of-use
and similar calling sequence definitions in all targeted
programming languages.
\item This interface has been shown\cite{slug} to be effective
for the developement of large and high-quality dense linear
algebra software for distributed-memory computers.
\item The broad functionality permitted by this interface enables,
facilitates and encourages the development of current and
related research projects such as the Distributed BLAS
project\cite{strazdins96a}, the Multicomputer Toolbox
project\cite{falgout93a,bangalore95a} and the
PLAPACK project\cite{geijn97a}.
\item The publically available implementation containing source
code, testing and timing programs considerably facilitate
the task of providing a reference implementation of the
distributed-memory dense BLAS interface.
\item A large number of machine and software vendors such as
Fujitsu, Hewlett-Packard/Convex, Hitachi, IBM Parallel ESSL,
NAG Numerical PVM and MPI library, NEC Scientific Software
Library, SGI Cray Scientific Library, SUN Scientific Software
Library and Visual Numerics (IMSL) have already adopted this
interface design for their own products.
\end{itemize}
The distributed-memory
dense BLAS standardization
effort involved about
30 people from 20
organizations mainly
from the United States
and Europe. Most of
the major vendors of
concurrent computers
were involved in this
work, along with
researchers from
universities, government
laboratories, and industry.
The standardization
process began with
the BLAS Technical
Workshop, sponsored
by the University of
Tennessee Knoxville,
held November 13-14,
1995, in Knoxville,
Tennessee. At this
workshop the basic
features essential
to a standard
distributed-memory
dense BLAS interface
were discussed, and
a working group
established to
continue the
standardization
process.
The main advantages
of establishing a
distributed-memory
dense BLAS standard
are portability and
ease-of-use. In a
distributed-memory
environment in which
the higher level
routines and/or
abstractions are
built upon lower
level message-passing
and computational
routines the benefits
of standardization
are particularly
apparent. Furthermore,
the definition of
distributed-memory
dense basic linear
algebra subprograms,
such as those proposed
here, provides vendors
with a clearly defined
base set of routines
that they can implement
efficiently, or in some
cases provide hardware
support for, thereby
enhancing scalability.
The goal of the
distributed-memory
dense BLAS interface
simply stated is
to develop a widely
used standard for
writing message-passing
programs performing
dense basic linear
algebra operations.
As such the interface
should establish a
practical, portable,
efficient and flexible
standard for
distributed-memory
dense basic linear
algebra operations.
A complete list of goals follows.
\begin{itemize}
\item Design an Application Programming Interface
(API) (not necessarily for compilers or a
system implementation library) well suited
for distributed-memory dense basic linear
algebra computations.
\item Allow efficient communication and computation:
minimizing communication startup overhead and
volume, while maximizing load balance and local
computational performance.
\item Allow for re-use of existing message-passing
interface standard\cite{mpi94a} as well as local
basic linear algebra computational kernels such
as the {\em de facto} standard BLAS.
\item Allow for implementations that can be used in a
heterogeneous environment.
\item Allow convenient Fortran~77, Fortran~90, High
Performance Fortran (HPF), C and C++ bindings
for the interface.
\item Define an interface that is not too different
from current practice and provide extensions
that allow greater flexibility.
\item Define an interface that can be implemented on
many vendors' platforms, with no significant
changes in the underlying system software.
\item Semantics of the interface should be language-
and data-distribution independent.
\item The interface should be designed to allow for
thread-safety.
\end{itemize}
\subsection{Who Should Use This Standard ?}
This standard is intended
for use by all those who
want to write portable
programs performing dense
linear algebra operations
in Fortran~77, Fortran~90,
High Performance Fortran
(HPF), C or C++. This
includes individual
application programmers,
developers of dense
linear algebra software
designed to run on parallel
machines, and creators of
computational environment
and tools. In order to be
attractive to this wide
audience, the standard
must provide a simple,
easy-to-use interface
for the basic user while
not semantically precluding
the high-performance
computation and
communication operations
available on advanced
machines.
\subsection{What Platforms Are Targets for Implementation ?}
The attractiveness of the
distributed-memory dense BLAS
at least partially stems from
its wide portability as well
as the common occurence of
dense linear algebra operations
in numerical simulations.
These programs may run on
distributed-memory multiprocessors,
networks of workstations, and
combinations of all of these.
In addition, shared-memory
implementations are possible.
The message passing paradigm
will not be made obsolete by
architectures combining the
shared- and distributed- memory
views, or by increases in network
speeds. It thus should be both
possible and useful to implement
this standard on a great variety
of machines, including those
"machines", parallel or not,
connected by a communication
network.
The distributed-memory dense BLAS
interface provides many features
intended to improve performance
on scalable parallel computers
with specialized interprocessor
communication hardware. Thus, we
expect that native, high-performance
implementations of this interface
will be provided on such machines.
At the same time, implementations
of this standard on top of MPI or
PVM will provide portability to
workstation clusters and heterogeneous
networks of workstations.
\subsection{What Is Included in the Standard ?}
The standard includes:
\begin{itemize}
\item A set of basic dense linear algebra computational operations
\item Data-redistribution operations
\item Environmental management and inquiry
\item Bindings for Fortran~77 and C that can also be used in
respectively Fortran~90 and C++ programs
\item Bindings for High Performance Fortran (HPF)
\end{itemize}
\subsection{What Is Not (Yet ?) Included in the Standard ?}
The standard does not (yet) specify:
\begin{itemize}
\item Debugging facilities
\item Specific I/O and data generation operations
\item Out-of-core computational routines
\item Packed and banded storage computational routines
\item Extended precision computational routines
\end{itemize}
There are many features
and interface variants
that have been considered
and not included in this
standard. This happened
for a number of reasons,
one of which is the time
constraint that was
self-imposed in finishing
the standard. Features
that are not included
can always be offered
as extensions by specific
implementations. Perhaps
future version of the
distributed-memory dense
BLAS interface will
address some of these
issues.
\subsection{Organization of This Chapter} \label{subsec:dmblas_organization}
The following is a list
of the remaining sections
in this chapter, along
with a brief description
of each.
\begin{itemize}
\item Section~\ref{sec:dmblas_terminology},
Terminology and Conventions, explains
notational terms and conventions used
throughout this chapter.
\item Section~\ref{sec:dmblas_functionality},
Functionality, describes the operations
performed by the distributed-memory
dense BLAS functions,
\item Section~\ref{sec:dmblas_lis},
Language and Data-distribution
Independent Specifications,
defines the operations of
the distributed-memory dense
BLAS independently from the
data-decomposition scheme
used for distributing the
operands. Matrix-matrix
multiply, matrix-vector
multiply and dot-product
are found there, along
with many associated
functions designed to
make the basic linear
algebra computational
operation powerful, easy
to use and efficient.
\item Section~\ref{sec:dmblas_lds},
Language- and Data-Distribution
Dependent Specifications, defines
the Fortran~77 and C bindings of
the distributed-memory dense BLAS
functions for the two-dimensional
block-cyclic and block-cartesian
data decomposition schemes.
\item Section~\ref{sec:dmblas_ref_impl},
A Reference Implementation, describes
a model implementation of the
distributed-memory dense BLAS
as well as testing and timing
programs for the two-dimensional
block-cyclic mapping that aims
at supporting and encouraging
the use of the distributed-memory
dense BLAS.
\item Section~\ref{sec:dmblas_lang_bindings},
Language Bindings, gives specific
syntax in Fortran~77 and C, for all
distributed-memory dense BLAS functions
for the two-dimensional block-cyclic
and block-cartesian data-decomposition
schemes.
\item The distributed-memory dense BLAS
Index is a simple index showing the
location of the precise definition
of each function defined by this
chapter, together with both C and
Fortran~77 bindings.
\end{itemize}
From dongarra@CS.UTK.EDU Fri Dec 5 10:57:23 1997
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id KAA09292; Fri, 5 Dec 1997 10:57:22 -0500
Received: from concorde.inria.fr (concorde.inria.fr [192.93.2.39])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id LAA17761; Fri, 5 Dec 1997 11:04:08 -0500 (EST)
Received: from arte.inria.fr (arte.inria.fr [128.93.12.53])
by concorde.inria.fr (8.8.7/8.8.5) with ESMTP id RAA02474;
Fri, 5 Dec 1997 17:04:06 +0100 (MET)
Received: from localhost (scott@localhost) by arte.inria.fr (8.7.6/8.7.3) with SMTP id RAA19777; Fri, 5 Dec 1997 17:04:05 +0100 (MET)
Date: Fri, 5 Dec 1997 17:04:04 +0100 (MET)
From: Tony Scott
To: blast-comm@CS.UTK.EDU
cc: blast-sparse@CS.UTK.EDU, blast-parallel@CS.UTK.EDU
Subject: Maple/BLAS link
Message-ID:
MIME-Version: 1.0
Content-Type: TEXT/PLAIN; charset=US-ASCII
Dear Sirs,
My name is Tony Scott and I am one of the developers of
the Maple Computer Algebra System. I am currently working on
a project which attempts to link Maple with Scilab (a fast
numerical software package which is similar to MATLAB). In
particular, we are also interested in the links between Maple
and Scilab with FORTRAN, C and BLAS (Scilab is freeware BTW
and it can be accessed via anonymous ftp and is partly based on the
BLAS routines). This project has started here at INRIA-Rocquencourt
(which is in the Paris region of France).
For this reason, I would like to communicate with you
so as to exchange information that could benefit the outcome of
this project.
Dr. Claude Gomez has started a package called transfor which
transforms Matrix operations into BLAS routines. So far it works
but it is based on the standard routines. We are also
interested in SPARSE BLAS routines and routines which exploit
parallel processing. However, keeping freewarein our minds,
we want to make sure that the conversion into BLAS is done in
terms of standard available (and free) routines.
I know that the CRAY has a huge directory of SPARSE
BLAS routines and in the past few years, there has been some
intensive work on SPARSE BLAS routines. However, I don't
know if these SPARSE BLAS routines are now "standard" and
available. Can anyone tell me if they are? and where a
directory of such routines are available?
Thank you for your time.
Best Wishes,
Tony Scott
Editor-in-Chief
MapleTech
From dongarra@CS.UTK.EDU Sat Dec 6 02:35:16 1997
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id CAA20279; Sat, 6 Dec 1997 02:35:16 -0500
Received: from timbuk.cray.com (timbuk-anet.cray.com [128.162.19.7])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id CAA11338; Sat, 6 Dec 1997 02:42:06 -0500 (EST)
Received: from ironwood.cray.com (root@ironwood-fddi.cray.com [128.162.21.36]) by timbuk.cray.com (8.8.7/CRI-gate-news-1.3) with ESMTP id BAA27165; Sat, 6 Dec 1997 01:42:05 -0600 (CST)
Received: from [204.73.50.35] (eagan-rip17 [204.73.50.17]) by ironwood.cray.com (8.8.4/CRI-ironwood-news-1.0) with ESMTP id BAA01658; Sat, 6 Dec 1997 01:42:00 -0600 (CST)
X-Sender: mamh@ironwood.cray.com
Message-Id:
In-Reply-To:
Mime-Version: 1.0
Content-Type: text/plain; charset="us-ascii"
Date: Sat, 6 Dec 1997 01:28:28 -0600
To: Tony Scott , blast-comm@CS.UTK.EDU
From: Mike Heroux
Subject: Re: Maple/BLAS link
Cc: blast-sparse@CS.UTK.EDU, blast-parallel@CS.UTK.EDU
Tony,
The sparse BLAS are not standard at this point. There is ongoing
discussion about it, and progress is being made. There are reference
implementations of a Toolkit and User level sparse BLAS. The best starting
point is probably to visit the website http://math.nist.gov/spblas This
page is maintained by Karin Remington and Roldan Pozo. The reference
implementation of the Toolkit is there and pointers to Iain Duff's work on
the User Level sparse BLAS are there.
Best regards,
Mike
At 5:04 PM +0100 12/5/97, Tony Scott wrote:
>Dear Sirs,
>
> My name is Tony Scott and I am one of the developers of
>the Maple Computer Algebra System. I am currently working on
>a project which attempts to link Maple with Scilab (a fast
>numerical software package which is similar to MATLAB). In
>particular, we are also interested in the links between Maple
>and Scilab with FORTRAN, C and BLAS (Scilab is freeware BTW
>and it can be accessed via anonymous ftp and is partly based on the
>BLAS routines). This project has started here at INRIA-Rocquencourt
>(which is in the Paris region of France).
>
> For this reason, I would like to communicate with you
>so as to exchange information that could benefit the outcome of
>this project.
>
> Dr. Claude Gomez has started a package called transfor which
>transforms Matrix operations into BLAS routines. So far it works
>but it is based on the standard routines. We are also
>interested in SPARSE BLAS routines and routines which exploit
>parallel processing. However, keeping freewarein our minds,
>we want to make sure that the conversion into BLAS is done in
>terms of standard available (and free) routines.
>
> I know that the CRAY has a huge directory of SPARSE
>BLAS routines and in the past few years, there has been some
>intensive work on SPARSE BLAS routines. However, I don't
>know if these SPARSE BLAS routines are now "standard" and
>available. Can anyone tell me if they are? and where a
>directory of such routines are available?
>
>Thank you for your time.
>
>Best Wishes,
>
>Tony Scott
>Editor-in-Chief
>MapleTech
From dongarra@CS.UTK.EDU Wed Mar 4 15:07:28 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id PAA25817; Wed, 4 Mar 1998 15:07:27 -0500
Received: from mail.cs.utexas.edu (root@mail.cs.utexas.edu [128.83.139.10])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id PAA25480; Wed, 4 Mar 1998 15:07:47 -0500 (EST)
Received: from bigbird.cs.utexas.edu (rvdg@bigbird.cs.utexas.edu [128.83.143.202])
by mail.cs.utexas.edu (8.8.5/8.8.5) with ESMTP id OAA21180;
Wed, 4 Mar 1998 14:07:10 -0600 (CST)
Received: by bigbird.cs.utexas.edu (8.8.5/Client-1.5)
id OAA10616; Wed, 4 Mar 1998 14:07:08 -0600
Date: Wed, 4 Mar 1998 14:07:08 -0600
Message-Id: <199803042007.OAA10616@bigbird.cs.utexas.edu>
From: Robert van de Geijn
To: blast-parallel@CS.UTK.EDU
CC: blast-comm@CS.UTK.EDU, morrow@cs.utexas.edu
Subject: Distributed BLAS standardization
Folks,
I would like to start by stating that in my personal opinion, we who
are strictly in academia should stay out of the standardization
business. We should prototype ideas and give advice. Ultimately, the
mission of standardization is better met by the government labs and
industry.
As you are all well aware, we at Texas have long held the opinion that
standardization of the distributed BLAS along the lines of the
ScaLAPACK PBLAS should not be done in haste. Thus, we are quite
surprised to see that a first reading of the chapter on distributed
BLAS is planned at the next meeting, and that that chapter essentially
proposes to use the PBLAS interface as the standard.
Thus, a few years ago, we started the PLAPACK project to investigate
the merits of raising the abstraction for programming parallel BLAS
and codes that use parallel BLAS. All along we agreed that this may
incur a certain overhead, but maintained that this could be overcome
by the fact that the simplicity of the programming interface would
allow for more sophisticated algorithms to be programmed. We are
finally at a stage where we can report on our experiences.
The following performance comparison between PLAPACK and the
equivalent ScaLAPACK routine on a 16 node Cray T3E-600 demonstrates
our point:
Performance of dense linear solve
(in MFLOPS/sec/processor)
n PLAPACK ScaLAPACK
(PLA_Gesv) (PSGESV)
2000 66 101
5000 174 168
7500 228 201
10000 268 209
The ScaLAPACK performance numbers were derived from those reported in
the ScaLAPACK Users' Guide. We conducted our own experiments on the
T3E used to collect the PLAPACK numbers and found the ScaLAPACK
numbers to be representative. Of course, 16 nodes is too small to
prove a point, however, we have incomplete data to show that the same
behavior can be observed on other platforms and larger systems.
Since performance is sometimes a determining factor in whether or
not people pay attention to research, we hope that the above
numbers will encourage people to look beyond the ScaLAPACK PBLAS,
and consider alternatives. Our advice is that a more comprehensive
interface to an infrastructure be considered, along the line of MPI.
In particular, indices are evil.
More information on PLAPACK:
http://www.cs.utexas.edu/users/plapack
plapack@cs.utexas.edu
The PLAPACK Users' Guide is available from The MIT Press:
Robert van de Geijn,
"Using PLAPACK: Parallel Linear Algebra Package",
The MIT Press, 1997
http://mitpress.mit.edu/
Best Regards
Greg Morrow and Robert van de Geijn
for the PLAPACK team
======================================================================
Robert A. van de Geijn Taylor Hall 4.115C
Associate Professor (512) 471-9720 (office)
Department of Computer Sciences (512) 471-8885 (fax)
The University of Texas rvdg@cs.utexas.edu
Austin, Texas 78712 http://www.cs.utexas.edu/users/rvdg
From dongarra@CS.UTK.EDU Wed Mar 4 16:03:13 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA26240; Wed, 4 Mar 1998 16:03:13 -0500
Received: from cupid.cs.utk.edu (CUPID.CS.UTK.EDU [128.169.94.221])
by CS.UTK.EDU with SMTP (cf v2.9s-UTK)
id QAA29446; Wed, 4 Mar 1998 16:04:39 -0500 (EST)
From: R Clint Whaley
Received: by cupid.cs.utk.edu (cf v2.11c-UTK)
id QAA04618; Wed, 4 Mar 1998 16:04:33 -0500
Date: Wed, 4 Mar 1998 16:04:33 -0500
Message-Id: <199803042104.QAA04618@cupid.cs.utk.edu>
To: blast-parallel@CS.UTK.EDU, rvdg@cs.utexas.edu
Subject: Re: Distributed BLAS standardization
Cc: blast-comm@CS.UTK.EDU, morrow@cs.utexas.edu
>The following performance comparison between PLAPACK and the
>equivalent ScaLAPACK routine on a 16 node Cray T3E-600 demonstrates
>our point:
>
> Performance of dense linear solve
> (in MFLOPS/sec/processor)
>
> n PLAPACK ScaLAPACK
> (PLA_Gesv) (PSGESV)
> 2000 66 101
> 5000 174 168
> 7500 228 201
> 10000 268 209
>
>The ScaLAPACK performance numbers were derived from those reported in
>the ScaLAPACK Users' Guide. We conducted our own experiments on the
>T3E used to collect the PLAPACK numbers and found the ScaLAPACK
>numbers to be representative. Of course, 16 nodes is too small to
>prove a point, however, we have incomplete data to show that the same
>behavior can be observed on other platforms and larger systems.
I'm a little confused. The numbers in the SLUG (which are you are using),
are for a 300Mhz T3E; i.e., the hardware is running half as fast as the
hardware for your plapack timings. I don't have access to a 600Mhz T3E,
but I do have access to a 450Mhz T3E. Here are the numbers, as you've plotted
them:
600Mhz 450Mhz
n PLAPACK ScaLAPACK
(PLA_Gesv) (PSGESV)
2000 66 122
5000 174 257
7500 228 315
10000 268 344
Taken at face value, these numbers seem to indicate that ScaLAPACK is strikingly
faster . . .
Cheers,
Clint
From dongarra@CS.UTK.EDU Wed Mar 4 16:27:52 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA26570; Wed, 4 Mar 1998 16:27:51 -0500
Received: from mail.cs.utexas.edu (root@mail.cs.utexas.edu [128.83.139.10])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id QAA01348; Wed, 4 Mar 1998 16:29:06 -0500 (EST)
Received: from bigbird.cs.utexas.edu (morrow@bigbird.cs.utexas.edu [128.83.143.202])
by mail.cs.utexas.edu (8.8.5/8.8.5) with ESMTP id PAA24265;
Wed, 4 Mar 1998 15:28:24 -0600 (CST)
From: Greg Morrow
Received: by bigbird.cs.utexas.edu (8.8.5/Client-1.5)
id PAA22566; Wed, 4 Mar 1998 15:28:23 -0600
Message-Id: <199803042128.PAA22566@bigbird.cs.utexas.edu>
Subject: Re: Distributed BLAS standardization
To: rwhaley@CS.UTK.EDU
Date: Wed, 4 Mar 1998 15:28:22 -0600 (CST)
Cc: blast-parallel@CS.UTK.EDU, blast-comm@CS.UTK.EDU,
rvdg@cs.utexas.edu (Robert van de Geijn)
In-Reply-To: <199803042104.QAA04618@cupid.cs.utk.edu> from "rwhaley@cs.utk.edu" at Mar 4, 98 04:04:33 pm
X-Mailer: ELM [version 2.4 PL25]
MIME-Version: 1.0
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: 7bit
rwhaley@cs.utk.edu wrote :
>
> >The following performance comparison between PLAPACK and the
> >equivalent ScaLAPACK routine on a 16 node Cray T3E-600 demonstrates
> >our point:
> >
> > Performance of dense linear solve
> > (in MFLOPS/sec/processor)
> >
> > n PLAPACK ScaLAPACK
> > (PLA_Gesv) (PSGESV)
> > 2000 66 101
> > 5000 174 168
> > 7500 228 201
> > 10000 268 209
> >
...
>
> I'm a little confused. The numbers in the SLUG (which are you are using),
> are for a 300Mhz T3E; i.e., the hardware is running half as fast as the
> hardware for your plapack timings. I don't have access to a 600Mhz T3E,
> but I do have access to a 450Mhz T3E. Here are the numbers, as you've plotted
> them:
>
> 600Mhz 450Mhz
> n PLAPACK ScaLAPACK
> (PLA_Gesv) (PSGESV)
> 2000 66 122
> 5000 174 257
> 7500 228 315
> 10000 268 344
>
> Taken at face value, these numbers seem to indicate that ScaLAPACK is strikingly
> faster . . .
>
> Cheers,
> Clint
>
Clint,
There is a bit of confusion here. The T3E 600 is the
model number for the 300 MHz machine. The T3E 900 is
the 450 MHz machine. The PLAPACK numbers are from
the 300 Mhz machine, so the comparison is fair.
Thanks,
Greg
--
Greg Morrow morrow@cs.utexas.edu
Texas Institute for Computational and Applied Mathematics
University of Texas at Austin
From dongarra@CS.UTK.EDU Wed Mar 4 16:59:03 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA26845; Wed, 4 Mar 1998 16:58:59 -0500
Received: from antares.mcs.anl.gov (mcs.anl.gov [140.221.9.6])
by CS.UTK.EDU with SMTP (cf v2.9s-UTK)
id RAA03672; Wed, 4 Mar 1998 17:00:25 -0500 (EST)
Received: from eagle.mcs.anl.gov (eagle.mcs.anl.gov [140.221.3.47]) by antares.mcs.anl.gov (8.6.10/8.6.10) with SMTP
id QAA17923; Wed, 4 Mar 1998 16:00:20 -0600
Date: Wed, 4 Mar 1998 16:00:18 -0600 (CST)
From: Barry Smith
To: blast-parallel@CS.UTK.EDU, blast-comm@CS.UTK.ED
Subject: Re: Distributed BLAS standardization
In-Reply-To: <199803042007.OAA10616@bigbird.cs.utexas.edu>
Message-ID:
MIME-Version: 1.0
Content-Type: TEXT/PLAIN; charset=US-ASCII
It would be very unfortunate if a standard were implemented
just because "any standard is better than no standard". Clearly
the research community is still exploring various approaches to
parallelizing dense matrix computations, good new ideas are still
emerging, they should be fostered not crushed.
I withdraw from the BLAST Forum because it was clear to me that
it would never have the community-wide participation required to
foster the best possible approaches and standards (that for example,
MPI did have). Unless the BLAST Forum has the wide participation,
it will always be irrelevent to the end users.
I hope that Robert can propose the PLAPACK approach as an alternative
to the PBLAS as a chapter in the distributed BLAS. The technical
discussions and learning as one compares and contrasts the two
approaches are fundamental to developing any standards; no one loses
and everyone wins if you can select the best parts of several proposals.
This was the great success of MPI.
Barry Smith
On Wed, 4 Mar 1998, Robert van de Geijn wrote:
> Folks,
>
> I would like to start by stating that in my personal opinion, we who
> are strictly in academia should stay out of the standardization
> business. We should prototype ideas and give advice. Ultimately, the
> mission of standardization is better met by the government labs and
> industry.
>
> As you are all well aware, we at Texas have long held the opinion that
> standardization of the distributed BLAS along the lines of the
> ScaLAPACK PBLAS should not be done in haste. Thus, we are quite
> surprised to see that a first reading of the chapter on distributed
> BLAS is planned at the next meeting, and that that chapter essentially
> proposes to use the PBLAS interface as the standard.
>
> Thus, a few years ago, we started the PLAPACK project to investigate
> the merits of raising the abstraction for programming parallel BLAS
> and codes that use parallel BLAS. All along we agreed that this may
> incur a certain overhead, but maintained that this could be overcome
> by the fact that the simplicity of the programming interface would
> allow for more sophisticated algorithms to be programmed. We are
> finally at a stage where we can report on our experiences.
>
> The following performance comparison between PLAPACK and the
> equivalent ScaLAPACK routine on a 16 node Cray T3E-600 demonstrates
> our point:
>
> Performance of dense linear solve
> (in MFLOPS/sec/processor)
>
> n PLAPACK ScaLAPACK
> (PLA_Gesv) (PSGESV)
> 2000 66 101
> 5000 174 168
> 7500 228 201
> 10000 268 209
>
> The ScaLAPACK performance numbers were derived from those reported in
> the ScaLAPACK Users' Guide. We conducted our own experiments on the
> T3E used to collect the PLAPACK numbers and found the ScaLAPACK
> numbers to be representative. Of course, 16 nodes is too small to
> prove a point, however, we have incomplete data to show that the same
> behavior can be observed on other platforms and larger systems.
>
> Since performance is sometimes a determining factor in whether or
> not people pay attention to research, we hope that the above
> numbers will encourage people to look beyond the ScaLAPACK PBLAS,
> and consider alternatives. Our advice is that a more comprehensive
> interface to an infrastructure be considered, along the line of MPI.
> In particular, indices are evil.
>
> More information on PLAPACK:
> http://www.cs.utexas.edu/users/plapack
> plapack@cs.utexas.edu
>
> The PLAPACK Users' Guide is available from The MIT Press:
> Robert van de Geijn,
> "Using PLAPACK: Parallel Linear Algebra Package",
> The MIT Press, 1997
> http://mitpress.mit.edu/
>
> Best Regards
> Greg Morrow and Robert van de Geijn
> for the PLAPACK team
>
> ======================================================================
> Robert A. van de Geijn Taylor Hall 4.115C
> Associate Professor (512) 471-9720 (office)
> Department of Computer Sciences (512) 471-8885 (fax)
> The University of Texas rvdg@cs.utexas.edu
> Austin, Texas 78712 http://www.cs.utexas.edu/users/rvdg
>
>
>
>
From dongarra@CS.UTK.EDU Wed Mar 4 17:09:59 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id RAA26952; Wed, 4 Mar 1998 17:09:59 -0500
Received: from timbuk.cray.com (timbuk-fddi.cray.com [128.162.8.102])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id RAA04272; Wed, 4 Mar 1998 17:08:23 -0500 (EST)
Received: from ironwood.cray.com (root@ironwood-fddi.cray.com [128.162.21.36]) by timbuk.cray.com (8.8.8/CRI-gate-news-1.3) with ESMTP id QAA16328; Wed, 4 Mar 1998 16:08:07 -0600 (CST)
Received: from song.cray.com (song [128.162.174.153]) by ironwood.cray.com (8.8.4/CRI-ironwood-news-1.0) with ESMTP id QAA20911; Wed, 4 Mar 1998 16:08:05 -0600 (CST)
From: Guangye Li
Received: by song.cray.com (8.8.0/btd-b3)
id WAA14728; Wed, 4 Mar 1998 22:08:04 GMT
Message-Id: <199803042208.WAA14728@song.cray.com>
Subject: Re: Distributed BLAS standardization
To: rwhaley@CS.UTK.EDU (R Clint Whaley)
Date: Wed, 4 Mar 1998 16:08:03 -0600 (CST)
Cc: blast-parallel@CS.UTK.EDU, rvdg@cs.utexas.edu, blast-comm@CS.UTK.EDU,
morrow@cs.utexas.edu
In-Reply-To: <199803042104.QAA04618@cupid.cs.utk.edu> from "R Clint Whaley" at Mar 4, 98 04:04:33 pm
X-Mailer: ELM [version 2.4 PL24-CRI-d]
MIME-Version: 1.0
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: 7bit
Just a clarification on the T3E speed.
The naming of the T3E series is based on the peak mega flop rate
of the processor instead of mega hertz. So a 300 Mhz T3E is called the
T3E600, The 450 Mhz T3E and 600 Mhz T3E are called the T3E900 and
T3E1200 respectively.
Guangye Li
Cray Research, a Silicon Graphics Company
>
> >The following performance comparison between PLAPACK and the
> >equivalent ScaLAPACK routine on a 16 node Cray T3E-600 demonstrates
> >our point:
> >
> > Performance of dense linear solve
> > (in MFLOPS/sec/processor)
> >
> > n PLAPACK ScaLAPACK
> > (PLA_Gesv) (PSGESV)
> > 2000 66 101
> > 5000 174 168
> > 7500 228 201
> > 10000 268 209
> >
> >The ScaLAPACK performance numbers were derived from those reported in
> >the ScaLAPACK Users' Guide. We conducted our own experiments on the
> >T3E used to collect the PLAPACK numbers and found the ScaLAPACK
> >numbers to be representative. Of course, 16 nodes is too small to
> >prove a point, however, we have incomplete data to show that the same
> >behavior can be observed on other platforms and larger systems.
>
> I'm a little confused. The numbers in the SLUG (which are you are using),
> are for a 300Mhz T3E; i.e., the hardware is running half as fast as the
> hardware for your plapack timings. I don't have access to a 600Mhz T3E,
> but I do have access to a 450Mhz T3E. Here are the numbers, as you've plotted
> them:
>
> 600Mhz 450Mhz
> n PLAPACK ScaLAPACK
> (PLA_Gesv) (PSGESV)
> 2000 66 122
> 5000 174 257
> 7500 228 315
> 10000 268 344
>
> Taken at face value, these numbers seem to indicate that ScaLAPACK is strikingly
> faster . . .
>
> Cheers,
> Clint
>
From dongarra@CS.UTK.EDU Fri Mar 13 16:03:09 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA10979; Fri, 13 Mar 1998 16:03:09 -0500
Received: from mail.cs.utexas.edu (root@mail.cs.utexas.edu [128.83.139.10])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id QAA00522; Fri, 13 Mar 1998 16:06:26 -0500 (EST)
Received: from bigbird.cs.utexas.edu (rvdg@bigbird.cs.utexas.edu [128.83.143.202])
by mail.cs.utexas.edu (8.8.5/8.8.5) with ESMTP id PAA18524;
Fri, 13 Mar 1998 15:05:23 -0600 (CST)
Received: by bigbird.cs.utexas.edu (8.8.5/Client-1.5)
id PAA37802; Fri, 13 Mar 1998 15:05:21 -0600
Date: Fri, 13 Mar 1998 15:05:21 -0600
Message-Id: <199803132105.PAA37802@bigbird.cs.utexas.edu>
From: Robert van de Geijn
To: blast-parallel@CS.UTK.EDU
CC: blast-comm@CS.UTK.EDU, plapackers@cs.utexas.edu
In-reply-to: <199803042007.OAA10616@bigbird.cs.utexas.edu> (message from
Robert van de Geijn on Wed, 4 Mar 1998 14:07:08 -0600)
Subject: Re: Distributed BLAS standardization
Folks,
just to show that high level abstraction can be achieved from FORTRAN
as well, we have just completed an initial stab at a PLAPACK FORTRAN
interface. As for the C PLAPACK interface itself, the FORTRAN
interface has an MPI-like feel to it, which is also what made the
interface quite simple to implement. Comments are of course welcome.
For details, see http://www.cs.utexas.edu/users/plapack/FORTRAN
Best Regards
Robert
======================================================================
Robert A. van de Geijn Taylor Hall 4.115C
Associate Professor (512) 471-9720 (office)
Department of Computer Sciences (512) 471-8885 (fax)
The University of Texas rvdg@cs.utexas.edu
Austin, Texas 78712 http://www.cs.utexas.edu/users/rvdg
From dongarra@CS.UTK.EDU Tue Apr 14 04:05:13 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id EAA12187; Tue, 14 Apr 1998 04:05:12 -0400
Received: from sky.fit.qut.edu.au (rajkumar@sky.fit.qut.edu.au [131.181.2.4])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id EAA07245; Tue, 14 Apr 1998 04:15:07 -0400 (EDT)
Received: (from rajkumar@localhost)
by sky.fit.qut.edu.au (8.8.8/8.8.8/tony) id SAA10846;
Tue, 14 Apr 1998 18:14:58 +1000 (EST)
Date: Tue, 14 Apr 1998 18:14:58 +1000 (EST)
From: Rajkumar Bunya
Message-Id: <199804140814.SAA10846@sky.fit.qut.edu.au>
To: blast-comm@CS.UTK.EDU, blast-parallel@CS.UTK.EDU
Subject: CFP for PDCP Journal special issue
Special Issue on
High Performance Computing on Clusters
Parallel and Distributed Computing Practices (PDCP) Journal
February 1999 (Vol 2, No 2), Nova Science Publishers, USA
Call for Papers
----------------------------------------------------------------------------
In the recent years, high speed network and improved microprocessor
performance are making network of workstations an appealing vehicle for
parallel computing. Cluster/network of computers (workstations/PCs) built
using commodity hardware or software is playing a major role in redefining
the concept of Supercomputing. As a whole, Clusters are becoming
compromising solution to MPPs and Supercomputers. The focus of this special
issue will be on both hardware and software aspects of computing on
clusters. Topics of interest include, but are not limited to:
* Cluster Hardware (Cluster of Workstations or PCs)
* Active Messages and Light Weight Protocols
* Cluster Operating System
* Single System Image
* Network Capabilities for Fast Communication
* Message Passing Systems such as MPI and PVM for Clusters
* Characterization of Communication and Synchronization Traffic
* Operating Environments
* Data Distribution and Load Balancing
* Programming Paradigms/Environment for Clusters
* HPCC Models for Graphics and Multimedia
* Problem Solving Environment for Clusters
* Tools for Operating and Managing Clusters
* Parallel and Distributed Computing/HPC in Java
* Algorithms for Solving Problems on Clusters
* Building Applications on Clusters
* Large Scale System Administration
* Issues in Building Scalable Services
* Fault Tolerance Issues for Clusters
Submission should include authors names, affiliations, addresses, fax and
phone numbers, email addresses, on the cover page. Please submit full paper
(not exceeding 10 single spaced pages in length) to one of the guest editors
electronically. Email postscript file of the paper (preferably viewable by
ghostview); also send a separate email with authors' names, addresses, title
and abstract of the paper. Hard copies should be sent only if electronic
submission is not possible.
Please submit full paper for consideration to this special issue to one of
the guest editors by September 15, 1998.
Guest Editors:
RAJKUMAR Clemens Szyperski
School of Computing Science School of Computing Science
Faculty of Information Technology Faculty of Information Technology
Queensland University of Technology Queensland University of Technology
825a, Level 8, ITE Building 735, Level 7, ITE Building
Gardens Point Campus Gardens Point Campus
Brisbane,, Australia, QLD 4001 Brisbane, Australia, QLD 4001
Office phone: +61 7 3864 1290 Phone: +61 7 3864 2132
Office fax: +61 7 3864 1801 Fax: +61 7 3864 1801
Email: rajkumar@fit.qut.edu.au Email: szypersk@fit.qut.edu.au
Important Dates:
Draft Papers due on: 15th September, 1998
Notification of acceptance: 15th November, 1998
Final paper in LaTeX format due on: 15th December, 1998
Important URLs:
PDCP Journal: http://orca.st.usm.edu/pdcp
CFP: http://www.fit.qut.edu.au/~rajkumar/pdcp.html
CFP: http://www.fit.qut.edu.au/~szypersk/pdcp.html
Instructons for
Contributors: http://orca.st.usm.edu/pdcp/InstructionsForContributors.html
Nova Science Publishers: http://www.nexusworld.com/nova
----------------------------------------------------------------------------
From dongarra@CS.UTK.EDU Thu Apr 16 15:43:53 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id PAA02022; Thu, 16 Apr 1998 15:43:53 -0400
Received: from mail.cs.utexas.edu (root@mail.cs.utexas.edu [128.83.139.10])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id PAA24407; Thu, 16 Apr 1998 15:43:35 -0400 (EDT)
Received: from bigbird.cs.utexas.edu (rvdg@bigbird.cs.utexas.edu [128.83.143.202])
by mail.cs.utexas.edu (8.8.5/8.8.5) with ESMTP id OAA29473;
Thu, 16 Apr 1998 14:42:32 -0500 (CDT)
Received: by bigbird.cs.utexas.edu (8.8.5/Client-1.5)
id OAA18392; Thu, 16 Apr 1998 14:42:31 -0500
Date: Thu, 16 Apr 1998 14:42:31 -0500
Message-Id: <199804161942.OAA18392@bigbird.cs.utexas.edu>
From: Robert van de Geijn
To: blast-comm@CS.UTK.EDU
CC: blast-parallel@CS.UTK.EDU, rvdg@cs.utexas.edu
Subject: Distributed BLAS standardization
Folks,
We recently reported performance numbers for PLAPACK vs. ScaLAPACK on
our Cray T3E-600. In compiling numbers for ScaLAPACK, we measured
performance of the ScaLAPACK library that is part of the Cray
scientific library. We were not aware that the Cray scientific
library is compiled with "streams" turned off. The default for our
system is to have streams turned on for both compilation and
execution, and thus PLAPACK was compiled and executed with streams
turned on.
Corrected performance numbers (for the LU factorization based
solvers) are given below.
Cray T3E-600 (300 MHz) 16 nodes
PLAPACK ScaLAPACK ScaLAPACK (Empirical)
size streams on Guide streams off streams on
--------------------------------------------------------------
2000 66 101 91 110
5000 174 168 169 215
7500 228 201 202 259
10000 268 209 225 285
Version information:
PLAPACK Release 1.1
ScaLAPACK as part of Cray Scientific Library
Standard C compiler Cray Standard C Version
Cray Assembler CAM Version 2.3.0.0
f90 Cray CF90
CrayLibs Version 3.0.0.0
CrayTools Version 3.0.0.0
Compiler Options -O3
While the performance numbers we quoted for the ScaLAPACK version
included in the Cray scientific library were not measured under
optimal circumstances, the conclusion that high performance can be
attained by raising the level of abstraction is nonetheless correct.
We had merely stopped optimizing PLAPACK at the point where we felt
there was a comfortable gap between PLAPACK performance and ScaLAPACK
performance.
After measuring ScaLAPACK with streams turned on, I implemented a few
minor optimizations to PLAPACK to demonstrate this point. So far, we
have only done this for the Cholesky factorization. The following
table reports a number of different versions for each of ScaLAPACK and
PLAPACK
Cray T3E-600 (300 MHz) 16 processors
PLAPACK ScaLAPACK
Previously Optimized Cray Sci Library netlib version
Reported streams off on streams on
n
1000 49 83 98 85
2000 104 115 137 165 149
3000 168 164 199 191
4000 200 184 226 222
5000 219 232 198 243 240
7500 268 283 220 269 273
10000 302 315 234 287 292
12500 334 245 299 306
14000 348
A comment on what was measured:
Performance for PLAPACK and for the Cray Sci Library were performed
by ourselves on the Cray T3E-600 at UT-Austin. The compiler versions
etc. are given above. The numbers in the last column were sent to
us by Jack Dongarra, and were collected on the NERSC Cray T3E-600.
Version info for that machine:
UNICOS/mk 2.0.2.18
LIBSCI 3.0.1.4
C compiler = cc
C flags = -O3
F77 compiler = f90
F77 flags = -dp -O3 -X m
Cray MPI (mpt 1.2.0.1)
The ScaLAPACK code was what is available on netlib:
ScaLAPACK, version 1.5 + update1.6 + t3epatch
The Default Programming Environment on the Cray was:
craylibs 3.0.1.4 craytools 3.0.1.1
cf90 3.0.1.4 scc 6.0.1.3
CC 3.0.1.3 CCmathlib 3.0.1.0
CCtoollib 3.0.1.0 cam 2.3.0.1
mpt 1.2.0.1
A few comments about performance:
Notice that the asymptotic performance of PLAPACK is better than the
current version of ScaLAPACK. We be (speculatively) attributed to
two things:
1) we use an algorithmic block size that is larger than our distribution
blocksize, which appears to improve performance.
2) The local symmetric rank-k update performed as part of the Cholesky
is inherently tricky: Locally on one node, the matrix is not a clean
rectangle nor a clean triangle, and thus the matrix is generally
updated panel at a time (where the panel width equals the distribution
block size). Our implementation instead uses a recursive approach,
which recreates a clean rectangle for the bulk of the computation,
thereby improving performance of the local BLAS calls.
Notice that for smaller matrices, PLAPACK clearly still carries a large
overhead. With some effort this can still be improved.
Limitations of the reported experiment: The data is for one machine,
one mesh size, and one algorithm.
Conclusion (as before): There is still much research to be done in
this area. It thus continues to be our view that standardization of
the distributed BLAS is premature, regardless of the interface being
proposed.
Regards
Robert
From dongarra@CS.UTK.EDU Thu Apr 16 16:35:10 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA02537; Thu, 16 Apr 1998 16:35:08 -0400
Received: from Aurora.CS.MsState.Edu (aurora.cs.msstate.edu [130.18.209.4])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id QAA28634; Thu, 16 Apr 1998 16:25:29 -0400 (EDT)
Received: from localhost (tony@localhost);
by Aurora.CS.MsState.Edu using SMTP (8.8.8/7.0m-FWP-MsState);
id PAA20697; Thu, 16 Apr 1998 15:34:49 -0500 (CDT)
Date: Thu, 16 Apr 1998 15:34:48 -0500 (CDT)
From: Tony Skjellum
To: Robert van de Geijn
cc: blast-comm@CS.UTK.EDU, blast-parallel@CS.UTK.EDU
Subject: Re: Distributed BLAS standardization
In-Reply-To: <199804161942.OAA18392@bigbird.cs.utexas.edu>
Message-ID:
MIME-Version: 1.0
Content-Type: TEXT/PLAIN; charset=US-ASCII
Folks,
Aren't we just comparing how clever Robert is vs. how clever Cray is here,
on a given day, or how many minion hours have been applied to a specific
solver in specific situations in either academia or industry?
I see no way to draw scientific inteferences (new word) other than for
dense linear algebra on 1 CPU, modulo overheads, the right abstracton
levels work pretty well (subject to all the neat issues we have learned
from each other over the years about involving caches, prefetching, pages,
TLBs etc), and there are important issues (like poly-algorithms) that
come into play, particularly in parallel. I think most people involved in
BLAST share these values, so I am not sure that the letter argues for
anything in terms of a persuasive argument, other than a dictum that we
not standardize :-)
It is very well likely that standardizing the API would increase the
users of solvers, and if done right, would enhance, rather than detract
from the competition, because it would allow (if general enough) all the
interesting options to function efficiently, under one tent: for
instance, with early binding of the problem, to allow poly-algorithmic
selection in the parallel case, or to avoid repeated error checking
for the case of small, multiple instance problems.
Indeed, standardization democratizes the situation, by giving more play to
those would have not been willing to use existing interfaces, because they
prohibit efficient programmatic approaches, including but not limited to
those above. Robert and others benefit if they can, within the canon of
an interface, show off their new ideas without breaking user code.
I submit that the BLAST Activity therefore can accelerate and otherwise
benefit the on-going effort to understand how to reduce price of
portability, and increase the market for reusable solvers like these.
Hope you have a good meeting in April, I'll see you at the next one
in October!
Tony
On Thu, 16 Apr 1998, Robert van de Geijn wrote:
> Date: Thu, 16 Apr 1998 14:42:31 -0500
> From: Robert van de Geijn
> To: blast-comm@CS.UTK.EDU
> Cc: blast-parallel@CS.UTK.EDU, rvdg@cs.utexas.edu
> Subject: Distributed BLAS standardization
>
>
> Folks,
>
> We recently reported performance numbers for PLAPACK vs. ScaLAPACK on
> our Cray T3E-600. In compiling numbers for ScaLAPACK, we measured
> performance of the ScaLAPACK library that is part of the Cray
> scientific library. We were not aware that the Cray scientific
> library is compiled with "streams" turned off. The default for our
> system is to have streams turned on for both compilation and
> execution, and thus PLAPACK was compiled and executed with streams
> turned on.
>
> Corrected performance numbers (for the LU factorization based
> solvers) are given below.
>
> Cray T3E-600 (300 MHz) 16 nodes
>
> PLAPACK ScaLAPACK ScaLAPACK (Empirical)
> size streams on Guide streams off streams on
> --------------------------------------------------------------
> 2000 66 101 91 110
> 5000 174 168 169 215
> 7500 228 201 202 259
> 10000 268 209 225 285
>
> Version information:
>
> PLAPACK Release 1.1
> ScaLAPACK as part of Cray Scientific Library
> Standard C compiler Cray Standard C Version
> Cray Assembler CAM Version 2.3.0.0
> f90 Cray CF90
> CrayLibs Version 3.0.0.0
> CrayTools Version 3.0.0.0
> Compiler Options -O3
>
> While the performance numbers we quoted for the ScaLAPACK version
> included in the Cray scientific library were not measured under
> optimal circumstances, the conclusion that high performance can be
> attained by raising the level of abstraction is nonetheless correct.
> We had merely stopped optimizing PLAPACK at the point where we felt
> there was a comfortable gap between PLAPACK performance and ScaLAPACK
> performance.
>
> After measuring ScaLAPACK with streams turned on, I implemented a few
> minor optimizations to PLAPACK to demonstrate this point. So far, we
> have only done this for the Cholesky factorization. The following
> table reports a number of different versions for each of ScaLAPACK and
> PLAPACK
>
> Cray T3E-600 (300 MHz) 16 processors
>
> PLAPACK ScaLAPACK
> Previously Optimized Cray Sci Library netlib version
> Reported streams off on streams on
> n
> 1000 49 83 98 85
> 2000 104 115 137 165 149
> 3000 168 164 199 191
> 4000 200 184 226 222
> 5000 219 232 198 243 240
> 7500 268 283 220 269 273
> 10000 302 315 234 287 292
> 12500 334 245 299 306
> 14000 348
>
>
> A comment on what was measured:
> Performance for PLAPACK and for the Cray Sci Library were performed
> by ourselves on the Cray T3E-600 at UT-Austin. The compiler versions
> etc. are given above. The numbers in the last column were sent to
> us by Jack Dongarra, and were collected on the NERSC Cray T3E-600.
> Version info for that machine:
>
> UNICOS/mk 2.0.2.18
> LIBSCI 3.0.1.4
> C compiler = cc
> C flags = -O3
> F77 compiler = f90
> F77 flags = -dp -O3 -X m
> Cray MPI (mpt 1.2.0.1)
> The ScaLAPACK code was what is available on netlib:
> ScaLAPACK, version 1.5 + update1.6 + t3epatch
> The Default Programming Environment on the Cray was:
> craylibs 3.0.1.4 craytools 3.0.1.1
> cf90 3.0.1.4 scc 6.0.1.3
> CC 3.0.1.3 CCmathlib 3.0.1.0
> CCtoollib 3.0.1.0 cam 2.3.0.1
> mpt 1.2.0.1
>
> A few comments about performance:
> Notice that the asymptotic performance of PLAPACK is better than the
> current version of ScaLAPACK. We be (speculatively) attributed to
> two things:
>
> 1) we use an algorithmic block size that is larger than our distribution
> blocksize, which appears to improve performance.
>
> 2) The local symmetric rank-k update performed as part of the Cholesky
> is inherently tricky: Locally on one node, the matrix is not a clean
> rectangle nor a clean triangle, and thus the matrix is generally
> updated panel at a time (where the panel width equals the distribution
> block size). Our implementation instead uses a recursive approach,
> which recreates a clean rectangle for the bulk of the computation,
> thereby improving performance of the local BLAS calls.
>
> Notice that for smaller matrices, PLAPACK clearly still carries a large
> overhead. With some effort this can still be improved.
>
> Limitations of the reported experiment: The data is for one machine,
> one mesh size, and one algorithm.
>
> Conclusion (as before): There is still much research to be done in
> this area. It thus continues to be our view that standardization of
> the distributed BLAS is premature, regardless of the interface being
> proposed.
>
> Regards
> Robert
>
A. Skjellum, PhD, Assoc. Prof. of Computer Science; Mississippi State University
http://www.cs.msstate.edu/~tony; tony@cs.msstate.edu; 601-325-8435 (FAX -8997)
"Mississippi (n): small US state where opportunities abound." [Try MPI/RT!]
From dongarra@CS.UTK.EDU Thu Apr 16 19:02:07 1998
Return-Path:
Received: from CS.UTK.EDU by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id TAA03846; Thu, 16 Apr 1998 19:02:07 -0400
Received: from timbuk.cray.com (timbuk-fddi.cray.com [128.162.8.102])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id TAA06842; Thu, 16 Apr 1998 19:02:08 -0400 (EDT)
Received: from ironwood.cray.com (root@ironwood-fddi.cray.com [128.162.21.36]) by timbuk.cray.com (8.8.8/CRI-gate-news-1.3) with ESMTP id SAA05188; Thu, 16 Apr 1998 18:02:03 -0500 (CDT)
Received: from horta.cray.com (horta [128.162.173.163]) by ironwood.cray.com (8.8.4/CRI-ironwood-news-1.0) with ESMTP id SAA23487; Thu, 16 Apr 1998 18:02:01 -0500 (CDT)
From: Ed Anderson
Received: by horta.cray.com (8.8.0/btd-b3)
id XAA01112; Thu, 16 Apr 1998 23:02:00 GMT
Message-Id: <199804162302.XAA01112@horta.cray.com>
Subject: Re: Distributed BLAS standardization
To: rvdg@cs.utexas.edu (Robert van de Geijn)
Date: Thu, 16 Apr 1998 18:02:00 -0500 (CDT)
Cc: blast-comm@CS.UTK.EDU, blast-parallel@CS.UTK.EDU
In-Reply-To: <199804161942.OAA18392@bigbird.cs.utexas.edu> from "Robert van de Geijn" at Apr 16, 98 02:42:31 pm
X-Mailer: ELM [version 2.4 PL24-CRI-d]
MIME-Version: 1.0
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: 7bit
Robert van de Geijn wrote:
>
> Folks,
>
> We recently reported performance numbers for PLAPACK vs. ScaLAPACK on
> our Cray T3E-600. In compiling numbers for ScaLAPACK, we measured
> performance of the ScaLAPACK library that is part of the Cray
> scientific library. We were not aware that the Cray scientific
> library is compiled with "streams" turned off. The default for our
> system is to have streams turned on for both compilation and
> execution, and thus PLAPACK was compiled and executed with streams
> turned on.
>
I'd like to clarify Robert's comment that the Cray scientific library
is "compiled with streams turned off". Because of a hardware issue
that affected CRAY T3E-600 (300 MHz) systems, the programming
environment made an effort to determine if an application were "stream
safe" by checking it for usage of Cray's proprietary "shmem" library.
If the application used the shmem library (or made some other use of
the E-registers), then the stream buffers were turned off by default;
however, some sites allowed users to enable the streams anyway by
setting the environment variable SCACHE_D_STREAMS to 1.
Robert is correct that the Cray Scientific Library, specifically the
BLACS, has been optimized using shmem. On a CRAY T3E-600 system, the
loader will detect that a program calling ScaLAPACK is using shmem and
disable the streams. The netlib version, using MPI BLACS, would have
the streams on by default because the MPI and PVM libraries have been
made "stream safe". We believe that libsci is "stream safe" too, so it
was OK for him to re-enable the streams.
The stream coherence problem was corrected in the next generation, so
all executables on CRAY T3E-900 and CRAY T3E-1200 systems have the
streams on by default.
--Ed
-------------------------------------------------------------------
Ed Anderson Cray Research/Silicon Graphics
Benchmarking Group 655F Lone Oak Drive
email: eca@cray.com Eagan, Minnesota 55121
Phone: 612-683-5238 Fax: 612-683-5599
From dongarra@CS.UTK.EDU Tue Jan 12 08:48:34 1999
Return-Path:
Received: from CS.UTK.EDU (CS.UTK.EDU [128.169.94.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id IAA07399; Tue, 12 Jan 1999 08:48:34 -0500
Received: from gate.ispras.ru (gate.ispras.ru [194.67.37.200])
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id IAA27923; Tue, 12 Jan 1999 08:53:34 -0500 (EST)
Received: from ispserv.ispras.ru (ispserv [194.67.37.72])
by gate.ispras.ru (8.9.1a/8.9.1) with ESMTP id QAA17950
for ; Tue, 12 Jan 1999 16:52:42 +0300 (GMT)
Received: from beta (beta [194.67.37.156])
by ispserv.ispras.ru (8.8.8+Sun/8.8.8) with SMTP id PAA14555
for ; Tue, 12 Jan 1999 15:59:47 +0200 (EET)
Sender: ka@ispras.ru
Message-ID: <369B618E.6DE8@ispras.ru>
Date: Tue, 12 Jan 1999 16:51:58 +0200
From: Alexey Kalinov
X-Mailer: Mozilla 3.0 (X11; I; SunOS 5.5.1 sun4m)
MIME-Version: 1.0
To: blast-parallel@CS.UTK.EDU
Subject: (no subject)
Content-Type: text/plain; charset=us-ascii
Content-Transfer-Encoding: 7bit
Content-Transfer-Encoding: 7bit
subscribe blast-comm
From dongarra@CS.UTK.EDU Fri Mar 3 05:41:16 2000
Return-Path:
Received: from CS.UTK.EDU (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id FAA27435; Fri, 3 Mar 2000 05:41:15 -0500
Received: from CS.UTK.EDU (128.169.94.1 -> CS.UTK.EDU)
by netlib2.cs.utk.edu (smtpshim v1.0); Fri, 3 Mar 2000 05:41:15 -0500
Received: from gull.prod.itd.earthlink.net (marvin@localhost)
by CS.UTK.EDU with ESMTP (cf v2.9s-UTK)
id FAA09775; Fri, 3 Mar 2000 05:54:19 -0500 (EST)
From:
Received: from gull.prod.itd.earthlink.net (207.217.121.85 -> gull.prod.itd.earthlink.net)
by CS.UTK.EDU (smtpshim v1.0); Fri, 3 Mar 2000 05:54:19 -0500
Received: from oemcomputer (sdn-ar-001flflauP216.dialsprint.net [168.191.74.130])
by gull.prod.itd.earthlink.net (8.9.3/8.9.3) with SMTP id CAA15339
for blast-parallel@cs.utk.edu; Fri, 3 Mar 2000 02:54:16 -0800 (PST)
Date: Fri, 3 Mar 2000 02:54:16 -0800 (PST)
Message-Id: <200003031054.CAA15339@gull.prod.itd.earthlink.net>
To:
Subject: >> Guaranteed -- Best SPRING BREAK Deals on the Planet!!! ==>
MIME-Version: 1.0
Content-Type: text/plain; charset=unknown-8bit
Content-Transfer-Encoding: base64
X-MIME-Autoconverted: from 8bit to base64 by CS.UTK.EDU id FAA09777
DQoNCiAgDQogICAgICBfX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19fX19f
X19fX19fDQogICAgICCrpLulq6S7p6uku6WrpLunq6S7pauku6erpLulq6S7p6uku6WrpLun
q6S7paukDQogICAgICCvr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+v
r6+vr6+vDQogIFN1Ymo6ICBHdWFyYW50ZWVkIC0tIEJFU1QgU1BSSU5HIEJSRUFLIERlYWxz
IE9uIFRoZSBJbnRlcm5ldCEhIQ0KICAgICAgX19fX19fX19fX19fX19fX19fX19fX19fX19f
X19fX19fX19fX19fX19fX19fXw0KICAgICAgq6S7pauku6erpLulq6S7p6uku6WrpLunq6S7
pauku6erpLulq6S7p6uku6WrpA0KICAgICAgr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+vr6+v
r6+vr6+vr6+vr6+vr6+vr6+vrw0KDQoJPT4gRGlzY291bnQgSG90ZWwgUm9vbXMgLSBTQVZF
IFVwIHRvIDcwJSEhIQ0KCQ0KCT0+PiBPbmxpbmUgUmVzZXJ2YXRpb25zIGZvciBhbGwgdGhl
IEhPVCBTUFJJTkcgQlJFQUsgRGVzdGluYXRpb25zDQoNCgk9Pj4+IFJvb21zIGF2YWlsYWJs
ZSBmb3IgU09MRCBPVVQgREFURVMhIQ0KDQoJPT4+Pj4gT25saW5lIHJlc2VydmF0aW9ucyBm
b3IgYWxsIE1BSk9SIFNLSS9TTk9XQk9BUkQgRGVzdGluYXRpb25zDQoNCg0KDQpDbGljayB0
aGUgTGluayBmb3IgdGhlIEJFU1QgU1BSSU5HIEJSRUFLIERFQUxTIE9OIFRIRSBJTlRFUk5F
VDoNCg0KCWh0dHA6Ly93d3cuNFNwcmluZy1CcmVhay5jb20NCg0KSWYgTGluayBkb2VzIG5v
dCB3b3JrLCBqdXN0IGN1dCBhbmQgcGFzdGUgaW50byB5b3VyIGJyb3dzZXIhIQ0KDQoNCisr
KysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysr
Kw0KKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysr
KysrKysrDQpMb29raW5nIGZvciBzb21ldGhpbmcgYSBsaXR0bGUgQ09PTEVSIHRoaXMgU3By
aW5nIEJyZWFrPz8NCg0KTWF5YmUgc29tZSBBV0VTT01FIFNLSUlORywgU05PV0JPQVJESU5H
ICYgUEFSVFlJTkc/PyEhDQorKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysrKysr
KysrKysrKysrKysrKysrKysrKysrDQorKysrKysrKysrKysrKysrKysrKysrKysrKysrKysr
KysrKysrKysrKysrKysrKysrKysrKysrKysrDQoNCg0KQ2xpY2sgdGhlIExpbmsgYmVsb3cg
Zm9yIHRoZSBCRVNUIERFQUxTIEJZIFRIRSBTTE9QRVM6DQoNCglodHRwOi8vd3d3LjRTa2lS
ZXNvcnRzLmNvbQ0KDQpJZiBMaW5rIGRvZXMgbm90IHdvcmssIGp1c3QgY3V0ICYgcGFzdGUg
aW50byB5b3VyIGJyb3dzZXIhIQ0KDQoNCg0KDQoNCg0KDQoNCg0KDQoNCg0KDQoNCg0KDQoN
Cg0KDQoNCg0KDQoNCg0KDQoNCg0KDQoNCg0KDQoNCg0KdG8gcmVtb3ZlIHNlbmQgdG8geWFo
X21vbjFAeWFob28uY29tDQoNCioqKioqKioqKioqKioqKioqKioNCjU2MzIxDQoNCg==
From postmaster@cs.utk.edu Fri Oct 6 09:06:40 2000
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id JAA22584; Fri, 6 Oct 2000 09:06:39 -0400
Received: from cs.utk.edu (128.169.94.1 -> CS.UTK.EDU)
by netlib2.cs.utk.edu (smtpshim v1.0); Fri, 6 Oct 2000 09:06:40 -0400
Received: from m1.cs.man.ac.uk (marvin@localhost)
by cs.utk.edu with ESMTP (cf v2.9s-UTK)
id JAA22220; Fri, 6 Oct 2000 09:15:37 -0400 (EDT)
Received: from m1.cs.man.ac.uk (130.88.192.2 -> m1.cs.man.ac.uk)
by cs.utk.edu (smtpshim v1.0); Fri, 6 Oct 2000 09:15:38 -0400
Received: from random by m1.cs.man.ac.uk (8.8.8/AL/MJK-2.0)
id OAA04330; Fri, 6 Oct 2000 14:15:34 +0100 (BST)
Received: from bane by random with local (Exim 2.05 #2)
id 13hXLU-0004qI-00; Fri, 6 Oct 2000 14:15:28 +0100
Subject: wanted: (parallel) sparse Mv BLAS for O2K
To: blast-comm@cs.utk.edu, blast-parallel@cs.utk.edu, blast-sparse@cs.utk.edu
Date: Fri, 6 Oct 2000 14:15:28 +0100 (BST)
From: Michael
Reply-to: m.bane@cs.man.ac.uk
X-Mailer: ELM [version 2.4ME+ PL66 (25)]
MIME-Version: 1.0
Content-Type: text/plain; charset=US-ASCII
Content-Transfer-Encoding: 7bit
Message-Id:
Sender: Michael Bane
Does a (parallel) sparse Mv "BLAS" yet exist for the SGI Origin2000?
ta in adv
- --- ----- ------- ----- --- -
Michael Bane
Centre for Novel Computing
University of Manchester
Tel: 0161 275 6134
http://www.cs.man.ac.uk/~bane
UK OpenMP website:
http://www.mcc.ac.uk/hpc/OpenMP
- --- ----- ------- ----- --- -
From postmaster@cs.utk.edu Sun Mar 4 01:34:59 2001
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id BAA19641; Sun, 4 Mar 2001 01:34:59 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Sun, 4 Mar 2001 01:34:59 -0500
Received: from pop509-ec.mail.com (marvin@localhost)
by cs.utk.edu with ESMTP (cf v2.9s-UTK)
id BAA13395; Sun, 4 Mar 2001 01:34:58 -0500 (EST)
Received: from pop509-ec.mail.com (165.251.32.56 -> pop509-ec.mail.com)
by cs.utk.edu (smtpshim v1.0); Sun, 4 Mar 2001 01:34:59 -0500
Received: from data (unknown [62.98.152.178])
by pop509-ec.mail.com (Postfix) with SMTP
id 2800A154C62; Sun, 4 Mar 2001 01:29:23 -0500 (EST)
From: "Nancy Howard"
Subject: RE: about britney spears
Message-Id: <20010304062923.2800A154C62@pop509-ec.mail.com>
Date: Sun, 4 Mar 2001 01:29:23 -0500 (EST)
Apparently-To:
Apparently-To:
Apparently-To:
Apparently-To:
Apparently-To:
Hello, I thinl personally that britney spears did some hard movie in her past
I know this for sure.. just check by yourself this and let me know,.... You will see
I don't say bullshits check out at http://www.sex4many.com
From postmaster@cs.utk.edu Wed Mar 14 23:23:41 2001
Return-Path:
Received: from cs.utk.edu (localhost.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id XAA18458; Wed, 14 Mar 2001 23:23:40 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Wed, 14 Mar 2001 23:23:40 -0500
Received: from project (marvin@localhost)
by cs.utk.edu with ESMTP (cf v2.9s-UTK)
id XAA08211; Wed, 14 Mar 2001 23:23:41 -0500 (EST)
From:
Received: from project (211.104.88.203)
by cs.utk.edu (smtpshim v1.0); Wed, 14 Mar 2001 23:23:41 -0500
Received: by project id GAA507592; Thu, 15 Mar 2001 06:17:29 +0900 (KST)
To: blast-parallel@cs.utk.edu
Subject: FREE Biotech Stock Info! 156
Date: Wed, 14 Mar 2001 16:17:10
Message-Id: <345.789452.450103@hotmail.com>
Reply-To: biotechinfo2007@yahoo.com
Mime-Version: 1.0
Content-Type: text/html; charset="us-ascii"
Do you want to capitalize on the Biotech Revolution

Do
you want to capitalize on the Biotech Revolution? Would you like to add
groundbreaking biotech, pharmaceutical and medical device companies to your
portfolio mix? Does hearing about exciting IPO and private placement offerings
from life sciences companies interest you?
The
exclusive Ruddy-Carlisle Biotech Infoline service keeps you abreast of
investment opportunities in the life sciences space. Just sign up for it once
and get important information instantly delivered to study at your leisure. Our
service is 100% FREE! Sign
up!
Ruddy-Carlisle
Biotech Infoline:
- Instantly
delivers key life sciences investment information directly to you!
- Learn
about biotech, pharmaceutical & medical device investment opportunities
before others!
- Includes
IPO & private placement information!
- 100%
FREE!
For
the entire last decade there were only three profitable biotech companies. At
the end of this year, ten are projected. At the end of 2003, over forty
are projected! The genomic promise is about to be delivered and investors know
it. The Ruddy-Carlisle Biotech Infoline provides you with critical,
decision-making, information that aids the chance of investment success in this
lucrative space. Sign
up!
Please
Note- Your information will only be
shared with companies that are in the life sciences space and pass our
rigorous inspection. Only the best opportunities will come to you.
Ruddy-Carlisle respects your privacy. Sign
up!
List Removal Instructions- Simply click here: remove
to be instantly and permanently removed from our list. Send the blank email to
the address specified. Please do not try to reply to this message.
From postmaster@cs.utk.edu Thu Apr 12 01:16:11 2001
Return-Path:
Received: from cs.utk.edu (localhost.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id BAA26309; Thu, 12 Apr 2001 01:16:11 -0400
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Thu, 12 Apr 2001 01:16:11 -0400
Received: from yahoo.com (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id BAA19021; Thu, 12 Apr 2001 01:16:08 -0400 (EDT)
Received: from yahoo.com (216.112.165.60 -> ts006d48.scr-pa.concentric.net)
by cs.utk.edu (smtpshim v1.0); Thu, 12 Apr 2001 01:16:09 -0400
From:
To: blast-parallel@cs.utk.edu
Subject: 68. (ADV) FREE GIFT LOCATOR!!!
Date: Thu, 12 Apr 2001 01:14:43
Message-Id: <399.814166.704377@yahoo.com>
Reply-To: christmaseveryday2001@yahoo.com
Mime-Version: 1.0
Content-Type: text/html; charset="us-ascii"
What
What?
Christmas Shopping!!!
The
two reasons to start thinking about Christmas gifts so soon are:
-
Great
Deals
-
Original
Gifts
That's
right ladies (ok...and gentlemen), the best deals come to the early
shopper. And if you are going to find that unique gift for a special
person, you can't start looking to soon. I know...I am not telling you anything
that you don't already know. I'm sure that many of you already have those after
Christmas deals packed away for next year.
That's
exactly the point...
Wouldn't
It be great to get occasional email notifications about huge Christmas
bargains, or unique handmade gifts that you can't find in the mall.
I
love Christmas...
Christmas
is a wonderful time, especially when you have found that perfect gift for
your loved ones. Sometimes it's difficult shopping for so many family members
and friends.
Throughout
the year I come across both individual craftspeople and companies that
are trying to reach customers. I have seen some wonderful products, and sadly
some truly unique gifts that have never made it to market. So this year I
am hoping to spread the word, make some friends and become a part of your
holiday celebration.
What
you get...
Just
click here and put "yes" in the
subject line to start your service now! You will receive a confirmation
email within 24 - 48 hrs with detailed information on how to use this service.
Please
Note: Only use these links to
subscribe or unsubscribe. DO NOT use the Reply function of your email
program. This will help me more efficiently handle your request.
To
unsubscribe click here. You
do not need to put anything in the subject line to unsubscribe.
By
subscribing to this service you agree to receive a confirmation email with
detailed instruction on how to use this service, a bi-monthly newsletter, and
periodic email notifications about available products. You will always have
the option to easily remove yourself from any future mailing. You will never
be asked to pay anything for any of the above mentioned services.
From memorytogo@memorytogo4.com Wed Oct 17 18:20:55 2001
Return-Path:
Received: from cs.utk.edu ([127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id SAA25302; Wed, 17 Oct 2001 18:20:55 -0400
From:
Received: from cs.utk.edu (160.36.56.56 -> cs.cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Wed, 17 Oct 2001 18:20:55 -0400
Received: from pltn13.pbi.net (marvin@localhost)
by cs.utk.edu with ESMTP (cf v2.9s-UTK)
id SAA07862; Wed, 17 Oct 2001 18:20:56 -0400 (EDT)
Received: from pltn13.pbi.net (64.164.98.8 -> mta7.pltn13.pbi.net)
by cs.utk.edu (smtpshim v1.0); Wed, 17 Oct 2001 18:20:56 -0400
Received: from sales@mail.voyager.com ([216.100.177.30])
by mta7.pltn13.pbi.net (iPlanet Messaging Server 5.1 (built May 7 2001))
with SMTP id <0GLD004M0FCGUF@mta7.pltn13.pbi.net>; Wed,
17 Oct 2001 15:20:51 -0700 (PDT)
Date: Wed, 17 Oct 2001 15:20:48 -0700 (PDT)
Date-warning: Date header was inserted by mta7.pltn13.pbi.net
Subject: **SPECIAL**COMPUTER MEMORY
Reply-to: memorytogo@memorytogo4.com
Message-id: <0GLD004SFFEAUF@mta7.pltn13.pbi.net>
MIME-version: 1.0
Content-type: text/html; charset=us-ascii
Content-transfer-encoding: 7BIT
Memory to Go - Specials - October
Dear Customer,
Memory to Go carries memory for
all systems. We offer a huge selection of memory for
both PC and MAC computers, printers, Cams and other devices. Just check
out a few of the outstanding prices below on memory and upgrade
processors for the MAC. Can't find something you're looking for? Call us
TOLL FREE at (877) 308-9800 for a fast, free quote.
|
MEMORY
PB TITANIUM iBOOK
512MB-$150 256MB-$60 128MB-$29
NEW PM G4
512MB-$115 256MB-$58
KINGSTON
PC133 256MB-$80 128MB-$40
DELL INSPIRON 3800,7500,4000, 5000,5000E,8000
32MB-$50 16MB-$26
Compaq Armada M300,E500,M700 E700 notebook
128MB-$35 256MB-$108
POWERBOOK G3 WALL
ST. 128MB-$29 256MB-$54
4 PORT USB
HUB $18
Rambus
PC133
PC100
Flash
SIMM
Dimm

|
|
Cresendo/G3
400MHz PM6100 To 8100 $290 |
Cresendo/G3
350MHz PM7300 TO 9600 $174 |
Boost your computer's performance up to thirteen times as
fast as the original system! The Crescendo 7200 G3 incorporates a G3
PowerPC processor and ultra high-speed Level 2 backside cache to
achieve truly modern performance levels.
|
Gain G3
performance for your "upgrade-challenged" Power Macintosh. The
Crescendo utilizes a G4 PowerPC processor that can take advantage of
AltiVec-enhanced applications for even more impressive performance
gains.
|
|
Cresendo/PBG3
333MHz PB1400 $299 |
Encore/ZIF
G3 500MHz PMG3 $349 |
Boost your computer's performance up to thirteen times as
fast as the original system! The Crescendo 7200 G3 incorporates a G3
PowerPC processor and ultra high-speed Level 2 backside cache to
achieve truly modern performance levels.
|
Gain G3
performance for your "upgrade-challenged" Power Macintosh. The
Crescendo utilizes a G4 PowerPC processor that can take advantage of
AltiVec-enhanced applications for even more impressive performance
gains.
|
|

Flash
Cards - as low as $50! |

Rambus
Memory as low as $36! |
Complies with CompactFlashTM specification 1.4. Compatible
with PC Card ATA standard. NAND type or AND type flash memory.
Minimum 1,000,000 erase cycles. Minimum 10,000 Insertions. Low power
consumption and automatic power saving.
Memory to Go (310) 385-7373 Toll
Free: (877) 308-9800 Fax: (310) 385-9111
| Cutting-edge
technology. Largest bandwith For 820, 840 & 850 Chipset (Pentium
4) Lifetime Warranty. Enables data rates of 800 Mbits per second
(two bits transferred per each clock cycle, at the leading and the
trailing edge of the clock.) Aka: RIMM or RDRAM.
Memory to Go
respects your right to privacy. This email was sent to you because
you are a regular MTG customer. To unsubscribe to this service, click here and enter
Unsubscribe as the subject of your
email. | | |
From postmaster@cs.utk.edu Mon Jan 14 16:25:53 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id QAA11210; Mon, 14 Jan 2002 16:25:52 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Mon, 14 Jan 2002 16:25:52 -0500
Received: from boy (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id QAA22817; Mon, 14 Jan 2002 16:25:51 -0500 (EST)
Message-Id: <200201142125.QAA22817@cs.utk.edu>
Received: from boy (211.202.71.98)
by cs.utk.edu (smtpshim v1.0); Mon, 14 Jan 2002 16:25:52 -0500
From: =?ks_c_5601-1987?B?vsbAzLX7tfu1+w==?=
To: blast-parallel@cs.utk.edu
Subject: =?ks_c_5601-1987?B?W8irurhdILPXxrzB8MDMILi4tecgsMu79r+jwfggvsbAzLX7tfu1+8DUtM+02S4=?=
Date: Tue, 15 Jan 2002 06:24:21 +0900
MIME-Version: 1.0
Content-Type: multipart/alternative;
boundary="----=_NextPart_000_0067_01C0F24A.93A21C00"
X-Priority: 3
X-MSMail-Priority: Normal
X-Mailer: Microsoft Outlook Express 6.00.2600.0000
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2600.0000
This is a multi-part message in MIME format.
------=_NextPart_000_0067_01C0F24A.93A21C00
Content-Type: text/plain;
charset="ks_c_5601-1987"
Content-Transfer-Encoding: base64
sMu79r+jwfggvsbAzLX7tfu1+yAgICAgICAgICAgICAgICAgICAgICAgIMDOxc2z3b+htMIg
uLnAuiDBpLq4v80gsdcgwaS6uLimIMOjvsbB1rTCILDLu/a/o8H4wMwgwNa9wLTPtNkuDQog
IMfPwfa4uCCwy7v2v6PB+LXpwMwgs8q5q7OqILi5wLogwaS6uLimIMGmsPjH2CDB1rTCILDh
sPogv8DI97fBIMGkuri4piDDo7TCtaUguLnAuiCz67fCsPogvcOwo8C7DQogICDH47rxx8+0
wiCw4bD6uKYgw8q3ocfPsO0gwNa9wLTPtNkuDQoNCiAgICAgIMDMwaa0wiC+8sC7ILz2IL74
tMIguLnAuiC+58DHILDLu/aw4bD6uri02bTCIL3Ft9q8uiDA1rTCIMGkuri4piC/5LG4x8+0
wiC9w7TrsKEgtce++r3AtM+02S4NCiAgIMDOxc2z3b+hILvqwOfH2CDA1rTCILvnwMzGriDB
37+htMIgv+y4rrChILLAIMfKv+TH0SDBpLq4tenAuyC047DtIMDWtMIgu+fAzMausKEguLnA
zCDA1rTCtaUsDQogICDAzCC758DMxq4gtenAuyCxuLrQx8+46SDG98W7LLq4xbssx+O66iC7
58DMxq6287DtIMfVtM+02S4NCg0KICAgICAgvsbAzLX7tfu1+7TCIMDMt7Egu+fAzMauuKYg
w6O+xsHWtMIgxKvF17DtuK4gudcgxbC/9rXlILDLu/a/o8H4wNS0z7TZLg0KICAgsKIgxKvF
17DtuK4gurC3ziC9xbfavLogwNa0wiC+9ryxtcggu+fAzMauuLggs9fGvMHwwMcgvue9ycC4
t8617rfPsPy4rsfPtMIgsMu79r+jwfjAzLjnDQogILHNx8+ysryttbUgxKvF17DtuK4gtOO0
58DasKEgtce9xyC89iDA1r3AtM+02S4NCg0KICDEq8XXsO24riC047TnwMwgtce9w7jpIKLf
vsbAzL+jwKXAxyDB1r3EIDHB1rimILmru/PAuLfOILXluK645yBwb3AzIGUtbWFpbCCw6MGk
wLsgteW4s7TPtNkuDQogICAoIr+5IiBhYmNAaXd3dy5uZXQiKQ0KICAgtsfH0Syx1yDEq8XX
sO24rrimILD8uK7H0iC89iDA1rTCILHHx9Gw+iDH2LTnIMSrxdew7biuv6EgtOO058DaIL7G
wMy18LimILXut8/H1bTPtNkuDQogICAote63z73Fw7vAuyDHz73FyMQgtOO057D8uK7A2rfO
IGxvZ2luIMfPvcO46SDEq8XXsO24rrimIMH3waIgsPy4rsfPvccgvPYgwNa9wLTPtNkuKQ0K
ICAgyLi/+LChwNTAuyDHz73DsO0gyLi/+MDMILXHvcO46SCi3yC+xsDMv6PApcDHIMHWvcQg
McHWuKYguau788C4t84gteW4s7TPtNkuDQogICDAzsXNs93AuiCz18a8wfDAzCDB1sDOwMyw
7SC+xsDMtfu1+7X7tMIgs9fGvMHwwMcgsM3AzLHiILanua7A1LTPtNkhIA0KDQogICAgaHR0
cDovL2l3d3cubmV0ICi+xsDMtfu1+7X7KbfOILnmua7H2CDB1ry8v+QgICAgIL7GwMy1+7X7
tfvAxyDAzLPkwLogv+y4riCz18a8wfDAzCCwrrDtwNa0wiDAr8DNx9EgwaS6uLimILytt84g
sPjAr8fPsO0gu/W3zr/uILPXxrzB8LmuyK24pg0KICAgw6LD4sfPtMKwzcDUtM+02S4gsc3H
z7KyvK21tSC+xsDMtfu1+7X7wMcgx9EgsKHBt8DMILXHvu7B1r3DseYgus7FubXluLO0z7TZ
Lg0KDQogILTDILDHsK3Hz73DsO0gx+C6ucfPvLy/5H5+frCou+fH1bTPtNkuDQoNCiAgICAg
ICAgICAgICAgICAgIDHAzyDG8rHVILnmua4gICA3NzQsNTAwIGhpdCgyMDAyLjAxLjA3KSAg
ICAgICAgICAgILPXxrzB8CC047TnIMSrxdew7biuICAgNzQ1ILCzICAgICAgIA0KICDB98Gi
ILnmua7Hz7zFvK0gxvKwocfYIMHWvcq9w7/AISA9PT09PT0+aHR0cDovL2l3d3cubmV0wK/A
zcfRILvnwMzGrrbzsO0gxvKwobXHvcO46SANCiAgwdbAp7rQtem/obDUIL7Lt8HB1r3Dsea5
2bb4tM+02S4gKCC+xsDMtfu1+7X7ID0gaXd3dyApDQoNCiAgICAgICAgICANCiAgsc3Hz7Ky
ILrSxu3AuyCzosPEILXlt8i02bjpIL/rvK24piC52bb4tM+02S4NCiAgsc3Hz8DHILjewM/A
uiDAzsXNs92/obytIMClvK3HzsHfIMPrtebHz7+0wLi45yCxzcfPwMcgvu62sMfRIMGkuri1
tSCwrrDtwNbB9iC+yr3AtM+02S4NCg0KICC02cC9us7FzbTCIMDOxc2z3SzBpLq4xeu9xSy5
2cDMt6+9urnpvcUgte4gwK/AzcfRIMGkuri4uMC7ILq4s7u15biztM+02S4gDQogIL7GwMy1
+7X7tfvAxyCwocG3wMwgtce9w7jpIMD8w7ywocG3ILjewM/AuyDF68fPv6kgwK/AzcfRIMGk
uri4piC53r7Guri9xyC89iDA1r3AtM+02S4NCiAgsPjB9rvnx9fAuyDC/LDtx8+9w7jpIL7G
wMy1+7X7tfsgs7u6zrvnwaTAuyC+xr3HILz2IMDWvcC0z7TZLiC52bfOsKG8rSC6uLHiDQog
ILPXxrzB8MDHILDtsN/AuyC89rfFx8+0wiCw+LCzsNS9w8bHwLsgv+6/tcHfwNS0z7TZLrnZ
t86wobytILq4seINCiAgILHXt6G1tSC89r3FwLsgv/jEoSC+ysC4vccgsOa/7CC89r3FsMW6
zrimIMWsuK/Hz73KvcO/wCG89r3FsMW6zg0KDQogICAgICAgICAg
------=_NextPart_000_0067_01C0F24A.93A21C00
Content-Type: text/html;
charset="ks_c_5601-1987"
Content-Transfer-Encoding: base64
DQo8aHRtbD4NCjxoZWFkPg0KPHRpdGxlPrDLu/a/o8H4IL7GwMy1+7X7tfs8L3RpdGxlPg0K
PHgtbWV0YSBodHRwLWVxdWl2PSJDb250ZW50LVR5cGUiIGNvbnRlbnQ9InRleHQvaHRtbDsg
Y2hhcnNldD1ldWMta3IiPg0KPC9oZWFkPg0KPHN0eWxlIHR5cGU9InRleHQvY3NzIj4NCjwh
LS0NCkE6bGluaywgQTphY3RpdmUsIEE6dmlzaXRlZCB7DQpmb250LXNpemU6IDlwdDsNCmNv
bG9yOiByZWQ7DQp0ZXh0LWRlY29yYXRpb246IG5vbmU7DQp9DQpBOmhvdmVyIHsgDQpmb250
LXNpemU6IDlwdDsNCmNvbG9yOjAwMDAwMDsNCnRleHQtZGVjb3JhdGlvbjogdW5kZXJsaW5l
Ow0KfQ0KVEQgew0KZm9udC1mYW1pbHk6ILG8uLI7DQpmb250LXNpemU6IDlwdDsNCmNvbG9y
OiAwMDAwMDA7DQp9DQotLT4NCjwvc3R5bGU+DQo8dGFibGUgd2lkdGg9MTAwJSAgYmdjb2xv
cj0iI0I0QjRCNCIgdGV4dD0iIzAwMDAwMCI+PHRkIHZhbGlnbj10b3A+DQo8dGFibGUgd2lk
dGg9IjYxOSIgYm9yZGVyPSIwIiBjZWxsc3BhY2luZz0iMCIgY2VsbHBhZGRpbmc9IjAiIGFs
aWduPSJjZW50ZXIiIA0KYmFja2dyb3VuZD0iaHR0cDovL2l3d3cuY28ua3IvaXd3d19pbmZv
L2JhY2suZ2lmIj4NCiAgPHRyPiANCiAgICA8dGQgYWxpZ249ImNlbnRlciIgdmFsaWduPSJ0
b3AiIHdpZHRoPSIxMSI+PGltZyBzcmM9Imh0dHA6Ly9pd3d3LmNvLmtyL2l3d3dfaW5mby9s
ZWZ0LmdpZiIgd2lkdGg9IjExIiANCmhlaWdodD0iMjA5Ij48L3RkPg0KICAgIDx0ZCBhbGln
bj0iY2VudGVyIiB2YWxpZ249InRvcCIgd2lkdGg9IjU4NiI+IA0KICAgICAgPHRhYmxlIHdp
ZHRoPSIxMDAlIiBib3JkZXI9IjAiIGNlbGxzcGFjaW5nPSIwIiBjZWxscGFkZGluZz0iMCIg
DQpiYWNrZ3JvdW5kPSJodHRwOi8vaXd3dy5jby5rci9pd3d3X2luZm8vY19iYWNrLmdpZiI+
DQogICAgICAgIDx0cj4gDQogICAgICAgICAgPHRkPjxhIGhyZWY9Imh0dHA6Ly9pd3d3Lm5l
dCIgdGFyZ2V0PSJfYmxhbmsiPjxpbWcgDQpzcmM9Imh0dHA6Ly9pd3d3LmNvLmtyL2l3d3df
aW5mby9sb2dvLmdpZiIgd2lkdGg9IjU4NiIgaGVpZ2h0PSI0MCIgYm9yZGVyPSIwIj48L2E+
PC90ZD4NCiAgICAgICAgPC90cj4NCiAgICAgICAgPHRyPiANCiAgICAgICAgICA8dGQ+PGlt
ZyBzcmM9Imh0dHA6Ly9pd3d3LmNvLmtyL2l3d3dfaW5mby9sb2dvX3R4dC5naWYiIHdpZHRo
PSI1ODYiIGhlaWdodD0iMTAwIj48L3RkPg0KICAgICAgICA8L3RyPg0KICAgICAgICA8dHI+
IA0KICAgICAgICAgIDx0ZD4gDQogICAgICAgICAgICA8dGFibGUgd2lkdGg9IjEwMCUiIGNl
bGxzcGFjaW5nPSIwIiBjZWxscGFkZGluZz0iMCIgYm9yZGVyPSIxIiBib3JkZXJjb2xvcj0i
IzYxOTRERCIgYm9yZGVyY29sb3JkYXJrPSIjRkZGRkZGIj4NCiAgICAgICAgICAgICAgPHRy
PiANCiAgICAgICAgICAgICAgICA8dGQgd2lkdGg9IjEwMCUiIGFsaWduPSJjZW50ZXIiIGhl
aWdodD0iNjkiPiANCiAgICAgICAgICAgICAgICAgIDx0YWJsZSBiZ2NvbG9yPSIjOThCMUQx
IiB3aWR0aD0iMTAwJSIgYm9yZGVyPSIwIiBjZWxsc3BhY2luZz0iMCIgY2VsbHBhZGRpbmc9
IjAiPg0KICAgICAgICAgICAgICAgICAgICA8dHI+IA0KICAgICAgICAgICAgICAgICAgICAg
IDx0ZCB3aWR0aD0iOTYlIj4mbmJzcDs8L3RkPg0KICAgICAgICAgICAgICAgICAgICA8L3Ry
Pg0KICAgICAgICAgICAgICAgICAgICA8dHI+IA0KICAgICAgICAgICAgICAgICAgICAgIDx0
ZCB3aWR0aD0iOTYlIiBoZWlnaHQ9IjE4Ij4mbmJzcDsmbmJzcDvAzsXNs92/obTCILi5wLog
waS6uL/NILHXIMGkuri4piDDo77Gwda0wiCwy7v2v6PB+MDMIMDWvcC0z7TZLjxicj4NCiZu
YnNwOyZuYnNwO8fPwfa4uCCwy7v2v6PB+LXpwMwgs8q5q7OqILi5wLogwaS6uLimIMGmsPjH
2CDB1rTCILDhsPogv8DI97fBIMGkuri4piDDo7TCtaUguLnAuiCz67fCsPogvcOwo8C7PGJy
PiAgDQombmJzcDsmbmJzcDvH47rxx8+0wiCw4bD6uKYgw8q3ocfPsO0gwNa9wLTPtNkuPGJy
Pjxicj4gDQo8L3RkPg0KICAgICAgICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAg
ICAgICAgICA8dHI+IA0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iOTYlIiBo
ZWlnaHQ9IjE4Ij4NCiZuYnNwOyZuYnNwO8DMwaa0wiC+8sC7ILz2IL74tMIguLnAuiC+58DH
ILDLu/aw4bD6uri02bTCIL3Ft9q8uiDA1rTCIMGkuri4piC/5LG4x8+0wiC9w7TrsKEgtce+
+r3AtM+02S48YnI+ICANCiZuYnNwOyZuYnNwO8DOxc2z3b+hILvqwOfH2CDA1rTCILvnwMzG
riDB37+htMIgv+y4rrChILLAIMfKv+TH0SDBpLq4tenAuyC047DtIMDWtMIgu+fAzMausKEg
uLnAzCDA1rTCtaUsPGJyPiAgIA0KJm5ic3A7Jm5ic3A7wMwgu+fAzMauILXpwLsgsbi60MfP
uOkgPGEgaHJlZj0iaHR0cDovL2l3d3cubmV0L3BvcnRhbC5odG1sIiB0YXJnZXQ9Il9ibGFu
ayI+xvfFuyy6uMW7LMfjuuo8L2E+ILvnwMzGrrbzsO0gx9W0z7TZLjxicj48YnI+ICANCjwv
dGQ+DQogICAgICAgICAgICAgICAgICAgIDwvdHI+DQogICAgICAgICAgICAgICAgICAgIDx0
cj4gDQogICAgICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI5NiUiPg0KJm5ic3A7Jm5i
c3A7vsbAzLX7tfu1+7TCIMDMt7Egu+fAzMauuKYgw6O+xsHWtMIgxKvF17DtuK4gudcgxbC/
9rXlILDLu/a/o8H4wNS0z7TZLjxicj4gIA0KJm5ic3A7Jm5ic3A7sKIgxKvF17DtuK4gurC3
ziC9xbfavLogwNa0wiC+9ryxtcggu+fAzMauuLggs9fGvMHwwMcgvue9ycC4t84NCjxhIGhy
ZWY9Imh0dHA6Ly9pd3d3Lm5ldC9kYXRhL2NhdF9saXN0Lmh0bWwiIHRhcmdldD0iX2JsYW5r
Ij617rfPsPy4rjwvYT7Hz7TCILDLu/a/o8H4wMy45zxicj4NCiZuYnNwOyZuYnNwO7HNx8+y
sryttbUgxKvF17DtuK4gtOO058DasKEgtce9xyC89iDA1r3AtM+02S48YnI+PGJyPg0KDQoN
CiZuYnNwOyZuYnNwOzxhIGhyZWY9Imh0dHA6Ly9pd3d3Lm5ldC9teV9zaWdudXAuaHRtbCIg
dGFyZ2V0PSJfYmxhbmsiPsSrxdew7biuILTjtOc8L2E+wMwgtce9w7jpIKLfvsbAzL+jwKXA
xyDB1r3EIDHB1rimILmru/PAuLfOILXluK645yBwb3AzIGUtbWFpbCCw6MGkwLsgteW4s7TP
tNkuPGJyPiAgDQombmJzcDsmbmJzcDsoIr+5IiBhYmNAaXd3dy5uZXQiKTxicj4gICANCiZu
YnNwOyZuYnNwO7bHx9EssdcgxKvF17DtuK64piCw/Liux9IgvPYgwNa0wiCxx8fRsPogx9i0
5yDEq8XXsO24rr+hILTjtOfA2iC+xsDMtfC4piC17rfPx9W0z7TZLjxicj4gICANCiZuYnNw
OyZuYnNwOyi17rfPvcXDu8C7IMfPvcXIxCC047TnsPy4rsDat84gbG9naW4gx8+9w7jpIMSr
xdew7biuuKYgwffBoiCw/Liux8+9xyC89iDA1r3AtM+02S4pPGJyPiAgIA0KDQombmJzcDsm
bmJzcDs8YSBocmVmPSJodHRwOi8vaXd3dy5uZXQvc2lnbnVwLmh0bWwiIHRhcmdldD0iX2Js
YW5rIj7IuL/4sKHA1DwvYT7AuyDHz73DsO0gyLi/+MDMILXHvcO46SCi3yC+xsDMv6PApcDH
IMHWvcQgMcHWuKYguau788C4t84gteW4s7TPtNkuPGJyPiAgDQombmJzcDsmbmJzcDvAzsXN
s93AuiCz18a8wfDAzCDB1sDOwMyw7SC+xsDMtfu1+7X7tMIgs9fGvMHwwMcgsM3AzLHiILan
ua7A1LTPtNkhIDxicj4gIDxicj4gIA0KDQoNCg0KPC90ZD48L3RyPg0KIDx0cj4gDQogICAg
ICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI5NiUiIGhlaWdodD0iMTkiPjxiPjxmb250
IGNvbG9yPSIjRkY5OTAwIj4NCiZuYnNwOyZuYnNwOzxhIGhyZWY9Imh0dHA6Ly9pd3d3Lm5l
dCIgdGFyZ2V0PSJfYmxhbmsiPmh0dHA6Ly9pd3d3Lm5ldDwvYT48L2ZvbnQ+PC9iPiAovsbA
zLX7tfu1+ym3ziC55rmux9ggwda8vL/kPC90ZD4NCiAgICAgICAgICAgICAgICAgICAgPC90
cj4NCiA8dHI+IA0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iOTYlIiBoZWln
aHQ9IjE5Ij4NCiZuYnNwOyZuYnNwO77GwMy1+7X7tfvAxyDAzLPkwLogv+y4riCz18a8wfDA
zCCwrrDtwNa0wiDAr8DNx9EgwaS6uLimILytt84gsPjAr8fPsO0gu/W3zr/uILPXxrzB8Lmu
yK24pjxicj4gDQombmJzcDsmbmJzcDvDosPix8+0wrDNwNS0z7TZLiCxzcfPsrK8rbW1IL7G
wMy1+7X7tfvAxyDH0SCwocG3wMwgtce+7sHWvcOx5iC6zsW5teW4s7TPtNkuPGJyPjxicj4N
Cg0KJm5ic3A7Jm5ic3A7tMMgsMewrcfPvcOw7SDH4Lq5x8+8vL/kfn5+sKi758fVtM+02S48
YnI+PGJyPg0KDQoNCjwvdGQ+DQogICAgICAgICAgICAgICAgICAgIDwvdHI+DQogICAgICAg
ICAgICAgICAgICA8L3RhYmxlPg0KICAgICAgICAgICAgICAgIDwvdGQ+DQogICAgICAgICAg
ICAgICAgDQogICAgICAgICAgICAgIDwvdHI+DQogICAgICAgICAgICAgIDx0cj4gDQogICAg
ICAgICAgICAgICAgPHRkIHdpZHRoPSIxMDAlIiBhbGlnbj0iY2VudGVyIiB2YWxpZ249InRv
cCI+IA0KICAgICAgICAgICAgICAgICAgICA8dGFibGUgd2lkdGg9IjEwMCUiIGJvcmRlcj0i
MCIgY2VsbHNwYWNpbmc9IjAiIGNlbGxwYWRkaW5nPSIwIj4NCiAgICAgICAgICAgICAgICAg
ICAgPHRyIGJnY29sb3I9IiMwMDAwMDAiPiANCiAgICAgICAgICAgICAgICAgICAgICA8dGQg
d2lkdGg9IjclIiBoZWlnaHQ9IjEiPjwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAgPHRk
IHdpZHRoPSI0MyUiIGhlaWdodD0iMSI+PC90ZD4NCiAgICAgICAgICAgICAgICAgICAgICA8
dGQgd2lkdGg9IjQlIiBoZWlnaHQ9IjEiPjwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAg
PHRkIHdpZHRoPSI0NiUiIGhlaWdodD0iMSI+PC90ZD4NCiAgICAgICAgICAgICAgICAgICAg
PC90cj4NCiAgICAgICAgICAgICAgICAgICAgPHRyPiANCiAgICAgICAgICAgICAgICAgICAg
ICA8dGQgd2lkdGg9IjclIiBoZWlnaHQ9IjIwIiBiZ2NvbG9yPSIjOTlDQ0NDIj4mbmJzcDs8
L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iNDMlIiBoZWlnaHQ9IjIw
IiBiZ2NvbG9yPSIjOTlDQ0NDIj48Yj4xwM8gxvKx1SC55rmuPC9iPjwvdGQ+DQogICAgICAg
ICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI0JSIgaGVpZ2h0PSIyMCIgYmdjb2xvcj0iI2Vk
ZWRlZCI+Jm5ic3A7PC90ZD4NCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9IjQ2
JSIgaGVpZ2h0PSIyMCIgYmdjb2xvcj0iI2VkZWRlZCI+PGI+PGZvbnQgY29sb3I9IiM2NjY2
NjYiPjc3NCw1MDAgaGl0KDIwMDIuMDEuMDcpDQo8L2ZvbnQ+PC9iPjwvdGQ+DQogICAgICAg
ICAgICAgICAgICAgIDwvdHI+DQogICAgICAgICAgICAgICAgICAgDQogICAgICAgICAgICAg
ICAgICAgIDx0cj4gDQogICAgICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI3JSIgaGVp
Z2h0PSIxIiBiZ2NvbG9yPSIjMDAwMDAwIj48L3RkPg0KICAgICAgICAgICAgICAgICAgICAg
IDx0ZCB3aWR0aD0iNDMlIiBoZWlnaHQ9IjEiIGJnY29sb3I9IiMwMDAwMDAiPjwvdGQ+DQog
ICAgICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI0JSIgaGVpZ2h0PSIxIiBiZ2NvbG9y
PSIjMDAwMDAwIj48L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iNDYl
IiBoZWlnaHQ9IjEiIGJnY29sb3I9IiMwMDAwMDAiPjwvdGQ+DQogICAgICAgICAgICAgICAg
ICAgIDwvdHI+DQogICAgICAgICAgICAgICAgICAgIDx0cj4gDQogICAgICAgICAgICAgICAg
ICAgICAgPHRkIHdpZHRoPSI3JSIgaGVpZ2h0PSIyMCIgYmdjb2xvcj0iIzk5Q0NDQyI+Jm5i
c3A7PC90ZD4NCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9IjQzJSIgaGVpZ2h0
PSIyMCIgYmdjb2xvcj0iIzk5Q0NDQyI+PGI+s9fGvMHwILTjtOcgxKvF17DtuK48L2I+PC90
ZD4NCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9IjQlIiBoZWlnaHQ9IjIwIiBi
Z2NvbG9yPSIjZWRlZGVkIj4mbmJzcDs8L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0
ZCB3aWR0aD0iNDYlIiBoZWlnaHQ9IjIwIiBiZ2NvbG9yPSIjZWRlZGVkIj48Yj48Zm9udCBj
b2xvcj0iIzY2NjY2NiI+DQo8YSBocmVmPSJodHRwOi8vaXd3dy5uZXQvZGF0YS9jYXRfbGlz
dC5odG1sIiB0YXJnZXQ9Il9uZXciPjc0NSCwszwvYT48L2ZvbnQ+PC9iPjwvdGQ+DQogICAg
ICAgICAgICAgICAgICAgIDwvdHI+DQogICAgICAgICAgICAgICAgICAgIA0KICAgICAgICAg
ICAgICAgICAgPC90YWJsZT4NCiAgICAgICAgICAgICAgICAgIDx0YWJsZSB3aWR0aD0iMTAw
JSIgYm9yZGVyPSIwIiBjZWxsc3BhY2luZz0iMCIgY2VsbHBhZGRpbmc9IjAiPg0KICAgICAg
ICAgICAgICAgICAgIA0KICAgICAgICAgICAgICAgICAgICA8dHI+IA0KICAgICAgICAgICAg
ICAgICAgICAgIDx0ZCBiZ2NvbG9yPSIjQ0VDRUNFIiBoZWlnaHQ9IjE4Ij48Zm9udCBjb2xv
cj0iIzAwMDA4MCI+PGJyPg0KJm5ic3A7Jm5ic3A7wffBoiC55rmux8+8xbytIMbysKHH2CDB
1r3KvcO/wCEgPT09PT09Jmd0OzxhIGhyZWY9Imh0dHA6Ly9pd3d3Lm5ldCIgdGFyZ2V0PSJf
bmV3Ij48Zm9udCBjb2xvcj0iIzA2MDZGRiI+aHR0cDovL2l3d3cubmV0PC9mb250PjwvYT4N
CsCvwM3H0SC758DMxq6287DtIMbysKG1x73DuOkgPGJyPg0KJm5ic3A7Jm5ic3A7wdbAp7rQ
tem/obDUIL7Lt8HB1r3Dsea52bb4tM+02S4gKCC+xsDMtfu1+7X7ID0gaXd3dyApPGJyPg0K
PGJyPjwvdGQ+DQogICAgICAgICAgICAgICAgICAgIDwvdHI+DQogICAgICAgICAgICAgICAg
ICA8L3RhYmxlPg0KICAgICAgICAgICAgICAgIDwvdGQ+DQogICAgICAgICAgICAgICAgIA0K
ICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAgPC90YWJsZT4NCiAgICAgICAgICA8
L3RkPg0KICAgICAgICA8L3RyPg0KDQogPHRyPiANCiAgICAgICAgICA8dGQgaGVpZ2h0PSIy
MCIgYmdjb2xvcj0iIzQwNTY4MCIgYWxpZ249ImxlZnQiPjxmb250IGNvbG9yPSIjZmZmZmZm
Ij48YnI+DQoNCiZuYnNwOyZuYnNwO7HNx8+ysiC60sbtwLsgs6LDxCC15bfItNm46SC/67yt
uKYgudm2+LTPtNkuPGJyPg0KDQombmJzcDsmbmJzcDuxzcfPwMcguN7Az8C6IMDOxc2z3b+h
vK0gwKW8rcfOwd8gw+u15sfPv7TAuLjnILHNx8/AxyC+7rawx9EgwaS6uLW1ILCusO3A1sH2
IL7KvcC0z7TZLjxicj48YnI+DQombmJzcDsmbmJzcDu02cC9us7FzbTCIMDOxc2z3SzBpLq4
xeu9xSy52cDMt6+9urnpvcUgte4gwK/AzcfRIMGkuri4uMC7ILq4s7u15biztM+02S4gPGJy
Pg0KJm5ic3A7Jm5ic3A7vsbAzLX7tfu1+8DHILChwbfAzCC1x73DuOkgwPzDvLChwbcguN7A
z8C7IMXrx8+/qSDAr8DNx9EgwaS6uLimILnevsa6uL3HILz2IMDWvcC0z7TZLjxicj4NCiZu
YnNwOyZuYnNwO7D4wfa758fXwLsgwvyw7cfPvcO46SC+xsDMtfu1+7X7ILO7us6758GkwLsg
vsa9xyC89iDA1r3AtM+02S48QSBIUkVGPSJodHRwOi8vaXd3dy5uZXQvYmJzLmh0bWwiIHRh
cmdldD0iX25ldyI+DQo8Zm9udCBjb2xvcj0iYmx1ZSI+ILnZt86wobytILq4seI8L2ZvbnQ+
PC9hPjxicj4NCiZuYnNwOyZuYnNwO7PXxrzB8MDHILDtsN/AuyC89rfFx8+0wiCw+LCzsNS9
w8bHwLsgv+6/tcHfwNS0z7TZLg0KPGEgaHJlZj0iaHR0cDovL2l3d3cubmV0L3d3d2IvQ3Jh
enlXV1dCb2FyZC5jZ2k/ZGI9Ym9hcmQxIiB0YXJnZXQ9Il9uZXciPjxmb250IGNvbG9yPSJi
bHVlIj652bfOsKG8rSC6uLHiPC9mb250PjwvYT48YnI+DQogIA0KJm5ic3A7Jm5ic3A7sde3
obW1ILz2vcXAuyC/+MShIL7KwLi9xyCw5r/sILz2vcWwxbrOuKYgxay4r8fPvcq9w7/AITwv
Zm9udD4NCjxBIEhSRUY9bWFpbHRvOml3ZWJtYXN0ZXJAaXd3dy5uZXQ/c3ViamVjdD289r3F
sMW6ziZib2R5PbjewM+89r3FsMW6zj4NCjxmb250IGNvbG9yPSJibHVlIj48Yj689r3FsMW6
zjwvYj48L2ZvbnQ+PC9BPjxicj48YnI+IA0KICAgICAgICAgIDwvdGQ+DQogICAgICAgIDwv
dHI+DQogICAgICA8L3RhYmxlPg0KICAgIDwvdGQ+DQogICAgPHRkIGFsaWduPSJjZW50ZXIi
IHZhbGlnbj0idG9wIiB3aWR0aD0iMjIiPjxpbWcgc3JjPSJodHRwOi8vaXd3dy5jby5rci9p
d3d3X2luZm8vcmlnaHQuZ2lmIiB3aWR0aD0iMjIiIA0KaGVpZ2h0PSIyMDkiPjwvdGQ+DQog
IDwvdHI+DQo8L3RhYmxlPg0KPHRhYmxlIHdpZHRoPSI2MTkiIGJvcmRlcj0iMCIgY2VsbHNw
YWNpbmc9IjAiIGNlbGxwYWRkaW5nPSIwIiBhbGlnbj0iY2VudGVyIj4NCiAgPHRyPg0KICAg
IDx0ZD48YSBocmVmPSJodHRwOi8vaXd3dy5uZXQiIHRhcmdldD0iX25ldyI+PGltZyBzcmM9
Imh0dHA6Ly9pd3d3Lm5ldC9pd3d3X2luZm8vYnV0dG9tMS5naWYiIA0Kd2lkdGg9IjYxOSIg
aGVpZ2h0PSI0NiIgYm9yZGVyPSIwIj48L2E+PC90ZD4NCiAgPC90cj4NCjwvdGFibGU+DQo8
L3RkPjwvdGFibGU+DQo8L2h0bWw+
------=_NextPart_000_0067_01C0F24A.93A21C00--
From postmaster@cs.utk.edu Tue Jan 15 00:25:54 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id AAA17742; Tue, 15 Jan 2002 00:25:53 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Tue, 15 Jan 2002 00:25:53 -0500
Received: from yahoo.co.kr (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id AAA05904; Tue, 15 Jan 2002 00:25:52 -0500 (EST)
Message-Id: <200201150525.AAA05904@cs.utk.edu>
Received: from yahoo.co.kr (211.215.9.40)
by cs.utk.edu (smtpshim v1.0); Tue, 15 Jan 2002 00:25:53 -0500
Reply-To: kkkk5012@yahoo.co.kr
From: "1hA$@:"
To:
Subject: (A$:8) !Z E7D+ , D}D+8& C#@8<
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Tue, 15 Jan 2002 14:28:11 +0900
X-User: 2.5-
Content-Transfer-Encoding: base64
X-MIME-Autoconverted: from 8bit to base64 by cs.utk.edu id AAA05906
PEhUTUw+DQo8SEVBRD4NCjxNRVRBIGNvbnRlbnQ9InRleHQvaHRtbDsgY2hhcnNldD1rc19j
XzU2MDEtMTk4NyIgaHR0cC1lcXVpdj1Db250ZW50LVR5cGU+DQo8U1RZTEU+IHAsIGZvbnQs
IHNwYW4geyBsaW5lLWhlaWdodDoxMjAlOyBtYXJnaW4tdG9wOjA7IG1hcmdpbi1ib3R0b206
MDsgfTwvU1RZTEU+DQo8L0hFQUQ+PEJPRFk+DQo8UD4oyKjG5MDMwfYgsbiw5sfPseIpJm5i
c3A7IC0tLS0tLSZndDsmbmJzcDsgPEEgDQpocmVmPSJodHRwOi8vd3d3LnNlYXJjaGNvcmVh
LmNvbSI+aHR0cDovL3d3dy5zZWFyY2hjb3JlYS5jb208L0E+PC9QPg0KPFA+Jm5ic3A7PC9Q
Pg0KPFA+odpgxbfEqyzE/cSrYCDA2r3FwMwgv/jHz73DtMIgwMy788f8wLsgw6O+xrXluLO0
z7TZodo8L1A+DQo8UD6+yLPnx8+8vL/kLsGmsKEgwaS4uyC/qby6utCw+iCzsry6utCysiDB
wcC6vNK9xCC+y7fBteW4sbKyv+ReXjwvUD4NCjxQPrOyvLq60MDMs6ogv6m8urrQwMyzqiC/
5MHyIMW3xKsgxP3Eq7imILi4s6q9w7HiIMj7teW9w8HSPzwvUD4NCjxQPsGmsKEgw9/DtcfY
ILXluK60wiC758DMxq6/oSDH0bn4ILChuri8vL/kLjwvUD4NCjxQPsDMIMi4u+e0wiDC+MfR
ILOywNoswMy727+pwNostMm3wsDWtMKzssDaLMfQurDBwcC6v6nA2iy4xbPKwcHAurOywNos
LCwsPC9QPg0KPFA+te7AuLfOILCiwNrAxyC/+MfPvcO0wiDAzLvzx/zAuyDDo77GvK0gv6yw
4cfYIMHWtMIgPC9QPg0KPFA+yLi757G4v+QsILmwt9AgsKHA1LrxtMIgvvi9wLTPtNkuPC9Q
Pg0KPFA+sde4rrDtILvnwfjAuLfOILvztOu55sDHIL7zsbzAuyDIrsDOx9K89iDA1r7uvK0s
PC9QPg0KPFA+wNq9xcDHIL/4x8+9w7TCIMDMu/PH/MC7ILLAIMOjwLi9x7z2IMDWwLi9x7Ko
v7m/5CxeXio8L1A+DQo8UD6x17iusO0gwNq8vMfRILO7v+vAuyC6uLDtvc3AuL3DuOkgPC9Q
Pg0KPFA+yKjG5MDMwfYgtem+7rzFvK0gsbiw5sfPvcO46SC1x7G4v+QsPC9QPg0KPFA+wcHA
uiDBpLq4sKEgtce8y8C4uOkgwcGw2rPXv+QsIMHxsMW/7iDHz7fnILXHvLy/5C48L1A+DQo8
UD48QlI+KMioxuTAzMH2ILG4sObHz7HiKSZuYnNwOyAtLS0tLS0mZ3Q7Jm5ic3A7IDxBIA0K
aHJlZj0iaHR0cDovL3d3dy5zZWFyY2hjb3JlYS5jb20iPmh0dHA6Ly93d3cuc2VhcmNoY29y
ZWEuY29tPC9BPjwvUD4NCjxQPiZuYnNwOzwvUD4NCjxQPiZuYnNwOzwvUD4NCjxQPjxCUj5w
cy4gx+O29L74wMwgwMy43sDPwLsgurizu7ytIMHLvNvH1bTPtNkuIMDMuN7Az8HWvNK0wjwv
UD4NCjxQPiZuYnNwOyZuYnNwOyZuYnNwOyDFuLvnwMzGriCw1L3Dxse/obytIMOjvsYgwMy4
3sDPwLsgurizu7DUILXHvvq9wLTPtNkuPC9QPg0KPFA+Jm5ic3A7Jm5ic3A7Jm5ic3A7IMDM
uN7AzyC53rHiuKYgv/jHz73DwfYgvsrAu7Dmv+wsvPa9xbDFus64piDH2MHWvcq9w7/kLjwv
UD4NCjxQPiZuYnNwOyZuYnNwOyZuYnNwOyA8Y2VudGVyPjxhIGhyZWY9J2h0dHA6Ly8xOTIu
MTY4LjAuMTo5MDgwL3JlZnVzZS9yZWZ1c2U/Y21kPXZpZXcmZ3JvdXA9MTImbmFtZT0mbWFp
bD1ibGFzdC1wYXJhbGxlbEBjcy51dGsuZWR1Jz48aW1nIHNyYz0naHR0cDovLzE5Mi4xNjgu
MC4xOjkwODAvcmVmdXNlL21haWwtcmVmdXNlLmdpZicgYm9yZGVyPTApPjwvY2VudGVyPjwv
UD4NCjwvQk9EWT4NCjwvSFRNTD4NCg==
From postmaster@cs.utk.edu Tue Jan 15 23:06:58 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id XAA16743; Tue, 15 Jan 2002 23:06:58 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Tue, 15 Jan 2002 23:06:58 -0500
Received: from iwww-75 (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id XAA00797; Tue, 15 Jan 2002 23:06:50 -0500 (EST)
Message-Id: <200201160406.XAA00797@cs.utk.edu>
Received: from iwww-75 (211.202.71.98)
by cs.utk.edu (smtpshim v1.0); Tue, 15 Jan 2002 23:06:55 -0500
From: =?ks_c_5601-1987?B?vsbAzLX7tfu1+w==?=
To: blast-parallel@cs.utk.edu
Subject: =?ks_c_5601-1987?B?W8irurhdILPXxrzB8MDMILi4tecgsMu79r+jwfggvsbAzLX7tfu1+8DUtM+02S4=?=
Date: Wed, 16 Jan 2002 13:05:01 +0900
MIME-Version: 1.0
Content-Type: multipart/alternative;
boundary="----=_NextPart_000_0137_01C0F05A.93A01C00"
X-Priority: 3
X-MSMail-Priority: Normal
X-Mailer: Microsoft Outlook Express 6.00.2600.0000
X-MimeOLE: Produced By Microsoft MimeOLE V6.00.2600.0000
This is a multi-part message in MIME format.
------=_NextPart_000_0137_01C0F05A.93A01C00
Content-Type: text/plain;
charset="ks_c_5601-1987"
Content-Transfer-Encoding: base64
sMu79r+jwfggvsbAzLX7tfu1+yAgICAgICAgICAgICAgICAgICAgICAgIMDOxc2z3b+htMIg
uLnAuiDBpLq4v80gsdcgwaS6uLimIMOjvsbB1rTCILDLu/a/o8H4wMwgwNa9wLTPtNkuDQog
IMfPwfa4uCCwy7v2v6PB+LXpwMwgs8q5q7OqILi5wLogwaS6uLimIMGmsPjH2CDB1rTCILDh
sPogv8DI97fBIMGkuri4piDDo7TCtaUguLnAuiCz67fCsPogvcOwo8C7DQogICDH47rxx8+0
wiCw4bD6uKYgw8q3ocfPsO0gwNa9wLTPtNkuDQoNCiAgICAgIMDMwaa0wiC+8sC7ILz2IL74
tMIguLnAuiC+58DHILDLu/aw4bD6uri02bTCIL3Ft9q8uiDA1rTCIMGkuri4piC/5LG4x8+0
wiC9w7TrsKEgtce++r3AtM+02S4NCiAgIMDOxc2z3b+hILvqwOfH2CDA1rTCILvnwMzGriDB
37+htMIgv+y4rrChILLAIMfKv+TH0SDBpLq4tenAuyC047DtIMDWtMIgu+fAzMausKEguLnA
zCDA1rTCtaUsDQogICDAzCC758DMxq4gtenAuyCxuLrQx8+46SDG98W7LLq4xbssx+O66iC7
58DMxq6287DtIMfVtM+02S4NCg0KICAgICAgvsbAzLX7tfu1+7TCIMDMt7Egu+fAzMauuKYg
w6O+xsHWtMIgxKvF17DtuK4gudcgxbC/9rXlILDLu/a/o8H4wNS0z7TZLg0KICAgsKIgxKvF
17DtuK4gurC3ziC9xbfavLogwNa0wiC+9ryxtcggu+fAzMauuLggs9fGvMHwwMcgvue9ycC4
t8617rfPsPy4rsfPtMIgsMu79r+jwfjAzLjnDQogILHNx8+ysryttbUgxKvF17DtuK4gtOO0
58DasKEgtce9xyC89iDA1r3AtM+02S4NCg0KICDEq8XXsO24riC047TnwMwgtce9w7jpIKLf
vsbAzL+jwKXAxyDB1r3EIDHB1rimILmru/PAuLfOILXluK645yBwb3AzIGUtbWFpbCCw6MGk
wLsgteW4s7TPtNkuDQogICAoIr+5IiBhYmNAaXd3dy5uZXQiKQ0KICAgtsfH0Syx1yDEq8XX
sO24rrimILD8uK7H0iC89iDA1rTCILHHx9Gw+iDH2LTnIMSrxdew7biuv6EgtOO058DaIL7G
wMy18LimILXut8/H1bTPtNkuDQogICAote63z73Fw7vAuyDHz73FyMQgtOO057D8uK7A2rfO
IGxvZ2luIMfPvcO46SDEq8XXsO24rrimIMH3waIgsPy4rsfPvccgvPYgwNa9wLTPtNkuKQ0K
ICAgyLi/+LChwNTAuyDHz73DsO0gyLi/+MDMILXHvcO46SCi3yC+xsDMv6PApcDHIMHWvcQg
McHWuKYguau788C4t84gteW4s7TPtNkuDQogICDAzsXNs93AuiCz18a8wfDAzCDB1sDOwMyw
7SC+xsDMtfu1+7X7tMIgs9fGvMHwwMcgsM3AzLHiILanua7A1LTPtNkhIA0KDQogICAgaHR0
cDovL2l3d3cubmV0ICi+xsDMtfu1+7X7KbfOILnmua7H2CDB1ry8v+QgICAgIL7GwMy1+7X7
tfvAxyDAzLPkwLogv+y4riCz18a8wfDAzCCwrrDtwNa0wiDAr8DNx9EgwaS6uLimILytt84g
sPjAr8fPsO0gu/W3zr/uILPXxrzB8LmuyK24pg0KICAgw6LD4sfPtMKwzcDUtM+02S4gsc3H
z7KyvK21tSC+xsDMtfu1+7X7wMcgx9EgsKHBt8DMILXHvu7B1r3DseYgus7FubXluLO0z7TZ
Lg0KDQogILTDILDHsK3Hz73DsO0gx+C6ucfPvLy/5H5+frCou+fH1bTPtNkuDQoNCiAgICAg
ICAgICAgICAgICAgIDHAzyDG8rHVILnmua4gICA3NzQsNTAwIGhpdCgyMDAyLjAxLjEzKSAg
ICAgICAgICAgsKHA1Mi4v/ggICAxNTYsNTUwICgyMDAyLjAxLjEzKSAgICAgICAgICAgILPX
xrzB8CC047TnIMSrxdew7biuICAgNzg1ILCzICAgICAgIA0KICDB98GiILnmua7Hz7zFvK0g
xvKwocfYIMHWvcq9w7/AISA9PT09PT0+aHR0cDovL2l3d3cubmV0wK/AzcfRILvnwMzGrrbz
sO0gxvKwobXHvcO46SANCiAgwdbAp7rQtem/obDUIL7Lt8HB1r3Dsea52bb4tM+02S4gKCC+
xsDMtfu1+7X7ID0gaXd3dyApDQoNCiAgICAgICAgICANCiAgsc3Hz7KyILrSxu3AuyCzosPE
ILXlt8i02bjpIL/rvK24piC52bb4tM+02S4NCiAgsc3Hz8DHILjewM/AuiDAzsXNs92/obyt
IMClvK3HzsHfIMPrtebHz7+0wLi45yCxzcfPwMcgvu62sMfRIMGkuri1tSCwrrDtwNbB9iC+
yr3AtM+02S4NCg0KICC02cC9us7FzbTCIMDOxc2z3SzBpLq4xeu9xSy52cDMt6+9urnpvcUg
te4gwK/AzcfRIMGkuri4uMC7ILq4s7u15biztM+02S4gDQogIL7GwMy1+7X7tfvAxyCwocG3
wMwgtce9w7jpIMD8w7ywocG3ILjewM/AuyDF68fPv6kgwK/AzcfRIMGkuri4piC53r7Guri9
xyC89iDA1r3AtM+02S4NCiAgsPjB9rvnx9fAuyDC/LDtx8+9w7jpIL7GwMy1+7X7tfsgs7u6
zrvnwaTAuyC+xr3HILz2IMDWvcC0z7TZLiC52bfOsKG8rSC6uLHiDQogILPXxrzB8MDHILDt
sN/AuyC89rfFx8+0wiCw+LCzsNS9w8bHwLsgv+6/tcHfwNS0z7TZLrnZt86wobytILq4seIN
CiAgILHXt6G1tSC89r3FwLsgv/jEoSC+ysC4vccgsOa/7CC89r3FsMW6zrimIMWsuK/Hz73K
vcO/wCG89r3FsMW6zg0KDQogICAgICAgICAg
------=_NextPart_000_0137_01C0F05A.93A01C00
Content-Type: text/html;
charset="ks_c_5601-1987"
Content-Transfer-Encoding: base64
PGh0bWw+DQo8aGVhZD4NCjx0aXRsZT6wy7v2v6PB+CC+xsDMtfu1+7X7PC90aXRsZT4NCjx4
LW1ldGEgaHR0cC1lcXVpdj0iQ29udGVudC1UeXBlIiBjb250ZW50PSJ0ZXh0L2h0bWw7IGNo
YXJzZXQ9ZXVjLWtyIj4NCjwvaGVhZD4NCjxzdHlsZSB0eXBlPSJ0ZXh0L2NzcyI+DQo8IS0t
DQpBOmxpbmssIEE6YWN0aXZlLCBBOnZpc2l0ZWQgew0KZm9udC1zaXplOiA5cHQ7DQpjb2xv
cjogcmVkOw0KdGV4dC1kZWNvcmF0aW9uOiBub25lOw0KfQ0KQTpob3ZlciB7IA0KZm9udC1z
aXplOiA5cHQ7DQpjb2xvcjowMDAwMDA7DQp0ZXh0LWRlY29yYXRpb246IHVuZGVybGluZTsN
Cn0NClREIHsNCmZvbnQtZmFtaWx5OiCxvLiyOw0KZm9udC1zaXplOiA5cHQ7DQpjb2xvcjog
MDAwMDAwOw0KfQ0KLS0+DQo8L3N0eWxlPg0KPHRhYmxlIHdpZHRoPTEwMCUgIGJnY29sb3I9
IiNCNEI0QjQiIHRleHQ9IiMwMDAwMDAiPjx0ZCB2YWxpZ249dG9wPg0KPHRhYmxlIHdpZHRo
PSI2MTkiIGJvcmRlcj0iMCIgY2VsbHNwYWNpbmc9IjAiIGNlbGxwYWRkaW5nPSIwIiBhbGln
bj0iY2VudGVyIiANCmJhY2tncm91bmQ9Imh0dHA6Ly9pd3d3LmNvLmtyL2l3d3dfaW5mby9i
YWNrLmdpZiI+DQogIDx0cj4gDQogICAgPHRkIGFsaWduPSJjZW50ZXIiIHZhbGlnbj0idG9w
IiB3aWR0aD0iMTEiPjxpbWcgc3JjPSJodHRwOi8vaXd3dy5jby5rci9pd3d3X2luZm8vbGVm
dC5naWYiIHdpZHRoPSIxMSIgDQpoZWlnaHQ9IjIwOSI+PC90ZD4NCiAgICA8dGQgYWxpZ249
ImNlbnRlciIgdmFsaWduPSJ0b3AiIHdpZHRoPSI1ODYiPiANCiAgICAgIDx0YWJsZSB3aWR0
aD0iMTAwJSIgYm9yZGVyPSIwIiBjZWxsc3BhY2luZz0iMCIgY2VsbHBhZGRpbmc9IjAiIA0K
YmFja2dyb3VuZD0iaHR0cDovL2l3d3cuY28ua3IvaXd3d19pbmZvL2NfYmFjay5naWYiPg0K
ICAgICAgICA8dHI+IA0KICAgICAgICAgIDx0ZD48YSBocmVmPSJodHRwOi8vaXd3dy5uZXQi
IHRhcmdldD0iX2JsYW5rIj48aW1nIA0Kc3JjPSJodHRwOi8vaXd3dy5jby5rci9pd3d3X2lu
Zm8vbG9nby5naWYiIHdpZHRoPSI1ODYiIGhlaWdodD0iNDAiIGJvcmRlcj0iMCI+PC9hPjwv
dGQ+DQogICAgICAgIDwvdHI+DQogICAgICAgIDx0cj4gDQogICAgICAgICAgPHRkPjxpbWcg
c3JjPSJodHRwOi8vaXd3dy5jby5rci9pd3d3X2luZm8vbG9nb190eHQuZ2lmIiB3aWR0aD0i
NTg2IiBoZWlnaHQ9IjEwMCI+PC90ZD4NCiAgICAgICAgPC90cj4NCiAgICAgICAgPHRyPiAN
CiAgICAgICAgICA8dGQ+IA0KICAgICAgICAgICAgPHRhYmxlIHdpZHRoPSIxMDAlIiBjZWxs
c3BhY2luZz0iMCIgY2VsbHBhZGRpbmc9IjAiIGJvcmRlcj0iMSIgYm9yZGVyY29sb3I9IiM2
MTk0REQiIGJvcmRlcmNvbG9yZGFyaz0iI0ZGRkZGRiI+DQogICAgICAgICAgICAgIDx0cj4g
DQogICAgICAgICAgICAgICAgPHRkIHdpZHRoPSIxMDAlIiBhbGlnbj0iY2VudGVyIiBoZWln
aHQ9IjY5Ij4gDQogICAgICAgICAgICAgICAgICA8dGFibGUgYmdjb2xvcj0iIzk4QjFEMSIg
d2lkdGg9IjEwMCUiIGJvcmRlcj0iMCIgY2VsbHNwYWNpbmc9IjAiIGNlbGxwYWRkaW5nPSIw
Ij4NCiAgICAgICAgICAgICAgICAgICAgPHRyPiANCiAgICAgICAgICAgICAgICAgICAgICA8
dGQgd2lkdGg9Ijk2JSI+Jm5ic3A7PC90ZD4NCiAgICAgICAgICAgICAgICAgICAgPC90cj4N
CiAgICAgICAgICAgICAgICAgICAgPHRyPiANCiAgICAgICAgICAgICAgICAgICAgICA8dGQg
d2lkdGg9Ijk2JSIgaGVpZ2h0PSIxOCI+Jm5ic3A7Jm5ic3A7wM7FzbPdv6G0wiC4ucC6IMGk
uri/zSCx1yDBpLq4uKYgw6O+xsHWtMIgsMu79r+jwfjAzCDA1r3AtM+02S48YnI+DQombmJz
cDsmbmJzcDvHz8H2uLggsMu79r+jwfi16cDMILPKuauzqiC4ucC6IMGkuri4piDBprD4x9gg
wda0wiCw4bD6IL/AyPe3wSDBpLq4uKYgw6O0wrWlILi5wLogs+u3wrD6IL3DsKPAuzxicj4g
IA0KJm5ic3A7Jm5ic3A7x+O68cfPtMIgsOGw+rimIMPKt6HHz7DtIMDWvcC0z7TZLjxicj48
YnI+IA0KPC90ZD4NCiAgICAgICAgICAgICAgICAgICAgPC90cj4NCiAgICAgICAgICAgICAg
ICAgICAgPHRyPiANCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9Ijk2JSIgaGVp
Z2h0PSIxOCI+DQombmJzcDsmbmJzcDvAzMGmtMIgvvLAuyC89iC++LTCILi5wLogvufAxyCw
y7v2sOGw+rq4tNm0wiC9xbfavLogwNa0wiDBpLq4uKYgv+SxuMfPtMIgvcO067ChILXHvvq9
wLTPtNkuPGJyPiAgDQombmJzcDsmbmJzcDvAzsXNs92/oSC76sDnx9ggwNa0wiC758DMxq4g
wd+/obTCIL/suK6woSCywCDHyr/kx9EgwaS6uLXpwLsgtOOw7SDA1rTCILvnwMzGrrChILi5
wMwgwNa0wrWlLDxicj4gICANCiZuYnNwOyZuYnNwO8DMILvnwMzGriC16cC7ILG4utDHz7jp
IDxhIGhyZWY9Imh0dHA6Ly9pd3d3Lm5ldC9wb3J0YWwuaHRtbCIgdGFyZ2V0PSJfYmxhbmsi
Psb3xbssurjFuyzH47rqPC9hPiC758DMxq6287DtIMfVtM+02S48YnI+PGJyPiAgDQo8L3Rk
Pg0KICAgICAgICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAgICAgICAgICA8dHI+
IA0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iOTYlIj4NCiZuYnNwOyZuYnNw
O77GwMy1+7X7tfu0wiDAzLexILvnwMzGrrimIMOjvsbB1rTCIMSrxdew7biuILnXIMWwv/a1
5SCwy7v2v6PB+MDUtM+02S48YnI+ICANCiZuYnNwOyZuYnNwO7CiIMSrxdew7biuILqwt84g
vcW32ry6IMDWtMIgvva8sbXIILvnwMzGrri4ILPXxrzB8MDHIL7nvcnAuLfODQo8YSBocmVm
PSJodHRwOi8vaXd3dy5uZXQvZGF0YS9jYXRfbGlzdC5odG1sIiB0YXJnZXQ9Il9ibGFuayI+
te63z7D8uK48L2E+x8+0wiCwy7v2v6PB+MDMuOc8YnI+DQombmJzcDsmbmJzcDuxzcfPsrK8
rbW1IMSrxdew7biuILTjtOfA2rChILXHvccgvPYgwNa9wLTPtNkuPGJyPjxicj4NCg0KDQom
bmJzcDsmbmJzcDs8YSBocmVmPSJodHRwOi8vaXd3dy5uZXQvbXlfc2lnbnVwLmh0bWwiIHRh
cmdldD0iX2JsYW5rIj7Eq8XXsO24riC047TnPC9hPsDMILXHvcO46SCi377GwMy/o8ClwMcg
wda9xCAxwda4piC5q7vzwLi3ziC15biuuOcgcG9wMyBlLW1haWwgsOjBpMC7ILXluLO0z7TZ
Ljxicj4gIA0KJm5ic3A7Jm5ic3A7KCK/uSIgYWJjQGl3d3cubmV0Iik8YnI+ICAgDQombmJz
cDsmbmJzcDu2x8fRLLHXIMSrxdew7biuuKYgsPy4rsfSILz2IMDWtMIgscfH0bD6IMfYtOcg
xKvF17DtuK6/oSC047TnwNogvsbAzLXwuKYgte63z8fVtM+02S48YnI+ICAgDQombmJzcDsm
bmJzcDsote63z73Fw7vAuyDHz73FyMQgtOO057D8uK7A2rfOIGxvZ2luIMfPvcO46SDEq8XX
sO24rrimIMH3waIgsPy4rsfPvccgvPYgwNa9wLTPtNkuKTxicj4gICANCg0KJm5ic3A7Jm5i
c3A7PGEgaHJlZj0iaHR0cDovL2l3d3cubmV0L3NpZ251cC5odG1sIiB0YXJnZXQ9Il9ibGFu
ayI+yLi/+LChwNQ8L2E+wLsgx8+9w7DtIMi4v/jAzCC1x73DuOkgot8gvsbAzL+jwKXAxyDB
1r3EIDHB1rimILmru/PAuLfOILXluLO0z7TZLjxicj4gIA0KJm5ic3A7Jm5ic3A7wM7FzbPd
wLogs9fGvMHwwMwgwdbAzsDMsO0gvsbAzLX7tfu1+7TCILPXxrzB8MDHILDNwMyx4iC2p7mu
wNS0z7TZISA8YnI+ICA8YnI+ICANCg0KDQoNCjwvdGQ+PC90cj4NCiA8dHI+IA0KICAgICAg
ICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iOTYlIiBoZWlnaHQ9IjE5Ij48Yj48Zm9udCBj
b2xvcj0iI0ZGOTkwMCI+DQombmJzcDsmbmJzcDs8YSBocmVmPSJodHRwOi8vaXd3dy5uZXQi
IHRhcmdldD0iX2JsYW5rIj5odHRwOi8vaXd3dy5uZXQ8L2E+PC9mb250PjwvYj4gKL7GwMy1
+7X7tfspt84guea5rsfYIMHWvLy/5DwvdGQ+DQogICAgICAgICAgICAgICAgICAgIDwvdHI+
DQogPHRyPiANCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9Ijk2JSIgaGVpZ2h0
PSIxOSI+DQombmJzcDsmbmJzcDu+xsDMtfu1+7X7wMcgwMyz5MC6IL/suK4gs9fGvMHwwMwg
sK6w7cDWtMIgwK/AzcfRIMGkuri4piC8rbfOILD4wK/Hz7DtILv1t86/7iCz18a8wfC5rsit
uKY8YnI+IA0KJm5ic3A7Jm5ic3A7w6LD4sfPtMKwzcDUtM+02S4gsc3Hz7KyvK21tSC+xsDM
tfu1+7X7wMcgx9EgsKHBt8DMILXHvu7B1r3DseYgus7FubXluLO0z7TZLjxicj48YnI+DQoN
CiZuYnNwOyZuYnNwO7TDILDHsK3Hz73DsO0gx+C6ucfPvLy/5H5+frCou+fH1bTPtNkuPGJy
Pjxicj4NCg0KDQo8L3RkPg0KICAgICAgICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAg
ICAgICAgICAgPC90YWJsZT4NCiAgICAgICAgICAgICAgICA8L3RkPg0KICAgICAgICAgICAg
ICAgIA0KICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAgICA8dHI+IA0KICAgICAg
ICAgICAgICAgIDx0ZCB3aWR0aD0iMTAwJSIgYWxpZ249ImNlbnRlciIgdmFsaWduPSJ0b3Ai
PiANCiAgICAgICAgICAgICAgICAgICAgPHRhYmxlIHdpZHRoPSIxMDAlIiBib3JkZXI9IjAi
IGNlbGxzcGFjaW5nPSIwIiBjZWxscGFkZGluZz0iMCI+DQogICAgICAgICAgICAgICAgICAg
IDx0ciBiZ2NvbG9yPSIjMDAwMDAwIj4gDQogICAgICAgICAgICAgICAgICAgICAgPHRkIHdp
ZHRoPSI3JSIgaGVpZ2h0PSIxIj48L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3
aWR0aD0iNDMlIiBoZWlnaHQ9IjEiPjwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAgPHRk
IHdpZHRoPSI0JSIgaGVpZ2h0PSIxIj48L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0
ZCB3aWR0aD0iNDYlIiBoZWlnaHQ9IjEiPjwvdGQ+DQogICAgICAgICAgICAgICAgICAgIDwv
dHI+DQogICAgICAgICAgICAgICAgICAgIDx0cj4gDQogICAgICAgICAgICAgICAgICAgICAg
PHRkIHdpZHRoPSI3JSIgaGVpZ2h0PSIyMCIgYmdjb2xvcj0iIzk5Q0NDQyI+Jm5ic3A7PC90
ZD4NCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9IjQzJSIgaGVpZ2h0PSIyMCIg
Ymdjb2xvcj0iIzk5Q0NDQyI+PGI+McDPIMbysdUguea5rjwvYj48L3RkPg0KICAgICAgICAg
ICAgICAgICAgICAgIDx0ZCB3aWR0aD0iNCUiIGhlaWdodD0iMjAiIGJnY29sb3I9IiNlZGVk
ZWQiPiZuYnNwOzwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI0NiUi
IGhlaWdodD0iMjAiIGJnY29sb3I9IiNlZGVkZWQiPjxiPjxmb250IGNvbG9yPSIjNjY2NjY2
Ij43NzQsNTAwIGhpdCgyMDAyLjAxLjEzKQ0KPC9mb250PjwvYj48L3RkPg0KICAgICAgICAg
ICAgICAgICAgICA8L3RyPg0KPHRyIGJnY29sb3I9IiMwMDAwMDAiPiANCiAgICAgICAgICAg
ICAgICAgICAgICA8dGQgd2lkdGg9IjclIiBoZWlnaHQ9IjEiPjwvdGQ+DQogICAgICAgICAg
ICAgICAgICAgICAgPHRkIHdpZHRoPSI0MyUiIGhlaWdodD0iMSI+PC90ZD4NCiAgICAgICAg
ICAgICAgICAgICAgICA8dGQgd2lkdGg9IjQlIiBoZWlnaHQ9IjEiPjwvdGQ+DQogICAgICAg
ICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI0NiUiIGhlaWdodD0iMSI+PC90ZD4NCiAgICAg
ICAgICAgICAgICAgICAgPC90cj4NCiAgICAgICAgICAgICAgICAgICAgPHRyPiANCiAgICAg
ICAgICAgICAgICAgICAgICA8dGQgd2lkdGg9IjclIiBoZWlnaHQ9IjIwIiBiZ2NvbG9yPSIj
OTlDQ0NDIj4mbmJzcDs8L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0i
NDMlIiBoZWlnaHQ9IjIwIiBiZ2NvbG9yPSIjOTlDQ0NDIj48Yj6wocDUyLi/+DwvYj48L3Rk
Pg0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iNCUiIGhlaWdodD0iMjAiIGJn
Y29sb3I9IiNlZGVkZWQiPiZuYnNwOzwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAgPHRk
IHdpZHRoPSI0NiUiIGhlaWdodD0iMjAiIGJnY29sb3I9IiNlZGVkZWQiPjxiPjxmb250IGNv
bG9yPSIjNjY2NjY2Ij4xNTYsNTUwICgyMDAyLjAxLjEzKQ0KPC9mb250PjwvYj48L3RkPg0K
ICAgICAgICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAgICAgICAgIA0KICAgICAg
ICAgICAgICAgICAgICA8dHI+IA0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0i
NyUiIGhlaWdodD0iMSIgYmdjb2xvcj0iIzAwMDAwMCI+PC90ZD4NCiAgICAgICAgICAgICAg
ICAgICAgICA8dGQgd2lkdGg9IjQzJSIgaGVpZ2h0PSIxIiBiZ2NvbG9yPSIjMDAwMDAwIj48
L3RkPg0KICAgICAgICAgICAgICAgICAgICAgIDx0ZCB3aWR0aD0iNCUiIGhlaWdodD0iMSIg
Ymdjb2xvcj0iIzAwMDAwMCI+PC90ZD4NCiAgICAgICAgICAgICAgICAgICAgICA8dGQgd2lk
dGg9IjQ2JSIgaGVpZ2h0PSIxIiBiZ2NvbG9yPSIjMDAwMDAwIj48L3RkPg0KICAgICAgICAg
ICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAgICAgICAgICA8dHI+IA0KICAgICAgICAg
ICAgICAgICAgICAgIDx0ZCB3aWR0aD0iNyUiIGhlaWdodD0iMjAiIGJnY29sb3I9IiM5OUND
Q0MiPiZuYnNwOzwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI0MyUi
IGhlaWdodD0iMjAiIGJnY29sb3I9IiM5OUNDQ0MiPjxiPrPXxrzB8CC047TnIMSrxdew7biu
PC9iPjwvdGQ+DQogICAgICAgICAgICAgICAgICAgICAgPHRkIHdpZHRoPSI0JSIgaGVpZ2h0
PSIyMCIgYmdjb2xvcj0iI2VkZWRlZCI+Jm5ic3A7PC90ZD4NCiAgICAgICAgICAgICAgICAg
ICAgICA8dGQgd2lkdGg9IjQ2JSIgaGVpZ2h0PSIyMCIgYmdjb2xvcj0iI2VkZWRlZCI+PGI+
PGZvbnQgY29sb3I9IiM2NjY2NjYiPg0KPGEgaHJlZj0iaHR0cDovL2l3d3cubmV0L2RhdGEv
Y2F0X2xpc3QuaHRtbCIgdGFyZ2V0PSJfbmV3Ij43ODUgsLM8L2E+PC9mb250PjwvYj48L3Rk
Pg0KICAgICAgICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAgICAgICAgICAgICANCiAg
ICAgICAgICAgICAgICAgIDwvdGFibGU+DQogICAgICAgICAgICAgICAgICA8dGFibGUgd2lk
dGg9IjEwMCUiIGJvcmRlcj0iMCIgY2VsbHNwYWNpbmc9IjAiIGNlbGxwYWRkaW5nPSIwIj4N
CiAgICAgICAgICAgICAgICAgICANCiAgICAgICAgICAgICAgICAgICAgPHRyPiANCiAgICAg
ICAgICAgICAgICAgICAgICA8dGQgYmdjb2xvcj0iI0NFQ0VDRSIgaGVpZ2h0PSIxOCI+PGZv
bnQgY29sb3I9IiMwMDAwODAiPjxicj4NCiZuYnNwOyZuYnNwO8H3waIguea5rsfPvMW8rSDG
8rChx9ggwda9yr3Dv8AhID09PT09PSZndDs8YSBocmVmPSJodHRwOi8vaXd3dy5uZXQiIHRh
cmdldD0iX25ldyI+PGZvbnQgY29sb3I9IiMwNjA2RkYiPmh0dHA6Ly9pd3d3Lm5ldDwvZm9u
dD48L2E+DQrAr8DNx9Egu+fAzMautvOw7SDG8rChtce9w7jpIDxicj4NCiZuYnNwOyZuYnNw
O8HWwKe60LXpv6Gw1CC+y7fBwda9w7Hmudm2+LTPtNkuICggvsbAzLX7tfu1+yA9IGl3d3cg
KTxicj4NCjxicj48L3RkPg0KICAgICAgICAgICAgICAgICAgICA8L3RyPg0KICAgICAgICAg
ICAgICAgICAgPC90YWJsZT4NCiAgICAgICAgICAgICAgICA8L3RkPg0KICAgICAgICAgICAg
ICAgICANCiAgICAgICAgICAgICAgPC90cj4NCiAgICAgICAgICAgIDwvdGFibGU+DQogICAg
ICAgICAgPC90ZD4NCiAgICAgICAgPC90cj4NCg0KIDx0cj4gDQogICAgICAgICAgPHRkIGhl
aWdodD0iMjAiIGJnY29sb3I9IiM0MDU2ODAiIGFsaWduPSJsZWZ0Ij48Zm9udCBjb2xvcj0i
I2ZmZmZmZiI+PGJyPg0KDQombmJzcDsmbmJzcDuxzcfPsrIgutLG7cC7ILOiw8QgteW3yLTZ
uOkgv+u8rbimILnZtvi0z7TZLjxicj4NCg0KJm5ic3A7Jm5ic3A7sc3Hz8DHILjewM/AuiDA
zsXNs92/obytIMClvK3HzsHfIMPrtebHz7+0wLi45yCxzcfPwMcgvu62sMfRIMGkuri1tSCw
rrDtwNbB9iC+yr3AtM+02S48YnI+PGJyPg0KJm5ic3A7Jm5ic3A7tNnAvbrOxc20wiDAzsXN
s90swaS6uMXrvcUsudnAzLevvbq56b3FILXuIMCvwM3H0SDBpLq4uLjAuyC6uLO7teW4s7TP
tNkuIDxicj4NCiZuYnNwOyZuYnNwO77GwMy1+7X7tfvAxyCwocG3wMwgtce9w7jpIMD8w7yw
ocG3ILjewM/AuyDF68fPv6kgwK/AzcfRIMGkuri4piC53r7Guri9xyC89iDA1r3AtM+02S48
YnI+DQombmJzcDsmbmJzcDuw+MH2u+fH18C7IML8sO3Hz73DuOkgvsbAzLX7tfu1+yCzu7rO
u+fBpMC7IL7GvccgvPYgwNa9wLTPtNkuPEEgSFJFRj0iaHR0cDovL2l3d3cubmV0L2Jicy5o
dG1sIiB0YXJnZXQ9Il9uZXciPg0KPGZvbnQgY29sb3I9ImJsdWUiPiC52bfOsKG8rSC6uLHi
PC9mb250PjwvYT48YnI+DQombmJzcDsmbmJzcDuz18a8wfDAxyCw7bDfwLsgvPa3xcfPtMIg
sPiws7DUvcPGx8C7IL/uv7XB38DUtM+02S4NCjxhIGhyZWY9Imh0dHA6Ly9pd3d3Lm5ldC93
d3diL0NyYXp5V1dXQm9hcmQuY2dpP2RiPWJvYXJkMSIgdGFyZ2V0PSJfbmV3Ij48Zm9udCBj
b2xvcj0iYmx1ZSI+udm3zrChvK0gurix4jwvZm9udD48L2E+PGJyPg0KICANCiZuYnNwOyZu
YnNwO7HXt6G1tSC89r3FwLsgv/jEoSC+ysC4vccgsOa/7CC89r3FsMW6zrimIMWsuK/Hz73K
vcO/wCE8L2ZvbnQ+DQo8QSBIUkVGPW1haWx0bzppd2VibWFzdGVyQGl3d3cubmV0P3N1Ympl
Y3Q9vPa9xbDFus4mYm9keT243sDPvPa9xbDFus4+DQo8Zm9udCBjb2xvcj0iYmx1ZSI+PGI+
vPa9xbDFus48L2I+PC9mb250PjwvQT48YnI+PGJyPiANCiAgICAgICAgICA8L3RkPg0KICAg
ICAgICA8L3RyPg0KICAgICAgPC90YWJsZT4NCiAgICA8L3RkPg0KICAgIDx0ZCBhbGlnbj0i
Y2VudGVyIiB2YWxpZ249InRvcCIgd2lkdGg9IjIyIj48aW1nIHNyYz0iaHR0cDovL2l3d3cu
Y28ua3IvaXd3d19pbmZvL3JpZ2h0LmdpZiIgd2lkdGg9IjIyIiANCmhlaWdodD0iMjA5Ij48
L3RkPg0KICA8L3RyPg0KPC90YWJsZT4NCjx0YWJsZSB3aWR0aD0iNjE5IiBib3JkZXI9IjAi
IGNlbGxzcGFjaW5nPSIwIiBjZWxscGFkZGluZz0iMCIgYWxpZ249ImNlbnRlciI+DQogIDx0
cj4NCiAgICA8dGQ+PGEgaHJlZj0iaHR0cDovL2l3d3cubmV0IiB0YXJnZXQ9Il9uZXciPjxp
bWcgc3JjPSJodHRwOi8vaXd3dy5uZXQvaXd3d19pbmZvL2J1dHRvbTEuZ2lmIiANCndpZHRo
PSI2MTkiIGhlaWdodD0iNDYiIGJvcmRlcj0iMCI+PC9hPjwvdGQ+DQogIDwvdHI+DQo8L3Rh
YmxlPg0KPC90ZD48L3RhYmxlPg0KPC9odG1sPg==
------=_NextPart_000_0137_01C0F05A.93A01C00--
From postmaster@cs.utk.edu Wed Jan 23 21:02:40 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id VAA06072; Wed, 23 Jan 2002 21:02:39 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Wed, 23 Jan 2002 21:02:39 -0500
Received: from gosok.com (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id VAA06651; Wed, 23 Jan 2002 21:02:38 -0500 (EST)
Message-Id: <200201240202.VAA06651@cs.utk.edu>
Received: from gosok.com (211.215.9.143)
by cs.utk.edu (smtpshim v1.0); Wed, 23 Jan 2002 21:02:39 -0500
Reply-To: abc@gosok.com
From: 9ZAvGv
To:
Subject: ?M?l~ 9+7a0fG0=EC;GO1b!!!GZ5eFy@L 0xB%!!!,?y1^A$:8!!!>F8#9Y@LF. 18GO1b!!!(1$-0m)
Sender: 9ZAvGv
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Thu, 24 Jan 2002 11:02:40 +0900
X-User: 2.53-figjdkjp-glkiin-Ddijq
Content-Transfer-Encoding: base64
X-MIME-Autoconverted: from 8bit to base64 by cs.utk.edu id VAA06657
PEhUTUw+DQo8SEVBRD4NCjxNRVRBIGNvbnRlbnQ9InRleHQvaHRtbDsgY2hhcnNldD1rc19j
XzU2MDEtMTk4NyIgaHR0cC1lcXVpdj1Db250ZW50LVR5cGU+DQo8U1RZTEU+IHAsIGZvbnQs
IHNwYW4geyBsaW5lLWhlaWdodDoxMjAlOyBtYXJnaW4tdG9wOjA7IG1hcmdpbi1ib3R0b206
MDsgfTwvU1RZTEU+DQo8L0hFQUQ+PEJPRFk+DQo8UD7B9rHdIMDMvPiwoyA8QSBocmVmPSJo
dHRwOi8vd3d3Lm9kb2QuY28ua3IiPmh0dHA6Ly93d3cub2RvZC5jby5rcjwvQT4guKYgxay4
r8fYICC6uLy8v+QuLi48L1A+DQo8UD4gICAgICAmbmJzcDs8L1A+DQo8UD6x4rrQwcHAuiDB
pLq4uLggvsu3wbXluK60wiBgvu618L7utfA/YMGkuri758DMxq6/obytPC9QPg0KPFA+Jm5i
c3A7PC9QPg0KPFA+uau34bfOILjuvcqxurWlILDmx7C758DMxq6/oSDA2rW/wLi3ziA8Rk9O
VCBjb2xvcj0jZmYwMDAwPrDmx7A8L0ZPTlQ+wLsgvcXDu8C7IMfYwda45yw8L1A+DQo8UD4g
ICAgJm5ic3A7PC9QPg0KPFA+sKLBviA8Rk9OVCBjb2xvcj0jZmYwMDAwPsPWvcXErrbzx9q1
5cb5tbUguau34bfOPC9GT05UPiC15biusO0gwNa9wLTPtNkuPC9QPg0KPFA+ICAgICAmbmJz
cDs8L1A+DQo8UD6+xr/vt68gMzAwuLi/+MDMu/Mgv6y6wL/DuK6x4iw8Rk9OVCBjb2xvcj0j
ZmYwMDAwPsD8sbkgvsa4o7nZwMzGrrW1PC9GT05UPiCxuMfPseIsPC9QPg0KPFA+ICAgICAm
bmJzcDs8L1A+DQo8UD6wosG+IDxGT05UIGNvbG9yPSNmZjAwMDA+vcW/68SrteUgwe+9wyC5
37HePC9GT05UPiAsvK268b26te7AuyC9x73Dx8+w7SANCsDWvcC0z7TZLjwvUD4NCjxQPiAg
ICAgICZuYnNwOzwvUD4NCjxQPsDMwaa6zsXNtMIguPC15yC5q7fhvK268b26v80gwaS6uLim
IMfRsPe/obytIMiuwM7H2CC6uLy8v+QuPC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+Jm5ic3A7
PC9QPg0KPFA+KMioxuTAzMH2ILG4sObHz7HiKSAtJmd0OyZndDsmbmJzcDsgPEEgDQpocmVm
PSJodHRwOi8vd3d3Lm9kb2QuY28ua3IiPmh0dHA6Ly93d3cub2RvZC5jby5rcjwvQT4gPC9Q
Pg0KPFA+vu618L7utfA/PC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+
PEJSPr7Is+fHz73KtM+x7j88QlI+ursguN7Az8C6ILDUvcPGx7+hvK0gud/D6cfRILDNwLi3
zrytIL7utrDH0SCws8DOwaS6uLW1ILCusO0gwNbB9iC+yr3AtM+02S48QlI+wMzIxCC89r3F
wLsgv/jHz8H2IL7KwLi9w7jpIA0Kvsa3oSC89r3FsMW6zrimILStt6/B1r3DseIgudm2+LTP
tNkuIDxCUj4NCjxBIGhyZWY9Im1haWx0bzppbmZvQG9kb2QuY28ua3IiPrz2vcWwxbrOPC9Q
PjwvQT4NCjwvQk9EWT4NCjwvSFRNTD4NCg==
From moore+bounces-blast-parallel@cs.utk.edu Mon Feb 4 22:10:49 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id WAA28753; Mon, 4 Feb 2002 22:10:48 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Mon, 4 Feb 2002 22:10:48 -0500
Received: from relay2.kornet.net (marvin@localhost)
by cs.utk.edu with ESMTP (cf v2.9s-UTK)
id WAA22208; Mon, 4 Feb 2002 22:10:51 -0500 (EST)
Received: from relay2.kornet.net (211.48.62.162)
by cs.utk.edu (smtpshim v1.0); Mon, 4 Feb 2002 22:10:52 -0500
Received: from localhost (211.217.54.61) by relay2.kornet.net; 5 Feb 2002 12:10:39 +0900
Message-ID: <3c5f4d313c9cdff1@relay2.kornet.net> (added by relay2.kornet.net)
Reply-To: test@test.com
From: test
To: blast-parallel@cs.utk.edu
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Tue, 5 Feb 2002 12:13:19 +0900
Content-Transfer-Encoding: base64
X-MIME-Autoconverted: from 8bit to base64 by cs.utk.edu id WAA22227
PEhUTUw+DQo8SEVBRD4NCjxUSVRMRT48L1RJVExFPg0KPC9IRUFEPg0KPEJPRFk+DQo8RElW
Pr7Is+fHz73KtM+x7j88L0RJVj4NCjxESVY+Ir/swM+76r73IsDUtM+02TwvRElWPg0KPERJ
Vj641cD6IMfjtvS++MDMIMDMsdvAuyC257/2IMHLvNvH1bTPtNkuPC9ESVY+DQo8RElWPiZu
YnNwOzwvRElWPg0KPERJVj602binwMwgvsa0z7bzILG5s7u/obytILz4vPbHz7DUILCzud+1
yCAiv+vBorTrw7y9xcGmx7AnPC9ESVY+DQo8RElWPrHdvNMqxNzFqbiuxq4gw8qwrbfCILq4
vPbBosL4waYgudcgs7uxuLy6wMywrcfRJm5ic3A7xq+89iC55r3EILnmvPbBprXuIDwvRElW
Pg0KPERJVj69xbHivPogOcG+t/m4piC80rCzx8+w7cDaIMfVtM+02S4gPEJSPjwvRElWPg0K
PERJVj7Gr8KhwLombmJzcDsgMS4mbmJzcDsgs7Kz4LPrvNIgtKmxuLOqILzVvbGw1CC757/r
x9IgvPYgJm5ic3A7wNa02bTCsM2w+jwvRElWPg0KPERJVj4mbmJzcDsmbmJzcDsmbmJzcDsm
bmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsm
bmJzcDsNCjIuJm5ic3A7ILChwaTAuiZuYnNwO7mwt9DAzLDtILDHw+C8s7rxLCDB1sXDuri8
9iwgvLG52rXuILTZvufHz7DUILvnv+u1x7jnPC9ESVY+DQo8RElWPiZuYnNwOyZuYnNwOyZu
YnNwOyZuYnNwOyZuYnNwOyZuYnNwOyZuYnNwOyZuYnNwOyZuYnNwOyZuYnNwOyZuYnNwOyZu
YnNwOyZuYnNwOw0KMy4mbmJzcDsgvsbB1iDA+rfFx9EgsKGw3cC4t84gxse4xbXHvu4mbmJz
cDsmbmJzcDuw+LvnudcgwNu+97q4vPa68bChIDwvRElWPg0KPERJVj4mbmJzcDsmbmJzcDsm
bmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsm
bmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDsmbmJzcDu4xb/sJm5i
c3A7wPvAug0KuvG/68C4t84gx9iw4bXLtM+02S4uLiZuYnNwOzwvRElWPg0KPERJVj4mbmJz
cDs8L0RJVj4NCjxESVY+sbq6zrTrt864uCCzs8ewx8+0+LDNwLsmbmJzcDsmbmJzcDuw+LD4
seKw/MDMs6ossMfD4LyzuvEsvLG52iwgPC9ESVY+DQo8RElWPrD4sPjB1sXDx8/A2rq4vPa+
98O8te4gtNm/67W1t84gs7PHsMDMILXHsO0gwNbAuLjnLDwvRElWPg0KPERJVj7AzMGmtMIg
vNK68cDatekgsLOws8DOv6Gw1LHuwfYmbmJzcDu6uLHex8+w7cDaILHbwLsgv8O4s7TPtNku
Jm5ic3A7PEJSPiZuYnNwOzwvRElWPg0KPERJVj6x17PJIMH2s6rDxCC59riuwfYguLa9w7Dt
IMDhsfHAxyC9w7CjwLsgs7u8xbytIDwvRElWPg0KPERJVj7A+sjxIMioxuTAzMH2wM4mbmJz
cDsgotEgPEEgaHJlZj0iaHR0cDovL3d3dy53b29pbDIxLmNvbSI+d3d3Lndvb2lsMjEuY29t
PC9BPiZuYnNwOw0Kv8C9w7jpPC9ESVY+DQo8RElWPrT1IMDavLzH0SC8s7jtsPogtPUgwcHA
uiZuYnNwO7vzx7DAuyC4uLOqvce89iDA1sC7ILDNwNS0z7TZLjxCUj48QlI+tPUgscOx3cfR
ILvnx9fAzCDA1sC4vcO46SAwMikgOTY3LTY3MDS3zg0Kv6y29CDB1ry8v+Q8L0RJVj4NCjxE
SVY+sKi758fVtM+02S48QlI+PC9ESVY+DQo8L0JPRFk+DQo8L0hUTUw+DQo=
From moore+bounces-blast-parallel@cs.utk.edu Fri Feb 8 05:07:16 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id FAA20913; Fri, 8 Feb 2002 05:07:15 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Fri, 8 Feb 2002 05:07:15 -0500
Received: from mailto.co.kr (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id FAA17968; Fri, 8 Feb 2002 05:07:19 -0500 (EST)
Message-Id: <200202081007.FAA17968@cs.utk.edu>
Received: from mailto.co.kr (211.215.9.45)
by cs.utk.edu (smtpshim v1.0); Fri, 8 Feb 2002 05:07:20 -0500
Reply-To: ilove@mailto.co.kr
From: daun
To:
Subject: !Z>H3gGO<
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Fri, 8 Feb 2002 19:07:20 +0900
X-User: 2.62-ehficjkm-illhhj-Cchgh
Content-Transfer-Encoding: base64
X-MIME-Autoconverted: from 8bit to base64 by cs.utk.edu id FAA17973
PEhUTUw+DQo8SEVBRD4NCjxNRVRBIGNvbnRlbnQ9InRleHQvaHRtbDsgY2hhcnNldD1rc19j
XzU2MDEtMTk4NyIgaHR0cC1lcXVpdj1Db250ZW50LVR5cGU+DQo8U1RZTEU+IHAsIGZvbnQs
IHNwYW4geyBsaW5lLWhlaWdodDoxMjAlOyBtYXJnaW4tdG9wOjA7IG1hcmdpbi1ib3R0b206
MDsgfTwvU1RZTEU+DQo8L0hFQUQ+PEJPRFk+DQo8UD6+yLPnx8+8vL/kLiBkYXVuxKvG5MH2
seLA1LTPtNkuIF5ePC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+wN/B9rO7vMzB9r/kPyC02bin
wMwgvsa0z7bzIMH2sd0gwPrI8SDEq8bkv820wiC787D8vvi0wiCx28DOtaUsPC9QPg0KPFA+
Jm5ic3A7PC9QPg0KPFA+tbW/8sDMILXHvcex7iDAzLe4sNQgwPzDvCDIuL/4utCysiDAzLje
wM/AuyC6uLO7teW4s7TPtNkuPC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+wfax3SBUVsiovO7H
ziBDSm1hbGzAzLbzsO0gvsa9w7TCutDAuiC02SC+xr3Hsqu1pSw8L1A+DQo8UD4mbmJzcDs8
L1A+DQo8UD7AzLzux8649L+hvK0gyLi/+LChwNS4uMfPuOkgvLOzr8C7ILjCwMzHz7+pIGC6
ucHWuNO0z2C287TCILDNwLsgPC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+wdaw7SDA1r3AtM+0
2S7AzCC6ucHWuNO0z7TCIMHvvK66ubHHsPogtsiwsMC6x/y9xMDMuOcsPC9QPg0KPFA+Jm5i
c3A7PC9QPg0KPFA+uavBtrDHIDUwMDC/+MC7ILTnw7fAzMDMuOcsvPix3SAxtbcgLCC788ew
sce17sDMPC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+sKHA1Li4IMfPvcO46SC058O3tce9x7z2
IMDWvcC0z7TZLrmwt9AgurnB1rjTtM+0wiC8s7OvwMy6pcaut848L1A+DQo8UD4mbmJzcDs8
L1A+DQo8UD7IuL/4sKHA1Li4IMfPvcO46SDA/LrOILvnv+vHz73HvPYgwNa9wLTPtNkuPC9Q
Pg0KPFA+Jm5ic3A7PC9QPg0KPFA+scOx3cfPvcW60MC6IL7Gt6Egwda80rimIMWsuK/Hz73D
seYgudm287jnLDwvUD4NCjxQPiZuYnNwOzwvUD4NCjxQPrrSx8q/5MfRIMDMuN7Az8DMtvOw
7SC7/bCix8+9w7TCutCysrTCIMH4vcnAuLfOILvnsPq15biztM+02S48L1A+DQo8UD4mbmJz
cDs8L1A+DQo8UD679cfYILq5ILi5wMwgud7AuL3DsbggwfGwxb/uIMfPt+fHz7fnILXHvLy/
5H5+fn5+fn5+fn5+fn5+fn48L1A+DQo8UD4mbmJzcDs8L1A+DQo8UD7C/CAhISEgyLi/+LCh
wNS9wyDD38O1wM7AuyDA+7TCtvW/obTCYCA8U1RST05HPjxGT05UIGNvbG9yPSNmZjAwMDA+
bmF2ZXI8L0ZPTlQ+PC9TVFJPTkc+IGC287DtIA0KssAgwPu+7sHWvcOx5iC52bb2srK/5F5e
PC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+wfGwxb/uIMfPt+cgtce9yr3Dv+QuILCou+fH1bTP
tNkuPC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+Jm5ic3A7PC9QPg0KPFA+Jm5ic3A7PC9QPg0K
PFA+yKjG5MDMwfYtLSZndDsmbmJzcDsgPEEgDQpocmVmPSJodHRwOi8vd3d3LmNqbWFsbC5j
b20vZXZlbnQvbmV3eWVhci9sdWNreV9tYWluLmpzcCI+aHR0cDovL3d3dy5jam1hbGwuY29t
L2V2ZW50L25ld3llYXIvbHVja3lfbWFpbi5qc3A8L0E+PC9QPg0KPFA+Jm5ic3A7PC9QPg0K
PFA+w9/DtcDOvsbAzLXwOiZuYnNwOzxTVFJPTkc+PEZPTlQgY29sb3I9I2ZmMDAwMD4gbmF2
ZXI8L0ZPTlQ+PC9TVFJPTkc+Jm5ic3A7IA0KJmx0OyZsdDstLS3D38O1wM4gvsa18LTCILLA
IMD7vu7B1r3DseYgudm2+LTPtNkuPEJSPjwvUD4NCjxQPjxBIGhyZWY9Im1haWx0bzppbG92
ZUBtYWlsLmNvLmtyIj689r3FsMW6zjwvQT48L1A+DQo8L0JPRFk+DQo8L0hUTUw+DQo=
From moore+bounces-blast-parallel@cs.utk.edu Sat Feb 9 13:26:15 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id NAA10478; Sat, 9 Feb 2002 13:26:15 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Sat, 9 Feb 2002 13:26:15 -0500
Received: from dreamwiz.com (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id NAA08939; Sat, 9 Feb 2002 13:26:10 -0500 (EST)
Message-Id: <200202091826.NAA08939@cs.utk.edu>
Received: from dreamwiz.com (218.50.84.153)
by cs.utk.edu (smtpshim v1.0); Sat, 9 Feb 2002 13:26:18 -0500
Reply-To: mihwa38@dreamwiz.com
From: 9LH-
To:
Subject: [1$0m] "GC6s?v55;g" 20023b 2IA!@87N ?n<<8&...
Sender: 9LH-
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Sun, 10 Feb 2002 03:27:37 +0900
X-User: 2.62-iljmgolw-hmppiq-Gglom
Content-Transfer-Encoding: quoted-printable
X-MIME-Autoconverted: from 8bit to quoted-printable by cs.utk.edu id NAA08980
=A2=BF=A2=BD=A1=DA "=C7=C3=B6=F3=BF=F6=B5=B5=BB=E7" =B0=A1=B6=F3=BB=
=E7=B4=EB...
 |
=20
=20
| =20
|
 |
=20
|
=20
=20
 =
; =
&=
nbsp; =BB=E7=C0=FC =BE=E7=C7=
=D8=BE=F8=C0=CC =B8=DE=C0=CF=C0=BB =BA=B8=B3=BB=BC=AD =C1=CB=BC=DB=C7=D5=B4=
=CF=B4=D9.
 =
; =
&=
nbsp; =BA=BB =B8=DE=C0=CF=C0=
=BA =C0=CE=C5=CD=B3=DD=BB=F3=BF=A1 =BF=C3=B6=F3=BF=C2 =B8=DE=C0=CF=C1=D6=BC=
=D2=B8=A6 =B9=DF=C3=E9=C7=CF=BF=A9 =B9=DF=BC=DB=C7=CF=BF=B4=BD=C0=B4=CF=B4=
=D9.
 =
; =
&=
nbsp; =BA=BB =B8=DE=C0=CF=C0=
=BA =C1=A4=BA=B8 =C5=EB=BD=C5=B8=C1 =C0=CC=BF=EB =C3=CB=C1=F8 =B9=D7 =C1=A4=
=BA=B8=BA=B8=C8=A3 =B5=EE=BF=A1 =B0=FC=C7=D1
 =
; =
&=
nbsp; =B9=FD=B7=FC =C1=A6 =
50=C1=B6=BF=A1 =C0=C7=B0=C5=C7=D1 [=B1=A4=B0=ED] =B8=DE=C0=CF=C0=D4=B4=CF=
=B4=D9.
 =
; =
&=
nbsp; =BF=F8=C4=A1 =BE=CA=C0=
=B8=BD=C3=B8=E9 =BB=E8=C1=A6=C7=CF=BD=C3=B0=C5=B3=AA, <=
b>[=BC=F6=BD=C5=B0=C5=BA=CE]=B8=A6 =B4=AD=B7=AF=C1=D6=BC=BC=BF=E4=
!
|
=20
| =20
Copyright =A8=CF 200=
1-2002 .J&=
Y=20
=C1=B6=B3=AA=B4=DC =C1=A4=BA=B8=C5=EB=BD=C5. All Rights Reserved.=
|
From moore+bounces-blast-parallel@cs.utk.edu Tue Feb 19 06:29:42 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id GAA14224; Tue, 19 Feb 2002 06:29:42 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Tue, 19 Feb 2002 06:29:42 -0500
Received: from intizen.com (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id GAA13958; Tue, 19 Feb 2002 06:29:40 -0500 (EST)
Message-Id: <200202191129.GAA13958@cs.utk.edu>
Received: from intizen.com (61.248.169.101)
by cs.utk.edu (smtpshim v1.0); Tue, 19 Feb 2002 06:29:41 -0500
Reply-To: gricea@intizen.com
From: ">V4O?@5p?@"
To:
Subject: mp3,?,?9A$:8,=:E2,>V4O8^@Lx4B0T >x4B FwE;;g@LF... FwE;7N E:;}!![H+:8]
Sender: ">V4O?@5p?@"
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Tue, 19 Feb 2002 20:29:45 +0900
X-User: 2.61-lompjrot-nopsmx-Jjomo
Content-Transfer-Encoding: base64
X-MIME-Autoconverted: from 8bit to base64 by cs.utk.edu id GAA13983
PGh0bWw+DQo8aGVhZD4NCjxtZXRhIGh0dHAtZXF1aXY9IkNvbnRlbnQtTGFuZ3VhZ2UiIGNv
bnRlbnQ9ImtvIj4NCjxtZXRhIG5hbWU9IkdFTkVSQVRPUiIgY29udGVudD0iTWljcm9zb2Z0
IEZyb250UGFnZSA1LjAiPg0KPG1ldGEgbmFtZT0iUHJvZ0lkIiBjb250ZW50PSJGcm9udFBh
Z2UuRWRpdG9yLkRvY3VtZW50Ij4NCjxtZXRhIGh0dHAtZXF1aXY9IkNvbnRlbnQtVHlwZSIg
Y29udGVudD0idGV4dC9odG1sOyBjaGFyc2V0PWtzX2NfNTYwMS0xOTg3Ij4NCjx0aXRsZT7A
/Mf0ILv1t86/7iCws7PkwMcgxvfFu7vnwMzGrjwvdGl0bGU+DQo8L2hlYWQ+DQo8Ym9keT4N
CjxwIGFsaWduPSJjZW50ZXIiPjxmb250IGNvbG9yPSIjMDA4MDAwIj7A/Mf0ILv1t86/7iCw
s7PkwMcgxvfFu7vnwMzGriEhPC9mb250PjwvcD4NCjxwIGFsaWduPSJjZW50ZXIiPjxhIHRh
cmdldD0iX2JsYW5rIiBocmVmPSJodHRwOi8vaW1hZ2UyMWMuY28ua3IvcG9ydGFscm8iPg0K
PGltZyBib3JkZXI9IjEiIHNyYz0iaHR0cDovL2ltYWdlMjFjLmNvLmtyL3BvcnRhbHJvL2lt
Zy9zY3JlZW5zaG90X3MuanBnIiBhbHQ9Irv1t86/7iCws7PkwMcgxvfFu7vnwMzGrrimIL3D
wNvG5MDMwfa3zi4uLiIgd2lkdGg9IjQwMCIgaGVpZ2h0PSI0MjYiPjwvYT48L3A+DQo8cCBh
bGlnbj0iY2VudGVyIj48Yj48Zm9udCBzaXplPSI0Ij7G98W7t84oPGEgdGFyZ2V0PSJfYmxh
bmsiIGhyZWY9Imh0dHA6Ly9pbWFnZTIxYy5jby5rci9wb3J0YWxybyI+PGZvbnQgY29sb3I9
IiNmZjAwMDAiPmh0dHA6Ly9wb3J0YWxyby53by50bzwvZm9udD48L2E+PGZvbnQgY29sb3I9
IiNmZjAwMDAiPik8L2ZvbnQ+PC9mb250PjwvYj48L3A+DQo8cCBhbGlnbj0iY2VudGVyIj7D
1r3FsO4sv7XIrSy/zbe5we4svta0z7jewMy8xyy/rL+5waS6uC4uLjwvcD4NCjxwIGFsaWdu
PSJjZW50ZXIiPrjwtecgwaS6uLimIMfRsPe/obytILq4vccgvPYgwNa9wLTPtNkuPC9wPg0K
PHAgYWxpZ249ImNlbnRlciI+bXAztNm/7rfOteUsvta0z8C9ud2wqLvzLMCpvtrHwb26xbIs
wKm1tbnZxcHIrbjpLi4uPC9wPg0KPHAgYWxpZ249ImNlbnRlciI+LS0tLS0tLS0tLS0tLS0t
LS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLS0tLTwvcD4NCjxwIGFsaWduPSJj
ZW50ZXIiPjxmb250IHNpemU9IjIiPrq7ILjewM/AuiDBpMXrus4gscew7bvnx9e/oSDAx7DF
x9EgsaSw7bjewM/A1LTPtNkuPGJyPg0KuN7Az8C6ILX8fiDH0bn4uLggud+827XLtM+02S4o
wP206yC02b3Dud+827XHwfYgvsrAvSk8YnI+DQrB7ywgvPa9xbDFus4gx8+9xyDHyr/ktbUg
vvi9wLTPtNkuKLT1wMy78yC537zbtcfB9iC+ysC4uce3zik8L2ZvbnQ+PC9wPg0KPC9ib2R5
Pg0KPC9odG1sPg0K
From moore+bounces-blast-parallel@cs.utk.edu Sun Feb 24 05:07:35 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id FAA12019; Sun, 24 Feb 2002 05:07:34 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Sun, 24 Feb 2002 05:07:35 -0500
Received: from hananet.net (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id FAA12323; Sun, 24 Feb 2002 05:07:35 -0500 (EST)
Message-Id: <200202241007.FAA12323@cs.utk.edu>
Received: from hananet.net (211.202.87.13)
by cs.utk.edu (smtpshim v1.0); Sun, 24 Feb 2002 05:07:35 -0500
Reply-To: ktt21c@hananet.net
From: "A6H-@/?5>n"
To:
Subject: [1$0m]?5>n559h?l0m ?%GG>28.55 9+7a7N
Sender: "A6H-@/?5>n"
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Sun, 24 Feb 2002 18:52:43 +0900
Content-Transfer-Encoding: quoted-printable
X-MIME-Autoconverted: from 8bit to quoted-printable by cs.utk.edu id FAA12328
=20
 |
=20
 |
=20
 |
|
 |
=20
 |
=20
=B1=CD=C7=CF=C0=C7 =BD=C2=B6=F4=BE=F8=C0=CC =C0=FC=C0=DA=
=BF=EC=C6=ED=C0=BB =BA=B8=B3=BB=B0=D4 =B5=C8 =C1=A1 =C1=A4=C1=DF=C8=F7 =BB=
=E7=B0=FA =B5=E5=B8=B3=B4=CF=B4=D9. =C0=FA=C8=F1=C8=B8=BB=E7=B4=C2 =C1=A4=
=BA=B8=C5=EB=BD=C5=BA=CE=C0=C7 =BF=E4=B1=B8=BB=E7=C7=D7=C0=CE =BC=F6=BD=C5=
=B0=C5=BA=CE =C0=E5=C4=A1=B8=A6 =C7=CA=C8=F7 =B8=B6=B7=C3=C7=CF=B0=ED =C0=
=D6=BD=C0=B4=CF=B4=D9. =B1=CD=C7=CF=C0=C7 =C0=FC=C0=DA =BF=EC=C6=ED =C1=D6=
=BC=D2=B4=C2 =C0=CE=C5=CD=B3=DD=BB=F3=C0=C7 =B0=F8=B0=B3=B5=C8 =C0=E5=BC=D2=
=BF=A1=BC=AD =C8=AE=C0=CE=C7=CF=BF=B4=C0=B8=B8=E7, =B1=CD=C7=CF=C0=C7 =C0=
=FC=C0=DA=BF=EC=C6=ED =C1=D6=BC=D2=BF=DC =BE=EE=B6=B0=C7=D1 =B0=B3=C0=CE =
=C1=A4=BA=B8=B5=B5 =C8=AE=C0=CE=B5=C7=C1=F6 =BE=CA=C0=B8=B9=C7=B7=CE =BE=C8=
=BD=C9=C7=CF=BD=C3=B1=E2 =B9=D9=B6=F8=B4=CF=B4=D9. =B5=BF=C0=CF=C7=D1 =B3=
=BB=BF=EB=C0=C7 =B8=DE=C0=CF=BC=F6=BD=C5=C0=BB =B0=C5=BA=CE=C7=CF=BD=C5=B4=
=D9=B8=E9 =B1=CD=C7=CF=C0=C7 =C0=C7=BB=E7=B8=A6 =C1=B8=C1=DF=C7=CF=BF=A9 =
=BB=E8=C1=A6=C3=B3=B8=AE=C7=CF=B0=DA=BD=C0=B4=CF=B4=D9. =B0=A8=BB=E7=C7=D5=
=B4=CF=B4=D9.
 =20
|
|
From moore+bounces-blast-parallel@cs.utk.edu Mon Feb 25 02:47:06 2002
Return-Path:
Received: from cs.utk.edu (LOCALHOST.cs.utk.edu [127.0.0.1])
by netlib2.cs.utk.edu with ESMTP (cf v2.9t-netlib)
id CAA00347; Mon, 25 Feb 2002 02:47:05 -0500
Received: from cs.utk.edu (160.36.56.56 -> cs.utk.edu)
by netlib2.cs.utk.edu (smtpshim v1.0); Mon, 25 Feb 2002 02:47:05 -0500
Received: from CashRe-PadEmail.com (marvin@localhost)
by cs.utk.edu with SMTP (cf v2.9s-UTK)
id CAA21639; Mon, 25 Feb 2002 02:47:03 -0500 (EST)
Message-Id: <200202250747.CAA21639@cs.utk.edu>
Received: from CashRe-PadEmail.com (61.84.155.200)
by cs.utk.edu (smtpshim v1.0); Mon, 25 Feb 2002 02:47:05 -0500
Reply-To: nonreply@CashRe-PadEmail.com
From: "nonreply@CashRe-PadEmail.com"
To:
Subject: Thank you all! We have!!!
Sender: "nonreply@CashRe-PadEmail.com"
Mime-Version: 1.0
Content-Type: text/html; charset="ks_c_5601-1987"
Date: Mon, 25 Feb 2002 16:40:53 +0900
X-User: 2.62-gjhkelkn-mlljhm-Eejjk
Thank you all! We have already over 58,000 members!