1997 Technical Reports | SCS | UW
[Please remove <h1>]
CS-97-35 |
Title |
Adaptive Protocols for
Negotiating Non-Deterministic Choice over Synchronous Channels |
Authors |
Erik D. Demaine |
Abstract |
In this paper, we propose
several deadlock-free protocols for implementing the generalized alternative
construct, where a process non-deterministically chooses between sending
or receiving among various synchronous channels. We consider general many-to-many
channels and examine in detail the special case of fan (many-to-one and
one-to-many) channels, which are common and can be implemented much more
efficiently. We propose a protocol that achieves an optimal number of message
cycles per user-level communication, significantly improving on previous
results. We propose several other ``less aggressive'' protocols, which may
be more suitable for some applications and networks, and demonstrate how
to adaptively switch between them and modify protocol parameters. Finally,
we show how to maintain dynamic membership of channels, while avoiding race
conditions that would prevent garbage collection of processes. |
Comments |
A version of this paper
has been submitted to the 1st Merged Symposium of the 12th International
Parallel Processing Symposium and 9th Symposium on Parallel and Distributed
Processing, Orlando, Florida, March 30-April 3, 1998. |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-32 |
Title |
Penalty Methods for American
Options With Stochastic Volatility |
Authors |
R. Zvan, P.A. Forsyth
and K.R. Vetzal |
Abstract |
The American early exercise
constraint can be viewed as transforming the two dimensional stochastic
volatility option pricing PDE into a differential algebraic equation (DAE).
Several methods are described for forcing the algebraic constraint by using
a penalty source term in the discrete equations. The resulting nonlinear
algebraic equations are solved using an approximate Newton iteration. The
solution of the Jacobian is obtained using an incomplete LU (ILU) preconditioned
PCG method. Some example computations are presented for option pricing problems
based on a stochastic volatility model, including an exotic American chooser
option written on a put and call with discrete double knockout barriers
and discrete dividends.
|
Date |
October 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-31 |
Title |
The Suffix-Signature method
for Searching for Phrases in Text |
Authors |
M. Zhou |
Abstract |
Finding all occurrences
of a given word in a large static text is a well-studied problem. Most solutions,
however, are not well-suited for phrase-searching. In this thesis, we investigate
a new algorithm to find all occurrences of a given phrase in a large, static
text, based on the data structure known as a suffix array. Using this algorithm,
phrases of bounded length can be found with expected search time of one
disk access to the text and one disk access to an index. To achieve this
performance for phrases of up to five words in length requires an index
having total size of approximately 120% of the size of the text. The algorithm
guarantees a worst case search performance of 2 disk accesses to the text
per phrase search.
The method augments a suffix array with a parallel signature array, so that
every indexed phrase has an associated signature. To search for a phrase,
we search a block of the index in memory to locate matching signatures.
Then we read one or two phrases corresponding to matching signatures from
disk and compare them to the target phrase to filter out false matches.
We present theoretical properties of the data structure and algorithm derived
from a suitable model. The theoretical results have been validated through
experimentation with actual data ranging in size from 6Mb to 550Mb and also
including actual query patterns derived from logs of searches on the World
Wide Web. These experiments show that the approach is applicable in practice
to a variety of texts and realistic phrase searches. |
Date |
October 1997 |
Comments |
This report is a reprint
of a thesis that embodies the results of research done in partial fulfillment
of the requirements for the degree of Doctor of Philosophy of Mathematics
in Computer Science at the University of Waterloo.
|
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-30 |
Title |
The Use of 3D Sound as
a Navigational Aid in Virtual Environments |
Authors |
R. Guenther |
Abstract |
Because virtual environments are less visually rich than real-world environments,
careful consideration must be given to their design, otherwise much of the potential
for virtual reality will be lost. One important design criteria is to make certain
that adequate navigational cues are incorporated into a complex virtual world.
The usability of virtual environments will be decreased if users often become lost
or disoriented, as is frequently the case now.
This thesis investigates the use of spatialized sound as a navigational aid for
people using virtual reality systems. An experiment was performed to determine
if incorporating spatialized sound into a virtual environment a) helps people find
specific objects in the environment, and b) in uences the rate at which people
acquire spatial knowledge.
The results show that the addition of spatialized sound to a virtual environment
improved the ability of people to find objects, although the addition of sound did
not increase the amount of configurational knowledge a person was able to acquire.
In fact, it appears that the addition of sound may have suppressed the development
of configurational knowledge. |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-29 |
Title |
A Parametric Hybrid Triangular
Bezier Patch |
Authors |
M.F. Davidchuk |
Abstract |
The triangular Bezier
patch presents a natural primitive for modeling surfaces of arbitrary topology,
but does not enjoy the same widespread use as the tensor product patch.
Existing triangular Bezier patch interpolation schemes that are designed
to fit surfaces to parametric data produce interpolants that suffer from
noticeable visual flaws. Patch boundaries are often visible as creases in
the resulting surface.
Interpolation of functional data is simpler than interpolation of parametric
data. However schemes such as the Clough-Tocher technique have many of the
same visual flaws as parametric schemes. Foley and Opitz introduced an improvement
over Clough-Tocher. Central to the Foley-Opitz scheme is the cross boundary
construction that produces quadratically varying cross boundary derivatives.
My work involves attempting to extend Foley's scheme to the parametric setting.
Modifying this scheme requires the formulation of an extension to the standard
rational blend technique. In addition to blending interior control points,
boundary control points must also be blended to incorporate Foley's cross
boundary construction, which relies on the natural parameterization of the
functional setting.
When interpolating irregularly scattered data and when increasing the tessellation
of the data mesh, the new scheme shows improvement over representative parametric
data fitting schemes. The quality of the resulting surfaces depended largely
on the correlation between the normals and the triangles of the underlying
mesh. |
Date |
September 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-28 |
Title |
A Technique for Finding
and Verifying Speed-Dependences in Gate Circuits |
Authors |
R. Negulescu |
Abstract |
Before sizing the delays
of the components in a circuit, it is often necessary to find delay constraints
that would ensure the correctness of the circuit, and to verify the sufficiency
of these constraints. We formalize constraints of a certain type as processes
in a metric-free model. The circuit specification and components are also
represented as processes, to be coupled and compared with the constraint
processes. We use a BDD-based tool to verify a circuit together with a set
of constraints, and to find hints as to which constraints are needed. We
illustrate this technique on a circuit that uses extended isochronic forks. |
Date |
July 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-27 |
Title |
PDE Methods for Pricing
Barrier Options |
Authors |
R. Zvan, K.R. Vetzal and
P.A. Forsyth |
Abstract |
The paper presents an
implicit method for solving PDE models of contingent claims proces with
general algebraic constraints on the solution. Examples of constraints include
barriers and early exercise features. In this unified framework, barrier
options with or without American-style features can be handled in the same
way. Either contiuously or discretely monitored barriers can be accommodated,
as can time-varying barriers. The underlying asset may pay out either a
constant dividend yield or a discrete dollar dividend. The use of the implicit
method leads to convergence in fewer time steps compared to explicit schemes.
This paper also discusses extending the basic methodology to the valuation
of two asset barrier options and the incorporation of automatic time stepping. |
Date |
July 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-26 |
Title |
A Structured Text ADT
for Object-Relational Databases |
Authors |
L.J. Brown, M.P.Consens,
I.J. Davis, C.R. Palmer and F.W. Tompa |
Abstract |
There is a growing need,
both for use within corporate intranets and within the rapidly evolving
World Wide Web, to develop tools that are able to retrieve relevant textual
information rapidly, to present textual information in a meaningful way,
and to integrate textual information with related data retrieved from other
sources.
This paper introduces a model for structured text and presents a small set
of operations that may be applied against this model. Using these operations
structured text may be selected, marked, fragmented, and transformed into
relations for use in relational and object oriented database systems.
The extended functionality has been accepted for inclusion within the SQL/MM
standard, and a prototype database engine that supports SQL with extensions
to incorporate the proposed text operations has been implemented. This prototype
serves as a proof of concept intended to address industrial concerns, and
it demonstrates the power of the proposed abstract data type for structured
text.
|
Date |
July 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-24 |
Title |
Preconditioning Methods
for Very Ill-conditioned Three-dimensional Linear Elasticity Problems |
Authors |
E. Graham and P.A. Forsyth |
Abstract |
Finite element models
of linear elasticity arise in many application areas of structural analysis.
Solving the resulting system of equations accounts for a large portion of
the total cost for large, three-dimensional models, for which direct methods
can be prohibitively expensive. Preconditioned conjugate gradient (PCG)
methods are used to solve difficult problems with small $(\ll 1)$ average
element aspect ratios. Incomplete LU (ILU) factorisations based on a drop
tolerance parameter are used to form the preconditioning matrices. Various
new techniques known as reduction techniques are examined. Combinations
of these reduction techniques result in highly effective preconditioners
for problems with very poor aspect ratios. Standard and hierarchical triquadratic
basis functions are used on hexahedral elements, and test problems comprising
a variety of geometries with up to 50,000 degrees of freedom are considered.
Manteuffel's method of perturbing the stiffness matrix to ensure positive
pivots occur during factorisation is used, and its effects on the convergence
of the preconditioned system are discussed. |
Date |
July 1997 |
Comments |
We have submitted the
paper (25 pages) to the International Journal for Numerical Methods in Engineering. |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-23 |
Title |
A Finite Element Approach
to the Pricing of Discrete Lookbacks with Stochastic Volatility |
Authors |
P.A. Forsyth, K. Vetzal
and R. Zvan |
Abstract |
Finite element methods
are described for valuing lookback options under stochastic volatility.
Particular attention is paid to the method for handling the boundary equations.
For some boundaries, the equations reduce to first order hyperbolic equations
which must be discretized to ensure that outgoing waves are correctly modelled.
Some example computations show that for certain choices of parameters, the
option price computed for a lookback under stochastic volatility can differ
from the price under the usual constant volatility assumption by as much
as 35\% (i.e. \$7.30 compared with \$5.45 for an at-the-money put), even
though the models are calibrated so as to produce exactly the same price
for an at-the-money vanilla European option with the same time remaining
until expiry. |
Date |
July 1997 |
Comments |
Submitted to: Appl. Math
Finance (July, 1997) |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-22 |
Title |
Computing Extreme Origami
Bases |
Authors |
Erik D. Demaine and Martin
L. Demaine |
Abstract |
In this paper, we examine
extreme origami bases that fold the boundary of a polygonal sheet of paper
to a common plane, generalizing the Husimi and Meguro molecules used in
origami design. This also solves the folding-and-cutting problem, where
we want to make a specified polygon by one complete straight cut after arbitrary
folding of a square sheet of paper. We present an algorithm to construct
the crease pattern of an extreme base for paper in the shape of an arbitrary
simple polygon. It is based on the straight skeleton, a variant on the medial
axis. For convex polygons, we describe exactly how the folding process can
be performed. This proves that the folding can even be done with paper that
is rigid except at the creases, and allows animation of the folding process.
Such descriptions have not been achieved before for an infinite class of
origamis. |
Date |
December 1997 |
Comments |
We are extending this
work to cutting out non-convex polygons and more generally arrangements
of line segments. |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-19 |
Title |
Using Ray Tracing for
Site Specific Indoor Radio Signal Strength Analysis |
Authors |
M. Nidd |
Abstract |
As wireless networks are
installed at more locations, it will become more important that they be
organised in an efficient layout. This requires prediction tools that use
site-specific information to give better results than the traditional statistical
models. While counting the number of walls between the transmitter and receiver
can provide a rough approximation, more comprehensive techniques can make
better predictions.
This report explores ray tracing as a method for predicting signal strength
in an indoor wireless radio environment. It presents a dynamic ``cone tracing''
algorithm that offers a reasonable compromise between execution time and
accuracy. This technique bypasses the duplication of effort present in similar
previous attempts to analyze the signal strength in an entire room, instead
of just at a single receiver location. The more complete view gives the
planner seemingly continuous information about the region of interest, providing
more information about the areas of poor coverage, and allowing for more
informed decisions to be made in attempting corrections. This improved quality
is achieved without the sacrificing response time. |
Date |
April 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-18 |
Title |
NP-hardness of some map
labeling problems |
Authors |
C. Iturriaga and A. Lubiw |
Abstract |
One of the most challenging
tasks of cartographic map lettering is the optimal placement of the region
information on a map. We show the NP-hardness of two formulations of this
task: the Elastic Labelling Problem and Sliding Labels Problem. In the elastic
labelling problem we are given a set of Elastic Rectangles as labels, each
associated with a point in the plane. An elastic rectangle has a specified
area but its width and height may vary. The problem then is to choose the
height and width of each label, and the corner of the label to place at
the associated point, so that no two labels overlap. This problem is known
to be NP-hard even when there is no elasticity (just because of the choice
of the corners). We show that the problem remains NP-hard when we have elasticity
but no choice about which corner of the label to use--we call this the
One-corner Elastic Labelling Problem. In the sliding labels problem each
label has fixed dimension but any boundary point (not just a corner) of
the label can be placed at the associated point in the plane. This problem
is known to be NP-hard. We show that it remains NP-hard when only points
on the left or right boundary of the label can be placed at the associated
point --we call this the Left-right Sliding Labels Problem. A companion
paper will give algorithms for special cases of the elastic labelling problem
and the sliding labels problem. |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-17 |
Title |
Montana Smart pointers,
They're Smart and They're Pointers |
Authors |
J. Hamilton |
Abstract |
The Montana C++ programming
environment provides an API interface to the compiler, which allows the
compilation process to be extended through programmer-supplied tools. This
paper investigates the feasibility of that interface, using smart pointers
as an example. Smart pointers are a powerful feature of the C++ language
that enable a variety of applications, such as garbage collection, persistence,
and distributed objects. However, while smart pointers can be used in much
the same way as built-in pointers, they are not interchangeable. Using the
Montana API, smart pointer functionality can be introduced for built-in
pointers, thus enabling built-in pointers that act like smart pointers.
We provide an overview of the Montana programming environment and describes
how smart pointers can be implemented using the Montana API. |
Date |
July 1997 |
Comments |
Submitted to: 1997 USENIX
COOTS (accepted) |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-16 |
Title |
Operation Data Storage
Unification |
Authors |
J.R. Hamilton |
Abstract |
We see a world where personal
system software is self-installing, self-configuring, self-personalizing,
self-administering, and self-healing. In this world, personal system data
is visible in a single name space regardless of owning application and data
location, supports uniform non-procedural query, transactions, and security,
is automatically backed up, is recoverable, and supports portable and disconnected
operation.
This paper focuses on how database technology can be applied to help achieve
many of these goals. We propose an integrated storage manager, the Virtual
Object File System (VOFS), which supports structured files, unstructured
files, and relational access. VOFS also supports disconnected operations
by caching and replicating all data stored via any of the supported persistence
APIs. We show how the combination of support for storage Integration and
disconnected and portable operation found in VOFS can substantially reduce
administrative costs and support more compelling client applications.
The VOFS system can be extended to support alternative caching systems,
new access methods based upon past navigational paths and temporal access,
and different software and data billing schemes.
|
Date |
July 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-15 |
Title |
A Coordinate
Free Geometry ADT |
Authors |
S. Mann, N. Litke and
A. DeRose |
Abstract |
An algebra for geometric
reasoning is developed that is amenable to software implementation. The
features of the algebra are chosen to support geometric programming of the
variety found in computer graphics and computer aided geometric design applications.
The implementation of the algebra in C++ is described, and several examples
illustrating the use of this software are given.
|
Date |
June 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-11 |
Title |
Delay-Insensitivity and
Semi-Modularity |
Authors |
J.A. Brzozowski and H.
Zhang |
Abstract |
A network is delay-dense
if, of each pair of adjacent components in the network, at least one is
a delay. In this paper, we prove that a delay-dense asynchronous network
is delay-insensitive if and only if its behavior is semi-modular. We consider
autonomous networks, i.e., networks without external inputs, whose components
can be any sequential machines of the Moore type. A formal model for this
type of networks and their behaviors is established. We define delay-insensitivity
of networks using the notion of bisimulation. The definition of semi-modularity
is generalized to reflect non-determinism in network behaviors.
KEYWORDS:asynchronous, bisimulation, delay-dense, delay-insensitive,
delay extension, isochronic, module, network, semi-modular, speed-independent
|
Date |
March 1997 |
Comments |
Also available here. |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-04 |
Title |
J.F. Buss and R.B. Simpson |
Authors |
Planar Mesh Refinement
Cannot be Both Local and Regular |
Abstract |
We show that two desirable
properties for planar mesh refinement techniques are incompatible. Mesh
refinement is a common technique for adaptive error control in generating
unstructured planar triangular meshes for piecewise polynomial representations
of data. Local refinements are modifications of the mesh that involve a
fixed maximum amount of computation independent of the number of triangles
in the mesh. Regular meshes are meshes for which every interior vertex has
degree 6.
At least for some simple model meshing problems, optimal meshes are known
to be regular, hence it would be desirable to have a refinement technique
that, if applied to a regular mesh, produced a larger regular mesh. We call
such a technique a regular refinement. In this paper, we prove that no refinement
technique can be both local and regular. Our results also have implications
for non-local refinement techniques such as Delaunay insertion or Rivara's
refinement. |
Date |
January 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-03 |
Title |
A Comparison of 2D and
3D Interfaces for Editing Surfaces Reconstructed from Contours |
Authors |
J.F. Waterhouse |
Abstract |
The last decade of computer
technology has seen the proliferation of computer graphics applications.
As technology advances, there is a growing fascination with three-dimensional
(3D) object representations that likely comes from their greater ability
to match "real life" than their two-dimensional (2D) counterparts. Unfortunately,
the benefits of 3D editing are not without a price. Most techniques for
manipulating objects in a 3D environment are developed for conventional
hardware configurations that use 2D input devices and CRT displays. The
difficulties lie in mapping 3D spatial relationships to 2D displays, and
in mapping 2D user input to 3D object manipulation. This mapping problem
is somewhat mitigated by adding constraints to the degrees of freedom in
the manipulation task. 3D surfaces that have been reconstructed from contours
are interesting to consider as targets of 3D interaction because they provide
an inherent constraint on manipulation: point motion is restricted to a
plane.
As part of my research, I implemented an interactive contour editor to edit
3D surfaces that were reconstructed from planar contours. More precisely,
the editor is a tool for visualising a surface derived from a set of serial
sections, and for removing deformations from this surface. It was designed
specifically to remove artefacts from medical images of arteries.
I used the interface from my editor in an experiment that tested whether
users were faster and more accurate at manipulating surfaces in a 2D environment
or a 3D environment. At the outset of this study, I predicted that 2D would
be better for editing deformations of a 2D nature. That prediction was borne
out by my experimental results. I had also hoped that 3D would be superior
as an editing environment for correcting deformations of a 3D nature. However,
the 2D character of the data had a stronger effect on performance than did
the 3D character of the deformation. Despite the inherent constraints in
the surfaces, participants were faster at editing in 2D for all types of
deformations, while maintaining a consistent accuracy between 2D and 3D.
Participants did perceive a 3D environment to be better than a 2D environment
for manipulating a group of points that spanned multiple contours, although
this was not reflected in the quantitative results. The intuitive preference
for 3D in this situation leads me to believe that it is worth continuing
the search for a natural and effective interface for editing surfaces in
a 3D environment. |
Date |
January 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-02 |
Title |
Response Time Properties
of Some Asynchronous Circuits |
Authors |
J. Ebergen and R. Berks |
Abstract |
We discuss response time
properties of linear arrays of cells with various handshake communication
behaviours. The response times of a linear array are the delays between
requests and succeeding acknowledgments for the first cell. We derive simple
formulas for the worst-case response time and amortized response time of
linear arrays using a general variable-delay model, where delays may vary
between a lower and upper bound. The properties are independent of any particular
implementation of the cells of the network.
|
Date |
May 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |
CS-97-01 |
Title |
Decomposition of Boolean
Functions Specified by Cubes |
Authors |
J.A. Brzozowski and T.
Luba |
Abstract |
We study the problem of
decomposing a Boolean function into a set of functions with fewer arguments.
This problem has considerable practical importance in VLSI. For example,
for digital circuits designed with field-programmable gate arrays, it is
necessary to express Boolean functions in terms of `smaller' functions that
fit the cells of the array. The decomposition problem is old, and well understood
when the function to be decomposed is specified by a truth table listing
the function's minterms, or has one output only. However, modern design
tools, such as Berkeley's Espresso, handle functions with many outputs and
represent them by Boolean cubes rather than minterms, for reasons of efficiency.
In this paper we develop a decomposition theory for multiple-output, partially
specified Boolean functions represented by cubes. The theory uses ternary
algebra and generalized set systems. KEYWORDS: blanket, Boolean function,
cube, decomposition, disjunctive, don't care, multiple-output, set system,
ternary algebra, type fr |
Date |
January 1997 |
Report |
|
|
Adobe
PDF |
Compressed
PostScript |