2011Technical Reports | SCS | UW

[Please remove <h1>]


<2010 2012>
CS-2011-01
Title Semantics and Encoding of the kell-m Calculus
Authors Rolando Blanco and Paulo Alencar
Abstract We present kell-m, an asynchronous higher-order process algebra with hierarchical localities. The main focus of this report is on the operational semantics and behavioural equivalences for kell-m. The operational semantics determine how systems represented using kell-m evolve; the behavioural equivalences determine what it means for two kell-m processes to behave similarly. We also present and encoding of kell-m into MMC, the variation of the -calculus as implemented in the Mobility Model Checker (MMC).
Date January 13, 2011
Comments
Report     Adobe PDF
CS-2011-02
Title Modelling Distributed Event-Based Systems Using the kell-m Calculus
Authors Rolando Blanco and Paulo Alencar
Abstract In this report we use the kell-m process algebra to develop three models for Distributed Event-Based Systems (DEBSs). The first model is of the DEBS API standard proposed by Pietzuch et al. The second model is for the hierarchical structuring mechanism for components in the REBECA DEBS. The third model is for the internal structure of administrative components in the NaradaBrokering DEBS. These models support the specification of DEBS properties previously proposed in the area using other formalisms. We also show how new properties, based on the locality features provided by kell-m and the ability to passivate kells, can now be specified.

Date January 13, 2011
Comments
Report     Adobe PDF
CS-2011-03
Title Universal Top k Keyword Search over Relational Databases
Authors Ning Zhang, Ihab F. Ilyas and M. Tamer Özsu
Abstract

Keyword search is one of the most effective paradigms for information discovery. One of the key advantages of keyword search querying is its simplicity. There is an increasing need for allowing ordinary users to issue keyword queries without any knowledge of the database schema. The retrieval unit of keyword search queries over relational databases is different than in IR systems. While the retrieval unit in those IR systems is a document, in our case, the result is a synthesized document formed by joining a number of tuples. We measure result quality using two metrics: structural quality and content quality. The content quality of a JTT is an IR-style score that indicates how well the information nodes match the keywords, while the structural quality of JTT is a score that evaluates the meaningfulness/semantics of connecting information nodes, for example, the closeness of the corresponding relationship. We design a hybrid approach and develop a buffer system that dynamically maintains a partial data graph in memory. To reuse intermediate results of SQL queries, we break complex SQL queries into two types of simple queries. This allow us to support very large databases and reduce redundant computation. In addition, we conduct extensive experiments on large-scale real datasets to study the performance of the proposed approaches. Experiments show that our approach is better than previous approaches, especially in terms of result quality.

Date Feb 18, 2011
Comments
Report     Adobe PDF
CS-2011-04
Title A Study of Ontology-based Query Expansion
Authors Jiewen Wu, Ihab Ilyas, and Grant Weddell
Abstract With enormous data emerging on the Web, traditional keyword searching is challenged by short queries posed by users to vaguely describe their information need. Query expansion has been researched for decades and a variety of expansion strategies have improved retrieval effectiveness. At present, knowledge-based query expansion approaches are popular as the Web becomes more semantic. This paper studies state-of-the-art in ontologybased query expansion approaches, and expands on practical strategies to exploit the rich semantics of domain ontologies. This paper, on the one hand, focuses on finding out the success factors for ontology-based query expansion; on the other hand, it emphasizes the tradeoff between the gained retrieval effectiveness and the incurred computation cost.
Date February 09, 2011
Comments
Report     Adobe PDF
CS-2011-05
Title Specifying Search Queries for Web Service Discovery
Authors Shahram Esmaeilsabzali, Nancy A. Day and Farhad Mavaddat
Abstract Web services are meant to be easily accessible software systems. With the emergence of Web service technologies and the presence of thousands, or perhaps millions, of Web services in the coming years, the challenge is searching for a Web service that meets a specified requirement. In this paper, we propose methods and models for effectively specifying the search queries in a Web service discovery system. We show that while search queries should be specified precisely enough to contain information for appropriate matching, they should not be too detailed. Users often do not have a clear vision of their desired Web service and a search query should be at a high enough level of abstraction. We propose two models and show how they are appropriate for capturing search queries.
Date February 10, 2011
Comments
Report     Adobe PDF
CS-2011-06
Title SMURFEN: A Knowledge Sharing Intrusion Detection Network
Authors Carol Fung, Quanyan Zhu, Raouf Boutaba and Tamar Basar
Abstract The problem of Internet intrusions has become a world-wide security concern. To protect computer users from malicious attacks, Intrusion Detection Systems (IDSs) are designed to monitor network traffic and computer activities in order to alert users about suspicious intrusions. Collaboration among IDSs allows users to benefit from the collective knowledge and information from their collaborators and achieve more accurate intrusion detection. However, most existing collaborative intrusion detection networks rely on the exchange of intrusion data which raises the privacy concern of participants. To overcome this problem, we propose SMURFEN: a knowledge-based intrusion detection network, which provides a platform for IDS users to effectively share their customized detection knowledge in an IDS community. An automatic knowledge propagation mechanism is proposed based on a decentralized two-level optimization problem formulation, leading to a Nash equilibrium solution which is proved to be scalable, incentive compatible, fair, efficient and robust. We evaluate our rule sharing mechanism through simulations and compare our results to existing knowledge sharing methods such as random gossiping and fixed neighbors sharing schemes.
Date February 11, 2011
Comments
Report   Adobe PDF  
CS-2011-07
Title Multi-target Ray Searching Problems
Authors Spyros Angelopoulos, Alejandro Lopez-Ortiz and Konstantinos Panagiotou
Abstract

We consider the problem of exploring m concurrent rays (i.e., branches) using a single searcher. The rays are disjoint with the exception of a single common point, and in each ray a potential target may be located. The objective is to design efficient search strategies for locating t targets (with t ≤ m). This setting generalizes the extensively studied ray search (or star search) problem, in which the searcher seeks a single target. In addition, it is motivated by applications such as the interleaved execution of heuristic algorithms, when it is required that a certain number of heuristics have to successfully terminate. We apply two different measures for evaluating the efficiency of the search strategy. The first measure is the standard metric in the context of ray-search problems, and compares the total search cost to the cost of an optimal algorithm that has full information on the targets. We present a strategy that achieves optimal competitive ratio under this metric. The second measure is based on a weakening of the optimal cost as proposed by Kirkpatrick [ESA 2009] and McGregor et al. [ESA 2009]. For this model, we present an asymptotically optimal strategy which is within a multiplicative factor of Θ(log(m - t)) from the optimal search cost. Interestingly, our strategy incorporates three fundamental search paradigms, namely uniform search, doubling and hyperbolic dovetailing. Moreover, for both measures, our results demonstrate that the problem of locating t targets in m rays is essentially as difficult as the problem of locating a single target in m - (t - 1) rays.

Date January 28, 2011
Comments
Report     Adobe PDF
CS-2011-08
Title Faster Optimal Algorithms For SegmentMinimization With Small Maximal Value*
Authors Therese Biedl, Stephane Durocher, Celine Engelbeen, Samuel Fiorini and Maxwell Young
Abstract

The segment minimization problem consists of finding the smallest set of integer matrices (segments) that sum to a given intensity matrix, such that each summand has only one non-zero value (the segment-value), and the non-zeroes in each row are consecutive. This has direct applications in intensity-modulated radiation therapy, an effective form of cancer treatment. We study here the special case when the largest value H in the intensity matrix is small. We show that for an intensity matrix with one row, this problem is fixed-parameter tractable (FPT) in H ; our algorithm obtains a significant asymptotic speedup over the previous best FPT algorithm. We also show how to solve the full-matrix problem faster than all previously known algorithms. Finally, we address a closely related problem that deals with minimizing the number of segments subject to a minimum beam-on-time, defined as the sum of the segment-values. Here, we obtain a almost-quadratic speedup over the previous best algorithm.

Date February 21, 2011
Comments
Report     Adobe PDF
CS-2011-09
Title Overcoming Adversaries in Sensor Networks: A Survey of Theoretical Models and Algorithmic Approaches for Tolerating Malicious Interference
Authors Maxwell Young and Raouf Boutaba
Abstract Interference is an unavoidable property of the wireless communication medium and, in sensor networks, such interference is exacerbated due to the power-starved nature of the network devices themselves. In the presence of antagonistic interference, reliable communication in sensor networks becomes an extremely challenging problem that, in recent years, has attracted significant attention from the research community. This survey presents the current state of affairs in the formulation of theoretical models for adversarial interference in sensor networks and the different algorithmic remedies developed by the research community. There is a particular focus on jamming adversaries and Byzantine faults as these capture a wide range of benign faults as well as malicious attacks. The models in the literature are examined and contrasted with the aim of discerning the underlying assumptions that dictate analytical bounds with regards to feasibility and a number of performance metrics such as communication complexity, latency, and energy efficiency. Limitations are also highlighted with a focus on how various results impact real world applications and, conversely, how the current sensor network technology informs newer models. Finally, directions for future research are discussed.
Date February 28, 2011
Comments
Report     Adobe PDF
CS-2011-10
Title Improving Health Care with a Virtual Human Sleep Coach
Authors Cristina Ribeiro, Gaurav Mehrotra, Gregory Vey, Areej Alhothali and Chrysanne DiMarco
Abstract Persuasive technology can have a significant effect on people’s health. The sleep coach application is a persuasive technology that raises student’s awareness and attitudes towards getting a full night’s sleep (6- 9 hours) to be more positive. Students using the application to attempt changing their sleep patterns as a direct result of the application. NOTE: We will have the results of this research by the final camera-ready paper deadline. We are currently in the process of conducting the experiment.
Date March 01, 2011
Comments
Report     Adobe PDF
CS-2011-11
Title Proportional Contact Representations of Planar Graphs
Authors Md. J. Alam, T. Biedl, S. Felsner, M. Kaufmann and S. G. Kobourov
Abstract We study contact representations for planar graphs, with vertices represented by simple polygons and adjacencies represented by a point-contact or a side-contact between the corresponding polygons. Specifically, we consider proportional contact representations, where given vertex weights are represented by the areas of the corresponding polygons. Several natural optimization goals for such representations include minimizing the complexity of the polygons, the cartographic error, and the unused area. We describe optimal (with respect to complexity) constructive algorithms for proportional contact representations for general planar graphs and planar 2-segment graphs, which include maximal outerplanar graphs and partial 2-trees. Specifically,we show that: (a) 4-sided polygons are necessary and sufficient for a point-contact proportional representation for any planar graph; b) triangles are necessary and sufficient for point-contact proportional representation of partial 2-trees; (c) trapezoids are necessary and sufficient for side-contact proportional representation of partial 2-trees; d) convex quadrilaterals are necessary and sufficient for hole-free side-contact proportional representation for maximal outer-planar graphs.
Date March 04, 2011
Comments
Report     Adobe PDF
CS-2011-12
Title Paging for Multicore Processors
Authors Alejandro L´opez-Ortiz and Alejandro Salinger
Abstract Paging for multicore processors extends the classical paging problem to a setting in which several processes simultaneously share the cache. Recently, Hassidim [16] studied cache eviction policies for multicores under the traditional competitive analysis metric, showing that LRU is not competitive against an offline policy that has the power of arbitrarily delaying request sequences to its advantage. In this paper we study caching under the more conservative model in which requests must be served as they arrive. We study the problem of minimizing the number of faults, deriving bounds on the competitive ratios of natural strategies to manage the cache. We then study the offline paging problem and show that deciding if a request can be served such that at a given time each sequence has faulted at most a given number of times is NP-complete and that its maximization version is APX-hard (for an unbounded number of sequences). Lastly, we describe offline algorithms for the decision problem and for minimizing the total number of faults that run in polynomial time in the length of the sequences.
Date April 26, 2011
Comments
Report     Adobe PDF
CS-2011-13
Title A Bayesian Approach to Online Performance Modeling for Database Appliances using Gaussian Models
Authors Muhammad Bilal Sheikh, Umar Farooq Minhas, Omar Zia Khan, Ashraf Aboulnaga, Pascal Poupart and David J Taylor
Abstract In order to meet service level agreements (SLAs) and to maintain peak performance for database management systems (DBMS), database administrators (DBAs) need to implement policies for effective workload scheduling, admission control, and resource provisioning. Accurately predicting response times of DBMS queries is necessary for a DBA to effectively achieve these goals. This task is particularly challenging due to the fact that a database workload typically consists of many concurrently running queries and an accurate model needs to capture their interactions. Additional challenges are introduced when DBMSes are run in dynamic cloud computing environments, where workload, data, and physical resources can change frequently, on-the-fly. Building an efficient and highly accurate online DBMS performance model that is robust in the face of changing workloads, data evolution, and physical resource allocations is still an unsolved problem. In this work, our goal is to build such an online performance model for database appliances using an experiment-driven modeling approach. We use a Bayesian approach and build novel Gaussian models that take into account the interaction among concurrently executing queries and predict response times of individual DBMS queries. A key feature of our modeling approach is that the models can be updated online in response to new queries or data, or changing resource allocations. We experimentally demonstrate that our models are accurate and effective – our best models have an average prediction error of 16.3% in the worst case.
Date December 20 , 2011
Comments
Report     Adobe PDF
CS-2011-14
Title Context-Awareness and Adaptation in Distributed Event-Based Systems
Authors Eduardo S. Barrenechea, Paulo S. C. Alencar, Rolando Blanc and Don Cowan
Abstract Context-aware and proactive technologies have been continuously used over the past years to improve user interaction in areas such as searching and information retrieval, health care and mobile computing. In such systems, context information is typically propagated through events. Although there have been significant advances in distributed context-aware systems, there is still a lack of approaches that model and implement context-aware proactive applications involving the combination of context and distributed events. In our research we plan to address these issues by providing a context-aware event model with support for a new context-aware publish subscribe scheme. Such context-aware model would introduce context as a first class element, allowing components to specify the context in advertisements and subscriptions. The goal of our research is to be able to leverage context as part of our event model and bring behaviour adaptation to publication and subscription of events.
Date April 20 , 2011
Comments
Report     Adobe PDF
CS-2011-15
Title Compact Navigation and Distance Oracles for Graphs with Small Treewidth *
Authors Arash Farzan and Shahin Kamali
Abstract

Given an unlabeled, unweighted, and undirected graph with n vertices and small (but not necessarily constant) treewidth k, we consider the problem of preprocessing the graph to build space-efficient encodings (oracles) to perform various queries efficiently. We assume the word RAM model where the size of a word is Ω(log n) bits. The first oracle, we present, is the navigation oracle which facilitates primitive navigation operations of adjacency, neighborhood, and degree queries. By way of an enumerate argument, which is of independent interest, we show the space requirement of the oracle is optimal to within lower order terms for all treewidths. The oracle supports the mentioned queries all in constant worst-case time. The second oracle, we present, is an exact distance oracle which facilitates distance queries between any pair of vertices (i.e., an all-pair shortest-path oracle). The space requirement of the oracle is also optimal to within lower order terms. Moreover, the distance queries perform in O(k 2 log3 k) time. Particularly, for the class of graphs of our interest, graphs of bounded treewidth (where k is constant), the distances are reported in constant worst-case time.

Date May 6, 2011
Comments
Report     Adobe PDF
CS-2011-16
Title QueryFeature Graphs: Bridging User Vocabulary and System Functionality
Authors Adam Fourney, Richard Mann and Michael Terry
Abstract This paper introduces query-feature graphs, or QF-graphs. QF-graphs encode associations between high-level descriptions of user goals (articulated as natural language search queries) and the specific features of an interactive system relevant to achieving those goals. For example, a QF-graph for the GIMP software links the query “GIMP black and white” to the commands “desaturate” and “grayscale.” We demonstrate how QF-graphs can be constructed using search query logs, search engine results, web page content, and localization data from interactive systems. An analysis of QF-graphs shows that the associations produced by our approach exhibit levels of accuracy that make them eminently usable in a range of real-world applications. Finally, we present three hypothetical user interface mechanisms that illustrate the potential of QF-graphs: search-driven interaction, dynamic tooltips, and app-to-app analogy search. Keywords: Query-feature Graph, Search-driven Interaction
Date May 6, 2011
Comments
Report     Adobe PDF
CS-2011-17
Title Planar Open Rectangle-of-Influence Drawings with Non-Aligned Frames
Authors Soroush Alamdari and Therese Biedl
Abstract A straight line drawing of a graph is an open weak rectangle-of-influence (RI) drawing, if there is no vertex in the relative interior of the axis parallel rectangle induced by the end points of each edge. No algorithm is known to test whether a graph has a planar open weak RI-drawing, not even for inner triangulated graphs. In this paper, we study RI-drawings that must have a non-aligned frame, i.e., the graph obtained from removing the interior of every filled triangle is drawn such that no two vertices have the same coordinate. We give a polynomial algorithm to test whether an inner triangulated graph has an open weak RI-drawing with non-aligned frame.
Date June 7, 2011
Comments
Report     Adobe PDF
CS-2011-18
Title Sizing the Electrical Grid
Authors Omid Ardakanian, S. Keshav and Catherine Rosenberg
Abstract Transformers and storage batteries in the electrical grid must be provisioned or sized just as routers and buffers must be sized in the Internet. We prove the formal equivalence between these two systems and use this insight to apply teletraffic theory to sizing the electrical grid, obtaining the capacity region corresponding to a given transformer and storage size. To validate our analysis, we conduct a fine-grained measurement study of household electrical load. We compare numerical simulations using traces from this study with results from teletraffic theory. We show not only that teletraffic theory agrees well with numerical simulations but also that it closely matches with the heuristics used in current practice. Moreover, our analysis permits us to develop sizing rules for battery storage electrical grid, advancing the state of the art.
Date July 4, 2011
Comments
Report     Adobe PDF
CS-2011-19
Title Linear-Time Algorithms for Proportional Contact Graph Representations
Authors Md. J. Alam, T. Biedl, S. Felsner, A. Gerasch, M. Kaufmann and S.G. Kobourov
Abstract In a proportional contact representation of a planar graph, each vertex is represented by a simple polygon with area proportional to a given weight, and edges are represented by adjacencies between the corresponding pairs of polygons. In this paper we study proportional contact representations that use rectilinear polygons without wasted areas (white space). In this setting, the best known algorithm for proportional contact representation of a maximal planar graph uses 12-sided rectilinear polygons and takes O (n log n) time. We describe a new algorithm that guarantees 10-sided rectilinear polygons and runs in O (n) time.We also describe a linear-time algorithm for proportional contact representation of planar 3-trees with 8-sided rectilinear polygons and show that this optimal, as there exist planar 3-trees that requires 8-sided polygons. Finally, we show that a maximal outer-planar graph admits a proportional contact representation with 6-sided rectilinear polygons when the outer-boundary is a rectangle and with 4 sides otherwise.
Date July 5, 2011
Comments
Report     Adobe PDF
CS-2011-20
Title The Return On Investment for Taxi Companies Transitioning to Electric Vehicles
Authors Tommy Carpenter, Andrew R. Curtis, S. Keshav
Abstract We study whether taxi companies can simultaneously save petroleum and money by transitioning to electric vehicles. We propose a process to compute the return on investment (ROI) of transitioning a taxi corporation's fleet to electric vehicles. We use Bayesian data analysis to infer the revenue changes associated with the transition. We do not make any assumptions about the vehicles' mobility patterns; instead, we use a time-series of GPS co-ordinates of the company's existing petroleum-based vehicles to derive our conclusions. As a case study, we apply our process to a major taxi corporation, Yellow Cab San Francisco (YCSF). Using current prices, we find that transitioning their fleet to battery electric vehicles (BEVs) and plug-in hybrid electric vehicles (PHEVs) is profitable for the company. Furthermore, given that gasoline prices in San Francisco are only 5.4% higher than the rest of the United States, but electricity prices are 75% higher; taxi companies with similar practices and mobility patterns in other cities are likely to profit more than YCSF by transitioning to EVs.
Date Aug 3, 2011
Comments
Report     Adobe PDF
CS-2011-21
Title REWIRE: An Optimization-based Framework for Data Center Network Design
Authors Andrew R. Curtis, Tommy Carpenter, Mustafa Elsheikh, Alejandro L´opez-Ortiz and S. Keshav
Abstract Despite the many proposals for data center network DCN) architectures, designing a DCN remains challenging. DCN design is especially difficult when expanding an existing network,because traditional DCN design places strict constraints on the topology (e.g., a fat-tree). Recent advances in routing protocols allow data center servers to fully utilize arbitrary networks,so there is no need to require restricted, regular topologies in the data center. Therefore, we propose a data center network design framework, REWIRE, that designs networks using a local search-based algorithm. Our algorithm finds a network with maximal bisection bandwidth and minimal end-to-end latency while meeting user-defined constraints and accurately modeling wide range of inputs and find that it significantly outperforms previous solutions—its network designs have up to 100–500% more bisection bandwidth and less end-to-end network latency than best-practice data center networks.
Date Aug 4, 2011
Comments
Report     Adobe PDF
CS-2011-22
Title Using Model Checking to Analyze Static Properties of Declarative Models: Extended Version*
Authors Amirhossein Vakili and Nancy A. Day
Abstract We show how static properties of declarative models can be efficiently analyzed in a symbolic model checker; in particular, we use Cadence SMV to analyze Alloy models by translating Alloy to SMV. The computational paths of the SMV models represent interpretations of the Alloy models. The produced SMV model satises its LTL specifications if and only if the original Alloy model is inconsistent with respect to its nite scopes; counterexamples produced by the model checker are valid instances of the Alloy model. Our experiments show that the translation of many frequently used constructs of Alloy to SMV results in optimized models such that their analysis in SMV is much faster than in the Alloy Analyzer. Model checking is faster than SAT solving for static problems when an interpretation can be eliminated by early decisions in the model checking search.
Date Aug 29, 2011
Comments
Report     Adobe PDF
CS-2011-23
Title Orthogonal Query Expansion
Authors Margareta Ackerman, David Loker, and Alejandro Lopez-Ortiz
Abstract Over the last fifteen years, web searching has seen tremendous improvements. Starting from a nearly random collection of matching pages in 1995, today, search engines tend to satisfy the user's informational need on well-formulated queries. One of the main remaining challenges is to satisfy the users' needs when they provide a poorly formulated query. When the pages matching the user's original keywords are judged to be unsatisfactory, query expansion techniques are used to alter the result set. These techniques find keywords that are similar to the keywords given by the user, which are then appended to the original query leading to a perturbation of the result set. However, when the original query is sufficiently ill-posed, the user's informational need is best met using entirely different keywords, and a small perturbation of the original result set is bound to fail. We propose a novel approach that is not based on the keywords of the original query. We intentionally seek out orthogonal queries, which are related queries that have low similarity to the user's query. The result sets of orthogonal queries intersect with the result set of the original query on a small number of pages. An orthogonal query can access the user's informational need while consisting of entirely different terms than the original query. We illustrate the effectiveness of our approach by proposing a query expansion method derived from these observations that improves upon results obtained using the Yahoo BOSS infrastructure.
Date Aug 31, 2011
Comments
Report     Adobe PDF
CS-2011-24
Title Group Behaviours around Public Displays
Authors Alec Azad, Jaime Ruiz, Mark Hancock, Edward Lank
Abstract Information kiosks often decorate large public areas to pro-vide basic information to inquisitive patrons. This paper presents an observational study examining groups interact-ing with public kiosks. We identify fundamental issues re-garding patterns in user orientation and layout, group iden-tification, and behaviour both within and between social groups during the entire period of interaction. Based on observations from our study, we present a foundation of guidelines and principles that informs the design of public (vertical) large-screen surfaces.
Date Nov 11, 2011
Comments
Report     Adobe PDF
CS-2011-25
Title A Recognition Safety Net: Bi-Level Threshold Recognition for Mobile Motion Gestures
Authors Matei Negulescu, Jaime Ruiz and Edward Lank
Abstract Designers of motion gestures for mobile devices face the difficult challenge of building a recognizer that can separate gestural input from motion noise. A threshold value is often used to classify motion and effectively balances the rates of false positives and false negatives. We present a bi-level threshold recognizer that is built to lower the rate of recognition failures by accepting either a tightly thresholded gesture or two consecutive possible gestures recognized by a looser model. We evaluate bi-level thresholding with a pilot study in order to gauge its effectiveness as a recognition safety net for users who have difficulty activating a motion gesture. Lastly, we suggest the use of bi-level thresholding to scaffold learning of motion gestures.
Date Nov 24, 2011
Comments
Report     Adobe PDF
CS-2011-26
Title Stop Using “Users”! An Examination of Word Usage in CHI Literature and the Impact of Objectifying People
Authors Adam Bradley, Mark Hancock and Sheelagh Carpendale
Abstract In this paper we describe, though a philological and philosophical investigation, what we think the effects of the word “user” are on HCI research. By recognizing that words themselves carry historical meanings and semiotic/ semantic suggestions we will attempt to show that a shift in terminology, replacing “user” with “human” or “person”, will positively affect not just our use of jargon, but our fundamental concerns with design. Through a study of the word itself, its historical usage in the English language, and then its usage in the corpus of CHI scholarship, we will attempt to show that a consideration of these factors could drastically change how we envision and interpret the work of our own community.
Date Nov 23, 2011
Comments
Report     Adobe PDF
CS-2011-27
Title The Point-set embeddability Problem for Plane Graphs
Authors Therese Biedl and Martin Vatshelle
Abstract In this paper, we study the Point-set embeddability-problem, i.e., given a planar graph and a set of points, is there a mapping of the vertices to the points such that the resulting straight-line drawing is planar? This problem is NP-hard if the embedding can be chosen, but
becomes polynomial for triangulated graphs of treewidth 3. We show here that in fact it can be answered for all planar graphs with a fixed embedding that have constant treewidth and constant face-degree. We also prove that as soon as one of the conditions is dropped (i.e., either the treewidth is unbounded or some faces have large degrees), Point-set embeddability with a fixed embedding becomes NP-hard. The NP-hardness holds even for a 3-connected planar graph with constant treewidth, triangulated planar graphs, or 2-connected outer-planar graphs.
Date Nov 29, 2011
Comments
Report     Adobe PDF
CS-2011-28
Title Information Technology to Support Indigenous Peoples
Authors D.D. Cowan, F.M. McGarry, H. Moran, D.D. McCarthy and C. King
Abstract Indigenous peoples face challenges retaining their traditional knowledge and culture and in negotiating with governments and proponents of resource exploitation, settlement and infrastructure development on their traditional lands. This paper outlines the issues faced by indigenous peoples and describes the Dreamcatcher software system. Dreamcatcher is a powerful web‐based information system containing interactive mapping tools, mediated social networks, geo‐spatial consultation services and a security model that can support indigenous peoples in the endeavours just described. Interactive mapping is a key tool since landscapes valued by indigenous peoples can be integral elements of indigenous cultural identity.
Date Nov 29, 2011
Comments
Report     Adobe PDF
CS-2011-29
Title Mapping Big-Step Modeling Languages to SMV
Authors Fathiyeh Faghih and Nancy A. Day
Abstract We propose an algorithm for creating a semantics-based, parameterized translator from the family of big-step modeling languages (BSMLs) to the input language of the model checker SMV. Our translator takes as input a specication in the CHTS notation and a set of user-provided parameters that encode the specication's semantics; it produces an SMV model suitable for model checking. We use a modular approach for translation, which means that the structure of the resulting SMV model matches the source CHTS structure.
Date Dec 23, 2011
Comments
Report     Adobe PDF
CS-2011-30
Title On the Modeling of Light Interactions with Human Blood
Authors D. Yim, G.V.G. Baranoski, T.F. Chen, B.W. Kimmel and E. Miranda
Abstract The development of predictive appearance models for organic tissues is a challenging task due to the inherent complexity of these materials. In this report, we closely examine the biophysical processes responsible for the appearance attributes of whole blood, one the most fundamental of these materials. We describe a new appearance model that simulates the mechanisms of light propagation and absorption within the cellular and fluid portions of this specialized tissue. The proposed model employs a comprehensive, and yet flexible first principles approach based on the morphological, optical and biochemical properties of blood cells. This approach allows for environment driven changes in the cells’ anatomy and orientation to be appropriately included into the light transport simulations. The correctness and predictive capabilities of the proposed model are quantitatively and qualitatively evaluated through comparisons of modeled results with actual measured data and experimental observations reported in the scientific literature. Its incorporation into rendering systems is illustrated through images of blood samples depicting appearance variations controlled by physiologically meaningful parameters. Besides the contributions to the modeling of material appearance, the research presented in this report is also expected to have applications in a wide range of biomedical areas, from optical diagnostics to the visualization and noninvasive imaging of blood-perfused tissues.
Date Dec 21, 2011
Comments
Report     Adobe PDF