Browsing by Author "Denzinger, Jörg"
Now showing 1 - 20 of 35
Results Per Page
Sort Options
Item Open Access A cooperative distributed data mining model and its application to medical data on diabetes(2004) Gao, Jie; Denzinger, JörgWe present CoLe, a cooperative distributed system model for mining knowledge from heterogeneous data. CoLe allows for the cooperation of different learning algorithms and the combination of the mined knowledge into knowledge structures no individual learner can produce. CoLe organizes the work in rounds so that knowledge discovered by one learner can help others in the next round. We implemented a system based on CoLe for mining diabetes data, including a genetic algorithm for learning event sequences, improvements to the PART algorithm for our problem and combination methods to produce hybrid rules containing conjunctive and sequence conditions. In our experiments, the CoLe-based system outperformed the individual learners, with better rules and more rules of a certain quality. Our improvements to learners also showed they were useful. From the medical perspective, our system confirmed hypertension has a tight relation to diabetes, and it also suggested connections new to medical doctors.Item Open Access A daily scrum meeting summarizer for agile software development teams(2007) Park, Shelly; Maurer, Frank Oliver; Denzinger, JörgItem Open Access A hybrid agent architecture for learning good cooperative behaviours for game characters(2012) Paskaradevan, Sanjeev; Denzinger, JörgCreating intelligent game agents is a difficult problem, cspcci??1lly in non-dctenninistic games where the outcome of an action cannot be determined with certainty. When we add in the complexity of team based games and the need for cooperation between the agents the problem can become even more complex. In this work, we provide a hybrid agent architecture that can be used to create and train agents to play team based games. Our architecture uses role based agent pools, communicated intentions, reinforcement learning, and evolutionary learning to train a team of agents. We test our architecture by applying it to the turn-based strategy game Battle for Wesnoth and demonstrate that we can train effective agents that can work together in teams to win against the built in \i\Tesnoth artificial intelligence.Item Open Access Agent-based cooperative heterogeneous data mining(2012) Gao, Jie; Denzinger, JörgThis thesis presents an agent-based cooperative data mining model named CoLe2. CoLe2 is targeted at performing data mining on large, heterogeneous data sets. It employs multiple different types of data mining algorithms, enables cooperations among these algorithms, and produces combined results in the form of rules. CoLe7- is a multi-agent system with three types of agents that have the different roles of running data mining algorithms, performing combination of mining results, and driving the entire CoLe2 system work flow with knowledge-based strategies, respectively. The system has a work flow with two levels of loops. The outer loop performs data selection, mining algorithm selection and expectation adjustment strategies. The inner loop performs data mining execution and result combination, with additional knowledge-based strategies implemented in the agents. The agents exchange useful information during the running of the work flow to help each other. A prototype system of the CoLe2 model is described. This prototype contains four different data mining algorithms (a classification algorithm, a sequence mining algorithm, an association rules mining algorithm and a descriptive mining algorithm), two combination strategies and instantiations of the knowledge-based strategies. The strategies instantiations include data selection based on a clustering algorithm, an asynchronous work flow for better turnaround time, relevance factor calculation, fuzzy condition matching, prediction histogram based rule similarity and rule grouping. Experiments have been performed with two data sets - a medium-sized data set of billing data from Calgary Health Region, and a large data set from the Alberta Kidney Disease Network. The experimental results show advantages of Cole? over individual data mining algorithms in terms of efficiency and result quality, as well as advantages over the CoLe model with only one level of work flow. Specialized experiments also prove the effectiveness of individual knowledge-based strategies.Item Open Access Automatic Inspection of Radio Astronomical Surveys (AIRAS)(2016) Said, Dina Adel; Barker, Kenneth Edwin; Stil, Jereon Maarten; Fiege, Jason; Rokne, Jon; Denzinger, Jörg; Leahy, DenisThis research investigates the problem of analyzing radio astronomical surveys (RAS) to automatically identify groups of objects forming patterns that astronomers are interested to find. The visual inspection of RAS to find these interesting patterns requires a lot of time and effort to go through thousands of images in RAS. Moreover, the visual process can be infeasible in very crowded and noisy images. To tackle this problem, this research presents AIRAS: the first reported system for the automatic inspection of RAS. AIRAS consists of two main stages; (i) STAGE 1: Object finding where all objects in RAS are found and presented in a graph-based representation called the astronomy graph (AG), and (ii) STAGE 2: Pattern querying and retrieval where astronomers specify the characteristics of interesting patterns in a query form. Afterwards, AIRAS finds patterns matching these characteristics in the AG and presents them to astronomers for further investigation. Astronomers can use AIRAS to detect patterns known to be suspicious (i.e. they consist of false astronomical objects or artifacts). Among these patterns are the hexagonal pattern (HP) and the zigzag pattern (ZP). In the HP, objects form a hexagon shape with an object in the middle, similar to the shape of the front end of the Arecibo telescope horn. In the ZP, objects are aligned in an orientation with the horizontal axis similar to the scanning line of the radio telescope. These two patterns are used as case studies to evaluate AIRAS performance using images from the GALFACTS project; a project carried out at the University of Calgary in cooperation with several research institutes worldwide. The experimental studies show that AIRAS is a promising system that finds patterns in RAS in response to astronomers’ queries with an acceptable accuracy. Additionally, AIRAS can be extended to connect the patterns found with their physical signals to provide more insights about the nature of these patterns.Item Open Access Automatically Characterizing Logging Usage: An Application of Anti-Unification(2016) Zirakchianzadeh, Narges; Walker, Robert; Ruhe, Guenther; Denzinger, JörgLogging has been a common practice to record the runtime behaviour of a software system, typically performed by inserting log statements in its source code. While several frameworks have been specifically created to help developers perform logging tasks, these do not provide guidance on where the log statements should be located in the source code. Thus, developers usually rely on their common sense to decide where to log. If logging is done properly, it can provide valuable information for software development and maintenance; if it is done poorly, system performance can degrade and maintenance can be made more difficult. Few studies have been conducted to characterize logging usage in real-world applications. This work tries to address the problem of where to log by proposing an automated approach that characterizes the location of log statements through the approximation of an anti-unification approach (specifically, higher-order anti-unification modulo theories) and a hierarchical clustering technique to construct a set of anti-unifiers, each describing the commonalities and differences between source code fragments that embody log statements. This approach has been reified in a prototype tool, called ELUS, that greedily identifies the best structural correspondences with respect to the highest similarity and some constraints. An empirical study was conducted by applying the tool on the source code of four open source systems and manually examining the generated anti-unifiers. The analysis resulted in five main categories of anti-unifiers in the logging usage. Two empirical evaluations were conducted in this work: (1) an experiment was conducted to evaluate the effectiveness of the proposed approach through the application of its supporting tool on a test suite; and (2) an experiment was performed to evaluate the quality of the anti-unifiers in describing the location of log statements in source code.Item Open Access Cooperative control of multiple airborne agents: a GIS-centered approach(2005) Carroll, Seamus; Denzinger, JörgItem Open Access Design and Analysis of an Intelligent Decision Support System for Trading and its Application to Electricity Trading(2013-09-23) Maurice, Sebastian Augustine; Ruhe, Guenther; Denzinger, JörgThe financial trading market is a highly complex and dynamic system, which is the limiting factor preventing any model from accurately predicting its movements. Because of this limiting factor, trading in a market can be risky for individuals and institutions that could experience financial losses and this can impact the overall economy. It remains an on-going research challenge to find approaches to minimize the risk in trading. The focus of this PhD research is on a multi-agent based simulation approach to provide decision support to traders to help minimize the risk from trading. We address four problems in the existing research on decision support systems for trading: 1) lack of a modeling framework, 2) lack of direction on modeling personas, 3) lack of direction on how to provide decision support to traders, and 4) lack of analysis on quality of forecasts. The main contributions of the research is the design, analysis, development and validation of a new decision support paradigm for trading called T-Evolve*. We have also developed a new intelligent decision support technology called TRAMAS. The paradigm together with TRAMAS supports traders by allowing them to simulate different market models composed of agents with different personas and forecast beliefs. Exploring and analysing the simulation results provides guidance to traders on potential market outcomes; this is then used to develop a trading plan for tomorrow’s market. The core element of TRAMAS is the incorporation of actors with different personas and forecasts beliefs, instantiated by agents. The advantage of our approach is twofold. First, different personas of participants exist in every market, thus incorporating personas is a natural representation of the real market. Second, forecast beliefs are a natural belief of participants because forecasts about the future market play a critical role in how a market may develop tomorrow. Two industrial-oriented case studies with empirical evidence support our approach. The validation of our approach by nine industry experts confirms, within the context of this research, that our approach has merit and can be useful in a real-world setting. TRAMAS also predicts with 77% accuracy the direction of future markets for six different days.Item Open Access Disslib:cc(2011) Kendon, Tyson; Denzinger, JörgDisSLib:CC (Distributed Search Library: Common Central) is a library for creating distributed search systems. It allows developers to take advantage of the now common access to multiple processors, either as part of a multi-core processor in a single machine or part of a cluster of computational power that exists around us. Some software can take advantage of this new configuration of hardware without a lot of extra work, but some software needs special consideration to get benefits from multiple processors, especially knowledge based search. Knowledge-based search allows us to find good solutions to hard problems, it combines knowledge about how to solve problems and knowledge about the problems that need to be solved with the raw processing power of a computer. In knowledge-based search systems solutions are found by taking small computational steps, some of which may be in the right direction and some of which may be wrong. Because of these small steps and the need to evaluate what was a good choice simply spreading these steps across multiple processors without a plan does not work. Fortunately there are paradigms for distributed knowledge-based search that let us use multiple processors to speed search up. Given the universality of these paradigms, it is possible to use them to design libraries that let developers create distributed search systems. DisSLib:CC was created to allow developers to build distributed search systems that solve their problems as quickly as possible without having to do the hard parts of handling communications and the repetitive parts handling configuration and writing logs and search protocols. DisSLib:CC allows developers to create distributed search systems from sequential search systems with minimal extra code. Two systems developed using DisSLib:CC were used to test some aspects of the configuration of distributed search systems and showed speed ups based on finding the right conditions.Item Open Access DisSLib:ICA(2013-09-13) Hoang, Dang; Denzinger, JörgKnowledge-based search is a set of concepts that can help software developers build better search systems. Until recently, developers often build sequential knowledge-based search systems (making use of only a single processor core) because sequential systems are easier to develop. For further speed improvements, developers could rely on new processors to run at a faster clock rate. However, with the recent trend in processor design, clock rates have remained stagnant while processors are moving towards having multiple processing cores. To make use of these additional processing capabilities, developers now need to {\it distribute} their search to the available processing cores. Building distributed based search systems is a difficult and time consuming process because not only are distributed systems difficult to develop, but knowledge-based search systems are not readily able to be distributed. Fortunately, there are different distribution paradigms that provide guidelines as to how the search process can be distributed. This thesis introduces DisSLib:ICA, a software library for building distributed knowledge-based search systems based on the improving on the competition approach paradigm. The main goal of DisSLib:ICA is to allow developers to build distributed search systems in the same manner, and with the same amount of effort, as it would normally take to build a sequential search system. It achieves this by handling the communication and multi-threading tasks along with providing developers a skeleton structure of a search system that can be extended to fit the developer's concrete search problem. To evaluate DisSLib:ICA, we have built three search systems that solves different problems using the library. Our results have shown that the library allows developers to build distributed systems with approximately the same amount of effort as it would take to build a sequential system. In addition, our experiments show that by using the improving on the competition approach paradigm, the library produces synergistic speedups.Item Open Access EEIA: The Extended Efficiency Improvement Advisor(2018-07-25) Nygren, Nicholas; Denzinger, Jörg; Aycock, John; Kattan, LinaIn the past, the Efficiency Improvement Advisor (EIA) has been successfully applied to several dynamic problems. By learning recurring tasks, it was able to correct inefficient behavior in multiagent systems. We present an extension to the advisor which allows certain known-ahead knowledge to be exploited. This extension unobtrusively guides autonomous agents to follow a plan, while retaining the dynamic abilities of those agents. Unlike other similar approaches which introduce planning functionality, this does not require always-on communications. The extended advisor’s planning abilities work in tandem with the original learning abilities to create additional efficiency gains. The abilities of the extended advisor (including the introduction of planning, the preservation of dynamism, and mixing certain knowledge with learned knowledge) are evaluated in 2 different problem domains. First, the advisor is applied to the familiar arcade game: Whack-a-mole. Then, Pickup and Delivery is considered, which is similar to coordinating a taxi service.Item Open Access EUKARYO: An Agent-Based, Interactive, Virtual Reality Simulation of a Eukaryotic Cell(2016) Yuen, Douglas Wing-Kwok; Jacob, Christian; Denzinger, Jörg; Katz, LarryGame engines provide sophisticated tools for rendering virtual environments. Their capabilities make them suitable for constructing detailed virtual environments. In this thesis, we present Eukaryo, an interactive, 3D model of a eukaryotic cell that was implemented in Unity and Unreal Engine. Eukaryo utilises mathematical modelling towards modelling enzyme kinetics, and agent-based modelling to illustrate the interaction between the individual proteins in the cell. Using hybrid modelling, Eukaryo is able to model the interactions for a diverse range of biological processes, such as protein activation pathways and cytoskeleton self-assembly. Further to this, Eukaryo provides support for virtual reality devices, such as the CAVE and Oculus Rift headset; users can immerse themselves in a virtual biomolecular environment. By combining state-of-the-art biological simulations with the 3D visualisations and real-time interactivity, Eukaryo provides an innovative environment for exploring cellular processes to convey the complexity in systems that constitute the machinery of life.Item Open Access Evaluating the Emergent Effects of (Multiple) Security Mechanisms via Evolutionary Algorithms(2018-11-30) Hudson, Jonathan William; Denzinger, Jörg; Williamson, Carey L.; Safavi-Naeini, ReyhanehSecurity mechanisms provide protection against system penetration and exploitation by providing coverage for vulnerabilities. However, security mechanisms often have demanding operational requirements that necessitate access to system resources and control of monitoring points. At the same time, users have particular requirements from programs they install, how they interact with these programs, and what performance they expect from their computing system. These combined requirements create a selection problem where the user desires to balance security coverage, through a choice of security mechanism(s), with system performance and functionality. This problem is known as the Effective Security-in-Depth problem. First, this thesis introduces a genetic algorithm to enable an evolutionary search for interaction event sequences for the problem of Effective Security-in-Depth. This methodology required the development of a fitness function that integrated numerous system metrics while addressing the variance found in event sequence simulation and measurement. Next, the steps for effectively implementing this methodology as a software tool are described. Finally, this thesis introduces three processes to use the tool to select between single security mechanisms for different usage profiles, compare and contrast subsets of security mechanisms, and evaluate examples of emergent misbehaviour such as system failure. The initial experimental evaluation validates the ability of the search for interaction event sequences to make progress despite the challenges of stochastic system measurement. The remaining experimental evaluations demonstrate the success of an application of each of the three processes. The evaluation supports that the developed method, tool, and processes are a viable solution to the problem of Effective Security-in-Depth.Item Open Access Evoattack - testing operating systems by learning harmful system call sequences(2003) Williams, Timothy Raymond; Denzinger, JörgItem Open Access evoExplore: Multiscale Visualization for Evolutionary Design Histories(2018-01-26) Kelly, Justin; Jacob, Christian; Alim, Usman; Jacob, Christian; Denzinger, JörgevoExplore is a system designed to assist evolutionary design projects. Built in the Unity 3D game engine and designed with future development and expansion in mind, evoExplore allows the user to collect and visualize data from evolutionary design experiments in 3D. The user is provided the tools needed to breed their own evolving populations, record the results from such evolutionary experiments and then visualize the recorded data as a series of 3D columns and rings representing the recorded evolutionary experiments and their populations over time. evoExplore allows the user to dynamically explore their own evolutionary experiments, as well as those produced by other users, in order to better understand the data produced by these evolutionary systems. In this document we describe the features of evoExplore, the engine it was built in, the use of virtual reality in evoExplore and the contributions our system brings to the fieldItem Open Access Evolutionary Algorithm for Adaptive Quantum-Channel Control(2019-01-23) Palittapongarnpim, Pantita; Sanders, Barry C.; Wiseman, Howard M.; Simon, Ch; Hobill, David W.; Denzinger, JörgThe key to successful implementations of quantum technologies is quantum control, whose aim is to steer quantum dynamics such that the desired outcome is achieved. Quantum control techniques rely on models of the quantum dynamics to generate control policies that attain the control targets. In a practical situation, the dynamic model may not match the dynamic in the implementation, and this mismatch can lead to reduced performance or even a failed control procedure. Data-driven control has been proposed as an alternative to model-based control design. In this approach, measurement outcomes from the system are used to generate a policy, which enables robust control without the need for a noise model. The potential for data-driven quantum control has been demonstrated in the problem of quantum-enhanced adaptive phase estimation. However, the performance and robustness of data-driven policies have never been compared with performance and robustness of model-based control techniques. In this thesis, we aim to determine the advantages and disadvantages of model-based and data-driven policy generation using a simulated quantum-enhanced adaptive phase estimation as an example of a quantum control task. In the process, we explore the connection between an adaptive quantum-enhanced metrological procedure to a decision-making process, which is an alternative model of the dynamic during the control task. We also devise a robust search algorithm based on an evolutionary algorithm that is ignorant of the properties of the phase noise but is still able to deliver quantum-enhanced precision. We then compare the performances of feedback control policies designed using Bayesian inference, which is a model-based technique, to policies generated using this robust evolutionary algorithm on their performance in both noisy and noiseless interferometers. We also assess the resources used in generating and implementing a control policy and use the complexities of the time and space costs as parts of selecting a practical control procedure.Item Open Access Exploring Context for Privacy Protection in North American Higher Education and Beyond(2020-01) Wu, Leanne; Barker, Ken E.; Denzinger, Jörg; Oehlberg, Lora A.; Lock, Jennifer V.; Veletsianos, GeorgeUndergraduate students in North American post-secondary institutions are subject to a wide range of data collection. It includes data generated in the course of teaching and learning, but also can include a wide range of other aspects of modern life, such as closed-circuit security cameras, internet and wireless network use, and what students buy and consume. This makes the post-secondary institution an ideal model for understanding the privacy impact of modern and future technologies, as a single organization which collects and potentially uses wide-ranging amounts and kinds of data about our daily lives. This thesis proposes a framework which separates context into three interrelated layers so that systems can be designed which more fully protect the privacy of individuals, examines the ways in which we collect and use data about undergraduate students, and makes a quantitative study of undergraduate privacy behaviours and attitudes. Thus we present the case that context is a core concept for privacy protections which better protect undergraduate students and their privacy.Item Open Access Exploring Higher Dimensional Photography Using Programmable Cameras(2013-07-10) Willson, Christopher; Boyd, Jeffrey; Denzinger, JörgConsumer-grade cameras can be programmed by technical users with an interest in photography, but currently the potential is still being investigated. We explore some of the capabilities available using programmable cameras through the example of capturing what we term a four-dimensional (4D) volume of image data. Having access to a 4D volume of image data allows for the creation of photographs that are visually interesting, but also technically challenging to work with. We deal with the inherent complexity by building tools using computer vision and image processing techniques both at the computer desktop software level as well as inside the camera itself. Our intention is to assist artists and photographers to manage this inherent complexity. The assistance we provide is evaluated by some artists and photographers through a user study. Now that programmable cameras have become available, our work shows how embedding image processing inside a consumer camera along with other computer vision tools can facilitate photographers’ creative possibilities.Item Open Access Haptic Flesh: Aesthetics of Electronic Touch(2018-02-27) Kryzhanivska, Oksana; Hushlak, Gerald; Boyd, Jeffrey E.; Cahill, Susan; Denzinger, Jörg; Kolarevic, Branko; Grimm, Cindy; Taron, Joshua M.Identifying the need for novel design approaches for developing vibrotactile interfaces this study explores ways to integrate tactile sensing and feedback into interactive three-dimensional organic sculptures. With the tactile responsive materials and electronics, this practice-based inquiry explores the question of how haptic technology can become a platform of generative, creative expression. Specifically, this study focuses on generating and conveying meaning during different stages of art production with the emphasis on the design of haptic experiences with flexible sculpture. It responds to the concerns of interactive art practices, scientific inquiries into haptics, and new media studies. In this inquiry, the exploration of embedding tactile sensing and feedback technology manifests through the development of composite responsive materials and generating sculptural form. Following a bottom-up approach to design, the variations of physical elements, such as materials, vibrotactile feedback, electronic components, human interactions, and the understanding of tactile aesthetics become artistic media -- equally significant as parts of a production system. Furthermore, this system organized the variables referencing the three-dimensional model of thinking based on designing with topological forms. The abstract notions materialize in this system through a metaphoric sensory encoding and actualize in computer-controlled generated tactile response emitted from within sculptural objects. The resulting interactive artifacts demonstrate this approach and share the understanding of the topological way of three-dimensional thinking with participating audiences that engage with these interfaces by touching the soft flesh-like surfaces of organ-like sculptures. As an outcome of these interactions, new understandings arise about the abilities to generate tactile sensory experiences and their meanings. Furthermore, these works begin to pose questions about the new norm of perceiving three-dimensional art and what these experiences mean for the understanding of technological-body augmentations, and for the future of electronically-mediated tactile interaction in embedded, prosthetic, and organic flexible interfaces.Item Open Access Identifying the Problems of Software Re-architecting and a Knowledge Representation Framework to Address Them(2018-06-25) Moazzen, Elham; Walker, Robert J.; Denzinger, Jörg; Oehlberg, Lora A.; Anvik, John; Hu, YaopingReal-world software undergoes constant change: to fix bugs; to extend functionality; to interact with the changing “ecosystem” around it; and to make internal improvements. Non-trivial software must possess a software architecture: a division into smaller pieces, how those pieces are meant to interact, and how those pieces are deployed physically. As a software architecture can have a significant impact on important properties of the software, the architecture for a software system may need to change as the system itself undergoes change: this is software re-architecting. Unfortunately, software re-architecting is poorly understood: without understanding what is involved in software re-architecting and what problems people encounter in approaching it, we cannot help solve or avoid those problems. I begin this thesis by conducting a case study on a real world example of a software re-architecting, for which documentation and records of discussions were available, to find basic issues that arose during the process. I also conducted a series of interviews with software engineers centred around those issues to deepen our understanding of the process of software re-architecting and discovered the notion of discrete change steps that must be organized and coordinated. I identify a set of critical challenges that must be addressed in any concrete solution. Software engineers lacked a systematic approach to the communication and record management of change steps, suggesting a set of design guidelines for future collaboration tools tailored for re-architecting. They need collaboration tools that facilitate viewing, recording, and retrieving the change steps, and involving the communications within and between the levels of the development team. I then propose a knowledge representation framework for the change process in asynchronous collaboration. This framework is a first step toward a re-architecting collaboration tool that would help to systematize the change process without disrupting it. I developed a paper prototype of the framework and conducted a user evaluation study to determine if the new approach meets the needs of software engineers working on a software re-architecting. My study suggests that the ii approach supported by the prototype allows software engineers to better present changes to their team relative to traditional mechanisms, thereby enabling them to consider more detail. I illustrate the potential value of the framework as a platform for deeper study and further investment in tools, highlighting promising areas for future research.