Graduate Theses & Dissertations

Pages

ADAPT
This thesis focuses on the design of a modelling framework consisting of loose-coupling of a sequence of spatial and process models and procedures necessary to predict future flood events for the years 2030 and 2050 in Tabasco Mexico. Temperature and precipitation data from the Hadley Centers Coupled Model (HadCM3), for those future years were downscaled using the Statistical Downscaling Model (SDSM4.2.9). These data were then used along with a variety of digital spatial data and models (current land use, soil characteristics, surface elevation and rivers) to parameterize the Soil Water Assessment Tool (SWAT) model and predict flows. Flow data were then input into the Hydrological Engineering Centers-River Analysis System (HEC-RAS) model. This model mapped the areas that are expected to be flooded based on the predicted flow values. Results from this modelling sequence generate images of flood extents, which are then ported to an online tool (ADAPT) for display. The results of this thesis indicate that with current prediction of climate change the city of Villahermosa, Tabasco, Mexico, and the surrounding area will experience a substantial amount of flooding. Therefore there is a need for adaptation planning to begin immediately. Author Keywords: Adaptation Planning, Climate Change, Extreme Weather Events, Flood Planning, Simulation Modelling
An Investigation of Load Balancing in a Distributed Web Caching System
With the exponential growth of the Internet, performance is an issue as bandwidth is often limited. A scalable solution to reduce the amount of bandwidth required is Web caching. Web caching (especially at the proxy-level) has been shown to be quite successful at addressing this issue. However as the number and needs of the clients grow, it becomes infeasible and inefficient to have just a single Web cache. To address this concern, the Web caching system can be set up in a distributed manner, allowing multiple machines to work together to meet the needs of the clients. Furthermore, it is also possible that further efficiency could be achieved by balancing the workload across all the Web caches in the system. This thesis investigates the benefits of load balancing in a distributed Web caching environment in order to improve the response times and help reduce bandwidth. Author Keywords: adaptive load sharing, Distributed systems, Load Balancing, Simulation, Web Caching
An Investigation of the Impact of Big Data on Bioinformatics Software
As the generation of genetic data accelerates, Big Data has an increasing impact on the way bioinformatics software is used. The experiments become larger and more complex than originally envisioned by software designers. One way to deal with this problem is to use parallel computing. Using the program Structure as a case study, we investigate ways in which to counteract the challenges created by the growing datasets. We propose an OpenMP and an OpenMP-MPI hybrid parallelization of the MCMC steps, and analyse the performance in various scenarios. The results indicate that the parallelizations produce significant speedups over the serial version in all scenarios tested. This allows for using the available hardware more efficiently, by adapting the program to the parallel architecture. This is important because not only does it reduce the time required to perform existing analyses, but it also opens the door to new analyses, which were previously impractical. Author Keywords: Big Data, HPC, MCMC, parallelization, speedup, Structure
Augmented Reality Sandbox (Aeolian Box)
The AeolianBox is an educational and presentation tool extended in this thesis to represent the atmospheric boundary layer (ABL) flow over a deformable surface in the sandbox. It is a hybrid hardware cum mathematical model which helps users to visually, interactively and spatially fathom the natural laws governing ABL airflow. The AeolianBox uses a Kinect V1 camera and a short focal length projector to capture the Digital Elevation Model (DEM) of the topography within the sandbox. The captured DEM is used to generate a Computational Fluid Dynamics (CFD) model and project the ABL flow back onto the surface topography within the sandbox. AeolianBox is designed to be used in a classroom setting. This requires a low time cost for the ABL flow simulation to keep the students engaged in the classroom. Thus, the process of DEM capture and CFD modelling were investigated to lower the time cost while maintaining key features of the ABL flow structure. A mesh-time sensitivity analysis was also conducted to investigate the tradeoff between the number of cells inside the mesh and time cost for both meshing process and CFD modelling. This allows the user to make an informed decision regarding the level of detail desired in the ABL flow structure by changing the number of cells in the mesh. There are infinite possible surface topographies which can be created by molding sand inside the sandbox. Therefore, in addition to keeping the time cost low while maintaining key features of the ABL flow structure, the meshing process and CFD modelling are required to be robust to variety of different surface topographies. To achieve these research objectives, in this thesis, parametrization is done for meshing process and CFD modelling. The accuracy of the CFD model for ABL flow used in the AeolianBox was qualitatively validated with airflow profiles captured in the Trent Environmental Wind Tunnel (TEWT) at Trent University using the Laser Doppler Anemometer (LDA). Three simple geometries namely a hemisphere, cube and a ridge were selected since they are well studied in academia. The CFD model was scaled to the dimensions of the grid where the airflow was captured in TEWT. The boundary conditions were also kept the same as the model used in the AeolianBox. The ABL flow is simulated by using software like OpenFoam and Paraview to build and visualize a CFD model. The AeolianBox is interactive and capable of detecting hands using the Kinect camera which allows a user to interact and change the topography of the sandbox in real time. The AeolianBox’s software built for this thesis uses only opensource tools and is accessible to anyone with an existing hardware model of its predecessors. Author Keywords: Augmented Reality, Computational Fluid Dynamics, Kinect Projector Calibration, OpenFoam, Paraview
Characteristics of Models for Representation of Mathematical Structure in Typesetting Applications and the Cognition of Digitally Transcribing Mathematics
The digital typesetting of mathematics can present many challenges to users, especially those of novice to intermediate experience levels. Through a series of experiments, we show that two models used to represent mathematical structure in these typesetting applications, the 1-dimensional structure based model and the 2-dimensional freeform model, cause interference with users' working memory during the process of transcribing mathematical content. This is a notable finding as a connection between working memory and mathematical performance has been established in the literature. Furthermore, we find that elements of these models allow them to handle various types of mathematical notation with different degrees of success. Notably, the 2-dimensional freeform model allows users to insert and manipulate exponents with increased efficiency and reduced cognitive load and working memory interference while the 1-dimensional structure based model allows for handling of the fraction structure with greater efficiency and decreased cognitive load. Author Keywords: mathematical cognition, mathematical software, user experience, working memory
Cloud Versus Bare Metal
A comparison of two high performance computing clusters running on AWS and Sharcnet was done to determine which scenarios yield the best performance. Algorithm complexity ranged from O (n) to O (n3). Data sizes ranged from 195 KB to 2 GB. The Sharcnet hardware consisted of Intel E5-2683 and Intel E7-4850 processors with memory sizes ranging from 256 GB to 3072 GB. On AWS, C4.8xlarge instances were used, which run on Intel Xeon E5-2666 processors with 60 GB per instance. AWS was able to launch jobs immediately regardless of job size. The only limiting factors on AWS were algorithm complexity and memory usage, suggesting a memory bottleneck. Sharcnet had the best performance but could be hampered by the job scheduler. In conclusion, Sharcnet is best used when the algorithm is complex and has high memory usage. AWS is best used when immediate processing is required. Author Keywords: AWS, cloud, HPC, parallelism, Sharcnet
Development of a Cross-Platform Solution for Calculating Certified Emission Reduction Credits in Forestry Projects under the Kyoto Protocol of the UNFCCC
This thesis presents an exploration of the requirements for and development of a software tool to calculate Certified Emission Reduction (CERs) credits for afforestation and reforestation projects conducted under the Clean Development Mechanism (CDM). We examine the relevant methodologies and tools to determine what is required to create a software package that can support a wide variety of projects involving a large variety of data and computations. During the requirements gathering, it was determined that the software package developed would need to support the ability to enter and edit equations at runtime. To create the software we used Java for the programming language, an H2 database to store our data, and an XML file to store our configuration settings. Through these choices, we can build a cross-platform software solution for the purpose outlined above. The end result is a versatile software tool through which users can create and customize projects to meet their unique needs as well as utilize the features provided to streamline the management of their CDM projects. Author Keywords: Carbon Emissions, Climate Change, Forests, Java, UNFCCC, XML
Educational Data Mining and Modelling on Trent University Students’ Academic Performance
Higher education is important. It enhances both individual and social welfare by improving productivity, life satisfaction, and health outcomes, and by reducing rates of crime. Universities play a critical role in providing that education. Because academic institutions face resource constraints, it is thus important that they deploy resources in support of student success in the most efficient ways possible. To inform that efficient deployment, this research analyzes institutional data reflecting undergraduate student performance to identify predictors of student success measured by GPA, rates of credit accumulation, and graduation rates. Using methods of cluster analysis and machine learning, the analysis yields predictions for the probabilities of individual success. Author Keywords: Educational data mining, Students’ academic performance modelling
Exploring the Scalability of Deep Learning on GPU Clusters
In recent years, we have observed an unprecedented rise in popularity of AI-powered systems. They have become ubiquitous in modern life, being used by countless people every day. Many of these AI systems are powered, entirely or partially, by deep learning models. From language translation to image recognition, deep learning models are being used to build systems with unprecedented accuracy. The primary downside, is the significant time required to train the models. Fortunately, the time needed for training the models is reduced through the use of GPUs rather than CPUs. However, with model complexity ever increasing, training times even with GPUs are on the rise. One possible solution to ever-increasing training times is to use parallelization to enable the distributed training of models on GPU clusters. This thesis investigates how to utilise clusters of GPU-accelerated nodes to achieve the best scalability possible, thus minimising model training times. Author Keywords: Compute Canada, Deep Learning, Distributed Computing, Horovod, Parallel Computing, TensorFlow
Fraud Detection in Financial Businesses Using Data Mining Approaches
The purpose of this research is to apply four methods on two data sets, a Synthetic dataset and a Real-World dataset, and compare the results to each other with the intention of arriving at methods to prevent fraud. Methods used include Logistic Regression, Isolation Forest, Ensemble Method and Generative Adversarial Networks. Results show that all four models achieve accuracies between 91% and 99% except Isolation Forest gave 69% accuracy for the Synthetic dataset. The four models detect fraud well when built on a training set and tested with a test set. Logistic Regression achieves good results with less computational eorts. Isolation Forest achieve lower results accuracies when the data is sparse and not preprocessed correctly. Ensemble Models achieve the highest accuracy for both datasets. GAN achieves good results but overts if a big number of epochs was used. Future work could incorporate other classiers. Author Keywords: Ensemble Method, GAN, Isolation forest, Logistic Regression, Outliers
Historic Magnetogram Digitization
The conversion of historical analog images to time series data was performed by using deconvolution for pre-processing, followed by the use of custom built digitization algorithms. These algorithms have been developed to be user friendly with the objective of aiding in the creation of a data set from decades of mechanical observations collected from the Agincourt and Toronto geomagnetic observatories beginning in the 1840s. The created algorithms follow a structure which begins with pre-processing followed by tracing and pattern detection. Each digitized magnetogram was then visually inspected, and the algorithm performance verified to ensure accuracy, and to allow the data to later be connected to create a long-running time-series. Author Keywords: Magnetograms
Machine Learning Using Topology Signatures For Associative Memory
This thesis presents a technique to produce signatures from topologies generated by the Growing Neural Gas algorithm. The generated signatures have the following characteristics: The signature's memory footprint is smaller than the "real object" and it represents a point in the n x m multidimensional space. Signatures can be compared based on Euclidean distance and distances between signatures provide measurements of differences between models. Signatures can be associated with a concept and then be used as a learning step for a classification algorithm. The signatures are normalized and vectorized to be used in a multidimensional space clustering. Although the technique is generic in essence, it was tested by classifying alphabet and numerical handwritten characters and 2D figures obtaining a good accuracy and precision. It can be used for many other purposes related to shapes and abstract typologies classification and associative memory. Future work could incorporate other classifiers. Author Keywords: Associative memory, Character recognition, Machine learning, Neural gas, Topological signatures, Unsupervised learning

Pages

Search Our Digital Collections

Query

Enabled Filters

  • (-) ≠ History
  • (-) = Computer science

Filter Results

Date

2004 - 2024
(decades)
Specify date range: Show
Format: 2024/03/28

Degree