Graduate Theses & Dissertations

Pages

Solving Differential and Integro-Differential Boundary Value Problems using a Numerical Sinc-Collocation Method Based on Derivative Interpolation
In this thesis, a new sinc-collocation method based upon derivative interpolation is developed for solving linear and nonlinear boundary value problems involving differential as well as integro-differential equations. The sinc-collocation method is chosen for its ease of implementation, exponential convergence of error, and ability to handle to singularities in the BVP. We present a unique method of treating boundary conditions and introduce the concept of the stretch factor into the conformal mappings of domains. The result is a method that achieves great accuracy while reducing computational cost. In most cases, the results from the method greatly exceed the published results of comparable methods in both accuracy and efficiency. The method is tested on the Blasius problem, the Lane-Emden problem and generalised to cover Fredholm-Volterra integro-differential problems. The results show that the sinc-collocation method with derivative interpolation is a viable and preferable method for solving nonlinear BVPs. Author Keywords: Blasius, Boundary Value Problem, Exponential convergence, Integro-differential, Nonlinear, Sinc
Fraud Detection in Financial Businesses Using Data Mining Approaches
The purpose of this research is to apply four methods on two data sets, a Synthetic dataset and a Real-World dataset, and compare the results to each other with the intention of arriving at methods to prevent fraud. Methods used include Logistic Regression, Isolation Forest, Ensemble Method and Generative Adversarial Networks. Results show that all four models achieve accuracies between 91% and 99% except Isolation Forest gave 69% accuracy for the Synthetic dataset. The four models detect fraud well when built on a training set and tested with a test set. Logistic Regression achieves good results with less computational eorts. Isolation Forest achieve lower results accuracies when the data is sparse and not preprocessed correctly. Ensemble Models achieve the highest accuracy for both datasets. GAN achieves good results but overts if a big number of epochs was used. Future work could incorporate other classiers. Author Keywords: Ensemble Method, GAN, Isolation forest, Logistic Regression, Outliers
Problem Solving as a Path to Understanding Mathematics Representations
Little is actually known about how people cognitively process and integrate information when solving complex mathematical problems. In this thesis, eye-tracking was used to examine how people read and integrate information from mathematical symbols and complex formula, with eye fixations being used as a measure of their current focus of attention. Each participant in the studies was presented with a series of stimuli in the form of mathematical problems and their eyes were tracked as they worked through the problem mentally. From these examinations, we were able to demonstrate differences in both the comprehension and problem-solving, with the results suggesting that what information is selected, and how, is responsible for a large portion of success in solving such problems. We were also able to examine how different mathematical representations of the same mathematical object are attended to by students. Author Keywords: eye-tracking, mathematical notation, mathematical representations, problem identification, problem-solving, symbolism
Framework for Testing Time Series Interpolators
The spectrum of a given time series is a characteristic function describing its frequency properties. Spectrum estimation methods require time series data to be contiguous in order for robust estimators to retain their performance. This poses a fundamental challenge, especially when considering real-world scientific data that is often plagued by missing values, and/or irregularly recorded measurements. One area of research devoted to this problem seeks to repair the original time series through interpolation. There are several algorithms that have proven successful for the interpolation of considerably large gaps of missing data, but most are only valid for use on stationary time series: processes whose statistical properties are time-invariant, which is not a common property of real-world data. The Hybrid Wiener interpolator is a method that was designed for repairing nonstationary data, rendering it suitable for spectrum estimation. This thesis work presents a computational framework designed for conducting systematic testing on the statistical performance of this method in light of changes to gap structure and departures from the stationarity assumption. A comprehensive audit of the Hybrid Wiener Interpolator against other state-of-the art algorithms will also be explored. Author Keywords: applied statistics, hybrid wiener interpolator, imputation, interpolation, R statistical software, time series
Historic Magnetogram Digitization
The conversion of historical analog images to time series data was performed by using deconvolution for pre-processing, followed by the use of custom built digitization algorithms. These algorithms have been developed to be user friendly with the objective of aiding in the creation of a data set from decades of mechanical observations collected from the Agincourt and Toronto geomagnetic observatories beginning in the 1840s. The created algorithms follow a structure which begins with pre-processing followed by tracing and pattern detection. Each digitized magnetogram was then visually inspected, and the algorithm performance verified to ensure accuracy, and to allow the data to later be connected to create a long-running time-series. Author Keywords: Magnetograms
Augmented Reality Sandbox (Aeolian Box)
The AeolianBox is an educational and presentation tool extended in this thesis to represent the atmospheric boundary layer (ABL) flow over a deformable surface in the sandbox. It is a hybrid hardware cum mathematical model which helps users to visually, interactively and spatially fathom the natural laws governing ABL airflow. The AeolianBox uses a Kinect V1 camera and a short focal length projector to capture the Digital Elevation Model (DEM) of the topography within the sandbox. The captured DEM is used to generate a Computational Fluid Dynamics (CFD) model and project the ABL flow back onto the surface topography within the sandbox. AeolianBox is designed to be used in a classroom setting. This requires a low time cost for the ABL flow simulation to keep the students engaged in the classroom. Thus, the process of DEM capture and CFD modelling were investigated to lower the time cost while maintaining key features of the ABL flow structure. A mesh-time sensitivity analysis was also conducted to investigate the tradeoff between the number of cells inside the mesh and time cost for both meshing process and CFD modelling. This allows the user to make an informed decision regarding the level of detail desired in the ABL flow structure by changing the number of cells in the mesh. There are infinite possible surface topographies which can be created by molding sand inside the sandbox. Therefore, in addition to keeping the time cost low while maintaining key features of the ABL flow structure, the meshing process and CFD modelling are required to be robust to variety of different surface topographies. To achieve these research objectives, in this thesis, parametrization is done for meshing process and CFD modelling. The accuracy of the CFD model for ABL flow used in the AeolianBox was qualitatively validated with airflow profiles captured in the Trent Environmental Wind Tunnel (TEWT) at Trent University using the Laser Doppler Anemometer (LDA). Three simple geometries namely a hemisphere, cube and a ridge were selected since they are well studied in academia. The CFD model was scaled to the dimensions of the grid where the airflow was captured in TEWT. The boundary conditions were also kept the same as the model used in the AeolianBox. The ABL flow is simulated by using software like OpenFoam and Paraview to build and visualize a CFD model. The AeolianBox is interactive and capable of detecting hands using the Kinect camera which allows a user to interact and change the topography of the sandbox in real time. The AeolianBox’s software built for this thesis uses only opensource tools and is accessible to anyone with an existing hardware model of its predecessors. Author Keywords: Augmented Reality, Computational Fluid Dynamics, Kinect Projector Calibration, OpenFoam, Paraview
Representation Learning with Restorative Autoencoders for Transfer Learning
Deep Neural Networks (DNNs) have reached human-level performance in numerous tasks in the domain of computer vision. DNNs are efficient for both classification and the more complex task of image segmentation. These networks are typically trained on thousands of images, which are often hand-labelled by domain experts. This bottleneck creates a promising research area: training accurate segmentation networks with fewer labelled samples. This thesis explores effective methods for learning deep representations from unlabelled images. We train a Restorative Autoencoder Network (RAN) to denoise synthetically corrupted images. The weights of the RAN are then fine-tuned on a labelled dataset from the same domain for image segmentation. We use three different segmentation datasets to evaluate our methods. In our experiments, we demonstrate that through our methods, only a fraction of data is required to achieve the same accuracy as a network trained with a large labelled dataset. Author Keywords: deep learning, image segmentation, representation learning, transfer learning
Support Vector Machines for Automated Galaxy Classification
Support Vector Machines (SVMs) are a deterministic, supervised machine learning algorithm that have been successfully applied to many areas of research. They are heavily grounded in mathematical theory and are effective at processing high-dimensional data. This thesis models a variety of galaxy classification tasks using SVMs and data from the Galaxy Zoo 2 project. SVM parameters were tuned in parallel using resources from Compute Canada, and a total of four experiments were completed to determine if invariance training and ensembles can be utilized to improve classification performance. It was found that SVMs performed well at many of the galaxy classification tasks examined, and the additional techniques explored did not provide a considerable improvement. Author Keywords: Compute Canada, Kernel, SDSS, SHARCNET, Support Vector Machine, SVM
Predicting Irregularities in Arrival Times for Toronto Transit Buses with LSTM Recurrent Neural Networks Using Vehicle Locations and Weather Data
Public transportation systems play important role in the quality of life of citizens in any metropolitan city. However, public transportation authorities face criticisms from commuters due to irregularities in bus arrival times. For example, transit bus users often complain when they miss the bus because it arrived too early or too late at the bus stop. Due to these irregularities, commuters may miss important appointments, wait for too long at the bus stop, or arrive late for work. This thesis seeks to predict the occurrence of irregularities in bus arrival times by developing machine learning models that use GPS locations of transit buses provided by the Toronto Transit Commission (TTC) and hourly weather data. We found that in nearly 37% of the time, buses either arrive early or late by more than 5 minutes, suggesting room for improvement in the current strategies employed by transit authorities. We compared the performance of three machine learning models, for which our Long Short-Term Memory (LSTM) [13] model outperformed all other models in terms of accuracy. The error rate for LSTM model was the lowest among Artificial Neural Network (ANN) and support vector regression (SVR). The improved accuracy achieved by LSTM is due to its ability to adjust and update the weights of neurons while maintaining long-term dependencies when encountering new stream of data. Author Keywords: ANN, LSTM, Machine Learning
Relationship Between Precarious Employment, Behaviour Addictions and Substance Use Among Canadian Young Adults
This thesis utilized a unique data-set, the Quinte Longitudinal Survey, to explore relationships among precarious employment and a range of mental health problems in a representative sample of Ontario young adults. Study 1 focused on various behavioural addictions (such as problem gambling, video gaming, internet use, exercise, compulsive shopping, and sex) and precarious employment. The results showed that precariously employed men were preoccupied with gambling and sex while their female counterparts preferred shopping. Gambling and excessive shopping diminished over time while excessive sexual practices increased. Study 2 focused on the association between precarious employment and substance abuse (such as tobacco, alcohol, cannabis, hallucinogens, stimulants, and other substances). The results showed that men used cannabis more than women, and the non-precarious employed group abused alcohol more than individuals in the precarious group. This research has implications for both health care professionals and intervention program developers when working with young adults in precarious jobs. Author Keywords: Behaviour Addictions, Precarious Employment, Substance Abuse, Young Adults
Exploring the Scalability of Deep Learning on GPU Clusters
In recent years, we have observed an unprecedented rise in popularity of AI-powered systems. They have become ubiquitous in modern life, being used by countless people every day. Many of these AI systems are powered, entirely or partially, by deep learning models. From language translation to image recognition, deep learning models are being used to build systems with unprecedented accuracy. The primary downside, is the significant time required to train the models. Fortunately, the time needed for training the models is reduced through the use of GPUs rather than CPUs. However, with model complexity ever increasing, training times even with GPUs are on the rise. One possible solution to ever-increasing training times is to use parallelization to enable the distributed training of models on GPU clusters. This thesis investigates how to utilise clusters of GPU-accelerated nodes to achieve the best scalability possible, thus minimising model training times. Author Keywords: Compute Canada, Deep Learning, Distributed Computing, Horovod, Parallel Computing, TensorFlow
Population-Level Ambient Pollution Exposure Proxies
The Air Health Trend Indicator (AHTI) is a joint Health Canada / Environment and Climate Change Canada initiative that seeks to model the Canadian national population health risk due to acute exposure to ambient air pollution. The common model in the field uses averages of local ambient air pollution monitors to produce a population-level exposure proxy variable. This method is applied to ozone, nitrogen dioxide, particulate matter, and other similar air pollutants. We examine the representative nature of these proxy averages on a large-scale Canadian data set, representing hundreds of monitors and dozens of city-level populations. The careful determination of temporal and spatial correlations between the disparate monitors allows for more precise estimation of population-level exposure, taking inspiration from the land-use regression models commonly used in geography. We conclude this work with an examination of the risk estimation differences between the original, simplistic population exposure metric and our new, revised metric. Author Keywords: Air Pollution, Population Health Risk, Spatial Process, Spatio-Temporal, Temporal Process, Time Series

Pages

Search Our Digital Collections

Query

Enabled Filters

  • (-) ≠ Reid
  • (-) ≠ Conolly
  • (-) = Applied Modeling and Quantitative Methods

Filter Results

Date

2010 - 2030
(decades)
Specify date range: Show
Format: 2020/07/16

Author Last Name

Show more

Last Name (Other)

Show more