Document Type

Honors Thesis

Major

Computer Science

Advisor(s)

Salimeh Yasaei Sekeh

Committee Members

Terry Yoo, Melissa Ladenheim

Graduation Year

May 2024

Publication Date

Spring 5-2024

Abstract

Deep neural network (DNN) approaches excel in various real-world applications like robotics and computer vision, yet their computational demands and memory requirements hinder usability on advanced devices. Also, larger models heighten overparameterization risks, making networks more vulnerable to input disturbances. Recent studies aim to boost DNN efficiency by trimming redundant neurons or filters based on task relevance. Instead of introducing a new pruning method, this project aims to enhance existing techniques by introducing a companion network, Ghost Connect-Net (GC-Net), to monitor the connections in the original network. The initial weights of GC- Net are equal to the connectivity measurements of the consecutive layers in the original network. Once the connectivity-weights in GC-Net have been created and loaded, a pruning method is then applied to GC-Net. The pruned weights are mapped back to the original network determining pruned connections. This method allows for the combination of both magnitude and connectivity based pruning methods by applying magnitude based pruning approaches to the connectivity-weight values of GC-Net. Experimental results using the CIFAR-10 dataset and common CNN architectures, ResNet-18 and VGG16, show promising results for hybridizing the method, and using GC-Net for later layers of a network and direct pruning on earlier layers.

Share