DNN_S24_HW02

.pdf

School

Virginia Commonwealth University *

*We aren’t endorsed by this school

Course

CMSC-636

Subject

Computer Science

Date

May 7, 2024

Type

pdf

Pages

5

Uploaded by DoctorTurtle9928 on coursehero.com

CMSC 636 Neural Nets and Deep Learning Spring 2024, Instructor: Dr. Milos Manic, http://www.people.vcu.edu/~mmanic Homework 2 Homework No. 2 Due Wednesday, Feb. 21, 2024, noon Student certification: Team member 1: Print Name: ___________________________________ Date: __________________ I have contributed by doing the following:________________________________________________________ Signed: ________________________________________________ (you can sign/scan or use e-signature) Team member 2: Print Name: ___________________________________ Date: __________________ I have contributed by doing the following:________________________________________________________ Signed: ________________________________________________ (you can sign/scan or use e-signature) Team member 3: Print Name: ___________________________________ Date: __________________ I have contributed by doing the following:________________________________________________________ Signed: ________________________________________________ (you can sign/scan or use e-signature) 2.1 Python, hard/soft activation function, delta rule (5pts) 2.1.0 Download two Python programs (0 pts). Download two Python programs perceptron_hard.py and perceptron_soft.py (save these scripts by the name listed in the header of script). Run them, inspect their output files. Try to understand the functionality of these programs. You will use these programs as skeletons for the following sections of this homework. Report: None. 2.1.1 Write a program in Python (based on 2.1.0) to train a neuron implementing the truth table from HW1.4 (Homework 1, problem 4). Use a hard activation function and perceptron learning rule. For initial weight set use (1, 1, 1, 1). Experiment with different values for learning constant so the learning process completes in the fewest number of iterations. During the learning process, print your results to files. Note: In a given initial weight set, the last “1” is bias weight. Report: 1. Save your Python program as H211_hard.py and save files with results as H211_hard.txt. 2. Explain your results. Page 1 of 5 CMSC 636 Neural Nets and Deep Learning HW#2
CMSC 636 Neural Nets and Deep Learning Spring 2024, Instructor: Dr. Milos Manic, http://www.people.vcu.edu/~mmanic Homework 2 2.1.2 For the previous problem (problem 2.1.1), change the activation function to soft activation function and repeat your training procedure. Start with a weight set (1,1,1,1). Introduce computation of the total error in your program defined as the sum of squares of errors for each pattern. Experiment with learning coefficients to find results in the fewest number of iterations to achieve TE<0.01. Report: 1. Save your Python program as name it H212_soft.py and save files with results H212soft.txt. 2. Explain & discuss your results. 2.1.3 For the problem 2.1.1, use soft activation function and modify the script to implement a DELTA training rule. Train the network using initial weights (1,1,1,1). Experiment with the learning constant to produce results in the least number of iterations to achieve TE<0.01. Report: 1. Save your Python program as H213_delta.py and save files with results as H213_delta.txt. 2. Explain & discuss your results. Note: please make sure that you read the problem (data) carefully. For example, if the output values are 0 or 1, you should not be using a bipolar activation function. 2.2 Design network that solves XOR (4 pts) Design a neural network with 2 inputs, 1 output, and 3 neurons, which performs the XOR function: Note: design means “by hand”, not by running an algorithm. A B out 0 0 0 0 1 1 1 0 1 1 1 0 Report: 1. Provide the network diagram with weights. 2. Explain & discuss your results. Page 2 of 5 CMSC 636 Neural Nets and Deep Learning HW#2
CMSC 636 Neural Nets and Deep Learning Spring 2024, Instructor: Dr. Milos Manic, http://www.people.vcu.edu/~mmanic Homework 2 2.3 Multi layer perceptron network 2.3.0 Download Python program (0 pts). Download Python program H230_mlp.py from Canvas. Run it and inspect the output. Make sure you understand the functionalities included in the script. This will be a starting point for the following sections of this homework. Report: None. To run Python script, in command prompt type: python3 script_name.py (f or example, python3 H230_mlp.py, to use python 3.0 or above) READING MATERIALS: Scikit-learn is a Python library that provides implementations of various machine learning algorithms. It provides inbuilt functionalities to run neural networks such as Multi-layer Perceptron (MLP) Read 1.17.1 to get more details on the MLP algorithm, architecture, advantages, and disadvantages. Read 1.17.2 to understand how to use scikit-learn MLP functionality for classification, provide inputs to the algorithm, and predict outcomes. Reading Resource: multi layer perceptron in scikit-learn link. Page 3 of 5 CMSC 636 Neural Nets and Deep Learning HW#2
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
  • Access to all documents
  • Unlimited textbook solutions
  • 24/7 expert homework help