...
HCU PHD CS MAY 2015
May 23, 2024
NTA UGC NET Dec 2023 Paper-2
May 23, 2024
HCU PHD CS MAY 2015
May 23, 2024
NTA UGC NET Dec 2023 Paper-2
May 23, 2024

NTA UGC NET Dec 2023 Paper-2

Question 18
In a feed forward neural network with the following specifications
The input layer has 4 neutrons, the hidden layer has 3 neurons and the output layer has 2 neurons using the sigmoid activation function for given input values [0.5,0.8,0.2,0.6] as well as the initial weights for the connections.
Input layer to hidden layer weights
WI: [0.1, 0.3, 0.5, 0.2]
W2: [0.2, 0.4, 0.6, 0.2]
W3: [0.3, 0.5, 0.7, 0.2]
Hidden layer to output layer weights
W4:[0.4,0.1, 0.3]
WS: [0.5,0.2.0.4]
What is the output of the output layer when the given a input values are passed through neural network?
Round the answer to two decimal places
A
[0.62, 0.68]
B
[0.72 0.78]
C
[0.82, 0.88]
D
[0.92,0.98]
Question 18 Explanation: 
Here’s the step-by-step calculation:
1. Input Layer to Hidden Layer:
Calculate weighted sums for each hidden neuron:
H1 = 0.1*0.5 + 0.3*0.8 + 0.5*0.2 + 0.2*0.6 = 0.51
H2 = 0.2*0.5 + 0.4*0.8 + 0.6*0.2 + 0.2*0.6 = 0.66
H3 = 0.3*0.5 + 0.5*0.8 + 0.7*0.2 + 0.2*0.6 = 0.81
Apply sigmoid activation function to each hidden neuron:
A1 = 1 / (1 + exp(-H1)) = 0.624
A2 = 1 / (1 + exp(-H2)) = 0.658
A3 = 1 / (1 + exp(-H3)) = 0.692
2. Hidden Layer to Output Layer:
Calculate weighted sums for each output neuron:
O1 = 0.4*A1 + 0.1*A2 + 0.3*A3 = 0.523
O2 = 0.5*A1 + 0.2*A2 + 0.4*A3 = 0.72
Apply sigmoid activation function to each output neuron:
Y1 = 1 / (1 + exp(-O1)) = 0.628 ≈ 0.62
Y2 = 1 / (1 + exp(-O2)) = 0.675 ≈ 0.68
Correct Answer: A
Question 18 Explanation: 
Here’s the step-by-step calculation:
1. Input Layer to Hidden Layer:
Calculate weighted sums for each hidden neuron:
H1 = 0.1*0.5 + 0.3*0.8 + 0.5*0.2 + 0.2*0.6 = 0.51
H2 = 0.2*0.5 + 0.4*0.8 + 0.6*0.2 + 0.2*0.6 = 0.66
H3 = 0.3*0.5 + 0.5*0.8 + 0.7*0.2 + 0.2*0.6 = 0.81
Apply sigmoid activation function to each hidden neuron:
A1 = 1 / (1 + exp(-H1)) = 0.624
A2 = 1 / (1 + exp(-H2)) = 0.658
A3 = 1 / (1 + exp(-H3)) = 0.692
2. Hidden Layer to Output Layer:
Calculate weighted sums for each output neuron:
O1 = 0.4*A1 + 0.1*A2 + 0.3*A3 = 0.523
O2 = 0.5*A1 + 0.2*A2 + 0.4*A3 = 0.72
Apply sigmoid activation function to each output neuron:
Y1 = 1 / (1 + exp(-O1)) = 0.628 ≈ 0.62
Y2 = 1 / (1 + exp(-O2)) = 0.675 ≈ 0.68
0 0 votes
Article Rating
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments
0
Would love your thoughts, please comment.x
()
x
error: Alert: Content selection is disabled!!