conv neural network - Forward Propagation with Dropout -


i working through andrew ng new deep learning coursera course.

we implementing following code :

def forward_propagation_with_dropout(x, parameters, keep_prob = 0.5):

np.random.seed(1)

# retrieve parameters w1 = parameters["w1"] b1 = parameters["b1"] w2 = parameters["w2"] b2 = parameters["b2"] w3 = parameters["w3"] b3 = parameters["b3"]  # linear -> relu -> linear -> relu -> linear -> sigmoid z1 = np.dot(w1, x) + b1 a1 = relu(z1) ### start code here ### (approx. 4 lines)         # steps 1-4 below correspond steps 1-4 described above.  d1 = np.random.rand(*a1.shape)                                # step 1: initialize matrix d1 = np.random.rand(..., ...) d1 = (d1 < 0.5)                                        # step 2: convert entries of d1 0 or 1 (using keep_prob threshold) a1 = a1*d1                                        # step 3: shut down neurons of a1 a1 = a1 / keep_prob                                         # step 4: scale value of neurons haven't been shut down ### end code here ### z2 = np.dot(w2, a1) + b2 a2 = relu(z2) ### start code here ### (approx. 4 lines) d2 =np.random.rand(*a2.shape)                                        # step 1: initialize matrix d2 = np.random.rand(..., ...) d2 = (d2 < 0.5)                                        # step 2: convert entries of d2 0 or 1 (using keep_prob threshold) a2 = a2 * d2                                         # step 3: shut down neurons of a2 a2 = a2 / keep_prob                                         # step 4: scale value of neurons haven't been shut down ### end code here ### z3 = np.dot(w3, a2) + b3 a3 = sigmoid(z3)  cache = (z1, d1, a1, w1, b1, z2, d2, a2, w2, b2, z3, a3, w3, b3)  return a3, cache 

x_assess, parameters = forward_propagation_with_dropout_test_case()

a3, cache = forward_propagation_with_dropout(x_assess, parameters, keep_prob = 0.7) print ("a3 = " + str(a3))

my output :

a3 = [[ 0.36974721 0.49683389 0.04565099 0.49683389 0.36974721]]

the expected output should :

a3 [[ 0.36974721 0.00305176 0.04565099 0.49683389 0.36974721]]

only 1 number difference. ideas why ?

i think because of way shaped d1 , d2.


Comments

Popular posts from this blog

ios - MKAnnotationView layer is not of expected type: MKLayer -

ZeroMQ on Windows, with Qt Creator -

unity3d - Unity SceneManager.LoadScene quits application -