위의 운동에 대한 파이썬 증거 :
"""
NEURAL NETWORKS AND DEEP LEARNING by Michael Nielsen
Chapter 1
http://neuralnetworksanddeeplearning.com/chap1.html#exercise_513527
Exercise:
There is a way of determining the bitwise representation of a digit by adding an extra layer to the three-layer network above. The extra layer converts the output from the previous layer into a binary representation, as illustrated in the figure below. Find a set of weights and biases for the new output layer. Assume that the first 3 layers of neurons are such that the correct output in the third layer (i.e., the old output layer) has activation at least 0.99, and incorrect outputs have activation less than 0.01.
"""
import numpy as np
def sigmoid(x):
return(1/(1+np.exp(-x)))
def new_representation(activation_vector):
a_0 = np.sum(w_0 * activation_vector)
a_1 = np.sum(w_1 * activation_vector)
a_2 = np.sum(w_2 * activation_vector)
a_3 = np.sum(w_3 * activation_vector)
return a_3, a_2, a_1, a_0
def new_repr_binary_vec(new_representation_vec):
sigmoid_op = np.apply_along_axis(sigmoid, 0, new_representation_vec)
return (sigmoid_op > 0.5).astype(int)
w_0 = np.full(10, -1, dtype=np.int8)
w_0[[1, 3, 5, 7, 9]] = 1
w_1 = np.full(10, -1, dtype=np.int8)
w_1[[2, 3, 6, 7]] = 1
w_2 = np.full(10, -1, dtype=np.int8)
w_2[[4, 5, 6, 7]] = 1
w_3 = np.full(10, -1, dtype=np.int8)
w_3[[8, 9]] = 1
activation_vec = np.full(10, 0.01, dtype=np.float)
# correct number is 5
activation_vec[3] = 0.99
new_representation_vec = new_representation(activation_vec)
print(new_representation_vec)
# (-1.04, 0.96, -1.0, 0.98)
print(new_repr_binary_vec(new_representation_vec))
# [0 1 0 1]
# if you wish to convert binary vector to int
b = new_repr_binary_vec(new_representation_vec)
print(b.dot(2**np.arange(b.size)[::-1]))
# 5