吴恩达深度学习第2课第2周编程作业 的坑(),,def initia


def initialize_parameters(layer_dims):    """    Arguments:    layer_dims -- python array (list) containing the dimensions of each layer in our network        Returns:    parameters -- python dictionary containing your parameters "W1", "b1", ..., "WL", "bL":                    W1 -- weight matrix of shape (layer_dims[l], layer_dims[l-1])                    b1 -- bias vector of shape (layer_dims[l], 1)                    Wl -- weight matrix of shape (layer_dims[l-1], layer_dims[l])                    bl -- bias vector of shape (1, layer_dims[l])                        Tips:    - For example: the layer_dims for the "Planar Data classification model" would have been [2,2,1].     This means W1‘s shape was (2,2), b1 was (1,2), W2 was (2,1) and b2 was (1,1). Now you have to generalize it!    - In the for loop, use parameters[‘W‘ + str(l)] to access Wl, where l is the iterative integer.    """        np.random.seed(3)    parameters = {}    L = len(layer_dims) # number of layers in the network    for l in range(1, L):        parameters[‘W‘ + str(l)] = np.random.randn(layer_dims[l], layer_dims[l-1])*  np.sqrt(2.0 / layer_dims[l-1]) # <------- 坑在这, 原来是2, 我们改成2.0了        parameters[‘b‘ + str(l)] = np.zeros((layer_dims[l], 1))                assert(parameters[‘W‘ + str(l)].shape == layer_dims[l], layer_dims[l-1])        assert(parameters[‘W‘ + str(l)].shape == layer_dims[l], 1)            return parameters

吴恩达深度学习第2课第2周编程作业 的坑()

评论关闭