Weight Initialization¶
Weight initialization is a technique that deals with presetting of the weight parameters in the network prior to training in a manner that favors the ultimate convergence of the neural network.
Featured Initializers¶
One 
One (One) 
Zero 
Zero (Zero) 
HeNormal 
He Normal (HeNormal) 
HeUniform 
He Normal (HeNormal) 
GlorotNormal 
Glorot Normal (GlorotNormal) 
GlorotUniform 
Glorot Uniform (GlorotUniform) 
LeCunNormal 
LeCun Normal (LeCunNormal) 
LeCunUniform 
LeCun Uniform (LeCunUniform) 
RandomNormal 
Random Normal (RandomNormal) 
RandomUniform 
Random Uniform (RandomUniform) 
Function Descriptions¶

class
ztlearn.initializers.
GlorotNormal
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
Glorot Normal (GlorotNormal)
GlorotNormal, more famously known as the Xavier initialization is based on the effort to try mantain the same variance of the gradients of the weights for all the layers. Glorot normal is an implementation based on Gaussian distribution
References
 [1] Understanding the difficulty of training deep feedforward neural networks
 [Xavier Glorot, 2010] http://proceedings.mlr.press/v9/glorot10a.html
 [PDF] http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf
 [2] Initialization Of Deep Feedfoward Networks
 [DeepGrid Article  Jefkine Kafunah] https://goo.gl/E2XrGe

init_name
¶

class
ztlearn.initializers.
GlorotUniform
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
Glorot Uniform (GlorotUniform)
GlorotUniform, more famously known as the Xavier initialization is based on the effort to try mantain the same variance of the gradients of the weights for all the layers. Glorot uniform is an implementation based on Uniform distribution
References
 [1] Understanding the difficulty of training deep feedforward neural networks
 [Xavier Glorot, 2010] http://proceedings.mlr.press/v9/glorot10a.html
 [PDF] http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf
 [2] Initialization Of Deep Feedfoward Networks
 [DeepGrid Article  Jefkine Kafunah] https://goo.gl/E2XrGe

init_name
¶

class
ztlearn.initializers.
HeNormal
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
He Normal (HeNormal)
HeNormal is a robust initialization method that particularly considers the rectifier nonlinearities. He normal is an implementation based on Gaussian distribution
References
 [1] Delving Deep into Rectifiers: Surpassing HumanLevel Performance on ImageNet Classification
 [Kaiming He, 2015] https://arxiv.org/abs/1502.01852
 [PDF] https://arxiv.org/pdf/1502.01852.pdf
 [2] Initialization Of Deep Networks Case of Rectifiers
 [DeepGrid Article  Jefkine Kafunah] https://goo.gl/TBNw5t

init_name
¶

class
ztlearn.initializers.
HeUniform
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
He Normal (HeNormal)
HeNormal is a robust initialization method that particularly considers the rectifier nonlinearities. He uniform is an implementation based on Uniform distribution
References
 [1] Delving Deep into Rectifiers: Surpassing HumanLevel Performance on ImageNet Classification
 [Kaiming He, 2015] https://arxiv.org/abs/1502.01852
 [PDF] https://arxiv.org/pdf/1502.01852.pdf
 [2] Initialization Of Deep Networks Case of Rectifiers
 [DeepGrid Article  Jefkine Kafunah] https://goo.gl/TBNw5t

init_name
¶

class
ztlearn.initializers.
Identity
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
Identity (Identity)
Identity is an implementation of weight initialization that returns an identity matrix of size shape

init_name
¶


class
ztlearn.initializers.
LeCunNormal
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
LeCun Normal (LeCunNormal)
Weights should be randomly chosen but in such a way that the sigmoid is primarily activated in its linear region. LeCun uniform is an implementation based on Gaussian distribution
References
 [1] Efficient Backprop
 [LeCun, 1998][PDF] http://yann.lecun.com/exdb/publis/pdf/lecun98b.pdf

init_name
¶

class
ztlearn.initializers.
LeCunUniform
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
LeCun Uniform (LeCunUniform)
Weights should be randomly chosen but in such a way that the sigmoid is primarily activated in its linear region. LeCun uniform is an implementation based on Uniform distribution
References
 [1] Efficient Backprop
 [LeCun, 1998][PDF] http://yann.lecun.com/exdb/publis/pdf/lecun98b.pdf

init_name
¶

class
ztlearn.initializers.
One
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
One (One)
One is an implementation of weight initialization that returns all ones

init_name
¶


class
ztlearn.initializers.
RandomNormal
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
Random Normal (RandomNormal)
Random uniform, an implementation of weight initialization based on Gaussian distribution

init_name
¶


class
ztlearn.initializers.
RandomUniform
[source]¶ Bases:
ztlearn.initializers.WeightInitializer
Random Uniform (RandomUniform)
Random uniform, an implementation of weight initialization based on Uniform distribution

init_name
¶
