import FullyConnected from '@jsmlt/jsmlt/src/supervised/neuralnetwork/fullyconnected.js'
FullyConnected
Extends:
Constructor Summary
Public Constructor  
public 
constructor(optionsUser: Object) Constructor. 
Member Summary
Public Members  
public 
Boolean matrix of connectivity between each pair of nodes in subsequent layers. 

public 
hiddenLayers: * 

public 
Number of nodes (including bias nodes) in each layer of the network. 

public 
learningRate: * 

public 
numEpochs: * 

public 
numInputs: * 

public 
numOutputs: * 

public 
Weights between each pair of nodes in subsequent layers. 
Method Summary
Public Methods  
public 
Get the activation function value for the specified input. 

public 
Get the function value for the derivative of the activation function for the specified input. 

public 
calculateError(x: Array<number>, y: number): number Calculate the squared error between the network outputs for a sample and the specified outputs. 

public 
Calculate rootmeansquare error of the network on some data set. 

public 
deltaRule(activations: Array<Array<number>>, outputs: Array<Array<number>>, targets: Array<number>): Array<Array<number>> Apply the delta rule to the result of a forward pass through the network, expressed by the specified activations and outputs. 

public 
forwardPass(x: Array<number>): Array Pass a sample through the network, calculating the activations and outputs for all nodes in the network. 

public 
Randomly initialize the weights for the neural network. 

public 
predict(X: *): * 

public 
setWeights(weights: *) Manually set the weights matrices of the network. 

public 
train(X: *, y: *) 

public 
trainEpoch(X: Array<Array<number>>, y: Array<mixed>) Train the network for one epoch. 

public 
trainSample(x: Array<number>, y: number) Train the network on a single sample 
Inherited Summary
From class Estimator  
public abstract 
Make a prediction for a data set. 

public abstract 
Train the supervised learning algorithm on a dataset. 
Public Constructors
public constructor(optionsUser: Object) source
Constructor. Initialize class members and store userdefined options.
Params:
Name  Type  Attribute  Description 
optionsUser  Object 

Userdefined options 
optionsUser.numInputs  number 

Number of features each input sample has. The first layer of the network has this (plus one bias node) as the number of nodes. Defaults to 'auto', which determines the number of input nodes on the dimensionality of the training data upon the training call 
optionsUser.numOutputs  number 

Number of possible outputs for the network. The final layer of the network has this as the number of nodes. Defaults to 'auto', which determines the number of input nodes on the number of unique labels in the data upon the training call 
optionsUser.hiddenLayers  Array<number> 

Number of nodes in the hidden layers. Each entry in this array corresponds to a single hidden layer 
optionsUser.numEpochs  number 

Number of epochs (i.e., passes over all training data) to train the network for 
optionsUser.learningRate  number 

Learning rate for training 
Public Members
public connectivity: Array<Array<Array<boolean>>> source
Boolean matrix of connectivity between each pair of nodes in subsequent layers. For format, see FullyConnected#weights.
public hiddenLayers: * source
public layers: Array<number> source
Number of nodes (including bias nodes) in each layer of the network. Filled at the start of training.
public learningRate: * source
public numEpochs: * source
public numInputs: * source
public numOutputs: * source
public weights: Array<Array<Array<number>>> source
Weights between each pair of nodes in subsequent layers. Each entry in the main array contains a matrix of weights between the nodes in that layer and the nodes in the next layer. This includes entries for weights between unconnected (e.g., where the output node is a bias node) nodes
Public Methods
public activationFunction(a: number): number source
Get the activation function value for the specified input.
Params:
Name  Type  Attribute  Description 
a  number  Input value 
public activationFunctionDerivative(a: number): number source
Get the function value for the derivative of the activation function for the specified input.
Params:
Name  Type  Attribute  Description 
a  number  Input value 
public calculateError(x: Array<number>, y: number): number source
Calculate the squared error between the network outputs for a sample and the specified outputs.
Return:
number  Sum of squared errors between the outputs corresponding to the sample label and the outputs obtained passing the sample through the network 
public calculateRMSE(X: Array<Array<number>>, y: Array<mixed>): number source
Calculate rootmeansquare error of the network on some data set.
public deltaRule(activations: Array<Array<number>>, outputs: Array<Array<number>>, targets: Array<number>): Array<Array<number>> source
Apply the delta rule to the result of a forward pass through the network, expressed by the specified activations and outputs. The network targets corresponding to the forward pass need to be specified too.
Params:
Name  Type  Attribute  Description 
activations  Array<Array<number>>  Network activations for each node in each layer 

outputs  Array<Array<number>>  Network outputs (i.e., the activations passed through the activation function) for each node in each layer 

targets  Array<number>  Network targets for the final layer 
public forwardPass(x: Array<number>): Array source
Pass a sample through the network, calculating the activations and outputs for all nodes in the network.
Return:
Array  Array with two elements: containing the activations and outputs, respectively, for each node in the network 
public initializeWeights() source
Randomly initialize the weights for the neural network. For each subsequent pair of layers, where the first has n nodes and the second n' nodes, initialize an matrix with n rows and n' columns. Each cell in the matrix is assigned a random value in the range [1, 1]. Furthermore, the connectivity of each pair of nodes in subsequent layers is stored (where all nodes in each layer are connected to all nonbias nodes in the next layer).
The weights between layer k and layer k + 1 are stored in element k (starting at k = 0) of the weights array.
public predict(X: *): * source
Make a prediction for a data set.
Override:
Estimator#predictParams:
Name  Type  Attribute  Description 
X  * 
Return:
* 
public setWeights(weights: *) source
Manually set the weights matrices of the network.
Params:
Name  Type  Attribute  Description 
weights  * 
public train(X: *, y: *) source
Train the supervised learning algorithm on a dataset.
Override:
Estimator#trainParams:
Name  Type  Attribute  Description 
X  *  
y  * 