Run#

class whobpyt.run.model_fitting.ModelFitting(model: AbstractNeuralModel, cost: AbstractLoss, device=device(type='cpu'))[source]#

This Model_fitting class is able to fit resting state data or evoked potential data for which the input training data is empty or some stimulus to one or more NMM nodes, and the label is an associated empirical neuroimaging recording.

Studies which consider different kinds of input, such as if SC or some other variable is associated with an empirical recording, must use a different fitting class.

Attributes:
model: AbstractNMM

Whole Brain Model to Simulate

cost: AbstractLoss

A particular objective function which the model will be optimized for.

trainingStats: TrainingStats

Information about objective function loss and parameter values over training windows/epochs

lastRec: Recording

The last simulation of fitting(), evaluation(), or simulation()

devicetorch.device

Whether the fitting is to run on CPU or GPU

__init__(model: AbstractNeuralModel, cost: AbstractLoss, device=device(type='cpu'))[source]#
Parameters:
model: AbstractNMM

Whole Brain Model to Simulate

cost: AbstractLoss

A particular objective function which the model will be optimized for.

devicetorch.device

Whether the fitting is to run on CPU or GPU

evaluate(u, empRec: list, TPperWindow: int, base_window_num: int = 0, transient_num: int = 10)[source]#
Parameters:
uint or Tensor

external or stimulus

empRec: list of Recording

This is the ML “Training Labels”

TPperWindow: int

Number of Empirical Time Points per window. model.forward does one window at a time.

base_window_numint

length of num_windows for resting

transient_numint

The number of initial time points to exclude from some metrics

———–
save(filename)[source]#
Parameters:
filename: String

filename to use when saving object to file

train(u, empRecs: list, num_epochs: int, TPperWindow: int, warmupWindow: int = 0, learningrate: float = 0.05, lr_2ndLevel: float = 0.05, lr_scheduler: bool = False)[source]#
Parameters:
u: type

This stimulus is the ML “Training Input”

empRec: list of Recording

This is the ML “Training Labels”

num_epochs: int

the number of times to go through the entire training data set

TPperWindow: int

Number of Empirical Time Points per window. model.forward does one window at a time.

learningrate: float

rate of gradient descent

lr_2ndLevel: float

learning rate for priors of model parameters, and possibly others

lr_scheduler: bool

Whether to use the learning rate scheduler