mvpa2.clfs.warehouse.GPR

Inheritance diagram of GPR

class mvpa2.clfs.warehouse.GPR(kernel=None, **kwargs)

Gaussian Process Regression (GPR).

Notes

Available conditional attributes:

  • calling_time+: Time (in seconds) it took to call the node
  • estimates+: Internal classifier estimates the most recent predictions are based on
  • log_marginal_likelihood: Log Marginal Likelihood
  • log_marginal_likelihood_gradient: Log Marginal Likelihood Gradient
  • predicted_variances: Variance per each predicted value
  • predicting_time+: Time (in seconds) which took classifier to predict
  • predictions+: Most recent set of predictions
  • raw_results: Computed results before invoking postproc. Stored only if postproc is not None.
  • trained_dataset: The dataset it has been trained on
  • trained_nsamples+: Number of samples it has been trained on
  • trained_targets+: Set of unique targets it has been trained on
  • training_stats: Confusion matrix of learning performance
  • training_time+: Time (in seconds) it took to train the learner

(Conditional attributes enabled by default suffixed with +)

Methods

clone() Create full copy of the classifier.
compute_gradient_log_marginal_likelihood() Compute gradient of the log marginal likelihood.
compute_gradient_log_marginal_likelihood_logscale() Compute gradient of the log marginal likelihood when hyperparameters are in logscale.
compute_log_marginal_likelihood() Compute log marginal likelihood using self.train_fv and self.targets.
generate(ds) Yield processing results.
get_postproc() Returns the post-processing node or None.
get_sensitivity_analyzer([flavor]) Returns a sensitivity analyzer for GPR.
get_space() Query the processing space name of this node.
is_trained([dataset]) Either classifier was already trained.
predict(obj, data, *args, **kwargs)
repredict(obj, data, *args, **kwargs)
reset()
retrain(dataset, **kwargs) Helper to avoid check if data was changed actually changed
set_hyperparameters(hyperparameter) Set hyperparameters’ values.
set_postproc(node) Assigns a post-processing node
set_space(name) Set the processing space name of this node.
summary() Providing summary over the classifier
train(ds) The default implementation calls _pretrain(), _train(), and finally _posttrain().
untrain() Reverts changes in the state of this node caused by previous training

Initialize a GPR regression analysis.

Parameters :

kernel : Kernel

a kernel object defining the covariance between instances. (Defaults to SquaredExponentialKernel if None in arguments)

sigma_noise :

the standard deviation of the gaussian noise. (Default: 0.001)

lm :

The regularization term lambda. Increase this when the kernel matrix is not positive definite. If None, some regularization will be provided upon necessity. (Default: None)

retrainable :

Either to enable retraining for ‘retrainable’ classifier. (Default: False)

enable_ca : None or list of str

Names of the conditional attributes which should be enabled in addition to the default ones

disable_ca : None or list of str

Names of the conditional attributes which should be disabled

auto_train : bool

Flag whether the learner will automatically train itself on the input dataset when called untrained.

force_train : bool

Flag whether the learner will enforce training on the input dataset upon every call.

space: str, optional :

Name of the ‘processing space’. The actual meaning of this argument heavily depends on the sub-class implementation. In general, this is a trigger that tells the node to compute and store information about the input data that is “interesting” in the context of the corresponding processing in the output dataset.

postproc : Node instance, optional

Node to perform post-processing of results. This node is applied in __call__() to perform a final processing step on the to be result dataset. If None, nothing is done.

descr : str

Description of the instance

Methods

clone() Create full copy of the classifier.
compute_gradient_log_marginal_likelihood() Compute gradient of the log marginal likelihood.
compute_gradient_log_marginal_likelihood_logscale() Compute gradient of the log marginal likelihood when hyperparameters are in logscale.
compute_log_marginal_likelihood() Compute log marginal likelihood using self.train_fv and self.targets.
generate(ds) Yield processing results.
get_postproc() Returns the post-processing node or None.
get_sensitivity_analyzer([flavor]) Returns a sensitivity analyzer for GPR.
get_space() Query the processing space name of this node.
is_trained([dataset]) Either classifier was already trained.
predict(obj, data, *args, **kwargs)
repredict(obj, data, *args, **kwargs)
reset()
retrain(dataset, **kwargs) Helper to avoid check if data was changed actually changed
set_hyperparameters(hyperparameter) Set hyperparameters’ values.
set_postproc(node) Assigns a post-processing node
set_space(name) Set the processing space name of this node.
summary() Providing summary over the classifier
train(ds) The default implementation calls _pretrain(), _train(), and finally _posttrain().
untrain() Reverts changes in the state of this node caused by previous training
compute_gradient_log_marginal_likelihood()

Compute gradient of the log marginal likelihood. This version use a more compact formula provided by Williams and Rasmussen book.

compute_gradient_log_marginal_likelihood_logscale()

Compute gradient of the log marginal likelihood when hyperparameters are in logscale. This version use a more compact formula provided by Williams and Rasmussen book.

compute_log_marginal_likelihood()

Compute log marginal likelihood using self.train_fv and self.targets.

get_sensitivity_analyzer(flavor='auto', **kwargs)

Returns a sensitivity analyzer for GPR.

Parameters :

flavor : str

What sensitivity to provide. Valid values are ‘linear’, ‘model_select’, ‘auto’. In case of ‘auto’ selects ‘linear’ for linear kernel and ‘model_select’ for the rest. ‘linear’ corresponds to GPRLinearWeights and ‘model_select’ to GRPWeights

kernel
set_hyperparameters(hyperparameter)

Set hyperparameters’ values.

Note that ‘hyperparameter’ is a sequence so the order of its values is important. First value must be sigma_noise, then other kernel’s hyperparameters values follow in the exact order the kernel expect them to be.

NeuroDebian

NITRC-listed