MetaModelAnalysis

class persalys.MetaModelAnalysis(*args)

Create a base class for the creation of meta models.

Notes

Can only be used through its derived classes. See FunctionalChaosAnalysis, KrigingAnalysis

Methods

analyticalValidation()

Whether an analytical validation is requested.

getClassName()

Accessor to the object's name.

getDesignOfExperiment()

Design of experiments accessor.

getEffectiveInputSample()

Effective input sample accessor.

getEffectiveOutputSample()

Effective output sample accessor.

getErrorMessage()

Error message accessor.

getInterestVariables()

Get the variables to analyse.

getKFoldValidationNumberOfFolds()

Number of folds accessor.

getKFoldValidationSeed()

Seed accessor.

getName()

Accessor to the object's name.

getPythonScript()

Physical model accessor.

getTestSampleValidationPercentageOfPoints()

Percentage of points accessor.

getTestSampleValidationSeed()

Seed accessor.

getWarningMessage()

Warning message accessor.

hasName()

Test if the object is named.

hasValidResult()

Whether the analysis has been run.

isReliabilityAnalysis()

Whether the analysis involves reliability.

isRunning()

Whether the analysis is running.

kFoldValidation()

Whether a k-Fold cross-validation is requested.

leaveOneOutValidation()

Whether a validation by leave-one-out is requested.

run()

Launch the analysis.

setAnalyticalValidation(validation)

Whether an analytical validation is requested.

setInterestVariables(variablesNames)

Set the variables to analyse.

setKFoldValidation(validation)

Whether a k-Fold cross-validation is requested.

setKFoldValidationNumberOfFolds(nbFolds)

Number of folds accessor.

setKFoldValidationSeed(seed)

Seed accessor.

setLeaveOneOutValidation(validation)

Whether it is sparse.

setName(name)

Accessor to the object's name.

setTestSampleValidation(validation)

Whether a validation with a test sample is requested.

setTestSampleValidationPercentageOfPoints(...)

Percentage of points accessor.

setTestSampleValidationSeed(seed)

Seed accessor.

testSampleValidation()

Whether a validation with a test sample is requested.

canBeLaunched

getElapsedTime

getParentObserver

__init__(*args)
analyticalValidation()

Whether an analytical validation is requested.

Returns
validationbool

Whether an analytical validation is requested. This method corresponds to an approximation of the Leave-one-out method result.

getClassName()

Accessor to the object’s name.

Returns
class_namestr

The object class name (object.__class__.__name__).

getDesignOfExperiment()

Design of experiments accessor.

Returns
modelDesignOfExperiment

Design of experiments

getEffectiveInputSample()

Effective input sample accessor.

Returns
sampleopenturns.Sample

Sample of all the input variables if all of them are deterministic. Otherwise, sample of the stochastic input variables.

getEffectiveOutputSample()

Effective output sample accessor.

Returns
sampleopenturns.Sample

Sample of the interest output variables.

getErrorMessage()

Error message accessor.

Returns
messagestr

Error message if the analysis failed

getInterestVariables()

Get the variables to analyse.

Returns
variablesNamessequence of str

Names of the variables to analyse

getKFoldValidationNumberOfFolds()

Number of folds accessor.

Returns
foldsint

Number of folds. By default it is 3.

getKFoldValidationSeed()

Seed accessor.

Returns
seedint

Seed value for k-Fold cross-validation

getName()

Accessor to the object’s name.

Returns
namestr

The name of the object.

getPythonScript()

Physical model accessor.

Returns
scriptstr

Python script to replay the analysis

getTestSampleValidationPercentageOfPoints()

Percentage of points accessor.

Returns
percentageint

Percentage of points used to validate the metamodel. By default it is 20%.

getTestSampleValidationSeed()

Seed accessor.

Returns
seedint

Seed value for the validation with a test sample

getWarningMessage()

Warning message accessor.

Returns
messagestr

Warning message which can appear during the analysis computation

hasName()

Test if the object is named.

Returns
hasNamebool

True if the name is not empty.

hasValidResult()

Whether the analysis has been run.

Returns
hasValidResultbool

Whether the analysis has already been run

isReliabilityAnalysis()

Whether the analysis involves reliability.

Returns
isReliabilityAnalysisbool

Whether the analysis involves a reliability analysis

isRunning()

Whether the analysis is running.

Returns
isRunningbool

Whether the analysis is running

kFoldValidation()

Whether a k-Fold cross-validation is requested.

Returns
validationbool

Whether a k-Fold cross-validation is requested

leaveOneOutValidation()

Whether a validation by leave-one-out is requested.

Returns
validationbool

Whether a validation by leave-one-out is requested

run()

Launch the analysis.

setAnalyticalValidation(validation)

Whether an analytical validation is requested.

Parameters
validationbool

Whether an analytical validation is requested. This method corresponds to an approximation of the Leave-one-out method result.

setInterestVariables(variablesNames)

Set the variables to analyse.

Parameters
variablesNamessequence of str

Names of the variables to analyse

setKFoldValidation(validation)

Whether a k-Fold cross-validation is requested.

Parameters
validationbool

Whether a k-Fold cross-validation is requested

setKFoldValidationNumberOfFolds(nbFolds)

Number of folds accessor.

Parameters
foldsint

Number of folds. By default it is 3.

setKFoldValidationSeed(seed)

Seed accessor.

Parameters
seedint

Seed value for k-Fold cross-validation

setLeaveOneOutValidation(validation)

Whether it is sparse.

Parameters
validationbool

Whether a validation by leave-one-out is requested

setName(name)

Accessor to the object’s name.

Parameters
namestr

The name of the object.

setTestSampleValidation(validation)

Whether a validation with a test sample is requested.

Parameters
validationbool

Whether a validation with a test sample is requested. The data sample is dividing into two sub-samples: a training sample (default: 80% of the sample points) and a test sample (default: 20% of the sample points). A new metamodel is built with the training sample and is validated with the test sample. The points are randomly picked in the data sample (by default the seed is 1).

setTestSampleValidationPercentageOfPoints(percentage)

Percentage of points accessor.

Parameters
percentageint

Percentage of points used to validate the metamodel. By default it is 20%.

setTestSampleValidationSeed(seed)

Seed accessor.

Parameters
seedint

Seed value for the validation with a test sample

testSampleValidation()

Whether a validation with a test sample is requested.

Returns
validationbool

Whether a validation with a test sample is requested. The data sample is dividing into two sub-samples: a training sample (default: 80% of the sample points) and a test sample (default: 20% of the sample points). A new metamodel is built with the training sample and is validated with the test sample. The points are randomly picked in the data sample (by default the seed is 1).