PolynomialRegressionAnalysis¶
- class persalys.PolynomialRegressionAnalysis(*args)¶
Polynomial stepwise regression.
Allows one to select of the most suitable polynomial (canonical) basis for a linear regression model with the help of the stepwise algorithm.
See also LinearModelStepwiseAlgorithm.
- Parameters
- namestr
Name
- designOfExperiment
DesignOfExperiment
Design of experiments
Examples
Create the model:
>>> import openturns as ot >>> import persalys >>> from math import pi >>> ot.RandomGenerator.SetSeed(0)
Create the model:
>>> xi1 = persalys.Input('xi1', ot.Uniform(-pi, pi)) >>> xi2 = persalys.Input('xi2', ot.Uniform(-pi, pi)) >>> xi3 = persalys.Input('xi3', ot.Uniform(-pi, pi)) >>> y0 = persalys.Output('y0') >>> myPhysicalModel = persalys.SymbolicPhysicalModel('myPhysicalModel', [xi1, xi2, xi3], [y0], ['sin(xi1) + 7. * (sin(xi2)) ^ 2 + 0.1 * xi3^4 * sin(xi1)'])
Create the design of experiments:
>>> aDesign = persalys.ProbabilisticDesignOfExperiment('aDesign', myPhysicalModel, 100, 'LHS') >>> aDesign.run()
Create the linear regression analysis:
>>> lm = persalys.PolynomialRegressionAnalysis('lm', aDesign) >>> lm.setDegree(1) >>> lm.setInteraction(False) >>> lm.run()
Get the result:
>>> result = lm.getResult()
Methods
Whether an analytical validation is requested.
Accessor to the object's name.
Accessor to the basis degree.
Design of experiments accessor.
Effective input sample accessor.
Effective output sample accessor.
Error message accessor.
Accessor to the interaction flag.
Get the variables to analyse.
Number of folds accessor.
Seed accessor.
getName
()Accessor to the object's name.
Physical model accessor.
Results accessor.
Percentage of points accessor.
Seed accessor.
Warning message accessor.
hasName
()Test if the object is named.
Whether the analysis has been run.
Whether the analysis involves reliability.
Whether the analysis is running.
Whether a k-Fold cross-validation is requested.
Whether a validation by leave-one-out is requested.
run
()Launch the analysis.
setAnalyticalValidation
(validation)Whether an analytical validation is requested.
setDegree
(degree)Accessor to the basis degree.
setInteraction
(interaction)Accessor to the interaction flag.
setInterestVariables
(variablesNames)Set the variables to analyse.
setKFoldValidation
(validation)Whether a k-Fold cross-validation is requested.
setKFoldValidationNumberOfFolds
(nbFolds)Number of folds accessor.
setKFoldValidationSeed
(seed)Seed accessor.
setLeaveOneOutValidation
(validation)Whether it is sparse.
setName
(name)Accessor to the object's name.
setTestSampleValidation
(validation)Whether a validation with a test sample is requested.
Percentage of points accessor.
Seed accessor.
Whether a validation with a test sample is requested.
canBeLaunched
getElapsedTime
getParentObserver
- __init__(*args)¶
- analyticalValidation()¶
Whether an analytical validation is requested.
- Returns
- validationbool
Whether an analytical validation is requested. This method corresponds to an approximation of the Leave-one-out method result.
- getClassName()¶
Accessor to the object’s name.
- Returns
- class_namestr
The object class name (object.__class__.__name__).
- getDegree()¶
Accessor to the basis degree.
- Returns
- degresint
Basis degree
- getDesignOfExperiment()¶
Design of experiments accessor.
- Returns
- model
DesignOfExperiment
Design of experiments
- model
- getEffectiveInputSample()¶
Effective input sample accessor.
- Returns
- sample
openturns.Sample
Sample of all the input variables if all of them are deterministic. Otherwise, sample of the stochastic input variables.
- sample
- getEffectiveOutputSample()¶
Effective output sample accessor.
- Returns
- sample
openturns.Sample
Sample of the interest output variables.
- sample
- getErrorMessage()¶
Error message accessor.
- Returns
- messagestr
Error message if the analysis failed
- getInteraction()¶
Accessor to the interaction flag.
- Returns
- interactionbool
Whether to include interaction terms in the basis
- getInterestVariables()¶
Get the variables to analyse.
- Returns
- variablesNamessequence of str
Names of the variables to analyse
- getKFoldValidationNumberOfFolds()¶
Number of folds accessor.
- Returns
- foldsint
Number of folds. By default it is 3.
- getKFoldValidationSeed()¶
Seed accessor.
- Returns
- seedint
Seed value for k-Fold cross-validation
- getName()¶
Accessor to the object’s name.
- Returns
- namestr
The name of the object.
- getPythonScript()¶
Physical model accessor.
- Returns
- scriptstr
Python script to replay the analysis
- getResult()¶
Results accessor.
- Returns
- result
PolynomialRegressionAnalysisResult
Results
- result
- getTestSampleValidationPercentageOfPoints()¶
Percentage of points accessor.
- Returns
- percentageint
Percentage of points used to validate the metamodel. By default it is 20%.
- getTestSampleValidationSeed()¶
Seed accessor.
- Returns
- seedint
Seed value for the validation with a test sample
- getWarningMessage()¶
Warning message accessor.
- Returns
- messagestr
Warning message which can appear during the analysis computation
- hasName()¶
Test if the object is named.
- Returns
- hasNamebool
True if the name is not empty.
- hasValidResult()¶
Whether the analysis has been run.
- Returns
- hasValidResultbool
Whether the analysis has already been run
- isReliabilityAnalysis()¶
Whether the analysis involves reliability.
- Returns
- isReliabilityAnalysisbool
Whether the analysis involves a reliability analysis
- isRunning()¶
Whether the analysis is running.
- Returns
- isRunningbool
Whether the analysis is running
- kFoldValidation()¶
Whether a k-Fold cross-validation is requested.
- Returns
- validationbool
Whether a k-Fold cross-validation is requested
- leaveOneOutValidation()¶
Whether a validation by leave-one-out is requested.
- Returns
- validationbool
Whether a validation by leave-one-out is requested
- run()¶
Launch the analysis.
- setAnalyticalValidation(validation)¶
Whether an analytical validation is requested.
- Parameters
- validationbool
Whether an analytical validation is requested. This method corresponds to an approximation of the Leave-one-out method result.
- setDegree(degree)¶
Accessor to the basis degree.
- Parameters
- degresint
Basis degree
- setInteraction(interaction)¶
Accessor to the interaction flag.
- Parameters
- interactionbool
Whether to include interaction terms in the basis
- setInterestVariables(variablesNames)¶
Set the variables to analyse.
- Parameters
- variablesNamessequence of str
Names of the variables to analyse
- setKFoldValidation(validation)¶
Whether a k-Fold cross-validation is requested.
- Parameters
- validationbool
Whether a k-Fold cross-validation is requested
- setKFoldValidationNumberOfFolds(nbFolds)¶
Number of folds accessor.
- Parameters
- foldsint
Number of folds. By default it is 3.
- setKFoldValidationSeed(seed)¶
Seed accessor.
- Parameters
- seedint
Seed value for k-Fold cross-validation
- setLeaveOneOutValidation(validation)¶
Whether it is sparse.
- Parameters
- validationbool
Whether a validation by leave-one-out is requested
- setName(name)¶
Accessor to the object’s name.
- Parameters
- namestr
The name of the object.
- setTestSampleValidation(validation)¶
Whether a validation with a test sample is requested.
- Parameters
- validationbool
Whether a validation with a test sample is requested. The data sample is dividing into two sub-samples: a training sample (default: 80% of the sample points) and a test sample (default: 20% of the sample points). A new metamodel is built with the training sample and is validated with the test sample. The points are randomly picked in the data sample (by default the seed is 1).
- setTestSampleValidationPercentageOfPoints(percentage)¶
Percentage of points accessor.
- Parameters
- percentageint
Percentage of points used to validate the metamodel. By default it is 20%.
- setTestSampleValidationSeed(seed)¶
Seed accessor.
- Parameters
- seedint
Seed value for the validation with a test sample
- testSampleValidation()¶
Whether a validation with a test sample is requested.
- Returns
- validationbool
Whether a validation with a test sample is requested. The data sample is dividing into two sub-samples: a training sample (default: 80% of the sample points) and a test sample (default: 20% of the sample points). A new metamodel is built with the training sample and is validated with the test sample. The points are randomly picked in the data sample (by default the seed is 1).