Electron microscopy
 
Additive Structure/Aadditive Model in ML
- Python Automation and Machine Learning for ICs -
- An Online Book -
Python Automation and Machine Learning for ICs                                                           http://www.globalsino.com/ICs/        


Chapter/Index: Introduction | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Appendix

=================================================================================

In machine learning, Additive Structure typically refers to a model or function that is composed of the sum of several components or features. An additive model assumes that the overall output is the sum of the individual contributions from each component.

Mathematically, an additive model can be represented as follows:

          additive model -------------------------------------- [3745a]

where:

  • is the overall function or prediction.
  • is the number of components or features.
  • represents the individual features.
  • is the contribution of the -th feature to the overall prediction.

Additive models are commonly used in various machine learning algorithms, such as additive regression models and tree-based models like gradient boosting. These models are often flexible and can capture complex relationships between features and the target variable by combining simpler functions. The additive structure allows for a modular representation of the problem, where each component focuses on a specific aspect of the input data.

For example, in additive regression, each term might represent the effect of a specific feature on the output, and the final prediction is the sum of these individual effects.

Figure 3745 shows ensemble of decision trees. This is done by using the scikit-learn library to implement decision trees and gradient boosting, which can be considered an additive model, with a decision tree for regression.

Ensemble of decision trees

Figure 3745. Ensemble of decision trees (Code).

============================================

         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         
         

 

 

 

 

 



















































 

 

 

 

 

=================================================================================