Important Machine Learning Concepts Part – 2

Ensemble Learning

Training multiple models with different parameters to solve the same problem.

A/B Testing

Statistical way of comparing 2+ techniques to determine which technique performs better and also if difference in statistically significant.

Baseline Model

Simple model/heuristic used as reference point for comparing how well a model is performing.

Bias

Prejudice or favourite towards some things, people or groups over others that can affect sampling and interpretation of data, the design of a system, and how users interact with system.

Dynamic Model

Model that is trained online in a continuously updating fashion.

Static model

Model that is trained offline.

Normalization

Process of converting an actual range of values into a standard range of values, typically -1 to +1.

Independently and Identically Distributed

Data drawn from a distribution that doesn’t change, and where each value drawn doesn’t depend on previously drawn values; ideal but rarely found in real life.

Hyperparameters

In machine learning, a hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters are derived via training.

Generalization

Refers to a model’s ability to make correct predictions on new, previously unseen data as opposed to the data used to train the model.

Cross-entropy

Cross-entropy is commonly used in machine learning as a loss function. Cross-entropy is a measure from the field of information theory, building upon entropy and generally calculating the difference between two probability distribution.

Important Machine Learning Concepts Part – 1

Popular Posts

Author

Spread the knowledge
 
  

Leave a Reply

Your email address will not be published. Required fields are marked *