Gradient Descent


In this topic you will get the mathematical explanation of gradient descent.

http://mccormickml.com/2014/03/04/gradient-descent-derivation/


Bias Variance Tradeoff


Supervised machine learning algorithms can best be understood through the lens of the bias-variance trade-off. In this post, you will discover the Bias-Variance Trade-Off and how to use it to better understand machine learning algorithms and get better performance on your data.

https://machinelearningmastery.com/gentle-introduction-to-the-bias-variance-trade-off-in-machine-learning/


Ridge and Lasso Regression


In this topic you will get a comprehensive beginners guide for Linear, Ridge and Lasso Regression in Python.

https://www.analyticsvidhya.com/blog/2017/06/a-comprehensive-guide-for-linear-ridge-and-lasso-regression/



Cross validation


In this topic you will get a overview of how to split data using cross validation technique.

https://towardsdatascience.com/train-test-split-and-cross-validation-in-python-80b61beca4b6



Hyper parameter tuning


Machine learning models are parameterized so that their behavior can be tuned for a given problem. Models can have many parameters and finding the best combination of parameters can be treated as a search problem. In this topic, you will discover how to tune the parameters of machine learning algorithms in Python using the scikit-learn library. https://machinelearningmastery.com/how-to-tune-algorithm-parameters-with-scikit-learn/