Hi 👋,  I'm Laura, Software Engineer with a love for Machine Learning and Maths. As a book lover I c...

# 2021

Sep 06, 2021
Sep 06, 2021
Studied Mathematics
Obtained a certificate
Just finished the last assignment of Week 4 - PCA  :)

This course was the hardest of the whole Mathematics for Machine Learning specialization but also the most useful of all 3 courses.

I truly enjoyed and will continue studying PCA in-depth as I think it's a useful technique everyone who studies ML should know about it.

Checkout my certificate: https://www.coursera.org/account/accomplishments/specialization/certificate/B5VANAZBL8PG

#ProudMoment :)
Sep 04, 2021
Sep 04, 2021
Learning PCA
Studying Machine Learning
Just finished last course of Mathematics for Machine Learning by Imperial College of London.

Last course was related to Principal Component Analysis, with the main content as:

1. Measures of center, such as mean and variance.
2. How to calculate inner products in 1D and Generalization to multiple dimensions.
3. Orthogonal Projections in 1D and its generalization to multidimensional
4. PCA, how to project multidimensional spaces into subspaces without losing information.
• Idea behind PCA is to compress data.
• Data should be normalized
• Compute the covariance matrix
• Compute the eigen vectors and eigen values that maximize the variance
• Obtain the principal components.

This last section of the course has been really hard and challenging, and with that I got the importance of understanding the foundations of PCA and its applications.
Sep 01, 2021
Sep 01, 2021
Studying Machine Learning
Worked on the assignment for week 3 of Mathematics for ML Specialization by Imperial College of London.

The assignment was fairly simple as you will apply the concepts to calculate projection matrices for 1D and General Cases.

Key points:
• Be careful on using vector operations with the help of NumPy. For example, if you do b @ b.T it will calculate the dot product when what we need is the outer product. For that use the np.outer method and it will return the correspondent matrix.
• Check on dimensions and vector operations as you might need to apply Traverse for some of them to work correctly.
Aug 30, 2021
Aug 30, 2021
Linear Regression
Studying Machine Learning
Week 3 - Statistical Learning Course.

Just started Linear Regression chapter, in which the basics of how linear regression works were explained.

• Residual sum of squares - RSS, which is the sum of the squared differences between the real value ( target ) and the predicted value.
• The idea behind this is to calculate/estimate the best parameters that minimized the difference of RSS so that we know the predicted values are closest to the real value.
• On the other hand, we want to calculate how far we are from the real values of the parameters, and this can be calculated through the standard error for each parameter using Confidence Intervals.
• Confidence intervals are a calculated range of values in which you say: Out of all 100% values on the range, there is a 95% chance of your truth parameter being on this interval.
Aug 28, 2021
Aug 28, 2021
Bayesian Statistics
Weekend :-)

Spent the morning in a coffee shop studying Orthogonal projections in 1D and 2D.

Later in the day I moved to another coffee shop and started drafting a blog post on Bayes classifiers and reading around the same topic  Deep Learning by Ian Goodfellow.

Stay tuned :)
Aug 23, 2021
Aug 23, 2021
Learning statistics
Continue working on Statistical Learning - Week 2

Notes:
• Intro to Nearest Neighbours and Linear Regression
• Nearest Neighbours modeling is cool but it has few disadvantages, for example, if the model has high dimensions then means that the principle of locality might be lost as you need to expand further and further in order to cover an area representative of local points.
• It was also explained trades off between using Linear Regression and Thin-plate splines, where Linear Models are:
• Way easier to interpret
• Explainability of predictors is simpler
• Accuracy is not great but it doesn't overfit as much as Thin-plate.