Attended [30 days of ML] by Kaggle
Bummer. Once again I am an almost was. It was fun to imagine applications for XGBoost. It doesn't seem very different from what I am used to for image classification. Of course, only if I am running the linear regression algorithm instead of the random forest algorithm. I have a lot to learn about randomforest hyper parameters.

I cannot run xgboost on the dataset in any form without running out of memory. I also was constantly delayed by lack of access to compute. I get that they are having very high usage right now. I might consider submitting my phone number or budget some money for some hours on GCP. I think I want to be a part of the kaggle community. I feel a little disappointed that I couldn't participate with a free notebook instance.

In any case it was exciting to look at how other people prepare and analyze what is basically unlabeled data. I am used to a more hands on approach with a much deeper understanding of the data. I also feel pretty good that I got close and now I know how to work with a Kaggle dataset and potentially enter a competition. I think that Kaggle is a great place to network with other data scientists and learn from them.

I tried to get involved with kaggle about 3 years ago but was so bored with the process that I moved on to something else. I think the 30 days of ML challenge was well done.