We are able to reduce the training time by almost 2.5x — which is quite significant when training large models for a longer time.
Applied Scientist. Gamer.
We are able to reduce the training time by almost 2.5x — which is quite significant when training large models for a longer time.
And that’s when we seriously began exploring distributed training — a shift that fundamentally changed how we build and scale our models.
20 July 2022
I hear a lot about self-attention in papers I read every now and then. And almost every time i get very puzzeled whenever the word is mentioned. Whether it is action recognition, image translation, image segementation or anything else (PS: sorry for being computer vision bias), self-attention is the new trend, I suppose.
25 December 2021
Fellowship.Ai is a four months long “unpaid” fellowship on various machine learning topics. The program is 100 % remote, and anyone can apply for it. (PS: Keep reading, if you should or not!). You can apply directly by submitting your resume, but if you complete one of the challenges mentioned on the website - you tend to increase your chances for getting to the interviews. The challenges are focused on different machine learning topics such as image segmentation, Natural Language Processing (NLP), One-Shot learning etc.