Linear Algebra for Machine Learning

The TensorFlow channel on YouTube recently uploaded a video I made on some elementary ideas from linear algebra and how they're used in machine learning (ML). It's a very nontechnical introduction — more of a bird's-eye view of some basic concepts and standard applications — with the simple goal of whetting the viewer's appetite to learn more.

I've decided to share it here, too, in case it may be of interest to anyone!

I imagine the content here might be helpful for undergraduate students who are in their first exposure to linear algebra and/or to ML, or for anyone else who's new to the topic and wants to get an idea for what it is and some ways it's used.

The video covers three basic concepts — vectors and matrix factorizations and eigenvectors/eigenvalues — and explains a few ways these concepts arise in ML — namely, as data representations, to find vector embeddings, and for dimensionality reduction techniques, respectively.

Enjoy!

Related Posts

Operator Norm, Intuitively

The Back Pocket
Leave a comment!