Word Embeddings

  • 0 Rating
  • 0 Reviews
  • 144 Students Enrolled

Word Embeddings

Introduction to word embeddings methods: word2vec, GloVe, and BERT.

  • 0 Rating
  • 0 Reviews
  • 144 Students Enrolled
  • Wishlist
  • Free
Tags:
P2P



Courselet Content

2 components

Requirements

  • None

General Overview

Description

Word embeddings is a way to let the computer learn the meanings of words. We start from one-hot encoding representation and move on to an accidental yet very important finding of w2v. Then, we demonstrate the engineering thinking of GloVe. Finally, we briefly go through the corner stone of current research on word embeddings, BERT. 

Courses that include this CL

blog
Last Updated 3rd September 2025
  • 0

Recommended for you

blog
Last Updated 2nd September 2024
  • 2
blog
Last Updated 15th January 2025
  • 2
blog
Last Updated 10th December 2023
  • 5
blog
Last Updated 16th June 2023
  • 4
blog
Last Updated 7th January 2023
  • 9
  • Free
blog
Last Updated 16th January 2023
  • 2
  • Free
blog
Last Updated 1st March 2023
  • 4
blog
Last Updated 5th January 2026
  • 32
  • 3
blog
Last Updated 7th November 2022
  • 86
  • Free
blog
Last Updated 23rd August 2024
  • 4
blog
Last Updated 20th August 2025
  • 43
  • Free
blog
Last Updated 19th November 2023
  • 7
blog
Last Updated 13th December 2022
  • 89
  • Free
blog
Last Updated 21st March 2025
  • 188
  • Free
blog
Last Updated 20th May 2025
  • 29
  • Free

Meet the instructors !

instructor
About the Instructor

Research Interest:

  • Robust hedging and trading methods
  • Cryptocurrency derivatives
  • Quantitative finance 

Work in Progress

  • Hedging Cryptos with Bitcoin Futures
  • Crypto-backed Peer-to-Peer Lending
  • On dynamics of CP2P Interest Rate