Word Embedding | |||
---|---|---|---|
Description |
| ||
Why to use | Machine learning models cannot process text so we need to convert this textual data into numerical data. Hence, we have to use this algorithm. | ||
When to use | When required to represent words or phrases in vector space with several dimensions | When not to use | In applications where antonyms or even synonyms are required to be used. |
Prerequisites | None | ||
Input | One textual column | Output | Vector norm – Higher the frequency of the word, larger is the value of the norm in this word embedding. |
Statistical Methods Used | None | Limitations | Words with multiple meanings are often combined into a single representation. |