spaCy Tokenization: sentences -> words Detecting Entities Detecting Nouns Lemmatization: been ===> be Phrase matching Stop Words Parts of Speech (POS) … Read More
Convolutional Neural Network
neuron (node/unit) input + weight function f = Activation Function Sigmoid σ = [0,1] tanh = [-1, 1] ReLU f(x) … Read More
Natural Language Processing (NLP) study
Pre-processing 1) Tokenization (word/sent) Bigrams / Trigrams / Ngrams 2) Stemming word stem: studiform new words like “studies,” “studied,” “studying” … Read More