Top 20 NuGet segmenter Packages
This is a port of pragmatic_segmenter, a Ruby gem which is a rule-based sentence boundary detection library with support for multiple languages.
Tokenization of raw text is a standard pre-processing step for many NLP tasks. For English, tokenization usually involves punctuation splitting and separation of some affixes like possessives. Other languages require more extensive token pre-processing, which is usually called segmentation.