Bug: white spaces missing between tokens
During the tokenization, we ignore all white spaces. This leads to the resulting span
s not having a white space between them and glues the single words together.
During the tokenization, we ignore all white spaces. This leads to the resulting span
s not having a white space between them and glues the single words together.
mentioned in merge request !97 (merged)
mentioned in commit 8102d53c
closed with merge request !97 (merged)