Tokenization, the process by which many generative AI models make sense of data, is flawed in key ways. Source link