Tokenizer mem
It seems Tokenizer code read past the memory block.
Should just give the tokenizer extra memory or fix it STOP past a memory point?
It seems Tokenizer code read past the memory block.
Should just give the tokenizer extra memory or fix it STOP past a memory point?