pub fn tokenize(input: &str) -> Vec<Token>Expand description
Tokenize a CQL input string into a sequence of tokens with grammar context.
This is the main entry point. It performs a single O(n) pass and classifies each token using both lexical rules and grammar position tracking.