Tokenizer in sqlparser::tokenizer - Rust

pub struct Tokenizer<'a> { /* private fields */ }
Expand description

SQL Tokenizer

Source§
Source

Create a new SQL tokenizer for the specified SQL statement

let query = r#"SELECT 'foo'"#;

// Parsing the query
let tokens = Tokenizer::new(&dialect, &query).tokenize().unwrap();

assert_eq!(tokens, vec![
  Token::make_word("SELECT", None),
  Token::Whitespace(Whitespace::Space),
  Token::SingleQuotedString("foo".to_string()),
]);
Source

Set unescape mode

When true (default) the tokenizer unescapes literal values (for example, "" in SQL is unescaped to the literal ").

When false, the tokenizer provides the raw strings as provided in the query. This can be helpful for programs that wish to recover the exact original query text without normalizing the escaping

§Example
let query = r#""Foo "" Bar""#;
let unescaped = Token::make_word(r#"Foo " Bar"#, Some('"'));
let original  = Token::make_word(r#"Foo "" Bar"#, Some('"'));

// Parsing with unescaping (default)
let tokens = Tokenizer::new(&dialect, &query).tokenize().unwrap();
assert_eq!(tokens, vec![unescaped]);

// Parsing with unescape = false
let tokens = Tokenizer::new(&dialect, &query)
   .with_unescape(false)
   .tokenize().unwrap();
assert_eq!(tokens, vec![original]);
Source

Tokenize the statement and produce a vector of tokens

Source

Tokenize the statement and produce a vector of tokens with location information

Source

Tokenize the statement and append tokens with location information into the provided buffer. If an error is thrown, the buffer will contain all tokens that were successfully parsed before the error.

Source

Tokenize the statement and produce a vector of tokens, mapping each token with provided mapper

§
§
§
§
§
§