Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Getting Started

Logos can be included in your Rust project using the cargo add logos command, or by directly modifying your Cargo.toml file:

[dependencies]
logos = "0.16.0"

Then, you can automatically derive the Logos trait on your enum using the Logos derive macro:

#![allow(unused)]
fn main() {
use logos::Logos;

#[derive(Logos, Debug, PartialEq)]
#[logos(skip r"[ \t\n\f]+")] // Ignore this regex pattern between tokens
enum Token {
    // Tokens can be literal strings, of any length.
    #[token("fast")]
    Fast,

    #[token(".")]
    Period,

    // Or regular expressions.
    #[regex("[a-zA-Z]+")]
    Text,
}
}

Then, you can use Logos::lexer method to turn any &str into an iterator of tokens1:

#![allow(unused)]
fn main() {
let mut lex = Token::lexer("Create ridiculously fast Lexers.");

assert_eq!(lex.next(), Some(Ok(Token::Text)));
assert_eq!(lex.span(), 0..6);
assert_eq!(lex.slice(), "Create");

assert_eq!(lex.next(), Some(Ok(Token::Text)));
assert_eq!(lex.span(), 7..19);
assert_eq!(lex.slice(), "ridiculously");

assert_eq!(lex.next(), Some(Ok(Token::Fast)));
assert_eq!(lex.span(), 20..24);
assert_eq!(lex.slice(), "fast");

assert_eq!(lex.next(), Some(Ok(Token::Text)));
assert_eq!(lex.slice(), "Lexers");
assert_eq!(lex.span(), 25..31);

assert_eq!(lex.next(), Some(Ok(Token::Period)));
assert_eq!(lex.span(), 31..32);
assert_eq!(lex.slice(), ".");

assert_eq!(lex.next(), None);
}

Because Lexer, returned by Logos::lexer, implements the Iterator trait, you can use a for .. in construct:

#![allow(unused)]
fn main() {
for result in Token::lexer("Create ridiculously fast Lexers.") {
    match result {
        Ok(token) => println!("{:#?}", token),
        Err(e) => panic!("some error occurred: {}", e),
    }
}
}

  1. Each item is actually a Result<Token, _>, because the lexer returns an error if some part of the string slice does not match any variant of Token.