I took the Token module from the Arcanum project and brought it over to here. It was a nice data oriented way of handling the Tokens. I then created a Lexer that can scan a file or text and allow the user to transform the scanned tokens before the final token array is returned. This should allow for more complex and specific tokens to be created for whatever domain is being targeted. I also added basic library examples and testing. Finally, I made sure the documentation generated nicely. This is now marked as version: 0.1.0
36 lines
708 B
Rust
36 lines
708 B
Rust
use std::path::PathBuf;
|
|
|
|
use rune::{Lexer, TokenStream, TokenType};
|
|
|
|
|
|
|
|
// Define how you want to interpret base tokens
|
|
fn transform(tokens: &TokenStream) -> Vec<(TokenType, String)>
|
|
{
|
|
let mut new_tokens = Vec::new();
|
|
|
|
for token in tokens
|
|
{
|
|
new_tokens.push((*token.variant, token.lexeme.to_string()));
|
|
}
|
|
|
|
new_tokens
|
|
}
|
|
|
|
|
|
fn main() -> Result<(), Box<dyn std::error::Error>>
|
|
{
|
|
let mut path = PathBuf::from(env!("CARGO_MANIFEST_DIR"));
|
|
path.push("examples/example.txt");
|
|
|
|
let tokens = Lexer::scan_file(path, transform)?;
|
|
|
|
// The tuple here is from the transform functions return type.
|
|
for (ty, lexeme) in tokens
|
|
{
|
|
println!("{:?}: {:?}", ty, lexeme);
|
|
}
|
|
|
|
Ok(())
|
|
}
|