I took the Token module from the Arcanum project and brought it over to here. It was a nice data oriented way of handling the Tokens. I then created a Lexer that can scan a file or text and allow the user to transform the scanned tokens before the final token array is returned. This should allow for more complex and specific tokens to be created for whatever domain is being targeted. I also added basic library examples and testing. Finally, I made sure the documentation generated nicely. This is now marked as version: 0.1.0
Rune
Rune is a high-performance, customizable lexical analysis library written in Rust.
It transforms source files into tokens using a fast, cache-friendly design.
“Turn raw text into structured meaning — like spellcraft for source code.”
Features
- Basic tokenization: Whitespace, text, numbers, symbols, and newlines.
- Flat
TokenStream
design: Optimized for speed and cache locality. - Custom transforms: Supply your own function to turn base tokens into domain-specific ones (like Markdown, HTML, or custom Domain Specific Languages).
- Iterators and mutation: Traverse or modify tokens efficiently.
Getting Started
Add to your project
This library is hosted on the CyberMages registry. To add it to a project the CyberMages registry will need to be added to Cargo as per the Cargo Book.
First add the registry to your cargo config.toml file.
[registries.cybermages]
index = "sparse+https://workshop.cybermages.tech/api/packages/CyberMages/cargo/"
Then add this to your Cargo.toml file.
[dependencies]
rune = { registry = "cybermages" }
Description
Languages
Rust
100%