Crate python_ast [] [src]

Modules

ast

See Grammar.txt for the python reference grammar.

doc

This module contains excerpts about the grammar directly from CPython's documentation

fmt

Format python_ast types as pretty printable strings

lexer

Transform bytes into tokens

macros
parser

Take a slice of tokens TkSlice and convert it into an Ast.

preprocessor

Useful TkSlice transformations before parsing with Parser which would otherwise complicate tokenizing or parsing.

slice
token
traits
util

Macros

drop_tokens

Generalized form of nom's eat_seperator! macro

ignore_spaces

Redef of the ws!() macro from nom for filtering based on Tk::id() instead of a bytes. Ignores spaces, tabs, and other whitespace for the scope of the wrapped subparser.

tk_is_none_of

Matches one of the provided tokens.

tk_is_one_of

Matches one of the provided tokens.

tk_method

Makes a method from a parser combination

tk_named

Makes a function from a parser combination

tk_named_args

Makes a function from a parser combination with arguments.

tk_named_attr

Makes a function from a parser combination, with attributes

tk_tag

Structs

Lexer

Struct that provides the operations to take a slice of bytes and convert them into Tk tokens.

Op
OwnedTk

Attempt to make an owned token to get out of lifetime hell. I found myself in trouble after trying to rewrite and inject values into the token slice in the parsing phase. This was to figure out block scopes and such since something something whitespace scoping.

ParsedAst

Wraps an Ast to give extra debug information when bits hit the fan

Parser

Create a Python AST from slice of Tokens created from the lexer::Lexer.

Tk

Enums

Ast
Expr
Id
LexResult

Holds the result of parsing functions

Module
Num
ParserResult

The result type returned by Parser. Both Ok and Error variants contain an instance of ParsedAst. Ok variants are considered to be the case where the TkSlice was fully consumed.

Stmt
Tag