xonsh.parsers.lexer

Lexer for xonsh code.

Written using a hybrid of tokenize and PLY.

class xonsh.parsers.lexer.Lexer(tolerant=False, pymode=True)[source]

Implements a lexer for the xonsh language.

Attributes:
fnamestr

Filename

lasttoken

The last token seen.

linenoint

The last line number seen.

tolerantbool

Tokenize without extra checks (e.g. paren matching). When True, ERRORTOKEN contains the erroneous string instead of an error msg.

pymodebool

Start the lexer in Python mode.

build(**kwargs)[source]

Part of the PLY lexer API.

input(s)[source]

Calls the lexer on the string s.

reset()[source]
split(s)[source]

Splits a string into a list of strings which are whitespace-separated tokens.

token()[source]

Retrieves the next token.

property tokens
property tolerant
xonsh.parsers.lexer.get_tokens(s, tolerant, pymode=True, tokenize_ioredirects=True)[source]

Given a string containing xonsh code, generates a stream of relevant PLY tokens using handle_token.

xonsh.parsers.lexer.handle_double_amps(state, token)[source]
xonsh.parsers.lexer.handle_double_pipe(state, token)[source]
xonsh.parsers.lexer.handle_error_linecont(state, token)[source]

Function for handling special line continuations as whitespace characters in subprocess mode.

xonsh.parsers.lexer.handle_error_space(state, token)[source]

Function for handling special whitespace characters in subprocess mode

xonsh.parsers.lexer.handle_error_token(state, token)[source]

Function for handling error tokens

xonsh.parsers.lexer.handle_ignore(state, token)[source]

Function for handling tokens that should be ignored

xonsh.parsers.lexer.handle_name(state, token)[source]

Function for handling name tokens

xonsh.parsers.lexer.handle_rbrace(state, token)[source]

Function for handling }

xonsh.parsers.lexer.handle_rbracket(state, token)[source]

Function for handling ]

xonsh.parsers.lexer.handle_redirect(state, token)[source]
xonsh.parsers.lexer.handle_rparen(state, token)[source]

Function for handling )

xonsh.parsers.lexer.handle_token(state, token)[source]

General-purpose token handler. Makes use of token_map or special_map to yield one or more PLY tokens from the given input.

Parameters:
state

The current state of the lexer, including information about whether we are in Python mode or subprocess mode, which changes the lexer’s behavior. Also includes the stream of tokens yet to be considered.

token

The token (from tokenize) currently under consideration