Lexer (xonsh.lexer)

Lexer for xonsh code.

Written using a hybrid of tokenize and PLY.

class xonsh.lexer.Lexer[source]

Implements a lexer for the xonsh language.




The last token seen.


The last line number seen.

build(self, **kwargs)[source]

Part of the PLY lexer API.

input(self, s)[source]

Calls the lexer on the string s.

split(self, s)[source]

Splits a string into a list of strings which are whitespace-separated tokens.


Retrieves the next token.

property tokens

Given a string containing xonsh code, generates a stream of relevant PLY tokens using handle_token.

xonsh.lexer.handle_double_amps(state, token)[source]
xonsh.lexer.handle_double_pipe(state, token)[source]
xonsh.lexer.handle_error_linecont(state, token)[source]

Function for handling special line continuations as whitespace characters in subprocess mode.

xonsh.lexer.handle_error_space(state, token)[source]

Function for handling special whitespace characters in subprocess mode

xonsh.lexer.handle_error_token(state, token)[source]

Function for handling error tokens

xonsh.lexer.handle_ignore(state, token)[source]

Function for handling tokens that should be ignored

xonsh.lexer.handle_name(state, token)[source]

Function for handling name tokens

xonsh.lexer.handle_rbrace(state, token)[source]

Function for handling }

xonsh.lexer.handle_rbracket(state, token)[source]

Function for handling ]

xonsh.lexer.handle_redirect(state, token)[source]
xonsh.lexer.handle_rparen(state, token)[source]

Function for handling )

xonsh.lexer.handle_token(state, token)[source]

General-purpose token handler. Makes use of token_map or special_map to yield one or more PLY tokens from the given input.

state :

The current state of the lexer, including information about whether we are in Python mode or subprocess mode, which changes the lexer’s behavior. Also includes the stream of tokens yet to be considered.

token :

The token (from tokenize) currently under consideration