Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add trivial lexer for output (using Token.Generic.Output for the complete code block) #1835

Closed
doerwalter opened this issue Jun 7, 2021 · 2 comments · Fixed by #1836
Closed
Milestone

Comments

@doerwalter
Copy link
Contributor

If I have the following Sphinx source:

	Do the following in your Python shell:

	.. sourcecode:: pycon

		>>> for i in range(3):
		...	print(i)
		...
		0
		1
		2

	You can also run the following code as a script:

	.. sourcecode:: pycon

		for i in range(3):
			print(i)

	This will output:

	.. sourcecode:: text

		0
		1
		2

the output in the first code snippet is marked up with Token.Generic.Output, but the output of the second code snippet is marked up with Token.Text (in fact it isn't marked up at all).

In my current style the result looks like this:

grafik

i.e. both outputs look different. I could of course use the console lexer for the second output and hope that the lexer doesn't find anything that it will interpret as a prompt. The better solution however would be to have a new lexer that simply outputs everything marked up with the token Token.Generic.Output.

Code for that might look something like this:

class OutputLexer(lexer.Lexer):
	"""
	Simple lexer that highlights everything as output.
	"""
	name = 'Text output'
	aliases = ['output']
	filenames = ['*.txt']
	mimetypes = ['text/plain']
	priority = 0.005

	def get_tokens_unprocessed(self, text):
		yield 0, token.Generic.Output, text

	def analyse_text(text):
		return OutpuLexer.priority

If this is accepted I can work on a pull request implementing OutputLexer.

@birkenfeld
Copy link
Member

Sounds good, it should be very similar to the TextLexer. But don't give it a priority, please. It should only be used when explicitly requested.

@doerwalter
Copy link
Contributor Author

Done: #1836

@Anteru Anteru linked a pull request Aug 8, 2021 that will close this issue
@Anteru Anteru added this to the 2.10 milestone Aug 8, 2021
@Anteru Anteru closed this as completed Aug 8, 2021
wmfgerrit pushed a commit to wikimedia/mediawiki-extensions-SyntaxHighlight_GeSHi that referenced this issue Sep 1, 2021
New lexers include ansys, apdl, asc, gcode, gsql, jslt, julia-repl,
kuin, meson, nestedtext, nodejsrepl, nt, omg-idl, output, pem, procfile,
pwsh, smithy, teal, thingsdb, ti, wast, wat.

Also "golang" is now an accepted alias for go.

The output lexer is a generic lexer that just makes everything look like
output, see <pygments/pygments#1835>.

Full upstream changelogs:
* https://github.com/pygments/pygments/releases/tag/2.10.0
* https://github.com/pygments/pygments/releases/tag/2.9.0
* https://github.com/pygments/pygments/releases/tag/2.8.1

Bug: T280117
Change-Id: I162dff1e3f3eb6f01e87dc09509b508f52aff46c
wmfgerrit pushed a commit to wikimedia/mediawiki-extensions that referenced this issue Sep 1, 2021
* Update SyntaxHighlight_GeSHi from branch 'master'
  to b4f53c7ad17c3438f187378652d083bc675dbe19
  - Update Pygments to 2.10.0
    
    New lexers include ansys, apdl, asc, gcode, gsql, jslt, julia-repl,
    kuin, meson, nestedtext, nodejsrepl, nt, omg-idl, output, pem, procfile,
    pwsh, smithy, teal, thingsdb, ti, wast, wat.
    
    Also "golang" is now an accepted alias for go.
    
    The output lexer is a generic lexer that just makes everything look like
    output, see <pygments/pygments#1835>.
    
    Full upstream changelogs:
    * https://github.com/pygments/pygments/releases/tag/2.10.0
    * https://github.com/pygments/pygments/releases/tag/2.9.0
    * https://github.com/pygments/pygments/releases/tag/2.8.1
    
    Bug: T280117
    Change-Id: I162dff1e3f3eb6f01e87dc09509b508f52aff46c
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants