You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We currently provide lexers for a few text editors with plans to implement more (see #2752). At present these lexers are manually tested, there is no standard test to run.
Options for testing:
Manual testing with standard reference files.
What we do now but better
Automated checking of generated HTML tags.
This will work for editors/lexers which can convert a reference file into HTML file with style tags. The Pygments lexer, for instance, could be easily tested in this way. Editors such as vi and emacs might not be testable in this way.
Personally I'm not too keen on this type of testing as you are typically only testing what you expect and the tests can be broken by minor syntax changes (in the editor). For me the real acid test for syntax highlighting is a visual overview.
Taking a quick look around there are extensions for other editors which might enable us to use this approach for instance the ScreenShot plugin for vim.
Manual visual testing.
Compare the visual difference before and after a changeset, if changes are present display them to the user as part of the review process.
Open a reference file with each text editor in turn. (e.g. vim suite.rc)
Take a screenshot of the text editor (e.g. using xwd).
Repeat steps 1&2 using a version of the syntax file from an earlier reference commit.
Feed the two images through a visual diff program (e.g. JS Image Diff).
If differences are present generate an HTML report including the before & after screenshots along with the visual diff example.
Automated visual testing.
An automated visual testing framework would be highly desirable for our future work with #1873 so whilst it may seem an overblown solution for testing syntax files it is work we may undertake anyway.
I can't seem to find any free off-the-shelf visual testing integrations that suit our purposes but there are plenty of tools / frameworks which can be assembled as required. Automated visual testing could be easily performed on CI (like Travis) but we would need somewhere to host the reports.
We will require standard reference files whatever way we go so I'll have a go at drafting something on this issue in due course. We would require at least two reference files due to the following mutually exclusive tests:
Cycling
Integer
ISO8601
Templating
Jinja2
EmPy
The text was updated successfully, but these errors were encountered:
We currently provide lexers for a few text editors with plans to implement more (see #2752). At present these lexers are manually tested, there is no standard test to run.
Options for testing:
Manual testing with standard reference files.
What we do now but better
Automated checking of generated HTML tags.
This will work for editors/lexers which can convert a reference file into HTML file with style tags. The Pygments lexer, for instance, could be easily tested in this way. Editors such as vi and emacs might not be testable in this way.
Personally I'm not too keen on this type of testing as you are typically only testing what you expect and the tests can be broken by minor syntax changes (in the editor). For me the real acid test for syntax highlighting is a visual overview.
Taking a quick look around there are extensions for other editors which might enable us to use this approach for instance the ScreenShot plugin for vim.
Manual visual testing.
Compare the visual difference before and after a changeset, if changes are present display them to the user as part of the review process.
vim suite.rc
)xwd
).Automated visual testing.
An automated visual testing framework would be highly desirable for our future work with #1873 so whilst it may seem an overblown solution for testing syntax files it is work we may undertake anyway.
I can't seem to find any free off-the-shelf visual testing integrations that suit our purposes but there are plenty of tools / frameworks which can be assembled as required. Automated visual testing could be easily performed on CI (like Travis) but we would need somewhere to host the reports.
We will require standard reference files whatever way we go so I'll have a go at drafting something on this issue in due course. We would require at least two reference files due to the following mutually exclusive tests:
The text was updated successfully, but these errors were encountered: