Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

#713, test multiple languages #721

Merged
merged 7 commits into from
Jul 8, 2023
Merged

#713, test multiple languages #721

merged 7 commits into from
Jul 8, 2023

Conversation

plocket
Copy link
Collaborator

@plocket plocket commented Jun 30, 2023

This has minimal invasiveness to just get this working again. It's mostly user initiated. Languages don't get tested by default. If the user wants to test languages, they can use the @al_language tag, at which point, alkiln will step aside and let language tests run.

Getting this into circulation can get people started if they need it. If this interface is acceptable for folks to start with (see the new tests), then that's good. I think it'd still be great to have a deeper discussion about what interfaces we want to provide for languages.

Closes #713
Closes #360

plocket added 2 commits June 28, 2023 16:55
forces the author to use `Examples:` in every scenario, which is a bit burdensome. Working on a solution involving env vars and bash loops in the action, as well as another tag that will allow those kinds of tests to skip tests that should avoid testing with different languages (such as tests that are language-specific, like testing for different phrases).
Copy link
Collaborator

@BryceStevenWilley BryceStevenWilley left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I really like the Examples API of running things; it'll need to be documented, but once I understood what was going on, it's really nice, and would make it easy to compare and edit different languages in the interview. I don't think we need to treat al_language as special, which is why I didn't approve. Feel free to skip over the necessary approval though if you think this is the right way

Comment on lines 11 to 12
# Not sure how to test urls like this: http://es.<other stuff>interview.yml
# That's how wikipedia does it.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Wikipedia has a very different infrastructure than docassemble; I doubt that if some one is running a multi-lingual interview, they would have different copies of the interview on different servers. which they would need to do with es.<other stuff>interview.yml, so I don't think that needs to be noted in the file.

The other way that I would recommend testing or supporting (eventually at least) is the Accept-Language header that docassemble uses if there isn't a lang in the URL:

If there is no lang parameter in the URL, docassemble will look for an Accept-Language header in the request from the user’s browser. This is typically set by the user’s web browser to whatever language is defined in the web browser settings. So if the user is a German speaker who is using a web browser that is set up to use the language de, then the language de will be used for translations. In interviews, you can read this language by calling language_from_browser().

docassemble/ALKilnTests/data/questions/test_languages.yml Outdated Show resolved Hide resolved
docassemble/ALKilnTests/data/sources/languages.feature Outdated Show resolved Hide resolved
# Tag use examples:
# # No tags for default language only
# @al_language # for all languages
# @al_language @es # to isolate one language
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this would be confusing behavior; From my reading of things, @es wouldn't work (since and not @al_language would be added), and IMO it should just work.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I read through the other issues (#713 and #360), and I don't really agree with the idea that we should only run english tests by default, and ignore everything else. Mostly, the idea that we only need to run the English tests by default and not other languages will lead to cases where other language tests aren't run, and eventually break because of that. The argument that things will automatically double in runtime when adding a language doesn't hold here, as with this interface, the tests need to be manually added for each language, for each scenario. Also, as I commented before, it makes for confusing behavior when trying to use the language specific tags.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To record the user conversation I had, they didn't recall having expressed the desire for the need to only run language tests manually. They then said it seemed like they must not have been that attached to the idea, so I'm fine leaving it out.

I do think the tests will be made longer, especially if they do want to thoroughly test all languages, but if the user isn't concerned about it, I'm not either. I will say I'm surprised, since the user's interview has 5 languages to test, which seems like it would really expand the amount of time it takes, but again, I'm fine if that's at their discretion. I'll post about this in the issue too to outline the current plan.

@plocket
Copy link
Collaborator Author

plocket commented Jul 8, 2023

I'm going to go ahead and merge. Anyone can feel free to make an issue or comment if something needs adjusting.

@plocket
Copy link
Collaborator Author

plocket commented Jul 8, 2023

Also, nothing really change in ALKiln itself, this is just authors using cucumber features that we didn't know about before. We should edit the CHANGELOG to reflect this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants