-
-
Notifications
You must be signed in to change notification settings - Fork 544
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Very translates updated, some keys for some strings fixed and linked with crowdim. #1921
Conversation
Update messages.json
Others branch
Multi translations
New Crowdin updates
New Crowdin updates
New Crowdin updates
♡♡♡ |
thank you! @unnamed-orbert Sorry(!) didn't want to bloat your workload!😳 Yet our code currently only loads one locale file all at once, so it has to be complete (possibly reduces I/O load?). so starting today our function will consider all (up to) 3 files: 5254632
So we can undo the pre-population.
we have only 376 massages with 850 words.
|
RE: #1928
sorry, didnt mean to ask to remove pt_BR & pt_PT es_419: 1 unique message (yet)
|
Very translates updated, some keys for some strings fixed and linked with crowdim.
Original Pull resquest: #1916
Hey, I realized here that the problem we're trying to solve is a very old problem.
I propose the following solutions:
Use Crowdin for mass duplication and translation into the most popular languages (Would solve 30% of the problem)
Unfortunately Crowdin has a duent restriction of 60000 words!
Use manual translation by requesting a notice on the website, on the project page and in the extension for languages that are not on Crowdin
(Manual checking will be complicated)
Update localy.py file list manually
I'm sure we don't have the time or patience to be able to replicate text for more than 100 files, I don't have the knowledge at the moment to make a bot.
If possible, I would be happy if you could merge the changes I made and correct the keys by adding them to the _locales/en/messagens.json file