We used to use a similar json formatted dictionary until it built up to over several thousand phrases and become horrendous to maintain. Even after breaking it down it was still a nightmare. We implemented the keyed array solution and its working great. We see no difference in speed of translation at all. Even on huge pages. Keeping it all up to date is simple as all via a translation area in the site broken down in to zones so only what is necessary is retrieved defined by the zone id. We then cache this retrieved data so no more calls to the db are required.
Thanks for the input. Nice to see that there is no big hit on performance using the DB approach.
I will probably end up implementing a mixed solution. JSON for labels and static pages that donāt access database(marketing pages) and DB for dynamic content.
How do you guys handle language priority?
For marketing pages I will be using url parameter(or route) > browser language > fallback language
For the app(logged) pages I will be using user profile language(db) > fallback/default language
This being the only drawback I can see. A lot of people, myself included, do not like auto language selection. Being an English speaker living overseas and travelling a lot it annoys me when this happens. That's not to say it is not liked by some. We go via the profile route for users and allow them to select their own language as they require. And have very few complaints. When we did use the browser language we had repeated requests to allow them to select their own language preference. On the public facing side we offer a simple flag based selection. Click the flag to set the language preference for the session and this works very well. Either way we feel offering content in multiple language is essential this day and age. Whichever deployment you select, and as long as the user can easily switch, then everyone should be happy. The Adobe site is a good example. Asks if you would like to continue to browse using the browser language or to select an alternative. Fallback languages are easy as without a selection being made the pages display by default in English in our circumstance. We did see some issues using json when it came to select lists and some other dynamic content not being translated without using a mouse over or on click event to initiate the switch automatically but it was not favorable by any means, and not reliable in some circumstances based on browser plugins and user configurations browser side. Whereas using the DB to overcome this worked from the offset. Not sure if that helps with decision making but its the path we found ourselves on and have now adopted across all our multi-language based platforms. The users are very happy with this also. Which makes us happy as developers.
Thanks for the detailed response. Gives a lot of insight about what users actually want
Yeah. That's approach I will take. But I can only do that to the app pages where there is a user. For marketing pages(landing, contact, about, etc) there is no signed in user so I can't retrieve anything from the profile.
On first visit I'm thinking on serving browser language and presenting language selection and remembering choice(cookies or local storage). Then if they register I will set the choice or browser language as profile language.
Quick update. I have dropped i18next in favor of app connect. Although i18next is great it seems way too overkill for what I need. Additionally I am pretty confident that I can replicate some of the neat features i18next has like plurals and context with App Connect and some custom formatters.
The fact that Wappler also has support for moment with locales is great too and a time saver.
@George how does wappler manage number format with locales? I havenāt checked in depth so I donāt know my options. Anything similar to Numeraljs that can be achieved out of the box?
I must admit though that it would be nice at some point along the road to have something like this for app connect.
No bother. The files were static json which were manually updated and uploaded upon new additions. Not our choice at all but that of the clients. These files contained thousands upon thousands of entries.
Would you consider that deploying the translations files via GIT and with a tool like i18n manager or weblate it would have made the task of maintaining translations less cumbersome?
Asking this because when you say manually I donāt know if you mean you basically opened each file in a text editor and had to make sure also manually that all the files were in sync opening one at a time or you mean that you didnāt have API help to translate.
I want to make sure that even if I have the right tools and deployment process that there is no roadblock at some point in the future because I didnāt consider something.
The standard used by all i18n libraries is to use text files(.po, .mo, .json, .xml, .strings, .yml, etc) so when you are all using database I wonder why you are following this path(besides being suggested in this thread). Itās an OK path, donāt get me wrong. It is just not what most websites and apps use. And for obvious reasons itās not a valid approach for static websites/pages of course.
Not to mention that if the db connection is lost for whatever reason you get no content
This may not be an issue if itās in āthe app sideā of your website. If the database doesnāt connect you have no app at all. But in the sales/marketing side of a website itās a bad idea not to render content(even if database is down).
The client had numerous individuals working on translations so the files were mailed back and fourth which was a total pain in the arse. This was in place for quite a few years..
No problem there at all as we run replication across several load balanced servers allowing for multiple redundancy. A privilege and essential when working with enterprise data.
Perfectly clear! It was more an issue of process within the organization than the technical implementation. I can live with that given itās a one man task right now
Anyway as I commented previously I need to implement both approaches as the end user can also translate their generated content.
I will continue posting pieces of my implementation in this thread.
And likewise Jon. Us all bouncing off each-others ideas really helps a lot and is a pleasure. We inherit a lot of legacy code and often end up cleaning it up and deploying it back to secure reliable locations as quite a bit of it is historic data relating to accounting divisions operating in multiple countries (hence all the translation files).
Quick example. Last week we had a huge 1990ās mixture of FoxPro and DB4 databases to re organise. Millions of rows. Our database engineer really does earn his keep, Then this week we may have a touch-up on a dashboard or registration⦠Or could be anything inbetween hehe. Wappler is my love in the office and I tend to undertake the more simple straight forward front-end stuff with a little dynamic side thrown in. Our specialties really lay in older technologies and conversions. I started in VAX/VMS architecture and assembly and older mainframe based thin client based networks. Novell, IBM, DEC etc. Be super surprised at the amount of these systems propping up big business and banks (and government). Interesting work but now I see myself as too addled to concentrate so much on predominately text and terminals. I like a GUI
Oh man I wish! In demand and on very good money. A friend here in Portugal is one such guy who is always on a plane to some exciting location. Gets his hands on some amazing equipment that should really be in a museum. We also see a lot of security via obscurity in place via VMs running ancient systems with lots of tweaks and customisation for the modern era and protocols (very clever stuff which totally blows my mind) sat in the front-line, some of the older systems display up-times running in to a decade and more!
We were among the first UK consultancies to offer BS5750/ISO9002. With this came the requirement to have a full inventory and stock count for everything in the building! Our sister company was a tape streamer repair enterprise fixing backup tape drives (Wangtek, Kennedy, Jumbo etc) so we had thousands of components all over the place. First true DB I ever created (FoxPro) was to manage these inventories for the components. Proud to say it was used in a lot of very well known companies to comply with Quality Standards accreditation. We were hated in all honesty as we removed middle management and replaced them with technology. Real insight in to how tech can have a major impact on people. Spent years feeling guilty as kept bumping in to the same āmanagersā all over the UK, and they knew exactly why we were there⦠Aside from the guilt I did love the job and the perks as the owner operated his own plane so we flew all over the place in this tiny thing. He was also an adrenaline junkie so most flights were terrifying to say the least, but am still here now to tell the tale so canāt complain.