I’m not a fan of “client-side” for the websites (I’m going to use Wappler just to hybrid apps) but I’d like to know if is possibile to have a few of examples of “real” websites with “App Connect” integration, just to know how the search engines see/process the html code.
I’ve tried to test dmxzone.com and wappler.io websites but seems that both do not use “client-side” to process the data.
(O.T.) For your knowledge, if I use www.wappler.io to enter on the website, I receive an error about the ssl certification:
Hello Michele,
It’s been years already since Google renders all the javascript and data provided by client-side frameworks like App Connect (or Angular, Vue, React etc.) - so there are no SEO issues.
It is also true that Google asks to avoid doing client-side Javascript as much as possible, because it is much harder for the bot to figure out the content of the page.
Of course it is
I am not sure what this blog is, but google blogs and developer articles DO CONFIRM it's great to use JavaScript and google bot reads everything. It's been years since this was introduced...
You can even google for 'wappler seo test' and explore the first result source code
The blog only gathers comments made by a Google's Search employee called Martin Splitt (mentioned at the beginning of the article):
“I think most of the time it is because [pre-rendering] has benefits for the user on top of just the crawlers. […]
Giving more content over is always a great thing. That doesn’t mean that you should always give us a page with a bazillion images right away because that’s just not going to be good for the users. […]
It should be always a mix between getting as much crucial content and as possible, but then figuring out which content [images and other non-crucial content] you can load lazily in the end of it.
So for SEOs that would be you know, we we know that different queries are different intents. Informational, transactional… so elements critical to that intent should really be in that initial rush [i.e. pre-rendered].”
So Google does not like client-side Javascript rendering pages...
So the href follow-through is done using client-side Javascript then?
I'm sorry but I don't agree I don't know who this guy is and whether he ever said anything like this.
It's already been more than 5 years since google bot crawls and renders JavaScript really well:
And during these 5 years things have improved even more. Google likes JavaScript template engines just as any other content, so that's actually the last thing you should be worried about
I think one thing is that GoogleBot can understand (some, or most) Javascript, and a different thing is that GoogleBot can understand any Javascript you throw at it Although GoogleBot might be using a headless Chromium to parse the pages.
Anyway, I’m sure the hrefs are understood by GoogleBot, otherwise you would have known by now
But the important thing, I think, is to not over-rely on GoogleBot to understand all Javascript, and also, to be careful that the Javascript does not provide a bad user experience for the user (takes too long to display the page, for example), hence this guy recommends to pre-render the page server-side.
We’ve been working on these tools (our front end tools based on App Connect) since 2013 and until now (at DMXzone and Wappler) we never ever had any issue with Google and SEO.
All the dynamic attributes used by our framework (App Connect) are really well ‘seen’ by Google and there’s really nothing to worry about
All modern search engines like Google, Bing and others work perfectly with JavaScript websites, so SEO is not a problem. The social websites like Facebook and Twitter do not currently support JavaScript on the websites, so dynamic meta tags for them will not work. Hopefully they also will update their crawlers to support JavaScript websites soon.