Data Bindings SEO... examples?

I’m not a fan of “client-side” for the websites (I’m going to use Wappler just to hybrid apps) but I’d like to know if is possibile to have a few of examples of “real” websites with “App Connect” integration, just to know how the search engines see/process the html code.

I’ve tried to test dmxzone.com and wappler.io websites but seems that both do not use “client-side” to process the data.

(O.T.) For your knowledge, if I use www.wappler.io to enter on the website, I receive an error about the ssl certification:

Hello Michele,
It’s been years already since Google renders all the javascript and data provided by client-side frameworks like App Connect (or Angular, Vue, React etc.) - so there are no SEO issues.

Ok, I just wanted to test a website on Google, but no problem.

Well you can already do this here:
https://www.google.com/webmasters/tools/googlebot-fetch?

Just try any of your pages using app connect with dynamic data and you will see how it renders.

1 Like

It is also true that Google asks to avoid doing client-side Javascript as much as possible, because it is much harder for the bot to figure out the content of the page.

https://blog.seoprofiler.com/official-google-seo-and-javascript-2019/

I see that hrefs in Wappler are coded like this:

<a href="#" dmx-bind:href="the url here">click</a>

Could you please confirm this href is understood by Google’s bot and the linked page is followed through by the bot accordingly?

Many thanks.

Of course it is :slight_smile:
I am not sure what this blog is, but google blogs and developer articles DO CONFIRM it's great to use JavaScript and google bot reads everything. It's been years since this was introduced...
You can even google for 'wappler seo test' and explore the first result source code :slight_smile:

Also you can test/preview/get results how google renders and crawls your sites in Google search console :slight_smile:
https://search.google.com/search-console

The blog only gathers comments made by a Google's Search employee called Martin Splitt (mentioned at the beginning of the article):

“I think most of the time it is because [pre-rendering] has benefits for the user on top of just the crawlers. […]

Giving more content over is always a great thing. That doesn’t mean that you should always give us a page with a bazillion images right away because that’s just not going to be good for the users. […]

It should be always a mix between getting as much crucial content and as possible, but then figuring out which content [images and other non-crucial content] you can load lazily in the end of it.

So for SEOs that would be you know, we we know that different queries are different intents. Informational, transactional… so elements critical to that intent should really be in that initial rush [i.e. pre-rendered].”

So Google does not like client-side Javascript rendering pages...

So the href follow-through is done using client-side Javascript then?

I'm sorry but I don't agree :slight_smile: I don't know who this guy is and whether he ever said anything like this.

It's already been more than 5 years since google bot crawls and renders JavaScript really well:

And during these 5 years things have improved even more. Google likes JavaScript template engines just as any other content, so that's actually the last thing you should be worried about :smiley:

2 Likes

And also, I believe André (@swf) is pretty experienced in SEO and working with App Connect and dynamic data, he can explain how’s the SEO going :slight_smile:

I think one thing is that GoogleBot can understand (some, or most) Javascript, and a different thing is that GoogleBot can understand any Javascript you throw at it :slight_smile: Although GoogleBot might be using a headless Chromium to parse the pages.

This guy is WebMaster Trends Analyst at Google. Here’s the complete video: https://www.youtube.com/watch?v=EZtCgrpa6ss

Anyway, I’m sure the hrefs are understood by GoogleBot, otherwise you would have known by now :slight_smile:

But the important thing, I think, is to not over-rely on GoogleBot to understand all Javascript, and also, to be careful that the Javascript does not provide a bad user experience for the user (takes too long to display the page, for example), hence this guy recommends to pre-render the page server-side.

Thanks for your input by the way.

We’ve been working on these tools (our front end tools based on App Connect) since 2013 and until now (at DMXzone and Wappler) we never ever had any issue with Google and SEO.
All the dynamic attributes used by our framework (App Connect) are really well ‘seen’ by Google and there’s really nothing to worry about :slight_smile:

3 Likes

AppConnect/DataBindings works fine with Google, but not with Facebook. So we’re currently using “classic” php for the metatags/SEO code.

All modern search engines like Google, Bing and others work perfectly with JavaScript websites, so SEO is not a problem. The social websites like Facebook and Twitter do not currently support JavaScript on the websites, so dynamic meta tags for them will not work. Hopefully they also will update their crawlers to support JavaScript websites soon.