I got the same results when I accessed the link in Chrome (Edge also, which is Chrome based), then viewed page source. The data appears correctly in Firefox and Firefox developer when the source is viewed - If you also try and post that link to Facebook, LinkedIn, Mastodon you’ll see that it shows the tags and not the actual data. Twitter will most likely be the same - this is a problem for you if you need to market your site using social media.
Do you have your App Root set to ‘Body’ or ‘Page’ …?
And that IS the expected result. When you see the page source code it is supposed to look like that.
But when you inspect the rendered page, all the data is rendered and the dynamic data IS parsed by google for its search results and indexing.
Why does the dynamic data appear in the source in Firefox and Firefox Developer? I’d guess they are two totally different engines but would have expected similar behavior from them all.
I did a Google check on the link and it renders correctly, which is good:
Because unlike Google - Facebook, Linkedin or Whatsapp do not render the dynamic bindings. And this has nothing to do with the initial question in this topic or with the SEO and google crawler which reads and indexes these pretty well.
I beg to differ here @Teodor - Links shared across social media have a direct impact on SEO - if a website cant be shared on social media its SEO will be a lot worse than a website that can be shared in social media. I’ve been marketing online since the mid 90s, if Id created sites for my business that could not be shared then Id have went out of business a long time ago.
I keep saying Wappler is a fantastic tool, however its got a big problem with .NET pages and social media sharing. This is critical for any SEO.
It’s not related to .net and again, has nothing to do with the initial question here. Please open a separate topic if you want to discuss social media sharing.
We are certainly still far from perfect.
The biggest problem that is underestimated is the management of pages by sitemap.xml in ‘Google Search Console’
Periodically I get reports with error 404 on dynamic pages, as the link https///www.villarperosa.net/ events-2.aspx?id=’+id
This means that if the homepage links are also correct and have the right syntax to point to the pages, they are not taken into account by the Google Search Console scan, unless I add them manually via the panel ‘check url’
Then in that case I see them correctly.
For me this is unrealistic to do, in a site where news pages are automatically loaded daily by users themselves via private area.
But sitemap.xml should be used to keep your site updated, right?
I mean, this SEO thing on dynamic pages really seems very limiting to me, because for a whole series of things Wappler is really great.
Maybe it’s for my bad english, but i’m talking about using sitemap.xml in Google Search Console in dynamic pages with wappler, as mentioned in my previous posts
It’s a tool i have been using without problem… before wappler
As i told, I get reports from Google search console with error 404 on dynamic pages, as https///www.villarperosa.net/ eventi-2.aspx?id=’+id
Links from homepage (to this pages) are correct.
Index.asp is listed in sitemap and indexed correctly, but the links to details page are not automatically grabbed
Where do you get these reports? And what has this to do with a generated and submitted sitemap.xml file? I really don’t understand what are you trying to explain.
Sorry can you please explain this with more details and/or with screenshots. I asked you 2 questions above and you just post a link and no description.
Sorry, I was texting on my phone and it was uncomfortable, but now I can take some screenshots.
Let me explain briefly.
I enter in Google Search Console, as you see my sitemap.xml is successfully saved
In the sitemap of course there are all the paths of the main pages including eventi.aspx (page that contains the titles of the articles that users insert)
Some errors appear in the report ‘check url’, including eventi-dettagli.aspx pages (the detail pages that are linking to eventi.aspx events with link similar to - eventi.aspx?id=XXX - )
if I check one of the pages I’m talking about, copying and pasting manually the url, I get the error that the page exists but isn’t indexed.
Then I can force manual indexing (which works)
obviously the screenshots are in Italian and this may be a problem for you, but eventually I can give you more information in English.
Thank you for your support
But that is pretty normal! It takes time for the Google crawler to crawl and index all the pages.
Even google says that you can’t “make” the Google crawler read and process your sitemap. Nor can you make it process it any quicker.
This all depends on internal linking, page content etc.
I think you should just check the articles about how Google crawler works and stop panicking without a reason.
And most of things you mention as “problems” here in this topic have nothing to do with the {{expressions}} used in your pages. It’s how google crawler and indexing works.