I see a lot of discussion about SEO on the forum but little about performance and wondered how users measure or improve it.
My understanding is that Google expects the first contentful paint to occur within 1.8 seconds and anything over 3 seconds is considered slow and leads to poor rankings.
Personally i use GTMertrix ( https://gtmetrix.com/ ) as my “go to” testing site (which also incorporated lighthouse functionality)
I have been looking at a number of sites mentioned by forum members and running them through gtmetrix and see some pretty slow sites.
perhaps tips on increasing performance would be a really useful thread?
I wont name and shame but this is what we should aiming for (or better)
I really think this is an area which is not getting the attention it needs.
Good servers are important as are fast CDNs. Optimization and edge caching is really a must do nowadays
We have run this on our websites, usually received poor results. Mostly owing to DMX AppConnect and other 3rd party JS libraries and integrations like GTM or FB for analytics and stuff.
I have no deep understanding or knowledge on improving SEO scores but in real world usage websites seem fine to me.
We have used CDN in one of the websites as well, but no visible performance improvement.
If you look at our company website slashash.co, it performs great on such benchmarks, because its basically just text with couple of images.
But if I add all analytics and widgets and other things, its bound to fail.
If someone could give some best practices around Wappler projects, it would be great.
A lot is down to servers but a lot also is down to caching, image optimization and decent CDNs
yes, slashash performs well as you say.
I discovered gumlet (or more accurately @psweb told me about it). It’s a great caching tool which can make a huge difference bringing heavy CMS sites to similar performance levels of simple html sites
PHP sites can be greatly improved by adding image/js/css caching via the .htaccess files by using a long lifetime cache, just remember to clear it if making significant updates
(see leverage browser caching)
The problem with GTMetrix is the distance from my location. I am sure that I will get results that are inferior to someone residing in Vancouver.
I tend to go to Lighthouse where I see the same data as for GTMetrix with more favourable results. Here I have used https://slashash.co/ as the guinea pig.
I agree with Sid, when running our site through these tools (mostly used lighthouse) it gives me a bad performance due to various external things. Which is a bit frustrating because I do not know how to optimize.
For me the 2 biggest issues are Stripe and AppConnect