"Load more" functionality... Like Wappler chat

Hey giys,

I'm glad playing with my new "NodeJS" toy and enjoying its benefits...

On this one I would need your lights!
I'm building a Notifications functionality like FB. Users and Admin will be informed of a various type of actions.
On the click of the Notification button, a dropdown is shown containing the last (date sorted) 20 records. At the bottom of the Dropdown there is a "View All" button that directs to the Notifications page.
It works fine so far, with the "new notification" sign and sound involved but I load all the Notifications and this is not going to be working well...
There will be a large amount of records, So there, I need to load again the last 20 records and some kind triggering "load more" that will load AND the next 20 records and goes on...

I thought that a paged query and a paging generator on my page would do the trick BUT this way only the current 20 records will be viewd...

Then the way that our chat topic are loaded and shown and I liked the idea...
I don;t need the scrolling to trigger the load more... A simple button is perfect!

With this functionality, every time the "load more" is clicked, the query is loading all the records or only the new (20)? Is this a NodeJS trick also?

How could we accomplish this kind of functionality?

Any help/idea/suggestion would be much appreciated!

I tried to check how this is accomplished from Discourse...

So, what I noticed is that on initial page load, "featured.json" is called that fetches the first 30 records...
Then every time we reach the bottom, "latest.json" is called and a parameter "page" is passed to it with the number of the next page as a value
Example:
page=1 -> records 30-59,
page=2 -> records 60-89,
page=3 -> records 90-119,
and so on...

So, my question is:

How the results of "latest.json" is added to the previous shown results and not overwritting them?

From what I understand, it acts like adding the results from multiple ServerConnects or JSON DataSources to a DataView?

Is there a way of accomplishing something like this?

Maybe this will help you:

Thank brother,

I've read that great solution from @franse and bookmarked it...

But I suppose this solution has a single datasourse and I suppose that everytime you click the "load more" the api is running and fetches all the "requested" records:

  1. getRecords -> 30 records
  2. getRecords -> 60 records
  3. getRecords -> 90 records
  4. getRecords -> 120 records
    ....
  5. getRecords -> 600 records
    Is that correct?

And what happens if that button is somehow triggered continously (bot, hack) and there are 2 million records in Notification table?

Sorry if I'm overthinking of it...

Yes, unfortunately this way the data source will always be reloaded.

1 Like

There's a FR here for that, its been requested a lot of times:

There's an excellent infinite scroll extension that I use a lot, its not supported anymore though which is why a native Wappler component would be more than welcome:

Pagination no longer supplies a decent UX (and hasn't done for years now) so please vote on the FR.

1 Like

I see... Thanks Tom!

But I think I found some "food for thought"... :wink:

I'm gonna do a few tests in my free time and see how it goes...

1 Like

Brother, I think got it working somehow.

I used an array to hold the serverconnect's results on each call
I don't know how "heavy" it will be, how large amount of datasets can be handled and the impact on mobile devices...
Anyway, please have a look and tell me if there can be inprovements or changes/corrections.

(I'm preparing a walkthrough with short descriptions and screenshots)

2 Likes

I have a table Notifications with some columns and a subtable that is updated if a Notification is SENT to the user(not_usr_sent==1) and if is READ by the user(not_usr_read==1).
Screenshot_1

So, first I created the serveraction that reads the notifications.

We need 2 $_GET variables:
page: holds the current page of results
limit: defines the amount of records returned on each call

We define 2 setValue for those two variables and set their default values

We have a query just for getting the total count of records

And finally, we added a custom query that pulls the current (page) of (limit) records
Let's look at it...

A basic select with the addition of LIMIT :P2 OFFSET (:P3-1)*:P2

Nothing special...

Now in our page.

  1. PageFlow that calls the serverconnect
  2. Our serverconnect that we created before
  3. variable var_limit that holds the LIMIT for our serverconnect, value=10 for our case.
  4. variable var_CurPage that holds the current page for our serverconnect, value=1 (first page of results)
  5. Our assistant Array that the results of each serverconnect call will be added there.
  6. A DataView. mirror of our Array data (DataSourse=arr_Data.items)
  7. A table Generator that lists our results.
    Here a trick is needed because Table generator couldn't accept neither the Array or the dataView:
    Screenshot_8
    We populated the ServerConnect since it has exactly the same structure and...
  8. In our table's body tablerepeat we binded the DataView.

Let's see in detail the above steps.

We have a PageFlow that handles the calls of the serverconnect
It is set to AutoRun
But let me talk about the serverconnect first...

It is set to "No Auto Load" because it will be called only through the PageFlow and its parameters are empty.

Back to the PageFlow

We define the parameters needed for the limit and offset of our serveraction.
And we add a condition to check if the parameters have value or no.
If they have no value we call the serveraction and pass the default values of page=1 and limit=10
(we could just igmore this step if the default values we have set in our serveraction are fine for our case)

So, the serverconnect is called for the first time and we set some actions on its success event:

On Success event, we set an inline flow and in there:

  1. we add a repeat step based on the serveraction custom query that returns our results
  2. on each loop, we insert in the array
    -on the current index: (var_CurPage.value*var_Limit.value)+$index
    -the current row's value: srvc_getUsrNotList_Pages.data.qr_getAllUsrNots[$index]
  3. After the repeat (outside) we increase the value of the variable var_CurPage=var_CurPage.value+1

And we are ready...
You can check the result in the short video on my previous post

Please, anyone can try it and give a feedback if there is a problem/issue.
Especially if you have a "heavy" dataset to work on so we can see its behavior...

Any suggestions, corrections or ideas are always welcome are welcome

Thanks for reading

2 Likes

The implementation looks good. I'll try it when I get a chance. Will you give the user option to adjust the load per page?

Hey @Buggy,

I adjusted it to my needs but everything is open here buddy...
Concering the load per page ("limit" in my example) could be driven by any kind of input field:

  • a select with predefined options (eg 10, 25, 50 etc)
  • a radio-group
    ot whatever you like...Off course then it would be wise, inside the PageFlow to check if "limit" value is under a logical number (for example "limit"<200)

Off course this workaround can be used in any kind of dataset..

  1. It can be a heavy dataset per record (each record contains enough basic columns and a few sets/subtables of different data)
  2. Or it can be a light dataset but our source has a loooot of records
    In case one I would call the PageFlow with a low "limit" (eg 30) and in the second case I would go with a higher one (eg 100 or 200)

In all cases another parameter should be set here... That is the "Max_recs" that should be also taken in consideration according the above cases... Heavy recordset = low "Max_recs" , Light recordset = higher "Max_recs". Something like that.

I don't have time now but now with the new @Hyperbytes GREAT tutorials, this is an AppConnect module that I would like to try to build. (Trying=Learning)

Anyway, all the numbers I mentioned above are just for reference and not real...
The final setup of this would be on the hand of the developer that knows approximately the "query's weight" on which this "load more" is applied each time

1 Like