If you are not using the database manager and its schema changes tracking (migrations), how do you track and deploy your database schema changes to different targets and do rollback if needed?
Keeping manually track of sql changes can be very error prone.
I use Navicat for all my database needs. I don't have a roll back system in place but with proper back ups I have never needed it since 1997. (Knock on Wood). I have all my databases duplicated on a MAMP Local Server as well. When I am going to make drastic changes I just back it up on my MAMP Server including the data.
I just find the DB Manager confusing to use. Having to write a comment for every change is also more work. I am sure the DB Manager is fantastic for those that use it. I appreciate the work that went into it. It just doesn't solve any problems for me or help my workflow.
I have attempted to use the db manager at least half a dozen times since it was released and after several updates. Each time it ends up with problems with migrations that require me to manually "deal with it". I also just prefer a robust db manager where I can do everything in one place like analyzing queries for performance, look up data, etc. All while using an extensive UI. For what it's worth, my tool of choice is Data Grip.
Most db managers have some type of schema sync that does a comparison of a source to another. So I work in dev, and make all the changes I need to the dev db.
When ready to push to production, I click a few buttons and an sql script is created and executed that migrates any changes. I then push my code. The scripts can be tracked by git, although I don't even bother.
Because I have mobile apps in the wild, I am never removing or renaming table columns, as that would break previous versions of the app which also use versioned API's (folders in the root of API).
Everything I do is adding columns, modifying indexes, an occasional stored proc and function, so even if I pull back a code push, I never have any db changes to pull back. And all of these syncs are tested first when moving to staging.
And although this topic is about targets and their management, I can't help but react to your question, George.
At the moment, there is a very painful part related to migration and database version control via a Wappler. The main problem arises from the fact that the database migration is stored in local files. This is a crazy headache, because of which you often have to refuse to work with migration through a Wappler, use obscene language and run third-party migration software. And all because if I change my workplace, I need to download all the migration files to a USB flash drive and transfer them to a new workplace. And if you suddenly forgot to do this, it will be impossible to work with the database on a new machine unless you reset the change control. And it's a pain! It's just a lot of pain!
And the solution is quite simple: it is necessary to save not only the names of local files with an index in the database, but also their contents. In this case, when you switch to another workplace, the missing files will be created automatically. In this case, the Wappler database manager will work at 100%.
I usually use Git. But I have scenarios where using Git is impossible and I have to use bad practice (but it is the only possible one) is to transfer changes to a USB flash drive. That's why it's relevant to me.
But if we abstract, it seems to me a bad practice to make the connectivity of different functions. I.e. now we have a situation that without using Git, the database manager does not work 100%. It seems to me that this is a bad practice. Any functionality should be isolated and work regardless of the use of other functions. If you save the contents of the migration scripts in the database, the functionality of the database manager in Wappler would be independent of other functions and full-fledged.
That's right, these are the names of local files that are stored in the project folder. I suggest adding a field in this table in which the contents of the file will be saved. In this case, you won't have to worry about anything if you don't use Git. On another computer, the Wappler will compare the contents of this table and local files and create the missing ones.
Moreover, there is a scenario in which this will be relevant even if you use git. Imagine that you are working on one project, and then you need to connect another project to this database. In the current version, the git will not help you. You will have to open the migration folder and manually copy the migration files to the new project. If the contents of the files were stored directly in the database, this procedure could be automated and the migration would always take place without errors, under any scenarios.
+1 to Ken’s comments about db manager. I have tried to use it multiple times and have consistently ran into errors that led me back to using the structure sync of Navicat.
Well make sure you report any problems with it as any stability issues has been solved already long time ago.
Also note that usually the issues that require “manual fixing” usually come because you are doing some changes and quick fixes directly the database and then the database manager isn’t are of and can’t match its own migration log. So you really should use fully the database manager or not at all.
The database manager is really the easiest way to work with multiple targets and data migrations.
It also runs automatically the migrations on publish and also on git checkout or branch change to make sure your database is always up to date.
That can’t be matched with manual db comparison tools that make a controlled version deploy really difficult.
Our views on the usefulness of the db manager are, shall I say, different.
Fortunately, Wappler doesn't lock me in, so I can make use of my tool of choice. If I could not use Data Grip (and Github desktop) I would have been forced to leave a long time ago.
This reminded me of a comment I made 4 years ago, when db manager was released:
There is a problem with MySQL and virtual columns: Wappler deletes the virtual column definition.
I haven't checked since, until now, but this is still a problem. This alone would prevent me from using db manager, if I wanted to. (Probably not a priority as I don't think anyone's mentioned it since, but it should probably be fixed. Not supporting features is one thing.. but destroying column definitions..)
This goes back to the core question - how much should be expected from Wappler for database admin? Maybe for very basic applications db manager is ok but I'd much rather see Wappler invest their precious resources in stabilizing the core product and leaving database administration to pro tools that exist in the marketplace already.
For us being able to easily connect to a database is the most important thing. For all things database management wise we use either Workbench (FREE) and for knocking up more complex queries we pay for a months subscription of Navicat or dbForge (at a host level we backup our databases every few hours, and locally we dump to S3 using a cron job). We did try the Wappler way but it caused us some real issues in the early days and to be honest we don't have the confidence in it. And likewise with the Resource Manager our early usage put us off using it again. Target management is never going to please everyone. Maybe a choice like with the Server Connect panel, tree or list view, and allow the User to pick old or new? BUT please do not break the current method! It must be backward compatable and WORK. Could easily go in-depth but some of the previous responses to this Topic cover our concerns so no point in repeating them.