Introduction
In this blog post, we will take a look at the mysqldump command that is shipped as part of the MySQL clients.
Data Backup
We normally deal with backup of data by exporting the data as JSON / CSV or as SQL Files. This is required so that any change that we are working on might have a impact, we can immediately use the backup to fix any unexpected database operation.
When working with JSON / CSV at times, we have to deal with the issues in the data formats. Further as SQL queries, we get the option to rewrite the query, insert them into a local database, perform the required transform or operations and then move to the server.
The MySQL Workbench provides the export option through which we can generate the SQL scripts, but there are issues like the target table names are empty and there is a new insert statement for each row in the table.
Example Command
mysqldump.exe
--host=servername --port=3306 --default-character-set=utf8
--user=(db_user_name) -p --protocol=tcp --column-statistics=FALSE --skip-triggers
--set-gtid-purged=OFF --no-create-info --quick --max_allowed_packet=512M --skip-lock-tables
"db_name" "table_name" --where="column_name='value'" > script_name.sql
As we can see in the above command, we have provided the following arguments
--host -> to provide the server name
--user -> the database connectivity user name
--port -> default to 3306 or custom port if used
--default-character -> we are using utf8, it can be based on the char set defined in the table schema
-p -> to prompt for the server password
--protocol -> by default, we are using the tcp protocol
--skip-triggers -> to skip invocation of triggers
--set-git-purged -> to skip the global transaction identifiers
--no-create -> to not include the create table script as part of the generated sql files
--quick -> for handling large volume of data
--max_allowed_packet -> to set a max packet size, we are using 512 MB
--skip-lock-tables -> to avoid locking table during running the query, default is to lock the table
--where -> adds a filter condition to the generated query so that, we can split into chunks or focus on the required subset of data in the export
I have tried this and it works fine. Hope this helps for anyone in need.
Comments
Post a Comment