SYNOPSIS
- mysqldbimport [options] import_file ...
DESCRIPTION
If an object exists on the destination server with the same name as an imported object, it is dropped first before importing the new object.
To skip objects by type, use the --skip option with a list of the objects to skip. This enables you to extract a particular set of objects, say, for importing only events (by excluding all other types). Similarly, to skip creation of UPDATE statements for BLOB data, specify the --skip-blobs option.
To specify the input format, use one of the following values with the --format option. These correspond to the output formats of the mysqldbexport utility:
-
•
sql
(default)
Input consists of SQL statements. For definitions, this consists of the appropriate CREATE and GRANT statements. For data, this is an INSERT statement (or bulk insert if the --bulk-insert option is specified).
-
•
grid
Display output in grid or table format like that of the mysql client command-line tool.
-
•
csv
Input is formatted in comma-separated values format.
-
•
raw_csv
Input is a simple CSV file containing uniform rows with values separated with commas. The file can contain a header (the first row) that lists the table columns. The option --table is required to use this format.
-
•
tab
Input is formatted in tab-separated format.
-
•
vertical
Display output in single-column format like that of the \G command for the mysql client command-line tool.
To indicate that input in csv or tab format does not contain column headers, specify the --no-headers option.
To turn off all feedback information, specify the --quiet option.
By default, the utility creates each table on the destination server using the same storage engine as the original table. To override this and specify the storage engine to use for all tables created on the destination server, use the --new-storage-engine option. If the destination server supports the new engine, all tables use that engine.
To specify the storage engine to use for tables for which the destination server does not support the original storage engine on the source server, use the --default-storage-engine option.
The --new-storage-engine option takes precedence over --default-storage-engine if both are given.
If the --new-storage-engine or --default-storage-engine option is given and the destination server does not support the specified storage engine, a warning is issued and the server's default storage engine setting is used instead.
You must provide connection parameters (user, host, password, and so forth) for an account that has the appropriate privileges to access all objects in the operation. For details, see NOTES.
If you attempt to import databases on a server with GTIDs enabled (GTID_MODE = ON), a warning will be generated if the import file did not include the GTID statements generated by mysqldbexport.
The utility will also generate a warning if you import databases on a server without GTIDs enabled and there are GTID statements present in the file. Use the --skip-gtid option to ignore the GTID statements.
To make the most use of GTIDs and export/import, you should export all of the databases on the server with the --all option. This will generate an export file with all of the databases and the GTIDs executed to that point. Importing this file on another server will ensure that server has all of the data as well as all of the GTIDs recorded correctly in its logs. OPTIONS.PP mysqldbimport accepts the following command-line options:
-
•
--help
Display a help message and exit.
-
•
--license
Display license information and exit.
-
•
--autocommit
Enable autocommit for data import. By default, autocommit is off and data changes are only committed once at the end of each imported file.
-
•
--bulk-insert, -b
Use bulk insert statements for data.
-
•
--character-set=<charset>
Sets the client character set. The default is retrieved from the server variable character_set_client.
-
•
--default-storage-engine=<def_engine>
The engine to use for tables if the destination server does not support the original storage engine on the source server.
-
•
--drop-first, -d
Drop each database to be imported if exists before importing anything into it.
-
•
--dryrun
Import the files and generate the statements but do not execute them. This is useful for testing input file validity.
-
•
--format=<format>, -f<format>
Specify the input format. Permitted format values are sql (default), grid, tab, csv, raw_csv, and vertical.
-
•
--import=<import_type>, -i<import_type>
Specify the import format. Permitted format values are:
Table 5.1. mysqldbimport Import TypesImport Type Definition definitions (default) Only import the definitions (metadata) for the objects in the database
listdata Only import the table data for the tables in the database list both Import both the definitions (metadata) and data If you attempt to import objects into an existing database, the result depends on the import format. If the format is definitions or both, an error occurs unless --drop-first is given. If the format is data, imported table data is added to existing table data.
-
•
--max-bulk-insert
Specify the maximum number of INSERT statements to bulk, by default 30000. This option is only used with --bulk-insert.
-
•
--multiprocess
Specify the number of processes to concurrently import the specified files. Special values: 0 (number of processes equal to the number of detected CPUs) and 1 (default - no concurrency). Multiprocessing works at the files level for any operating systems.
-
•
--new-storage-engine=<new_engine>
The engine to use for all tables created on the destination MySQL server.
-
•
--no-headers, -h
Input does not contain column headers. This option only applies to the csv and tab file formats.
-
•
--quiet, -q
Turn off all messages for quiet execution.
-
•
--server=<server>
Connection information for the server.
To connect to a server, it is necessary to specify connection parameters such as user name, host name, password, and either a port or socket. MySQL Utilities provides a number of ways to supply this information. All of the methods require specifying your choice via a command-line option such as --server, --master, --slave, etc. The methods include the following in order of most secure to least secure.
- • Use login-paths from your .mylogin.cnf file (encrypted, not visible). Example : <login-path>[:<port>][:<socket>]
- • Use a configuration file (unencrypted, not visible) Note: available in release-1.5.0. Example : <configuration-file-path>[:<section>]
- • Specify the data on the command-line (unencrypted, visible). Example : <user>[:<passwd>]@<host>[:<port>][:<socket>]
-
•
--skip=<skip_objects>
Specify objects to skip in the operation as a comma-separated list (no spaces). Permitted values for this list are; CREATE_DB, DATA, EVENTS, FUNCTIONS, GRANTS, PROCEDURES, TABLES, TRIGGERS, and VIEWS.
-
•
--skip-blobs
Do not import BLOB data.
-
•
--skip-gtid
Skip execution of GTID_PURGED statements.
-
•
--skip-rpl
Do not execute replication commands.
-
•
--ssl-ca
The path to a file that contains a list of trusted SSL CAs.
-
•
--ssl-cert
The name of the SSL certificate file to use for establishing a secure connection.
-
•
--ssl-cert
The name of the SSL key file to use for establishing a secure connection.
-
•
--ssl
Specifies if the server connection requires use of SSL. If an encrypted connection cannot be established, the connection attempt fails. Default setting is 0 (SSL not required).
-
•
--table=<db>,<table>
Specify the table for importing. This option is required while using --format=raw_csv.
-
•
--verbose, -v
Specify how much information to display. Use this option multiple times to increase the amount of information. For example, -v = verbose, -vv = more verbose, -vvv = debug.
-
•
--version
Display version information and exit.
Actual privileges needed may differ from installation to installation depending on the security privileges present and whether the database contains certain objects such as views or events and whether binary logging is enabled.
Some combinations of the options may result in errors during the operation. For example, excluding tables but not views may result in an error when a view is imported.
The --new-storage-engine and --default-storage-engine options apply to all destination tables in the operation.
For the --format and --import options, the permitted values are not case sensitive. In addition, values may be specified as any unambiguous prefix of a valid value. For example, --format=g specifies the grid format. An error occurs if a prefix matches more than one valid value.
When importing data and including the GTID commands, you may encounter an error similar to "GTID_PURGED can only be set when GTID_EXECUTED is empty". This occurs because the destination server is not in a clean replication state. To solve this problem, you can issue a "RESET MASTER" command on the destination prior to executing the import.
The path to the MySQL client tools should be included in the PATH environment variable in order to use the authentication mechanism with login-paths. This will allow the utility to use the my_print_defaults tools which is required to read the login-path values from the login configuration file (.mylogin.cnf).
Keep in mind that you can only take advantage of multiprocessing if your system has multiple CPUs available for concurrent execution. Also note that multiprocessing is applied at the file level for the mysqldbimport utility, which means that only different files can be concurrently imported. EXAMPLES.PP To import the metadata from the util_test database to the server on the local host using a file in CSV format, use this command:
-
shell> mysqldbimport --server=root@localhost --import=definitions \ --format=csv data.csv # Source on localhost: ... connected. # Importing definitions from data.csv. #...done.
Similarly, to import the data from the util_test database to the server on the local host, importing the data using bulk insert statements, use this command:
-
shell> mysqldbimport --server=root@localhost --import=data \ --bulk-insert --format=csv data.csv # Source on localhost: ... connected. # Importing data from data.csv. #...done.
To import both data and definitions from the util_test database, importing the data using bulk insert statements from a file that contains SQL statements, use this command:
-
shell> mysqldbimport --server=root@localhost --import=both --bulk-insert --format=sql data.sql # Source on localhost: ... connected. # Importing definitions and data from data.sql. #...done.
PERMISSIONS REQUIRED.PP You also need permissions to create the new data directory and write data to it.
COPYRIGHT
Copyright © 2006, 2016, Oracle and/or its affiliates. All rights reserved.
This documentation is free software; you can redistribute it and/or modify it only under the terms of the GNU General Public License as published by the Free Software Foundation; version 2 of the License.
This documentation is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General Public License for more details.
You should have received a copy of the GNU General Public License along with the program; if not, write to the Free Software Foundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA or see http://www.gnu.org/licenses/.