oracle下逻辑的导入导出工具exp/imp
oracle下逻辑的导入导出工具exp/impexp/imp简介:exp/imp是oracle幸存的最古老的两个命令行备份工具,在小型数据库的转储、表空间的迁移、表的抽
oracle下逻辑的导入导出工具exp/imp
exp/imp简介:
exp/imp是oracle幸存的最古老的两个命令行备份工具 ,在小型数据库的转储、表空间的迁移、表的抽取、检测逻辑和物理冲突中使用非常广泛,我们可以把它作为小型数据库的物理备份后的一个逻辑备份。它可以跨平台、跨版本。
exp/imp的工作原理:
exp用户进程通过服务器进程连接到数据库,开启shadow进程,同时执行select语句查询数据库中的数据,通过buffer cache并通过SQL语句处理层再转移出exp导出文件,即exp进程需要占用服务器上的SGA和PGA资源。
imp读取exp导出的.dmp文件,虚拟主机,构造DDL语句,插入创建表与其他对象以及添加数据的语句
exp的导出数据的方式:
1、全库导出(这种方式一般不用)
2、按用户导出
3、按表导出
exp常用参数简介:exp help=y
3、按表导出
4、在导出是,只导出表结构不导出数据
exp工具的缺点:
速度慢,由于exp连接到数据库需要先select要导出的数据,再通过SGA、PGA传输给exp。
如果此连接断开,网站空间,则exp需从头开始导出,没有断点续传的功能
消耗服务端资源,只能服务端业务的前提下使用
imp 常用参数:imp help=y
导入数据,将导出的数据导入到windows平台的oracle 11g 64位中
使用11g客户段导入数据
1、按用户导入
imp system/oracle@orcl fromuser=lck touser=lck file=d:\lck_tables.dmp;
再导入
又报错了,对data表空间没权限,因为没有给lck用户分配在data表空间上的配额,香港空间,其实现在表结构已经导进来了,看一下
好吧,那就再给lck的配额,再导入
啊,还是报错,对象已存在,加参数ignore=y (忽略创建错误),再导入
ok,导入成功, 原来导入数据不是一帆风顺的啊。
总结:导入数据前需要在目标数据库中创建对应的用户,并给用户相应的权限和该用户在自己默认表空间上的配额
2、按表导入
exp system/oracle@orcl talbes=tab1 fromuser=lck,test touser=lck,test file=d:\lck_test_tables.dmp
ok,导入成功!!
本文出自 “挨踢小蝌蚪” 博客,请务必保留此出处

Hot AI Tools

Undresser.AI Undress
AI-powered app for creating realistic nude photos

AI Clothes Remover
Online AI tool for removing clothes from photos.

Undress AI Tool
Undress images for free

Clothoff.io
AI clothes remover

AI Hentai Generator
Generate AI Hentai for free.

Hot Article

Hot Tools

Notepad++7.3.1
Easy-to-use and free code editor

SublimeText3 Chinese version
Chinese version, very easy to use

Zend Studio 13.0.1
Powerful PHP integrated development environment

Dreamweaver CS6
Visual web development tools

SublimeText3 Mac version
God-level code editing software (SublimeText3)

Hot Topics

Configure the apscheduler timing task as a service on macOS platform, if you want to configure the apscheduler timing task as a service, similar to ngin...

Alternative usage of Python parameter annotations In Python programming, parameter annotations are a very useful function that can help developers better understand and use functions...

This article describes how to build a highly available MongoDB database on a Debian system. We will explore multiple ways to ensure data security and services continue to operate. Key strategy: ReplicaSet: ReplicaSet: Use replicasets to achieve data redundancy and automatic failover. When a master node fails, the replica set will automatically elect a new master node to ensure the continuous availability of the service. Data backup and recovery: Regularly use the mongodump command to backup the database and formulate effective recovery strategies to deal with the risk of data loss. Monitoring and Alarms: Deploy monitoring tools (such as Prometheus, Grafana) to monitor the running status of MongoDB in real time, and

Regarding the problem of removing the Python interpreter that comes with Linux systems, many Linux distributions will preinstall the Python interpreter when installed, and it does not use the package manager...

This article introduces a variety of methods and tools to monitor PostgreSQL databases under the Debian system, helping you to fully grasp database performance monitoring. 1. Use PostgreSQL to build-in monitoring view PostgreSQL itself provides multiple views for monitoring database activities: pg_stat_activity: displays database activities in real time, including connections, queries, transactions and other information. pg_stat_replication: Monitors replication status, especially suitable for stream replication clusters. pg_stat_database: Provides database statistics, such as database size, transaction commit/rollback times and other key indicators. 2. Use log analysis tool pgBadg

How to teach computer novice programming basics within 10 hours? If you only have 10 hours to teach computer novice some programming knowledge, what would you choose to teach...

How to replace the disabled initialize_agent function in LangChain? In the LangChain library, initialize_agent...

Understanding the anti-crawling strategy of Investing.com Many people often try to crawl news data from Investing.com (https://cn.investing.com/news/latest-news)...
