One of the oldest but still very useful Linux tricks for Oracle DBAs is using mknod to create a named pipe (FIFO). This allows you to export data and compress it in a single step — without writing a huge uncompressed dump file to disk first.
This technique was very popular in the days of the classic exp utility and remains relevant even today with Data Pump.
What is mknod and Named Pipe?
mknod is a Linux command used to create special files, including named pipes (FIFO).
A named pipe acts as a temporary buffer between two processes — one process writes to the pipe, and another reads from it simultaneously.
This is perfect for export → compress pipelines because the dump file never touches the disk in uncompressed form.
Why Use mknod for Oracle Exports?
- Save huge amounts of disk space (no large intermediate .dmp file)
- Export and compression happen in parallel
- Reduce I/O load on the server
- Very useful when you have limited free space on the filesystem
- Classic technique that still works perfectly in 19c/21c
Classic Example – Using exp + mknod + gzip
Here is the original script style from 2012 (still works today):
[oracle@ORACLEDBASECRETS01 ~]$ !/bin/sh [oracle@ORACLEDBASECRETS01 ~]$ . $HOME/.bash_profile [oracle@ORACLEDBASECRETS01 ~]$ cd /home/oracle/backup/ Create the named pipe [oracle@ORACLEDBASECRETS01 backup]$ mknod exp_pipe p Verify pipe creation [oracle@ORACLEDBASECRETS01 backup]$ ls -l exp_pipe prw-r--r-- 1 oracle oinstall 0 Apr 19 18:10 exp_pipe Start compression in background [oracle@ORACLEDBASECRETS01 backup]$ gzip -cNf < exp_pipe > exp_data.dmp.gz & [1] 24567 Run export and write directly to the pipe [oracle@ORACLEDBASECRETS01 backup]$ exp dba/dbaOra file=exp_pipe log=exp_data_TEAM.log owner=TEAM statistics=none Export: Release 19.0.0.0.0 - Production on Fri Apr 19 18:11:05 2026 Version 19.3.0.0.0 Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 - Production Export done in WE8MSWIN1252 character set and AL16UTF16 NCHAR character set About to export specified users ... . exporting pre-schema procedural objects and actions . exporting foreign function library names for user TEAM . exporting PUBLIC type synonyms . exporting private type synonyms . exporting object type definitions for user TEAM About to export TEAM's objects ... . exporting database links . exporting sequence numbers . exporting cluster definitions . exporting tables . . exporting table EMP_DATA 10500 rows exported . . exporting table SALES_DATA 24890 rows exported . exporting indexes . exporting grants . exporting constraints . exporting triggers Export terminated successfully without warnings. Check compressed dump file [oracle@ORACLEDBASECRETS01 backup]$ ls -lh exp_data.dmp.gz -rw-r--r-- 1 oracle oinstall 1.2G Apr 19 18:20 exp_data.dmp.gz Verify gzip integrity [oracle@ORACLEDBASECRETS01 backup]$ gzip -t exp_data.dmp.gz Remove named pipe [oracle@ORACLEDBASECRETS01 backup]$ rm -f exp_pipe Background job completed [1]+ Done gzip -cNf < exp_pipe > exp_data.dmp.gz
Modern Version – Using Data Pump (expdp) + mknod
[oracle@ORACLEDBASECRETS01 backup]$!/bin/sh [oracle@ORACLEDBASECRETS01 backup]$. $HOME/.bash_profile [oracle@ORACLEDBASECRETS01 backup]$cd /u01/backup/ Create named pipe [oracle@ORACLEDBASECRETS01 backup]$ mknod mypipe p Verify pipe [oracle@ORACLEDBASECRETS01 backup]$ ls -l mypipe prw-r--r-- 1 oracle oinstall 0 Apr 19 19:05 mypipe Start compression in background [oracle@ORACLEDBASECRETS01 backup]$ gzip -c < mypipe > full_export.dmp.gz & [1] 31245 Run Data Pump export directly to the pipe [oracle@ORACLEDBASECRETS01 backup]$ expdp dba/dbaOra directory=DATA_PUMP_DIR dumpfile=mypipe logfile=full_export.log full=Y parallel=4 Export: Release 19.0.0.0.0 - Production on Fri Apr 19 19:06:10 2026 Version 19.3.0.0.0 Connected to: Oracle Database 19c Enterprise Edition Release 19.0.0.0.0 Starting "SYSTEM"."SYS_EXPORT_FULL_01": system/******** directory=DATA_PUMP_DIR dumpfile=mypipe logfile=full_export.log full=Y parallel=4 Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA Processing object type DATABASE_EXPORT/SCHEMA/TABLE/INDEX/INDEX Processing object type DATABASE_EXPORT/SCHEMA/TABLE/CONSTRAINT . . exported "HR"."EMPLOYEES" 107 rows . . exported "HR"."DEPARTMENTS" 27 rows . . exported "SCOTT"."EMP" 14 rows . . exported "SCOTT"."DEPT" 4 rows Master table "SYSTEM"."SYS_EXPORT_FULL_01" successfully loaded/unloaded ****************************************************************************** Dump file set for SYSTEM.SYS_EXPORT_FULL_01 is: mypipe Job "SYSTEM"."SYS_EXPORT_FULL_01" successfully completed at Fri Apr 19 19:15:22 2026 elapsed 0 00:09:12 Verify compressed dump file [oracle@ORACLEDBASECRETS01 backup]$ ls -lh full_export.dmp.gz -rw-r--r-- 1 oracle oinstall 2.5G Apr 19 19:15 full_export.dmp.gz Validate gzip file [oracle@ORACLEDBASECRETS01 backup]$ gzip -t full_export.dmp.gz Cleanup pipe [oracle@ORACLEDBASECRETS01 backup]$ rm -f mypipe Background compression job completed [1]+ Done gzip -c < mypipe > full_export.dmp.gz
Tip: For even faster compression, replace gzip with pigz (parallel gzip).
Step-by-Step Implementation
- Create a dedicated backup directory
- Write the shell script (make it executable: chmod +x export_pipe.sh)
- Run the script as the oracle user
- Monitor the process with ps -ef | grep gzip and tail -f *.log
- Verify the final compressed file
Important Notes & Best Practices
- The named pipe must be created in the same directory where the export is running
- Always remove the pipe (rm -f pipe_name) after the job completes
- Never use this technique on very large exports without sufficient CPU (compression is CPU-intensive)
- Test in non-production first
- Works with both classic exp and modern expdp
- You can also use it for import (imp/impdp) with gunzip
Key Takeaways
• mknod exp_pipe p creates a named pipe for streaming
• Export writes directly to the pipe → gzip reads and compresses simultaneously
• Saves disk space and reduces I/O
• Still a very useful trick in 2025 for space-constrained environments
• Combine with pigz for even better performance
Conclusion
The mknod named pipe technique is a simple yet powerful way to export and compress Oracle data in one smooth operation. Even though it looks old-school, it continues to solve real-world problems when you need to export large schemas or full databases with limited disk space.
Toufique Khan

No comments:
Post a Comment