Tuesday 21 January 2014

p7zip E_FAIL when uncompressing

Problem

Uncompressing a multi-part zip file on Linux using p7zip throws the error "E_FAIL"

x@y:/media/z/../..$ 7z x ~/Documents/abc-2.0.zip.001

7-Zip 9.20  Copyright (c) 1999-2010 Igor Pavlov  2010-11-18
p7zip Version 9.20 (locale=en_GB.UTF-8,Utf16=on,HugeFiles=on,2 CPUs)

Processing archive: /home/x/Documents/abc-2.0.zip.001

Extracting  ...
...
...
Extracting  abc/def.vmdk
ERROR: E_FAIL  

Description

I have a linux box running Ubuntu 13.10 and it has a large multi-part zip file. Total size of compressed files >40GB; largest file >25GB; number of parts 80. I wanted to unzip this to a 128GB flash drive. As I was unzipping this I got the error E_FAIL. Doing some simple digging around solved this.

E_FAIL means "out of disk"; "unable to make a file of a certain size"; and other variants of that. Basically it struggled to output the uncompressed files to target disk. There are three main reasons for this:

1. You are looking at a compressed archive that is read-only, like a CD-ROM and you've asked -7zip to write to that. It can't do this so fails as if there is no disk space.
2. You are trying to uncompress to a drive with insufficient disk space.
3. You are trying to uncompress a large file to a file system that cannot support a file that large so it fails as if there is no disk space.

My problem was the brand new Flash drive was formatted to exFAT or FAT32 as it appeared to my linux box. Thus the big file (>25GB) couldn't be written above 4GB, which is the limit for FAT32.

Solution

I reformatted the drive in NTFS because it can handle >4GB files and the drive is going to be used on Windows. I re-ran the uncompress process and it worked.

SOURCES

Ubuntu Forums Post

6 comments:

  1. I encountered the same error whilst trying to unzip the latest raspberry pi image (<4gb) on vfat filesystems. After reading your blogpost I was successful after doing it on an ext4 drive. Cheers!

    ReplyDelete
  2. I just want to add you can also run into this error if your archive exceeds the maximum number of open files allowed per user.

    I have an archive consisting of 1200 1MB files, that got transmitted over an unreliable connection (small 1MB chunks so network retries have a minimal impact).

    The default max number of open files is 1024. (ulimit -n). I changed it using ulimit -n 2048.

    ReplyDelete
  3. Just to be clear all i have to do is clear some storage space?

    ReplyDelete
  4. Thanks, I added a file to a backup dir that was too big to fit in temp /dev/shm RAMfs folder for tape backup...

    ReplyDelete
    Replies
    1. Removed the file from there but /dev/shm is full. I have to investigate and check for files that are too large and/or use a different temp dir for my raw tar file and encrypted 7z file to be backed up to tape /dev/st0

      Delete