bug-parted
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

print <number> gives an error


From: Debarshi 'Rishi' Ray
Subject: print <number> gives an error
Date: Sun, 10 Sep 2006 23:57:07 +0530

Whenever I try to give the 'print <number>' command in Parted, it
complains that the FS was not cleanly unmounted, or that it has an
incompatible feature enabled.

address@hidden ~]# umount /dev/hda7
address@hidden ~]# parted
GNU Parted 1.7.1
Using /dev/hda
Welcome to GNU Parted! Type 'help' to view a list of commands.
(parted) print

Disk /dev/hda: 30.0GB
Sector size (logical/physical): 512B/512B
Partition Table: msdos

Number  Start   End     Size    Type      File system  Flags
1      32.3kB  98.7MB  98.7MB  primary   ext3         boot
2      98.7MB  19.4GB  19.3GB  primary   ext3
3      19.4GB  25.9GB  6440MB  primary   ext3
4      25.9GB  30.0GB  4137MB  extended
5      25.9GB  28.0GB  2147MB  logical   ext3
6      28.0GB  29.1GB  1077MB  logical   ext3
7      29.1GB  29.3GB  239MB   logical   ext3
8      29.3GB  30.0GB  674MB   logical   linux-swap

(parted) print 7
Error: File system has an incompatible feature enabled.
(parted) quit
Information: Don't forget to update /etc/fstab, if necessary.

address@hidden ~]# e2fsck /dev/hda7
e2fsck 1.38 (30-Jun-2005)
/development: recovering journal
/development: clean, 6407/58232 files, 69743/232908 blocks
address@hidden ~]#

In case the same print command is issued with the partition mounted, I get a:
Error: File system was not cleanly unmounted!  You should run e2fsck.
Modifying an unclean file system could cause severe
corruption.
Ignore/Cancel?

I am using GNU Parted 1.7.1 RPM from the Fedora Core 5 repositories.
address@hidden ~]# parted --version
GNU Parted 1.7.1
address@hidden ~]#

Any idea?

Happy hacking,
Debarshi
--
It matters not whether you win or lose; what matters is whether I win or lose.
                                           - Darrin Weinberg




reply via email to

[Prev in Thread] Current Thread [Next in Thread]