0

Apologies for the title. To be more specific, I recall reading this somewhere in the last year when I was trying to repair data damage to a hard drive.

I have a windows XP hard drive that suffered a head crash and caused unrecoverable bad sectors on certain files. I'm cloning the drive to a duplicate drive and replacing the files that were damaged using a backup I had from long ago.

What I've done already is... I booted into Parted Magic from the Ultimate Boot CD and used linux's ddrescue tools to clone the damaged drive into a disk image, then used the logfile.txt to write DEADBEEF to all sectors it couldn't read/clone to the disk image file and make a complete image. I then used one of the linux grep commands, I believe, to try to search the entire file system for the string DEADBEEF and list all files containing that string, though had some trouble with it quitting hours into the search due to some odd error.

I also manually repaired the $MFT's (Master File Table) errors, everything except a single picture file's data which is unimportant, so that I could scan the entire file system properly and see all files (some were not showing due to the damage).

What I need to do is this:

I want to fully scan the entire drive down to the byte level (as if looking at the disk image in a hex editor), every sector, for the string DEADBEEF, then have it list every file that the bad sector overwritten with DEADBEEF belongs to according to the filesystem. I remember reading about this somewhere, being able to scan the drive and once it finds the string, it lists the offset/location/sector of the DEADBEEF string and what file owns the data in that sector. That, or every bad sector and lists what file the bad sector belongs to.

The ddrescue logfile lists every sector that was detected as unreadable (about 1000 200-byte sectors), where it wrote DEADBEEF to. If I know what files own those bad sectors, I can replace them using my old backup.

Before you ask, I can't just use the old backup alone because it's about 3 years old. The old backup is actually the original drive from this computer, which I cloned to this drive that I'm trying to rescue. Most of the bad sectors from the hard drive crash were in a part of the drive which only had files that were present from the original drive. I can easily copy those files from the original to the new drive to fix all the DEADBEEF bad sectors, but I need to know what every file is that those bad sectors belong to.

Again, I recall reading something about scanning all sectors of a drive and having it list any file that a certain sector belonged to. So how do I do this from Parted Magic? I have to mount it in there so it is mounted as read only.

DChronosX
  • 1
  • 1
  • 1

1 Answers1

1

You could do like the helpful comment from steeldriver says and use ddrutility

It doesn't appear to be in the Ubuntu repos, but it's home page is https://sourceforge.net/projects/ddrutility/
Specifically use it's tool ddru_findbad
Here's a clip from it's wiki page:

ddru_findbad
It is a bash script that will try to find which files are related to bad sectors in a ddrescue log file.
It relies on 3rd party utilities for its functions. It may not work on all systems. It can be slow, and can be very slow if not unusable if there are a lot of bad sectors in the list (it does not work well with a large error size).


I'm tempted to forget about sector numbers and just mount the cloned image & search all files for "DEADBEEF", with find, xargs & grep in a Ubuntu (or Xubuntu, Lubuntu, or Debian, most any Linux).

Whether it's easier or faster than trying ddru_findbad or not probably depends on how big & fast your disk image is.

find /mnt/x -type f -print0 | xargs -0 grep --files-with-matches "DEADBEEF" >> list

Where the image is mounted to /mnt/x. Then the file list has all the filenames that match. Any free space that has DEADBEEF are ignored.

Xen2050
  • 8,943