3

I have large directory with lots of files and directories on a remote Windows system where I only have FTP access.

I had to do some modifications in some of the files, so I downloaded the whole directory to run a find and replace command in files recursively.

Using Git locally, I was able to get the list of modified files. (Their is no Git on the Windows system, so I can't push/pull. Besides, I only have FTP access.)

Now I need to update the files on the Windows system. Uploading file by file manually can be frustrating as there are a lot of modified files.
Some of the files are located in subdirectories and sub-subdirectories.

For example this structure (There are a lot more files):

./
 |--- file1
 |--- file2
 |--- dir1
 |     |--- file1.1
 |     |--- dir1.1
 |           |---- file1.1.1
 |--- dir2
       |--- file2.1
       |--- file2.2
       |--- file2.3

How can I copy them (or moving them is even fine) to a new location, while preserving their directory structure.
This way, in my FTP client, I would just move the whole folder. It will manage the directory structure by itself.

Keep in mind, that the subdirectories have other unmodified files in them, which I do not want to be copied.

Git gave me this list, so it did half the work:

file1
file2
dir1/file1.1
dir1/dir1.1/file1.1.1
dir2/file2.1
dir2/file2.2
dir2/file2.3
Dan
  • 14,180

3 Answers3

2

Why don't you use curlftpfs? I think that would be simpler that uploading and downloading a whole directory structure.

sudo apt-get install curlftpfs

Mount a remote ftp filesystem:

curlftpfs -v user:password@ftp-server.com/path/ /mnt/

Then edit or modify whatever you have to, and unmount the fs. But notice, the filesystem is very slow.

Edit: If your password has special characters like "@" use a file called ~/.netrc with the following format:

machine ftp-server.com
login user
password p@ssword

And mount the fs via:

curlftpfs -v ftp-server.com/path/ /mnt/
chaos
  • 28,186
2

Interesting problem. I searched a bit, so here is another approach. When you have a file list and very much files to upload. Use wput.

apt-get install wput

cat /you/large/file/list | wput ftp:/host/ -i -

I would recommend to turn on verbose output with -v

From the man page:

-i file
       --input-file=file
           Reads URLs and filenames from file. ...

You can also use find to pipe filenames into wput.

find | wput ftp://host/ -i -

But, take care of shell escaped characters in your list.

chaos
  • 28,186
1

This is what I ended up doing, although if there is a better way please do add as answer:

I saved the list of files in a file foo.log

and ran the following command:

tar -zcf bar.tar.gz --files-from foo.log

When archiving with tar, it keeps the directory structure intact of the files being added.

I then unarchived the bar.tar.gz file in an empty directory:

mkdir tempdir
tar -zcf bar.tar.gz -C tempdir
# Delete the archive
rm bar.tar.gz

Now all my modified files got copied inside of tempdir

Dan
  • 14,180