«    »

Automating FTP with ANT

This article describes how to automate the transfer of files between servers via FTP using the Java-based ANT build tool. I have been using ANT to do automated FTP for a number of years now, and will share some of the lessons and limitations I have discovered.

ANT provides a FTP task. It requires the installation of two external libraries into ANT's \lib directory: Jakarta ORO and Jakarta Commons Net. The FTP task connects to a remote server via the FTP protocol and provides all the basic FTP operations: send files, get files, delete files, create directories, delete directories, and change permissions. For the full details on how to use the FTP task, see the ANT manual's entry on the FTP task.

For my website, I use the FTP task to transfer my server logs to my local workstation for analysis. Here is an example (using ANT version 1.6.5):

<ftp action="recv"
  server="ftp.server"
  remotedir="/logs"
  userid="my-userid"
  password="my-password"
  verbose="yes"
>
  <fileset dir="${log.dir}">
      <include name="*.log"/>
  </fileset>
</ftp>

This task transfers the files named *.log from the directory /logs of the server ftp.server to the ${log.dir} directory on your local machine.

At work, I usually use the FTP task for deploying application changes - sending a set of files built on my local workstation or development server to the test or production server. Here is an example:

<ftp action="send"
  server="ftp.server"
  userid="my-userid"
  password="my-password"
  remotedir="/incoming"
  depends="yes"
  binary="no"
  chmod="755"
>
  <fileset dir="${ftp.source.dir}">
      <include name="**/*"/>
  </fileset>
</ftp>

This task transfers all the files within the local directory ${ftp.source.dir} to the directory /incoming of the server ftp.server. Files in sub-directories are also transferred into corresponding sub-directories on the remote server, and the sub-directories are created if they do not already exist. Only new or changed files are actually transferred (depends="yes"). The files are transferred in ASCII mode (binary="no"). Assuming the remote server runs UNIX, the permissions of the files are set to 755 (chmod="755").

Using the FTP task to deploy a set of files has a number of limitations, especially if these files are organized into a hierarchy of directories. While the FTP task will automatically create sub-directories as required, I have not found a way to ensure that the required directory permissions are assigned for new directories (this assumes the remote server runs UNIX). In particular, the chmod attribute is only assigned to new files, not new directories. The only workaround I have found is to execute a script on the server that assigns the necessary directory permissions. Another limitation is that the FTP task will not create empty directories. If you want to create a particular directory structure on the remote server, then each directory you want created must have at least a single file in it. The workaround is to create a dummy file to ensure the directory is created.

Deleting obsolete files and directories on the remote server is also difficult for a couple of reasons. The FTP task does provide all the basic FTP operations, including deleting files and directories. But each is handled as a separate action which requires a separate invocation of the FTP task, instead of performing all the operations within a single FTP session. This is a minor inconvenience. The more significant limitation is in determining how to delete obsolete files and directories. The development build usually assembles a set of files to transfer to the server without caring about the individual files or directories - it just assembles a directory structure. Therefore, if certain files or directories are deleted or renamed, this just results in a new set of files without the build process knowing which are to be deleted. Therefore, automating the deletion of these obsolete files by listing each one is quite difficult. An easier approach would be to simply delete all the files and directories deployed to the server and then re-deploy the new set. This solution also has its problems. First, this requires recreating all the sub-directories, which as I explained above leads to the problem of ensuring their permissions are properly configured. Second, the directory structure into which files are deployed may include files created by server processes which cannot or should not be deleted, such as server logs or application data files. The workaround I use to deal with obsolete files and directories is to simple leave them in place - they typically have no impact on the operation of the system. But I am on the lookout for a better solution to this problem - if you know of any, please let me know via a comment.

Despite the limitations, I have been pleased with my experiences using ANT to automate FTP tasks. I encourage you to give it a try.

Update June, 2009: I have had questions about what version of the ORO and Commons Net libraries to use, as not all versions are compatible. I have had success with Ant 1.7, Commons Net 1.4.0, and Jakarta ORO 2.0.8.

If you find this article helpful, please make a donation.

11 Comments on “Automating FTP with ANT”

  1. Fabio says:

    Nice idea. I’ve tried it sometimes, but I’m always stopped by the fact that ANT does not have a SFTP task…

  2. Thanks. Using secure FTP may be a requirement on one of my projects sometime soon as well.

  3. Fabio says:

    follow-up: interested by your post I digged deeper and found I could use scp instead of sftp:

    http://wiki.apache.org/ant/NewAntFeaturesInDetail/Ssh

    thanks

  4. [...] of human error and ensures a consistent process is followed. Last year I wrote an article about automatically deploying code via FTP using the Java-based Apache Ant build tool. Since then I have needed to deploy to servers which do [...]

  5. Jimmycav says:

    I had the same problem setting directory permissions when doing a send, but instead of doing a chmod=”755″ i did a umask=”022″ and it seems to do the right thing.

  6. Jose says:

    On linux you can use the exec task with rsync and avoid the system prompt for a password using authorized keys. This should solve problems with permissions and deleted files.

  7. Ponic says:

    After deploying files from windows to unix, I would like to execute shell scripts which would compile the files. How can I do this?

    Thanks

  8. @Ponic, you can use Ant’s sshexec task to execute any remote command via SSH. After you transfer the files, simply use this task to execute the script that does the compile.

  9. Ponic says:

    Basil,

    Thanks for the reply.

    I have the following and would like to specify the location to execute the commands I have in command? How can I do that?

  10. Ponic says:

    Basil,

    Thanks for the reply.

    I have the following and would like to specify the location to execute the commands I have in command? How can I do that?

    sshexec
    host = “server”
    username=”user”
    password=”password”
    command=”. ./.setenv

  11. Ponic says:

    Basil,

    I have resolved the above issue by command=”cd

    I have yet another issue with path.

    When I run from server by doing this sets the environment path correctly
    . ./.setenv and the content of this file is “export SRC_PATH=$project_app_home/oracle/src”.

    I set the environment in sshexec is by the following

    command=”cd /location; ./setenv;

    However when I execute I am getting path as /oracle/src. It is omitting project_app_home/. How can I resolve this issue?

«    »