DevX Home    Today's Headlines   Articles Archive   Tip Bank   Forums   

Results 1 to 5 of 5
  1. #1
    Join Date
    Feb 2009

    Fetch files over web server

    Hi, I am just a newbie and I need to write some programs.

    I have one directory and under that directory, I have 35 xml files in the Apache Web Server. The path for the directory is like //localhost:8081/dirxml/

    And path for the xml files are like..
    .... and so on....

    I would like to fetch them from my java program and then produce an xml file which is a combination of those xml files.

    So, how can I fetch those xml files from web server simultaneously? Is it ok if I use java.net.URL class. But what I found so far is it needs to give the exact URL path like //localhost:8081/dirxml/mycontacts.xml?? But I can't do it coz I don't want to give those xml file names and fetches. What I want to do is to fetch the files under that directory without knowing the file name. It is like to fetch all the files under that directory which ends with .xml

    How can I do this? And is there any reference for it.

    Thanks in advance.

  2. #2
    Join Date
    Jan 2009
    As far as I know, web servers do not send files by wildcards. Imaging that you enter http://www.example.com/* into your browser and there are 100 matches, what do you expect to see in the page? 100 files displayed back to back?

  3. #3
    Join Date
    Feb 2009
    I don't mean to display those files. I just want to fetch those files to process over them and create another xml file using those records. And actully, I know how to fetch one file giving specific url path.

    The problem is I would like to fetch more than one file (say 30) simultaneously so that I cannot give specific url in my java program. So, what I want to know is the way to fetch more than one files from web server from my java program.

  4. #4
    Join Date
    Jan 2009
    What I meant was that the web server by default doesn't know how to respond if the request were intended for multiple files. If from your Java program you intend to download multiple files with one URL, the same would happen if you enter the URL into your browser. How is your browser supposed to respond then?

    However, you can arrange the web server to list the file names in a directory when a directory (as opposed to a file) is requested. Then you can follow the links to download each file in turn (or in parallel if you use multiple threads). That way you only need to know the directory name, not the name of each and every file.

    For Apache, you can set the Indexes option:

    <Directory /web/docs>
    Options Indexes

  5. #5
    Join Date
    Feb 2009

    Thumbs up this what you lokking for!!

    there is no simple way to do that because this is not a real path for a file system so this will be some huge code because you want to parse an html resulted from specified url and get the resulted inline files,
    but fortunately apache did that effort and make a package called ivr this have more utilities to do such that things,
    you can download the binary from this location:

    and then use it as simple as:
    package networkanddatabase;
    import java.io.BufferedInputStream;
    import java.io.BufferedOutputStream;
    import java.io.File;
    import java.io.FileOutputStream;
    import java.io.IOException;
    import java.net.URL;
    import java.net.URLConnection;
    import java.util.Iterator;
    import java.util.List;
    import org.apache.ivy.util.url.ApacheURLLister;
    public class FetchFilesFromHttpURL {
    	public static void main(String[] args) {
    		URL url;
    		try {
    			url = new URL("http://engineeringserver.com/fetchfiles/");
    			File destFolder = new File("c:\\test");
    			ApacheURLLister lister = new ApacheURLLister();
    			// this list of URLs objects
    			List files = lister.listAll(url);
    			System.out.println("list file is complete.."+files);
    			for (Iterator iter = files.iterator(); iter.hasNext();) {
    				URL fileUrl = (URL) iter.next();
    				httpFileDownload(fileUrl, destFolder);
    			System.out.println("download is complete..");
    		} catch (Exception e) {
    	public static void httpFileDownload(URL url, File destFolder) throws Exception {
    		File destination = new File(destFolder, url.getFile());
    		BufferedInputStream bis = null;
    		BufferedOutputStream bos = null;
    		try {
    			URLConnection urlc = url.openConnection();
    			bis = new BufferedInputStream(urlc.getInputStream());
    			bos = new BufferedOutputStream(new FileOutputStream(destination.getPath()));
    			int i;
    			while ((i = bis.read()) != -1) {
    		} finally {
    			if (bis != null)
    				try {
    				} catch (IOException ioe) {
    			if (bos != null)
    				try {
    				} catch (IOException ioe) {
    this is all thing that I can do..


Similar Threads

  1. Replies: 1
    Last Post: 12-29-2008, 04:47 PM
  2. Replies: 2
    Last Post: 08-04-2007, 12:12 PM
  3. Downloading files from a web server to a visitor's local computer
    By Helen Kavanagh in forum authorevents.mitchell
    Replies: 0
    Last Post: 10-18-2000, 02:25 PM
  4. Remove exchange from a site
    By andriano in forum Enterprise
    Replies: 1
    Last Post: 10-02-2000, 03:54 PM
  5. Replies: 0
    Last Post: 09-07-2000, 09:53 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
HTML5 Development Center
Latest Articles
Questions? Contact us.
Web Development
Latest Tips
Open Source

   Development Centers

   -- Android Development Center
   -- Cloud Development Project Center
   -- HTML5 Development Center
   -- Windows Mobile Development Center

We have made updates to our Privacy Policy to reflect the implementation of the General Data Protection Regulation.