[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Orekit Users] crawler build failure




MAISONOBE Luc <luc.maisonobe@c-s.fr> a écrit :

w.grossman@ieee.org a écrit :

Still get the build failure with the network crawler. It is supposed to find
two files but it finds zero.

The files are there. Other similar url() tests pass so the base directory is
probably correct.  Maybe it is a problem with my O/S.  Stuck for now.

If the local() test passes and the compressed() test doesn't pass, it
is strange. They are very similar. What bothers me is that I cannot
reproduce the problem.

Here are a few hints that you could try to identify the problem on your
side:

 1) check if the file names are exactly similar, including case
 2) try to (temporarily!) rename the three '.gz' files to a different
    suffix (say '.gzz') both on the workspace and in the test source
    so the automatic decompression layer is not triggered
 3) try to surround the uncompression filtering stuff with print
    statements near line 127 of NetworkCrawler.java:

      System.out.println("before filtering = " + data.getName());
      data = DataProvidersManager.getInstance().applyAllFilters(data);
      System.out.println("after filtering = " + data.getName());

    with these prints, the local() test should show reguler files
    are not filtered at all whereas the compressed() test should show
    the names are stripped  from the '.dz' suffix (and in fact the

read '.gz', of course.

    decompression layer is inserted at the same time)

Tell us what you get from these tests


best regards,
Luc