advice on what to do with ongoing pull request

(yann) #1

Dear All,

I have a couple of commits, for which the code is more or less written, it is about bugfix in image io palette handling, and adding pallette write capabilities for tiff and png. The thing is I have no idea when I’ll be able to write the proper test so that it can be accepted as a pull request.

What should I do in this case ? I don’t like see these sitting on my computer…

(Matt McCormick) #2

Dear Yann,

Can you discuss in detail and show code on what feature / issue the commits address? We should be able to design a test.


(yann) #3

sure, shall we discuss here ?

(yann) #4

One of my problems with testing is that if I enable the testing on everyday basis, it takes a very long time to recompile everything each time I modify something. Also I’m in a very time consuming project right now…

I made these corrections more than one year ago, so it is not so fresh in my head.

First I found some bug while testing how itk reads metadata on a bunch of tif files provided by libtiff.

  • when asked not to expand the palette, some tiff with 16 bits palette (not "directly’ supported by the reader) were detected and read as 8bit because the piece of code testing the palette type did not run.
  • There is a use of setExpandPalette instead of GetIsReadAsScalarPlusPalette.

Ideally, I think all the files provided by libtiff should be tested for correctness of metadata reading and also testing regressions. But I have no idea how this could be done…

I’ve also implemented writing image with palette for tiff and png image. By writing palette image, I mean writing a scalar image along with its colormap, not converting to rgb, like itk was doing before.
I try to apply the same logic as the read palette image:

  • boolean WritePalette macro in itkImageIOBase.h
  • add a set macro for the color palette in itkPNGImageIO.h and itkTIFFImageIO.h
  • in itkPNGImageIO.cpp and itkTIFFImageIO.cpp, the logic for allocating palette and setting it to the file

It is limited to one palette by tiff file for multipage tiff, but more because I have no clear idea on how to have a proper interface for this. It would be trivial to adapt tough.

(yann) #5

Ok, I made a first round of pull request for a few small bug:

I hope it is the right place…

(Matt McCormick) #6

A time-saving approach is to not build the entire toolkit and its tests during development. Instead only build the target module and its regression tests. The target name is <ITKModuleName>-all. For example

make ITKIOTIFF-all

To only run the tests for a specific module, use the CTest label for that module, e.g.


Make sure to rebase on master in case other improvements have been made to the toolkit.

Tests that examine reading and writing images with the desired characteristics should work well. ImageJ / Fiji has great tiff support, so the test files could be generated there, or there may be some example tests in their regression data.

Cool! Since this features is currently not supported by most of the other formats, it is best to add methods on itk::PNGImageIO and itk::TIFFImageIO with a consistent name. Then, we can evaluate whether it makes sense to expand support into itk::ImageIOBase and itk::ImageFileReader / itk::ImageFileWriter.

(Jon Haitz Legarreta Gorroño) #7

As an addition to what Matt said about building tests, and about enabling tests on everyday basis is this: if you have enough disk space, you could have two builds, one for “everyday” or development use with tests disabled, and another build with tests enabled used for this sort of questions.

Less elegant than what Matt proposed, though.

(yann) #8

I did it consistently with the palette reading, which I wrote only for tiff, png, and bmp. I kind of remember that for reading, it was necessary to have the variable in the base class, but not sure.

Here I just didn’t bother do the implementation for bmp. I’ve guessed that people are not using this format much.

I you want, I can already push the branch as it is to gerrit, so you see what I’m talking about. It would be easier to check and it an easychange anyway as you said.

By the way, I personally think palette images are great, it make very compact files, makes visualization easy without having to alter the original data too much !

(yann) #9

Thanks for the tips !
I feel I can do that only if I’m not changing the interface, otherwise I may break classes using it, right ?

Did that already :wink:

Because libtiff is the library used by itk to read tiff I guessed their testing base was a good start. At the time of my testing, over the 75 files, around 20 gave an error includingr:

caspian 279x220 64-bit floating point(deflate).tif

These are mostly specific bit type. I have a patch that fix a couple of them for palette recognition.
15 a give warning.

Tiff class seems not well tested for specific format. Testing all those file would allow a 100% coverage or so…

(Matt McCormick) #10

It we have good unit test coverage on the module, then we can be pretty confident in code.

And, when contributing it upstream, there are cross-platform continuous integration builds and nightly builds on many systems, which help discover issues.

Nice. :candy:

(yann) #11

If we test the libtiff images, plus some other I generated, lets say around 80 image, how would we do this ?

Some files are explicitly not supported in itk (like specific bit types), some give error but should be supported (the purpose of one incoming path) some gives warning that someone may want to correct (eg EXIFIFDOffset has unsupported data type (18) for meta-data dictionary), etc…

What I did a that time was to read all files, output

  • all meta as read by itk
  • warnings
  • errors
    and check by hand that these were correct…

It seems quite a bit of work to make this working, clear and flexible.

I don’t know what you think about it

(Matt McCormick) #12

How about a test that takes in two files,

  1. The tiff file to test.
  2. A plain text file with the metadata expected for the given file.

The test

  1. Reads in the file.
  2. Verifies the metadata.
  3. Writes out the file.

Then the test compares the output file MD5 to what is expected.

What do you think?