|
From: Benjamin R. <ben...@ou...> - 2011-01-21 21:54:43
|
2011/1/17 Eoghan Harrington <eo...@gm...> > Hi, > > I noticed some erroneous behaviour when using a > LinearSegmentedColormap with an "under" color and different numbers of > color levels. The attached script replicates the behaviour, whereby > lowering the number of colors causes less of the values to be > considered "under" the vmin. I tracked the problem back to the > Colormap class where the results of Normalize are multiplied by the > number of color levels (N) and casted as an int to be used as indices > in the color array. The expected behaviour would be that all negative > values should be considered "under", however the results of the cast > means that anything between 0 and -0.5 will be set to 0 and therefore > will be in the normal color range for the colormap. The attached patch > overcomes this by setting all negative values to -1 before applying > the cast. > > Thanks for your help, > Eoghan > > Thanks for catching this one. Blindly casting to int is wrong (which has a problem for values between -1 and 0, not just -0.5 and 0). This has been committed in v1_0_maint as r8931 and in the development trunk as r8932. Ben Root |