You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
1
(5) |
2
(2) |
3
(4) |
4
|
5
|
6
(4) |
7
(6) |
8
(7) |
9
(2) |
10
(8) |
11
(5) |
12
(3) |
13
(1) |
14
|
15
(11) |
16
(10) |
17
(3) |
18
(5) |
19
(6) |
20
(2) |
21
(2) |
22
(8) |
23
|
24
(2) |
25
(16) |
26
(37) |
27
(15) |
28
(1) |
|
|
|
|
|
From: Eric F. <ef...@ha...> - 2011-02-16 19:19:38
|
On 02/16/2011 08:38 AM, Darren Dale wrote: > On Wed, Feb 16, 2011 at 12:39 PM, Fernando Perez<fpe...@gm...> wrote: >> are you guys planning on transfering the old bugs to github? > > Probably at some point. > >> As I >> mentioned, I have code lying around for the upload (and to download >> from launchpad, but that's irrelevant here). I'm going to be mosly >> offline til Monday (conference trip), but if someone pings me on my >> Berkeley email address, which I monitor even while traveling, I'll be >> happy to help out. > > Thanks, that would be very helpful. I'll follow up once I figure out > how to extract the information from sourceforge. Darren, Just a heads-up on that: In November the tracker was heavily spammed. Recently I marked a few hundred items with the "delete" disposition, but I don't think that actually gets rid of them. If it doesn't, then maybe they can be filtered out during the transfer. Eric > >> Glad to see eveythong moving over to github! (since scipy is also >> about to do the same, as soon as 0.9 is out, for which things are >> already at the RC stage). >> >> A huge thank you to > > Hang on, don't jinx it. > > Darren > > ------------------------------------------------------------------------------ > The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE: > Pinpoint memory and threading errors before they happen. > Find and fix more than 250 security defects in the development cycle. > Locate bottlenecks in serial and parallel code that limit performance. > http://p.sf.net/sfu/intel-dev2devfeb > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel |
From: John H. <jd...@gm...> - 2011-02-16 18:57:32
|
On Wed, Feb 16, 2011 at 7:00 AM, Darren Dale <dsd...@gm...> wrote: > John, could you freeze the svn repo around noon on Friday? I'll > convert the repositories and push them up to github on Saturday. Is it > possible to close the sourceforge bugtracker, feature requests, etc to > new issues as well? I did some poking around and do not see an easy way to freeze the repo. I can disable it, but then I think it won't be available for read access. One thing I could do is remove commit privs for every developer, making the repo read only going forward. This seems like a reasonable approach. As for the tracker, similarly, I see how to disable it but not to freeze it so that no new issues can be added. Anyone seeing things differently? JDH |
From: Darren D. <dsd...@gm...> - 2011-02-16 18:38:23
|
On Wed, Feb 16, 2011 at 12:39 PM, Fernando Perez <fpe...@gm...> wrote: > are you guys planning on transfering the old bugs to github? Probably at some point. > As I > mentioned, I have code lying around for the upload (and to download > from launchpad, but that's irrelevant here). I'm going to be mosly > offline til Monday (conference trip), but if someone pings me on my > Berkeley email address, which I monitor even while traveling, I'll be > happy to help out. Thanks, that would be very helpful. I'll follow up once I figure out how to extract the information from sourceforge. > Glad to see eveythong moving over to github! (since scipy is also > about to do the same, as soon as 0.9 is out, for which things are > already at the RC stage). > > A huge thank you to Hang on, don't jinx it. Darren |
From: Maximilian T. <fa...@tr...> - 2011-02-16 18:21:36
|
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Hi there, I just created my first patch for matplotlib, it's addressing bug 3176823. I send the patch with an example (as suggested in the faq). You can see the effect in the example: before patching it ignores the tz argument, and marks are set at 23:00. The example dates are at 23:00 o'clock in utc, but should be displayed as 0:00, because the exmaple "lives" in timezone Europe/Berlin. Hopefully i got this converter stuff right, i changed in dates.DateConverter: def default_units(x, axis): 'Return the default unit for *x* or None' - - return None + return x and in axis.py i did - - def axis_date(self): + def axis_date(self, tz): """ Sets up x-axis ticks and labels that treat the x data as dates. + *tz* is the time zone to use in labeling dates. """ import datetime # should be enough to inform the unit conversion interface # dates are comng in - - self.update_units(datetime.date(2009,1,1)) + self.update_units(tz.localize(datetime.datetime(2009,1,1))) so that now DateConverter.axisinfo can now something about timezones: def axisinfo(unit, axis): 'return the unit AxisInfo' # make sure that the axis does not start at 0 - - majloc = AutoDateLocator(tz=unit) - - majfmt = AutoDateFormatter(majloc, tz=unit) + tz = None + if getattr(unit, "tzinfo", None): + tz = unit.tzinfo + + majloc = AutoDateLocator(tz=tz) + majfmt = AutoDateFormatter(majloc, tz=tz) cheers, Maximilian PS: sry, im not that used to writing mails in english -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (GNU/Linux) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iQEcBAEBAgAGBQJNXBWoAAoJEMxyzzlQGYxT/IwH/jyLsd5ldmzFUTjmV0qIaDdu ct1B0/FbpGf2/F6+IyQxEoltYUXNpWwf4fzg/bdjBAd4CHaS1S0tOv7o6m9JHZDe J6Eym9h1DmXVPkLJau4gdi1jZwTMpP94ac8wcHDqP39rXSiC/klRdXqpizccD2eP 9DbmgnePd61UQCYJchg9fBwIfSJg1iC7wpGuC51uLkyLT1P1ITDXbvSFH82zWtDU hBCc8w54X7wlsDuCSejnibocIj251YhyG3cXbWrdw0+wTg+8L9mxglAo+TmCI2Nb aOIYlQUunhBQkVRIG2qTKXmu6CDv1Sk1u1oxnEGeVXxVsJIbdLWxEPBb1jOjjJI= =zRMW -----END PGP SIGNATURE----- |
From: Fernando P. <fpe...@gm...> - 2011-02-16 17:39:49
|
Hey, On Wed, Feb 16, 2011 at 5:00 AM, Darren Dale <dsd...@gm...> wrote: > > John, could you freeze the svn repo around noon on Friday? I'll > convert the repositories and push them up to github on Saturday. Is it > possible to close the sourceforge bugtracker, feature requests, etc to > new issues as well? are you guys planning on transfering the old bugs to github? As I mentioned, I have code lying around for the upload (and to download from launchpad, but that's irrelevant here). I'm going to be mosly offline til Monday (conference trip), but if someone pings me on my Berkeley email address, which I monitor even while traveling, I'll be happy to help out. Glad to see eveythong moving over to github! (since scipy is also about to do the same, as soon as 0.9 is out, for which things are already at the RC stage). A huge thank you to Darren for putting so much hard work into this, I admire your attention to detail (and I wish I'd been so thorough when I transitioned ipython, where we could have recovered from some old history problems, but I'm too lazy for that :). Cheers, f |
From: Darren D. <dsd...@gm...> - 2011-02-16 13:00:48
|
On Sun, Jan 30, 2011 at 8:10 AM, Darren Dale <dsd...@gm...> wrote: > On Thu, Jan 27, 2011 at 9:34 PM, Darren Dale <dsd...@gm...> wrote: >> Hi Folks, >> >> I'm planning on freezing the sourceforge svn repository Friday evening >> at 8:00 (NY time), and moving the git repository to its new home on >> Saturday morning. >> >> If you have concerns, please speak up. > > John discovered a problem with some very early project history that > was lost several years ago during the CVS to Subversion migration. We > have an opportunity to recover it during the git migration. However, > do to a recent attack, Sourceforge has taken their CVS service down, > and based on the latest information at http://sourceforge.net/blog/ , > they do not expect it to be back before late this week. I do not think > I will available to work on the migration this upcoming weekend, Feb > 4-6. So it will probably be February 7 or 8 before I have a chance to > try to recover the old history, convert the repos to git, and post > them to github. It looks like the history we were looking for does not exist in the CVS repository either. John, could you freeze the svn repo around noon on Friday? I'll convert the repositories and push them up to github on Saturday. Is it possible to close the sourceforge bugtracker, feature requests, etc to new issues as well? Darren |
From: Benjamin R. <ben...@ou...> - 2011-02-15 19:42:02
|
On Tue, Feb 15, 2011 at 1:17 PM, Eric Firing <ef...@ha...> wrote: > On 02/15/2011 08:50 AM, Benjamin Root wrote: > >> On Tue, Feb 15, 2011 at 12:19 PM, Benjamin Root <ben...@ou... >> <mailto:ben...@ou...>> wrote: >> >> On Tue, Feb 15, 2011 at 11:54 AM, Eric Firing <ef...@ha... >> <mailto:ef...@ha...>> wrote: >> >> On 02/15/2011 07:40 AM, Benjamin Root wrote: >> > I have come across a little inconsistency that was unexpected >> in the >> > matplotlib API. The following is perfectly valid: >> > >> > import matplotlib.pyplot as plt >> > plt.plot([], []) >> > plt.show() >> > >> > >> > However, this is not valid: >> > >> > import matplotlib.pyplot as plt >> > plt.scatter([], []) >> > plt.show() >> > >> > >> > The immediate issue that comes up in scatter is that it >> attempts to find >> > min/max of the input data for the purpose of autoscaling >> (this can >> > probably be done better by just using set_xmargin(0.05) and >> > set_ymargin(0.05)). This can easily be bypassed with an if >> statement. >> > However, then we discover that polygon collection do not like >> having >> > empty offsets, which leads to a failure in the affine >> transformation. >> > >> > So, the question is, is this a bug or a feature? I >> personally believe >> > that empty data is a perfectly valid scenario and given that >> other >> > matplotlib functions handle it gracefully, we should make the >> > collections object more friendly to empty data. >> >> I agree; a call with empty data should simply not plot anything. >> >> Eric >> >> >> Digging further, it appears that the problem is in _path.cpp for >> _path_module::affine_transform() which explicitly checks for an >> empty vertices array and throws an exception if it is empty. >> >> So, do we want to make _path.cpp empty-friendly or should we just >> make empty collections objects just avoid doing anything that >> requires doing an affine transform? >> >> Ben Root >> >> >> >> Ok, some more digging deepens the mystery. While an empty-friendly >> _path.cpp would be nice, it appears that the collections and axes >> objects are already doing all it can to avoid doing transforms for empty >> collections. >> >> However, it appears that the supposedly empty collection object from >> scatter([], []) is not really empty. Its self._paths member contains a >> list of unit_circle() from Path. This is also the case for >> EllipseCollection. Meanwhile, LineCollection and PatchCollection >> initializes their self._paths in accordance to their given data. >> > > One way to solve the problem would be to start each draw() method with a > short-circuit return in case there is nothing to draw. It would be needed > only for classes for which empty self._paths is not a valid test. So for > CircleCollection it would be: > > @allow_rasterization > def draw(self, renderer): > # sizes is the area of the circle circumscribing the polygon > # in points^2 > if len(self._sizes) == 0: > return > self._transforms = [ > transforms.Affine2D().scale( > (np.sqrt(x) * self.figure.dpi / 72.0) / np.sqrt(np.pi)) > for x in self._sizes] > return Collection.draw(self, renderer) > > (Collection.draw returns nothing, so there is no inconsistency in the > return value.) > > Alternatively, it looks like maybe an empty self._transforms could be used > in a short-circuit test at the start of Collection.draw. > > Eric > > > >> Ben Root >> >> > That wouldn't completely solve the problem. Affine transforms are being done for get_datalim() as well. The other issue (and I see that I mixed this up myself) is the assumption elsewhere that a non-empty self._path attribute means that there is something to plot. This is an assumption that is made in axes.py on line 1413 and it is an invalid assumption. As for your proposed solution in draw(), I prefer short-circuiting in Collections.draw(). This makes for less work for new Collection subclasses. Ben Root |
From: Eric F. <ef...@ha...> - 2011-02-15 19:17:21
|
On 02/15/2011 08:50 AM, Benjamin Root wrote: > On Tue, Feb 15, 2011 at 12:19 PM, Benjamin Root <ben...@ou... > <mailto:ben...@ou...>> wrote: > > On Tue, Feb 15, 2011 at 11:54 AM, Eric Firing <ef...@ha... > <mailto:ef...@ha...>> wrote: > > On 02/15/2011 07:40 AM, Benjamin Root wrote: > > I have come across a little inconsistency that was unexpected > in the > > matplotlib API. The following is perfectly valid: > > > > import matplotlib.pyplot as plt > > plt.plot([], []) > > plt.show() > > > > > > However, this is not valid: > > > > import matplotlib.pyplot as plt > > plt.scatter([], []) > > plt.show() > > > > > > The immediate issue that comes up in scatter is that it > attempts to find > > min/max of the input data for the purpose of autoscaling > (this can > > probably be done better by just using set_xmargin(0.05) and > > set_ymargin(0.05)). This can easily be bypassed with an if > statement. > > However, then we discover that polygon collection do not like > having > > empty offsets, which leads to a failure in the affine > transformation. > > > > So, the question is, is this a bug or a feature? I > personally believe > > that empty data is a perfectly valid scenario and given that > other > > matplotlib functions handle it gracefully, we should make the > > collections object more friendly to empty data. > > I agree; a call with empty data should simply not plot anything. > > Eric > > > Digging further, it appears that the problem is in _path.cpp for > _path_module::affine_transform() which explicitly checks for an > empty vertices array and throws an exception if it is empty. > > So, do we want to make _path.cpp empty-friendly or should we just > make empty collections objects just avoid doing anything that > requires doing an affine transform? > > Ben Root > > > > Ok, some more digging deepens the mystery. While an empty-friendly > _path.cpp would be nice, it appears that the collections and axes > objects are already doing all it can to avoid doing transforms for empty > collections. > > However, it appears that the supposedly empty collection object from > scatter([], []) is not really empty. Its self._paths member contains a > list of unit_circle() from Path. This is also the case for > EllipseCollection. Meanwhile, LineCollection and PatchCollection > initializes their self._paths in accordance to their given data. One way to solve the problem would be to start each draw() method with a short-circuit return in case there is nothing to draw. It would be needed only for classes for which empty self._paths is not a valid test. So for CircleCollection it would be: @allow_rasterization def draw(self, renderer): # sizes is the area of the circle circumscribing the polygon # in points^2 if len(self._sizes) == 0: return self._transforms = [ transforms.Affine2D().scale( (np.sqrt(x) * self.figure.dpi / 72.0) / np.sqrt(np.pi)) for x in self._sizes] return Collection.draw(self, renderer) (Collection.draw returns nothing, so there is no inconsistency in the return value.) Alternatively, it looks like maybe an empty self._transforms could be used in a short-circuit test at the start of Collection.draw. Eric > > Ben Root > |
From: Benjamin R. <ben...@ou...> - 2011-02-15 18:50:48
|
On Tue, Feb 15, 2011 at 12:19 PM, Benjamin Root <ben...@ou...> wrote: > On Tue, Feb 15, 2011 at 11:54 AM, Eric Firing <ef...@ha...> wrote: > >> On 02/15/2011 07:40 AM, Benjamin Root wrote: >> > I have come across a little inconsistency that was unexpected in the >> > matplotlib API. The following is perfectly valid: >> > >> > import matplotlib.pyplot as plt >> > plt.plot([], []) >> > plt.show() >> > >> > >> > However, this is not valid: >> > >> > import matplotlib.pyplot as plt >> > plt.scatter([], []) >> > plt.show() >> > >> > >> > The immediate issue that comes up in scatter is that it attempts to find >> > min/max of the input data for the purpose of autoscaling (this can >> > probably be done better by just using set_xmargin(0.05) and >> > set_ymargin(0.05)). This can easily be bypassed with an if statement. >> > However, then we discover that polygon collection do not like having >> > empty offsets, which leads to a failure in the affine transformation. >> > >> > So, the question is, is this a bug or a feature? I personally believe >> > that empty data is a perfectly valid scenario and given that other >> > matplotlib functions handle it gracefully, we should make the >> > collections object more friendly to empty data. >> >> I agree; a call with empty data should simply not plot anything. >> >> Eric >> >> > Digging further, it appears that the problem is in _path.cpp for > _path_module::affine_transform() which explicitly checks for an empty > vertices array and throws an exception if it is empty. > > So, do we want to make _path.cpp empty-friendly or should we just make > empty collections objects just avoid doing anything that requires doing an > affine transform? > > Ben Root > > Ok, some more digging deepens the mystery. While an empty-friendly _path.cpp would be nice, it appears that the collections and axes objects are already doing all it can to avoid doing transforms for empty collections. However, it appears that the supposedly empty collection object from scatter([], []) is not really empty. Its self._paths member contains a list of unit_circle() from Path. This is also the case for EllipseCollection. Meanwhile, LineCollection and PatchCollection initializes their self._paths in accordance to their given data. Ben Root |
From: Benjamin R. <ben...@ou...> - 2011-02-15 18:20:00
|
On Tue, Feb 15, 2011 at 11:54 AM, Eric Firing <ef...@ha...> wrote: > On 02/15/2011 07:40 AM, Benjamin Root wrote: > > I have come across a little inconsistency that was unexpected in the > > matplotlib API. The following is perfectly valid: > > > > import matplotlib.pyplot as plt > > plt.plot([], []) > > plt.show() > > > > > > However, this is not valid: > > > > import matplotlib.pyplot as plt > > plt.scatter([], []) > > plt.show() > > > > > > The immediate issue that comes up in scatter is that it attempts to find > > min/max of the input data for the purpose of autoscaling (this can > > probably be done better by just using set_xmargin(0.05) and > > set_ymargin(0.05)). This can easily be bypassed with an if statement. > > However, then we discover that polygon collection do not like having > > empty offsets, which leads to a failure in the affine transformation. > > > > So, the question is, is this a bug or a feature? I personally believe > > that empty data is a perfectly valid scenario and given that other > > matplotlib functions handle it gracefully, we should make the > > collections object more friendly to empty data. > > I agree; a call with empty data should simply not plot anything. > > Eric > > Digging further, it appears that the problem is in _path.cpp for _path_module::affine_transform() which explicitly checks for an empty vertices array and throws an exception if it is empty. So, do we want to make _path.cpp empty-friendly or should we just make empty collections objects just avoid doing anything that requires doing an affine transform? Ben Root |
From: Eric F. <ef...@ha...> - 2011-02-15 17:54:50
|
On 02/15/2011 07:40 AM, Benjamin Root wrote: > I have come across a little inconsistency that was unexpected in the > matplotlib API. The following is perfectly valid: > > import matplotlib.pyplot as plt > plt.plot([], []) > plt.show() > > > However, this is not valid: > > import matplotlib.pyplot as plt > plt.scatter([], []) > plt.show() > > > The immediate issue that comes up in scatter is that it attempts to find > min/max of the input data for the purpose of autoscaling (this can > probably be done better by just using set_xmargin(0.05) and > set_ymargin(0.05)). This can easily be bypassed with an if statement. > However, then we discover that polygon collection do not like having > empty offsets, which leads to a failure in the affine transformation. > > So, the question is, is this a bug or a feature? I personally believe > that empty data is a perfectly valid scenario and given that other > matplotlib functions handle it gracefully, we should make the > collections object more friendly to empty data. I agree; a call with empty data should simply not plot anything. Eric > > Ben Root > |
From: Ben G. <bga...@gm...> - 2011-02-15 17:52:07
|
On Tue, 15 Feb 2011 11:40:01 -0600, Benjamin Root <ben...@ou...> wrote: > So, the question is, is this a bug or a feature? I personally believe that > empty data is a perfectly valid scenario and given that other matplotlib > functions handle it gracefully, we should make the collections object more > friendly to empty data. > I agree that this is certainly a bug. For instance, I have a short script which reads a stream of data from stdin and plots in realtime. Unfortunately this bug made it prohibitively difficult to add support for scatter plots. I'd definitely appreciate it if this were fixed. Cheers, - Ben |
From: Eric F. <ef...@ha...> - 2011-02-15 17:46:08
|
On 02/12/2011 12:11 PM, Michael Albert wrote: > Greetings! > > First, my personal thanks to you good folks who make > a wonderful tool like matplotlib available. > > I am currently trying to build matplotlib-1.0.1 against > libpng1.5.1, and _png.cpp failed to compile. Apparently, > libpng's info_ptr is now opaque, so the code required > multiple changes of this nature: Mike, A quick check indicates that the png_get* functions are available way back in libpng 1.2.x, so I suspect we can support everything we need to with a single modern version. Does anyone know of a reason we need to support libpng prior to 1.2.x? Would you attach your complete diff, please? Thanks. Eric > > -- _png.cpp.orig 2011-02-12 16:42:42.000000000 -0500 > *************** > *** 350,362 **** > png_set_sig_bytes(png_ptr, 8); > png_read_info(png_ptr, info_ptr); > > ! /*png_uint_32 width = info_ptr->width;*/ > ! /*png_uint_32 height = info_ptr->height;*/ > ! png_uint_32 width = png_get_image_width( png_ptr, info_ptr ); > ! png_uint_32 height = png_get_image_height( png_ptr, info_ptr ); > > ! /*int bit_depth = info_ptr->bit_depth;*/ > ! int bit_depth = png_get_bit_depth( png_ptr, info_ptr ); > > // Unpack 1, 2, and 4-bit images > if (bit_depth< 8) > --- 350,359 ---- > png_set_sig_bytes(png_ptr, 8); > png_read_info(png_ptr, info_ptr); > > ! png_uint_32 width = info_ptr->width; > ! png_uint_32 height = info_ptr->height; > > ! int bit_depth = info_ptr->bit_depth; > > // Unpack 1, 2, and 4-bit images > if (bit_depth< 8) > *************** > > Sorry to be sending problems :-). > I suspect you have probably noticed this > already, but just in case I figured I'd send > a "head's up". Thanks! > > Sincerely, > Mike Albert > > > ------------------------------------------------------------------------------ > The ultimate all-in-one performance toolkit: Intel(R) Parallel Studio XE: > Pinpoint memory and threading errors before they happen. > Find and fix more than 250 security defects in the development cycle. > Locate bottlenecks in serial and parallel code that limit performance. > http://p.sf.net/sfu/intel-dev2devfeb > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel |
From: Benjamin R. <ben...@ou...> - 2011-02-15 17:40:32
|
I have come across a little inconsistency that was unexpected in the matplotlib API. The following is perfectly valid: import matplotlib.pyplot as plt plt.plot([], []) plt.show() However, this is not valid: import matplotlib.pyplot as plt plt.scatter([], []) plt.show() The immediate issue that comes up in scatter is that it attempts to find min/max of the input data for the purpose of autoscaling (this can probably be done better by just using set_xmargin(0.05) and set_ymargin(0.05)). This can easily be bypassed with an if statement. However, then we discover that polygon collection do not like having empty offsets, which leads to a failure in the affine transformation. So, the question is, is this a bug or a feature? I personally believe that empty data is a perfectly valid scenario and given that other matplotlib functions handle it gracefully, we should make the collections object more friendly to empty data. Ben Root |
From: Benjamin R. <ben...@ou...> - 2011-02-15 01:33:07
|
On Mon, Feb 14, 2011 at 6:56 PM, Gökhan Sever <gok...@gm...> wrote: > python > Python 2.7 (r27:82500, Sep 16 2010, 18:02:00) > [GCC 4.5.1 20100907 (Red Hat 4.5.1-3)] on linux2 > Type "help", "copyright", "credits" or "license" for more information. > >>> import matplotlib > >>> matplotlib.test() > Warning: divide by zero encountered in log > Warning: divide by zero encountered in log > Warning: divide by zero encountered in log > /home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/axes.py:2389: > UserWarning: Attempting to set identical left==right results > in singular transformations; automatically expanding. > left=730139.0, right=730139.0 > + 'left=%s, right=%s') % (left, right)) > ====================================================================== > ERROR: Failure: IOError ([Errno 2] No such file or directory: > '/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png') > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.7/site-packages/nose-1.0.0-py2.7.egg/nose/loader.py", line > 231, in generate > for test in g(): > File > "/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/testing/decorators.py", > line 91, in compare_images_generator > shutil.copyfile(src,dst) > File "/usr/lib64/python2.7/shutil.py", line 81, in copyfile > with open(src, 'rb') as fsrc: > IOError: [Errno 2] No such file or directory: > '/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png' > > ====================================================================== > ERROR: make the basic nearest, bilinear and bicubic interps > ---------------------------------------------------------------------- > Traceback (most recent call last): > File > "/usr/lib/python2.7/site-packages/nose-1.0.0-py2.7.egg/nose/case.py", line > 187, in runTest > self.test(*self.arg) > File > "/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/testing/decorators.py", > line 32, in failer > result = f(*args, **kwargs) > File > "/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/testing/decorators.py", > line 126, in decorated_compare_images > '(RMS %(rms).3f)'%err) > ImageComparisonFailure: images not close: > /home/gsever/Desktop/result_images/test_image/image_interps_pdf.png vs. > /home/gsever/Desktop/result_images/test_image/expected-image_interps_pdf.png > (RMS 281.963) > > ---------------------------------------------------------------------- > Ran 152 tests in 71.340s > > FAILED (KNOWNFAIL=46, errors=2) > False > > > -- > Gökhan > > I have reported that first error a few months ago, and nobody has commented on it. I hope somebody knows what image that is supposed to be for. The second error has popped up several times before, and we don't seem to address it properly. The difference between the expected and the resulting images has a definite structure to it, suggesting that something changed. Either there is something wrong with the original image, or there is something wrong with the current image. If we figure that the current image is correct, and that there was something wrong with the previous image, then we should update the image in the test suite. Ben Root |
From: Gökhan S. <gok...@gm...> - 2011-02-15 00:56:21
|
python Python 2.7 (r27:82500, Sep 16 2010, 18:02:00) [GCC 4.5.1 20100907 (Red Hat 4.5.1-3)] on linux2 Type "help", "copyright", "credits" or "license" for more information. >>> import matplotlib >>> matplotlib.test() Warning: divide by zero encountered in log Warning: divide by zero encountered in log Warning: divide by zero encountered in log /home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/axes.py:2389: UserWarning: Attempting to set identical left==right results in singular transformations; automatically expanding. left=730139.0, right=730139.0 + 'left=%s, right=%s') % (left, right)) ====================================================================== ERROR: Failure: IOError ([Errno 2] No such file or directory: '/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png') ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/nose-1.0.0-py2.7.egg/nose/loader.py", line 231, in generate for test in g(): File "/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/testing/decorators.py", line 91, in compare_images_generator shutil.copyfile(src,dst) File "/usr/lib64/python2.7/shutil.py", line 81, in copyfile with open(src, 'rb') as fsrc: IOError: [Errno 2] No such file or directory: '/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/tests/baseline_images/test_axes/canonical.png' ====================================================================== ERROR: make the basic nearest, bilinear and bicubic interps ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/python2.7/site-packages/nose-1.0.0-py2.7.egg/nose/case.py", line 187, in runTest self.test(*self.arg) File "/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/testing/decorators.py", line 32, in failer result = f(*args, **kwargs) File "/home/gsever/Desktop/python-repo/matplotlib/lib/matplotlib/testing/decorators.py", line 126, in decorated_compare_images '(RMS %(rms).3f)'%err) ImageComparisonFailure: images not close: /home/gsever/Desktop/result_images/test_image/image_interps_pdf.png vs. /home/gsever/Desktop/result_images/test_image/expected-image_interps_pdf.png (RMS 281.963) ---------------------------------------------------------------------- Ran 152 tests in 71.340s FAILED (KNOWNFAIL=46, errors=2) False -- Gökhan |
From: Jae-Joon L. <lee...@gm...> - 2011-02-13 00:43:45
|
On Sun, Feb 13, 2011 at 6:08 AM, Benjamin Root <ben...@ou...> wrote: > In particular, the following just produces an empty box: > This has been fixed in my side, and the fix will be included when I commit. > Another use-case that doesn't seem addressed yet (and isn't quite right in > regular mpl either) is legends for stemplots. So far, I only implemented legend for "bars" and "errorbars". For other plots, such as stemplot, while I'll try to implement some of them eventually, contribution from others will be needed. Regards, -JJ |
From: Michael A. <m_a...@ya...> - 2011-02-12 22:11:51
|
Greetings! First, my personal thanks to you good folks who make a wonderful tool like matplotlib available. I am currently trying to build matplotlib-1.0.1 against libpng1.5.1, and _png.cpp failed to compile. Apparently, libpng's info_ptr is now opaque, so the code required multiple changes of this nature: -- _png.cpp.orig 2011-02-12 16:42:42.000000000 -0500 *************** *** 350,362 **** png_set_sig_bytes(png_ptr, 8); png_read_info(png_ptr, info_ptr); ! /*png_uint_32 width = info_ptr->width;*/ ! /*png_uint_32 height = info_ptr->height;*/ ! png_uint_32 width = png_get_image_width( png_ptr, info_ptr ); ! png_uint_32 height = png_get_image_height( png_ptr, info_ptr ); ! /*int bit_depth = info_ptr->bit_depth;*/ ! int bit_depth = png_get_bit_depth( png_ptr, info_ptr ); // Unpack 1, 2, and 4-bit images if (bit_depth < 8) --- 350,359 ---- png_set_sig_bytes(png_ptr, 8); png_read_info(png_ptr, info_ptr); ! png_uint_32 width = info_ptr->width; ! png_uint_32 height = info_ptr->height; ! int bit_depth = info_ptr->bit_depth; // Unpack 1, 2, and 4-bit images if (bit_depth < 8) *************** Sorry to be sending problems :-). I suspect you have probably noticed this already, but just in case I figured I'd send a "head's up". Thanks! Sincerely, Mike Albert |
From: Benjamin R. <ben...@ou...> - 2011-02-12 21:08:32
|
On Sat, Jan 22, 2011 at 10:18 AM, Benjamin Root <ben...@ou...> wrote: > On Tue, Jan 18, 2011 at 12:27 AM, Jae-Joon Lee <lee...@gm...>wrote: > >> Dear Matplotlib developers, >> >> Attached is a patch to improve the functionality of legend. >> The two biggest changes are as follows, >> >> * Drawing of legend is delegated to "legend handlers". >> * Introduces a new "Container" class. This is primarily to support >> legend of complex plots (e.g., bar, errorbar, etc). >> >> The first change is to ease the creation of customized legends. See >> "legend_demo_custom_handler.py" for example. >> The second change is to support legend of complex plots. Axes >> instances now have a "containers" attribute. And this is only intended >> to be used for generating a legend. For example, "bar" plots create a >> series of Rectangle patches. Previously, it returned a list of these >> patches. With the current change, it creates a container object of >> these rectangle patches and return it instead. As the container class >> is derived from a tuple, it should be backward-compatible. >> Furthermore, the container object is added to the Axes.containers >> attributes. And legend command use this "container" attribute to >> properly create a legend for the bar. >> >> A two example figures are attached. >> >> As this patch introduces relatively significant changes. I wanted to >> get some comments from others before I commit. >> The change will be divided into four commits. >> >> Regards, >> >> -JJ >> >> > Nice. I will look through it this week and see if I can break it. > > Ben Root > > Jae-Joon, I finally got around to doing some testing with your refactor of legend. I find the concepts behind the refactor interesting, however the current implementation doesn't seem to work properly in some basic use-cases. In particular, the following just produces an empty box: import matplotlib.pyplot as plt x = [1,2,3,4,5] y1 = [1, 2, 3, 4, 5] y2 = [5, 4, 3, 2, 1] plt.plot(x, y1, 'rx-') plt.plot(x, y2, 'bx-') plt.legend(('a', 'b')) plt.show() However, it does work if I use the label= kwarg in the plot commands. Another use-case that doesn't seem addressed yet (and isn't quite right in regular mpl either) is legends for stemplots. I haven't tried out the new features yet, as I am mostly concerned about backwards-compatibility right now. Ben Root |
From: Nicholas D. <mis...@gm...> - 2011-02-12 03:07:02
|
As suggested on http://matplotlib.sourceforge.net/faq/howto_faq.html#submit-a-patch , I have submitted this to the patch tracker an am following it up, with a tracker link! The patch tracker entry is: https://sourceforge.net/tracker/?func=detail&aid=3178834&group_id=80706&atid=560722 Apologies if I am sticking rigidly (and annoyingly) to this process. (I have also been observing the git-transition, so appreciate this probably isn't the best time to integrate random patches) On Mon, Feb 7, 2011 at 1:23 AM, Nicholas Devenish <mis...@gm...> wrote: > One of the things that bugs me about axes.hist is that with > histtype='step' the automatic legend style is an empty box, instead of > a line, as I would expect. This behaviour doesn't seem to make sense, > because it seems a line would be much more appropriate for this case. > Example is attached for the code: > > import matplotlib.pyplot as plt > plt.hist([0,1,1,2,2,2], [0,1,2,3], histtype='step', label="histtype='step'") > plt.legend() > plt.show() > > I can get around this by using proxy Line2D objects in building the > legend, but as this is an extremely common operation for me (the > common style of histograms in my field is equivalent to step) this > becomes messy and annoying. I'd rather not have to roll my own > histogram drawing function, as it would be almost entirely duplicating > the axes hist code, and don't know another way to get what I want. I > understand that the way of setting Legend styles is something that has > been looked at recently, but don't know the timescale for any changes. > > The cause of this is the fact that in axes.py::7799 (current SVN > head), in axes.hist, patch objects are always created, even for the > line-based step style. I searched the tracker briefly, and couldn't > find this mentioned before. I therefore have a few questions: > > - Is this intended behaviour, that I am just not understanding the > requirement for? > - Would changing this to create (and thus return) Line2D's instead of > Patch's for this histtype be a horrible breaking of the return > interface? > > I've attached a patch that makes the simple change of swapping out the > call to .fill for .plot (only the function is changed here, not yet > the documentation), and it appears to work but I haven't tested > exhaustively. > > Thoughts? > > Nick > |
From: Darren D. <dsd...@gm...> - 2011-02-11 18:53:54
|
On Thu, Feb 10, 2011 at 5:54 PM, Pauli Virtanen <pa...@ik...> wrote: > On Thu, 10 Feb 2011 17:34:32 -0500, Darren Dale wrote: > [clip] >> Unfortunately, I am getting exactly the same results: the matplotlib/ >> directory is missing in the earliest history. I've tried adding >> --use-cvs and --keep-trivial-imports, to no avail. I've tried checking >> out a working copy of the cvs repo (setting CVSROOT to point to the >> directory I created using rsync), and I *thought* the right way to >> inspect the r7 working directory is to do "cvs update -R -r 7", but >> thats not right. So I'm currently having trouble determining whether the >> history even exists in CVS. Anybody have a longer memory than I do? How >> can I get cvs to perform this basic operation? > > Maybe you can try skipping SVN altogether (needs "git-cvs" package on > Ubuntu): > > export CVSROOT=/rsynced/directory > test -d "$CVSROOT/CVSROOT" || echo "Wrong cvsroot..." > mkdir imported > cd imported > git cvsimport matplotlib > > This at least shows some files in the first revisions. You can probably > then just graft the two histories together at a suitable point. > > Apparently, it also needs some use of "git filter-branch" to get rid of > the top-level matplotlib/ directory. On further inspection, the direct cvs to git conversion *also* yields a repository lacking the matplotlib package directory. It looks like the history leading up to revision 540 may have been lost from the CVS repository itself, not during the cvs2svn conversion. John, do you want some time to continue looking into the cvs repo yourself? Or should we go ahead with the git migration? If the latter, should we start the git repo at revision 540, or include all available history, even though some of it is missing the matplotlib package directory? If we want to go ahead with the git migration, I can probably work on it this weekend. Darren |
From: Mark S. <sie...@st...> - 2011-02-11 17:49:37
|
>> We can't put python-matplotlib in main because of *its* dependencies. >> > > As a digression, I think the python-matplotlib dependencies could be > significantly reduced. For a number of use cases (this is one of them, > but there are others), you don't need any GUI backend. Independent of > this issue, it would be great to be able to install python-matplotlib > in a headless server environment without pulling in all of those GUI > bits. Looking at the list of the hard dependencies, I don't understand > why half of them are there. > > http://packages.ubuntu.com/natty/python/python-matplotlib > This web page lists several "dependencies" that are optional. Just flipping through the list, I can see several packages that I know that I do not have, and several more that I have never heard of. "Never heard of" could mean that it is in my linux distro and I don't know it, but I am certain that I have machines that do not have cairo or gtk+ or qt. A matplotlib application selects one of the available backends to draw the graphics. If I remember correctly _all_ backends are optional in matplotlib, but there is at least one ("agg") that is available everywhere. When you install matplotlib, it builds support for any backends that it can. A backend that depends on a missing library does not get a C extension built. BUT the python code is still installed. The only way to know that a backend is not supported in this installation is to try to use it. For example, >>> import matplotlib.backends.backend_qt Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/stsci/pyssgdev/2.7/matplotlib/backends/backend_qt.py", line 19, in <module> raise ImportError("Qt backend requires pyqt to be installed.") ImportError: Qt backend requires pyqt to be installed. >>> Ok - I don't have qt on this machine. So, you can try this: Get a build machine that has all the packages required by the various backends. Build a binary distribution of matplotlib. Install it on a machine that has only some of the libraries required by the backends. The result is a copy of matplotlib with _some_ working backends and some that fail because of missing libraries. As you install other supporting packages, additional backends can start working because their shared library is now present. So, if I install matplotlib and pyqt is not there I get a working matplotlib without QT support. If I use a package installer to install pyqt, presumably it will also install the QT libraries, resulting in matplotlib that does have qt support. I'm not saying you want to do this, but it is an option. If you want to experiment in this direction, there is a list that breaks out requirements for the core and requirements for each of the backends at http://matplotlib.sourceforge.net/users/installing.html . Mark S. |
From: Barry W. <ba...@py...> - 2011-02-11 16:14:53
|
On Feb 11, 2011, at 05:18 PM, Jouni K. Seppänen wrote: >[Crossposting to matplotlib devel list] > >Robert Kern <rob...@gm...> writes: > >> On Thu, Feb 10, 2011 at 11:22, Barry Warsaw <ba...@py...> wrote: >> >>> Here's the problem: for Ubuntu, we've had to disable the building of >>> the numpy documentation package, because its dependencies violate >>> Ubuntu policy. Numpy is in our "main" archive but the documentation >>> depends on python-matplotlib, which lives in our "universe" >>> archive. Such cross archive dependencies break the build. >>> >>> We can't put python-matplotlib in main because of *its* dependencies. >> >> As a digression, I think the python-matplotlib dependencies could be >> significantly reduced. For a number of use cases (this is one of them, >> but there are others), you don't need any GUI backend. Independent of >> this issue, it would be great to be able to install python-matplotlib >> in a headless server environment without pulling in all of those GUI >> bits. Looking at the list of the hard dependencies, I don't understand >> why half of them are there. >> >> http://packages.ubuntu.com/natty/python/python-matplotlib > >Would it make sense to split out each interactive backend to its own >Ubuntu package, e.g. python-matplotlib-tk, etc? Each of these would >depend on the relevant toolkit packages, and python-matplotlib would >have a much shorter list of dependencies. Note that the main/universe distinction happens at the source package level, so we'd have to have separate source packages, ideally with different upstream tarballs. We could finesse that with two different source packages using the same upstream tarball (as suggested in a previous follow up), but I think it would be more difficult to get into Debian, thus precipitating a divergence. Cheers, -Barry |
From: Benjamin R. <ben...@ou...> - 2011-02-11 15:59:45
|
On Friday, February 11, 2011, Benjamin Root <ben...@ou...> wrote: > On Friday, February 11, 2011, Jouni K. Seppänen <jk...@ik...> wrote: >> [Crossposting to matplotlib devel list] >> >> Robert Kern <rob...@gm...> writes: >> >>> On Thu, Feb 10, 2011 at 11:22, Barry Warsaw <ba...@py...> wrote: >>> >>>> Here's the problem: for Ubuntu, we've had to disable the building of >>>> the numpy documentation package, because its dependencies violate >>>> Ubuntu policy. Numpy is in our "main" archive but the documentation >>>> depends on python-matplotlib, which lives in our "universe" >>>> archive. Such cross archive dependencies break the build. >>>> >>>> We can't put python-matplotlib in main because of *its* dependencies. >>> >>> As a digression, I think the python-matplotlib dependencies could be >>> significantly reduced. For a number of use cases (this is one of them, >>> but there are others), you don't need any GUI backend. Independent of >>> this issue, it would be great to be able to install python-matplotlib >>> in a headless server environment without pulling in all of those GUI >>> bits. Looking at the list of the hard dependencies, I don't understand >>> why half of them are there. >>> >>> http://packages.ubuntu.com/natty/python/python-matplotlib >> >> Would it make sense to split out each interactive backend to its own >> Ubuntu package, e.g. python-matplotlib-tk, etc? Each of these would >> depend on the relevant toolkit packages, and python-matplotlib would >> have a much shorter list of dependencies. >> >> -- >> Jouni K. Seppänen >> http://www.iki.fi/jks >> >> _______________________________________________ >> NumPy-Discussion mailing list >> Num...@sc... >> http://mail.scipy.org/mailman/listinfo/numpy-discussion >> > > There are a lot of advantages to this idea, and I wonder if it might > make distributions easier and allow fuller control by the user. In > particular, kubuntu could default to using the qt backend while > regular ubuntu could use gym. stupid autocorrect... Gym -> Gtk Ben Root > > However, how practical is this to implement? What does it require > from us upstream? > > Ben Root > |
From: Benjamin R. <ben...@ou...> - 2011-02-11 15:56:49
|
On Friday, February 11, 2011, Jouni K. Seppänen <jk...@ik...> wrote: > [Crossposting to matplotlib devel list] > > Robert Kern <rob...@gm...> writes: > >> On Thu, Feb 10, 2011 at 11:22, Barry Warsaw <ba...@py...> wrote: >> >>> Here's the problem: for Ubuntu, we've had to disable the building of >>> the numpy documentation package, because its dependencies violate >>> Ubuntu policy. Numpy is in our "main" archive but the documentation >>> depends on python-matplotlib, which lives in our "universe" >>> archive. Such cross archive dependencies break the build. >>> >>> We can't put python-matplotlib in main because of *its* dependencies. >> >> As a digression, I think the python-matplotlib dependencies could be >> significantly reduced. For a number of use cases (this is one of them, >> but there are others), you don't need any GUI backend. Independent of >> this issue, it would be great to be able to install python-matplotlib >> in a headless server environment without pulling in all of those GUI >> bits. Looking at the list of the hard dependencies, I don't understand >> why half of them are there. >> >> http://packages.ubuntu.com/natty/python/python-matplotlib > > Would it make sense to split out each interactive backend to its own > Ubuntu package, e.g. python-matplotlib-tk, etc? Each of these would > depend on the relevant toolkit packages, and python-matplotlib would > have a much shorter list of dependencies. > > -- > Jouni K. Seppänen > http://www.iki.fi/jks > > _______________________________________________ > NumPy-Discussion mailing list > Num...@sc... > http://mail.scipy.org/mailman/listinfo/numpy-discussion > There are a lot of advantages to this idea, and I wonder if it might make distributions easier and allow fuller control by the user. In particular, kubuntu could default to using the qt backend while regular ubuntu could use gym. However, how practical is this to implement? What does it require from us upstream? Ben Root |