You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
|
|
1
(1) |
2
(7) |
3
|
4
|
5
(16) |
6
(11) |
7
|
8
(1) |
9
(4) |
10
(10) |
11
|
12
(4) |
13
(4) |
14
(5) |
15
(5) |
16
(11) |
17
(3) |
18
(2) |
19
(5) |
20
(2) |
21
(5) |
22
(2) |
23
(2) |
24
|
25
|
26
(4) |
27
(8) |
28
(9) |
29
(9) |
30
(5) |
31
(1) |
From: Eric F. <ef...@ha...> - 2009-01-27 18:45:52
|
John Hunter wrote: > On Mon, Jan 26, 2009 at 6:02 PM, Jae-Joon Lee <lee...@gm...> wrote: >> Michael, >> >> It seems that the gtk backend in the current svn silently ignores ALL >> exceptions raised during the drawing. >> >> http://matplotlib.svn.sourceforge.net/viewvc/matplotlib/trunk/matplotlib/lib/matplotlib/backends/backend_gtk.py?r1=6696&r2=6793 >> >> Is this necessary? I don't think we want to do this. > > No, it is a bug. Catching blanket exceptions and ignoring them is > never OK -- we need to add a section to the coding guide to this > effect. If absolutely necessary, one can catch blanket exceptions and > log them, eg using cbook.exception_to_str, but they must be reported. > Michael has already fixed this (perhaps it was some detritus left in > from a debugging session?) and I'll make a note in the developer docs > coding guide. John, Not quite "always": I think that for something like cbook.is_string_like we actually *do* want to silently catch all exceptions. The problem is that if you know nothing about the type of object that you might have to deal with, you have no way of knowing what exception it might raise. We ran into an example of this recently. Eric > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by: > SourcForge Community > SourceForge wants to tell your story. > http://p.sf.net/sfu/sf-spreadtheword > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel |
From: John H. <jd...@gm...> - 2009-01-27 13:43:31
|
On Tue, Jan 27, 2009 at 12:28 AM, Eric Firing <ef...@ha...> wrote: > Martin Spacek wrote: >> >> Hi, >> >> I just updated my checkout to rev 6829, and it seems >> lines.Line2D.set_pickradius has been renamed to setpickradius. Is this a >> typo? get_pickradius still exists. This is on line 318 in lines.py. Renaming >> it back to set_pickradius seems make it work the way it used to. >> > > John, you made the change on Dec. 10: > > http://currents.soest.hawaii.edu/hg/hgwebdir.cgi/matplotlib_mirror/diff/0a8f5203a8fd/matplotlib/lib/matplotlib/lines.py > > Looks accidental to me. Yep, it must have been a bug because it doesn't follow the naming conventions for the set_property funcs. Fixed on branch and trunk. Thanks for tracking me down as the culprit :-) JDH |
From: John H. <jd...@gm...> - 2009-01-27 13:41:56
|
On Mon, Jan 26, 2009 at 6:02 PM, Jae-Joon Lee <lee...@gm...> wrote: > Michael, > > It seems that the gtk backend in the current svn silently ignores ALL > exceptions raised during the drawing. > > http://matplotlib.svn.sourceforge.net/viewvc/matplotlib/trunk/matplotlib/lib/matplotlib/backends/backend_gtk.py?r1=6696&r2=6793 > > Is this necessary? I don't think we want to do this. No, it is a bug. Catching blanket exceptions and ignoring them is never OK -- we need to add a section to the coding guide to this effect. If absolutely necessary, one can catch blanket exceptions and log them, eg using cbook.exception_to_str, but they must be reported. Michael has already fixed this (perhaps it was some detritus left in from a debugging session?) and I'll make a note in the developer docs coding guide. |
From: Michael D. <md...@st...> - 2009-01-27 13:28:27
|
Sorry. That was a mistake to commit it -- I did this while I was trying to track down a segfault. I will revert it. Mike Jae-Joon Lee wrote: > Michael, > > It seems that the gtk backend in the current svn silently ignores ALL > exceptions raised during the drawing. > > http://matplotlib.svn.sourceforge.net/viewvc/matplotlib/trunk/matplotlib/lib/matplotlib/backends/backend_gtk.py?r1=6696&r2=6793 > > Is this necessary? I don't think we want to do this. > > Regards, > > -JJ > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by: > SourcForge Community > SourceForge wants to tell your story. > http://p.sf.net/sfu/sf-spreadtheword > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel > |
From: Eric F. <ef...@ha...> - 2009-01-27 06:28:49
|
Martin Spacek wrote: > Hi, > > I just updated my checkout to rev 6829, and it seems lines.Line2D.set_pickradius > has been renamed to setpickradius. Is this a typo? get_pickradius still exists. > This is on line 318 in lines.py. Renaming it back to set_pickradius seems make > it work the way it used to. > John, you made the change on Dec. 10: http://currents.soest.hawaii.edu/hg/hgwebdir.cgi/matplotlib_mirror/diff/0a8f5203a8fd/matplotlib/lib/matplotlib/lines.py Looks accidental to me. Eric |
From: Martin S. <sc...@ms...> - 2009-01-27 03:36:17
|
Hi, I just updated my checkout to rev 6829, and it seems lines.Line2D.set_pickradius has been renamed to setpickradius. Is this a typo? get_pickradius still exists. This is on line 318 in lines.py. Renaming it back to set_pickradius seems make it work the way it used to. Cheers, Martin |
From: Jae-Joon L. <lee...@gm...> - 2009-01-27 00:02:10
|
Michael, It seems that the gtk backend in the current svn silently ignores ALL exceptions raised during the drawing. http://matplotlib.svn.sourceforge.net/viewvc/matplotlib/trunk/matplotlib/lib/matplotlib/backends/backend_gtk.py?r1=6696&r2=6793 Is this necessary? I don't think we want to do this. Regards, -JJ |
From: Michael D. <md...@st...> - 2009-01-26 18:46:21
|
Okay -- I think I've at least narrowed it down to a cause. Agg uses fixed-point arithmetic to render at the low-level -- by default it uses 24.8 (i.e. 24 integer bits and 8 fractional bits). Therefore, it can only handle pixel coordinates in the range -2^23 to 2^23. Both of the provided examples, after the data has been scaled, draw outside of this range, which results in integer overflow, hence things going in the wrong direction etc. We could change the fixed point in agg_basics.h, but I hesitate to do so, as it's at the expense of fine detail. We could possibly move to 64-bits, but I'm not sure how easy that would be, or what the impact might be on 32-bit platforms. //----------------------------------------------------poly_subpixel_scale_e // These constants determine the subpixel accuracy, to be more precise, // the number of bits of the fractional part of the coordinates. // The possible coordinate capacity in bits can be calculated by formula: // sizeof(int) * 8 - poly_subpixel_shift, i.e, for 32-bit integers and // 8-bits fractional part the capacity is 24 bits. enum poly_subpixel_scale_e { poly_subpixel_shift = 8, //----poly_subpixel_shift poly_subpixel_scale = 1<<poly_subpixel_shift, //----poly_subpixel_scale poly_subpixel_mask = poly_subpixel_scale-1, //----poly_subpixel_mask }; One thing I will look into is whether the line simplification algorithm can be extended to actually clip the lines when they go outside of the image range. At the moment, it does some work to reduce the number of points outside of the image, but it always draws at least one point outside at its original location. It looks like Agg has some of the pieces necessary to do this -- whether it's feasible to integrate that into our existing line simplification algorithm remains to be seen. Mike Michael Droettboom wrote: > Thanks for this. > > I believe both of these examples illustrate a shortcoming in Agg when > the distance between two points on either end of a line is too great. > > I'll do some digging around and see what may be causing this and if any > limits can be adjusted -- I may not get to this today, however. > > Mike > > João Luís Silva wrote: > >> Jan Müller wrote: >> >> >>> Hi, >>> >>> The simple code snippet at the end of this mail should plot a single line. >>> >>> >>> >> I can confirm this bug on Ubuntu running matplotlib svn revision 6827. >> However I think it doesn't have to do with the log-scale but with the >> big variations on the x-scale and the custom xscale. I've reproduced a >> similar behavior with the following script (pan and zoom to see the >> buggy behavior). >> -------------------- >> >> import numpy as np >> import matplotlib.pyplot as plt >> >> x = np.array([1.0,2.0,3.0,1.0E5,2.0E5]) >> y = np.arange(len(x)) >> plt.plot(x,y) >> plt.xlim(xmin=2,xmax=6) >> plt.show() >> >> -------------------- >> >> Best Regards, >> João Silva >> >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by: >> SourcForge Community >> SourceForge wants to tell your story. >> http://p.sf.net/sfu/sf-spreadtheword >> _______________________________________________ >> Matplotlib-devel mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >> >> > > -- Michael Droettboom Science Software Branch Operations and Engineering Division Space Telescope Science Institute Operated by AURA for NASA |
From: Michael D. <md...@st...> - 2009-01-26 15:33:29
|
Thanks for this. I believe both of these examples illustrate a shortcoming in Agg when the distance between two points on either end of a line is too great. I'll do some digging around and see what may be causing this and if any limits can be adjusted -- I may not get to this today, however. Mike João Luís Silva wrote: > Jan Müller wrote: > >> Hi, >> >> The simple code snippet at the end of this mail should plot a single line. >> >> > > I can confirm this bug on Ubuntu running matplotlib svn revision 6827. > However I think it doesn't have to do with the log-scale but with the > big variations on the x-scale and the custom xscale. I've reproduced a > similar behavior with the following script (pan and zoom to see the > buggy behavior). > -------------------- > > import numpy as np > import matplotlib.pyplot as plt > > x = np.array([1.0,2.0,3.0,1.0E5,2.0E5]) > y = np.arange(len(x)) > plt.plot(x,y) > plt.xlim(xmin=2,xmax=6) > plt.show() > > -------------------- > > Best Regards, > João Silva > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by: > SourcForge Community > SourceForge wants to tell your story. > http://p.sf.net/sfu/sf-spreadtheword > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel > -- Michael Droettboom Science Software Branch Operations and Engineering Division Space Telescope Science Institute Operated by AURA for NASA |
From: João L. S. <js...@fc...> - 2009-01-26 15:05:08
|
Jan Müller wrote: > Hi, > > The simple code snippet at the end of this mail should plot a single line. > I can confirm this bug on Ubuntu running matplotlib svn revision 6827. However I think it doesn't have to do with the log-scale but with the big variations on the x-scale and the custom xscale. I've reproduced a similar behavior with the following script (pan and zoom to see the buggy behavior). -------------------- import numpy as np import matplotlib.pyplot as plt x = np.array([1.0,2.0,3.0,1.0E5,2.0E5]) y = np.arange(len(x)) plt.plot(x,y) plt.xlim(xmin=2,xmax=6) plt.show() -------------------- Best Regards, João Silva |
From: Jan M. <mu...@im...> - 2009-01-26 13:58:41
|
Hi, The simple code snippet at the end of this mail should plot a single line. Unfortunately, depending on - the backend - the windowsize - and the pan/zoom position inside the plot one or more additional lines appear. Under windows it looks like this: http://img217.imageshack.us/my.php?image=matplotlibproblemei6.png (disable Adblock Plus on Firefox, otherwise the image might not be visible) When I pan the plot, the additional lines jump around randomly and sometimes dis- and reappear at their own will. I get this problem for all the AGG-based backends (one additional line) and the GTK backend (several additional lines). GTKCairo does not show this behavior. The problem appears under an up-to-date Fedora 64-bit with matplotlib 0.98.3 and 0.89.6 SVN (from today) The output of "python setup.py" is this: ------------------ BUILDING MATPLOTLIB matplotlib: 0.98.6svn python: 2.5.2 (r252:60911, Sep 30 2008, 15:42:03) [GCC 4.3.2 20080917 (Red Hat 4.3.2-4)] platform: linux2 REQUIRED DEPENDENCIES numpy: 1.2.0 freetype2: 9.18.3 OPTIONAL BACKEND DEPENDENCIES libpng: 1.2.33 Tkinter: no wxPython: 2.8.9.1 * WxAgg extension not required for wxPython >= 2.8 Gtk+: gtk+: 2.14.5, glib: 2.18.3, pygtk: 2.13.0, pygobject: 2.15.4 Qt4: Qt: 4.4.3, PyQt4: 4.4.4 Cairo: 1.4.12 OPTIONAL DATE/TIMEZONE DEPENDENCIES datetime: present, version unknown dateutil: 1.4 pytz: 2008i OPTIONAL USETEX DEPENDENCIES dvipng: no ghostscript: 8.63 latex: no ------------------ Additionally I see this problem under 32-bit Windows XP using the Enthought EPD Py25 v4.1.30101 distribution which uses matplotlib 0.98.3 To reproduce the error: - switch to GTKAgg - run the script below - enlarge or maximize the window (something larger than 1280*1024 should be fine) - play around with zoom/pan Expected result: - a single line is plotted Actually result: - two or more lines are plotted If you need more information just contact me, Jan ----------------------------------------- import numpy as np import matplotlib.pyplot as plt E = np.array(( 1.00000000e+00, 1.50000000e+00, 2.00000000e+00, 3.00000000e+00, 4.00000000e+00, 5.00000000e+00, 6.00000000e+00, 8.00000000e+00, 1.00000000e+01, 1.50000000e+01, 2.00000000e+01, 3.00000000e+01, 4.00000000e+01, 5.00000000e+01, 6.00000000e+01, 8.00000000e+01, 1.00000000e+02, 1.50000000e+02, 2.00000000e+02, 3.00000000e+02, 4.00000000e+02, 5.00000000e+02, 6.00000000e+02, 8.00000000e+02, 1.00000000e+03, 1.02200000e+03, 1.25000000e+03, 1.50000000e+03, 2.00000000e+03, 2.04400000e+03, 3.00000000e+03, 4.00000000e+03, 5.00000000e+03, 6.00000000e+03, 7.00000000e+03, 8.00000000e+03, 9.00000000e+03, 1.00000000e+04, 1.10000000e+04, 1.20000000e+04, 1.30000000e+04, 1.40000000e+04, 1.50000000e+04, 1.60000000e+04, 1.80000000e+04, 2.00000000e+04, 2.20000000e+04, 2.40000000e+04, 2.60000000e+04, 2.80000000e+04, 3.00000000e+04, 4.00000000e+04, 5.00000000e+04, 6.00000000e+04, 8.00000000e+04, 1.00000000e+05, 1.50000000e+05, 2.00000000e+05, 3.00000000e+05, 4.00000000e+05, 5.00000000e+05, 6.00000000e+05, 8.00000000e+05, 1.00000000e+06, 1.50000000e+06, 2.00000000e+06, 3.00000000e+06, 4.00000000e+06, 5.00000000e+06, 6.00000000e+06, 8.00000000e+06, 1.00000000e+07, 1.50000000e+07, 2.00000000e+07, 3.00000000e+07, 4.00000000e+07, 5.00000000e+07, 6.00000000e+07, 8.00000000e+07, 1.00000000e+08)) att = np.array(( 6.81740051e+00, 1.75185086e+00, 6.63815247e-01, 1.67656668e-01, 6.29160626e-02, 2.93190047e-02, 1.56961535e-02, 5.86499592e-03, 2.72337524e-03, 6.73972636e-04, 2.49871770e-04, 6.16613263e-05, 2.28421754e-05, 1.05816094e-05, 5.64930077e-06, 2.10496950e-06, 9.82279267e-07, 2.49513274e-07, 9.62561983e-08, 2.63733618e-08, 1.11014287e-08, 5.93131769e-09, 3.67936480e-09, 1.86298464e-09, 1.17168470e-09, 1.12328773e-09, 7.79131487e-10, 5.81480647e-10, 3.70505702e-10, 3.58735080e-10, 2.10556699e-10, 1.46027404e-10, 1.11492282e-10, 9.01020155e-11, 7.55231748e-11, 6.50072897e-11, 5.70367268e-11, 5.08048699e-11, 4.57978746e-11, 4.16871195e-11, 3.82455571e-11, 3.53357639e-11, 3.28322663e-11, 3.06573900e-11, 2.70724292e-11, 2.42403101e-11, 2.19399603e-11, 2.00399310e-11, 1.84446235e-11, 1.70823384e-11, 1.59052762e-11, 1.18363457e-11, 9.42247205e-12, 7.82716448e-12, 5.84766861e-12, 4.66702151e-12, 3.10158861e-12, 2.32245712e-12, 1.54571561e-12, 1.15853984e-12, 9.26114881e-13, 7.71364072e-13, 5.78493179e-13, 4.62698944e-13, 3.08366381e-13, 2.31229973e-13, 1.54153316e-13, 1.15555237e-13, 9.24919894e-14, 7.70766578e-14, 5.77835936e-14, 4.62280699e-14, 3.08187133e-14, 2.31110475e-14, 1.54093566e-14, 1.15555237e-14, 9.24322401e-15, 7.70169085e-15, 5.77776187e-15, 4.62220950e-15)) plt.figure() plt.plot(E,att) plt.yscale("log") plt.xscale("linear") plt.xlim(xmin=np.log(20), xmax=np.log(500)) plt.ylim(ymin=-18,ymax=5) plt.show() |
From: Evans, J. R <jam...@jp...> - 2009-01-23 23:27:22
|
All, While looking over the polar plot code I came across the following issue: When plotting something like 'polar( [2*pi/180, 358*pi/180], [2.0, 1.0] )' the plotted line will actually wrap around the origin of the plot before reaching its destination. Initially I thought that this was correct behavior. The line numerically passed through all angles between 2 and 358 degrees in a linear fashion. However after consulting several colleagues and text books I believe that the behavior is actually wrong. It is my understanding that for polar plots there is no linear mapping of the axes as it is currently implemented. Rather for a simple two-point line defined in polar coordinates, the line should essentially take the direct route. This is highlighted by the two-point equation of a line for polar plots: r = ( r1*r2*sin(t2-t1) ) / ( (r1*sin(t-t1)) - (r2*sin(t-t2)) ) If you were to plug in the two points given above, then increment theta (t) from 2 degrees to 358 degrees, then convert to Cartesian cords, and plot the results, you will get the correct line that directly crosses the zero degree line and not one that wraps around the origin. Is the polar plot function implemented this way on purpose? Which way should it really be implemented? Thanks, --James Evans |
From: Дмитрий Л. <dmi...@gm...> - 2009-01-23 19:36:36
|
Hello I couldn't manage to find how to search the mailing list archive on sourceforge, so sorry if it is already discussed/developed. There are many backends for matplotlib and debian/ubuntu packages depend on at least one of the interactive once [1]. But matplotrc as installed says "backend: TkAgg". This often results in errors upon importing pylab after installation saying "Please install python-tk". Has there been any work done about providing an auto backend? The one that would select the best one automagically. I've pocked into code a little bit. My understanding is to hook into pylab_setup() (lib/matplotlib/backends/__init__.py). That is if backend is auto, run a script which returns a name of the pretties backend which then is used. I think I will manage writing something like that. And I would be glad to collaborate. [1] One of the dependencies is: python-tk | python-gtk2 | python-wxgtk2.8 | python-qt3 | python-qt4 ps. cloned github repo, now updating, takes ages =D Maybe git hub needs to be updated with more revisions? =D -- With regards, Dmitrijs Ledkovs. |
From: Eric F. <ef...@ha...> - 2009-01-22 21:27:01
|
Ryan May wrote: > Hi, > > Does anyone know why set_aspect('equal', 'box') isn't accepted on shared axes? > I'm trying to make the following type of scenario work: Ryan, Mark Bakker keeps asking about this, too. I have never been able to figure out an algorithm, consistent with the present mpl architecture, that would cleanly handle shared axes with fixed aspect ratio when the box is adjustable. If you can specify exactly what should happen under all such conditions that might arise--including single or dual shared axes, and fixed aspect ratio on one or more of the plots with shared axes, and any kind of zooming or panning--then we should be able to come up with the logic to implement it. Maybe the problem is not that there is no such specification or algorithm, but that it requires a global solution, involving figuring out the dimensions of all the boxes at the same time, whereas with adjustable datalim the calculation can be done on any subplot individually--hence it can be in Axes.draw, as at present. I suspect this is the crux of it--fixed aspect ratio, shared axes, adjustable box would require a level of code that does not exist at present; a method invoked at the Figure.draw() level that would calculate and freeze all of the axes positions in one whack before any of them are drawn. Actually, even this is not right, because IIRC axes can be shared across figures, so this calculation would need to be done at the highest level--before the Figure.draw() method. If we go this route--which sounds like going to full-fledged sizer/pack type algorithms--we need to be sure it does not slow down interactive responsiveness. Or burden us with bugs and unmaintainable code. Sometimes it is worthwhile to accept some limitations and keep things simple. Note that the present implementation of shared axes, unlike an earlier implementation, has no notion of master and slaves; all are equivalent, and can be calculated and drawn in any order. Eric > > import numpy as np > from matplotlib.pyplot import figure, show > > fig1 = figure() > fig2 = figure() > > ax1 = fig1.add_subplot(1, 1, 1) > ax1.set_aspect('equal', 'datalim') > > ax2 = fig2.add_subplot(1, 2, 1, sharex=ax1, sharey=ax1) > ax2.set_aspect('equal', 'datalim') > ax3 = fig2.add_subplot(1, 2, 2, sharex=ax2, sharey=ax2) > > data = np.random.rand(50,50) > ax1.pcolormesh(data) > ax2.pcolormesh(data) > ax3.pcolormesh(data) > > show() > > Basically, I have multiple figures with multiple subplots, all of which should be > displaying the same range. However, the different figures have different numbers > of subplots. The example above doesn't work, because once you zoom into one of > the figures, it iteratively zooms out, adjusting data limits until both figures > have their aspect ratio properly set again. I thought using 'box' might > alleviate the problem, but that's throwing an exception. > > I realize making the figures have the same layout would solve the problem, I just > wasn't sure if there was another way. > > Ryan |
From: Ryan M. <rm...@gm...> - 2009-01-22 20:40:36
|
Hi, Does anyone know why set_aspect('equal', 'box') isn't accepted on shared axes? I'm trying to make the following type of scenario work: import numpy as np from matplotlib.pyplot import figure, show fig1 = figure() fig2 = figure() ax1 = fig1.add_subplot(1, 1, 1) ax1.set_aspect('equal', 'datalim') ax2 = fig2.add_subplot(1, 2, 1, sharex=ax1, sharey=ax1) ax2.set_aspect('equal', 'datalim') ax3 = fig2.add_subplot(1, 2, 2, sharex=ax2, sharey=ax2) data = np.random.rand(50,50) ax1.pcolormesh(data) ax2.pcolormesh(data) ax3.pcolormesh(data) show() Basically, I have multiple figures with multiple subplots, all of which should be displaying the same range. However, the different figures have different numbers of subplots. The example above doesn't work, because once you zoom into one of the figures, it iteratively zooms out, adjusting data limits until both figures have their aspect ratio properly set again. I thought using 'box' might alleviate the problem, but that's throwing an exception. I realize making the figures have the same layout would solve the problem, I just wasn't sure if there was another way. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma |
From: a <the...@gm...> - 2009-01-21 21:20:17
|
Michael Droettboom <mdroe@...> writes: > > > Thanks for the pointers. > > The original simplification code was written by John Hunter (I believe), > and I don't know if it was designed by him also or is a replication of > something published elsewhere. So I take no credit for and have little > knowledge of its original goals. I'm not sure on everything it does, but it seems to do clipping and removes line segments where the change in slope is less than some limit. There are probably better algorithms out there, but this one works surprisingly well and is fast and simple. I think it should be a requirement that it returns points which are a subset of the original points- with the change you've made it does this, right? > > However, IMHO the primary purpose of the path simplification in > matplotlib is to improve interactive performance (and smaller file size > is just an convenient side effect of that), I would hesitate to use an > algorithm that is any worse than O(n), since it must be recalculated on > every pan or zoom since the simplification is related to *pixels* not > data units. Even on modern hardware, it is a constant battle keeping > the inner drawing loop fast enough. We could, of course, make the > choice of algorithm user-configurable, or use something more precise > when using a non-interactive backend, but then we would have two > separate code paths to keep in sync and bug free --- not a choice I > take lightly. I see your point. I originally encountered a problem when preparing a pdf figure- I had a lot of high resolution data, and with path simplification the resulting pdf looked pretty bad (the lines were jagged). But the advantage was a massive reduction in file size of the pdf. I adjusted perpdNorm2 and got much better results. > > The trick with the present algorithm is to keep the error rate at the > subpixel level through the correct selection of perpdNorm. It seems to > me that the more advanced simplification algorithm is only necessary > when you want to simplify more aggressively than the pixel level. But > what hasn't been done is a proper study of the error rate along the > simplified path of the current approach vs. other possible approaches. > Even this latest change was verified by just looking at the results > which seemingly are better on the data I looked at. So I'm mostly > speaking from my gut rather than evidence here. > > > > #src/agg_py_path_iterator.h > > > > //if the perp vector is less than some number of (squared) > > //pixels in size, then merge the current vector > > if (perpdNorm2 < (1.0 / 9.0)) > > > That sounds like a good idea. I'll have a look at doing that. > Right, perhaps the best thing to do is make the tolerance parameter adjustable, so it can be reduced to speed up drawing in the interactive backends, but it can also be easily bumped up for extra resolution in the non-interactive backends like pdf/ps. > Mike > a |
From: James E. <jre...@ea...> - 2009-01-21 19:38:59
|
All, I have a suggestion for the units.ConversionInterface class. It seems to me that for each method of the ConversionInterface there should be another parameter passed in, the Axis instance that is requesting the conversion/info. Doing so would allow for any custom conversion code to make checks on the Axis, whether it is X/Y, or even make checks on the Axes the Axis is attached to. Additionally this would allow for converters to apply any necessary changes before they are really used (such as DateConverter making sure that the axis bounds are >= 1). Comments/Suggestions? If it is okay, I would like to apply the necessary changes. --James Evans |
From: Michael D. <md...@st...> - 2009-01-21 19:09:39
|
a wrote: > Michael Droettboom <mdroe@...> writes: > > >> I've checked this change into SVN so others can test it out. >> >> Assuming we don't discover any cases where this is clearly inferior, it >> should make it into the next major release. >> >> Mike >> >> > > Hi, > > This change looks good- it has the advantage of choosing points that actually > lie on the curve, which is better visually, and would seem to be a better > solution for publication quality plots. > > The method for simplifying the paths is quite simple and effective, but a bit > crude- there are other algorithms you might look into for simplifying lines: > > http://en.wikipedia.org/wiki/Ramer-Douglas-Peucker_algorithm > > This one is fairly simple to implement and has the advantage that you have some > control over the errors- the deviation from your simplified path and the actual > path. > Thanks for the pointers. The original simplification code was written by John Hunter (I believe), and I don't know if it was designed by him also or is a replication of something published elsewhere. So I take no credit for and have little knowledge of its original goals. However, IMHO the primary purpose of the path simplification in matplotlib is to improve interactive performance (and smaller file size is just an convenient side effect of that), I would hesitate to use an algorithm that is any worse than O(n), since it must be recalculated on every pan or zoom since the simplification is related to *pixels* not data units. Even on modern hardware, it is a constant battle keeping the inner drawing loop fast enough. We could, of course, make the choice of algorithm user-configurable, or use something more precise when using a non-interactive backend, but then we would have two separate code paths to keep in sync and bug free --- not a choice I take lightly. The trick with the present algorithm is to keep the error rate at the subpixel level through the correct selection of perpdNorm. It seems to me that the more advanced simplification algorithm is only necessary when you want to simplify more aggressively than the pixel level. But what hasn't been done is a proper study of the error rate along the simplified path of the current approach vs. other possible approaches. Even this latest change was verified by just looking at the results which seemingly are better on the data I looked at. So I'm mostly speaking from my gut rather than evidence here. > Also, you might consider to make the path simplification tolerance (perdNorm2) > an adjustable parameter in the matplotlibrc file: > > #src/agg_py_path_iterator.h > > //if the perp vector is less than some number of (squared) > //pixels in size, then merge the current vector > if (perpdNorm2 < (1.0 / 9.0)) > That sounds like a good idea. I'll have a look at doing that. Mike -- Michael Droettboom Science Software Branch Operations and Engineering Division Space Telescope Science Institute Operated by AURA for NASA |
From: a <the...@gm...> - 2009-01-21 18:35:15
|
Michael Droettboom <mdroe@...> writes: > > I've checked this change into SVN so others can test it out. > > Assuming we don't discover any cases where this is clearly inferior, it > should make it into the next major release. > > Mike > Hi, This change looks good- it has the advantage of choosing points that actually lie on the curve, which is better visually, and would seem to be a better solution for publication quality plots. The method for simplifying the paths is quite simple and effective, but a bit crude- there are other algorithms you might look into for simplifying lines: http://en.wikipedia.org/wiki/Ramer-Douglas-Peucker_algorithm This one is fairly simple to implement and has the advantage that you have some control over the errors- the deviation from your simplified path and the actual path. Also, you might consider to make the path simplification tolerance (perdNorm2) an adjustable parameter in the matplotlibrc file: #src/agg_py_path_iterator.h //if the perp vector is less than some number of (squared) //pixels in size, then merge the current vector if (perpdNorm2 < (1.0 / 9.0)) kind regards, a |
From: Michael D. <md...@st...> - 2009-01-21 14:53:16
|
I've checked this change into SVN so others can test it out. Assuming we don't discover any cases where this is clearly inferior, it should make it into the next major release. Mike Andrew Hawryluk wrote: >> -----Original Message----- >> From: Michael Droettboom [mailto:md...@st...] >> Sent: 16 Jan 2009 1:31 PM >> To: Andrew Hawryluk >> Cc: mat...@li... >> Subject: Re: [matplotlib-devel] path simplification can decrease the >> smoothness of data plots >> >> Michael Droettboom wrote: >> > > ... > > >> I've attached a patch that will only include points from the original >> data in the simplified path. I hesitate to commit it to SVN, as these >> things are very hard to get right -- and just because it appears to >> work better on this data doesn't mean it doesn't create a regression >> > on > >> something else... ;) That said, it would be nice to confirm that this >> solution works, because it has the added benefit of being a little >> simpler computationally. Be sure to blitz your build directory when >> testing the patch -- distutils won't pick it up as a dependency. >> >> I've attached two PDFs -- one with the original (current trunk) >> behavior, and one with the new behavior. I plotted the unsimplified >> plot in thick blue behind the simplified plot in green, so you can see >> how much deviation there is between the original data and the >> simplified line (you'll want to zoom way in with your PDF viewer to >> > see > >> it.) >> >> I've also included a new version of your test script which detects >> "new" >> data values in the simplified path, and also seeds the random number >> generator so that results are comparable. I also set the >> solid_joinstyle to "round", as it makes the wiggliness less >> > pronounced. > >> (There was another thread on this list recently about making that the >> default setting). >> >> Cheers, >> Mike >> >> -- >> Michael Droettboom >> Science Software Branch >> Operations and Engineering Division >> Space Telescope Science Institute >> Operated by AURA for NASA >> > > Thanks for looking into this! The new plot is much improved, and the > simplified calculations are a pleasant surprise. I was also testing the > previous algorithm with solid_joinstyle set to "round" as it is the > default in my matplotlibrc. > > I am probably not able to build your patch here, unless building > matplotlib from source on Windows is easier than I anticipate. May I > send you some data off the list for you to test? > > Regards, > Andrew > > NOVA Chemicals Research & Technology Centre > Calgary, Canada > -- Michael Droettboom Science Software Branch Operations and Engineering Division Space Telescope Science Institute Operated by AURA for NASA |
From: Michael D. <md...@st...> - 2009-01-20 14:17:32
|
I'm not able to see this here (RHEL 4, matplotlib SVN trunk) with either the PDF or TkAgg backends. The attached graphs show that memory usage levels off after the first 3-4 iterations. I only did 10 iterations here to save bandwidth, but I also tested 500 and the results were the same. As I'm not on a Mac, I'm unable to test the Cocoa backend. What tool are you using to detect the memory leaks? Many tools that work at the high level (such as Windows Task Manager etc.) are full of pitfalls, so in my experience you really need to use a tool designed for the purpose of detecting memory leaks (such as Valgrind/Massif, or the Apple one whose name escapes me). I modified Damon's script to use random data (since his data was not attached). If this script does not leak memory for either of you, then perhaps the leak is data-dependent, in which case I'll either need to obtain or better simulate that data. Mike Michiel de Hoon wrote: > I am also finding the continuing increase in memory usage, but this also occurs with other backends (I tried tkagg and pdf) and also without the call to savefig. One possibility is a circular reference in the quiver function that prevents data from being cleaned up. > > --Michiel > > > --- On Mon, 1/19/09, Damon McDougall <dam...@gm...> wrote: > > >> From: Damon McDougall <dam...@gm...> >> Subject: [matplotlib-devel] Memory leak using savefig with MacOSX backend? >> To: "matplotlib development list" <mat...@li...> >> Date: Monday, January 19, 2009, 6:09 AM >> I'm looping over several files (about 1000) to produce a >> vector field plot for each data file I have. Doing this with >> the MacOSX backend appears to chew memory. My guess as to >> the source of the problem is the 'savefig' function >> (or possibly the way the MacOSX backend handles the saving >> of plots). >> >> I opened Activity Monitor to watch the usage of memory >> increase. Below is code that recreates the problem. >> >> [start] >> >> import matplotlib >> matplotlib.use('macosx') >> matplotlib.rc('font',**{'family':'serif','serif':['Computer >> Modern Roman']}) >> matplotlib.rc('text', usetex=True) >> from pylab import * >> >> i = 0 >> x = [] >> y = [] >> v1 = [] >> v2 = [] >> >> while(True): >> f = open("%dresults.dat"%i,"r") >> for line in f: >> x.append(float(line.split()[0])) >> y.append(float(line.split()[1])) >> v1.append(float(line.split()[2])) >> v2.append(float(line.split()[3])) >> f.close() >> hold(False) >> figure(1) >> quiver(x, y, v1, v2, color='b', >> units='width', scale=1.0) >> xlabel('$x$') >> ylabel('$y$') >> grid(True) >> print i >> savefig('graph-%05d.pdf'%i) >> close(1) >> x = [] >> y = [] >> v1 = [] >> v2 = [] >> i = i + 1 >> >> >> [end] >> >> Regards, >> --Damon >> >> ------------------------------------------------------------------------------ >> This SF.net email is sponsored by: >> SourcForge Community >> SourceForge wants to tell your story. >> http://p.sf.net/sfu/sf-spreadtheword_______________________________________________ >> Matplotlib-devel mailing list >> Mat...@li... >> https://lists.sourceforge.net/lists/listinfo/matplotlib-devel >> > > > > > ------------------------------------------------------------------------------ > This SF.net email is sponsored by: > SourcForge Community > SourceForge wants to tell your story. > http://p.sf.net/sfu/sf-spreadtheword > _______________________________________________ > Matplotlib-devel mailing list > Mat...@li... > https://lists.sourceforge.net/lists/listinfo/matplotlib-devel > -- Michael Droettboom Science Software Branch Operations and Engineering Division Space Telescope Science Institute Operated by AURA for NASA |
From: Michael D. <md...@st...> - 2009-01-20 13:43:21
|
> Thanks for looking into this! The new plot is much improved, and the > simplified calculations are a pleasant surprise. I was also testing the > previous algorithm with solid_joinstyle set to "round" as it is the > default in my matplotlibrc. > > I am probably not able to build your patch here, unless building > matplotlib from source on Windows is easier than I anticipate. May I > send you some data off the list for you to test? > No problem. I'd also want testing from others -- there aren't a lot of examples in matplotlib itself where simplification even kicks in. Mike -- Michael Droettboom Science Software Branch Operations and Engineering Division Space Telescope Science Institute Operated by AURA for NASA |
From: Andrew H. <HA...@no...> - 2009-01-19 19:07:01
|
> -----Original Message----- > From: Michael Droettboom [mailto:md...@st...] > Sent: 16 Jan 2009 1:31 PM > To: Andrew Hawryluk > Cc: mat...@li... > Subject: Re: [matplotlib-devel] path simplification can decrease the > smoothness of data plots > > Michael Droettboom wrote: ... > I've attached a patch that will only include points from the original > data in the simplified path. I hesitate to commit it to SVN, as these > things are very hard to get right -- and just because it appears to > work better on this data doesn't mean it doesn't create a regression on > something else... ;) That said, it would be nice to confirm that this > solution works, because it has the added benefit of being a little > simpler computationally. Be sure to blitz your build directory when > testing the patch -- distutils won't pick it up as a dependency. > > I've attached two PDFs -- one with the original (current trunk) > behavior, and one with the new behavior. I plotted the unsimplified > plot in thick blue behind the simplified plot in green, so you can see > how much deviation there is between the original data and the > simplified line (you'll want to zoom way in with your PDF viewer to see > it.) > > I've also included a new version of your test script which detects > "new" > data values in the simplified path, and also seeds the random number > generator so that results are comparable. I also set the > solid_joinstyle to "round", as it makes the wiggliness less pronounced. > (There was another thread on this list recently about making that the > default setting). > > Cheers, > Mike > > -- > Michael Droettboom > Science Software Branch > Operations and Engineering Division > Space Telescope Science Institute > Operated by AURA for NASA Thanks for looking into this! The new plot is much improved, and the simplified calculations are a pleasant surprise. I was also testing the previous algorithm with solid_joinstyle set to "round" as it is the default in my matplotlibrc. I am probably not able to build your patch here, unless building matplotlib from source on Windows is easier than I anticipate. May I send you some data off the list for you to test? Regards, Andrew NOVA Chemicals Research & Technology Centre Calgary, Canada |
From: Patrick M. <pat...@gm...> - 2009-01-19 18:21:31
|
Greetings, This weekend I decided to try and build mpl for windows using python 2.6.1. I was able to build numpy and mpl and I thought things were great. However, when I tried to run a script with a show() function, nothing would happen. Looking into this, I discovered that mpl didn't recognize Tkinter during it's building process. I checked the setupext.py code and found that python 2.6 wast excluded in the windows section, even though Tkinter at least imports using python2.6. I also noticed that there appears to be a hold over to a previous version of mpl by including references to python 2.2, even though the TCL/TK headers and libs for TCL/TK 8.3 aren't included in win32_static version anymore. I decided to build TCL/TK 8.5 (which is what python 2.6 ships with) and dumped the appropriate libs in the lib directory of win32_static and the header files into include/tcl85 directory. I then changed references from python 2.2 to use python 2.6 (including using TCL/TK 8.5 instead of 8.3), rebuilt mpl, and ran it. This time the show window opens, however as the figure is being drawn an error occurs and the figure window remains blank. Below is the traceback: Exception in Tkinter callback Traceback (most recent call last): File "D:\Python26\lib\lib-tk\Tkinter.py", line 1410, in __call__ return self.func(*args) File "D:\Python26\lib\site-packages\matplotlib\backends\backend_tkagg.py", line 212, in resize self.show() File "D:\Python26\lib\site-packages\matplotlib\backends\backend_tkagg.py", line 216, in draw tkagg.blit(self._tkphoto, self.renderer._renderer, colormode=2) File "D:\Python26\lib\site-packages\matplotlib\backends\tkagg.py", line 19, in blit tk.call("PyAggImagePhoto", photoimage, id(aggimage), colormode, id(bbox_array)) TclError This is as far as I've been able to go (so far) in tracking down the issue. I'll admit that I haven't looked much farther due to time constraints, so if anyone else has suggestions as to where to go next, I'm all ears. I'm not sure who the windows guru is for mpl, but if I (we) can figure this problem out, I'll work with them in getting mpl ready for 2.6 on windows. Thanks, -Patrick -- Patrick Marsh Graduate Research Assistant School of Meteorology University of Oklahoma http://www.patricktmarsh.com |
From: Michael A. <mab...@go...> - 2009-01-19 14:37:04
|
On Mon, Jan 19, 2009 at 6:33 AM, Michiel de Hoon <mjl...@ya...> wrote: > I am also finding the continuing increase in memory usage, but this also occurs with other backends (I tried tkagg and pdf) and also without the call to savefig. One possibility is a circular reference in the quiver function that prevents data from being cleaned up. I do not know how relevant this is to the problem at hand, but I have observed memory leaks in the Delaunay code in matplotlib 0.98.3. I hadn't had the chance to upgrade to 0.98.5.x yet, but judging from the release notes the issue was not fixed. Since the leak happens from inside Sage I need to find out what exactly causes those leaks before poking around and fixing them. > --Michiel > Cheers, Michael |