You can subscribe to this list here.
2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
(1) |
Nov
(33) |
Dec
(20) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2004 |
Jan
(7) |
Feb
(44) |
Mar
(51) |
Apr
(43) |
May
(43) |
Jun
(36) |
Jul
(61) |
Aug
(44) |
Sep
(25) |
Oct
(82) |
Nov
(97) |
Dec
(47) |
2005 |
Jan
(77) |
Feb
(143) |
Mar
(42) |
Apr
(31) |
May
(93) |
Jun
(93) |
Jul
(35) |
Aug
(78) |
Sep
(56) |
Oct
(44) |
Nov
(72) |
Dec
(75) |
2006 |
Jan
(116) |
Feb
(99) |
Mar
(181) |
Apr
(171) |
May
(112) |
Jun
(86) |
Jul
(91) |
Aug
(111) |
Sep
(77) |
Oct
(72) |
Nov
(57) |
Dec
(51) |
2007 |
Jan
(64) |
Feb
(116) |
Mar
(70) |
Apr
(74) |
May
(53) |
Jun
(40) |
Jul
(519) |
Aug
(151) |
Sep
(132) |
Oct
(74) |
Nov
(282) |
Dec
(190) |
2008 |
Jan
(141) |
Feb
(67) |
Mar
(69) |
Apr
(96) |
May
(227) |
Jun
(404) |
Jul
(399) |
Aug
(96) |
Sep
(120) |
Oct
(205) |
Nov
(126) |
Dec
(261) |
2009 |
Jan
(136) |
Feb
(136) |
Mar
(119) |
Apr
(124) |
May
(155) |
Jun
(98) |
Jul
(136) |
Aug
(292) |
Sep
(174) |
Oct
(126) |
Nov
(126) |
Dec
(79) |
2010 |
Jan
(109) |
Feb
(83) |
Mar
(139) |
Apr
(91) |
May
(79) |
Jun
(164) |
Jul
(184) |
Aug
(146) |
Sep
(163) |
Oct
(128) |
Nov
(70) |
Dec
(73) |
2011 |
Jan
(235) |
Feb
(165) |
Mar
(147) |
Apr
(86) |
May
(74) |
Jun
(118) |
Jul
(65) |
Aug
(75) |
Sep
(162) |
Oct
(94) |
Nov
(48) |
Dec
(44) |
2012 |
Jan
(49) |
Feb
(40) |
Mar
(88) |
Apr
(35) |
May
(52) |
Jun
(69) |
Jul
(90) |
Aug
(123) |
Sep
(112) |
Oct
(120) |
Nov
(105) |
Dec
(116) |
2013 |
Jan
(76) |
Feb
(26) |
Mar
(78) |
Apr
(43) |
May
(61) |
Jun
(53) |
Jul
(147) |
Aug
(85) |
Sep
(83) |
Oct
(122) |
Nov
(18) |
Dec
(27) |
2014 |
Jan
(58) |
Feb
(25) |
Mar
(49) |
Apr
(17) |
May
(29) |
Jun
(39) |
Jul
(53) |
Aug
(52) |
Sep
(35) |
Oct
(47) |
Nov
(110) |
Dec
(27) |
2015 |
Jan
(50) |
Feb
(93) |
Mar
(96) |
Apr
(30) |
May
(55) |
Jun
(83) |
Jul
(44) |
Aug
(8) |
Sep
(5) |
Oct
|
Nov
(1) |
Dec
(1) |
2016 |
Jan
|
Feb
|
Mar
(1) |
Apr
|
May
|
Jun
(2) |
Jul
|
Aug
(3) |
Sep
(1) |
Oct
(3) |
Nov
|
Dec
|
2017 |
Jan
|
Feb
(5) |
Mar
|
Apr
|
May
|
Jun
|
Jul
(3) |
Aug
|
Sep
(7) |
Oct
|
Nov
|
Dec
|
2018 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
(2) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
S | M | T | W | T | F | S |
---|---|---|---|---|---|---|
|
|
|
|
1
(1) |
2
(10) |
3
(2) |
4
|
5
(2) |
6
|
7
|
8
|
9
(3) |
10
|
11
(1) |
12
(2) |
13
(2) |
14
(5) |
15
(5) |
16
(5) |
17
(1) |
18
(1) |
19
(1) |
20
(5) |
21
(2) |
22
(4) |
23
(1) |
24
(3) |
25
(14) |
26
(6) |
27
(6) |
28
(7) |
29
(2) |
30
|
|
From: Eric F. <ef...@ha...> - 2010-04-02 19:57:04
|
Jeff Klukas wrote: > Alright, I have attached a top-level diff that contains the changes to > axes.py that allow sending multiple colors to the 'color' argument in > Axes.hist. > > Below is a short examples that passes lists to 'colors' and 'labels'. Jeff, Thanks. I find that both hist and the patch need some additional reworking, which I will try to get done this weekend. Eric > > Cheers, > Jeff > > || Jeff Klukas, Research Assistant, Physics > || University of Wisconsin -- Madison > || jeff.klukas@gmail | jeffyklukas@aim | jeffklukas@skype > || http://www.hep.wisc.edu/~jklukas/ > > ---------------------------------- > import pylab as P > > mu, sigma = 200, 25 > x0 = mu + sigma*P.randn(10000) > x1 = mu + sigma*P.randn(7000) > x2 = mu + sigma*P.randn(3000) > > P.figure() > > colors = ['crimson', 'burlywood', 'chartreuse'] > labels = ['Crimson', 'Burlywood', 'Chartreuse'] > n, bins, patches = P.hist([x0,x1,x2], 10, histtype='bar', > color=colors, label=labels) > > P.legend() > P.show() > --------------------------------- > > > > On Wed, Mar 31, 2010 at 1:27 PM, Eric Firing <ef...@ha...> wrote: >> Jeff Klukas wrote: >>> When plotting multiple data with one Axes.hist call, the method's >>> interface allows you to specify a list of labels to the 'label' kwarg >>> to distinguish between the datasets. To get different colors, >>> however, you cannot give a list of colors to 'color'; instead, you >>> have to leave out the 'color' kwarg and change the color cycle. >>> >>> Is there any reason why the color kwarg can't work like label? I >>> spent an hour or two trying to debug a script before I realized that >>> 'color' wasn't being interpreted as I expected. I realize that there >>> is some ambiguity since a color argument can be an rgb or rgba >>> sequence. My proposal would be that 'color' would be interpreted as a >>> list of distinct colors only when multiple datasets are given as input >>> and len(color) equals the number of datasets. >>> >>> I find it hard to imagine a case where you would want to set all >>> datasets to be the same color, so I don't think the ambiguity would be >>> a major issue. I would be happy to write and submit an implementation >>> if others think this is a reasonable idea. >> Sounds good to me. I agree that it makes no sense to have to set the color >> cycle for hist (although using the color cycle as a default is reasonable), >> and I think it is just an artifact of the way hist has evolved. >> >> Eric >> >>> Cheers, >>> Jeff >>> >>> || Jeff Klukas, Research Assistant, Physics >>> || University of Wisconsin -- Madison >>> || jeff.klukas@gmail | jeffyklukas@aim | jeffklukas@skype >>> || http://www.hep.wisc.edu/~jklukas/ >>> |
From: Peter B. <bu...@gm...> - 2010-04-02 19:25:25
|
Any feedback on the submitted patch ? I've now added the possibility to switch autoscale off. On Sun, Mar 28, 2010 at 9:26 PM, Peter Butterworth <bu...@gm...> wrote: > please find attached the 2 patched files and the diff vs trunk. |
From: Eric F. <ef...@ha...> - 2010-04-02 18:46:06
|
Ryan May wrote: > On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <ef...@ha...> wrote: >> Ryan May wrote: >>> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >>>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>>>>> I just hit a problem with using quiver with Basemap when when >>>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>>>>> with angles that are quantized due to floating point truncation >>>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>>>>> problem, but it probably should be automatically scaled, as noted in >>>>>> the comments: >>>>>> >>>>>> elif self.angles == 'xy' or self.scale_units == 'xy': >>>>>> # We could refine this by calculating eps based on >>>>>> # the magnitude of U, V relative to that of X, Y, >>>>>> # to ensure we are always making small shifts in X, Y. >>>>>> >>>>>> I managed to fix the problem locally by setting: >>>>>> >>>>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>>>>> self.XY.max()) >>>>>> >>>> I don't think this will work in all cases. For example, there could be a >>>> single arrow at (0,0). >>> Good point. >>> >>>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? >>> Wouldn't this have problems if we zoom in sufficiently that the width >>> is much less than magnitude of the values? Not exactly sure what data >>> set would sensibly yield this, so I'm not sure if we should worry >>> about it. >>> >>> If we do care, we could just put a minimum bound on eps: >>> >>> eps=max(1e-8, 0.0001 * self.XY.max()) >> I don't like taking the max of a potentially large array every time; and one >> needs max absolute value in any case. I think the following is better: >> >> eps = np.abs(self.ax.dataLim.extents).max() * 0.001 > > I hadn't thought about performance. I think that's more important > than any worries about bounds being disproportionately smaller. I'll > check this in. Sorry for the piecemeal approach in thinking about this--but now I realize that to do this right, as indicated by the comment in the original code, we need to take the magnitude of U and V into account. The maximum magnitude could be calculated once in set_UVC and then saved so that it does not have to be recalculated every time it is used in make_verts. Maybe I am still missing some simpler way to handle this well. Eric > > Ryan > |
From: Ryan M. <rm...@gm...> - 2010-04-02 18:05:38
|
On Fri, Apr 2, 2010 at 11:42 AM, Eric Firing <ef...@ha...> wrote: > Ryan May wrote: >> >> On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >>>> >>>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>>>> >>>>> I just hit a problem with using quiver with Basemap when when >>>>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>>>> with angles that are quantized due to floating point truncation >>>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>>>> problem, but it probably should be automatically scaled, as noted in >>>>> the comments: >>>>> >>>>> elif self.angles == 'xy' or self.scale_units == 'xy': >>>>> # We could refine this by calculating eps based on >>>>> # the magnitude of U, V relative to that of X, Y, >>>>> # to ensure we are always making small shifts in X, Y. >>>>> >>>>> I managed to fix the problem locally by setting: >>>>> >>>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>>>> self.XY.max()) >>>>> >>> I don't think this will work in all cases. For example, there could be a >>> single arrow at (0,0). >> >> Good point. >> >>> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? >> >> Wouldn't this have problems if we zoom in sufficiently that the width >> is much less than magnitude of the values? Not exactly sure what data >> set would sensibly yield this, so I'm not sure if we should worry >> about it. >> >> If we do care, we could just put a minimum bound on eps: >> >> eps=max(1e-8, 0.0001 * self.XY.max()) > > I don't like taking the max of a potentially large array every time; and one > needs max absolute value in any case. I think the following is better: > > eps = np.abs(self.ax.dataLim.extents).max() * 0.001 I hadn't thought about performance. I think that's more important than any worries about bounds being disproportionately smaller. I'll check this in. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma |
From: Eric F. <ef...@ha...> - 2010-04-02 17:42:27
|
Ryan May wrote: > On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >>> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>>> I just hit a problem with using quiver with Basemap when when >>>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>>> with angles that are quantized due to floating point truncation >>>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>>> problem, but it probably should be automatically scaled, as noted in >>>> the comments: >>>> >>>> elif self.angles == 'xy' or self.scale_units == 'xy': >>>> # We could refine this by calculating eps based on >>>> # the magnitude of U, V relative to that of X, Y, >>>> # to ensure we are always making small shifts in X, Y. >>>> >>>> I managed to fix the problem locally by setting: >>>> >>>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>>> self.XY.max()) >>>> >> I don't think this will work in all cases. For example, there could be a >> single arrow at (0,0). > > Good point. > >> Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? > > Wouldn't this have problems if we zoom in sufficiently that the width > is much less than magnitude of the values? Not exactly sure what data > set would sensibly yield this, so I'm not sure if we should worry > about it. > > If we do care, we could just put a minimum bound on eps: > > eps=max(1e-8, 0.0001 * self.XY.max()) I don't like taking the max of a potentially large array every time; and one needs max absolute value in any case. I think the following is better: eps = np.abs(self.ax.dataLim.extents).max() * 0.001 Eric > > Ryan > |
From: Ryan M. <rm...@gm...> - 2010-04-02 14:02:57
|
On Fri, Apr 2, 2010 at 1:23 AM, Eric Firing <ef...@ha...> wrote: >> On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >>> I just hit a problem with using quiver with Basemap when when >>> angles='xy'. Because Basemap's x,y units are in meters, you end up >>> with angles that are quantized due to floating point truncation >>> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >>> problem, but it probably should be automatically scaled, as noted in >>> the comments: >>> >>> elif self.angles == 'xy' or self.scale_units == 'xy': >>> # We could refine this by calculating eps based on >>> # the magnitude of U, V relative to that of X, Y, >>> # to ensure we are always making small shifts in X, Y. >>> >>> I managed to fix the problem locally by setting: >>> >>> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >>> self.XY.max()) >>> > > I don't think this will work in all cases. For example, there could be a > single arrow at (0,0). Good point. > Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? Wouldn't this have problems if we zoom in sufficiently that the width is much less than magnitude of the values? Not exactly sure what data set would sensibly yield this, so I'm not sure if we should worry about it. If we do care, we could just put a minimum bound on eps: eps=max(1e-8, 0.0001 * self.XY.max()) Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma |
From: Eric F. <ef...@ha...> - 2010-04-02 07:24:10
|
Ryan May wrote: > Ping. Not sure if you missed it first time around or are just that busy. > I looked, but decided I needed to look again, and then lost it in the stack. See below. > Ryan > > On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: >> Eric, >> >> I just hit a problem with using quiver with Basemap when when >> angles='xy'. Because Basemap's x,y units are in meters, you end up >> with angles that are quantized due to floating point truncation >> (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the >> problem, but it probably should be automatically scaled, as noted in >> the comments: >> >> elif self.angles == 'xy' or self.scale_units == 'xy': >> # We could refine this by calculating eps based on >> # the magnitude of U, V relative to that of X, Y, >> # to ensure we are always making small shifts in X, Y. >> >> I managed to fix the problem locally by setting: >> >> angles, lengths = self._angles_lengths(U, V, eps=0.0001 * >> self.XY.max()) >> I don't think this will work in all cases. For example, there could be a single arrow at (0,0). Instead of self.XY.max(), how about abs(self.ax.dataLim.width)? Eric >> but I'm not sure if you would want a different fix. If you're happy >> with this fix, I'll go ahead an check in. > > Ryan > |
From: Ryan M. <rm...@gm...> - 2010-04-02 03:53:56
|
Hi, A "quick afternoon hack" developed into what seems to me to be a useful and simple framework for doing animations in matplotlib, utilizing the timed idle event in GTK (currently). It also supports writing out a movie file using ffmpeg. Particular issues: 1) Supporting backends other than gtk. I'm not sure how to mimic the behavior of gobject.idle_add() with Qt and Wx. Also, I'm not sure if code to mimic this belongs with the animation class or should be kept with the backend to allow use elsewhere. 2) The FuncAnimation class is written to allow an easy way to make a more "procedural" animation, such as one that displays data while reading from a live source, or draws a line repeatedly adding the points. My question is whether the interface makes sense or if it's even worthwhile since it's just saving a couple lines of code that would be necessary to just to a straightforward Animation subclass. The code still needs quite a bit of clean up and thought to make sure that the classes are broken up into the proper parts, as well as documentation, but I wanted to see if this seems like a good way to go to add easy animation support to matplotlib. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma |
From: Ryan M. <rm...@gm...> - 2010-04-02 02:15:11
|
Ping. Not sure if you missed it first time around or are just that busy. Ryan On Fri, Mar 26, 2010 at 12:13 PM, Ryan May <rm...@gm...> wrote: > Eric, > > I just hit a problem with using quiver with Basemap when when > angles='xy'. Because Basemap's x,y units are in meters, you end up > with angles that are quantized due to floating point truncation > (30000. + 0.001*u = 30000.). Changing to angles='uv' fixes the > problem, but it probably should be automatically scaled, as noted in > the comments: > > elif self.angles == 'xy' or self.scale_units == 'xy': > # We could refine this by calculating eps based on > # the magnitude of U, V relative to that of X, Y, > # to ensure we are always making small shifts in X, Y. > > I managed to fix the problem locally by setting: > > angles, lengths = self._angles_lengths(U, V, eps=0.0001 * > self.XY.max()) > > but I'm not sure if you would want a different fix. If you're happy > with this fix, I'll go ahead an check in. Ryan -- Ryan May Graduate Research Assistant School of Meteorology University of Oklahoma |