You can subscribe to this list here.
| 2003 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(3) |
Jun
|
Jul
|
Aug
(12) |
Sep
(12) |
Oct
(56) |
Nov
(65) |
Dec
(37) |
|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 2004 |
Jan
(59) |
Feb
(78) |
Mar
(153) |
Apr
(205) |
May
(184) |
Jun
(123) |
Jul
(171) |
Aug
(156) |
Sep
(190) |
Oct
(120) |
Nov
(154) |
Dec
(223) |
| 2005 |
Jan
(184) |
Feb
(267) |
Mar
(214) |
Apr
(286) |
May
(320) |
Jun
(299) |
Jul
(348) |
Aug
(283) |
Sep
(355) |
Oct
(293) |
Nov
(232) |
Dec
(203) |
| 2006 |
Jan
(352) |
Feb
(358) |
Mar
(403) |
Apr
(313) |
May
(165) |
Jun
(281) |
Jul
(316) |
Aug
(228) |
Sep
(279) |
Oct
(243) |
Nov
(315) |
Dec
(345) |
| 2007 |
Jan
(260) |
Feb
(323) |
Mar
(340) |
Apr
(319) |
May
(290) |
Jun
(296) |
Jul
(221) |
Aug
(292) |
Sep
(242) |
Oct
(248) |
Nov
(242) |
Dec
(332) |
| 2008 |
Jan
(312) |
Feb
(359) |
Mar
(454) |
Apr
(287) |
May
(340) |
Jun
(450) |
Jul
(403) |
Aug
(324) |
Sep
(349) |
Oct
(385) |
Nov
(363) |
Dec
(437) |
| 2009 |
Jan
(500) |
Feb
(301) |
Mar
(409) |
Apr
(486) |
May
(545) |
Jun
(391) |
Jul
(518) |
Aug
(497) |
Sep
(492) |
Oct
(429) |
Nov
(357) |
Dec
(310) |
| 2010 |
Jan
(371) |
Feb
(657) |
Mar
(519) |
Apr
(432) |
May
(312) |
Jun
(416) |
Jul
(477) |
Aug
(386) |
Sep
(419) |
Oct
(435) |
Nov
(320) |
Dec
(202) |
| 2011 |
Jan
(321) |
Feb
(413) |
Mar
(299) |
Apr
(215) |
May
(284) |
Jun
(203) |
Jul
(207) |
Aug
(314) |
Sep
(321) |
Oct
(259) |
Nov
(347) |
Dec
(209) |
| 2012 |
Jan
(322) |
Feb
(414) |
Mar
(377) |
Apr
(179) |
May
(173) |
Jun
(234) |
Jul
(295) |
Aug
(239) |
Sep
(276) |
Oct
(355) |
Nov
(144) |
Dec
(108) |
| 2013 |
Jan
(170) |
Feb
(89) |
Mar
(204) |
Apr
(133) |
May
(142) |
Jun
(89) |
Jul
(160) |
Aug
(180) |
Sep
(69) |
Oct
(136) |
Nov
(83) |
Dec
(32) |
| 2014 |
Jan
(71) |
Feb
(90) |
Mar
(161) |
Apr
(117) |
May
(78) |
Jun
(94) |
Jul
(60) |
Aug
(83) |
Sep
(102) |
Oct
(132) |
Nov
(154) |
Dec
(96) |
| 2015 |
Jan
(45) |
Feb
(138) |
Mar
(176) |
Apr
(132) |
May
(119) |
Jun
(124) |
Jul
(77) |
Aug
(31) |
Sep
(34) |
Oct
(22) |
Nov
(23) |
Dec
(9) |
| 2016 |
Jan
(26) |
Feb
(17) |
Mar
(10) |
Apr
(8) |
May
(4) |
Jun
(8) |
Jul
(6) |
Aug
(5) |
Sep
(9) |
Oct
(4) |
Nov
|
Dec
|
| 2017 |
Jan
(5) |
Feb
(7) |
Mar
(1) |
Apr
(5) |
May
|
Jun
(3) |
Jul
(6) |
Aug
(1) |
Sep
|
Oct
(2) |
Nov
(1) |
Dec
|
| 2018 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2020 |
Jan
|
Feb
|
Mar
|
Apr
|
May
(1) |
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| 2025 |
Jan
(1) |
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
|
Oct
|
Nov
|
Dec
|
| S | M | T | W | T | F | S |
|---|---|---|---|---|---|---|
|
|
|
1
(9) |
2
(8) |
3
|
4
(6) |
5
|
|
6
|
7
(41) |
8
(18) |
9
(25) |
10
(18) |
11
(10) |
12
(13) |
|
13
(7) |
14
(4) |
15
(12) |
16
(6) |
17
(9) |
18
(7) |
19
(2) |
|
20
(5) |
21
(7) |
22
(2) |
23
(11) |
24
(11) |
25
(14) |
26
(3) |
|
27
(3) |
28
(17) |
29
(7) |
30
(16) |
31
(8) |
|
|
|
From: elmar <el...@ne...> - 2011-03-26 20:50:19
|
Am 14.03.2011 03:49, schrieb John F. Gibson:
>
> I would like to construct a 3d plot consisting of several 2d quiver plots on
> orthogonal, intersecting planes. Is this possible with matplotlib? In matlab
> I do it by construct several 2d graph and then reorienting them in the 3d
> space using the 'rotate' function. E.g.
>
> xaxis = [1 0 0];
> h = quiver('v6', z, y, w, v, 'k');
> rotate(h, xaxis, 90, [0 0 0]);
>
> This produces a 2d quiver plot of [v,w](y,z) oriented along the y,z axes of
> the 3d space, and then I do the same for x,y and x,z quiver plots.
>
> Any ideas for matplotib 3d? Thanks!
>
> John Gibson
have a look at "Volumetric Slice Plot" in the tutorial of Easyviz
(http://code.google.com/p/scitools/)
Elmar
|
|
From: Paul I. <piv...@gm...> - 2011-03-26 01:12:21
|
Blast from the past!
I just ran into this and it comes from the fact that
'matplotlib.tests.test_text' is not in the default_test_modules
variable inside matplotlib's __init__.py
Here's the necessary diff:
index 82633a5..649e4d8 100644
--- a/lib/matplotlib/__init__.py
+++ b/lib/matplotlib/__init__.py
@@ -968,7 +968,8 @@ default_test_modules =3D [
'matplotlib.tests.test_spines',
'matplotlib.tests.test_image',
'matplotlib.tests.test_simplification',
- 'matplotlib.tests.test_mathtext'
+ 'matplotlib.tests.test_mathtext',
+ 'matplotlib.tests.test_text'
]
I added a pull request for this two line change just in case
there was a specific reason to *exclude* test_text from the test
modules?=20
For instance, right now, I get one failure in the test suite if I
include it. The failure is in test_text:test_font_styles, but
this has been the case for a while, it's just that these tests
weren't running before.
Any developers want to chime in on this?
best,
--
Paul Ivanov
http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7
Michael Droettboom, on 2010-07-27 11:19, wrote:
> Hmm... surprisingly, I am actually able to reproduce this sort of=20
> behaviour here. I'll look into it further.
>=20
> Mike
>=20
> On 07/27/2010 09:49 AM, Michael Droettboom wrote:
> > Of course, we'll prefer to see all of the tests pass...
> >
> > I'm surprised the two modes of running the tests gives different
> > results. Are you sure they are running the same python? Does
> >
> > python `which nosetests` matplotlib.tests
> >
> > give you the same result as
> >
> > nosetests matplotlib.tests
> >
> > ?
> >
> > There must be some environmental difference between the two to cause the
> > different results.
> >
> > Mike
> >
> > On 07/24/2010 05:09 PM, Adam wrote:
> > =20
> >> Hello, I have just updated to v1.0.0 and am trying to run the test
> >> suite to make sure everything is ok. There seems to be two different
> >> suites and I am not sure which is correct/current:
> >>
> >> $python -c 'import matplotlib; matplotlib.test()'
> >> [...snipped output...]
> >> Ran 138 tests in 390.991s
> >> OK (KNOWNFAIL=3D2)
> >>
> >> $nosetests matplotlib.tests I get:
> >> [...snipped output]
> >> Ran 144 tests in 380.165s
> >> FAILED (errors=3D4, failures=3D1)
> >>
> >> Two of these errors are the known failures from above, and the other
> >> two are in "matplotlib.tests.test_text.test_font_styles":
> >> ImageComparisonFailure: images not close:
> >> /home/adam/result_images/test_text/font_styles.png vs.
> >> /home/adam/result_images/test_text/expected-font_styles.png (RMS
> >> 23.833)
> >> ImageComparisonFailure: images not close:
> >> /home/adam/result_images/test_text/font_styles_svg.png vs.
> >> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
> >> 12.961)
> >>
> >> The module that fails is:
> >>
> >> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
> >> ----------------------------------------------------------------------
> >> Traceback (most recent call last):
> >> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/=
nose/case.py",
> >> line 186, in runTest
> >> self.test(*self.arg)
> >> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_=
mlab.py",
> >> line 24, in test_recarray_csv_roundtrip
> >> assert np.allclose( expected['x'], actual['x'] )
> >> AssertionError
> >>
> >>
> >>
> >> I am not sure of the importance level of these - but I wanted to ask
> >> to see if I should do anything or if they can safely be ignored.
> >>
> >> Thanks,
> >> Adam.
|
|
From: Paul I. <pi...@be...> - 2011-03-26 01:06:57
|
Blast from the past!
I just ran into this and it comes from the fact that
'matplotlib.tests.test_text' is not in the default_test_modules
variable inside matplotlib's __init__.py
Here's the necessary diff:
index 82633a5..649e4d8 100644
--- a/lib/matplotlib/__init__.py
+++ b/lib/matplotlib/__init__.py
@@ -968,7 +968,8 @@ default_test_modules = [
'matplotlib.tests.test_spines',
'matplotlib.tests.test_image',
'matplotlib.tests.test_simplification',
- 'matplotlib.tests.test_mathtext'
+ 'matplotlib.tests.test_mathtext',
+ 'matplotlib.tests.test_text'
]
I added a pull request for this two line change just in case
there was a specific reason to *exclude* test_text from the test
modules?
For instance, right now, I get one failure in the test suite if I
include it. The failure is in test_text:test_font_styles, but
this has been the case for a while, it's just that these tests
weren't running before.
Any developers want to chime in on this?
best,
--
Paul Ivanov
http://pirsquared.org | GPG/PGP key id: 0x0F3E28F7
Michael Droettboom, on 2010-07-27 11:19, wrote:
> Hmm... surprisingly, I am actually able to reproduce this sort of
> behaviour here. I'll look into it further.
>
> Mike
>
> On 07/27/2010 09:49 AM, Michael Droettboom wrote:
> > Of course, we'll prefer to see all of the tests pass...
> >
> > I'm surprised the two modes of running the tests gives different
> > results. Are you sure they are running the same python? Does
> >
> > python `which nosetests` matplotlib.tests
> >
> > give you the same result as
> >
> > nosetests matplotlib.tests
> >
> > ?
> >
> > There must be some environmental difference between the two to cause the
> > different results.
> >
> > Mike
> >
> > On 07/24/2010 05:09 PM, Adam wrote:
> >
> >> Hello, I have just updated to v1.0.0 and am trying to run the test
> >> suite to make sure everything is ok. There seems to be two different
> >> suites and I am not sure which is correct/current:
> >>
> >> $python -c 'import matplotlib; matplotlib.test()'
> >> [...snipped output...]
> >> Ran 138 tests in 390.991s
> >> OK (KNOWNFAIL=2)
> >>
> >> $nosetests matplotlib.tests I get:
> >> [...snipped output]
> >> Ran 144 tests in 380.165s
> >> FAILED (errors=4, failures=1)
> >>
> >> Two of these errors are the known failures from above, and the other
> >> two are in "matplotlib.tests.test_text.test_font_styles":
> >> ImageComparisonFailure: images not close:
> >> /home/adam/result_images/test_text/font_styles.png vs.
> >> /home/adam/result_images/test_text/expected-font_styles.png (RMS
> >> 23.833)
> >> ImageComparisonFailure: images not close:
> >> /home/adam/result_images/test_text/font_styles_svg.png vs.
> >> /home/adam/result_images/test_text/expected-font_styles_svg.png (RMS
> >> 12.961)
> >>
> >> The module that fails is:
> >>
> >> FAIL: matplotlib.tests.test_mlab.test_recarray_csv_roundtrip
> >> ----------------------------------------------------------------------
> >> Traceback (most recent call last):
> >> File "/usr/local/lib/python2.6/dist-packages/nose-0.11.4-py2.6.egg/nose/case.py",
> >> line 186, in runTest
> >> self.test(*self.arg)
> >> File "/usr/local/lib/python2.6/dist-packages/matplotlib/tests/test_mlab.py",
> >> line 24, in test_recarray_csv_roundtrip
> >> assert np.allclose( expected['x'], actual['x'] )
> >> AssertionError
> >>
> >>
> >>
> >> I am not sure of the importance level of these - but I wanted to ask
> >> to see if I should do anything or if they can safely be ignored.
> >>
> >> Thanks,
> >> Adam.
|