Math NSERC Liaison Committee

Representing the Canadian Mathematics Community to NSERC

  • Blog
  • About
    • MNLC Constitution
    • Membership

Report on the 2012 NSERC Discovery Grants Competition for Math/Stats Section 1508

2012-04-27 By nmlcadmin Leave a Comment

Dear colleagues in the mathematics and statistics community,

This year we held the second annual Discovery Grant survey, collecting basic information as to the outcomes of the EG 1508 (Math & Stats) DG competition. While we do not have absolutely full data, we have had a very good response rate to our requests for information, and have been able to come up with a basic picture of the outcome of this year’s competition, data which is given below.

Our impression is that overall it was a much more orderly and fair outcome than the one of 2011, with evenly distributed bin levels and success rates within reason. Part of this is that the total budget coming into the EG 1508 for the competition was not unreasonably and unexpectedly cut, while another part might be that there were a lower number of applicants to EG 1508 than last year. A further point is that bin values for mathematics grants were not different from statistics grants for the same bin.

Another noteworthy feature is the number of members of mathematics and statistics departments who are applying to EGs other than EG 1508, with good success rates and good award amounts.  This testifies to the continuing underfunding of math & stats grants as compared with our fellow scientists.

Because of the sensitivity of our community to the value of research performed at smaller institutions, we have also subdivided the results of this competition by institution size. Since our data is not absolutely complete, the numbers below are underestimates of the total applicants, and since the missing data is principally from smaller institutions, the success rates and grant award averages are probably over estimates.

Best regards,

Walter Craig for the MNLC


Bin levels (\$ values) for EG 1508

  • A = \$60K
  • B = \$52-51K
  • C = \$48-45K
  • D = \$40K
  • E = \$35K
  • F = \$30K
  • G = \$26-24K
  • H = \$22-20K
  • I = \$17-15K
  • J = \$13-12K
  • below this received \$0K

Note: The variation in some bins by one or two \$K is probably partially due to small supplements to early career researchers, as in a NSERC-Contact announcement of October 2011. Some senior researchers have also received small additional increments, which we understand less well, although at the top of the scale this is probably due to `grandfathering’ of prior awards.

Basic data

Total applications from Math and Stats departments = 235
(this is a lower bound, because of underreporting in our survey)
Total applications to EG 1508 = 211
Total grants awarded in EG 1508 = 150
Total grants awarded from other EGs = 20
Total budget (annual) allocated by EG 1508 in this competition = \$3,238K

Bin populations: A=1, B=3, C=5, D=1, E=8, F=16, G=21, H=25, I=32, J=38

Statistical data

EG 1508 Math:   #applicants = 151 (this again is a lower bound, because of underreporting)
average grant \$22.7K
#grants = 111
success rate 73.5%

EG 1508 Statistics: #applicants = 60 (ditto)
average grant \$18.4K
#grants = 39
success rate 65%

Other EG’s:  #applicants = 24 (even harder to verify completeness, this is a lower bound)
average grant \$28.2K
#grants = 20
success rate 83.3%

Percent applications to other EGs: 10%+

Data subdivided by institution size

EG 1508 Mathematics:
Size  Av Grant Success Rate   # Apps
3     28.31          80.00                   60
2     19.22          70.59                   69
1     15.71          63.64                   22
Overall 22.71    73.51                  151

EG 1508 Statistics:
Size  Av Grant Success Rate   # Apps
3     19.65           85.00                  20
2     17.65           62.50                  32
1     15.00           25.00                   8
Overall 18.38      65.00                 60

Size categories: Institutions are considered here to be of size 3 when total NSERC (annualized) grants exceed \$3M, of size 2 when total NSERC grants are between this and \$225K, and below this, size 1.

Note: In putting together this statistical data subdivided by institution size, we have (somewhat arbitrarily) classified institution size in terms of its total NSERC grant profile. Maybe it would have been better to do this with the tri-council profile instead, but this information is less available. In any case, a list of total NSERC grant awards per institution is available either from us, or else from NSERC.

Filed Under: 2012

Response to Physicist’s Public Letter to NSERC President Fortier

2012-02-23 By nmlcadmin Leave a Comment

(Sent on 2012-02-22 to the physicists who signed an open letter to NSERC President Suzanne Fortier.)

 

Dear physics colleagues,

We were interested and pleased to read your letter to NSERC president Suzanne Fortier, that was published last week in the journal Physics in Canada. Your letter, your advice to NSERC, and your requests regarding Discovery Grant competitions and the running of the selection committees seemed very familiar territory, given our own letter to NSERC on the same topic. I would like to share our letter with you for comparison, and it is attached to this e-mail accordingly.

To give a brief background, the Mathematics & Statistics Evaluation Group has been seriously underfunded for a decade if not more, in which average Math & Stats grant levels, as well as minimum and certainly maximum Discovery Grants, have been by far the lowest of the NSERC portfolio of sciences. At the 2011 DG competition, the budget was cut even further, and in an unequal way, resulting in essentially a grants funding crisis for many mathematics researchers, and certainly a great loss of confidence in the fairness and equality of funding for Canadian research in the mathematical sciences. At the 2011 announcements of successful grants this became very clear, and the community sent an open letter to Suzanne Fortier expressing our concerns. Here is a copy of our original letter.

https://nmlc.math.ca/blog/blog/2011/04/26/canadian-mathematics-community-statement-about-nserc-discovery-grants/

A certain large amount of this inequity can be traced directly to the newly imposed and rigid system for judging grants in the Evaluation Group meetings. In preparing for the 2012 competition, A.-M. Thompson of the NSERC sought the advice of Nancy Reid (Math&Stats LRP) and myself on several issues having to do with the EG procedures for Math & Stats. We formulated a response in consultation with a broad representation of the community, and sent the attached letter back to NSERC on October 19 2011, giving our advice. It is also available on line at the following address:

https://nmlc.math.ca/blog/blog/2011/12/12/recommendations-for-nserc-eg-1508/

Upon reading your own letter, I am struck how similar our demands to the NSERC have been. And I cannot say that the response or flexibility of the NSERC has been overwhelming. I would be happy to learn of the responses that you yourselves have received from your own letters to Suzanne Fortier, and NSERC in general, have been.

Best regards,
Walter Craig

chair, Math NSERC Liaison Committee

Filed Under: 2012 Tagged With: Physicists

Recommendations for NSERC EG 1508

2011-12-12 By nmlcadmin Leave a Comment

(The following post is extracted from the letter sent to NSERC dated October 18 2011. This letter is also posted on the long range plan web space.)

Recommendations for EG 1508

In July NSERC staff asked the Long Range Plan (LRP) steering committee and the chair of the NSERC Mathematics and Statistics Liaison Committee for input on the policies and operating procedures of Evaluation Group (EG) 1508. We were in particular asked to comment on the community’s tolerance around the bin levels themselves1 and on priorities if difficult decisions have to be made regarding trade offs; i.e. guidance around maintaining bin levels vs. maintaining success rates. The attached document is our response. It was developed following consultations with the Mathematics NSERC Liaison Committee (MNLC), and revised following discussions with the LRP steering committee, who also consulted with the statistics representatives to the NSERC – Mathematics and Statistics Liaison committee. However this letter should be viewed as reflecting the views of the signatories, following consultation with the above various groups, as several members expressed concern with some of the more detailed recommendations. We were guided by the following principles, for which there is a general consensus among those consulted.

  1. Openness and transparency: It is important that the process of proposal assessment permits EG members to form a global perspective of the spectrum of dossiers in the competition, at least within the three sections. EG members should be informed by NSERC about the budget constraints they will be operating under, at least approximately.
  2. Fairness: The proposal evaluation process should provide sufficiently detailed comparisons of all proposals in the competition, and should result in grant award amounts that reflect research merit that are equitable across the EG.
  3. Professionalism: The members of the Evaluation Group require appropriate tools for their deliberations, and respect for their judgements when reviewing dossiers.

We are including more detailed recommendations below for adapting the procedures of EG 1508 to more closely reflect these principles in future Discovery Grant competitions. We divide these into two sets: (1) changes for the upcoming 2012 Discovery Grant competition, and (2) aims for EG practice in future competitions. We feel that implementation of the suggestions would not entail major changes in policy or practice.

Nancy Reid

Walter Craig


Recommendations for the 2012 DG competition

We are recommending the following for the upcoming Discovery Grant competition.

  1. Openness and transparency.
    (i) Evaluation group members are given the right to participate in the discussion and to review the dossier of each DG proposal, aside from direct conflicts of interest. This would normally apply to EG members of the same section.
    (ii) The EG should be given some idea of the budget allocated for the competition before beginning the process of rating proposals. The purpose of this is so that the EG members have an understanding of the funding consequences of concentrating many proposals in the same bin.
  2. Fairness.
    (i) As the process of proposal proceeds during the evaluation meeting, all rankings/bin assignments are made available to the full EG. The bin rankings could be posted during the evaluation week as they are made.2
    (ii) Each section should review the rankings of all of its dossiers before they are finalized into bin ratings.
    (iii) Major budget decisions should be made by the full EG, as representatives of the communities of mathematics and statistics. On the last day of the evaluation meeting, the full EG should remain to take part in decisions on bin funding.
  3. Professionalism.
    During their deliberations, relevant web-based reference materials should be available for consultation be EG members.
  4. Success rates.
    It is important that top researchers be given adequate funds to maintain their research program, while at the same time it is vital to maintain high-impact research programs at institutions with smaller graduate programs. In practical terms, the success rate should not be hostage to unacceptable rigidity with regard to a misplaced precept of indivisibility of a bin. In the 2011 competition the success rate was highly constrained by the large number of proposals in bin J and the inability to either split the bin or revisit the ratings. Thus the choice seemed to be a success rate of either 48% or 66%. We recommend below relaxation of this rigidity in part to avoid this crisis in future. Success rates of 45% would be unacceptably low, given the quality of the proposals, and in relation to both success rates in EG 1508 since 2009, and success rates in other EGs in 2011.
  5. Bin values.
    This is very difficult to address in isolation: the average grant size in EG 1508 is the lowest among the evaluation groups, and the bin levels must be set within the competition budget set by NSERC. This budget varies from year to year, and has experienced serious (and unexpected) decreases in recent years. Thus we have recommended above that all the members of EG 1508, who are the communities’ representatives for the DG competition, should be the advocates for mathematics and statistics and in a position to discuss fair bin levels, given the constraints of the competition budget. In order for this to be acceptable the NSERC would need to ensure that the EG appointments do indeed reflect a broad representation of the mathematics and statistics communities. Under fair and equitable budget conditions the ranges should fall reasonably close to the other theoretical physical sciences. The following is a tabulation of the approximate 2011 bin ranges for Computer Science and Physics, onto which we place an approximate projection of the range that we expect should be the norm for mathematics and statistics, at least within \$5K on either side of these amounts.
bin ABC D E F G H I J
Computer Science EG >$60K $49K $42K $33K $29K $24K $20K $14K
Physics EG >$98K $69K $57K $48K $37K $28K $22K $15K
Math & Stats EG >$80K $60K $50K $40K $34K $26K $22K $15K

 

Until the Mathematics & Statistics budget can accommodate at least this range, some pro-rating will be needed. Many senior members of our communities felt that in the circumstances of a large cut in the competition budget, such as occurred in 2011, a proportional cut in all grants would have been preferable to large changes in some bin levels from previous years. In the happy event of an increase in the competition budget, the goal would be to work towards a distribution consistent with the numbers above.
Recommendations for future DG competitions

  1. Openness and transparency.
    (i) Early in the ranking process the NSERC staff should present to the EG an outline of the budget available for the competition, at least in approximate terms, relating it to the mathematics and statistics budgets of the past several years.
    (ii) The evaluation process of the competition should allow each EG member to form a global view of the spectrum of dossiers that are submitted to the 2012 competition, including access to each dossier and a review of the final rankings.
    (iii) While responsibility for reading and reporting on individual proposals is assigned to a smaller subset (a committee) of each of the three sections (pure mathematics, applied mathematics, and statistics), members of the EG (aside from those with direct conflicts of interest) should be able to participate in the discussion of the merits of each dossier.
    (iv) The final ranking of each dossier should be supported by a written summary outlining the basis of the EG’s decision. This summary is to be returned to the applicant along with the mail reviews of their proposal.
    (v) During the evaluation process NSERC should explain the basis of the calculation of the available budget for the present year. This disclosure should take place before the EG enters into its final rating process of the dossiers under consideration.
  2. Fairness.
    (i) The EG should make multiple passes through the proposals in the competition, avoiding the inequities that result from one pass/serial rating methods3. We recommend that a full day is set aside for an overall review, after the detailed evaluations are made and also after NSERC has disclosed the approximate budget figures for the competition.
    (ii) We recommend that there be finer grading of the proposals (e.g. in the form of marks such as $J_-  < J_0 < J_+$), at least in the critical bins in which it is possible that the funding cutoff will fall. Such a multiple-scale analysis will help in making sensible final budget outcomes, when difficult decisions have to be made by the EG, by allowing bins to be appropriately subdivided.4
    (iii) The full EG should be consulted in decisions on dollar amounts assigned to bins, early career researcher awards, and other major grant awards under the aegis of the EG.
    (iv) In fact many people in our community feel that the quality and impact of a grant proposal should be judged independently from the HQP element, and the latter is better treated as a function of cost of research.
  3. Professionalism.
    (i) Members of the Evaluation Group are professional scientists, and the exercise of their scientific knowledge and judgements made from their scientific experience needs to be treated as important information, and taken into account in the proposal rankings.
    (ii) During the EG meeting, its members should have the tools available to them to make important decisions about the quality of the research they are evaluating. This includes the ability to consult reference texts and any other professional material that will aid them in their work, which is to evaluate and compare research proposals as diligently as they can.

Additional comments.

Points of particular sensitivity to the mathematics and statistics communities are the following.
(i) The EG should be sensitive to systematic rating differences between pure mathematics, applied mathematics, and particularly statistics. Recent experience indicates that the calibration process may not be as effective as it is intended to be, and this needs to be considered in discussion of budget decisions as suggested in 2(iii) above.
(ii) If in going forward the NSERC adopts separate budgets for mathematics and statistics, the
basis of forming these budgets should be treated openly and permit broad consultation. Careful attention will be needed with respect to proposals in probability and in mathematical finance, in particular.
(iii) The possibility of having separate budgets for mathematics and statistics has already raised the question in the community about whether or not it may be appropriate to have separate budgets for pure and for applied mathematics; any decision on this would require broad consultation.

 

Walter Craig FRSC

Chair, Mathematics – NSERC Liaison Committee

Professor of Mathematics and

Canada Research Chair

McMaster University

 

Nancy Reid FRSC

Chair, Long Range Planning Committee

University Professor of Statistics and

Canada Research Chair

University of Toronto


  1. E.g. this bin should not go below \$X. ↩
  2. This is much as it is in practice in NSF Grant Review Panels. ↩
  3. We recommend that that each dossier be addressed at least three times: (1) a preliminary coarse-grained rating, (2) the detailed evaluation and (3) a global comparison and adjustment of rankings. In fact the fi rst step of this evaluation can already be made in the late fall term, at the point when the proposals are submitted and are ready for evaluation. The final step (3) is critical in making fair judgements on the dossiers after EG members have acquired a global perspective of the competition as a whole, and after having made a detailed scrutiny of all of the dossiers and understanding the impact of the budget on the funding level. ↩
  4. In the 2011 competition, the fact that bins could not be reopened nor split, combined with the fact that populations of the bins are normally largest in the middle of the distribution, led directly to budget decisions that resulted in anomalously low grant allocations for dossiers in the middle bins in pure mathematics and applied mathematics. ↩

 

Filed Under: 2011

S. Fortier Reply

2011-11-22 By nmlcadmin2 Leave a Comment

Originally post on June 30, 2011

NSERC President Suzanne Fortier has replied to the math/stats community’s public letter with a letter.

Here is a link to President Fortier’s letter in PDF format.

Filed Under: Archive

Analytical Overview

2011-11-22 By nmlcadmin2 Leave a Comment

Originally post on May 27, 2011

Analytical overview of the three previous Evaluation Group 1508 meetings

Presentation at the May 20, 2011 conference call NSERC-Mathematics & Statistics Liaison Committee

Walter Craig
Department of Mathematics & Statistics
McMaster University

Introduction: The past three years of NSERC Discovery Grants have not been good ones for Canadian mathematics and statistics. Average grant amounts for their Discovery Grants (DG) are \$20K as opposed to the NSERC DG average of more than \$34K. The total budget allocated for mathematics and statistics Discovery Grants declines each year; indeed over a five year period this amount has decreased by 17.5% 1. And many award amounts for Discovery Grants do not make sense with respect to the international stature of the researcher. In general, the median grant does not support a research program that is reasonable and sustainable for an active high quality researcher. 2

On consultation with members of the Canadian mathematics and statistics community, there emerge three basic causes for this situation, pointing to actions that we must take:

(1) The Discovery Grant budget for Mathematics & Statistics (EG 1508) is insufficient. This has been recognized by the research community and NSERC alike, however action to remedy the situation is ever put off to the future. For this I quote Shakespere’s Hamlet ‘…tomorrow and tomorrow and tomorrow…’ as the response to the problem. We need NSERC to show a good faith effort for action on this problem today.

(2) The Evaluation Group (EG) and its chairs and co-chairs are expected to play a leadership role in allocating research funding for the community, while given no actual control, nor even any prior knowledge of the budget. The real role that they play at this point is of a scapegoat (en francais, le bouc emmissaire).

(3) The dynamics of the EG decision-making process under the current bin-conference model relies upon too few parameters to make its subtle choices, and is excessively and rigidly controlled through rules imposed upon it by NSERC.

All three of these causes need to be studied and ameliorated. Fortunately, item (3) can be addressed in a straightforward manner, with a certain amount of thought, planning and good will. I will talk about this today. At the end of this presentation I will make several suggestions for changes, all of the implementable within the bin-conference model, none of which is too big. The result will be a more fair, more stable, and more transparent process of Discovery Grant allocation. Regarding item (3), the EG must make difficult decisions about comparing the quality of grant proposals and funding them, and do so in as fair as possible way, as do all grant evaluation committees. In the current procedures, they are not given the tools to do so. This is the aspect that I am suggesting to change, and this can be accomplished with relatively minor modifications to the current methods.

Outline of the dynamics of decision making in the EG meetings:
Just as is the case with most other scientific grant awarding agencies, proposals are solicited and received. In mathematics and statistics there were about 274 this year (the numbers here and below are for the most part estimates, as the actual figures were not made available at the time of this meeting). And as is the case with each such grant competition, they fall into three categories.

  1. high quality researcher, will be funded at some level (perhaps 20% of the proposals)
  2. proposals which, despite their possible merit, will most likely not be funded because of the competitive level of the process. (perhaps another 20%)
  3. proposals which would deserve to be funded, if the budget allows. (the remaining middle 60% of the proposals)

This division, explicit or implicit, takes place in every grant reviewing panel that I have experienced. In my mind, the third ‘middle’ category corresponds to the NSERC bins H (VVS), I (VSS), J (SSS) and K (SSM). The funding cutoff will have to be placed somewhere within this large group of proposals.

The current procedures of the NSERC EGs: proposals are inspected serially (by conferences or sub-committees made of the EG members) and rated (with three ratings in fact) placing them in a ‘bin’. As it is known that in the prior two competitions, bin J (SSS) is most likely to be the lowest bin that will be funded, while the budget is not known, it is very difficult for an EG member to give one of the middle proposals a rank lower than that for bin J. The factual evidence for this is that a full 33% of the successful grants in Math & Stats are in bin J. Indeed about 68% are in the combined bins H, I and J. In the 2011 Discovery Grant competition, this exceeds most other EGs by quite a lot, as one can check from the NSERC data release at

http://www.nserc-crsng.gc.ca/doc/Funding-Financement/DGStat-SDStateng.pdf

As the EG members rank proposals, there is basically only one pass, with no construction of a global picture of their spectrum of quality. It is certainly realized that proposals should be judged fairly with respect to each other, and with the new bin system there is an attention towards a ‘calibration’ as to the meaning of the different letter grades (EOVSMI). And true problematic case are allowed to be reinspected, but these are a small minority of the total. The hard decision, which is where to draw the grant cutoff line once the budget is revealed, is somewhere between bins H and K, and given its size, probably in the middle of bin J. However the rules are rigid as to a bin being indivisible. In my opinion this is a very important point, namely the EG executive committee is forced to decide about the inclusion or the exclusion of all proposals that have been placed in this critical bin, leading to the depletion of the budget, or respectively an unacceptably low success rate. This is precisely because the dynamics of the decision making process has resulted in so many proposals being allocated to bin J.

There are quite a few other (and larger) grant awarding agencies, and it is a good idea to discuss what procedures they have developed, and their sense of best practices. I will start with a comparison with the NSF of the United States 3. At a meeting of a NSF grants panel, all proposals are visible to all panel members, and can be read and judged. A subcommittee is charged to read and recommend each proposal (4 people, a ‘reporter’ and three ‘readers’), however the full panel plays a role in the decision. Each proposal is compared with all others. Indeed they are labeled on post-its and placed on a white board at the front of the committee room. And there are three columns, where the middle on corresponds roughly to the bins H, I, J, and K. In rank ordering the proposals, the panel makes at least three passes through the whole list. And indeed the tough decisions are made on the proposals in the middle column. This is where the panel spends most of its discussion time. At the end of the panel meeting (normally lasting 3 days) a summary review of each proposal is written by the relevant subcommittee, and the panelists approve it. The global ranking system, the transparency of the judging decisions, and the summary review are all part of the accountability of the NSF process.

The biggest difference between NSERC’s present procedures and the time-honored ones of the NSF, are that he latter offers a global picture of the spectrum of quality of the proposals. The current NSERC practice is weak precisely on this point. There is only effectively one pass through the proposal dossiers, and the internal quality of proposals within the same bin is not compared. If however the NSERC Evaluation Group members were able to construct a global picture before proposals were rated into bins, the difficult decisions would be able to be made. This would allow them to avoid the situation in which bin J contained approximately 58 proposals (as it did this year) where the funding decision rules require either all or none of them to be awarded a grant.

The funding agency that I know second best is the CNRS administered ANR grants of France. I served on this panel last year. There are two weeks of meeting, one in February and one in April, and in all there were five passes through all of the proposals, in which all panelists participated. And on questions of conflict of interest, Canada may have quite a few, but France has very many, due to the prominence of the École Normale Supérieure and the École Polytechnique. Yet they are able to handle them, in a relatively graceful way. As far as accountability goes, it is important in these sessions that the reader, that is the subcommittee chair, write and sign each proposal’s summary review.

There is one further comparison to make, this time something close to every university instructor, because we all have to mark final exams and give grades in large sections of freshman courses. The time honored `best practice’ is to assign each problem of the final a number of points, calibrated to its difficulty or importance. As the problems are graded, points are tallied for each exam script. A fair amount of effort goes into calibrating the standards of grading between different graders and between those marked earlier and later in the grading process. A global distribution is tabulated, often graphically. Only then are the grades ABCDF assigned. Finally marks are reviewed; any mark that does not make sense with respect to the known qualities of a given student is revisited. This procedure is considered the best way to mark a course exam fairly. Without this procedure, or something similar, an instructor of a large course knows that he/she will get into difficulties. Grading exams serially as A, B, … F, with no backward looks nor comparisons of the global distribution, would definitely lead to injustices.

Demands for action:

Regarding item (3) on EG procedures, we must make changes to the decision making process so that grant awards are fair and make sense. It is a procedural question that can be worked out, as long as good will is exhibited by all sides. It should contain the
following elements:

  • EG members should be given a global picture of the proposals in the competition, which should be ranked according to global comparisons.
  • rules governing proposal rating should be transparent, flexible and make scientific sense.
  • The ranking system must require that EG members and NSERC staff be accountable for their decisions. In particular a summary review of the panels decisions should be forthcoming for every proposal.

Regarding item (2), which has to do with EG chairs and co-chairs, we have started to work on suggestions for members for next year’s committee, doing this in a correspondence with Patricia Rouillard of NSERC. We are ready to give more input to this process and further suggestions for new EG members. We appreciate that NSERC is consulting us, we are willing to respond; more is needed.

Regarding item (1), having to do with the Mathematics & Statistics EG 1508 budget:

  • returning grants to EG 1508 are about \$3,828K, while we have estimated the outgoing budget of this year’s DG competition at about \$3,334K, this is a difference of almost \$500K. It is just about enough to restore the mathematics bins to values comparable to the 2010 DG competition. NSERC should do this immediately.
  • We of the Canadian mathematics and statistics community need to have accountability of NSERC for its budget decisions regarding EG 1508. In particular we need to know the origin of such deep budget cuts as have occurred this year, and why they they were not signaled to the community nor to the Math & Stats EG during the main phase of the rating of our proposals.
  • The separation of the mathematics budget from the statistics budget is controversial and divisive. Recognizing the deep level of commonality in mathematics and statistics, this should not have been done, and we should not be placed in our current position of two grant award values for the same quality standards, one for mathematics and another for statistics.

Respectfully,

Walter Craig

Chair, Mathematics – NSERC Liaison Committee


  1. Coupled with inflation of between 10% to 15% over five years, this means that NSERC is spending over 25% less on mathematics and statistics, in real terms. ↩
  2. Bruce Gaulin, chair EG 1505. ↩
  3. In response to a query as to whether any other people present have served on grants panels of other agencies, only Rachel Kuske had done so. Those present included all members of the executive committee of the EG 1508. ↩

 

Filed Under: Archive

EG Letter to S. Fortier

2011-11-22 By nmlcadmin2 Leave a Comment

Originally post on May 19, 2011

The Math-NSERC liasion committee has received the following letter from 16 members of the EG 1508:

May 19, 2011

Dr. Suzanne Fortier
President, NSERC

Dear Dr. Fortier,

We would like to draw your attention to the distressing results of the 2011 Discovery Grants Program in Mathematics, a competition in which all the signatories of this letter played a role, as members of the Evaluation Group for Mathematics and Statistics (1508). We have strong reasons to believe that the outcome of the competition in 2011 is not consistent with NSERC’s clearly stated goal to have a “process that is responsive to merit and is objective, transparent and fair”, as quoted from your recent presentation of the competition summary.

We detail below each aspect that seems seriously problematic in the hope and expectation that NSERC will rapidly correct the situation for the applicants from the recent competition of 2011 or, at the very least, ensure that these issues are adequately resolved in time for next year’s competition.

Lack of fairness: The members of the Evaluation Group (hereafter, the EG) have been deeply concerned by the fact that there are huge funding differences for comparable files in the 2010 and 2011 competitions. Indeed, funding for the same bin in 2010 and 2011 dropped by as much as 40% ($12,000, in this particular case), even though we, as members of the EG, agree that rating standards for the two successive years were not significantly different. Researchers expect to be judged on the merit of their file when compared to the whole community, not only to the applicants in the same competition. Variations such as those observed between 2010 and 2011 are highly unfair.

Responsiveness to merit : In addition to the bin funding discrepancies between 2010 and 2011, we have observed that funding differences between successive bins (from \$1,000 between bins I and J, to \$9,000 between bins D and E) are extraordinarily large and appear, frankly, arbitrary from the perspective of merit. Researchers with remarkable achievements and outstanding potential are provided with insufficient funds to develop and carry out their research projects as a consequence of this extremely unbalanced funding model. As committee members, we were asked to use the full scale in our ratings, and we did so, but with the expectation that scientific merit would ultimately be rewarded with proportional grant amounts. The bin values selected by NSERC distort the rating scale in such a drastic fashion that the lengthy and rigorous evaluation process that took place was rendered almost meaningless.

Transparency: The 2011 competition and its aftermath also raises questions about the transparency of the present system in which critical funding choices are exclusively the purview of the Executive committee of the EG and NSERC staff, and we feel that Isabelle Blain’s letter to the applicants hardly addresses any of the concerns expressed by the EG members and the community.

The Executive of our EG was placed in the very difficult situation to have to choose between two unpalatable and unfair scenarios, a huge drop in success rates from a year to the other (from 62% to under 45%), or reducing drastically bin values in mid-range bins. All along this process as well as after it, for confidentiality reasons, they were unable to consult and explain in a detailed way the whole situation to the rest of the EG.

One example of how the general lack of openness affected the process and its aftermath is related to the decision made by the Executive and NSERC staff to have different funding levels per bin for mathematics and for statistics, even though the two subjects share the same EG. The difference in funding levels was swiftly explained in Ms. Blain’s letter by claiming a different interpretation of the merit criteria. However, there was no comparison of files with the same rankings in mathematics and in statistics during or after the end of the competition to justify such a broad claim – in fact, bins can not even be opened after the end of the competition. After noticing a markedly different bin distribution in mathematics and statistics, the Executive decided to split the budgets between these two groups, a decision justified by the fact that, historically, these two subjects have been separated one from the other in the funding process and the respective subgroups of the EG had worked in complete separation also in this evaluation exercise. As a consequence, the bin values in statistics became higher than in mathematics and reached levels comparable to those in 2010. As the confidentiality principle did not allow the Executive to explain to the EG, not to mention to consult with its members on,the full details of the various scenarios involved, this led to considerable unease in the EG and beyond.

Another example of lack of transparency is apparent in the fact that Ms. Blain’s letter hardly makes any attempt to justify grant funding differences between 2010 and 2011. Her letter indicates that one of the priorities in the current year was to fund the bin J. This is commendable and in keeping with the wishes of our community, as we understand them. But it does not explain why this led to hugely different results from 2010, when the same criteria were applied and the same bin J was funded, with a similar overall success rate. We are distressed that her letter did not refer to any of the other variables entirely in NSERC hands and involved in this outcome : first and foremost the budget, the values of the lowest funded and the top bins, and the gradations between funding amounts for different bins.

As members of the EG, we were involved in a process that is of great importance to our scientific community and, more generally, for the advancement of research and innovationin Canada. We have made considerable efforts to implement the new grant distribution system in ways consistent with the highest scientific standards in a context made particularly difficult by the chronic and increasingly severe underfunding of the mathematical and statistical sciences when compared with other NSERC supported disciplines. We established our ratings with the best of our scientific expertise and we naturally expected this to be reflected accurately in terms of budgetary allocations. This was far from being the case.

As reviewers, we were highly impressed with the quality of the 2011 applicants. Canada, through NSERC, has supported many of them in past years and can take justifiable pride in their achievements. Moreover, the Discovery Grants Program is essential for the progress of science in Canada and, for the outright majority of applicants reviewed by the EG, is the only available source of significant research funding. Unfortunately, the 2011 Discovery Grants Program in Mathematics failed many of the most promising of these applicants. As a consequence, our confidence in the program, as currently administered, is regrettably shaken. We urge you to take rapid corrective action so that the 2012 competition will not suffer from the same problems.

Yours Sincerely,

Mike Bennett (UBC)
Nantel Bergeron (York)
Lia Bronsard (McMaster)
Thomas Brüstle (Sherbrooke and Bishop’s)
Olivier Collin (UQAM)
Benoit Collins (Ottawa)
Octav Cornea (Montreal)
Alan Dow (North Carolina, Charlotte)
Hermann Eberl (Guelph)
Christopher Godsil (Waterloo)
Eyal Goren (McGill)
Robert McCann (Toronto)
Matthias Neufang (Carleton and Fields)
John Stockie (Simon Fraser)
Holger Teismann (Acadia)
Xiaoqiang Zhao (Memorial University)

cc: Ms. Isabelle Blain, Vice-President, Research Grants & Scholarships; Ms. Madeleine
Bastien, Team Leader, NSERC Evaluation Group 1508; Mathematics-NSERC Liaison
Committee.

Filed Under: Archive

CRC

2011-11-22 By nmlcadmin2 Leave a Comment

Originally post on May 8, 2011

The following Canada Research Chairs have signed the
2011 public statement about NSERC Discovery Grants.

  1. Alejandro Adem, UBC
  2. Heinz Bauschke, UBC Okanagan
  3. Nantel Bergeron, York (member 2011 Evaluation Group)
  4. Vladimir Chernousov, UAlberta
  5. Hugh Chipman, Acadia
  6. Walter Craig, McMaster
  7. Hermann Eberl, Guelph (member 2011 Evaluation Group)
  8. George Elliott, Toronto
  9. Jim Geelen, Waterloo
  10. Andrew Granville, U. Montreal
  11. Pengfei Guan, McGill
  12. Adrian Iovita, Concordia
  13. Steve Kudla, Toronto
  14. Rachel A. Kuske, UBC
  15. Francois Lalonde, U. Montreal
  16. Mark Lewis, UAlberta
  17. Edwin Perkins, UBC
  18. Ian Putnam, Victoria
  19. Anthony Quas, Victoria
  20. Sujatha Ramdorai, UBC
  21. Nancy Reid, Toronto
  22. Alexander Shirelman, Concordia
  23. Patrick Speissegger, McMaster
  24. Stevo Todorcevic, Toronto
  25. Nicole Tomczak-Jaegermann, UAlberta
  26. Bálint Virág, Toronto
  27. Jianhong Wu, York

To suggest additions or edits to this list, please send an email message to: nmlcadmin@nmlc.math.ca.

Filed Under: Archive

FRSC

2011-11-22 By nmlcadmin2 Leave a Comment

Originally post on May 8, 2011

The following Fellows of the Royal Society of Canada have signed the
2011 public statement about NSERC Discovery Grants.


  1. Jim Arthur
  2. Martin Barlow
  3. Ed Bierstone
  4. Tom Bloom
  5. David Boyd
  6. Walter Craig
  7. Ken Davidson
  8. Henri Darmon
  9. Ivar Ekeland
  10. George Elliot
  11. Joel Feldman
  12. Donald Fraser
  13. John Friedlander
  14. Nassif Ghoussoub
  15. Andrew Granville
  16. Pengfei Guan
  17. Victor Ivrii
  18. Lisa Jeffrey
  19. Niky Kamran
  20. Francois Lalonde
  21. Jerry Lawless
  22. Eckhard Meinrenken
  23. Pierre Milman
  24. Kumar Murty
  25. Ram Murty
  26. Edwin Perkins
  27. Ian Putnam
  28. Nancy Reid
  29. Gordon Semenoff
  30. I. Michael Sigal
  31. Gordon Slade
  32. Cameron Stewart
  33. Mary Thompson
  34. Nicole Tomczak-Jaegerman
  35. Alfred Weiss

To suggest additions or edits to this list, please send an email message to: nmlcadmin@nmlc.math.ca.

Filed Under: Archive

EC Statement

2011-11-22 By nmlcadmin2 Leave a Comment

Originally post on May 6, 2011

Statement from the Executive Committee of the Mathematics & Statistics Evaluation Group (EG 1508):

Mathematics & Statistics Evaluation Group 1508, Executive Committee

The Executive Committee is part of, stands with and advocates to NSERC for the mathematical and statistical community. It had the extremely difficult task to choose between different unsatisfactory outcomes. The following principles guided the Executive and NSERC during budget deliberations:

  • maintaining success rates in response to the extensive feedback received from the community after the 2010 competition;
  • ensuring researchers with strong evaluations across the board receive non-zero funding, again based on extensive feedback received in the past from the community;
  • equitable treatment of all sections;
  • keeping exceptional researchers funded at internationally competitive levels.

Trying to manage all of the above with the ever increasing pressure on our budget led to very hard decisions. The final scenario provides funding to the maximum possible number of deserving researchers within the above constraints.

While the Executive must abstain from signing the open letter due to the obvious conflict of interest, we will address the issues in a separate communication to NSERC.

Filed Under: Archive

Canadian Mathematics Community Statement about NSERC Discovery Grants

2011-04-26 By nmlcadmin 233 Comments

The following public statement is endorsed by the signatories appearing beneath it.

The signatories include 35 Fellows of the Royal Society of Canada and 27 Canada Research Chairs.

Over the past decade Canada has enriched its world-class research community in the mathematical and statistical sciences through strategic recruitment by Universities aided by the Canada Research Chairs Program, and by the prospect of a supportive and relatively stable Canadian research funding environment. NSERC’s Discovery Grants program plays a continuing vital role as these scientists advance their research programs and train the next generation of innovators.

However, recent changes to the Discovery Grants Program threaten this investment in mathematical and statistical talent, and could precipitate a reversal of all the gains that Canada made in the last decade. We are therefore calling on the Minister of Industry and on NSERC’s President to:

  • Rebuild confidence in the merit-based peer review system for NSERC Discovery Grants funding research.
  • Reconfirm Canada’s long-term commitment to fund research in the mathematical and statistical sciences at internationally competitive levels.

The mathematical and statistical sciences community calls upon the NSERC leadership to proceed promptly with the following specific actions:

1. As stressed in the NSERC peer review manual, research grant review and subsequent funding decisions should be equitable and fair, considered from year to year and across Canada’s diverse university research environments. We therefore request that an equalizing bin-to-funding map should be applied to the anomalous 2011 results in mathematics and statistics in order to restore fairness compared with the 2009 and 2010 results. This requires a one-time-only investment and should be announced as soon as possible to signal Canada’s commitment to fairness and long-term investment in mathematical sciences.

2. NSERC should recognize that its investment in mathematical and statistical research is not at internationally competitive levels and make appropriate changes:

  • There is inadequate funding for a mathematical or statistical scientist in Canada to run a standard research program.
  • There is an inequity between research funding in the mathematical sciences (including computer sciences) and other NSERC supported disciplines. A scientist in Canada has an average Discovery grant that exceeds \$34K/year. A mathematical scientist has an average Discovery grant less than \$20K/year. (These grants are mostly used by all sciences in the same way, to fund the training of junior scholars.)
  • In pure mathematics, the Discovery Grant Program is the only research resource available, and pure mathematicians are less able to leverage their Discovery Grants to obtain other grants than scientists in other disciplines.

3. A large segment of the research community is losing confidence in the system for assigning grant awards presently used in NSERC’s Discovery Grants competition.

  • The peer review process, translating proposal evaluation into funding amounts, should be reconstructed to ensure fairness.
  • The scientific community should be intimately involved in choosing high caliber scientists to serve on evaluation and granting committees.
  • Scientists should be actively involved in setting the research investment strategy at every step from proposal evaluation through the assignment of dollar amounts on proposals, not just in the assignment of scientists into bins.
  • Research grants should be funded with amounts sufficient to carry out the proposed studies.

We therefore request an immediate review of the newly implemented review system by an independent international panel to consider its impact on Canadian mathematical sciences.

Finally, we call for a robust national strategy to stabilize the research infrastructure and investment necessary for basic research and scientific innovation so as to safeguard Canada’s past investment in recruiting talented scientists.

Signatories:

  1. Alejandro Adem, UBC, CRC
  2. Martial Agueh, Victoria
  3. Amir Akbary, Lethbridge
  4. Walter Allegretto, UAlberta
  5. S. Twareque Ali, Concordia
  6. Stephen Anco, Brock
  7. Razvan Anisca, Lakehead
  8. Richard Anstee, UBC
  9. Vestislav Apostolov, UQAM
  10. Tom Archibald, SFU
  11. Jim Arthur, Toronto, FRSC
  12. Tom Baird, Memorial
  13. Catharine Baker, Mount Allison
  14. Martin Barlow, UBC, FRSC, FRS
  15. Michael Barr, McGill
  16. Heinz Bauschke, UBC Okanagan, CRC
  17. Margaret Beattie, Mount Allison
  18. Kai Behrend, UBC
  19. Mike Bennett, UBC (member 2011 Evaluation Group)
  20. François Bergeron, UQAM, CRC
  21. Nantel Bergeron, York (member 2011 Evaluation Group)
  22. Ed Bierstone, Toronto and Fields Institute, FRSC
  23. Ilia Binder, Toronto
  24. Ted Bisztriczky, Calagry
  25. Tom Bloom, Toronto, FRSC
  26. George Bluman, UBC
  27. Richard Blute, Ottawa
  28. Hans Boden, McMaster
  29. Anthony Bonato, Ryerson
  30. Chris Bose, Victoria
  31. Christian Boudreau, Waterloo
  32. David Boyd, UBC, FRSC
  33. Steven Boyer, UQAM
  34. Fred Brauer, UBC
  35. Elena Braverman, Calgary
  36. Murray R. Bremner, Saskatchewan
  37. Berndt Brenken, Calgary
  38. Nils Bruin, SFU
  39. Carmen Bruni, UBC
  40. Thomas Brüstle, Sherbrooke (member 2011 Evaluation Group)
  41. Ragnar-Olaf Buchweitz, Toronto
  42. Ryan Budney, Victoria
  43. Almut Burchard, Toronto
  44. Sue Ann Campbell, Waterloo
  45. James B. Carrell, UBC
  46. Bill Casselman, UBC
  47. Duong Eui Chang, Waterloo
  48. Hugo Chapdelaine, Laval
  49. Albert Chau, UBC
  50. Cedric Chauvet, SFU
  51. Jingyi Chen, UBC
  52. Vladimir Chernousov, UAlberta, CRC
  53. Hugh Chipman, Acadia
  54. Stephen Choi, SFU
  55. Yemon Choi, Saskatchewan
  56. Tahir Choulli, UAlberta
  57. Dan Christensen, Western
  58. Gerald Cliff, UAlberta
  59. Bernard Colin, Sherbrooke
  60. James Colliander, Toronto
  61. Olivier Collin, UQAM (member 2011 Evaluation Group)
  62. Octav Cornea, U. Montreal (member 2011 Evaluation Group)
  63. Helene Cossette, Laval
  64. Walter Craig, McMaster, FRSC, CRC
  65. Bill Cunningham, Waterloo
  66. Clifton Cunningham, Calgary
  67. James Currie, Winnipeg
  68. Henri Darmon, McGill, FRSC
  69. Ken Davidson, Waterloo, FRSC
  70. Robert Dawson, St. Mary’s
  71. Jean-Marie De Konick, Laval
  72. Alex de Leon, Calgary
  73. Gerda de Vries, Alberta
  74. Andrew P. Dean, Lakehead
  75. Charmaine Dean, SFU
  76. Dejan Delic, Ryerson
  77. Florin Diacu, Victoria
  78. Charles Doran, UAlberta
  79. Alan Dow, UNCC (member 2011 Evaluation Group)
  80. Danny Dyer, Memorial
  81. Hermann Eberl, Guelph, CRC (member 2011 Evaluation Group)
  82. Charles Edmunds, MSVU
  83. Roderick Edwards, Victoria
  84. Ivar Ekeland, UBC, FRSC
  85. George Elliott, Toronto, FRSC,
  86. Heath Emerson, Victoria
  87. Mike Evans, Toronto
  88. Ilijas Farah, York
  89. Joel Feldman, UBC, FRSC
  90. Shui Feng, McMaster
  91. Sebastian Ferrando, Ryerson
  92. Razvan Fetecau, SFU
  93. Ailana Fraser, UBC
  94. Don Fraser, Toronto, FRSC
  95. John Friedlander, Toronto, FRSC
  96. Marlene Frigon, U. Montreal
  97. Richard Froese, UBC
  98. Ricardo Fukasawa, Waterloo
  99. Jose Garrido, Concordia
  100. Paul Gauthier, U. Montreal
  101. Jim Geelen, Waterloo, CRC
  102. Anthony V. Geramita, Queens
  103. Nassif Ghoussoub, UBC, FRSC
  104. Alison Gibbs, Toronto
  105. Thierry Giordano, Ottawa
  106. Michael Goldstein, Toronto
  107. Edit Gombay, UAlberta
  108. Edgar Goodaire, Memorial
  109. Pawel Gora, Concordia
  110. Eyal Goren, McGill (member 2011 Evaluation Group)
  111. Shonda Gosselin, Winnipeg & Regina
  112. Colin C. Graham, UBC
  113. Ian Graham, Toronto
  114. Andrew Granville, U. Montreal, CRC
  115. Justin Gray, SFU
  116. Priscilla Greenwood, UBC
  117. Marco Gualtieri, Toronto
  118. Pengfei Guan, McGill, FRSC, CRC
  119. Stephen Gustafson, UBC
  120. Lucien Haddad, Royal Military College
  121. Ian Hambleton, McMaster
  122. Megumi Harada, McMaster
  123. Nadia Hardy, Concordia
  124. Kathryn Hare, Waterloo
  125. Warren Hare, UBC Okanagan
  126. Bradd Hart, McMaster
  127. Deirdre Haskell, McMaster
  128. Wenqing He, Western
  129. Florian Herzig, IAS and Toronto
  130. Thomas Hillen, UAlberta
  131. Christophe Hohlweg, UQAM
  132. Monica Ilie, Lakehead
  133. Reinhard Illner, Victoria
  134. Adrian Iovita, Concordia, CRC
  135. Victor Ivrii, Toronto, FRSC
  136. David M. Jackson, Waterloo
  137. Ken Jackson, Toronto
  138. Sebastian Jaimungal, Toronto
  139. Dmitry Jakobson, McGill
  140. Jeannette Janssen, Dalhousie
  141. Lisa Jeffrey, Toronto, FRSC
  142. Robert Jerrard, Toronto
  143. Barry Jessup, Ottawa
  144. Tomasz Kaczynski, Sherbrooke
  145. Vadim Kaimanovich, Ottawa
  146. Niky Kamran, McGill, FRSC
  147. Spiro Karigiannis, Waterloo
  148. Yael Karshon, Toronto
  149. Lee Keener, UNBC
  150. Nicholas Kevlahan, McMaster
  151. Kostya Khanin, Toronto
  152. Askold Khovanskii, Toronto
  153. Omar Kihel, Brock
  154. Young-Heon Kim, UBC
  155. Hershy Kisilevsky, Concordia
  156. Robert Klinzmann, UBC
  157. Jochen Koenemann, Waterloo
  158. Manfred Kolster, McMaster
  159. Dmitry Korotkin, Concordia
  160. Michael Kozdron, Regina
  161. David Kribs, Guelph
  162. Lilia Krivodonova, Waterloo
  163. Mary-Catherine Kropinski, SFU
  164. Steve Kudla, Toronto, CRC
  165. Wentang Kuo, Waterloo
  166. Rachel Kuske, UBC, CRC
  167. Jochen Kuttler, Alberta
  168. Marcelo Laca, Victoria
  169. Claude Laflamme, Calgary
  170. Matilde Lalín, U. Montreal
  171. Francois Lalonde, U. Montreal, FRSC, CRC
  172. Kevin Lamb, Waterloo
  173. Michael Lamoureux, Calgary
  174. William Langford, Guelph
  175. Anthony Lau, Alberta (also representing Canadian Mathematical Society)
  176. Michael Lau, Laval
  177. Jerry Lawless, Waterloo, FRSC
  178. Victor LeBlanc, Ottawa
  179. Jerome Lefebvre, UBC
  180. Chistiane Lemieux, Waterloo
  181. Ghislain Léveillé, Laval
  182. Claude Levesque, Laval
  183. Andrew D. Lewis, Queens
  184. Mark Lewis, UAlberta, CRC
  185. Sheldon Lin, Toronto (member 2011 Evaluation Group)
  186. Yu-Ru Liu, Waterloo
  187. Frithjof Lutscher, Ottawa
  188. Gordon MacDonald, PEI
  189. Neal Madras, York
  190. Rogemar Mamon, Western
  191. Brian Marcus, UBC
  192. Greg Martin, UBC
  193. Javad Mashregi, Laval
  194. David E. Matthews, Waterloo
  195. Robert McCann, Toronto (member 2011 Evaluation Group)
  196. Connell McCluskey, Laurier
  197. David McKinnon, Waterloo
  198. Don McLeish, Waterloo
  199. Eckhard Meinrenken, Toronto, FRSC
  200. Franklin Mendivil, Acadia
  201. Paul Mezo, Carleton University
  202. Pierre Milman, Toronto, FRSC
  203. Ján Mináč, Western
  204. Maung Min-oo, McMaster
  205. Jamie Mingo, Queen’s
  206. Richard A. Mollin, Calgary
  207. Michael Monagan, SFU
  208. Rahim Moosa, Waterloo
  209. Ruxandra Moraru, Waterloo
  210. Joy Morris, Lethbridge
  211. Fiona Murnaghan, Toronto
  212. Kumar Murty, Toronto, FRSC
  213. Ram Murty, Queens, FRSC
  214. Kieka Mynhardt, Victoria
  215. Alex Nabutovsky, Toronto
  216. Adrian Nachman, Toronto
  217. Erhard Neher, Ottawa
  218. Zhuang Niu, Memorial
  219. Ortrud Oellermann, Winnipeg
  220. Pablo Oliveras, Ryerson
  221. B. Doug Park, Waterloo
  222. Anthony Peirce, Toronto
  223. Edwin Perkins, UBC, FRSC, CRC
  224. Alexandra Pettet, UBC
  225. John Phillips, Victoria
  226. Arturo Pianzola, U. Alberta
  227. Chris Pickles, Queens
  228. Martin Pinsonnault, Western
  229. Iosif Polterovich, U. Montreal
  230. David Poole, Trent
  231. Dorette Pronk, Dalhousie
  232. Bartosz Protas, McMaster
  233. Mary Pugh, Toronto
  234. Kevin Purbhoo, Waterloo
  235. Ian Putnam, Victoria, FRSC, CRC
  236. Anthony Quas, Victoria, CRC
  237. Jeremy Quastel, Toronto
  238. Chris Radford, Memorial
  239. Sujatha Ramdorai, UBC, CRC
  240. Robert Raphael, Concordia
  241. Andrew Rechnitzer, UBC
  242. Nancy Reid, Toronto, FRSC, CRC
  243. Jean-Francois Renaud, UQAM
  244. Bruce Richter, Waterloo
  245. Louis-Paul Rivest, Laval
  246. Dale Rolfsen, UBC
  247. Mike Roth, Queens
  248. Regina Rotman, Toronto
  249. Christiane Rousseau, U. Montreal
  250. Damien Roy, Ottawa
  251. Volker Runde, Alberta
  252. Peter Russell, McGill
  253. Hassan Safouhi, UAlberta
  254. Yvan Saint-Aubin, U. Montreal
  255. Mateja Sajna, Ottawa
  256. Tom Salisbury, York
  257. Hadi Salmasian, Ottawa
  258. Bill Sands, Calgary
  259. Manuele Santoprete, Laurier
  260. David Saunders, Waterloo
  261. Alistair Savage, Ottawa
  262. Nabil Sayari, Moncton
  263. Renate Scheidler, Calgary
  264. Dana Schlomiuk, U. Montreal
  265. David Scollnik, Calgary
  266. Matthew Scott, Waterloo
  267. Jonathan P. Seldin, Lethbridge
  268. Paul Selick, Toronto
  269. Gordon Semenoff, UBC, FRSC
  270. Karen Seyffarth, Calgary
  271. Vasilisa Shramchenko, Sherbrooke
  272. Alexander Shnirelman, Concordia, CRC
  273. Israel Michael Sigal, Toronto, FRSC
  274. Gordon Slade, UBC, FRSC
  275. Christopher G. Small, Waterloo
  276. Gregory G. Smith, Queen’s
  277. Robert Smith, Ottawa
  278. Gary Sneddon, Memorial –> MSVU
  279. Jozsef Solymosi, UBC
  280. Juan Souto, UBC
  281. Artur P. Sowa, Saskatchewan
  282. Blair Spearman, UBC Okanagan
  283. Roland Speicher, Queen’s
  284. Patrick Speissegger, McMaster, CRC
  285. Nico Spronk, Waterloo
  286. Muni Srivastava, Toronto
  287. Alina Stancu, Concordia
  288. Juris Steprans, York
  289. Brett Stevens, Careleton
  290. Cameron Stewart, Waterloo, FRSC
  291. Anna Stokke, Winnipeg
  292. Ross Stokke, Winnipeg
  293. Catherine Sulem, Toronto
  294. Wei Sun, Concordia
  295. Ken Seng Tan, Waterloo
  296. Keith Taylor, Dalhousie
  297. Hugh Thomas, UNB
  298. Mary Thompson, Waterloo, FRSC
  299. Aaron Tikuisis, Toronto
  300. James G. Timourian, Alberta
  301. Stevo Todorcevic, Toronto, CRC
  302. Nicole Tomczak-Jaegermann, UAlberta, FRSC, CRC
  303. Vladimir Troitsky, Alberta
  304. Manfred Trummer, SFU
  305. TaiPeng Tsai, UBC
  306. Murat Tuncali, Nipissing
  307. Adam Van Tuyl, Lakehead
  308. Stephanie van Willigenburg, UBC
  309. Yevgeniy Vasilyev, Memorial
  310. Bálint Virág, Toronto, CRC
  311. Roman Viveros-Aguilera, McMaster
  312. Edward Vrscay, Waterloo
  313. McKenzie Wang, McMaster
  314. Shawn X. Wang, UBC Okanagan
  315. Michael Ward, UBC
  316. David Wehlau, Queens
  317. Alfred Weiss, UAlberta, FRSC
  318. William Weiss, Toronto
  319. Walter Whiteley, York
  320. Ross Willard, Waterloo
  321. Hugh Williams, Calgary
  322. Gord Willmot, Waterloo
  323. Shelly Wismath,  Lethbridge
  324. Gail Wolkowicz, McMaster
  325. Henry Wolkowicz, Waterloo
  326. Douglas G. Woolford, Laurier
  327. Jianhong Wu, York, CRC
  328. Michael Yampolsky, Toronto
  329. Vlad Yaskin, UAlberta
  330. Grace Yi, Waterloo
  331. Ozgur Yilmaz, UBC
  332. Noriko Yui, Queens
  333. Imed Zaguia, Royal Military College
  334. Kirill Zainoulline, Ottawa
  335. Ping Zhou, SFXU
  336. Zhou Zhou, Toronto

Click here for additional signatories

Filed Under: 2011 Tagged With: Discovery Grants, NSERC

Get in touch with us!

Recent Comments

  • Shui Feng on Canadian Mathematics Community Statement about NSERC Discovery Grants
  • brett stevens on Canadian Mathematics Community Statement about NSERC Discovery Grants
  • Michael Kozdron on Canadian Mathematics Community Statement about NSERC Discovery Grants
  • Vasilisa Shramchenko on Canadian Mathematics Community Statement about NSERC Discovery Grants
  • Paul Mezo on Canadian Mathematics Community Statement about NSERC Discovery Grants

Infrastructure

  • Atlantic Association for Research in the Mathematical Sciences
  • Banff International Research Station
  • Centre de Rechereche Mathématique
  • Fields Institute
  • Mitacs
  • mprime
  • Pacific Institute for the Mathematical Sciences

Institutes

  • Atlantic Association for Research in the Mathematical Sciences
  • Centre de Rechereche Mathématique
  • Fields Institute
  • Pacific Institute for the Mathematical Sciences

NSERC

  • Discovery Grants
  • Peer Review
  • Prizes
  • Section 1508

Societies

  • Canadian Applied and Industrial Mathematics Society
  • Canadian Mathematical Socieity
  • Royal Society of Canada
  • Statistical Society of Canada

Recent Posts

  • Report on the 2012 NSERC Discovery Grants Competition for Math/Stats Section 1508
  • Response to Physicist’s Public Letter to NSERC President Fortier
  • Recommendations for NSERC EG 1508
  • S. Fortier Reply
  • Analytical Overview
© Copyright 2011-2012. Math NSERC Liaison Committee.