Christie Rowe, McGill University
The quest for funding: it is a constant concern, often a source of unease. This is a unique aspect of academic life – we have the opportunity, and the burden, of funding our own research programs. Some of us raise all or part of our own salaries, as well as those of our coworkers. The stress can be compounded by the feeling of not knowing how reviewers might perceive a proposal, or not knowing how to directly pitch the value of one’s proposed research to the goals of a funding program or current trends.
Raising funds to support my students and their (sometimes costly) fieldwork is a continuous responsibility. This obligation is balanced by the chance to pursue my own research interests and the opportunity to choose my preferred field sites and investigate the scientific questions that interest me. But, when I’m worried about my ability to come up with funds, or I spend nights and weekends working on a proposal that is eventually rejected, the whole system can be very discouraging. I worry that I won’t be able to support the students who are counting on me and provide the opportunities and research experiences I think they should have. Everyone I know in this business – even those who are unambiguously successful – has suffered disappointment and lean years when it comes to research funding.
In the last five years, I’ve been applicant, reviewer, and an occasional member of evaluation panels for the national funding agencies in three countries where I’ve lived and worked as an academic structural geologist. South Africa, Canada, and the USA have somewhat different approaches toward funding research in Structural Geology & Tectonics. In addition, the funding landscape is evolving in each country with changes to policy and budget. For me, starting out as a new faculty member (twice) brought a slew of challenges. One of the most difficult was to try to make sense of the different funding programs and identify opportunities. The subtleties of tone and pitch that make a proposal “sound like a winner” can vary for different contexts, and it’s also important to have a sense of the overall priorities built into the funding system policies.
Through time, the perceptions of importance of various aspects of grant proposals necessarily evolve – and different countries assign different weight to these aspects. The relative weighting of novelty, innovativeness, researcher track record, contributions to community and return on investment reflect the priorities of funding programs. A well-pitched proposal must appeal to the priorities of reviewers and panel evaluators. A proposal that might fair well in one context could fail in another.
One benefit of knowing this is a largely emotional one – I think it makes it easier to face rejection when I can reflect on the fact that rejection represents a failure to satisfy one set of expectations, not necessarily a universal critique – and that one set of reviewers don’t hold a monopoly on judging the value of my ideas or past accomplishments. The flip side of that is that without changing my research direction to follow trends, it is still possible to spin a proposal toward an emphasis that feels more cutting-edge, if I understand what the trends are within the local research community. Learning how to pitch my research program in field-based structural geology in terms of the interests of three different nations (i.e. in terms of regional tectonics, ore deposits, or earthquake mechanics) has helped me diversify my research portfolio and find ways to continue to do what I love.
The other potential value from comparing different countries’ approaches to funding in my field is that I might someday have the opportunity to affect the direction that policy evolves. The philosophies built into funding policies have far reaching effects – they may stimulate student training, reward industry partnerships or discourage them, support the formation of large research groups under one PI or allow the distribution of money across more small groups and individual PIs. The culture and structure of academia is strongly affected by the values imposed by funding policies. So, I think it’s important to consider what other countries prioritize and how they develop policies to encourage or enforce certain approaches, and how the domestic community responds.
There are two factors that I think really set the three countries’ approaches apart:
1. Track Record or Proposed Work? Funding programs necessarily incorporate a weighting for the importance of two factors:
• Track record of the researcher – in terms of publication productivity and impact, and past success in mentoring.
• Proposed work – originality and degree of potential impact of the science, innovativeness of new ideas and methods, and plans for student training and knowledge dissemination. These programs award according to requested budgets.
In South Africa and Canada, researchers’ track records are explicitly prioritized. South African researchers can apply for a Rating. If successful, the researcher will then receive a fixed amount of money each year, depending on their Rating Group. Ratings are based on several metrics, some subjective, like reviewers’ estimation of “impact in the field”. The Canadian Discovery Grant program also awards a fixed sum each year, and places highest priority on past accomplishments (assessment is based 1/3 on past contributions to research, 1/3 on past mentoring of students and postdocs, and 1/3 on the research plan.)
Prioritizing the track record of the researcher rewards past success, acknowledges that someone has proven their ability to make discoveries and pursue research ideas to completion. It also recognizes that it’s often difficult to project research developments years into the future. Therefore, funding that is not tethered to a particular project helps with planning for students, while enabling changes of focus along the way.
In the US, proposals that describe and budget very specific projects are submitted to the National Science Foundation (NSF). These proposals are evaluated on Intellectual Merit and Broader Impacts (where Broader Impacts include societal impact of the research, public outreach, improvements to teaching, and training of students and postdocs). There is no explicit instruction to take into account the past successes of the researchers, but these factors are integrated into the assessment of the research viability and potential for success, and especially the Broader Impacts and mentoring plans.
Early-career researchers with a shorter track record might be at a disadvantage in a CVfocused funding program. In Canada this inequity is partially addressed by boosting the awards for superstar young applicants, but broadly, funding increases with seniority. In South Africa, separate pots of money are set aside for applications from different Rating Groups so that competition is only within group – effectively separating researchers of different backgrounds. In both countries, a researcher must make due with what they are awarded until they apply for renewal (typically 5 year cycles).
South Africa and Canada have created programs of endowed prestige Research Chairs. The Research Chair positions are allocated by research administration to universities who lobby for them. In both countries, the Research Chairs programs are sometimes viewed as taking funds and allocating them to candidates selected by political processes instead of by open competition and peer review (CAUT, 2013).
Holding a 5-year fixed grant as a researcher in Canada gives me a certain amount of stability, but I think a proposal-centric system really stimulates innovation more aggressively. The Canadian Discovery Program (mentioned above) currently funds about half of the applicants in the earth sciences (compare to NSF Tectonics at about 15%).
I think an ideal system would be a hybrid: a baseline funding program (perhaps with a minimum award equivalent to about the cost of supporting one student, with a very high success rate), combined with a proposal-based call for collaborative research (larger funds but lower success rates). This might stimulate primary productivity and student training, but also allow the high-powered groups to grow faster. Both Canada and South Africa have made significant changes to their research funding structures in the last few years, and have moved in the direction of hybrid systems, although they vary substantially in detail.
2. Industry and International Collaborations Canada encourages collaborations between industry and academia by matching money from industry donations. There is a program under which a researcher can apply for a few thousand dollars for travel/meetings to try to set up industry partnerships, with no commitment from the industry partner. This emphasis reflects a desire to develop “benefits to Canada” through research funds, but has generated controversy recently due the perception that these programs take funds out of academic competition and funnel them into industry R&D (c.f. Colliander, 2012-2013).
Neither South Africa nor the US provides much federal funding directly to stimulate industry collaboration. In the US, the petroleum industry sometimes supports individual schools or programs, but there is little widespread opportunity for funding of individual researchers or students. In South Africa, the mining and energy industries fund direct grants to individual researchers, and also award scholarships to students, tied to employment contracts. While this enables many students to get science degrees who wouldn’t otherwise be able to attend university, it also prohibits them from continuing to graduate studies until they have satisfied their employment contracts. These scholarships usually target black students – and effectively siphon off many promising future scientists who might otherwise have been among the still very small number of aspiring black South African academics in the earth sciences.
South Africa has also been very proactive in establishing bilateral research agreements with other countries. In some cases this involves transfer of funds from a partner to South Africa, some involve matching funds from cross-border teams, and sometimes South African funds are spread to researchers in the surrounding region. These programs bring South African students in contact with the international research communities in their fields. They also facilitate collaborative research around regional problems.
Bilateral and multilateral opportunities for Canadian or American researchers also exist, and my adopted province of Québec has several agreements of its own. There is no specific federal US-Canada joint research agreement open to the earth sciences, in spite of an abundance of active collaborations across the border. The Discovery Grant funds are sufficiently unencumbered that I have used them for collaboration with colleagues in the US, which is helpful because my US colleagues cannot list me as a co-applicant on proposals in most NSF programs. Given that so many Canadians work closely with US colleagues, I wish there was a formal way to propose bilaterally funded research between Canada and the US.
I don’t like the feeling that students are obligated to industry, or can only choose from the potential advisors with the most grant money when choosing their direction in science. Ideally I would like to see all three countries put a higher priority on freestanding graduate student and postdoctoral fellowships. This might allow the funding agencies to directly create opportunities for increasing diversity and supporting low-income students into graduate studies – instead of waiting for industry to do it (e.g. SA), or indirectly by encouraging PIs to consider reaching out to under-represented groups as one of many potential “broader impacts” (e.g. US).
Universally good proposals?
Being able to compare the three systems has left me with one useful insight: the priorities of funding programs may vary, but there are essential elements of good proposals that are universal. A good proposal should be easy and fun to read! After reviewing way too many this year, I’ve put together a few notes on what makes the best ones stand out in my memory. It didn’t seem to matter whether they were five or 25 pages, the difference in length could be ascribed to the difference in level of detail.
• Big Issues – To be interesting (/novel /transformative /innovative) a good proposal promises something substantially new and interesting. However, it’s highly unusual that a single proposal can promise to complete a massive scientific leap, so the Big Issues must be broken down clearly.
• Organic design – where the work plan directly follows from the Big Issues. I’ve both written and reviewed proposals of the form: “I want to go back to my PhD research area, so here’s a weakly linked question which would probably be better investigated elsewhere.” The same goes for techniques, methods, instruments – anything we get very invested in and want to keep using over time.
• Question-driven work plan – Specific investigations, which will answer specific questions, which will cumulatively build understanding toward (or eliminate hypotheses for) the Big Issues. The linkages from the individual tasks, all the way back to the Big Issues, need to be spelled out succinctly and specifically. Likely turning points where ideas can be falsified or supported by evidence should be highlighted so that it’s clear that a path to Big Answers is possible.
• Bold but not Brazen – I have reviewed more than one proposal which raised potentially contentious issues, but gave equal lip service to different sides of the controversy such that I wondered if they were being careful not to offend potential reviewers from either side. This might make a proposal sound flimsy or timid, especially from an early career researcher. On the flip side, I’ve seen researchers propose to continue the same work they’ve been pursuing for decades, using the same techniques and the testing the same hypotheses, and still insist the research is novel and timely and terribly important. Other researchers outline the Big Issues and promise to solve them all (skipping over the links to specific data to be collected). This is pretty unconvincing.
By my own criteria, I’ve written some good proposals and some rather pathetic ones. Really novel, interesting ideas are everywhere – but without a well defined, clearly expressed vision, they don’t turn into great proposals. My style of writing proposals has evolved as I learned more the last few years – instead of laboriously and slowly writing, I spend days or weeks reading, scribbling notes on post-its, and phoning other researchers (people I know and people I have never met) to ask follow-up questions after reading their papers. Once the idea crystallizes, the writing of the actual proposal can happen in a couple of days. If anyone knows a magic way to be sure that these couple of days occur at a comfortable margin before a grant deadline, I would certainly appreciate it if you could leave this information in the comments.
References:
Canadian Association of University Teachers (CAUT) Bulletin, 2013, Editorial: CAUT launches campaign to overhaul federal science policy. v. 60 n. 5, May 5, 2013 Colliander, J. – See several blog posts, mostly dated 2012, on the issue of NSERC funding for industry via academic collaborations at http://blog.math.toronto.edu/colliand/category/ administration/nserc/
_________________________________________________________________
Christie Rowe is an assistant professor and Wares Faculty Scholar in Economic Geology at McGill University in Montréal, Canada. If you want to learn about her and her students’ past and current research in the geology and rheology of faults, check out her website. This essay benefited from conversations with, and suggestions from, Rick Allmendinger, Steve Harlan, Laurent Godin, Chris Harris, Jodie Miller, Jamie Kirkpatrick, Heather Savage, and Eric Rowe, but all errors and opinions are entirely mine.
It is focused what we are running after, the professional training and research in structural geology
Reblogged this on Orogeny and commented:
This is very interesting and does not only apply to structural geology. How academia is judged is changing it.
My family members every time say that I am killing my time here
at net, however I know I am getting knowledge everyday by
reading thes fastidious posts.