Pre-publication planning and proofing

Authors
Mary-Alice Doyle
Contributors
Summary

This resource is intended for research teams who have drafted a paper with results from a randomized evaluation and are preparing to submit to a journal or publish a working paper.

Randomized evaluations may have many stakeholders, and they may each have requirements for the way that results are presented or for what must happen before results are published.

In this resource, we provide two template documents that can be used in planning for publication. The first is a pre-submission plan, used to list stakeholders’ requirements and other best practices for projects that are nearing completion. The second is a quality assurance and proofing checklist. Both are used by staff at J-PAL North America to prepare a paper for publication.

Introduction

Randomized evaluations may have many stakeholders. Before publishing a working paper or submitting a manuscript to a journal, it can be helpful to map out each of these stakeholders and their requirements.1 Doing so at least a month in advance of submitting to a journal or publishing a working paper can help to avoid unanticipated delays.

Mapping stakeholders

Stakeholders may include:

  • Research partners

  • Funders

  • Institutional Review Boards (IRBs)

  • Data providers

  • Target journal

  • Media

  • Project staff

Assign a member of the research team to reread all data use agreements (DUAs), memoranda of understanding (MOUs), IRB documents, grants or other agreements, and to list any formal requirement contained in these documents relating to the end of the project. This is also a good time to think about actions the research team can take to maintain good relationships with partners or other stakeholders, even if there is no formal requirement to do so.

Pre-publication plans can look different for different projects. Below is an example of one used by J-PAL North America to map out formal requirements and other good practices prior to publication. Following that, we provide an example of a quality assurance and proofing checklist used by research staff at J-PAL North America.  

Pre-submission planning—an example

Below is an example adapted from a pre-submission plan used by J-PAL North America staff in a recent journal submission. Requirements specific to that project’s data providers, IRBs, funders and target journal have been replaced by questions to guide research teams through reading the relevant documents.

Most important tasks

  • Share draft manuscript with partners and, if different, data providers
  • Obtain approval from the partners to use any of their figures or images contained in the paper
  • Report results to funders
  • Quality assurance and fact-checking.

Details

Data providers and Institutional Review Boards (IRBs)

  • Is there a review period for data providers? Many DUAs stipulate that data providers must have a chance (often 30 days) to review any papers, presentations or use of their name prior to publication or submission for publication.
  • Do the DUA and IRB allow researchers to publish data alongside the results paper? If so, are there any conditions?
  • Does the DUA or IRB specify that data must be deleted or returned to the provider?
  • Does the IRB require researchers to notify them when a paper is published?

If there are multiple data providers, a table like this one might help to summarize the key details:

Table  1 Data providers – key details
Data provider Do they require a review period? Does the DUA specify how data should be cited? Can we publish de-identified data? Instructions for how data should be destroyed/ returned?
Example: Provider A Yes—30 days Yes, must state that results do not reflect the opinions of Provider A. Unclear from DUA—check with provider. Destroy after 5 years except if stored in a data registry.

 

Research partner

Even if there are not formal requirements, walking partners through the analysis can help them to understand and act on any findings, and is important in maintaining good relationships.

  • Present results to the partner.2
  • Distribute a plain language summary of results to the partner’s staff (if desired by the partner).
  • Is there a Memorandum of Understanding (MOU) requiring the research team to provide papers and presentations to the partner for review prior to submission or publication? Even if there is no formal requirement, it may still be a good idea to build in time for partners to review the draft paper prior to submission.
  • Seek approval to use the name of the partner and figures or images from the partner in the paper.

Funders

  • Do funders specify that they must be notified of research findings or of publication within a certain time period?
  • Does the grant specify any suggestion or requirement to publish data and code for replication?

  • Is there a requirement to make findings public within a certain time period?
  • Do funders specify guidelines around how they should be acknowledged in a manuscript?

Target journal

  • Check the target journal’s embargo rule—is it ok to release a working paper?3
  • Does the journal have other requirements for how the results are presented and discussed?4

Media

Are the results likely to attract media attention, or is the Principal Investigator (PI) interested in obtaining media coverage? If so:

  • Talk to the research partner about communications strategy.

  • It can be helpful to find a media outlet interested in ‘breaking’ the story, to whom the research team can provide advance notice of the results and their release date. However, note that some journals (e.g. certain medical journals) do not allow researchers to share results with the media outside of the formal embargo period.

  • Research teams can consult with their university’s media office, or with J-PAL staff if the PIs are within the J-PAL network.5

  • Consider preparing a press release or writing a blog post.

Other good practices

  • Complete a thorough fact-check and proofread of the paper (see checklist below).

  • Prepare a public use dataset and code.

  • Consider submitting data and code for re-analysis by J-PAL.

  • Update registries to notify of study completion (e.g. AEA, Clinical Trials, OSF).

  • Complete internal documentation (e.g. project logs, internal memos and reflections on lessons learned).

  • Ensure that project staff, data providers and any academic or policy experts who were involved in developing the project are listed in the acknowledgements.

  • Send a ‘thank you’ email to former Research Assistants, project staff and staff at partner organizations who contributed.

Quality assurance and proofing checklist

Below is a checklist used by J-PAL North America staff in a recent journal submission. The tasks and instructions are intended for research assistants or other project staff. Time estimates and priority levels will vary depending on the project.

Instructions

Please complete each task for all documents (i.e., main text, tables and figures, paper appendix, and online appendix). This is a guide only. If you notice issues that should be addressed but are not listed here, please add them to the list.

You can complete each task in the way that works best for you (e.g., print a copy to mark up manually, use track changes or color code an electronic version.) Please add your initials in the table next to each item when it is complete—this document will serve as a record that each task has been completed. When you are finished, please notify the rest of the research team.

If you identify things that need to be edited or corrected, please do so directly in the draft text/tables/graphs. If they are substantial changes, notify the rest of the team. Please also record Y/N and any relevant details under the ‘Are edits needed?’ column below. The final output from this process will be a single marked-up electronic version of the paper and accompanying materials. PIs will then review and accept/reject changes.

Tasks

Review code

Time estimate: two people, 16 hours each

Conduct a final audit of the code base to ensure that every code file that generates numbers, tables or variables used in analysis has been reviewed. This time estimate assumes that review of code has been ongoing throughout the project. Below are some examples of specific items to review, but please tailor this list to the specifics of your project.

Table  2 Review code
  Task Are edits needed? Responsible individual Complete? (Initials)
Review for accuracy
1 Ensure that code runs to completion if you move it to a different working directory.      
2 Create a list of each code file to be reviewed. For each file, read through and test it. Don’t forget to:      
-Check that any data/other code files called from this file are the most up-to-date versions.
-Check definitions of each variable used in analysis. 
-Check that you agree with the treatment of missing values.
3 Check output from reviewed code against each table and figure in the draft paper. Ensure that the final code replicates:      
 - Number of observations 
 - For summary statistics: mean, standard deviation, etc. 
- For regression results: point estimate, standard error, statistical significance, confidence intervals
         
Review for readability
1 Are comments in the code accurate? Update or delete those that are no longer true.      
2 Is there commented-out code that should be deleted?      
3 Will somebody else be able to follow the code?  If not, consider changing formatting, separating the code into several files, and writing ‘readme’ files to ensure clarity.      
4 If one does not exist already, generate a data dictionary and/or a codebook.      

Journal requirements

Time estimate: one person, 10 hours

List the requirements for the target journal and check them off. Often journals specify requirements around reporting of pre-analysis plans and other details specific to randomized evaluations; as a general priority, ensure that deviations from a pre-analysis plan (if one exists) are noted or justified in the text.

Table  3 Journal requirements
  Task Are edits needed? Responsible individual Complete? (Initials)
1 Example: Is everything that is not in the pre-analysis plan marked with a ^ (and is everything without ^ in the pre-analysis plan)?      
2 Example: Create a list of analyses in the pre-analysis plan that are not in the text or appendices.      

 

Check tables, graphs, and figures

Time estimate: one person, 5 hours

Table  4 Check tables, graphs, and figures
  Task Are edits needed? Responsible individual Complete? (Initials)
1 Check spelling in graph titles, axes, footnotes, variable labels etc      
2 Check table, graph and figure titles for clarity and consistency.      
3 Check colors—will it be possible to interpret the graphs or figures if printed in black and white, and for readers who are colorblind?      
4 Check that tables and graphs used in the paper are the latest versions.      
5 If graphs or figures include data points that are also reported in tables, are they consistent?      
6 Check table notes to ensure that they are informative and up to date. If not stated elsewhere, ensure that they include details such as: which dataset is used, what type of standard errors are used, whether regressions were weighted, which control variables were used, etc.      
7 Consistency of tables—check that each table that describes the same category of information is formatted similarly (e.g. regression type and standard error type reported in subtitle or table notes, consistent rounding, etc.).      
8 Check spelling and grammar in table notes.      

 

Accuracy in text

Time estimate: two people, 16 hours each

Table  5 Accuracy in text
  Task Are edits needed? Responsible individual Complete? (Initials)
1 Highlight/underline each fact, number or assertion made in the text.      
2 For each highlighted fact, satisfy yourself that it can be backed up by a valid external reference or by tables/figures in the main paper or appendix.      
3 If a fact comes from our dataset but does not appear in a table or figure, ensure that it is documented in the code.      
4 If a fact comes from the implementing partner, make sure this is documented (e.g. save email/call notes in shared drive).      
5 For each citation, check that the source unambiguously says the thing that it is cited for.      
6 Check that statements made in the paper about definitions of samples and variables are consistent with code.      
7 Check any equations—Are they correct? Is each element defined? Do the sub/superscripts make sense?      

 

Check reference list

Time estimate: one person, 5 hours

Table  6 Check reference list
  Task Are edits needed? Responsible individual Complete? (Initials)
1 All in-text citations match the reference list.      
2 Every item in the reference list is cited in text. If not, remove from the reference list.      
3 Is each reference accurate? (Check spelling of authors’ names, accuracy of name of journal, journal volume etc.)      
4 Can each reference be found (e.g. in an online search) by a reader?      
5 If working papers are cited, have they been updated or published elsewhere?      
6 In-text references and the reference list follow a consistent style.      
7 Does the reference style conform with the journal’s style guide?      

 

Proofreading

Time estimate: two people, 16 hours each

At least one person should do a full read through of the ‘final’ clean copy before sending it to PIs for sign-off.

Table  7 Proofreading
  Task Are edits needed? Responsible individual Complete? (Initials)
1 Check for and fix spelling and grammar errors in text.      
2 Look out for unclear wording—suggest edits to clarify.      
3 Check commonly confused terms, such as percent vs. percentage points.      
4 Check that titles of sections are clear and follow a consistent format. Suggest edits if needed.      
5 Check that acronyms and abbreviations are spelled out in their first use, and used consistently thereafter. Edit if needed.      
6 Check that variable names, notation and concepts are used consistently. The same term should not be used to refer to multiple concepts. If multiple terms are used to refer to a single concept, make sure this is clear. Edit if needed.      
7 Ensure that each figure and table is referred to at least once in the text. If not, add a reference to it in the text or remove it.      
8 Check that each table/figure is referred to accurately in the text. E.g. does each in-text reference to ‘Table 1’ actually refer to Table 1 and not Table 2?      
9 Check for inconsistent numbering. E.g., if  Appendix Table 1 is followed by Table A2, edit numbering system to ensure consistency.      
10 Check that material in appendices appear in the same order as they are referenced in the paper.      
11 Check that names or other terms are used and spelled consistently—in acknowledgments and throughout text (eg, Health Care or Healthcare).      

 

Acknowledgments

The quality assurance and proofing checklist was originally developed based on the Paper Production wiki page in Professors Matthew Gentzkow and Jesse Shapiro’s Research Assistant manual.

Thanks to Maya Duru, Laura Feeney, Sarah Kopper and Vincent Quan for insights and advice. Caroline Garau copyedited this document. This work was made possible by support from the Alfred P. Sloan Foundation and Arnold Ventures. Any errors are our own.   

1.
While drafting a document or manuscript, J-PAL staff find it helpful to add comment bubbles to in-text references, citing the exact location (e.g., chapter, page number) and including the relevant sentence from the source document. Doing this work upfront reduces the time involved in quality assurance and fact-checking later on.
2.
J-PAL’s resource on Communicating with a partner about results includes further considerations and guidance related to creating a communications strategy.
3.
See ‘Embargo Policies’ in Communicating with a partner about results
4.
For example, medical journals may require authors to submit a Consolidated Standard of Reporting Trials (CONSORT) checklist alongside a manuscript.
5.
 Taylor & Francis provide a guide for researchers on engaging with the media.
Additional Resources

Gentzkow, Matthew and Jesse Shapiro. 2018. “Paper Production.” RA-manual wiki. Accessed June 7, 2019. https://github.com/gslab-econ/ra-manual/wiki/Paper-Production.

 

Taylor & Francis Group. 2019. “Engaging with the media: Reaching beyond academia.” Author Services. Accessed June 6, 2019. https://authorservices.taylorandfrancis.com/engaging-with-the-media/#.