Research methodology vs. research methods
The research methodology or design is the overall strategy and rationale that you used to carry out the research. Whereas, research methods are the specific tools and processes you use to gather and understand the data you need to test your hypothesis.
To further understand research methodology, let’s explore some examples of research methodology:
a. Qualitative research methodology example: A study exploring the impact of author branding on author popularity might utilize in-depth interviews to gather personal experiences and perspectives.
b. Quantitative research methodology example: A research project investigating the effects of a book promotion technique on book sales could employ a statistical analysis of profit margins and sales before and after the implementation of the method.
c. Mixed-Methods research methodology example: A study examining the relationship between social media use and academic performance might combine both qualitative and quantitative approaches. It could include surveys to quantitatively assess the frequency of social media usage and its correlation with grades, alongside focus groups or interviews to qualitatively explore students’ perceptions and experiences regarding how social media affects their study habits and academic engagement.
These examples highlight the meaning of methodology in research and how it guides the research process, from data collection to analysis, ensuring the study’s objectives are met efficiently.
When it comes to writing your study, the methodology in research papers or a dissertation plays a pivotal role. A well-crafted methodology section of a research paper or thesis not only enhances the credibility of your research but also provides a roadmap for others to replicate or build upon your work.
Wondering how to write the research methodology section? Follow these steps to create a strong methods chapter:
At the start of a research paper , you would have provided the background of your research and stated your hypothesis or research problem. In this section, you will elaborate on your research strategy.
Begin by restating your research question and proceed to explain what type of research you opted for to test it. Depending on your research, here are some questions you can consider:
a. Did you use qualitative or quantitative data to test the hypothesis?
b. Did you perform an experiment where you collected data or are you writing a dissertation that is descriptive/theoretical without data collection?
c. Did you use primary data that you collected or analyze secondary research data or existing data as part of your study?
These questions will help you establish the rationale for your study on a broader level, which you will follow by elaborating on the specific methods you used to collect and understand your data.
Now that you have told your reader what type of research you’ve undertaken for the dissertation, it’s time to dig into specifics. State what specific methods you used and explain the conditions and variables involved. Explain what the theoretical framework behind the method was, what samples you used for testing it, and what tools and materials you used to collect the data.
Once you have explained the data collection process, explain how you analyzed and studied the data. Here, your focus is simply to explain the methods of analysis rather than the results of the study.
Here are some questions you can answer at this stage:
a. What tools or software did you use to analyze your results?
b. What parameters or variables did you consider while understanding and studying the data you’ve collected?
c. Was your analysis based on a theoretical framework?
Your mode of analysis will change depending on whether you used a quantitative or qualitative research methodology in your study. If you’re working within the hard sciences or physical sciences, you are likely to use a quantitative research methodology (relying on numbers and hard data). If you’re doing a qualitative study, in the social sciences or humanities, your analysis may rely on understanding language and socio-political contexts around your topic. This is why it’s important to establish what kind of study you’re undertaking at the onset.
Now that you have gone through your research process in detail, you’ll also have to make a case for it. Justify your choice of methodology and methods, explaining why it is the best choice for your research question. This is especially important if you have chosen an unconventional approach or you’ve simply chosen to study an existing research problem from a different perspective. Compare it with other methodologies, especially ones attempted by previous researchers, and discuss what contributions using your methodology makes.
No matter how thorough a methodology is, it doesn’t come without its hurdles. This is a natural part of scientific research that is important to document so that your peers and future researchers are aware of it. Writing in a research paper about this aspect of your research process also tells your evaluator that you have actively worked to overcome the pitfalls that came your way and you have refined the research process.
1. Remember who you are writing for. Keeping sight of the reader/evaluator will help you know what to elaborate on and what information they are already likely to have. You’re condensing months’ work of research in just a few pages, so you should omit basic definitions and information about general phenomena people already know.
2. Do not give an overly elaborate explanation of every single condition in your study.
3. Skip details and findings irrelevant to the results.
4. Cite references that back your claim and choice of methodology.
5. Consistently emphasize the relationship between your research question and the methodology you adopted to study it.
To sum it up, what is methodology in research? It’s the blueprint of your research, essential for ensuring that your study is systematic, rigorous, and credible. Whether your focus is on qualitative research methodology, quantitative research methodology, or a combination of both, understanding and clearly defining your methodology is key to the success of your research.
Once you write the research methodology and complete writing the entire research paper, the next step is to edit your paper. As experts in research paper editing and proofreading services , we’d love to help you perfect your paper!
Here are some other articles that you might find useful:
What does research methodology mean, what types of research methodologies are there, what is qualitative research methodology, how to determine sample size in research methodology, what is action research methodology.
Found this article helpful?
This is very simplified and direct. Very helpful to understand the research methodology section of a dissertation
Leave a Comment: Cancel reply
Your email address will not be published.
Your organization needs a technical editor: here’s why, your guide to the best ebook readers in 2024, writing for the web: 7 expert tips for web content writing.
Subscribe to our Newsletter
Get carefully curated resources about writing, editing, and publishing in the comfort of your inbox.
How to Copyright Your Book?
If you’ve thought about copyrighting your book, you’re on the right path.
© 2024 All rights reserved
Last updated
5 March 2024
Reviewed by
Short on time? Get an AI generated summary of this article instead
From successful product launches or software releases to planning major business decisions, research reports serve many vital functions. They can summarize evidence and deliver insights and recommendations to save companies time and resources. They can reveal the most value-adding actions a company should take.
However, poorly constructed reports can have the opposite effect! Taking the time to learn established research-reporting rules and approaches will equip you with in-demand skills. You’ll be able to capture and communicate information applicable to numerous situations and industries, adding another string to your resume bow.
A research report is a collection of contextual data, gathered through organized research, that provides new insights into a particular challenge (which, for this article, is business-related). Research reports are a time-tested method for distilling large amounts of data into a narrow band of focus.
Their effectiveness often hinges on whether the report provides:
Strong, well-researched evidence
Comprehensive analysis
Well-considered conclusions and recommendations
Though the topic possibilities are endless, an effective research report keeps a laser-like focus on the specific questions or objectives the researcher believes are key to achieving success. Many research reports begin as research proposals, which usually include the need for a report to capture the findings of the study and recommend a course of action.
A description of the research method used, e.g., qualitative, quantitative, or other
Statistical analysis
Causal (or explanatory) research (i.e., research identifying relationships between two variables)
Inductive research, also known as ‘theory-building’
Deductive research, such as that used to test theories
Action research, where the research is actively used to drive change
Research reports can unify and direct a company's focus toward the most appropriate strategic action. Of course, spending resources on a report takes up some of the company's human and financial resources. Choosing when a report is called for is a matter of judgment and experience.
Some development models used heavily in the engineering world, such as Waterfall development, are notorious for over-relying on research reports. With Waterfall development, there is a linear progression through each step of a project, and each stage is precisely documented and reported on before moving to the next.
The pace of the business world is faster than the speed at which your authors can produce and disseminate reports. So how do companies strike the right balance between creating and acting on research reports?
The answer lies, again, in the report's defined objectives. By paring down your most pressing interests and those of your stakeholders, your research and reporting skills will be the lenses that keep your company's priorities in constant focus.
Honing your company's primary objectives can save significant amounts of time and align research and reporting efforts with ever-greater precision.
Some examples of well-designed research objectives are:
Proving whether or not a product or service meets customer expectations
Demonstrating the value of a service, product, or business process to your stakeholders and investors
Improving business decision-making when faced with a lack of time or other constraints
Clarifying the relationship between a critical cause and effect for problematic business processes
Prioritizing the development of a backlog of products or product features
Comparing business or production strategies
Evaluating past decisions and predicting future outcomes
Research reports generally require a research design phase, where the report author(s) determine the most important elements the report must contain.
Just as there are various kinds of research, there are many types of reports.
Here are the standard elements of almost any research-reporting format:
Report summary. A broad but comprehensive overview of what readers will learn in the full report. Summaries are usually no more than one or two paragraphs and address all key elements of the report. Think of the key takeaways your primary stakeholders will want to know if they don’t have time to read the full document.
Introduction. Include a brief background of the topic, the type of research, and the research sample. Consider the primary goal of the report, who is most affected, and how far along the company is in meeting its objectives.
Methods. A description of how the researcher carried out data collection, analysis, and final interpretations of the data. Include the reasons for choosing a particular method. The methods section should strike a balance between clearly presenting the approach taken to gather data and discussing how it is designed to achieve the report's objectives.
Data analysis. This section contains interpretations that lead readers through the results relevant to the report's thesis. If there were unexpected results, include here a discussion on why that might be. Charts, calculations, statistics, and other supporting information also belong here (or, if lengthy, as an appendix). This should be the most detailed section of the research report, with references for further study. Present the information in a logical order, whether chronologically or in order of importance to the report's objectives.
Conclusion. This should be written with sound reasoning, often containing useful recommendations. The conclusion must be backed by a continuous thread of logic throughout the report.
With a clear outline and robust pool of research, a research paper can start to write itself, but what's a good way to start a research report?
Research report examples are often the quickest way to gain inspiration for your report. Look for the types of research reports most relevant to your industry and consider which makes the most sense for your data and goals.
The research report outline will help you organize the elements of your report. One of the most time-tested report outlines is the IMRaD structure:
Introduction
...and Discussion
Pay close attention to the most well-established research reporting format in your industry, and consider your tone and language from your audience's perspective. Learn the key terms inside and out; incorrect jargon could easily harm the perceived authority of your research paper.
Along with a foundation in high-quality research and razor-sharp analysis, the most effective research reports will also demonstrate well-developed:
Internal logic
Narrative flow
Conclusions and recommendations
Readability, striking a balance between simple phrasing and technical insight
The validity of research data is critical. Because the research phase usually occurs well before the writing phase, you normally have plenty of time to vet your data.
However, research reports could involve ongoing research, where report authors (sometimes the researchers themselves) write portions of the report alongside ongoing research.
One such research-report example would be an R&D department that knows its primary stakeholders are eager to learn about a lengthy work in progress and any potentially important outcomes.
However you choose to manage the research and reporting, your data must meet robust quality standards before you can rely on it. Vet any research with the following questions in mind:
Does it use statistically valid analysis methods?
Do the researchers clearly explain their research, analysis, and sampling methods?
Did the researchers provide any caveats or advice on how to interpret their data?
Have you gathered the data yourself or were you in close contact with those who did?
Is the source biased?
Usually, flawed research methods become more apparent the further you get through a research report.
It's perfectly natural for good research to raise new questions, but the reader should have no uncertainty about what the data represents. There should be no doubt about matters such as:
Whether the sampling or analysis methods were based on sound and consistent logic
What the research samples are and where they came from
The accuracy of any statistical functions or equations
Validation of testing and measuring processes
A robust design validation process is often a gold standard in highly technical research reports. Design validation ensures the objects of a study are measured accurately, which lends more weight to your report and makes it valuable to more specialized industries.
Product development and engineering projects are the most common research-report examples that typically involve a design validation process. Depending on the scope and complexity of your research, you might face additional steps to validate your data and research procedures.
If you’re including design validation in the report (or report proposal), explain and justify your data-collection processes. Good design validation builds greater trust in a research report and lends more weight to its conclusions.
Just as the quality of your report depends on properly validated research, a useful conclusion requires the most contextually relevant analysis method. This means comparing different statistical methods and choosing the one that makes the most sense for your research.
Most broadly, research analysis comes down to quantitative or qualitative methods (respectively: measurable by a number vs subjectively qualified values). There are also mixed research methods, which bridge the need for merging hard data with qualified assessments and still reach a cohesive set of conclusions.
Some of the most common analysis methods in research reports include:
Significance testing (aka hypothesis analysis), which compares test and control groups to determine how likely the data was the result of random chance.
Regression analysis , to establish relationships between variables, control for extraneous variables , and support correlation analysis.
Correlation analysis (aka bivariate testing), a method to identify and determine the strength of linear relationships between variables. It’s effective for detecting patterns from complex data, but care must be exercised to not confuse correlation with causation.
With any analysis method, it's important to justify which method you chose in the report. You should also provide estimates of the statistical accuracy (e.g., the p-value or confidence level of quantifiable data) of any data analysis.
This requires a commitment to the report's primary aim. For instance, this may be achieving a certain level of customer satisfaction by analyzing the cause and effect of changes to how service is delivered. Even better, use statistical analysis to calculate which change is most positively correlated with improved levels of customer satisfaction.
There's endless good advice for writing effective research reports, and it almost all depends on the subjective aims of the people behind the report. Due to the wide variety of research reports, the best tips will be unique to each author's purpose.
Consider the following research report tips in any order, and take note of the ones most relevant to you:
No matter how in depth or detailed your report might be, provide a well-considered, succinct summary. At the very least, give your readers a quick and effective way to get up to speed.
Pare down your target audience (e.g., other researchers, employees, laypersons, etc.), and adjust your voice for their background knowledge and interest levels
For all but the most open-ended research, clarify your objectives, both for yourself and within the report.
Leverage your team members’ talents to fill in any knowledge gaps you might have. Your team is only as good as the sum of its parts.
Justify why your research proposal’s topic will endure long enough to derive value from the finished report.
Consolidate all research and analysis functions onto a single user-friendly platform. There's no reason to settle for less than developer-grade tools suitable for non-developers.
The research-reporting format is how the report is structured—a framework the authors use to organize their data, conclusions, arguments, and recommendations. The format heavily determines how the report's outline develops, because the format dictates the overall structure and order of information (based on the report's goals and research objectives).
A good report outline gives form and substance to the report's objectives, presenting the results in a readable, engaging way. For any research-report format, the outline should create momentum along a chain of logic that builds up to a conclusion or interpretation.
There are several key differences between research reports and essays:
Research report:
Ordered into separate sections
More commercial in nature
Often includes infographics
Heavily descriptive
More self-referential
Usually provides recommendations
Research essay
Does not rely on research report formatting
More academically minded
Normally text-only
Less detailed
Omits discussion of methods
Usually non-prescriptive
Do you want to discover previous research faster?
Do you share your research findings with others?
Do you analyze research data?
Start for free today, add your research, and get to key insights faster
Last updated: 18 April 2023
Last updated: 27 February 2023
Last updated: 22 August 2024
Last updated: 5 February 2023
Last updated: 16 August 2024
Last updated: 9 March 2023
Last updated: 30 April 2024
Last updated: 12 December 2023
Last updated: 11 March 2024
Last updated: 4 July 2024
Last updated: 6 March 2024
Last updated: 5 March 2024
Last updated: 13 May 2024
Related topics, .css-je19u9{-webkit-align-items:flex-end;-webkit-box-align:flex-end;-ms-flex-align:flex-end;align-items:flex-end;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-flex-direction:row;-ms-flex-direction:row;flex-direction:row;-webkit-box-flex-wrap:wrap;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;-webkit-box-pack:center;-ms-flex-pack:center;-webkit-justify-content:center;justify-content:center;row-gap:0;text-align:center;max-width:671px;}@media (max-width: 1079px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}}@media (max-width: 799px){.css-je19u9{max-width:400px;}.css-je19u9>span{white-space:pre;}} decide what to .css-1kiodld{max-height:56px;display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;}@media (max-width: 1079px){.css-1kiodld{display:none;}} build next, decide what to build next, log in or sign up.
Get started for free
BMC Medical Research Methodology volume 20 , Article number: 226 ( 2020 ) Cite this article
42k Accesses
57 Citations
61 Altmetric
Metrics details
Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?
Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.
Peer Review reports
The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 , 2 , 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 , 7 , 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).
In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig. 1 .
Trends in the number studies that mention “methodological review” or “meta-
epidemiological study” in PubMed.
The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.
The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.
Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 , 13 , 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.
Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.
Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.
These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].
There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.
Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].
Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.
In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.
Q: How should I select research reports for my methodological study?
A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].
The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.
Q: How many databases should I search?
A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.
Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.
Q: Should I publish a protocol for my methodological study?
A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.
Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).
Q: How to appraise the quality of a methodological study?
A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.
Q: Should I justify a sample size?
A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:
Comparing two groups
Determining a proportion, mean or another quantifier
Determining factors associated with an outcome using regression-based analyses
For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].
Q: What should I call my study?
A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.
Q: Should I account for clustering in my methodological study?
A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”
A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].
Q: Should I extract data in duplicate?
A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].
Q: Should I assess the risk of bias of research reports included in my methodological study?
A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].
Q: What variables are relevant to methodological studies?
A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:
Country: Countries and regions differ in their research cultures, and the resources available to conduct research. Therefore, it is reasonable to believe that there may be differences in methodological features across countries. Methodological studies have reported loco-regional differences in reporting quality [ 52 , 53 ]. This may also be related to challenges non-English speakers face in publishing papers in English.
Authors’ expertise: The inclusion of authors with expertise in research methodology, biostatistics, and scientific writing is likely to influence the end-product. Oltean et al. found that among randomized trials in orthopaedic surgery, the use of analyses that accounted for clustering was more likely when specialists (e.g. statistician, epidemiologist or clinical trials methodologist) were included on the study team [ 54 ]. Fleming et al. found that including methodologists in the review team was associated with appropriate use of reporting guidelines [ 55 ].
Source of funding and conflicts of interest: Some studies have found that funded studies report better [ 56 , 57 ], while others do not [ 53 , 58 ]. The presence of funding would indicate the availability of resources deployed to ensure optimal design, conduct, analysis and reporting. However, the source of funding may introduce conflicts of interest and warrant assessment. For example, Kaiser et al. investigated the effect of industry funding on obesity or nutrition randomized trials and found that reporting quality was similar [ 59 ]. Thomas et al. looked at reporting quality of long-term weight loss trials and found that industry funded studies were better [ 60 ]. Kan et al. examined the association between industry funding and “positive trials” (trials reporting a significant intervention effect) and found that industry funding was highly predictive of a positive trial [ 61 ]. This finding is similar to that of a recent Cochrane Methodology Review by Hansen et al. [ 62 ]
Journal characteristics: Certain journals’ characteristics may influence the study design, analysis or reporting. Characteristics such as journal endorsement of guidelines [ 63 , 64 ], and Journal Impact Factor (JIF) have been shown to be associated with reporting [ 63 , 65 , 66 , 67 ].
Study size (sample size/number of sites): Some studies have shown that reporting is better in larger studies [ 53 , 56 , 58 ].
Year of publication: It is reasonable to assume that design, conduct, analysis and reporting of research will change over time. Many studies have demonstrated improvements in reporting over time or after the publication of reporting guidelines [ 68 , 69 ].
Type of intervention: In a methodological study of reporting quality of weight loss intervention studies, Thabane et al. found that trials of pharmacologic interventions were reported better than trials of non-pharmacologic interventions [ 70 ].
Interactions between variables: Complex interactions between the previously listed variables are possible. High income countries with more resources may be more likely to conduct larger studies and incorporate a variety of experts. Authors in certain countries may prefer certain journals, and journal endorsement of guidelines and editorial policies may change over time.
Q: Should I focus only on high impact journals?
A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.
Q: Can I conduct a methodological study of qualitative research?
A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.
Q: What reporting guidelines should I use for my methodological study?
A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.
Q: What are the potential threats to validity and how can I avoid them?
A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.
Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].
With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.
Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.
Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.
In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:
What is the aim?
Methodological studies that investigate bias
A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].
Methodological studies that investigate quality (or completeness) of reporting
Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].
Methodological studies that investigate the consistency of reporting
Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].
Methodological studies that investigate factors associated with reporting
In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].
Methodological studies that investigate methods
Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].
Methodological studies that summarize other methodological studies
Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].
Methodological studies that investigate nomenclature and terminology
Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].
Other types of methodological studies
In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.
What is the design?
Methodological studies that are descriptive
Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].
Methodological studies that are analytical
Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].
What is the sampling strategy?
Methodological studies that include the target population
Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n = 103) [ 30 ].
Methodological studies that include a sample of the target population
Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.
What is the unit of analysis?
Methodological studies with a research report as the unit of analysis
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Methodological studies with a design, analysis or reporting item as the unit of analysis
Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].
This framework is outlined in Fig. 2 .
A proposed framework for methodological studies
Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.
In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Consolidated Standards of Reporting Trials
Evidence, Participants, Intervention, Comparison, Outcome, Timeframe
Grading of Recommendations, Assessment, Development and Evaluations
Participants, Intervention, Comparison, Outcome, Timeframe
Preferred Reporting Items of Systematic reviews and Meta-Analyses
Studies Within a Review
Studies Within a Trial
Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet. 2009;374(9683):86–9.
PubMed Google Scholar
Chan AW, Song F, Vickers A, Jefferson T, Dickersin K, Gotzsche PC, Krumholz HM, Ghersi D, van der Worp HB. Increasing value and reducing waste: addressing inaccessible research. Lancet. 2014;383(9913):257–66.
PubMed PubMed Central Google Scholar
Ioannidis JP, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R. Increasing value and reducing waste in research design, conduct, and analysis. Lancet. 2014;383(9912):166–75.
Higgins JP, Altman DG, Gotzsche PC, Juni P, Moher D, Oxman AD, Savovic J, Schulz KF, Weeks L, Sterne JA. The Cochrane Collaboration's tool for assessing risk of bias in randomised trials. BMJ. 2011;343:d5928.
Moher D, Schulz KF, Altman DG. The CONSORT statement: revised recommendations for improving the quality of reports of parallel-group randomised trials. Lancet. 2001;357.
Liberati A, Altman DG, Tetzlaff J, Mulrow C, Gotzsche PC, Ioannidis JP, Clarke M, Devereaux PJ, Kleijnen J, Moher D. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. PLoS Med. 2009;6(7):e1000100.
Shea BJ, Hamel C, Wells GA, Bouter LM, Kristjansson E, Grimshaw J, Henry DA, Boers M. AMSTAR is a reliable and valid measurement tool to assess the methodological quality of systematic reviews. J Clin Epidemiol. 2009;62(10):1013–20.
Shea BJ, Reeves BC, Wells G, Thuku M, Hamel C, Moran J, Moher D, Tugwell P, Welch V, Kristjansson E, et al. AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. Bmj. 2017;358:j4008.
Lawson DO, Leenus A, Mbuagbaw L. Mapping the nomenclature, methodology, and reporting of studies that review methods: a pilot methodological review. Pilot Feasibility Studies. 2020;6(1):13.
Puljak L, Makaric ZL, Buljan I, Pieper D. What is a meta-epidemiological study? Analysis of published literature indicated heterogeneous study designs and definitions. J Comp Eff Res. 2020.
Abbade LPF, Wang M, Sriganesh K, Jin Y, Mbuagbaw L, Thabane L. The framing of research questions using the PICOT format in randomized controlled trials of venous ulcer disease is suboptimal: a systematic survey. Wound Repair Regen. 2017;25(5):892–900.
Gohari F, Baradaran HR, Tabatabaee M, Anijidani S, Mohammadpour Touserkani F, Atlasi R, Razmgir M. Quality of reporting randomized controlled trials (RCTs) in diabetes in Iran; a systematic review. J Diabetes Metab Disord. 2015;15(1):36.
Wang M, Jin Y, Hu ZJ, Thabane A, Dennis B, Gajic-Veljanoski O, Paul J, Thabane L. The reporting quality of abstracts of stepped wedge randomized trials is suboptimal: a systematic survey of the literature. Contemp Clin Trials Commun. 2017;8:1–10.
Shanthanna H, Kaushal A, Mbuagbaw L, Couban R, Busse J, Thabane L: A cross-sectional study of the reporting quality of pilot or feasibility trials in high-impact anesthesia journals Can J Anaesthesia 2018, 65(11):1180–1195.
Kosa SD, Mbuagbaw L, Borg Debono V, Bhandari M, Dennis BB, Ene G, Leenus A, Shi D, Thabane M, Valvasori S, et al. Agreement in reporting between trial publications and current clinical trial registry in high impact journals: a methodological review. Contemporary Clinical Trials. 2018;65:144–50.
Zhang Y, Florez ID, Colunga Lozano LE, Aloweni FAB, Kennedy SA, Li A, Craigie S, Zhang S, Agarwal A, Lopes LC, et al. A systematic survey on reporting and methods for handling missing participant data for continuous outcomes in randomized controlled trials. J Clin Epidemiol. 2017;88:57–66.
CAS PubMed Google Scholar
Hernández AV, Boersma E, Murray GD, Habbema JD, Steyerberg EW. Subgroup analyses in therapeutic cardiovascular clinical trials: are most of them misleading? Am Heart J. 2006;151(2):257–64.
Samaan Z, Mbuagbaw L, Kosa D, Borg Debono V, Dillenburg R, Zhang S, Fruci V, Dennis B, Bawor M, Thabane L. A systematic scoping review of adherence to reporting guidelines in health care literature. J Multidiscip Healthc. 2013;6:169–88.
Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59(7):697–703.
Carrasco-Labra A, Brignardello-Petersen R, Santesso N, Neumann I, Mustafa RA, Mbuagbaw L, Etxeandia Ikobaltzeta I, De Stio C, McCullagh LJ, Alonso-Coello P. Improving GRADE evidence tables part 1: a randomized trial shows improved understanding of content in summary-of-findings tables with a new format. J Clin Epidemiol. 2016;74:7–18.
The Northern Ireland Hub for Trials Methodology Research: SWAT/SWAR Information [ https://www.qub.ac.uk/sites/TheNorthernIrelandNetworkforTrialsMethodologyResearch/SWATSWARInformation/ ]. Accessed 31 Aug 2020.
Chick S, Sánchez P, Ferrin D, Morrice D. How to conduct a successful simulation study. In: Proceedings of the 2003 winter simulation conference: 2003; 2003. p. 66–70.
Google Scholar
Mulrow CD. The medical review article: state of the science. Ann Intern Med. 1987;106(3):485–8.
Sacks HS, Reitman D, Pagano D, Kupelnick B. Meta-analysis: an update. Mount Sinai J Med New York. 1996;63(3–4):216–24.
CAS Google Scholar
Areia M, Soares M, Dinis-Ribeiro M. Quality reporting of endoscopic diagnostic studies in gastrointestinal journals: where do we stand on the use of the STARD and CONSORT statements? Endoscopy. 2010;42(2):138–47.
Knol M, Groenwold R, Grobbee D. P-values in baseline tables of randomised controlled trials are inappropriate but still common in high impact journals. Eur J Prev Cardiol. 2012;19(2):231–2.
Chen M, Cui J, Zhang AL, Sze DM, Xue CC, May BH. Adherence to CONSORT items in randomized controlled trials of integrative medicine for colorectal Cancer published in Chinese journals. J Altern Complement Med. 2018;24(2):115–24.
Hopewell S, Ravaud P, Baron G, Boutron I. Effect of editors' implementation of CONSORT guidelines on the reporting of abstracts in high impact medical journals: interrupted time series analysis. BMJ. 2012;344:e4178.
The Cochrane Methodology Register Issue 2 2009 [ https://cmr.cochrane.org/help.htm ]. Accessed 31 Aug 2020.
Mbuagbaw L, Kredo T, Welch V, Mursleen S, Ross S, Zani B, Motaze NV, Quinlan L. Critical EPICOT items were absent in Cochrane human immunodeficiency virus systematic reviews: a bibliometric analysis. J Clin Epidemiol. 2016;74:66–72.
Barton S, Peckitt C, Sclafani F, Cunningham D, Chau I. The influence of industry sponsorship on the reporting of subgroup analyses within phase III randomised controlled trials in gastrointestinal oncology. Eur J Cancer. 2015;51(18):2732–9.
Setia MS. Methodology series module 5: sampling strategies. Indian J Dermatol. 2016;61(5):505–9.
Wilson B, Burnett P, Moher D, Altman DG, Al-Shahi Salman R. Completeness of reporting of randomised controlled trials including people with transient ischaemic attack or stroke: a systematic review. Eur Stroke J. 2018;3(4):337–46.
Kahale LA, Diab B, Brignardello-Petersen R, Agarwal A, Mustafa RA, Kwong J, Neumann I, Li L, Lopes LC, Briel M, et al. Systematic reviews do not adequately report or address missing outcome data in their analyses: a methodological survey. J Clin Epidemiol. 2018;99:14–23.
De Angelis CD, Drazen JM, Frizelle FA, Haug C, Hoey J, Horton R, Kotzin S, Laine C, Marusic A, Overbeke AJPM, et al. Is this clinical trial fully registered?: a statement from the International Committee of Medical Journal Editors*. Ann Intern Med. 2005;143(2):146–8.
Ohtake PJ, Childs JD. Why publish study protocols? Phys Ther. 2014;94(9):1208–9.
Rombey T, Allers K, Mathes T, Hoffmann F, Pieper D. A descriptive analysis of the characteristics and the peer review process of systematic review protocols published in an open peer review journal from 2012 to 2017. BMC Med Res Methodol. 2019;19(1):57.
Grimes DA, Schulz KF. Bias and causal associations in observational research. Lancet. 2002;359(9302):248–52.
Porta M (ed.): A dictionary of epidemiology, 5th edn. Oxford: Oxford University Press, Inc.; 2008.
El Dib R, Tikkinen KAO, Akl EA, Gomaa HA, Mustafa RA, Agarwal A, Carpenter CR, Zhang Y, Jorge EC, Almeida R, et al. Systematic survey of randomized trials evaluating the impact of alternative diagnostic strategies on patient-important outcomes. J Clin Epidemiol. 2017;84:61–9.
Helzer JE, Robins LN, Taibleson M, Woodruff RA Jr, Reich T, Wish ED. Reliability of psychiatric diagnosis. I. a methodological review. Arch Gen Psychiatry. 1977;34(2):129–33.
Chung ST, Chacko SK, Sunehag AL, Haymond MW. Measurements of gluconeogenesis and Glycogenolysis: a methodological review. Diabetes. 2015;64(12):3996–4010.
CAS PubMed PubMed Central Google Scholar
Sterne JA, Juni P, Schulz KF, Altman DG, Bartlett C, Egger M. Statistical methods for assessing the influence of study characteristics on treatment effects in 'meta-epidemiological' research. Stat Med. 2002;21(11):1513–24.
Moen EL, Fricano-Kugler CJ, Luikart BW, O’Malley AJ. Analyzing clustered data: why and how to account for multiple observations nested within a study participant? PLoS One. 2016;11(1):e0146721.
Zyzanski SJ, Flocke SA, Dickinson LM. On the nature and analysis of clustered data. Ann Fam Med. 2004;2(3):199–200.
Mathes T, Klassen P, Pieper D. Frequency of data extraction errors and methods to increase data extraction quality: a methodological review. BMC Med Res Methodol. 2017;17(1):152.
Bui DDA, Del Fiol G, Hurdle JF, Jonnalagadda S. Extractive text summarization system to aid data extraction from full text in systematic review development. J Biomed Inform. 2016;64:265–72.
Bui DD, Del Fiol G, Jonnalagadda S. PDF text classification to leverage information extraction from publication reports. J Biomed Inform. 2016;61:141–8.
Maticic K, Krnic Martinic M, Puljak L. Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience. BMC Med Res Methodol. 2019;19(1):32.
Speich B. Blinding in surgical randomized clinical trials in 2015. Ann Surg. 2017;266(1):21–2.
Abraha I, Cozzolino F, Orso M, Marchesi M, Germani A, Lombardo G, Eusebi P, De Florio R, Luchetta ML, Iorio A, et al. A systematic review found that deviations from intention-to-treat are common in randomized trials and systematic reviews. J Clin Epidemiol. 2017;84:37–46.
Zhong Y, Zhou W, Jiang H, Fan T, Diao X, Yang H, Min J, Wang G, Fu J, Mao B. Quality of reporting of two-group parallel randomized controlled clinical trials of multi-herb formulae: A survey of reports indexed in the Science Citation Index Expanded. Eur J Integrative Med. 2011;3(4):e309–16.
Farrokhyar F, Chu R, Whitlock R, Thabane L. A systematic review of the quality of publications reporting coronary artery bypass grafting trials. Can J Surg. 2007;50(4):266–77.
Oltean H, Gagnier JJ. Use of clustering analysis in randomized controlled trials in orthopaedic surgery. BMC Med Res Methodol. 2015;15:17.
Fleming PS, Koletsi D, Pandis N. Blinded by PRISMA: are systematic reviewers focusing on PRISMA and ignoring other guidelines? PLoS One. 2014;9(5):e96407.
Balasubramanian SP, Wiener M, Alshameeri Z, Tiruvoipati R, Elbourne D, Reed MW. Standards of reporting of randomized controlled trials in general surgery: can we do better? Ann Surg. 2006;244(5):663–7.
de Vries TW, van Roon EN. Low quality of reporting adverse drug reactions in paediatric randomised controlled trials. Arch Dis Child. 2010;95(12):1023–6.
Borg Debono V, Zhang S, Ye C, Paul J, Arya A, Hurlburt L, Murthy Y, Thabane L. The quality of reporting of RCTs used within a postoperative pain management meta-analysis, using the CONSORT statement. BMC Anesthesiol. 2012;12:13.
Kaiser KA, Cofield SS, Fontaine KR, Glasser SP, Thabane L, Chu R, Ambrale S, Dwary AD, Kumar A, Nayyar G, et al. Is funding source related to study reporting quality in obesity or nutrition randomized control trials in top-tier medical journals? Int J Obes. 2012;36(7):977–81.
Thomas O, Thabane L, Douketis J, Chu R, Westfall AO, Allison DB. Industry funding and the reporting quality of large long-term weight loss trials. Int J Obes. 2008;32(10):1531–6.
Khan NR, Saad H, Oravec CS, Rossi N, Nguyen V, Venable GT, Lillard JC, Patel P, Taylor DR, Vaughn BN, et al. A review of industry funding in randomized controlled trials published in the neurosurgical literature-the elephant in the room. Neurosurgery. 2018;83(5):890–7.
Hansen C, Lundh A, Rasmussen K, Hrobjartsson A. Financial conflicts of interest in systematic reviews: associations with results, conclusions, and methodological quality. Cochrane Database Syst Rev. 2019;8:Mr000047.
Kiehna EN, Starke RM, Pouratian N, Dumont AS. Standards for reporting randomized controlled trials in neurosurgery. J Neurosurg. 2011;114(2):280–5.
Liu LQ, Morris PJ, Pengel LH. Compliance to the CONSORT statement of randomized controlled trials in solid organ transplantation: a 3-year overview. Transpl Int. 2013;26(3):300–6.
Bala MM, Akl EA, Sun X, Bassler D, Mertz D, Mejza F, Vandvik PO, Malaga G, Johnston BC, Dahm P, et al. Randomized trials published in higher vs. lower impact journals differ in design, conduct, and analysis. J Clin Epidemiol. 2013;66(3):286–95.
Lee SY, Teoh PJ, Camm CF, Agha RA. Compliance of randomized controlled trials in trauma surgery with the CONSORT statement. J Trauma Acute Care Surg. 2013;75(4):562–72.
Ziogas DC, Zintzaras E. Analysis of the quality of reporting of randomized controlled trials in acute and chronic myeloid leukemia, and myelodysplastic syndromes as governed by the CONSORT statement. Ann Epidemiol. 2009;19(7):494–500.
Alvarez F, Meyer N, Gourraud PA, Paul C. CONSORT adoption and quality of reporting of randomized controlled trials: a systematic analysis in two dermatology journals. Br J Dermatol. 2009;161(5):1159–65.
Mbuagbaw L, Thabane M, Vanniyasingam T, Borg Debono V, Kosa S, Zhang S, Ye C, Parpia S, Dennis BB, Thabane L. Improvement in the quality of abstracts in major clinical journals since CONSORT extension for abstracts: a systematic review. Contemporary Clin trials. 2014;38(2):245–50.
Thabane L, Chu R, Cuddy K, Douketis J. What is the quality of reporting in weight loss intervention studies? A systematic review of randomized controlled trials. Int J Obes. 2007;31(10):1554–9.
Murad MH, Wang Z. Guidelines for reporting meta-epidemiological methodology research. Evidence Based Med. 2017;22(4):139.
METRIC - MEthodological sTudy ReportIng Checklist: guidelines for reporting methodological studies in health research [ http://www.equator-network.org/library/reporting-guidelines-under-development/reporting-guidelines-under-development-for-other-study-designs/#METRIC ]. Accessed 31 Aug 2020.
Jager KJ, Zoccali C, MacLeod A, Dekker FW. Confounding: what it is and how to deal with it. Kidney Int. 2008;73(3):256–60.
Parker SG, Halligan S, Erotocritou M, Wood CPJ, Boulton RW, Plumb AAO, Windsor ACJ, Mallett S. A systematic methodological review of non-randomised interventional studies of elective ventral hernia repair: clear definitions and a standardised minimum dataset are needed. Hernia. 2019.
Bouwmeester W, Zuithoff NPA, Mallett S, Geerlings MI, Vergouwe Y, Steyerberg EW, Altman DG, Moons KGM. Reporting and methods in clinical prediction research: a systematic review. PLoS Med. 2012;9(5):1–12.
Schiller P, Burchardi N, Niestroj M, Kieser M. Quality of reporting of clinical non-inferiority and equivalence randomised trials--update and extension. Trials. 2012;13:214.
Riado Minguez D, Kowalski M, Vallve Odena M, Longin Pontzen D, Jelicic Kadic A, Jeric M, Dosenovic S, Jakus D, Vrdoljak M, Poklepovic Pericic T, et al. Methodological and reporting quality of systematic reviews published in the highest ranking journals in the field of pain. Anesth Analg. 2017;125(4):1348–54.
Thabut G, Estellat C, Boutron I, Samama CM, Ravaud P. Methodological issues in trials assessing primary prophylaxis of venous thrombo-embolism. Eur Heart J. 2005;27(2):227–36.
Puljak L, Riva N, Parmelli E, González-Lorenzo M, Moja L, Pieper D. Data extraction methods: an analysis of internal reporting discrepancies in single manuscripts and practical advice. J Clin Epidemiol. 2020;117:158–64.
Ritchie A, Seubert L, Clifford R, Perry D, Bond C. Do randomised controlled trials relevant to pharmacy meet best practice standards for quality conduct and reporting? A systematic review. Int J Pharm Pract. 2019.
Babic A, Vuka I, Saric F, Proloscic I, Slapnicar E, Cavar J, Pericic TP, Pieper D, Puljak L. Overall bias methods and their use in sensitivity analysis of Cochrane reviews were not consistent. J Clin Epidemiol. 2019.
Tan A, Porcher R, Crequit P, Ravaud P, Dechartres A. Differences in treatment effect size between overall survival and progression-free survival in immunotherapy trials: a Meta-epidemiologic study of trials with results posted at ClinicalTrials.gov. J Clin Oncol. 2017;35(15):1686–94.
Croitoru D, Huang Y, Kurdina A, Chan AW, Drucker AM. Quality of reporting in systematic reviews published in dermatology journals. Br J Dermatol. 2020;182(6):1469–76.
Khan MS, Ochani RK, Shaikh A, Vaduganathan M, Khan SU, Fatima K, Yamani N, Mandrola J, Doukky R, Krasuski RA: Assessing the Quality of Reporting of Harms in Randomized Controlled Trials Published in High Impact Cardiovascular Journals. Eur Heart J Qual Care Clin Outcomes 2019.
Rosmarakis ES, Soteriades ES, Vergidis PI, Kasiakou SK, Falagas ME. From conference abstract to full paper: differences between data presented in conferences and journals. FASEB J. 2005;19(7):673–80.
Mueller M, D’Addario M, Egger M, Cevallos M, Dekkers O, Mugglin C, Scott P. Methods to systematically review and meta-analyse observational studies: a systematic scoping review of recommendations. BMC Med Res Methodol. 2018;18(1):44.
Li G, Abbade LPF, Nwosu I, Jin Y, Leenus A, Maaz M, Wang M, Bhatt M, Zielinski L, Sanger N, et al. A scoping review of comparisons between abstracts and full reports in primary biomedical research. BMC Med Res Methodol. 2017;17(1):181.
Krnic Martinic M, Pieper D, Glatt A, Puljak L. Definition of a systematic review used in overviews of systematic reviews, meta-epidemiological studies and textbooks. BMC Med Res Methodol. 2019;19(1):203.
Analytical study [ https://medical-dictionary.thefreedictionary.com/analytical+study ]. Accessed 31 Aug 2020.
Tricco AC, Tetzlaff J, Pham B, Brehaut J, Moher D. Non-Cochrane vs. Cochrane reviews were twice as likely to have positive conclusion statements: cross-sectional study. J Clin Epidemiol. 2009;62(4):380–6 e381.
Schalken N, Rietbergen C. The reporting quality of systematic reviews and Meta-analyses in industrial and organizational psychology: a systematic review. Front Psychol. 2017;8:1395.
Ranker LR, Petersen JM, Fox MP. Awareness of and potential for dependent error in the observational epidemiologic literature: A review. Ann Epidemiol. 2019;36:15–9 e12.
Paquette M, Alotaibi AM, Nieuwlaat R, Santesso N, Mbuagbaw L. A meta-epidemiological study of subgroup analyses in cochrane systematic reviews of atrial fibrillation. Syst Rev. 2019;8(1):241.
Download references
This work did not receive any dedicated funding.
Authors and affiliations.
Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON, Canada
Lawrence Mbuagbaw, Daeria O. Lawson & Lehana Thabane
Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario, L8N 4A6, Canada
Lawrence Mbuagbaw & Lehana Thabane
Centre for the Development of Best Practices in Health, Yaoundé, Cameroon
Lawrence Mbuagbaw
Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000, Zagreb, Croatia
Livia Puljak
Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN, 47405, USA
David B. Allison
Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON, Canada
Lehana Thabane
Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON, Canada
Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON, Canada
You can also search for this author in PubMed Google Scholar
LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.
Correspondence to Lawrence Mbuagbaw .
Ethics approval and consent to participate.
Not applicable.
Competing interests.
DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.
Publisher’s note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
Reprints and permissions
Cite this article.
Mbuagbaw, L., Lawson, D.O., Puljak, L. et al. A tutorial on methodological studies: the what, when, how and why. BMC Med Res Methodol 20 , 226 (2020). https://doi.org/10.1186/s12874-020-01107-7
Download citation
Received : 27 May 2020
Accepted : 27 August 2020
Published : 07 September 2020
DOI : https://doi.org/10.1186/s12874-020-01107-7
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
ISSN: 1471-2288
Reference management. Clean and simple.
Why do you need a research methodology, what needs to be included, why do you need to document your research method, what are the different types of research instruments, qualitative / quantitative / mixed research methodologies, how do you choose the best research methodology for you, frequently asked questions about research methodology, related articles.
When you’re working on your first piece of academic research, there are many different things to focus on, and it can be overwhelming to stay on top of everything. This is especially true of budding or inexperienced researchers.
If you’ve never put together a research proposal before or find yourself in a position where you need to explain your research methodology decisions, there are a few things you need to be aware of.
Once you understand the ins and outs, handling academic research in the future will be less intimidating. We break down the basics below:
A research methodology encompasses the way in which you intend to carry out your research. This includes how you plan to tackle things like collection methods, statistical analysis, participant observations, and more.
You can think of your research methodology as being a formula. One part will be how you plan on putting your research into practice, and another will be why you feel this is the best way to approach it. Your research methodology is ultimately a methodological and systematic plan to resolve your research problem.
In short, you are explaining how you will take your idea and turn it into a study, which in turn will produce valid and reliable results that are in accordance with the aims and objectives of your research. This is true whether your paper plans to make use of qualitative methods or quantitative methods.
The purpose of a research methodology is to explain the reasoning behind your approach to your research - you'll need to support your collection methods, methods of analysis, and other key points of your work.
Think of it like writing a plan or an outline for you what you intend to do.
When carrying out research, it can be easy to go off-track or depart from your standard methodology.
Tip: Having a methodology keeps you accountable and on track with your original aims and objectives, and gives you a suitable and sound plan to keep your project manageable, smooth, and effective.
With all that said, how do you write out your standard approach to a research methodology?
As a general plan, your methodology should include the following information:
In any dissertation, thesis, or academic journal, you will always find a chapter dedicated to explaining the research methodology of the person who carried out the study, also referred to as the methodology section of the work.
A good research methodology will explain what you are going to do and why, while a poor methodology will lead to a messy or disorganized approach.
You should also be able to justify in this section your reasoning for why you intend to carry out your research in a particular way, especially if it might be a particularly unique method.
Having a sound methodology in place can also help you with the following:
A research instrument is a tool you will use to help you collect, measure and analyze the data you use as part of your research.
The choice of research instrument will usually be yours to make as the researcher and will be whichever best suits your methodology.
There are many different research instruments you can use in collecting data for your research.
Generally, they can be grouped as follows:
These are the most common ways of carrying out research, but it is really dependent on your needs as a researcher and what approach you think is best to take.
It is also possible to combine a number of research instruments if this is necessary and appropriate in answering your research problem.
There are three different types of methodologies, and they are distinguished by whether they focus on words, numbers, or both.
Data type | What is it? | Methodology |
---|---|---|
Quantitative | This methodology focuses more on measuring and testing numerical data. What is the aim of quantitative research? | Surveys, tests, existing databases. |
Qualitative | Qualitative research is a process of collecting and analyzing both words and textual data. | Observations, interviews, focus groups. |
Mixed-method | A mixed-method approach combines both of the above approaches. | Where you can use a mixed method of research, this can produce some incredibly interesting results. This is due to testing in a way that provides data that is both proven to be exact while also being exploratory at the same time. |
➡️ Want to learn more about the differences between qualitative and quantitative research, and how to use both methods? Check out our guide for that!
If you've done your due diligence, you'll have an idea of which methodology approach is best suited to your research.
It’s likely that you will have carried out considerable reading and homework before you reach this point and you may have taken inspiration from other similar studies that have yielded good results.
Still, it is important to consider different options before setting your research in stone. Exploring different options available will help you to explain why the choice you ultimately make is preferable to other methods.
If proving your research problem requires you to gather large volumes of numerical data to test hypotheses, a quantitative research method is likely to provide you with the most usable results.
If instead you’re looking to try and learn more about people, and their perception of events, your methodology is more exploratory in nature and would therefore probably be better served using a qualitative research methodology.
It helps to always bring things back to the question: what do I want to achieve with my research?
Once you have conducted your research, you need to analyze it. Here are some helpful guides for qualitative data analysis:
➡️ How to do a content analysis
➡️ How to do a thematic analysis
➡️ How to do a rhetorical analysis
Research methodology refers to the techniques used to find and analyze information for a study, ensuring that the results are valid, reliable and that they address the research objective.
Data can typically be organized into four different categories or methods: observational, experimental, simulation, and derived.
Writing a methodology section is a process of introducing your methods and instruments, discussing your analysis, providing more background information, addressing your research limitations, and more.
Your research methodology section will need a clear research question and proposed research approach. You'll need to add a background, introduce your research question, write your methodology and add the works you cited during your data collecting phase.
The research methodology section of your study will indicate how valid your findings are and how well-informed your paper is. It also assists future researchers planning to use the same methodology, who want to cite your study or replicate it.
Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of the research. Several aspects must be considered before selecting an appropriate research methodology, such as research limitations and ethical concerns that may affect your research.
The research methodology section in a scientific paper describes the different methodological choices made, such as the data collection and analysis methods, and why these choices were selected. The reasons should explain why the methods chosen are the most appropriate to answer the research question. A good research methodology also helps ensure the reliability and validity of the research findings. There are three types of research methodology—quantitative, qualitative, and mixed-method, which can be chosen based on the research objectives.
A research methodology describes the techniques and procedures used to identify and analyze information regarding a specific research topic. It is a process by which researchers design their study so that they can achieve their objectives using the selected research instruments. It includes all the important aspects of research, including research design, data collection methods, data analysis methods, and the overall framework within which the research is conducted. While these points can help you understand what is research methodology, you also need to know why it is important to pick the right methodology.
Having a good research methodology in place has the following advantages: 3
Types of research methodology.
There are three types of research methodology based on the type of research and the data required. 1
Sampling 4 is an important part of a research methodology and involves selecting a representative sample of the population to conduct the study, making statistical inferences about them, and estimating the characteristics of the whole population based on these inferences. There are two types of sampling designs in research methodology—probability and nonprobability.
In this type of sampling design, a sample is chosen from a larger population using some form of random selection, that is, every member of the population has an equal chance of being selected. The different types of probability sampling are:
During research, data are collected using various methods depending on the research methodology being followed and the research methods being undertaken. Both qualitative and quantitative research have different data collection methods, as listed below.
Qualitative research 5
Quantitative research 6
What are data analysis methods.
The data collected using the various methods for qualitative and quantitative research need to be analyzed to generate meaningful conclusions. These data analysis methods 7 also differ between quantitative and qualitative research.
Quantitative research involves a deductive method for data analysis where hypotheses are developed at the beginning of the research and precise measurement is required. The methods include statistical analysis applications to analyze numerical data and are grouped into two categories—descriptive and inferential.
Descriptive analysis is used to describe the basic features of different types of data to present it in a way that ensures the patterns become meaningful. The different types of descriptive analysis methods are:
Inferential analysis is used to make predictions about a larger population based on the analysis of the data collected from a smaller population. This analysis is used to study the relationships between different variables. Some commonly used inferential data analysis methods are:
Qualitative research involves an inductive method for data analysis where hypotheses are developed after data collection. The methods include:
Here are some important factors to consider when choosing a research methodology: 8
How to write a research methodology .
A research methodology should include the following components: 3,9
The methods section is a critical part of the research papers, allowing researchers to use this to understand your findings and replicate your work when pursuing their own research. However, it is usually also the most difficult section to write. This is where Paperpal can help you overcome the writer’s block and create the first draft in minutes with Paperpal Copilot, its secure generative AI feature suite.
With Paperpal you can get research advice, write and refine your work, rephrase and verify the writing, and ensure submission readiness, all in one place. Here’s how you can use Paperpal to develop the first draft of your methods section.
You can repeat this process to develop each section of your research manuscript, including the title, abstract and keywords. Ready to write your research papers faster, better, and without the stress? Sign up for Paperpal and start writing today!
Q1. What are the key components of research methodology?
A1. A good research methodology has the following key components:
Q2. Why is ethical consideration important in research methodology?
A2. Ethical consideration is important in research methodology to ensure the readers of the reliability and validity of the study. Researchers must clearly mention the ethical norms and standards followed during the conduct of the research and also mention if the research has been cleared by any institutional board. The following 10 points are the important principles related to ethical considerations: 10
Q3. What is the difference between methodology and method?
A3. Research methodology is different from a research method, although both terms are often confused. Research methods are the tools used to gather data, while the research methodology provides a framework for how research is planned, conducted, and analyzed. The latter guides researchers in making decisions about the most appropriate methods for their research. Research methods refer to the specific techniques, procedures, and tools used by researchers to collect, analyze, and interpret data, for instance surveys, questionnaires, interviews, etc.
Research methodology is, thus, an integral part of a research study. It helps ensure that you stay on track to meet your research objectives and answer your research questions using the most appropriate data collection and analysis tools based on your research design.
Paperpal is a comprehensive AI writing toolkit that helps students and researchers achieve 2x the writing in half the time. It leverages 21+ years of STM experience and insights from millions of research articles to provide in-depth academic writing, language editing, and submission readiness support to help you write better, faster.
Get accurate academic translations, rewriting support, grammar checks, vocabulary suggestions, and generative AI assistance that delivers human precision at machine speed. Try for free or upgrade to Paperpal Prime starting at US$19 a month to access premium features, including consistency, plagiarism, and 30+ submission readiness checks to help you succeed.
Experience the future of academic writing – Sign up to Paperpal and start writing for free!
Climatic vs. climactic: difference and examples, you may also like, dissertation printing and binding | types & comparison , what is a dissertation preface definition and examples , how to write a research proposal: (with examples..., how to write your research paper in apa..., how to choose a dissertation topic, how to write a phd research proposal, how to write an academic paragraph (step-by-step guide), maintaining academic integrity with paperpal’s generative ai writing..., research funding basics: what should a grant proposal..., how to write an abstract in research papers....
Last Updated: May 27, 2024 Approved
This article was co-authored by Alexander Ruiz, M.Ed. and by wikiHow staff writer, Jennifer Mueller, JD . Alexander Ruiz is an Educational Consultant and the Educational Director of Link Educational Institute, a tutoring business based in Claremont, California that provides customizable educational plans, subject and test prep tutoring, and college application consulting. With over a decade and a half of experience in the education industry, Alexander coaches students to increase their self-awareness and emotional intelligence while achieving skills and the goal of achieving skills and higher education. He holds a BA in Psychology from Florida International University and an MA in Education from Georgia Southern University. wikiHow marks an article as reader-approved once it receives enough positive feedback. In this case, several readers have written to tell us that this article was helpful to them, earning it our reader-approved status. This article has been viewed 528,365 times.
The research methodology section of any academic research paper gives you the opportunity to convince your readers that your research is useful and will contribute to your field of study. An effective research methodology is grounded in your overall approach – whether qualitative or quantitative – and adequately describes the methods you used. Justify why you chose those methods over others, then explain how those methods will provide answers to your research questions. [1] X Research source
To write a research methodology, start with a section that outlines the problems or questions you'll be studying, including your hypotheses or whatever it is you're setting out to prove. Then, briefly explain why you chose to use either a qualitative or quantitative approach for your study. Next, go over when and where you conducted your research and what parameters you used to ensure you were objective. Finally, cite any sources you used to decide on the methodology for your research. To learn how to justify your choice of methods in your research methodology, scroll down! Did this summary help you? Yes No
Prof. Dr. Ahmed Askar
Apr 18, 2020
M. Mahmood Shah Khan
Mar 17, 2020
Shimola Makondo
Jul 20, 2019
Zain Sharif Mohammed Alnadhery
Jan 7, 2019
Lundi Dukashe
Feb 17, 2020
Get all the best how-tos!
Sign up for wikiHow's weekly email newsletter
Getting started at uni, study skills, referencing.
For educators.
The method section of a report details how the research was conducted, the research methods used and the reasons for choosing those methods. It should outline:
The methodology is a step-by-step explanation of the research process. It should be factual and is mainly written in the past tense.
The research used a quantitative methodology based on the approach advocated by Williams (2009). This study was conducted by questionnaire and investigated university teaching staff attitudes to the use of mobile phones in tutorials (see Appendix 1). The questionnaire used Likert scales to assess social attitudes (Jones 2007) to student mobile phone use and provided open-ended responses for additional comments. The survey was voluntary and anonymous. A total of 412 questionnaires were distributed online to randomly selected staff from each of the three colleges within the university. The completed questionnaires were returned by email.
[Describe: The research used a quantitative methodology based on the approach advocated by Williams (2009).] [Refer: This study was conducted by questionnaire and investigated university teaching staff attitudes to the use of mobile phones in tutorials (see Appendix 1). The questionnaire used Likert scales to assess social attitudes (Jones 2007) to student mobile phone use and provided open-ended responses for additional comments.] [Describes: The survey was voluntary and anonymous. A total of 412 questionnaires were distributed online to randomly selected staff from each of the three colleges within the university. The completed questionnaires were returned by email.]
Still can't find what you need?
The RMIT University Library provides study support , one-on-one consultations and peer mentoring to RMIT students.
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Lawrence mbuagbaw.
1 Department of Health Research Methods, Evidence and Impact, McMaster University, Hamilton, ON Canada
2 Biostatistics Unit/FSORC, 50 Charlton Avenue East, St Joseph’s Healthcare—Hamilton, 3rd Floor Martha Wing, Room H321, Hamilton, Ontario L8N 4A6 Canada
3 Centre for the Development of Best Practices in Health, Yaoundé, Cameroon
Livia puljak.
4 Center for Evidence-Based Medicine and Health Care, Catholic University of Croatia, Ilica 242, 10000 Zagreb, Croatia
5 Department of Epidemiology and Biostatistics, School of Public Health – Bloomington, Indiana University, Bloomington, IN 47405 USA
6 Departments of Paediatrics and Anaesthesia, McMaster University, Hamilton, ON Canada
7 Centre for Evaluation of Medicine, St. Joseph’s Healthcare-Hamilton, Hamilton, ON Canada
8 Population Health Research Institute, Hamilton Health Sciences, Hamilton, ON Canada
Data sharing is not applicable to this article as no new data were created or analyzed in this study.
Methodological studies – studies that evaluate the design, analysis or reporting of other research-related reports – play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste.
We provide an overview of some of the key aspects of methodological studies such as what they are, and when, how and why they are done. We adopt a “frequently asked questions” format to facilitate reading this paper and provide multiple examples to help guide researchers interested in conducting methodological studies. Some of the topics addressed include: is it necessary to publish a study protocol? How to select relevant research reports and databases for a methodological study? What approaches to data extraction and statistical analysis should be considered when conducting a methodological study? What are potential threats to validity and is there a way to appraise the quality of methodological studies?
Appropriate reflection and application of basic principles of epidemiology and biostatistics are required in the design and analysis of methodological studies. This paper provides an introduction for further discussion about the conduct of methodological studies.
The field of meta-research (or research-on-research) has proliferated in recent years in response to issues with research quality and conduct [ 1 – 3 ]. As the name suggests, this field targets issues with research design, conduct, analysis and reporting. Various types of research reports are often examined as the unit of analysis in these studies (e.g. abstracts, full manuscripts, trial registry entries). Like many other novel fields of research, meta-research has seen a proliferation of use before the development of reporting guidance. For example, this was the case with randomized trials for which risk of bias tools and reporting guidelines were only developed much later – after many trials had been published and noted to have limitations [ 4 , 5 ]; and for systematic reviews as well [ 6 – 8 ]. However, in the absence of formal guidance, studies that report on research differ substantially in how they are named, conducted and reported [ 9 , 10 ]. This creates challenges in identifying, summarizing and comparing them. In this tutorial paper, we will use the term methodological study to refer to any study that reports on the design, conduct, analysis or reporting of primary or secondary research-related reports (such as trial registry entries and conference abstracts).
In the past 10 years, there has been an increase in the use of terms related to methodological studies (based on records retrieved with a keyword search [in the title and abstract] for “methodological review” and “meta-epidemiological study” in PubMed up to December 2019), suggesting that these studies may be appearing more frequently in the literature. See Fig. 1 .
Trends in the number studies that mention “methodological review” or “meta-
epidemiological study” in PubMed.
The methods used in many methodological studies have been borrowed from systematic and scoping reviews. This practice has influenced the direction of the field, with many methodological studies including searches of electronic databases, screening of records, duplicate data extraction and assessments of risk of bias in the included studies. However, the research questions posed in methodological studies do not always require the approaches listed above, and guidance is needed on when and how to apply these methods to a methodological study. Even though methodological studies can be conducted on qualitative or mixed methods research, this paper focuses on and draws examples exclusively from quantitative research.
The objectives of this paper are to provide some insights on how to conduct methodological studies so that there is greater consistency between the research questions posed, and the design, analysis and reporting of findings. We provide multiple examples to illustrate concepts and a proposed framework for categorizing methodological studies in quantitative research.
Any study that describes or analyzes methods (design, conduct, analysis or reporting) in published (or unpublished) literature is a methodological study. Consequently, the scope of methodological studies is quite extensive and includes, but is not limited to, topics as diverse as: research question formulation [ 11 ]; adherence to reporting guidelines [ 12 – 14 ] and consistency in reporting [ 15 ]; approaches to study analysis [ 16 ]; investigating the credibility of analyses [ 17 ]; and studies that synthesize these methodological studies [ 18 ]. While the nomenclature of methodological studies is not uniform, the intents and purposes of these studies remain fairly consistent – to describe or analyze methods in primary or secondary studies. As such, methodological studies may also be classified as a subtype of observational studies.
Parallel to this are experimental studies that compare different methods. Even though they play an important role in informing optimal research methods, experimental methodological studies are beyond the scope of this paper. Examples of such studies include the randomized trials by Buscemi et al., comparing single data extraction to double data extraction [ 19 ], and Carrasco-Labra et al., comparing approaches to presenting findings in Grading of Recommendations, Assessment, Development and Evaluations (GRADE) summary of findings tables [ 20 ]. In these studies, the unit of analysis is the person or groups of individuals applying the methods. We also direct readers to the Studies Within a Trial (SWAT) and Studies Within a Review (SWAR) programme operated through the Hub for Trials Methodology Research, for further reading as a potential useful resource for these types of experimental studies [ 21 ]. Lastly, this paper is not meant to inform the conduct of research using computational simulation and mathematical modeling for which some guidance already exists [ 22 ], or studies on the development of methods using consensus-based approaches.
Methodological studies occupy a unique niche in health research that allows them to inform methodological advances. Methodological studies should also be conducted as pre-cursors to reporting guideline development, as they provide an opportunity to understand current practices, and help to identify the need for guidance and gaps in methodological or reporting quality. For example, the development of the popular Preferred Reporting Items of Systematic reviews and Meta-Analyses (PRISMA) guidelines were preceded by methodological studies identifying poor reporting practices [ 23 , 24 ]. In these instances, after the reporting guidelines are published, methodological studies can also be used to monitor uptake of the guidelines.
These studies can also be conducted to inform the state of the art for design, analysis and reporting practices across different types of health research fields, with the aim of improving research practices, and preventing or reducing research waste. For example, Samaan et al. conducted a scoping review of adherence to different reporting guidelines in health care literature [ 18 ]. Methodological studies can also be used to determine the factors associated with reporting practices. For example, Abbade et al. investigated journal characteristics associated with the use of the Participants, Intervention, Comparison, Outcome, Timeframe (PICOT) format in framing research questions in trials of venous ulcer disease [ 11 ].
There is no clear answer to this question. Based on a search of PubMed, the use of related terms (“methodological review” and “meta-epidemiological study”) – and therefore, the number of methodological studies – is on the rise. However, many other terms are used to describe methodological studies. There are also many studies that explore design, conduct, analysis or reporting of research reports, but that do not use any specific terms to describe or label their study design in terms of “methodology”. This diversity in nomenclature makes a census of methodological studies elusive. Appropriate terminology and key words for methodological studies are needed to facilitate improved accessibility for end-users.
Methodological studies provide information on the design, conduct, analysis or reporting of primary and secondary research and can be used to appraise quality, quantity, completeness, accuracy and consistency of health research. These issues can be explored in specific fields, journals, databases, geographical regions and time periods. For example, Areia et al. explored the quality of reporting of endoscopic diagnostic studies in gastroenterology [ 25 ]; Knol et al. investigated the reporting of p -values in baseline tables in randomized trial published in high impact journals [ 26 ]; Chen et al. describe adherence to the Consolidated Standards of Reporting Trials (CONSORT) statement in Chinese Journals [ 27 ]; and Hopewell et al. describe the effect of editors’ implementation of CONSORT guidelines on reporting of abstracts over time [ 28 ]. Methodological studies provide useful information to researchers, clinicians, editors, publishers and users of health literature. As a result, these studies have been at the cornerstone of important methodological developments in the past two decades and have informed the development of many health research guidelines including the highly cited CONSORT statement [ 5 ].
Methodological studies can be found in most common biomedical bibliographic databases (e.g. Embase, MEDLINE, PubMed, Web of Science). However, the biggest caveat is that methodological studies are hard to identify in the literature due to the wide variety of names used and the lack of comprehensive databases dedicated to them. A handful can be found in the Cochrane Library as “Cochrane Methodology Reviews”, but these studies only cover methodological issues related to systematic reviews. Previous attempts to catalogue all empirical studies of methods used in reviews were abandoned 10 years ago [ 29 ]. In other databases, a variety of search terms may be applied with different levels of sensitivity and specificity.
In this section, we have outlined responses to questions that might help inform the conduct of methodological studies.
Q: How should I select research reports for my methodological study?
A: Selection of research reports for a methodological study depends on the research question and eligibility criteria. Once a clear research question is set and the nature of literature one desires to review is known, one can then begin the selection process. Selection may begin with a broad search, especially if the eligibility criteria are not apparent. For example, a methodological study of Cochrane Reviews of HIV would not require a complex search as all eligible studies can easily be retrieved from the Cochrane Library after checking a few boxes [ 30 ]. On the other hand, a methodological study of subgroup analyses in trials of gastrointestinal oncology would require a search to find such trials, and further screening to identify trials that conducted a subgroup analysis [ 31 ].
The strategies used for identifying participants in observational studies can apply here. One may use a systematic search to identify all eligible studies. If the number of eligible studies is unmanageable, a random sample of articles can be expected to provide comparable results if it is sufficiently large [ 32 ]. For example, Wilson et al. used a random sample of trials from the Cochrane Stroke Group’s Trial Register to investigate completeness of reporting [ 33 ]. It is possible that a simple random sample would lead to underrepresentation of units (i.e. research reports) that are smaller in number. This is relevant if the investigators wish to compare multiple groups but have too few units in one group. In this case a stratified sample would help to create equal groups. For example, in a methodological study comparing Cochrane and non-Cochrane reviews, Kahale et al. drew random samples from both groups [ 34 ]. Alternatively, systematic or purposeful sampling strategies can be used and we encourage researchers to justify their selected approaches based on the study objective.
Q: How many databases should I search?
A: The number of databases one should search would depend on the approach to sampling, which can include targeting the entire “population” of interest or a sample of that population. If you are interested in including the entire target population for your research question, or drawing a random or systematic sample from it, then a comprehensive and exhaustive search for relevant articles is required. In this case, we recommend using systematic approaches for searching electronic databases (i.e. at least 2 databases with a replicable and time stamped search strategy). The results of your search will constitute a sampling frame from which eligible studies can be drawn.
Alternatively, if your approach to sampling is purposeful, then we recommend targeting the database(s) or data sources (e.g. journals, registries) that include the information you need. For example, if you are conducting a methodological study of high impact journals in plastic surgery and they are all indexed in PubMed, you likely do not need to search any other databases. You may also have a comprehensive list of all journals of interest and can approach your search using the journal names in your database search (or by accessing the journal archives directly from the journal’s website). Even though one could also search journals’ web pages directly, using a database such as PubMed has multiple advantages, such as the use of filters, so the search can be narrowed down to a certain period, or study types of interest. Furthermore, individual journals’ web sites may have different search functionalities, which do not necessarily yield a consistent output.
Q: Should I publish a protocol for my methodological study?
A: A protocol is a description of intended research methods. Currently, only protocols for clinical trials require registration [ 35 ]. Protocols for systematic reviews are encouraged but no formal recommendation exists. The scientific community welcomes the publication of protocols because they help protect against selective outcome reporting, the use of post hoc methodologies to embellish results, and to help avoid duplication of efforts [ 36 ]. While the latter two risks exist in methodological research, the negative consequences may be substantially less than for clinical outcomes. In a sample of 31 methodological studies, 7 (22.6%) referenced a published protocol [ 9 ]. In the Cochrane Library, there are 15 protocols for methodological reviews (21 July 2020). This suggests that publishing protocols for methodological studies is not uncommon.
Authors can consider publishing their study protocol in a scholarly journal as a manuscript. Advantages of such publication include obtaining peer-review feedback about the planned study, and easy retrieval by searching databases such as PubMed. The disadvantages in trying to publish protocols includes delays associated with manuscript handling and peer review, as well as costs, as few journals publish study protocols, and those journals mostly charge article-processing fees [ 37 ]. Authors who would like to make their protocol publicly available without publishing it in scholarly journals, could deposit their study protocols in publicly available repositories, such as the Open Science Framework ( https://osf.io/ ).
Q: How to appraise the quality of a methodological study?
A: To date, there is no published tool for appraising the risk of bias in a methodological study, but in principle, a methodological study could be considered as a type of observational study. Therefore, during conduct or appraisal, care should be taken to avoid the biases common in observational studies [ 38 ]. These biases include selection bias, comparability of groups, and ascertainment of exposure or outcome. In other words, to generate a representative sample, a comprehensive reproducible search may be necessary to build a sampling frame. Additionally, random sampling may be necessary to ensure that all the included research reports have the same probability of being selected, and the screening and selection processes should be transparent and reproducible. To ensure that the groups compared are similar in all characteristics, matching, random sampling or stratified sampling can be used. Statistical adjustments for between-group differences can also be applied at the analysis stage. Finally, duplicate data extraction can reduce errors in assessment of exposures or outcomes.
Q: Should I justify a sample size?
A: In all instances where one is not using the target population (i.e. the group to which inferences from the research report are directed) [ 39 ], a sample size justification is good practice. The sample size justification may take the form of a description of what is expected to be achieved with the number of articles selected, or a formal sample size estimation that outlines the number of articles required to answer the research question with a certain precision and power. Sample size justifications in methodological studies are reasonable in the following instances:
For example, El Dib et al. computed a sample size requirement for a methodological study of diagnostic strategies in randomized trials, based on a confidence interval approach [ 40 ].
Q: What should I call my study?
A: Other terms which have been used to describe/label methodological studies include “ methodological review ”, “methodological survey” , “meta-epidemiological study” , “systematic review” , “systematic survey”, “meta-research”, “research-on-research” and many others. We recommend that the study nomenclature be clear, unambiguous, informative and allow for appropriate indexing. Methodological study nomenclature that should be avoided includes “ systematic review” – as this will likely be confused with a systematic review of a clinical question. “ Systematic survey” may also lead to confusion about whether the survey was systematic (i.e. using a preplanned methodology) or a survey using “ systematic” sampling (i.e. a sampling approach using specific intervals to determine who is selected) [ 32 ]. Any of the above meanings of the words “ systematic” may be true for methodological studies and could be potentially misleading. “ Meta-epidemiological study” is ideal for indexing, but not very informative as it describes an entire field. The term “ review ” may point towards an appraisal or “review” of the design, conduct, analysis or reporting (or methodological components) of the targeted research reports, yet it has also been used to describe narrative reviews [ 41 , 42 ]. The term “ survey ” is also in line with the approaches used in many methodological studies [ 9 ], and would be indicative of the sampling procedures of this study design. However, in the absence of guidelines on nomenclature, the term “ methodological study ” is broad enough to capture most of the scenarios of such studies.
Q: Should I account for clustering in my methodological study?
A: Data from methodological studies are often clustered. For example, articles coming from a specific source may have different reporting standards (e.g. the Cochrane Library). Articles within the same journal may be similar due to editorial practices and policies, reporting requirements and endorsement of guidelines. There is emerging evidence that these are real concerns that should be accounted for in analyses [ 43 ]. Some cluster variables are described in the section: “ What variables are relevant to methodological studies?”
A variety of modelling approaches can be used to account for correlated data, including the use of marginal, fixed or mixed effects regression models with appropriate computation of standard errors [ 44 ]. For example, Kosa et al. used generalized estimation equations to account for correlation of articles within journals [ 15 ]. Not accounting for clustering could lead to incorrect p -values, unduly narrow confidence intervals, and biased estimates [ 45 ].
Q: Should I extract data in duplicate?
A: Yes. Duplicate data extraction takes more time but results in less errors [ 19 ]. Data extraction errors in turn affect the effect estimate [ 46 ], and therefore should be mitigated. Duplicate data extraction should be considered in the absence of other approaches to minimize extraction errors. However, much like systematic reviews, this area will likely see rapid new advances with machine learning and natural language processing technologies to support researchers with screening and data extraction [ 47 , 48 ]. However, experience plays an important role in the quality of extracted data and inexperienced extractors should be paired with experienced extractors [ 46 , 49 ].
Q: Should I assess the risk of bias of research reports included in my methodological study?
A : Risk of bias is most useful in determining the certainty that can be placed in the effect measure from a study. In methodological studies, risk of bias may not serve the purpose of determining the trustworthiness of results, as effect measures are often not the primary goal of methodological studies. Determining risk of bias in methodological studies is likely a practice borrowed from systematic review methodology, but whose intrinsic value is not obvious in methodological studies. When it is part of the research question, investigators often focus on one aspect of risk of bias. For example, Speich investigated how blinding was reported in surgical trials [ 50 ], and Abraha et al., investigated the application of intention-to-treat analyses in systematic reviews and trials [ 51 ].
Q: What variables are relevant to methodological studies?
A: There is empirical evidence that certain variables may inform the findings in a methodological study. We outline some of these and provide a brief overview below:
Q: Should I focus only on high impact journals?
A: Investigators may choose to investigate only high impact journals because they are more likely to influence practice and policy, or because they assume that methodological standards would be higher. However, the JIF may severely limit the scope of articles included and may skew the sample towards articles with positive findings. The generalizability and applicability of findings from a handful of journals must be examined carefully, especially since the JIF varies over time. Even among journals that are all “high impact”, variations exist in methodological standards.
Q: Can I conduct a methodological study of qualitative research?
A: Yes. Even though a lot of methodological research has been conducted in the quantitative research field, methodological studies of qualitative studies are feasible. Certain databases that catalogue qualitative research including the Cumulative Index to Nursing & Allied Health Literature (CINAHL) have defined subject headings that are specific to methodological research (e.g. “research methodology”). Alternatively, one could also conduct a qualitative methodological review; that is, use qualitative approaches to synthesize methodological issues in qualitative studies.
Q: What reporting guidelines should I use for my methodological study?
A: There is no guideline that covers the entire scope of methodological studies. One adaptation of the PRISMA guidelines has been published, which works well for studies that aim to use the entire target population of research reports [ 71 ]. However, it is not widely used (40 citations in 2 years as of 09 December 2019), and methodological studies that are designed as cross-sectional or before-after studies require a more fit-for purpose guideline. A more encompassing reporting guideline for a broad range of methodological studies is currently under development [ 72 ]. However, in the absence of formal guidance, the requirements for scientific reporting should be respected, and authors of methodological studies should focus on transparency and reproducibility.
Q: What are the potential threats to validity and how can I avoid them?
A: Methodological studies may be compromised by a lack of internal or external validity. The main threats to internal validity in methodological studies are selection and confounding bias. Investigators must ensure that the methods used to select articles does not make them differ systematically from the set of articles to which they would like to make inferences. For example, attempting to make extrapolations to all journals after analyzing high-impact journals would be misleading.
Many factors (confounders) may distort the association between the exposure and outcome if the included research reports differ with respect to these factors [ 73 ]. For example, when examining the association between source of funding and completeness of reporting, it may be necessary to account for journals that endorse the guidelines. Confounding bias can be addressed by restriction, matching and statistical adjustment [ 73 ]. Restriction appears to be the method of choice for many investigators who choose to include only high impact journals or articles in a specific field. For example, Knol et al. examined the reporting of p -values in baseline tables of high impact journals [ 26 ]. Matching is also sometimes used. In the methodological study of non-randomized interventional studies of elective ventral hernia repair, Parker et al. matched prospective studies with retrospective studies and compared reporting standards [ 74 ]. Some other methodological studies use statistical adjustments. For example, Zhang et al. used regression techniques to determine the factors associated with missing participant data in trials [ 16 ].
With regard to external validity, researchers interested in conducting methodological studies must consider how generalizable or applicable their findings are. This should tie in closely with the research question and should be explicit. For example. Findings from methodological studies on trials published in high impact cardiology journals cannot be assumed to be applicable to trials in other fields. However, investigators must ensure that their sample truly represents the target sample either by a) conducting a comprehensive and exhaustive search, or b) using an appropriate and justified, randomly selected sample of research reports.
Even applicability to high impact journals may vary based on the investigators’ definition, and over time. For example, for high impact journals in the field of general medicine, Bouwmeester et al. included the Annals of Internal Medicine (AIM), BMJ, the Journal of the American Medical Association (JAMA), Lancet, the New England Journal of Medicine (NEJM), and PLoS Medicine ( n = 6) [ 75 ]. In contrast, the high impact journals selected in the methodological study by Schiller et al. were BMJ, JAMA, Lancet, and NEJM ( n = 4) [ 76 ]. Another methodological study by Kosa et al. included AIM, BMJ, JAMA, Lancet and NEJM ( n = 5). In the methodological study by Thabut et al., journals with a JIF greater than 5 were considered to be high impact. Riado Minguez et al. used first quartile journals in the Journal Citation Reports (JCR) for a specific year to determine “high impact” [ 77 ]. Ultimately, the definition of high impact will be based on the number of journals the investigators are willing to include, the year of impact and the JIF cut-off [ 78 ]. We acknowledge that the term “generalizability” may apply differently for methodological studies, especially when in many instances it is possible to include the entire target population in the sample studied.
Finally, methodological studies are not exempt from information bias which may stem from discrepancies in the included research reports [ 79 ], errors in data extraction, or inappropriate interpretation of the information extracted. Likewise, publication bias may also be a concern in methodological studies, but such concepts have not yet been explored.
In order to inform discussions about methodological studies, the development of guidance for what should be reported, we have outlined some key features of methodological studies that can be used to classify them. For each of the categories outlined below, we provide an example. In our experience, the choice of approach to completing a methodological study can be informed by asking the following four questions:
A methodological study may be focused on exploring sources of bias in primary or secondary studies (meta-bias), or how bias is analyzed. We have taken care to distinguish bias (i.e. systematic deviations from the truth irrespective of the source) from reporting quality or completeness (i.e. not adhering to a specific reporting guideline or norm). An example of where this distinction would be important is in the case of a randomized trial with no blinding. This study (depending on the nature of the intervention) would be at risk of performance bias. However, if the authors report that their study was not blinded, they would have reported adequately. In fact, some methodological studies attempt to capture both “quality of conduct” and “quality of reporting”, such as Richie et al., who reported on the risk of bias in randomized trials of pharmacy practice interventions [ 80 ]. Babic et al. investigated how risk of bias was used to inform sensitivity analyses in Cochrane reviews [ 81 ]. Further, biases related to choice of outcomes can also be explored. For example, Tan et al investigated differences in treatment effect size based on the outcome reported [ 82 ].
Methodological studies may report quality of reporting against a reporting checklist (i.e. adherence to guidelines) or against expected norms. For example, Croituro et al. report on the quality of reporting in systematic reviews published in dermatology journals based on their adherence to the PRISMA statement [ 83 ], and Khan et al. described the quality of reporting of harms in randomized controlled trials published in high impact cardiovascular journals based on the CONSORT extension for harms [ 84 ]. Other methodological studies investigate reporting of certain features of interest that may not be part of formally published checklists or guidelines. For example, Mbuagbaw et al. described how often the implications for research are elaborated using the Evidence, Participants, Intervention, Comparison, Outcome, Timeframe (EPICOT) format [ 30 ].
Sometimes investigators may be interested in how consistent reports of the same research are, as it is expected that there should be consistency between: conference abstracts and published manuscripts; manuscript abstracts and manuscript main text; and trial registration and published manuscript. For example, Rosmarakis et al. investigated consistency between conference abstracts and full text manuscripts [ 85 ].
In addition to identifying issues with reporting in primary and secondary studies, authors of methodological studies may be interested in determining the factors that are associated with certain reporting practices. Many methodological studies incorporate this, albeit as a secondary outcome. For example, Farrokhyar et al. investigated the factors associated with reporting quality in randomized trials of coronary artery bypass grafting surgery [ 53 ].
Methodological studies may also be used to describe methods or compare methods, and the factors associated with methods. Muller et al. described the methods used for systematic reviews and meta-analyses of observational studies [ 86 ].
Some methodological studies synthesize results from other methodological studies. For example, Li et al. conducted a scoping review of methodological reviews that investigated consistency between full text and abstracts in primary biomedical research [ 87 ].
Some methodological studies may investigate the use of names and terms in health research. For example, Martinic et al. investigated the definitions of systematic reviews used in overviews of systematic reviews (OSRs), meta-epidemiological studies and epidemiology textbooks [ 88 ].
In addition to the previously mentioned experimental methodological studies, there may exist other types of methodological studies not captured here.
Most methodological studies are purely descriptive and report their findings as counts (percent) and means (standard deviation) or medians (interquartile range). For example, Mbuagbaw et al. described the reporting of research recommendations in Cochrane HIV systematic reviews [ 30 ]. Gohari et al. described the quality of reporting of randomized trials in diabetes in Iran [ 12 ].
Some methodological studies are analytical wherein “analytical studies identify and quantify associations, test hypotheses, identify causes and determine whether an association exists between variables, such as between an exposure and a disease.” [ 89 ] In the case of methodological studies all these investigations are possible. For example, Kosa et al. investigated the association between agreement in primary outcome from trial registry to published manuscript and study covariates. They found that larger and more recent studies were more likely to have agreement [ 15 ]. Tricco et al. compared the conclusion statements from Cochrane and non-Cochrane systematic reviews with a meta-analysis of the primary outcome and found that non-Cochrane reviews were more likely to report positive findings. These results are a test of the null hypothesis that the proportions of Cochrane and non-Cochrane reviews that report positive results are equal [ 90 ].
Methodological reviews with narrow research questions may be able to include the entire target population. For example, in the methodological study of Cochrane HIV systematic reviews, Mbuagbaw et al. included all of the available studies ( n = 103) [ 30 ].
Many methodological studies use random samples of the target population [ 33 , 91 , 92 ]. Alternatively, purposeful sampling may be used, limiting the sample to a subset of research-related reports published within a certain time period, or in journals with a certain ranking or on a topic. Systematic sampling can also be used when random sampling may be challenging to implement.
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Some methodological studies report on items which may occur more than once per article. For example, Paquette et al. report on subgroup analyses in Cochrane reviews of atrial fibrillation in which 17 systematic reviews planned 56 subgroup analyses [ 93 ].
This framework is outlined in Fig. 2 .
A proposed framework for methodological studies
Methodological studies have examined different aspects of reporting such as quality, completeness, consistency and adherence to reporting guidelines. As such, many of the methodological study examples cited in this tutorial are related to reporting. However, as an evolving field, the scope of research questions that can be addressed by methodological studies is expected to increase.
In this paper we have outlined the scope and purpose of methodological studies, along with examples of instances in which various approaches have been used. In the absence of formal guidance on the design, conduct, analysis and reporting of methodological studies, we have provided some advice to help make methodological studies consistent. This advice is grounded in good contemporary scientific practice. Generally, the research question should tie in with the sampling approach and planned analysis. We have also highlighted the variables that may inform findings from methodological studies. Lastly, we have provided suggestions for ways in which authors can categorize their methodological studies to inform their design and analysis.
Abbreviations.
CONSORT | Consolidated Standards of Reporting Trials |
EPICOT | Evidence, Participants, Intervention, Comparison, Outcome, Timeframe |
GRADE | Grading of Recommendations, Assessment, Development and Evaluations |
PICOT | Participants, Intervention, Comparison, Outcome, Timeframe |
PRISMA | Preferred Reporting Items of Systematic reviews and Meta-Analyses |
SWAR | Studies Within a Review |
SWAT | Studies Within a Trial |
LM conceived the idea and drafted the outline and paper. DOL and LT commented on the idea and draft outline. LM, LP and DOL performed literature searches and data extraction. All authors (LM, DOL, LT, LP, DBA) reviewed several draft versions of the manuscript and approved the final manuscript.
This work did not receive any dedicated funding.
Ethics approval and consent to participate.
Not applicable.
Competing interests.
DOL, DBA, LM, LP and LT are involved in the development of a reporting guideline for methodological studies.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
A research report is a concise document that summarizes the findings, methods, and conclusions of a research study or investigation. There are various types of research reports available for different purposes.
It typically includes details on the research question, methodology, data analysis, and results, providing a structured and informative account of the research process and outcomes.
Limitations, key highlights.
1. technical or scientific reports.
Technical and scientific reports communicate research findings to experts and professionals in a particular field.
Start Your Free Data Science Course
Hadoop, Data Science, Statistics & others
Characteristics:
Popular reports are designed for a general audience and aim to inform, educate, or entertain on a wide range of topics.
Survey reports include data collected through surveys and focus on presenting insights and opinions on specific issues or questions.
Market research reports provide insights into consumer behavior, market trends, and industry analysis.
Case study reports focus on an in-depth examination of a single entity, often to explore complex, real-life situations.
Analytical research reports involve a deep analysis of data to uncover patterns, trends, or relationships.
Literature review reports provide an overview of existing research on a specific topic, highlighting gaps and trends.
Experimental research reports involve controlled experiments to test hypotheses and determine if the results support or reject the hypothesis.
Descriptive research reports aim to provide a comprehensive picture of a phenomenon, group, or situation. They seek to answer the “what” and “how” questions.
Exploratory research reports are conducted when there is little prior knowledge about a subject. They aim to identify key variables and research questions.
Explanatory research reports seek to understand the relationships between variables and explain why certain phenomena occur.
Policy or white papers aim to inform policymakers, stakeholders, and the public about specific issues and recommend actions.
These are some common components you must know while writing different types of research reports.
1. Title Page:
2. Abstract: Add a concise summary of the research, including the research question or objective, methodology, key findings, and implications. Typically, it should be no more than 150-250 words.
3. Table of Contents: Include a list of sections and subsections with page numbers.
4. List of Figures and Tables: If your research includes numerical data, add all the statistics and tables along with their corresponding page numbers. It is similar to a table of contents for quantitative data.
5. List of Abbreviations and Symbols: Include any abbreviations or symbols you have used in the report and their meanings.
6. Introduction:
7. Literature Review:
8. Methodology:
9. Results:
10. Discussion:
11. Conclusion:
12. References: Include a list of all the sources cited in your report in a standardized citation style (e.g., APA, MLA, Chicago).
Let us see an example of a research report.
Research Report: The Impact of Artificial Intelligence on the Labor Market
This research study explores the profound changes occurring in the labor market due to the increasing adoption of artificial intelligence (AI) technologies. The study examines the potential benefits and challenges AI poses for the workforce, job displacement, and the skills required in the future job market.
Introduction, literature review, methodology.
The introduction section provides an overview of the research topic. It explains the significance of studying the impact of AI on the labor market, outlines the research questions, and previews the structure of the report.
The literature review section reviews existing research on the effects of AI on employment and the labor market. It discusses the different perspectives on whether AI will create new jobs or lead to job displacement. It also explores the skills and education required for the future workforce.
This section explains the research methods used, such as data collection methods, sources, and analytical techniques. It outlines how data on AI adoption, job displacement, and future job projections were gathered and analyzed.
The results section presents the key findings of the study. It includes data on the extent of AI adoption across industries, job displacement rates, and projections for AI-related occupations.
The discussion section interprets the results in the context of the research questions. It analyzes the potential benefits and challenges AI poses for the labor market, discusses policy implications, and explores the role of education and training in preparing the workforce for the AI era.
In conclusion, this research highlights the transformative impact of artificial intelligence on the labor market. While AI brings opportunities for innovation and efficiency, it also presents challenges related to job displacement and workforce adaptation. Preparing for this evolving job landscape is crucial for individuals and policymakers.
Given below are various types of research reports writing that researchers and organizations use to present findings, progress, and other information.
Outlines a plan for a project or research for approval or funding. | Research proposal submitted to study the impact of climate change on local ecosystems. | |
Generated at regular intervals to provide project updates. | Weekly sales reports summarizing product sales figures. | |
Detailed, structured reports often used in academic, scientific, or business settings. | Formal business report analyzing a company’s financial performance for the year. | |
Less structured reports for quick internal communication. | Email summarizing key takeaways from a team meeting. | |
Concise documents offering a brief overview of a specific topic. | A one-page summary of customer feedback from a product launch. | |
Comprehensive reports with in-depth analysis and information. | 100-page research report on the effects of a new drug on a medical condition. | |
Focus on data analysis and provide insights or recommendations. | Market research report analyzing consumer behavior trends and recommending marketing strategies. | |
Convey information without providing analysis or recommendations. | Report detailing the steps of a manufacturing process for new employees. | |
Flow within the organizational hierarchy, moving up or down. | Report from a department manager to the company’s vice president on department performance. | |
Sent between individuals or departments at the same organizational level. | Report from one project manager to another project manager in a different department. | |
Created and distributed within an organization for internal purposes. | Internal audit report examining the company’s financial records for compliance. | |
Prepared for external audiences, such as clients, investors, or regulators. | A publicly traded company publishes an annual report for shareholders and the general public. |
Here is why the different types of research reports are important.
Listed below are some limitations of different types of research reports.
Different types of research reports are important for sharing knowledge, making smart choices, and moving forward in different areas of study. It’s vital for both researchers and those who use research to grasp the different kinds of reports, what goes into them, and why they matter.
Q1. Are research reports the same as research papers? Answer: Research reports and research papers share similarities but have distinct purposes and structures. Research papers are often more academic and can vary in structure, while research reports are typically more structured and cater to a broader audience.
Q2. How do I choose the right type of research report for my study? Answer: The choice of research report type depends on your research goals, audience, and the nature of your study. Consider whether you are conducting scientific research, market analysis, academic research, or policy analysis, and select the format that aligns with your objectives.
Q3. Can research reports be used as references in other research reports? Answer: Yes, research reports can be cited and used as references in other research reports as long as they are credible sources. Citing previous research reports adds depth and credibility to your work.
This article lists all the types of research reports available for research methodologies. We have also included its format, example, and several report-writing methods. For similar articles, you can check the following articles,
*Please provide your correct email id. Login details for this Free course will be emailed to you
By signing up, you agree to our Terms of Use and Privacy Policy .
Forgot Password?
This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. By closing this banner, scrolling this page, clicking a link or continuing to browse otherwise, you agree to our Privacy Policy
Explore 1000+ varieties of Mock tests View more
Submit Next Question
Early-Bird Offer: ENROLL NOW
Research reporting is the oral or written presentation of the findings in such detail and form as to be readily understood and assessed by the society, economy or particularly by the researchers.
As earlier said that it is the final stage of the research process and its purpose is to convey to interested persons the whole result of the study. Report writing is common to both academic and managerial situations. In academics, a research report is prepared for comprehensive and application-oriented learning. In businesses or organisations, reports are used for the basis of decision making.
Table of Content
According to C. A. Brown , “A report is a communication from someone who has information to someone who wants to use that information.”
According to Goode and Hatt , “The preparation of report is the final stage of research, and it’s purpose is to convey to the interested persons the whole result of the study, in sufficient detail and so arranged as to enable each reader to comprehend the data and to determine for himself the validity of the conclusions.”
It is clear from the above definitions of a research report, it is a brief account of the problem of investigation, the justification of its selection and the procedure of analysis and interpretation. It is only a summary of the entire research proceedings.
In other words, it can be defined as written documents, which presents information in a specialized and concise manner.
Although no hard and fast rules can be laid down, the report must contain the following points.
The preliminary part may have seven major components – cover, title, preface, acknowledgement, table of contents, list of tables, list of graphs. Long reports presented in book form have a cover made up of a card sheet. The cover contains title of the research report, the authority to whom the report is submitted, name of the author, etc.
The preface introduces the report to the readers. It gives a very brief introduction of the report. In the acknowledgements author mention names of persons and organisations that have extended co-operation and helped in the various stages of research. Table of contents is essential. It gives the title and page number of each chapter.
The introduction of the research report should clearly and logically bring out the background of the problem addressed in the research. The purpose of the introduction is to introduce the research project to the readers. A clear statement of the problem with specific questions to be answered is presented in the introduction. It contains a brief outline of the chapters.
The third section reviews the important literature related to the study. A comprehensive review of the research literature referred to must be made. Previous research studies and the important writings in the area under study should be reviewed. Review of literature is helpful to provide a background for the development of the present study.
The researcher may review concerned books, articles published in edited books, journals and periodicals. Researcher may also take review of articles published in leading newspapers. A researcher should study working papers/discussion papers/study reports. It is essential for a broad conclusion and indications for further research.
Research methodology is an integral part of the research. It should clearly indicate the universe and the selection of samples, techniques of data collection, analysis and interpretation, statistical techniques, etc.
Results contain pilot study, processing of data, hypothesis/model testing, data analysis and interpretation, tables and figures, etc. This is the heart of the research report. If a pilot study is planned to be used, it’s purpose should be given in the research methodology.
The collected data and the information should be edited, coded, tabulated and analysed with a view to arriving at a valid and authentic conclusion. Tables and figures are used to clarify the significant relationship. The results obtained through tables, graphs should be critically interpreted.
The concluding remarks should discuss the results obtained in the earlier sections, as well as their usefulness and implications. It contains findings, conclusions, shortcomings, suggestions to the problem and direction for future research. Findings are statements of factual information based upon the data analysis.
Conclusions must clearly explain whether the hypothesis have been established and rejected. This part requires great expertise and preciseness. A report should also refer to the limitations of the applicability of the research inferences. It is essential to suggest the theoretical, practical and policy implications of the research. The suggestions should be supported by scientific and logical arguments. The future direction of research based on the work completed should also be outlined.
The bibliography is an alphabetic list of books, journal articles, reports, etc, published or unpublished, read, referred to, examined by the researcher in preparing the report. The bibliography should follow standard formats for books, journal articles, research reports.
The end of the research report may consist of appendices, listed in respect of all technical data. Appendices are for the purpose of providing detailed data or information that would be too cumbersome within the main body of the research report.
Report writing is an important communication medium in organisations. The most crucial findings might have come out through a research report. Report is common to academics and managers also. Reports are used for comprehensive and application oriented learning in academics. In organisations, reports are used for the basis of decision making. The importance of report writing can be discussed as under.
Through research reports, a manager or an executive can quickly get an idea of a current scenario which improves his information base for making sound decisions affecting future operations of the company or enterprise. The research report acts as a means of communication of various research findings to the interested parties, organisations and general public.
Good report writing play, a significant role of conveying unknown facts about the phenomenon to the concerned parties. This may provide new insights and new opportunities to the people. Research report plays a key role in making effective decisions in marketing, production, banking, materials, human resource development and government also. Good report writing is used for economic planning and optimum utilisation of resources for the development of a nation.
Report writing facilitates the validation of generalisation. A research report is an end product of research. As earlier said that report writing provides useful information in arriving at rational decisions that may reform the business and society. The findings, conclusions, suggestions and recommendations are useful to academicians, scholars and policymakers. Report writing provides reference material for further research in the same or similar areas of research to the concerned parties.
While preparing a research report, a researcher should take some proper precautions. Report writing should be simple, lucid and systematic. Report writing should be written speedily without interrupting the continuity of thought. The report writing should sustain the interest of readers.
Report writing is a highly skilled job. It is a process of analysing, understanding and consolidating the findings and projecting a meaningful view of the phenomenon studied. A good report writing is essential for effective communication.
Following are the essential qualities of good report:
Report writing is a time consuming and expensive exercise. Therefore, reports have to be very sharply focused in purpose content and readership. There is no single universally acceptable method of writing a research report.
Following are the general steps in writing a research report:
Research outline, preparation of rough draft, rewriting and polishing, writing the final draft.
This is the first and important step in writing a research report. It is concerned with the development of a subject. Subject matter should be written in a clear, logical and concise manner. The style adopted should be open, straightforward and dignified and folk style language should be avoided.
The data, the reliability and validity of the results of the statistical analysis should be in the form of tables, figures and equations. All redundancy in the data or results presented should be eliminated.
The research outline is an organisational framework prepared by the researcher well in advance. It is an aid to logical organisation of material and a reminder of the points to be stressed in the report. In the process of writing, if need be, outline may be revised accordingly.
Time and place of the study, scope and limitations of the study, study design, summary of pilot study, methods of data collection, analysis interpretation, etc., may be included in a research outline.
Having prepared the primary and secondary data, the researcher has to prepare a rough draft. While preparing the rough draft, the researcher should keep the objectives of the research in mind, and focus on one objective at a time. The researcher should make a checklist of the important points that are necessary to be covered in the manuscript. A researcher should use dictionary and relevant reference materials as and when required.
This is an important step in writing a research report. It takes more time than a rough draft. While rewriting and polishing, a researcher should check the report for weakness in logical development or presentation. He should take breaks in between rewriting and polishing since this gives the time to incubate the ideas.
The last and important step is writing the final draft. The language of the report should be simple, employing appropriate words and expressions and should avoid vague expressions such as ‘it seems’ and ‘there may be’ etc.
It should not used personal pronouns, such as I, We, My, Us, etc and should substitute these by such expressions as a researcher, investigator, etc. Before the final drafting of the report, it is advisable that the researcher should prepare a first draft for critical considerations and possible improvements. It will be helpful in writing the final draft. Finally, the report should be logically outlined with the future directions of the research based on the work completed.
A research report is a means of conveying the research study to a specific target audience. The following precautions should be taken while preparing a research report:
Research reports are designed in order to convey and record the information that will be of practical use to the reader. It is organized into distinct units of specific and highly visible information. The kind of audience addressed in the research report decides the type of report.
Research reports can be categorized on the following basis:
Classification on the basis of representation.
Following are the ways through which the results of the research report can be presented on the basis of information contained:
A technical report is written for other researchers. In writing the technical reports, the importance is mainly given to the methods that have been used to collect the information and data, the presumptions that are made and finally, the various presentation techniques that are used to present the findings and data.
Following are main features of a technical report:
A popular report is formulated when there is a need to draw conclusions of the findings of the research report. One of the main points of consideration that should be kept in mind while formulating a research report is that it must be simple and attractive. It must be written in a very simple manner that is understandable to all. It must also be made attractive by using large prints, various sub-headings and by giving cartoons occasionally.
Following are the main points that must be kept in mind while preparing a popular report:
Following are the ways through which the results of the research report can be presented on the basis of representation:
A written report plays a vital role in every business operation. The manner in which an organization writes business letters and business reports creates an impression of its standard. Therefore, the organization should emphasize on the improvement of the writing skills of the employees in order to maintain effective relations with their customers.
Writing effective written reports requires a lot of hard work. Therefore, before you begin writing, it is important to know the objective, i.e., the purpose of writing, collection and organization of required data.
At times, oral presentation of the results that are drawn out of research is considered effective, particularly in cases where policy recommendations are to be made. This approach proves beneficial because it provides a medium of interaction between a listener and a speaker. This leads to a better understanding of the findings and their implications.
However, the main drawback of oral presentation is the lack of any permanent records related to the research. Oral presentation of the report is also effective when it is supported with various visual devices, such as slides, wall charts and whiteboards that help in better understanding of the research reports.
Business Ethics
( Click on Topic to Read )
Corporate social responsibility (CSR)
Lean Six Sigma
Sampling method.
Operations Research
Operation Management
Service Operations Management
Procurement Management
Strategic Management
Supply Chain
What is experiments variables, types, lab, field, what is measure of skewness, what is descriptive research types, features, what is scaling techniques types, classifications, techniques, ethics in research, what is research problem components, identifying, formulating,, primary data and secondary data, what is questionnaire design characteristics, types, don’t, what is sampling need, advantages, limitations, what is hypothesis definition, meaning, characteristics, sources, leave a reply cancel reply.
You must be logged in to post a comment.
We’ve spent the time in finding, so you can spend your time in learning
Report writing in research methodology.
A report is a well-written formal document that briefly describes the process and findings of a research. It outlines the systematic investigation, recommendations, and gaps that need further inquiry. A well-crafted research report tells you about all the main areas of a research process. In this article, we will talk about how to write a report in research methodology.
Below are some points that make the report crucial in research methodology:
A report contributes to the existing knowledge. Through this report, we can communicate effectively with the findings of the investigation.
A research report identifies knowledge gaps that can be investigated further. The report shows what and how much has been done.
A research report makes you able to show research information in a concise and precise manner.
A report is a time-efficient document because you don’t have to spend much time detailing the findings. Rather, it is written briefly and you can send it through email to the concerned people.
Structure of a report in research methodology
You can write the report in the following structure:
The title of your research should point to the objectives, aims, and findings of your systematic investigation.
The table of contents will make the readers able to navigate your research report.
In the abstract section, the reader can have an overview of the important aspects of research such as method, data collection, and findings. While writing the abstract you should follow the format of 5ws and 1H; what, where, when, who, why, and how.
You can write aims and the problems that become the cause of your research. You should also indicate whether you have achieved your objectives of the research or it requires further work.
In a literature review, you will write a survey that highlights existing knowledge about the research topic. In the literature review, you can present the research hypothesis and its implications.
In this portion of the investigation, write in-depth information briefly about the research process that includes methodology, data collection, sample, research subjects, and analysis.
In this portion, you are expected to show the results and findings of your systematic investigation.
Now, you will further explain the results of the research that you outlined earlier. Justify for each finding and show whether the outcomes are according to the hypothesis or not.
Finally, you will write a summary of your research in which you will talk about the whole report of research methodology.
In this section, mention all the primary and secondary sources used during research.
Before writing a report in research methodology, you must create an outline of its core areas and then write its detail concisely. Below are some tips you can follow while writing a report:
Always keep your audience in mind so that you can determine the tone while writing the report. If the report is for a general audience, you can present information in a simple way. While if you are writing for a particular audience, you can use field-specific or technical terms as well.
In report writing, exclude all irrelevant information and only highlight important findings and data. Just present the abridged version of the systematic investigation.
You can use illustrations and visual presentations to make your data more efficient. You can use charts, graphs, and relevant images to bring additional credibility to systematic investigation.
The title of the report should be clear and precise. It must contain keywords of your research. The title should show a clear idea of the investigation so the readers can easily grasp the focus of the research.
After completion of report writing, you must proofread and edit it wherever it needs before you publish the report. The second look will make the information valid and authentic. You can ask someone to go through your report or use any editing and proofreading software as well.
A report is a concise document that is the essence of research. So, you should be very careful while writing a report after conducting research. It should be accurate, clear, and concise. Its findings can communicate with the readers.
You must be logged in to post a comment.
Numbers, Facts and Trends Shaping Your World
Read our research on:
Full Topic List
Read Our Research On:
Methodology, table of contents.
The American Trends Panel (ATP), created by Pew Research Center, is a nationally representative panel of randomly selected U.S. adults. Panelists participate via self-administered web surveys. Panelists who do not have internet access at home are provided with a tablet and wireless internet connection. Interviews are conducted in both English and Spanish. The panel is being managed by Ipsos.
Data in this report is drawn from ATP Wave 141, conducted from Jan. 22 to 28, 2024, and includes an oversample of non-Hispanic Asian adults, non-Hispanic Black men and Hispanic men in order to provide more precise estimates of the opinions and experiences of these smaller demographic subgroups. These oversampled groups are weighted back to reflect their correct proportions in the population. A total of 5,146 panelists responded out of 5,604 who were sampled, for a response rate of 92%. The cumulative response rate accounting for nonresponse to the recruitment surveys and attrition is 3%. The break-off rate among panelists who logged on to the survey and completed at least one item is 1%. The margin of sampling error for the full sample of 5,146 respondents is plus or minus 1.7 percentage points.
This is a Pew Research Center report from the Pew-Knight Initiative, a research program funded jointly by The Pew Charitable Trusts and the John S. and James L. Knight Foundation. Find related reports online at https://www.pewresearch.org/pew-knight/ .
The ATP was created in 2014, with the first cohort of panelists invited to join the panel at the end of a large, national, landline and cellphone random-digit-dial survey that was conducted in both English and Spanish. Two additional recruitments were conducted using the same method in 2015 and 2017, respectively. Across these three surveys, a total of 19,718 adults were invited to join the ATP, of whom 9,942 (50%) agreed to participate.
In August 2018, the ATP switched from telephone to address-based sampling (ABS) recruitment. A study cover letter and a pre-incentive are mailed to a stratified, random sample of households selected from the U.S. Postal Service’s Delivery Sequence File. This Postal Service file has been estimated to cover as much as 98% of the population, although some studies suggest that the coverage could be in the low 90% range. 1
Within each sampled household, the adult with the next birthday is asked to participate. Other details of the ABS recruitment protocol have changed over time but are available upon request. 2
We have recruited a national sample of U.S. adults to the ATP approximately once per year since 2014. In some years, the recruitment has included additional efforts (known as an “oversample”) to boost sample size with underrepresented groups. For example, Hispanic adults, Black adults and Asian adults were oversampled in 2019, 2022 and 2023, respectively.
Across the six address-based recruitments, a total of 23,862 adults were invited to join the ATP, of whom 20,917 agreed to join the panel and completed an initial profile survey. Of the 30,859 individuals who have ever joined the ATP, 11,927 remained active panelists and continued to receive survey invitations at the time this survey was conducted.
The American Trends Panel never uses breakout routers or chains that direct respondents to additional surveys.
The overall target population for this survey was noninstitutionalized persons ages 18 and older living in the U.S., including Alaska and Hawaii. It featured a stratified random sample from the ATP in which Hispanic men, non-Hispanic Black men and non-Hispanic Asian adults were selected with certainty. The remaining panelists were sampled at rates designed to ensure that the share of respondents in each stratum is proportional to its share of the U.S. adult population to the greatest extent possible. Respondent weights are adjusted to account for differential probabilities of selection as described in the Weighting section below.
The questionnaire was developed by Pew Research Center in consultation with Ipsos. The web program was rigorously tested on both PC and mobile devices by the Ipsos project management team and Pew Research Center researchers. The Ipsos project management team also populated test data that was analyzed in SPSS to ensure the logic and randomizations were working as intended before launching the survey.
All respondents were offered a post-paid incentive for their participation. Respondents could choose to receive the post-paid incentive in the form of a check or a gift code to Amazon.com or could choose to decline the incentive. Incentive amounts ranged from $5 to $20 depending on whether the respondent belongs to a part of the population that is harder or easier to reach. Differential incentive amounts were designed to increase panel survey participation among groups that traditionally have low survey response propensities.
The data collection field period for this survey was Jan. 22 to Jan. 28, 2024. Postcard notifications were mailed to a subset of ATP panelists with a known residential address on Jan. 22. 3
Invitations were sent out in two separate launches: soft launch and full launch. Sixty panelists were included in the soft launch, which began with an initial invitation sent on Jan. 22. The ATP panelists chosen for the initial soft launch were known responders who had completed previous ATP surveys within one day of receiving their invitation. All remaining English- and Spanish-speaking sampled panelists were included in the full launch and were sent an invitation on Jan. 23.
All panelists with an email address received an email invitation and up to two email reminders if they did not respond to the survey. All ATP panelists who consented to SMS messages received an SMS invitation and up to two SMS reminders.
To ensure high-quality data, the Center’s researchers performed data quality checks to identify any respondents showing clear patterns of satisficing. This includes checking for whether respondents left questions blank at very high rates or always selected the first or last answer presented. As a result of this checking, three ATP respondents were removed from the survey dataset prior to weighting and analysis.
The ATP data is weighted in a multistep process that accounts for multiple stages of sampling and nonresponse that occur at different points in the survey process. First, each panelist begins with a base weight that reflects their probability of selection for their initial recruitment survey. These weights are then rescaled and adjusted to account for changes in the design of ATP recruitment surveys from year to year. Finally, the weights are calibrated to align with the population benchmarks in the accompanying table to correct for nonresponse to recruitment surveys and panel attrition. If only a subsample of panelists was invited to participate in the wave, this weight is adjusted to account for any differential probabilities of selection.
Among the panelists who completed the survey, this weight is then calibrated again to align with the population benchmarks identified in the accompanying table and trimmed at the 2nd and 98th percentiles to reduce the loss in precision stemming from variance in the weights. This trimming is performed separately among non-Hispanic Black, non-Hispanic Asian, Hispanic and all other respondents. Sampling errors and tests of statistical significance take into account the effect of weighting.
The following table shows the unweighted sample sizes and the error attributable to sampling that would be expected at the 95% level of confidence for different groups in the survey.
Sample sizes and sampling errors for other subgroups are available upon request. In addition to sampling error, one should bear in mind that question wording and practical difficulties in conducting surveys can introduce error or bias into the findings of opinion polls.
Fresh data delivery Saturday mornings
Weekly updates on the world of news & information
Americans’ changing relationship with local news, how americans get local political news, introducing the pew-knight initiative, audiences are declining for traditional news media in the u.s. – with some exceptions, most popular, report materials.
901 E St. NW, Suite 300 Washington, DC 20004 USA (+1) 202-419-4300 | Main (+1) 202-857-8562 | Fax (+1) 202-419-4372 | Media Inquiries
ABOUT PEW RESEARCH CENTER Pew Research Center is a nonpartisan fact tank that informs the public about the issues, attitudes and trends shaping the world. It conducts public opinion polling, demographic research, media content analysis and other empirical social science research. Pew Research Center does not take policy positions. It is a subsidiary of The Pew Charitable Trusts .
© 2024 Pew Research Center
We want to provide announcements, events, leadership messages and resources that are relevant to you. Your selection is stored in a browser cookie which you can remove at any time using “Clear all personalization” below.
A new technology can extract lithium from brines at an estimated cost of under 40% that of today’s dominant extraction method, and at just a fourth of lithium’s current market price. The new technology would also be much more reliable and sustainable in its use of water, chemicals, and land than today’s technology, according to a study published today in Matter by Stanford University researchers.
Global demand for lithium has surged in recent years, driven by the rise of electric vehicles and renewable energy storage. The dominant source of lithium extraction today relies on evaporating brines in huge ponds under the sun for a year or more, leaving behind a lithium-rich solution, after which heavy use of potentially toxic chemicals finishes the job. Water with a high concentration of salts, including lithium, occurs naturally in some lakes, hot springs, and aquifers, and as a byproduct of oil and natural gas operations and of seawater desalination.
The benefits to efficiency and cost innate to our approach make it a promising alternative to current extraction techniques and a potential game changer for the lithium supply chain.” Yi Cui Senior author and professor of materials science and engineering
Many scientists are searching for less expensive and more efficient, reliable, and environmentally friendly lithium extraction methods. These are generally direct lithium extraction that bypasses big evaporation ponds. The new study reports on the results of a new method using an approach known as “redox-couple electrodialysis,” or RCE, along with cost estimates.
“The benefits to efficiency and cost innate to our approach make it a promising alternative to current extraction techniques and a potential game changer for the lithium supply chain,” said Yi Cui , the study’s senior author and a professor of materials science and engineering in the School of Engineering .
The research team estimates its approach costs $3,500 to $4,400 per ton of high-purity lithium hydroxide, which can be converted to battery-grade lithium carbonate inexpensively, compared with costs of about $9,100 per ton for the dominant technology for extracting lithium from brine. The current market price for battery-grade lithium carbonate is almost $15,000 per ton, but a shortage in late 2022 drove the volatile lithium market price to $80,000.
Lithium, so far, has had a critical role in the global transition to sustainable energy. The demand for lithium is expected to rise from approximately half a million metric tons in 2021 to an estimated 3 million to 4 million metric tons by 2030, according to a report by McKinsey & Co. This sharp increase is driven mostly by the rapid adoption of electric vehicles and renewable energy storage systems, both of which rely heavily on batteries.
Traditionally, lithium has been extracted from mined rocks, a method that is even more expensive, energy intensive, and driven by toxic chemicals than brine extraction. As a result, the dominant method for lithium extraction today has switched to evaporating salt-lake brines, though still at high financial and environmental costs. This method is also heavily dependent on specific climatic conditions that limit the number of commercially viable salt lakes, throwing into doubt the lithium industry’s ability to meet rising demand.
The new method from Cui and his team uses electricity to move lithium through a solid-state electrolyte membrane from water with a low lithium concentration to a more concentrated, high-purity solution. Each of a series of cells increases the lithium concentration to a solution from which final chemical isolation is relatively easy. This approach uses less than 10% of the electricity required by current brine extraction technology and has a lithium selectivity of almost 100%, making it very efficient.
“The advantages displayed by our approach over conventional lithium extraction techniques enhance its feasibility in eco-friendly and cost-effective lithium production,” said co-lead author of the study, Rong Xu , a former postdoctoral researcher in Cui’s lab, now a faculty member at Xi'an Jiaotong University in China. “Eventually, we hope our method will significantly advance electrified transportation and the ability to store renewable energy.”
The study includes a brief techno-economic analysis comparing the costs of current lithium extraction with those of the RCE approach. The new method is expected to be relatively inexpensive due mostly to lower capital costs. It eliminates the need for large-scale solar evaporation ponds, which are expensive to build and maintain. The new method’s use of significantly less electricity, water, and chemical agents – aside from the sustainability benefits – further lowers costs.
By avoiding the extensive land use and water consumption of traditional methods, the RCE approach also reduces the ecological footprint of lithium production.
The RCE method works with a variety of saline waters, including those with varying concentrations of lithium, sodium, and potassium. Study experiments showed that the new technology could extract lithium, for example, from wastewater resulting from oil production. It could potentially be used to extract lithium from seawater, which has lower lithium concentrations than brines. Lithium extraction from seawater using conventional methods is not commercially viable today.
“Direct lithium extraction techniques like ours have been in development for a while. The main contending technologies to date have significant drawbacks, like the inability to operate continuously, high energetic costs, or relatively low efficiency,” said Ge Zhang , a Stanford postdoctoral scholar and co-author of the study. “Our method seems to have none of these drawbacks. Its continuous operation could contribute to a more reliable lithium supply and calm the volatile lithium market.”
The scalability of the RCE method is also promising. In experiments where the scale of the device was increased fourfold, the RCE method continued to perform well, with both energy efficiency and lithium selectivity remaining very high.
“This suggests that the method could be applied on an industrial scale, making it a viable alternative to current extraction technologies,” said Cui.
Nevertheless, the study highlights some areas for further research and development. The researchers experimented with two versions of their method. One extracted the lithium more quickly and used more electricity. The other was slower and used less electricity. The slower extraction resulted in lower costs and a more stable membrane for extracting the lithium continuously and for a long time, compared with the faster extraction. Under high current densities and faster water flow, the membranes degraded, leading to reduced efficiency over time. Even though this was not evident in the slower extracting experiment, the researchers want to optimize the design of their device for potentially faster extraction. They are already testing other promising materials for the membrane.
Also, the researchers did not demonstrate lithium extraction from seawater in this study.
“In principle, our method is applicable for seawater as well, but there could be stability problems for the membrane in seawater,” said Zhang.
Still, the team remains quite optimistic.
“As our research continues, we think our method could soon move from the laboratory to large-scale industrial applications,” said Xu.
The other co-lead author of the study, Xin Xiao, was a postdoc at Stanford when this work was done, and is now a faculty member at Zhejiang University. Other co-authors are Yusheng Ye, Pu Zhang, Yufei Yang, and Sanzeeda Baig Shuchi, all at Stanford. Yi Cui is also the Fortinet Founders Professor in the School of Engineering, faculty director of the Sustainability Accelerator in the Stanford Doerr School of Sustainability , a professor of energy science and engineering and of photon science, senior fellow and former director of the Precourt Institute for Energy , and senior fellow of the Woods Institute for the Environment . This research was funded by the StorageX Initiative , an industrial affiliates program within Stanford’s Precourt Institute for Energy.
Media contact Mark Golden, Precourt Institute for Energy: (650) 724-1629, [email protected]
We use some essential cookies to make this website work.
We’d like to set additional cookies to understand how you use GOV.UK, remember your settings and improve government services.
We also use cookies set by other sites to help us deliver content from their services.
You have accepted additional cookies. You can change your cookie settings at any time.
You have rejected additional cookies. You can change your cookie settings at any time.
A feasibility study and preliminary framework for an alternative heritage sector statistics methodology.
PDF , 867 KB , 59 pages
The Department for Culture, Media & Sport commissioned Alma Economics to carry out a feasibility study of different approaches that could be used to produce a single reliable estimate of the economic contribution of heritage organisations to the UK economy.
The report considers 4 approaches:
The research assesses these approaches against 6 criteria (coverage, disaggregation, robustness, feasibility, replicability, comparability) and sets out recommendations for improving economic estimates of the heritage sector.
PDF version of report added
First published.
Is this page useful.
Don’t include personal or financial information like your National Insurance number or credit card details.
To help us improve GOV.UK, we’d like to know more about your visit today. Please fill in this survey (opens in a new tab) .
To show a full picture of network performance in each market, our reports are informed by millions of daily consumer-initiated tests taken on Speedtest, along with quality of experience (QoE) metrics that offer insight into the daily connected activities that matter most to end-users.
Data Collection Period: January – June 2024
Dhiraagu was the fastest mobile provider in Maldives during 1H 2024, across all technologies combined, and also specifically for 5G. Dhiraagu recorded a median 5G download speed of 236.69 Mbps, and upload speed of 27.27 Mbps.
While Dhiraagu led on performance, Ooredoo offered the best mobile gaming experience, with a Game Score of 78.03. It was also the most consistent network in Maldives.
Maldives boasts comprehensive mobile network coverage, reaching 100% of its scattered archipelago. Dhiraagu was the fastest mobile provider in Maldives in the first half of 2024, with the best overall speed, for both all network technologies combined and 5G. Ooredoo recorded the best mobile gaming experience in Maldives, with a Game Score of 78.03 and a median game latency to key gaming server locations of 107 ms, well below next placed Dhiraagu with 120 ms. Ooredoo was also the most consistent network in Maldives during the same period.
Speed Score
5G Speed Score
Dhiraagu was the fastest mobile provider in Maldives during this period, based on Speedtest Intelligence® data for all technologies combined, with a Speed Score of 105.16.
Speed Score™ Speedtest® Intelligence | January – June 2024
Ookla’s methodology for fastest network is determined by Speed Score™, which combines download and upload performance, and is based on modern chipsets, in order to remove the impact on network performance of older devices. Read our detailed methodology for more details.
Dhiraagu was the fastest 5G provider in Maldives during this period, based on Speedtest Intelligence® data, with a Speed Score of 203.00. Dhiraagu led on median 5G download speed, recording 236.69 Mbps, ahead of next placed Ooredoo, which recorded 175.22 Mbps. Dhiraagu recorded a median 5G upload speed of 27.27 Mbps, and led on 5G latency, with 28.35 ms.
5G Speed Score™ Speedtest® Intelligence | January – June 2024
Ooredoo recorded the best mobile network Consistency in Maldives, with 93.9% of its samples meeting or exceeding the threshold of 5 Mbps download and 1 Mbps upload throughput. Ooredoo also recorded the best 5G Consistency in the market, with 85.7% of samples meeting or exceeding the threshold of 25 Mbps download and 3 Mbps upload.
Consistency Score™ Speedtest® Intelligence | January – June 2024
Ookla’s methodology for most Consistent network is determined by Consistency Score™, which reflects the percentage of samples that meet minimum thresholds for download and upload speeds, depending on the type of network. The higher a provider’s Consistency Score, the more likely a consumer will enjoy acceptable internet performance and quality. Read our detailed methodology for more details.
There was no best provider for video experience in Maldives during this period, based on Speedtest Intelligence® data, with no statistical difference between the mobile providers.
Video Score™ Speedtest® Intelligence | January – June 2024
Video Score™ is based on Ookla’s consumer-initiated video experience test. Video Score is a weighted sum of five components, each measuring a different aspect of the consumer video experience. These components are evaluated and then scored on a scale of 0-100 for each provider. Read our detailed methodology for more details.
Ooredoo recorded the best mobile gaming experience in Maldives, with a Game Score of 78.03 and a median game latency to key gaming server locations of 107 ms.
Game Score™ Speedtest® Intelligence | January – June 2024
Game Score™ is based on Ookla’s consumer initiated Speedtest results as well as Consumer QoE™ latency and jitter measurements taken to real-world game servers. Game Scores are composed of eight components, each measuring a different aspect of the consumer gaming experience. Each of these components is evaluated and scored on a scale of 0-100 for each provider. Read our detailed methodology for more details.
There was no top rated mobile provider in Maldives during this period, based on Speedtest Intelligence® data, with no statistical difference between the mobile providers.
5-Star Rating Speedtest® Intelligence | January – June 2024
Ookla’s five-star ratings system, driven by our consumer initiated samples, gauges a consumer’s overall satisfaction with their provider. Read our detailed methodology for more details.
Data Collection Period: January – June 2024
Capital Region (Male) had the fastest median mobile download speed among Maldives regions, recording 126.49 Mbps. South Province was second, followed by Upper North Province in third. At the other end of the scale, South Central Province had the slowest median download speed at 64.41 Mbps, followed by Upper South Province and North Central Province. Dhiraagu was reported as the fastest provider in Capital Region (Male).
Malé had the fastest median mobile download speed among Maldives cities, recording 123.33 Mbps. Fuvahmulah was second at 90.20 Mbps, and Kulhudhuffushi in third with median download speed at 87.17 Mbps. Dhiraagu was the fastest provider in Malé.
The contents of this report are the property of Ookla, LLC and may not be copied, redistributed, published, displayed, performed, modified, exploited or used for commercial purposes, including use in advertisements or other promotional content, without express written permission. This includes, but is not limited to, data, written analysis, images, logos, charts and graphs and other items that may appear on this page. Members of the press, academics, non-profit researchers and others using the findings in this report for non-commercial purposes are welcome to publicly share and link to report information with attribution to Ookla. For more information, please contact [email protected] .
In July 2024, the median asking rent for all rental properties listed on Realtor.com® in New York City was $3,421. In contrast to the overall declining trend seen across the top 50 markets , the median asking rent in New York City continues to rise annually, increasing by $73, or 2.2%, compared with a year ago. Although New York City was one of the rental markets that saw the steepest rent declines during the COVID-19 pandemic, its median asking rent rebounded to pre-pandemic levels by spring 2022 and has continued to rise annually since then. As of July 2024, the median asking rent in New York City was $413, or 13.7%, higher than the same time in 2019 (pre-pandemic).
There was greater demand for smaller rental units with 0-2 bedrooms compared with those with 3 or more bedrooms in New York City. In July 2024, the median asking rent for 0-2 bedrooms in the city was $3,322, marking an increase of $72, or 2.2%, from the previous year. Meanwhile, the median asking rent among larger units with 3-plus bedrooms fell to $4,996, experiencing a year-over-year rent decline of $262, or 5.0%, compared with July 2023.
Overall | $3,421 | 2.2% | 13.7% |
0-2 beds | $3,322 | 2.2% | 10.6% |
3+ beds | $4,996 | -5.0% | 14.9% |
In July 2024, the median asking rent for all rental units in Manhattan was $4,489, down $91 or 2.0% from a year ago. It was the 13th consecutive month of annual declines, and rent was $362 (-7.5%) below the peak seen in August 2019.
Additionally, in July 2024, Manhattan’s median asking rent was still $171 (-3.7%) lower than its pre-pandemic level, suggesting a relatively lower demand in this most expensive borough, perhaps indicating an ongoing willingness of workers to commute and leverage flexible working arrangements to find housing affordability, as Realtor.com previously found in the for-sale market .
In fact, to afford renting a typical home in Manhattan without spending more than 30% of income on housing (including utilities)—the standard measure of affordability—a gross household income of $14,963 per month, or $179,560 per year, is required.
Unlike the cooling rental market in Manhattan, the three relatively lower-rent boroughs of the Bronx, Brooklyn, and Queens saw rents continue to increase yearly. Among these three, Queens saw the fastest annual rental growth in July, where the median asking rent reached $3,380, up $256 or 8.2% from the same time last year. It was the highest rent level seen in our data history and was $967 (40.1%) higher than five years ago.
Meanwhile, the median asking rent in the Bronx increased by 7.7%, or $226, to $3,175 from a year ago. It was the second-highest rent level seen since March 2019 and was $1,202 (60.9%) higher than five years ago.
In Brooklyn, the median asking rent increased by 3.5%, or $124, on an annual basis, to $3,718 from a year ago. It was also the highest rent level seen in our data history and was $916 (32.7%) higher than five years ago.
To afford renting a typical home in these three boroughs while adhering to the 30% rule of thumb, the gross monthly household income required for tenants in Queens, Brooklyn, and the Bronx was $11,267, $12,393, and $10,583, respectively, or annual income of $135,200, $148,720, and $127,000 .
Manhattan | $4,489 | -2.0% | -3.7% | $179,560 |
Brooklyn | $3,718 | 3.5% | 32.7% | $148,720 |
Queens | $3,380 | 8.2% | 40.1% | $135,200 |
The Bronx | $3,175 | 7.7% | 60.9% | $127,000 |
Methodology.
New York City rental data as of July 2024 for all units advertised as for rent on Realtor.com®. Rental units include apartments as well as private rentals (condos, townhomes, single-family homes). We use rental sources that reliably report data each month within New York City and each of its boroughs. Data for Staten Island is currently under review.
Realtor.com began publishing regular monthly rental trends reports for New York City in August 2024 with data history stretching back to March 2019.
Join our mailing list to receive the latest data and research.
— a monthly roundup of healthcare-focused ai news and research.
by Michael DePeau-Wilson , Enterprise & Investigative Writer, MedPage Today August 29, 2024
Welcome to MedAI Roundup, highlighting the latest news and research in healthcare-related artificial intelligence each month.
These health systems have hired Chief AI Officers . ( Becker's Health IT )
AI-generated images of doctors were more often white and male than the U.S. physician population, potentially reinforcing stereotypes and undermining diversity, equity, and inclusion initiatives in healthcare, according to a study in JAMA Network Open .
When patients wrote summaries of their genetic conditions, large language models struggled to come up with a diagnosis, according to a study in the American Journal of Human Genetics .
The editors of Academic Medicine have asked researchers publishing on machine learning to include a "sufficient description" of the methods of the model, so that readers can better evaluate and replicate the results, and consider the generalizability and potential further applications of the model.
Europe's AI Act went into effect , requiring companies to ensure their AI systems are safe, transparent, nondiscriminatory, traceable, and environmentally friendly. Experts say the law could shape how AI regulation is drafted in the U.S. as well. ( CNET )
Most Americans believe AI can improve healthcare by minimizing human errors (75%), reducing wait times (71%), or assisting with clinical note taking during appointments (70%), according to a survey by The Ohio State University.
Researchers in the U.K. will analyze more than a million brain scans with AI to develop a tool for predicting a person's risk of dementia . ( The Guardian )
The race to adopt ambient AI documentation technology continues as Ochsner Health announced an agreement with DeepScribe to provide the technology for thousands of providers across its health system. ( Fierce Healthcare )
Not to be outdone, Kaiser Permanente announced a partnership with Abridge to make the company's ambient AI documentation technology available at 40 hospitals and more than 600 medical offices.
And Microsoft's Nuance announced a partnership with Northwestern Medicine to implement its ambient AI documentation technology, Dragon Ambient eXperience Copilot, which will be embedded into Epic.
Michael DePeau-Wilson is a reporter on MedPage Today’s enterprise & investigative team. He covers psychiatry, long covid, and infectious diseases, among other relevant U.S. clinical news. Follow
IMAGES
VIDEO
COMMENTS
Research Report is a written document that presents the results of a research project or study, including the research question, methodology, results, and conclusions, in a clear and objective manner. ... For example, a research report on a new teaching methodology could provide insights and ideas for educators to incorporate into their own ...
Qualitative Research Methodology. This is a research methodology that involves the collection and analysis of non-numerical data such as words, images, and observations. This type of research is often used to explore complex phenomena, to gain an in-depth understanding of a particular topic, and to generate hypotheses.
Your research methodology discusses and explains the data collection and analysis methods you used in your research. A key part of your thesis, ... In shorter scientific papers, where the aim is to report the findings of a specific study, you might simply describe what you did in a methods section.
Identify the major sections of an APA-style research report and the basic contents of each section. Plan and write an effective APA-style research report. ... The abstract presents the research question, a summary of the method, the basic results, and the most important conclusions. Because the abstract is usually limited to about 200 words, it ...
Provide the rationality behind your chosen approach. Based on logic and reason, let your readers know why you have chosen said research methodologies. Additionally, you have to build strong arguments supporting why your chosen research method is the best way to achieve the desired outcome. 3. Explain your mechanism.
There are five MAJOR parts of a Research Report: 1. Introduction 2. Review of Literature 3. Methods 4. Results 5. Discussion. As a general guide, the Introduction, Review of Literature, and Methods should be about 1/3 of your paper, Discussion 1/3, then Results 1/3. Section 1: Cover Sheet (APA format cover sheet) optional, if required.
Research reports are recorded data prepared by researchers or statisticians after analyzing the information gathered by conducting organized research, typically in the form of surveys or qualitative methods. A research report is a reliable source to recount details about a conducted research. It is most often considered to be a true testimony ...
1. Qualitative research methodology. Qualitative research methodology is aimed at understanding concepts, thoughts, or experiences. This approach is descriptive and is often utilized to gather in-depth insights into people's attitudes, behaviors, or cultures. Qualitative research methodology involves methods like interviews, focus groups, and ...
Use the section headings (outlined above) to assist with your rough plan. Write a thesis statement that clarifies the overall purpose of your report. Jot down anything you already know about the topic in the relevant sections. 3 Do the Research. Steps 1 and 2 will guide your research for this report.
Abstract. This guide for writers of research reports consists of practical suggestions for writing a report that is clear, concise, readable, and understandable. It includes suggestions for terminology and notation and for writing each section of the report—introduction, method, results, and discussion. Much of the guide consists of ...
Nature of Research Qualitative Research Report This is the type of report is written for qualitative research. It outlines the methods, processes, and findings of a qualitative method of ...
An outline of the research questions and hypotheses; the assumptions or propositions that your research will test. Literature Review. Not all research reports have a separate literature review section. In shorter research reports, the review is usually part of the Introduction. A literature review is a critical survey of recent relevant ...
A research report is a collection of contextual data, gathered through organized research, that provides new insights into a particular challenge (which, for this article, is business-related). Research reports are a time-tested method for distilling large amounts of data into a narrow band of focus. Their effectiveness often hinges on whether ...
When reporting the methods used in a sample -based study, the usual convention is to. discuss the following topics in the order shown: Chapter 13 Writing a Research Report 8. • Sample (number in ...
Background Methodological studies - studies that evaluate the design, analysis or reporting of other research-related reports - play an important role in health research. They help to highlight issues in the conduct of research with the aim of improving health research methodology, and ultimately reducing research waste. Main body We provide an overview of some of the key aspects of ...
A research methodology encompasses the way in which you intend to carry out your research. This includes how you plan to tackle things like collection methods, statistical analysis, participant observations, and more. You can think of your research methodology as being a formula. One part will be how you plan on putting your research into ...
Definition, Types, and Examples. Research methodology 1,2 is a structured and scientific approach used to collect, analyze, and interpret quantitative or qualitative data to answer research questions or test hypotheses. A research methodology is like a plan for carrying out research and helps keep researchers on track by limiting the scope of ...
A quantitative approach and statistical analysis would give you a bigger picture. 3. Identify how your analysis answers your research questions. Relate your methodology back to your original research questions and present a proposed outcome based on your analysis.
The method section of a report details how the research was conducted, the research methods used and the reasons for choosing those methods. It should outline: the participants and research methods used, e.g. surveys/questionnaire, interviews. refer to other relevant studies. The methodology is a step-by-step explanation of the research process.
Many methodological studies use a research report (e.g. full manuscript of study, abstract portion of the study) as the unit of analysis, and inferences can be made at the study-level. However, both published and unpublished research-related reports can be studied. These may include articles, conference abstracts, registry entries etc.
Comprehensive reports with in-depth analysis and information. 100-page research report on the effects of a new drug on a medical condition. Analytical. Focus on data analysis and provide insights or recommendations. Market research report analyzing consumer behavior trends and recommending marketing strategies.
A research report is an end product of research. As earlier said that report writing provides useful information in arriving at rational decisions that may reform the business and society. The findings, conclusions, suggestions and recommendations are useful to academicians, scholars and policymakers.
Report Writing in Research Methodology. A report is a well-written formal document that briefly describes the process and findings of a research. It outlines the systematic investigation, recommendations, and gaps that need further inquiry. A well-crafted research report tells you about all the main areas of a research process.
This is a Pew Research Center report from the Pew-Knight Initiative, a research program funded jointly by The Pew Charitable Trusts and the John S. and James L. Knight Foundation. ... Two additional recruitments were conducted using the same method in 2015 and 2017, respectively. Across these three surveys, a total of 19,718 adults were invited ...
The new study reports on the results of a new method using an approach known as "redox-couple electrodialysis," or RCE, along with cost estimates. ... The research team estimates its approach ...
Research and statistics. Reports, analysis and official statistics. Policy papers and consultations. Consultations and strategy. Transparency. Data, Freedom of Information releases and corporate ...
Evaluations & Research Reports; Health Equity. Back to menu section title h3. Coverage to Care; Minority health; Cobertura de atención; Grants & awards; Recent Legislation. Back to menu ... 2022 MCBS Methodology Report Dynamic List Information. Dynamic List Data. Contents Section. 2022 MCBS Methodology Report. Publish Year. 2022. Type ...
Speedtest Connectivity Report. To show a full picture of network performance in each market, our reports are informed by millions of daily consumer-initiated tests taken on Speedtest, along with quality of experience (QoE) metrics that offer insight into the daily connected activities that matter most to end-users.
In July 2024, the median asking rent in New York City registered at $3,421, increased by $73 or 2.2% compared to a year ago.
Welcome to MedAI Roundup, highlighting the latest news and research in healthcare-related artificial intelligence each month. These health systems have hired Chief AI Officers.(Becker's Health IT ...