The term inter-rater reliability describes the amount of agreement between multiple raters or judges. Using an inter-rater reliability formula provides a consistent way to determine the level of consensus among judges. This allows people to gauge just how reliable both the judges and the ratings that they give are in any given situation. Too little consensus often indicates that the criteria on which the judges base their ratings need to be changed.
Type the ratings into a text document. Each rating should be preceded by a judge identifier such as a judge name or numeral. A comma and a single space should seperate this judge identifier from the rating, and each rating should occupy a new line. If you have more than one rating per judge, these additional ratings should occupy a new line as well.
Save the text document to your computer.
Go to the Inter-rater Reliability Calculator listed in the Resources.
Scroll to the middle of the page where it says "Specify the text file with ratings" and click "Browse".
Select the text document you saved in Step 2.
Select the number of raters from the scroll down menu next to the browse button.
Tips & Warnings
- The reliability of the ratings should be calculated at somewhere between 0 and 1, with 1 being a high degree of reliability and 0 being no reliability.
- By translating qualitative measurements (e.g., bad, neutral and good) into quantitative ones (e.g., 1, 2 and 3), you can use this same method to measure the inter-rater reliability of non-numerical ratings as well.
- Photo Credit http://media.collegepublisher.com/media/paper634/stills/n04ck731.jpg
Inter Alia Definition
The Latin phrase "inter alia" means '"among other things." In English, it appears most frequently in the context of business or law,...
How to Evaluate Research Articles
Not every supposed research article is actually authoritative. Whether you're reading up on your favorite subject, writing an inquiry paper or compiling...
How to Calculate a MTTF
Most of the equipment your business relies on will need to be replaced at some point. Calculating how likely failures are and...
How to Calculate Stock Consistency
When looking at a stock, one of the things that you want to look at is the stock's consistency. Calculating it isn't...
How to Calculate the MTBF
Reliability engineering began in the aviation industry, where unreliable equipment often meant the difference between life and death. MTBF, or mean time...
How to Calculate Reliability & Probability
Probability is a measure of how likely something is to happen (or not happen). Measuring probability is usually based on a ratio...
How to Calculate Measurement Errors
Measurement error is the difference between a true value and the observed value of a trait. The problem is that we don't...
How to Calculate Content Validity Ratios
Worthless or essential -- that is the measurement of the Content Validity Ratio, or CVR. Struggling to find an empirical way to...
How to Calculate Kappa Statistic
Kappa is a statistical measure of agreement in ratings between two raters. For example, if two doctors evaluate a patient as either...
How to Calculate Percent Agreement Between Two Numbers
The calculation of the percent agreement requires you to find the percentage of difference between two numbers. This value can prove useful...
How to Write a Reliability Report
Reliability reports offer information about whether or not a product works right. These reports are essential to both the makers of that...
Children's Activities to Teach Reliability
School helps children develop numerous skills beyond building their knowledge base. One such skill is reliability. Teachers and parents can partner in...