Measuring Heats of Reaction: Calorimetry
The changes in temperature caused by a reaction, combined with the values of the specific heat and the mass of the reacting system, makes it possible to determine the heat of reaction.
 
Heat energy can be measured by observing how the temperature of a known mass of water (or other substance) changes when heat is added or removed. This is basically how most heats of reaction are determined. The reaction is carried out in some insulated container, where the heat absorbed or evolved by the reaction causes the temperature of the contents to change. This temperature change is measured and the amount of heat that caused the change is calculated by multiplying the temperature change by the heat capacity of the system.
 
The apparatus used to measure the temperature change for a reacting system is called a calorimeter (that is, a calorie meter). The science of using such a device and the data obtained with it is called calorimetry. The design of a calorimeter is not standard and different calorimeters are used for the amount of precision required. One very simple design used in many general chemistry labs is the styrofoam "coffee cup" calorimeter, which usually consists of two nested styrofoam cups.
 
When a reaction occurs at constant pressure inside a Styrofoam coffee-cup calorimeter, the enthalpy change involves heat, and little heat is lost to the lab (or gained from it). If the reaction evolves heat, for example, very nearly all of it stays inside the calorimeter, the amount of heat absorbed or evolved by the reaction is calculated.
 
Example Problem
The reaction of an acid such as HCl with a base such as NaOH in water involves the exothermic reaction
 
HCl(aq) + NaOH(aq) ---> NaCl(aq) + H2O

 
In one experiment, a student placed 50.0 mL of 1.00 M HCl in a coffee-cup calorimeter and carefully measured its temperature to be 25.5oC. To this was added 50.0 mL of 1.00 M NaOH solution whose temperature was also 25.5oC. The mixture was quickly stirred, and the student noticed that the temperature of the mixture rose to 32.4oC. What was the heat of reaction?
 
Assumptions
These are solutions, not pure water. The specific heat of water is 4.184 J/goC. Assume that these solutions are close enough to being like water that their specific heats are also 4.1984 J/goC.
 
The density of water is 1.00 g/mL and even though these are solutions we can assume that they are close enough to water to have the same density.
 
Solution
Calculate the heat actually evolved.

                                            q = mcΔt
 

Fill in the missing info. We have mL's and we need grams.
 
Use density. (50 mL + 50 mL ) = 100 mL of solution.
 
100 mL X 1     g        =  100 grams of solution. (m = V X D)
                      mL
 
Find the temperature change.
 
       Δt =tfinal - tinitial = 32.4oC - 25.5oC = 6.9oC
 
    q = mcΔt
       = 100 grams X 4.184    J   X 6.9oC
                                          goC
 
       = 2.9 X 103 J
 
This is the heat gained by the water, but in fact it is the heat lost by the reacting HCl and NaOH, therefore q = -2.9 x 103 J.
 
i.e. it is an exothermic reaction, heat was lost to the water and it got warmer.
 
This only gets us part way. This is the heat evolved for those specific amounts used. (Notice we used identical amounts to keep these solutions simple). We need to find the amount of heat released per mole.
 
How much HCl did we actually use anyways?
 
50.0 mL of HCl X 1.00 mol HCl = 0.0500 mol HCl
                             1000 mL HCl
 
The same quantity of base, 0.0500 mole NaOH, was used.
 
To calculate the energy per mole of acid or base, divide the number of joules by the number of moles.
 
i.e. molar enthalpy = J/mol = -2.9 x 103 J / 0.0500 mol
                                         = -5.8 x 104 J/mol
                                         = -58000 J/mol
                                         = -58 kJ/mol
 
Therefore, for the neutralization of HCl and NaOH, the enthalpy change, often called the enthalpy of reaction is ΔH = -58 kJ/mol
 
The Bomb Calorimeter
A type of calorimeter used in very precise measurements of heats of reaction is called the bomb calorimeter. It is used to measure energy changes for reactions that will not happen until they are deliberately initiated, for example, combustions which must be ignited. The reactants are put into the "bomb", which is then sealed and immersed in a large, well-insulated vat of water. When the reaction is set off, any heat that is liberated is absorbed by the bomb, the water, and any piece of the equipment sticking into the water, and the temperature of the entire contents of the vat rises. The stirrer ensures that any heat released becomes uniformly distributed before the final temperature is read. From the temperature change and the heat capacity of the calorimeter (water plus everything in the water), the heat liberated is calculated.
 

 
Example Problem:
A sample of sucrose (table sugar) with a mass of 1.32 g is burned in a bomb calorimeter. The heat capacity of this calorimeter had been previously found to be 9.43 kJ/oC. The temperature changed from 25.00oC to 27.31oC. Calculate the heat of combustion of sucrose in kilojoules per mole. The formula of sucrose is C12H22O11.
 
Solution
The Δt is 2.31OC. For each degree increase, the reaction has evolved 9.43 kJ, as we know from the heat capacity. Therefore the total heat evolved is
 
                          E =  2.31oC X 9.43 kJ = 21.8 kJ
                                                                            oC
 
This heat was produced by the combustion of 1.32 g of sucrose.
 
               moles = g/molecular mass
                         = 1.32 g / 342.3 grams/mole
                         = 3.86 x 10-3 mol of sucrose.
 
Therefore, the heat evolved per mole of sucrose is
 
              21.8 kJ           =   5.65 x 103 kJ/mole
        3.86 x 10-3 mol
 
Since the combustion is exothermic, this should be given a minus sign and reported as -5.65 x 103 kJ/mol for the heat of combustion for sucrose.
 
          Go to the Enthalpy Calorimetry Worksheet