Measuring Total Acidity or Titratable Acidity


Total Acidity “TA”, also called Titratable Acidity, is the total amount of all the available Hydrogen ions in a solution, i.e. ions that are both free in solution (as H+) and those that are bound to undissociated acids and anions (e.g. H2T and HT- for tartaric acids and anions). In grapes, must or wine, TA is expressed as grams of tartaric acid per 100 mL of wine, or % of weight.. Note, this measure expresses acidity in terms of tartaric acid although we know that the wine contains a mixture of acids (mostly tartaric, malic, lactic, citric and acetic acids)  . Some regulatory bodies prescribe a minimum TA (like 0.45 grams tartaric /100mL for red wine in the EU). A general range for TA in red wine is 0.6 to 0.8, for red wine must it is 0.7 to 0.9.



During the final weeks of grape development their TA drops from over 1.5 to around 0.7 while the sugar content increases from say 12 to 24 Brix. If by harvest time, the TA is not in a 0.7 to 0.9 range, adjustments should be considered, by either adding tartaric acid to increase TA or by adding potassium bicarbonate to reduce TA.

Although TA and pH are interrelated, they are not the same thing. A solution containing a specific quantity of a relatively weaker acid such as malic will have a different (higher) pH than a solution containing the same quantity of a stronger acid such as tartaric



We measure TA with a titration procedure: we start with a given amount of must or wine and slowly add and measure the amount (titrate) of a given alkaline solution it takes to bring the pH of the mixture to 8.2.

The titration procedure has 5 steps (The picture shows the laboratory setup using either of the pH meters).

  • Step 1: Calibrate the benchtop pH meter first with a buffer solution of pH 7, then pH 4. This makes sure that the instrument measures correctly
  • Step 2: Fill the titration pipette to a given Start Point with the 0.1 normalized Sodium Hydroxyde (0.1 N NaOH), the alkaline solution.
  • Step 3: Fill a beaker with 100 mL of purified water, put it on a stir plate and dip the electrodes of the pH meter into the water. Make sure the electrodes are properly submerged, do not touch the beaker wall and have a safe distance from the magnetic stirrer.
  • Step 4: Pipette exactly 10 mL of juice, must or wine into the beaker. To increase accuracy, you may want to degass the sample first by heating it to boil for just a few seconds and then letting it cool down to room temperature.
  • Step 5: Turn on the stir plate and slowly drip the alkaline solution into the beaker. Stop when the pH has risen to 8.2. Note, the pH will rise only slowly at the beginning but then rise fast after it passes a pH of 6. When 8.2 is reached note the End Point and calculate the amount of 0.1N NaOH used (in mL)

The following forrmula calculates the TA in grams per 100 mL:  TA = 75 * V * N / S

        V = ml of sodium hydroxide solution used for titration  (i.e. EndPoint – StartPoint)
        N = Normality of sodium hydroxide solution (e.g. 0.1)
        S = Sample juice/must/wine volume (ml) (e.g. 10)

So, if for example you used 8 mL of 0.1N NaOH to titrate a wine sample of 10 mL, then the TA of that sample would be 0.6 (g/100 mL) = 75 * 8 * 0.1 / 10


For more detailed instructions go to this site


Previous page: Measuring pH
Top of page: Go
Next Page: Measuring Volatile Acidity
Last updated: December 27, 2014