Measuring Total Acidity or Titratable Acidity

Definition

Total Acidity "TA," also called Titratable Acidity, is the total amount of all the available Hydrogen ions in a solution, i.e., ions that are both free in solution (as H+) and those that are bound to undissociated acids and anions (e.g., H2T and HT- for tartaric acids and anions). In grapes, must, or wine, TA is often expressed as grams of tartaric acid per 100 mL of wine or % weight. We prefer to represent it in parts per million [ppm]. Note this measure expresses Acidity in terms of tartaric acid, although we know that the wine contains a mixture of acids (mostly tartaric, malic, lactic, citric, and acetic acids). Some regulatory bodies prescribe a minimum TA (like 0.45 grams tartaric /100mL for red wine in the EU). A general range for TA in red wine is 0.6 to 0.8 [600-800 ppm]; for red wine must, it is 0.7 to 0.9 [700-900 ppm].

Relevance

During the final weeks of grape development, their TA drops from over 1500 ppm to around 700 ppm while the sugar content increases from 12 to 24 Brix. If the TA is not in a 700 – 900 ppm range by harvest time, adjustments should be considered by adding either tartaric acid to increase TA or potassium bicarbonate to reduce TA.

Although TA and pH are interrelated, they are not the same thing. A solution containing a specific quantity of a relatively weaker acid, such as malic, will have a different (higher) pH than a solution containing the same amount of a stronger acid, such as tartaric

Measurement

Until 2017 we measured TA with a titration procedure: we start with a given amount of must or wine and slowly add and measure the amount (titrate) of a given alkaline solution it takes to bring the pH of the mixture to 8.2.

The titration procedure has five steps:

  • Step 1: Calibrate the benchtop pH meter first with a buffer solution of pH 7, then pH 4. This makes sure that the instrument measures correctly

  • Step 2: Fill the titration pipette to a given Start Point with the 0.1 normalized Sodium Hydroxyde (0.1 N NaOH), the alkaline solution.

  • Step 3: Fill a beaker with 100 mL of purified water, put it on a stir plate, and dip the electrodes of the pH meter into the water. Ensure the electrodes are adequately submerged, do not touch the beaker wall, and have a safe distance from the magnetic stirrer.

  • Step 4: Pipette precisely 10 mL of juice, must, or wine into the beaker. To increase accuracy, you may want to degas the sample by heating it to a boil for a few seconds and then letting it cool down to room temperature.

  • Step 5: Turn on the stir plate and slowly drip the alkaline solution into the beaker. Stop when the pH has risen to 8.2. Note that the pH will rise only slowly at the beginning but then rise fast after reaching a pH of 6. When 8.2 is reached, note the End Point and calculate the amount of 0.1N NaOH used (in mL)

The following equation calculates the TA in grams per 100 mL: TA = 75 * V * N / S

V = ml of sodium hydroxide solution used for titration (i.e., EndPoint – StartPoint)
N = Normality of sodium hydroxide solution (e.g., 0.1)
S = Sample juice/must/wine volume (ml) (e.g., 10)

So, if, for example, you used 8 mL of 0.1N NaOH to titrate a wine sample of 10 mL, then the TA of that sample would be 0.6 (g/100 mL) = 75 * 8 * 0.1 / 10.

Since 2017 we have measured TA with OenoFoss


 

Previous page: Measuring pH
Top of page: Go
Next Page: Measuring Volatile Acidity
Last updated: May 16, 2023