Food Authenticity – high tariff testing?

“In the spring a food analyst’s fancy quickly turns to thoughts of authenticity” – began no great poem ever. However, this spring does see a number of conferences discussing the subject of food authenticity and how associated concerns can be addressed. This is certainly an indication that it remains one of the most significant issues to the public consumer, regulatory bodies, food manufacturers, food retailers and government.

Food authenticity testing is undoubtedly a branch of analytical chemistry that demands both attention and interest. The last 20 years have seen an unrivalled period in the development of analytical technology and instrumentation, with many applications available for food testing. This is not just tools to identify gross substitution such as horsemeat, whisky or olive oil. It is now quite possible to differentiate between Welsh or Scottish lamb, farmed or wild salmon, and pure or sweetened fruit juice. A mere hair from the tail of a cow may be enough to track its large-scale geographical movement over time. It is amazing, fascinating and inspiring.

So how does one access this remarkable technology?

In the UK that very ordinary question turns out to be a bit more complicated than might be hoped. There are laboratories in the UK that are at the forefront in the refinement of these techniques and the development of these applications. However, it would be fair to term those laboratories as centres of research rather than being particularly commercial. Commercial laboratories in the UK have relatively little capacity for food authenticity testing – certainly in terms of the advanced analytical chemistry highlighted above. Most of the results that are reported on UK samples that have been submitted for such complex testing will probably have been analysed in Germany or France. Most of the DNA analysis for meat species will probably end up in Europe, as will most of the generic and less complex authenticity testing.

Generally speaking this is not too much of a day-to-day problem. The UK routine food-testing market is genuinely competitive, but is dominated by a few large laboratories. A number of those are owned by large multi-national organisations which also own laboratories across Europe. These businesses tend to have well-organised internal systems, and samples can easily be moved throughout Europe to appropriate laboratory facilities. Even if testing capability is not available in-house, reputable third party subcontractors can readily be sourced across the continent.

The reasons behind this gap in UK testing capability are inevitably complex. However, it has been suggested that part of it may be down to how little money is spent on enforcement food sampling and analysis, including food authenticity testing. Countries such as Germany have many more times the funding available. There is therefore a viable long-standing testing market in Germany for complex food authenticity analysis. There are therefore suitably equipped and appropriately skilled testing laboratories in Germany. Therefore, it is much more efficient and cost-effective to send samples from the UK to Germany for testing. It would require a considerable change in current market conditions to set up equivalent, commercially justifiable laboratories in the UK.

Is the lack of UK capability in food authenticity testing something that is sustainable in the medium to long term? Might the practical efficiency currently in place be threatened by any imminent changes? It is a knotty one, that.

Food Testing: All the Fun of Finding the Fat.

Analysing a sample of food for its fat content can be very simple or it can be very complicated; it can require highly skilled analysts or the most basically trained operative; it can be the bane of the analytical chemist’s life or a triumph of their art. One way or the other it can be considered as a lot of “fun”, regardless of how you define the word.

There are a number of different issues associated with fat testing, but two that inevitably lead to much of the fun are the beautifully paradoxical –

  1. fat is a bit tricky to define
  2. there are lots of different ways of finding it

So, how come we are not too sure what we are looking for but we have lots of ways of doing it?

Well, of course we do pretty much know what we are looking for. We even have food legislation that defines fat for us. The EU Food Information to Consumers Regulation defines fat as total lipids, and includes phospholipids. Therefore, fat really means fatty acids, neutral fats, glycerides, compound lipids (glycolipids, phospholipids etc.), waxes and sterols; not one thing, but many, and a mixed bag at that.

I have to admit that I rather like definitions of fat that talk about anything that is soluble in non-polar organic solvents and that is, along with protein and carbohydrate, a main constituent of living cells. I like this definition because I am an analytical chemist, and I now have something to use as a way of selectively defining my analyte of interest. It is only fat that holds this property of solubility in those organic solvents. I don’t really care if it is any of the different compounds listed in the previous paragraph. I do care if it will dissolve in a non-polar solvent, and if it does I am going to claim it as “fat”.

Having defined the analyte, we are now faced with the question of how the fat is exposed to the solvent. The vast majority of food materials contain water, and most of those contain significant proportions of water. In addition, some of the fat may be physically bound into the food material itself. This will all interfere with our extraction process. However, if we in turn interfere with our sample to resolve this issue then might we not affect our results in some way? In addition, might some solvents be better to use than others?

And that I suppose is the key to it all – the fact that different approaches may end up giving different results because they may do different things to the sample. The determination of fat in milk using the Rose-Gottlieb method is almost an art form, and a skilled analyst may achieve remarkable levels of precision. However, if that method were applied to minced bacon then I would be very wary of the results. Conversely, a good old-fashioned acid-hydrolysis approach that is perfect for bacon might leave the milk a quivering mess. This is not to say that happy mediums are not to be found, but that they have to be applied carefully, and with thought, skill and technique.

The determination of fat content is therefore challenging. Methods are usually multi-step processes in which much can go wrong. Poorly executed fat determinations will almost always give low results as the solvent fails to extract the fat, for whatever reason or reasons. Unlike analysis for moisture content, not only is it that the sample itself governs the process applied, but that the analyst is almost certain to be a significant source of method uncertainty as well.

One final thought concerns inference methods, such as IR or nmr. I am a great fan of nmr technology and would suggest that it is an excellent solution for the routine testing of food. Indeed it is a preferable approach for many laboratories. However, for the most accurate and precise measurements of fat content, we may still have to turn to our more traditional methods that can deal with the fact that we are never exactly sure what we are looking for.

“Without good sample preparation we are wasting our time”


This statement should be used in every training session for food testing laboratory staff. Sample preparation is the most important laboratory activity of all. Yet whenever it is necessary to investigate anomalous results, all too often the cause is simply poor sample preparation.

Generally speaking, poorly homogenised sample material will affect analytical data in two ways. Either the test result will be directly affected because a non-representative test portion has been taken, or the test material may interfere with the analytical process and thereby give rise to an anomalous result. In food analysis this can be illustrated by reference to determinations of total nitrogen and total dietary fibre.

The determination of nitrogen is often performed with combustion Dumas instruments. The most reliable of these instruments tend to be gravity fed units operating with a test portion up to 0.5g. If the test portion is not representative then widely differing results for the same sample may be obtained. This is not resolved by taking a larger test portion, but by initial satisfactory homogenisation of the test material.

Total dietary fibre analysis is a complex procedure. Challenges can arise with what might otherwise be considered homogenous samples – nuts, grains, beans and seeds – which can require specialised milling equipment. Full dispersion within buffered medium to allow for serial enzyme digestions is essential to successful method performance. Therefore, finely homogenised test material is crucial as larger particles remain untouched by this process. Such undigested material usually causes inappropriately high final results. However, in some cases, if these larger particles are then also included in the analysis for protein correction, incorrect negative values can also be achieved.

In both these examples the key to good analysis is good sample preparation – the most important activity there is within a laboratory.

To Innovation, With Love

Generally speaking, managers of chemistry laboratories operating in the contract food testing sector prefer a simple life. It is a highly competitive industry in which there is a strong requirement for efficient and cost-effective analysis. This leads to an emphasis towards rugged and widely applicable methods allied to process control, and is reflected in the fact that the cost of nutritional testing is probably less than half, in real terms, of that which it was 20 years ago. Accompanying this revolution in scientific practice has been true technological advances from instrument suppliers. Even the most routine testing facility may be using nuclear magnetic resonance (nmr), microwave units, and the latest advances in chromatography.

Such progress is based upon the fortunate fact that the analytical requirements for nutritional labelling have been broadly the same for the last 20 years. This allows for the development of “fit for purpose” methods that are widely applicable across a vast range of food types and sample matrices. The inevitable result is that laboratories become a production facility and benefit from all the controls and efficiencies that follow.

However, sometimes problems can arise, and sometimes this can be due to the fact that the food innovation community and the contract food testing community are not necessarily facing in the same direction. It is in such times that the experienced and resourceful analytical chemist may find that life becomes interesting again.

The main weakness within modern efficient laboratories is the Rumsfeldian concept of “unknown unknowns”. A classic example of this can be illustrated by the increased use of industrially produced soluble fibre compounds within food. There are standard methods for the determination of total dietary fibre (TDF) in food. These will usually be based upon procedures accepted by the AOAC (Association of Analytical Communities) such as method 991.43 for soluble, insoluble and total dietary fibre in foods.  Unfortunately, this method will only really recover the relatively large molecular weight fibre components that occur naturally. Small molecular weight soluble fibre molecules, clearly defined within the regulatory framework as dietary fibre, are not detected. Therefore, if a product to which inulin has been added is sent for standard labelling analysis then the reported results will almost certainly not be those expected by the submitting food business. However, if the laboratory does not know such ingredients are present then they are not going to look for them. They will only test for “known unknowns”.

It must be pointed out that suitable test methods can usually be sourced, and in this case a number of alternative appropriate analytical approaches are available from specialist laboratories. A simple conversation between the food business and its chosen laboratory prior to sample submission may well save many anxious phone calls and missed deadlines.

Therefore, such an example highlights that engagement between new product developers and analytical technical specialists is important. This is especially true prior to new product development or innovative approaches to food production. It should help give everyone a simple life, and hopefully an interesting one too.

Food Testing: Moisture Analysis – How Easy It Isn’t

“You can’t trust water: Even a straight stick turns crooked in it.” So said W.C. Fields, and that is a lesson for food testing too. Analysis to determine the water content of food can be terribly tricky without a little thought and consideration.

This may seem like a strange assertion to make. Surely moisture testing is one of the easiest tests? People start working in labs by doing moisture tests, don’t they? Don’t you just dry a sample in an oven? Well, all this is probably true. However, let us look just a little bit further.

The first clue to this unlikely complexity is that there are about 20 different methods for the determination of water content in food on the British Standards website. It is most likely that some of the methods are very similar, and a few have been withdrawn. Nonetheless, there are about 20 methods listed. The second clue is that many of these methods are matrix dependent: meat, oilseed residues, spices, coffee, cereals, pulses and so on. What is it all about?

Let us consider a basic method for determination of moisture, and one that has already been alluded to: drying a sample in an oven. Water will evaporate from the test portion and therefore any loss of mass must be equivalent to the moisture content. However, even such a simple process is actually very complicated. Other volatile non-water species such as acetic acid (found in vinegar) may also evaporate, and volatile flavour compounds could be driven off too. Thermally unstable compounds may break down, perhaps even charring in the process. Some water may be very tightly bound in the sample material at a molecular level, and will not be driven off the test material without higher temperature drying. Some compounds may oxidise over the drying time and actually increase their final mass. The test material may even “cook” and trap moisture within its structure.

Therefore, it becomes very easy to appreciate how parameters such as oven temperature, time of drying, the environment within the oven, and presentation of the test material must be clearly defined and controlled. These parameters must be selected in order to deal with the challenges presented by differing sample matrices. This leads to the scenario of drying methods for determination of moisture carrying marked differences in conditions. An example is as follows; classically, meat samples require a fan oven set at 103 degrees Celsius with the test material mixed in with sand and a drying time of a number of hours, whereas a sample of baking powder demands a desiccating vacuum at no more than room temperature.

It may be possible to obtain differing, apparently precise data for differing conditions. If meat samples are dried at 103 °C for 4 hours then they may give slightly different results than if they are dried for 16 hours. Generally speaking, it is considered best practice to keep the drying time to a minimum to prevent chemical changes in the test material. However, there is then a risk of incomplete drying, particularly in a large oven containing many wet samples. Therefore, it may be necessary to carefully balance the applied conditions.

Adding to this complexity is the fact that there are alternatives to physically drying a sample in order to determine the moisture content. Water may be determined by titration, classically using the Karl Fischer titration and applying it to samples of low water content such as confectionary. It may be determined by distillation, which tends to be proposed for herbs and spices. There is even a further step, which is to use instrumentation such as nmr or infrared, although these inference procedures usually require calibration based upon directly analysed data from drying, distillation or chemical techniques.

The next question to be addressed is one too often overlooked. It should be thought through by both food manufacturers submitting samples as well as the laboratories performing the testing. It concerns the purpose of the testing. That is to say, what is the result to be used for? It is at this point that all concerned must consider the fitness for purpose of the method to be applied. If a result is to be used for the presentation of food labelling data then an absolute method uncertainty of ±0.5% (which for most foods is still only a relative value of less than 1%) would be most satisfactory. In point of fact, the vast majority of moisture testing would fall into such a bracket and for which basic, rugged testing methods are perfectly adequate. However, in a production control environment then this might well be an unacceptable level of uncertainty. This is particularly true for products with low levels of moisture such as crisps or biscuits. It may well be that slightly different conditions will need to be applied in order to obtain data with appropriate levels of precision for specific samples.

The final question is one that is relevant to all food testing; that of sample preparation or homogenisation. This can be particularly significant for resistant test material, or materials that take quite a lot of effort to homogenise. Unfortunately, the first law of thermodynamics will come into play and as more work is done to a system then the temperature will usually increase. Therefore, volatiles such as moisture may well suffer from evaporation during an extended or vigorous grinding process. It is not possible to over-emphasise the importance of the correct and controlled homogenisation of test material within food analysis.

In summary, although moisture testing is fairly standardised, there really is no such thing as one standard moisture test. One thing is certain, which is that the test material and the purpose of testing should both be considered before a method is selected, as it is likely to be the applied method that will actually determine the result obtained.