What are the advantages and disadvantages of different particle sizing methods?
|Laser diffraction||Advantages: simple to use, fast analysis, wide measurement range, good repeatability and accuracy, optional sampling methods (wet/dry/online/small volume).|
|Disadvantages: lower resolution for bimodal distribution whose peaks are close together, less suitable for nanoparticles.|
|Static image analysis||Advantage: morphology analysis, cost-effective, image is clear.|
|Disadvantage: unsuitable for small particles (<2μm), more complex operation, slower analysis speed.|
|Dynamic image analysis||Advantage: morphology analysis, simple to use, fast analysis, good repeatability and accuracy, suitable for large particles.|
|Disadvantages: unsuitable for small particles (<2μm), representativeness is affected by the sampling.|
|Dynamic light scattering||Advantages: wide measurement range, fast analysis, simple to use, excellent for nanoparticles.|
|Disadvantages: measurement errors can occur for particles with wide particle size distribution, only suitable for transparent samples.|
|Gravity sedimentation particle size analysis||Advantages: continuous measurement, cost-effective, wide measurement range.|
|Disadvantages: long measurement time, undersizes non-spherical particles, inaccurate for particles<1μm.|
|Sieving method||Advantage: simple to use, cost-effective.|
|Disadvantage: unsuitable for small particles (<38μm), measurement results are strongly dependent on operator methods, sieve apertures degrade with time, long measurement times for particles<100μm.|
|Coulter counter method||Advantage: particle counting so higher resolution of peaks in a bimodal distribution, fast analysis, good repeatability, suitable for cellular analysis.|
|Disadvantage: unsuitable for small particles and the particles with a wide particle size distribution, aperture needs to be changed for measurement of different sized particles, the maintenance is not simple, needs calibration regularly.|
|Scanning electron microscope||Advantage: accurate size analysis for ultrafine particles, clear image for particles with surface texture, high resolution, a standard technique to characterize nanoparticles.|
|Disadvantage: low representativeness, apparatus is very expensive.|
|Light obscuration||Advantage: particle counting, fast analysis, measurement for sample with low concentration in liquid or gas is possible.|
|Disadvantage: unsuitable for small particles, sample introduction is complicated, needs calibration regularly.|
|Ultrasonic extinction||Advantage: measurement for concentrated slurries without dilution, online measurement is also available.|
|Disadvantage: measurement errors for particles with wide particle size distribution, apparatus is expensive.|
The laser diffraction technique is widely regarded as the most reliable technique for most industrial applications. The measurements are fast, repeatable, accurate, reproducible, and sensitive. It accurately measures the size of irregular as well as regular-shaped particles. It is not affected by the density of a particle or its porosity. It can measure wet, dry, or sprays either in the laboratory or online. The laser diffraction technique can also be combined with dynamic image analysis, providing more accurate measurement results when facing particles whose sizes are related to their orientation to the laser source, such as rod-like particles.
Static and dynamic image analysis is a measurement technique used for measuring particle size, which cover a wide size range with no need to change lenses or other components. In dynamic image analysis, wet and dry samples can be automatically measured with minimal human intervention, which is easy to be used, providing fast analysis and good repeatability, reproducibility, and accuracy.
Dynamic light scattering is primarily used for measuring sub-micron particles. However, for the particles whose sizes are greater than 3μm, dynamic light scattering is unsuitable because they have a failing in that the Brownian motion speed is very low, so low in fact that the sedimentation speed of the particles is greater than the Brownian motion speed.
Gravity sedimentation particle size analysis is a technique that relies on Stokes’ law and has been a very popular method for those applications in which it is applicable. For the calculation of the particle size, the density of the material is necessary. Hence the method is not good for measuring low-density emulsions where the material does not settle or very dense materials which settle quickly. For the particles whose sizes are smaller than 2μm, this technique is limited due to the dominant Brownian motion
Coulter principle was developed for sizing blood cells which are virtually mono-dispersed suspensions in a dilute electrolyte. Although particle number and volume-based particle size can be given, orifices have to be changed when measuring different samples. Therefore the operation is difficult. Besides, calibration should be carried out regularly.
Light obscuration is a particle counting method that is primarily used for measuring small amounts of contamination in clean room facilities, such as pharmaceutical labs and silicon chip manufacturing facilities. Detection of contamination level in aircraft fuels is an important application as well. So basically, it is a low-concentration detection technique that needs constant calibration and is unsuitable for most industrial applications.
Sieving is an old technique used to separate particles with different size ranges, which is easy to use and cheap. However, the measurement result is greatly affected by human errors.
Scanning electron microscope requires elaborate sample preparation and is slow. Though clear images of particles are provided, there can be large operator-to-operator variability on the same sample because the particles are analyzed manually, and the observation area greatly varies, leading to poor representativeness.
The ultrasonic extinction is primarily used in an online system for the detection of undiluted samples. In reality, it can work for some applications, but it is very expensive. To properly function, it needs to be set up with up to 13 different parameters, which are difficult to be found and sometimes non-existent.