User Meeting 8 or 9 September 2015: Laser diffraction Data Quality: Do s and Don ts. User meeting 8 or 9 September 2015 Sysmex

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "User Meeting 8 or 9 September 2015: Laser diffraction Data Quality: Do s and Don ts. User meeting 8 or 9 September 2015 Sysmex"

Transcription

1 User Meeting 8 or 9 September 2015: Laser diffraction Data Quality: Do s and Don ts 1

2 Things to consider during and after the measurement: Result particle size distribution Background Raw data Calculation *) : Model Fit Obscuration *) Repeatability *) Measurement settings *) *) Optimized during method development but always good to check during and after the measurement 2

3 Check the result Particle size and Particle size distribution: Logical, expected distribution? 3

4 Check the result Mastersizer 2000: Result Analysis (M) Mastersizer 3000: Analysis 4

5 Check the data Background and data Background Sample data Background substracted 5

6 Check the data Background and data Background Sample data 6

7 Check the data Mastersizer 2000: Data (M) Mastersizer 3000: Data Background data Sample data 7

8 Check de Data and Background Background data Sample data

9 Check the Background data and system cleanliness A good measurement requires a clean, stable background This should show progressive decrease across the detector range Less than 100 units on detector 1 Less than 20 scattering units by detector 20

10 Background example 1 How does the background look like? What could be the reason? What could be done?

11 Background example 1 - solution How does the background look like? Hump around detector 20 What could be the reason? Material stuck to the windows What could be done? Extra Flush or clean the windows

12 Background example 2 How does the background look like? What could be the reason? What could be done?

13 Background example 2 - solution Poor background contaminated dispersant Particulate contamination causes fluctuations in the background over time Bubbles in the dispersant can also result in a similar background

14 Background example 2 - solution View after background measurement, before sample addition

15 Background example 3 How does the background look like? What could be the reason? What could be done?

16 Poor background misaligned system. Castellation indicates a mis-aligned system. This may be caused by: Dirt on the cell windows. Address these and then re-align. Changes in the temperature of the dispersant Castellation

17 Poor backgrounds: thermal gradients in organic dispersants The sample dispersion unit may be at 30 C+, while the solvent is generally much cooler This gives rise to thermal gradients resulting in fluctuations in the laser beam, alignment problems and poor backgrounds Thermal gradient in dispersant

18 Background example 3 How does the background look like? What could be the reason? What could be done?

19 Background example 3 In direct view: wild instability of the scattering on the first few detectors. After a few minutes the background should stabilise as the temperature differences equilibrate. If it does not stabilise, fit the cover of the dispersion unit to reduce surface evaporation.

20 Check the data The data is already corrected for the background Negative data (avoid this!) unstable background Results are affected! Light Energy Data Graph - Light Scattering Detector Number Background data Pharmaceutical in propanol, 22 September :17:21 No Background Signal Data

21 Check the raw data, mastersizer 3000 The data is already corrected for the background Negative data (avoid this!) unstable background Results are affected! 21

22 Check the data Idea about the particle size? First detectors numbers Small angle Scatter Higher light energy Big particles Higher detector numbers Big angle Scatter Lower light energy Small particles

23 Check the fit : what is the fit? The fit page shows the measured scattering data and the data predicted by the scattering model The precision of the overlay of these curves is known as the data fit The goodness of the data fit is quantified by the Residual 200 Data Graph - Light Scattering Light Energy Detector Number Fit data(weighted) 10 August :52:43 23

24 How do we get the size distribution? 24

25 Check the fit Weighted residual residual Weighted residual and residual < 2% (if possible) Bad fit could be caused by: Poor selection of Optical properties Negative data Too high obscuration (multiple scatter) Wrong selection of the calculation model The sample itself (mixture of components etc.) Etc. Optimize the fit, but still look if you use logical optical properties and get a logical particle size distribution 25

26 Example of Unweighted & Weighted fit: Good Example - Unweighted Example - Weighted 26

27 Assessing the Data Fit: Absorption (MS2000) Misfits to the extinction detectors indicate an incorrect absorption value 51 in the red light 52 in the blue light Detector 51 and 52: Absorption Data Graph - Light Scattering Detector Number TC 1 during 100% u/s, 11 January :36:27 27

28 Assessing the Data Fit: RI (MS2000) A poor fit to the focal plane detectors (<40) suggests an incorrect choice of refractive index Other detectors: RI Data Graph - Light Scattering Detector Number Compound A, 29 March :16:52 28

29 Assessing the Data Fit: Absorption (MS3000) Misfits to the extinction detectors indicate an incorrect absorption value 51 in the red light 63 in the blue light Poor data fit here indicates poor choice of of absorption

30 Assessing the Data Fit: RI (MS3000) A poor fit to the focal plane detectors (<40) suggests an incorrect choice of refractive index Poor fit indicates incorrect choice of refractive index

31 Check the fit : not good MS3000 Residuals > then 1-2%; residual and weighted residual not same order. No good fit for detector 51 and 63 (extinction red and blue) optimize optical properties absorbance 31

32 Check the fit : not good MS3000 Residuals > then 1-2%; residual and weighted residual not same order. No good fit for other detectors optimize optical properties refractive index 32

33 MS3000 development: The optical property optimiser (OPO) Offers a quick way to adjust optical properties and assess the fit and result Optical Property Selection Data & Result views

34 Check the obscuration Stable between measurements? Compare obscuration red/blue 34

35 Check the obscuration: red and blue obscuration Particle Size Distribution Particle Size Distribution X Y Volume (%) Volume (%) Particle Size (µm) Particle Size (µm) testx, vrijdag 7 mei :02:41 testy, vrijdag 7 mei :09: Z Particle Size Distribution Volume (%) Particle Size (µm) testz, dinsdag 29 juni :12:01

36 Check obscuration Stable between measurements? Compare obscuration red/blue (indicative information) 36

37 Check the obscuration Obscuration too high: Multiple scatter Obscuration too low: Signal/noise ratio low Bad reproducibility Blue>Red: Small particles, risk for multiple scatter. Avoid blue > ca. 15% Test right obscuration with an obscuration titration 37

38 Guideline Obscuration Ranges for Sizes Dv(50) Value Obscuration Range Submicron 2-5% 1-10um 5-10% Over 10um 5-15%

39 Obscuration guidelines Obscuration is tested during method development. Method Distribution Particle size Obscuration Dry Wide Coarse 0.5-6% Dry Narrow Fine 0.5-3% Wet Narrow Nano 5-6% Wet Narrow <4 µm 10% Wet Wide Coarse 15-25% 39

40 MS3000 development: Data Quality 40

41 Check the Repeatability A good measurement is repeatable Volume (%) Particle Size Distribution Particle Size (µm) Solvent1, 11 May :43:33 Solvent1, 11 May :44:10 Solvent1, 11 May :44:48 Solvent1, 11 May :45:25 41

42 Check the Repeatability If not repeatable: Did the measurement settings change? Did the calculation change? Did the dispersion change? Volume (%) Particle Size Distribution Particle Size (µm) product1, 13 January :17:38 product1, 13 January :18:24 product1+, 13 January :19:50 product1+, 13 January :20:35 product1+, 13 January :21:21 product1++, 13 January :22:40 product1++, 13 January :23:25 product1++, 13 January :24:11 42

43 Check the Repeatability ISO guide line Limits suggested in ISO13320: Dv(50): RSD < 3% Dv(10) and Dv(90): RSD < 5% Below 10µm, these maximum values should be doubled. In ideal conditions 0.5% COV on parameters >1μm 1% COV on parameters <1μm Limits suggested within USP<429> and EP : Dv(50): RSD < 10% Dv(10) and Dv(90): RSD < 15.0% Below 10µm, these maximum values should be doubled. 43

44 Check the Repeatability 44

45 Check Repeatability A good measurement is repeatable 45

46 Check Repeatability If not repeatable: Did the measurement settings change? Did the calculation change? Did the dispersion change? 46

47 Summary - data quality Background data Make sure that: Material is not stuck to the cell windows There is no dispersant contamination There are no thermal gradients That the system has been properly aligned The inner detector data is free from castellation Sample data Check that There are reasonable signal to noise levels There is no multiple scattering There is no negative data There is no noisy data There is no beam steering

48 Questions? Please contact: Sandra Remijn Application Specialist Tineke Mink Application Specialist