Chemical, Physical, Environmental Analyzers
The industrial CHEMICAL ANALYZERS we provide are: CHROMATOGRAPHS, MASS SPECTROMETERS, RESIDUAL GAS ANALYZERS, GAS DETECTORS, MOISTURE ANALYZER, DIGITAL GRAIN AND WOOD MOISTURE METERS, ANALYTICAL BALANCE
The industrial PYHSICAL ANALYSIS INSTRUMENTS we offer are: SPECTROPHOTOMETERS, POLARIMETER, REFRACTOMETER, LUX METER, GLOSS METERS, COLOR READERS, COLOR DIFFERENCE METER, DIGITAL LASER DISTANCE METERS, LASER RANGEFINDER, ULTRASONIC CABLE HEIGHT METER, SOUND LEVEL METER, ULTRASONIC DISTANCE METER , DIGITAL ULTRASONIC FLAW DETECTOR , HARDNESS TESTER , METALLURGICAL MICROSCOPES , SURFACE ROUGHNESS TESTER , ULTRASONIC THICKNESS GAUGE , VIBRATION METER , TACHOMETER.
For the highlighted products, please visit our related pages by clicking on the corresponding colored text above.
The ENVIRONMENTAL ANALYZERS we provide are: TEMPERATURE & HUMIDITY CYCLING CHAMBERS, ENVIRONMENTAL TESTING CHAMBERS.
To download catalog of our SADT brand metrology and test equipment, please CLICK HERE. You will find some models of the above listed equipment here.
CHROMATOGRAPHY is a physical method of separation that distributes components to separate between two phases, one stationary (stationary phase), the other (the mobile phase) moving in a definite direction. In other words, it refers to laboratory techniques for the separation of mixtures. The mixture is dissolved in a fluid called the mobile phase, which carries it through a structure holding another material called the stationary phase. The various constituents of the mixture travel at different speeds, which causes them to separate. The separation is based on differential partitioning between the mobile and stationary phases. Small differences in partition coefficient of a compound results in differential retention on the stationary phase and thus changing the separation. Chromatography can be used to separate the components of a mixture for more advanced use such as purification) or for measuring the relative proportions of analytes (which is the substance to be separated during chromatography) in a mixture. Several chromatographic methods exist, such as paper chromatography, gas chromatography and high performance liquid chromatography. ANALYTICAL CHROMATOGRAPHY is used to determine the existence and the concentration of analyte(s) in a sample. In a chromatogram different peaks or patterns correspond to different components of the separated mixture. In an optimal system each signal is proportional to the concentration of the corresponding analyte that was separated. An equipment called CHROMATOGRAPH enables a sophisticated separation. There are specialized types according to the physical state of the mobile phase such as GAS CHROMATOGRAPHS and LIQUID CHROMATOGRAPHS. Gas chromatography (GC), also sometimes called gas-liquid chromatography (GLC), is a separation technique in which the mobile phase is a gas. High temperatures used in Gas Chromatographs make it unsuitable for high molecular weight biopolymers or proteins encountered in biochemistry because heat denatures them. The technique is however well suited for use in the petrochemical, environmental monitoring, chemical research and industrial chemical fields. On the other hand, Liquid Chromatography (LC) is a separation technique in which the mobile phase is a liquid.
In order to measure the characteristics of individual molecules, a MASS SPECTROMETER converts them to ions so that they can be accelerated, and moved about by external electric and magnetic fields. Mass spectrometers are used in Chromatographs explained above, as well as in other analysis instruments. The associated components of a typical mass spectrometer are:
Ion Source: A small sample is ionized, usually to cations by loss of an electron.
Mass Analyzer: The ions are sorted and separated according to their mass and charge.
Detector: The separated ions are measured and results displayed on a chart.
Ions are very reactive and short-lived, therefore their formation and manipulation must be conducted in a vacuum. The pressure under which ions may be handled is roughly 10-5 to 10-8 torr. The three tasks listed above may be accomplished in different ways. In one common procedure, ionization is effected by a high energy beam of electrons, and ion separation is achieved by accelerating and focusing the ions in a beam, which is then bent by an external magnetic field. The ions are then detected electronically and the resulting information is stored and analyzed in a computer. The heart of the spectrometer is the ion source. Here molecules of the sample are bombarded by electrons emanating from a heated filament. This is called an electron source. Gases and volatile liquid samples are allowed to leak into the ion source from a reservoir and non-volatile solids and liquids may be introduced directly. Cations formed by the electron bombardment are pushed away by a charged repeller plate (anions are attracted to it), and accelerated toward other electrodes, having slits through which the ions pass as a beam. Some of these ions fragment into smaller cations and neutral fragments. A perpendicular magnetic field deflects the ion beam in an arc whose radius is inversely proportional to the mass of each ion. Lighter ions are deflected more than heavier ions. By varying the strength of the magnetic field, ions of different mass can be focused progressively on a detector fixed at the end of a curved tube under a high vacuum. A mass spectrum is displayed as a vertical bar graph, each bar representing an ion having a specific mass-to-charge ratio (m/z) and the length of the bar indicates the relative abundance of the ion. The most intense ion is assigned an abundance of 100, and it is referred to as the base peak. Most of the ions formed in a mass spectrometer have a single charge, so the m/z value is equivalent to mass itself. Modern mass spectrometers have very high resolutions and can easily distinguish ions differing by only a single atomic mass unit (amu).
A RESIDUAL GAS ANALYZER (RGA) is a small and rugged mass spectrometer. We have explained mass spectrometers above. RGAs are designed for process control and contamination monitoring in vacuum systems such as research chambers, surface science setups, accelerators, scanning microscopes. Utilizing quadrupole technology, there are two implementations, utilizing either an open ion source (OIS) or a closed ion source (CIS). RGAs are used in most cases to monitor the quality of the vacuum and easily detect minute traces of impurities possessing sub-ppm detectability in the absence of background interferences. These impurities can be measured down to (10)Exp -14 Torr levels, Residual Gas Analyzers are also used as sensitive in-situ, helium leak detectors. Vacuum systems require checking of the integrity of the vacuum seals and the quality of the vacuum for air leaks and contaminants at low levels before a process is initiated. Modern residual gas analyzers come complete with a quadrupole probe, electronics control unit , and a real-time Windows software package that is used for data acquisition and analysis, and probe control. Some software supports multiple head operation when more than one RGA is needed. Simple design with a small number of parts will minimize outgassing and reduce the chances of introducing impurities into your vacuum system. Probe designs using self-aligning parts will ensure easy reassembled after cleaning. LED indicators on modern devices provide instant feedback on the status of the electron multiplier, filament, electronics system and the probe. Long-life, easily changeable filaments are used for electron emission. For increased sensitivity and faster scan rates, an optional electron multiplier is sometimes offered that detects partial pressures down to 5 × (10)Exp -14 Torr. Another attractive feature of residual gas analyzers is the built-in degassing feature. Using electron impact desorption, the ion source is thoroughly cleaned, greatly reducing the ionizer's contribution to background noise. With a large dynamic range the user can make measurements of small and large gas concentrations simultaneously.
A MOISTURE ANALYZER determines the remaining dry mass after a drying process with infrared energy of the original matter which is previously weighed. Humidity is calculated in relation to the weight of the wet matter. During the drying process, the decrease of moisture in the material is shown on the display. The moisture analyzer determines moisture and the amount of dry mass as well as the consistency of volatile and fixed substances with high accuracy. The weighing system of the moisture analyzer possesses all the properties of modern balances. These metrology tools are used in the industrial sector to analyze pastes, wood, adhesive materials, dust,…etc. There are many applications where trace moisture measurements are necessary for manufacturing and process quality assurance. Trace moisture in solids must be controlled for plastics, pharmaceuticals and heat treatment processes. Trace moisture in gases and liquids need to be measured and controlled as well. Examples include dry air, hydrocarbon processing, pure semiconductor gases, bulk pure gases, natural gas in pipelines….etc. The loss on drying type analyzers incorporate an electronic balance with a sample tray and surrounding heating element. If the volatile content of the solid is primarily water, the LOD technique gives a good measure of moisture content. An accurate method for determining the amount of water is the Karl Fischer titration, developed by the German chemist. This method detects only water, contrary to loss on drying, which detects any volatile substances. Yet for natural gas there are specialized methods for the measurement of moisture, because natural gas poses a unique situation by having very high levels of solid and liquid contaminants as well as corrosives in varying concentrations.
MOISTURE METERS are test equipment for measuring the percentage of water in a substance or material. Using this information, workers in various industries determine if the material is ready for use, too wet or too dry. For example, wood and paper products are very sensitive to their moisture content. Physical properties including dimensions and weight are strongly affected by moisture content. If you are purchasing large quantities of wood by weight, it will be a wise thing to measure the moisture content to make sure it is not intentionally watered to increase the price. Generally two basic types of moisture meters are available. One type measures the electrical resistance of the material, which becomes increasingly lower as the moisture content of it rises. With the electrical resistance type of moisture meter, two electrodes are driven into the material and the electrical resistance is translated into moisture content on the device’s electronic output. A second type of moisture meter relies on the dielectric properties of the material, and requires only surface contact with it.
The ANALYTICAL BALANCE is a basic tool in quantitative analysis, used for the accurate weighing of samples and precipitates. A typical balance should be able to determine differences in mass of 0.1 milligram. In microanalyses the balance must be about 1,000 times more sensitive. For special work, balances of even higher sensitivity are available. The measuring pan of an analytical balance is inside a transparent enclosure with doors so that dust does not collect and air currents in the room do not affect the balance's operation. There is a smooth turbulence-free airflow and ventilation that prevents balance fluctuation and the measure of mass down to 1 microgram without fluctuations or loss of product. Maintaining consistent response throughout the useful capacity is achieved by maintaining a constant load on the balance beam, thus the fulcrum, by subtracting mass on the same side of the beam to which the sample is added. Electronic analytical balances measure the force needed to counter the mass being measured rather than using actual masses. Therefore they must have calibration adjustments made to compensate for gravitational differences. Analytical balances use an electromagnet to generate a force to counter the sample being measured and outputs the result by measuring the force needed to achieve balance.
SPECTROPHOTOMETRY is the quantitative measurement of the reflection or transmission properties of a material as a function of wavelength, and SPECTROPHOTOMETER is the test equipment used for this purpose. The spectral bandwidth (the range of colors it can transmit through the test sample), the percentage of sample-transmission, the logarithmic range of sample-absorption and percentage of reflectance measurement are critical for spectrophotometers. These test instruments are widely used in optical component testing where optical filters, beam splitters, reflectors, mirrors…etc need to be evaluated for their performance. There are many other applications of spectrophotometers including the measurement of transmission and reflection properties of pharmaceutical and medical solutions, chemicals, dyes, colors……etc. These tests ensure consistency from batch to batch in production. A spectrophotometer is able to determine, depending on the control or calibration, what substances are present in a target and their quantities through calculations using observed wavelengths. The range of wavelengths covered is generally between 200 nm - 2500 nm using different controls and calibrations. Within these ranges of light, calibrations are needed on the machine using specific standards for the wavelengths of interest. There are two major types of spectrophotometers, namely single beam and double beam. Double beam spectrophotometers compare the light intensity between two light paths, one path containing a reference sample and the other path containing the test sample. A single-beam spectrophotometer on the other hand measures the relative light intensity of the beam before and after a test sample is inserted. Although comparing measurements from double-beam instruments are easier and more stable, single-beam instruments can have a larger dynamic range and are optically simpler and more compact. Spectrophotometers can be installed also into other instruments and systems which can help users to perform in-situ measurements during production…etc. The typical sequence of events in a modern spectrophotometer can be summarized as: First the light source is imaged upon the sample, a fraction of the light is transmitted or reflected from the sample. Then the light from the sample is imaged upon the entrance slit of the monochromator, which separates the wavelengths of light and focuses each of them onto the photodetector sequentially. The most common spectrophotometers are UV & VISIBLE SPECTROPHOTOMETERS which operate in the ultraviolet and 400–700 nm wavelength range. Some of them cover the near-infrared region too. On the other hand, IR SPECTROPHOTOMETERS are more complicated and expensive because of the technical requirements of measurement in the infrared region. Infrared photosensors are more valuable and Infrared measurement is also challenging because almost everything emits IR light as thermal radiation, especially at wavelengths beyond about 5 m. Many materials used in other types of spectrophotometers such as glass and plastic absorb infrared light, making them unfit as the optical medium. Ideal optical materials are salts such as potassium bromide, which do not absorb strongly.
A POLARIMETER measures the angle of rotation caused by passing polarized light through an optically active material. Some chemical materials are optically active, and polarized (unidirectional) light will rotate either to the left (counter-clockwise) or right (clockwise) when passed through them. The amount by which the light is rotated is called the angle of rotation. One popular application, concentration and purity measurements are made to determine product or ingredient quality in the food, beverage and pharmaceutical industries. Some samples that display specific rotations that can be calculated for purity with a polarimeter include the Steroids, Antibiotics, Narcotics, Vitamins, Amino Acids, Polymers, Starches, Sugars. Many chemicals exhibit a unique specific rotation which can be used to distinguish them. A Polarimeter can identify unknown specimens based on this if other variables like concentration and length of sample cell are controlled or at least known. On the other hand, if the specific rotation of a sample is already known, then the concentration and/or purity of a solution containing it can be calculated. Automatic polarimeters calculate these once some input on variables are entered by the user.
A REFRACTOMETER is a piece of optical test equipment for the measurement of index of refraction. These instruments measure the extent to which light is bent, i.e. refracted when it moves from air into the sample and are typically used to determine the refractive index of samples. There are five types of refractometers: traditional handheld refractometers, digital handheld refractometers, laboratory or Abbe refractometers, inline process refractometers and finally Rayleigh Refractometers for measuring the refractive indices of gases. Refractometers are widely used in various disciplines such as mineralogy, medicine, veterinary, automotive industry…..etc., to examine products as diverse as gemstones, blood samples, auto coolants, industrial oils. The refractive index is an optical parameter to analyze liquid samples. It serves to identify or confirm the identity of a sample by comparing its refractive index to known values, helps assess the purity of a sample by comparing its refractive index to the value for the pure substance, helps determine the concentration of a solute in a solution by comparing the solution's refractive index to a standard curve. Let us go briefly over the types of refractometers: TRADITIONAL REFRACTOMETERS take advantage of the critical angle principle by which a shadow line is projected onto a small glass thru prisms and lenses. The specimen is placed between a small cover plate and a measuring prism. The point at which the shadow line crosses the scale indicates the reading. There is automatic temperature compensation, because the refractive index varies based on temperature. DIGITAL HANDHELD REFRACTOMETERS are compact, lightweight, water and high temperature resistant testing devices. Measurement times are very short and in the range of two to three seconds only. LABORATORY REFRACTOMETERS are ideal for users planning to measure multiple parameters and get the outputs in various formats, take printouts. Laboratory refractometers offer a wider range and higher accuracy than handheld refractometers. They can be connected to computers and controlled externally. INLINE PROCESS REFRACTOMETERS can be configured to constantly collect specified statistics of the material remotely. The microprocessor control provides computer power that makes these devices very versatile, time-saving and economical. Finally, the RAYLEIGH REFRACTOMETER is used for measuring the refractive indices of gases.
Quality of light is very important in the workplace, factory floor, hospitals, clinics, schools, public buildings and many other places. LUX METERS are used to measure luminuous intensity (brightness). Special optic filters match the spectral sensitivity of the human eye. Luminous intensity is measured and reported in foot-candle or lux (lx). One lux is equal to one lumen per square meter and one foot-candle is equal to one lumen per square foot. Modern lux meters are equipped with internal memory or a data logger to record the measurements, cosine correction of the angle of incident light and software to analyze readings. There are lux meters for measuring UVA radiation. High end version lux meters offer Class A status to meet CIE, graphic displays, statistical analysis functions, large measurement range up to 300 klx, manual or automatic range selection, USB and other outputs.
A LASER RANGEFINDER is a test instrument which uses a laser beam to determine the distance to an object. Most laser rangefinders operation is based on the time of flight principle. A laser pulse is sent in a narrow beam towards the object and the time taken by the pulse to be reflected off the target and returned to the sender is measured. This equipment is not suitable however for high precision sub-millimeter measurements. Some laser rangefinders use the Doppler effect technique to determine whether the object is moving towards or away from the rangefinder as well as the object’s speed. The precision of a laser rangefinder is determined by the rise or fall time of the laser pulse and the speed of the receiver. Rangefinders that use very sharp laser pulses and very fast detectors are capable to measure the distance of an object to within a few millimeters. Laser beams will eventually spread over long distances due to the divergence of the laser beam. Also distortions caused by air bubbles in the air make it difficult to get an accurate reading of the distance of an object over long distances of more than 1 km in open and unobscured terrain and over even shorter distances in humid and foggy places. High end military rangefinders operate at ranges up to 25 km and are combined with binoculars or monoculars and can be connected to computers wirelessly. Laser rangefinders are used in 3-D object recognition and modelling, and a wide variety of computer vision-related fields such as time-of-flight 3D scanners offering high-precision scanning abilities. The range data retrieved from multiple angles of a single object can be used to produce complete 3-D models with as little error as possible. Laser rangefinders used in computer vision applications offer depth resolutions of tenths of millimeters or less. Many other application areas for laser rangefinders exist, such as sports, construction, industry, warehouse management. Modern laser measurement tools include functions such as capability to make simple calculations, such as the area and volume of a room, switching between imperial and metric units.
An ULTRASONIC DISTANCE METER works on a similar principle as a laser distance meter, but instead of light it uses sound with a pitch too high for the human ear to hear. The speed of sound is only about 1/3 of a km per second, so the time measurement is easier. Ultrasound has many of the same advantages of a Laser Distance Meter, namely a single person and one-handed operation. There is no need to access the target personally. However ultrasound distance meters are intrinsically less accurate, because sound is far more difficult to focus than laser light. Accuracy is typically several centimeters or even worse, while it is a few millimeters for laser distance meters. Ultrasound needs a large, smooth, flat surface as the target. This is a severe limitation. You can’t measure to a narrow pipe or similar smaller targets. The ultrasound signal spreads out in a cone from the meter and any objects in the way can interfere with the measurement. Even with laser aiming, one cannot be sure that the surface from which the sound reflection is detected is the same as that where the laser dot is showing. This can lead to errors. Range is limited to tens of meters, whereas laser distance meters can measure hundreds of meters. Despite all these limitations, ultrasonic distance meters cost much less.
Handheld ULTRASONIC CABLE HEIGHT METER is a test instrument for measuring cable sag, cable height and overhead clearance to ground. It is the safest method for cable height measurement because it eliminates cable contact and the use of heavy fiberglass poles. Similar to other ultrasonic distance meters, the cable height meter is a one-man simple operation device that sends ultrasound waves to target, measures time to echo, calculates distance based on speed of sound and adjusts itself for air temperature.
A SOUND LEVEL METER is a testing instrument that measures sound pressure level. Sound level meters are useful in noise pollution studies for the quantification of different kinds of noise. The measurement of noise pollution is important in construction, aerospace, and many other industries. The American National Standards Institute (ANSI) specifies sound level meters as three different types, namely 0, 1 and 2. The relevant ANSI standards set performance and accuracy tolerances according to three levels of precision: Type 0 is used in laboratories, Type 1 is used for precision measurements in the field, and Type 2 is used for general-purpose measurements. For compliance purposes, readings with an ANSI Type 2 sound level meter and dosimeter are considered to have an accuracy of ±2 dBA, whereas a Type 1 instrument has an accuracy of ±1 dBA. A Type 2 meter is the minimum requirement by OSHA for noise measurements, and is usually sufficient for general purpose noise surveys. The more accurate Type 1 meter is intended for the design of cost-effective noise controls. International industry standards related to frequency weighting, peak sound pressure levels….etc are beyond the scope here due to the details associated with them . Before purchasing a particular sound level meter, we advise that you make sure to know what standards compliance your workplace requires and make the right decision in purchasing a particular model of test instrument.
ENVIRONMENTAL ANALYZERS like TEMPERATURE & HUMIDITY CYCLING CHAMBERS, ENVIRONMENTAL TESTING CHAMBERS come in a variety of sizes, configurations and functions depending on the area of application, the specific industrial standards compliance needed and the end users needs. They can be configured and manufactured according to custom requirements. There is a broad range of test specifications such as MIL-STD, SAE, ASTM to help determine the most appropriate temperature humidity profile for your product. Temperature / humidity testing is generally carried out for :
Accelerated Aging: Estimates the life of a product when actual lifespan is unknown under normal use. Accelerated aging exposes the product to high levels of controlled temperature, humidity, and pressure within a relatively shorter timeframe than the expected lifespan of the product. Instead of waiting long times and years to see product lifespan, one can determine it using these tests within a much shorter and reasonable time using these chambers.
Accelerated Weathering: Simulates exposure from moisture, dew, heat, UV….etc. Weathering and UV exposure causes damage to coatings, plastics, inks, organic materials, devices…etc. Fading, yellowing, cracking, peeling, brittleness, loss of tensile strength, and delamination occur under prolonged UV exposure. Accelerated weathering tests are designed to determine if products will stand the test of time.
Thermal Shock: Aimed to determine the ability of materials, parts and components to withstand sudden changes in temperature. Thermal shock chambers rapidly cycle products between hot and cold temperature zones to see the effect of multiple thermal expansions and contractions as would be the case in nature or industrial environments throughout the many seasons and years.
Pre & Post Conditioning: For conditioning of materials, containers, packages, devices…etc
For details and other similar equipment, please visit our equipment website: http://www.sourceindustrialsupply.com