Measurement Library

Measurement Science Conference Publications (1996)

Radiation Thermometry
Author(s): Russell N. Bigelow, Paul S. Carlson Aaron m. Hunter, Stephen R. King
Abstract/Introduction:
Radiation thermometry is the measurement of the temperature of a body by quantification of the electromagnetic radiation emitted therefrom. Instruments are commercially available to measure temperatures from -100 C to more than 3,500 C with accuracies of 0.5 to 1.5 percent of the temperature reading in Kelvin. Rms repeatabilities are the greater of 0.2 percent of reading or 0.2 C. This paper presents the theory of radiation thermometers and a discussion of the sources of errors that can affect their use for target temperatures below 600C . The source of errors are calibration, Size-of-Source, emissivity of the target, background radiation, and path absorption.
Go to Download Page
Email Reference
Document ID: 89BD2BCB

Biases In Mass Measurement Caused By Altitude
Author(s): Mark Fritz
Abstract/Introduction:
Here in Denver, we deal with the effects of altitude on our measurement process on a daily bases. In the field of mass measurement, values are expressed in two ways: The first is the mass in a vacuum often referred to as true mass, the second is apparent mass which is also called conventional mass or mass in air. It is not practical to weigh in a vacuum and air density varies with altitude and weather. Therefore, apparent mass by convention is reported as: A weighing of a mass standard with a density of 8.0 g/cm3 in 20C air with a density of 1.2 mg/cm3. The problem with this is mass standards are not exactly 8.0 g/cm3 and at altitude, air is not 1.2 mg/cm3. If corrections for this are not applied the measurement will have a bias. In this paper I will show the effects of uncorrected measurements.
Go to Download Page
Email Reference
Document ID: 46CFEE45

With And Without Sensitivity Weights
Author(s): Emil Hazarian
Abstract/Introduction:
The paper describes the possibility of employing methods of mass calibration without using sensitivity weights with regards to traditional equal arm balance, one pan opticalmechanical balance, and new electronic balances. The possibility of accelerating the mass calibration process without significant effect on reported uncertainty is also explored as an option, based on accumulated data as well as new experiments. The old direct observation method is offered as an attractive alternative due the short time involved and therefore less calibration cost.
Go to Download Page
Email Reference
Document ID: 0DF0A2C9

A Measurement Assurance Program Map() Using Sonic Nozzles
Author(s): Richard W. Caron, Charles L. Britton
Abstract/Introduction:
Ford Motor Company is involved in accurately measuring the air flow to internal combustion engines. In this endeavor, Ford has constructed many different flow test stands and to be assured that each flow test stand is measuring the air mass flow rate correctly, a Measurement Assurance Program (MAP) has been implemented.
Go to Download Page
Email Reference
Document ID: 1D5A1BCA

Developments In High Bandwidth Power Amplifier Technology For Compact Cost Effective Calibrator Applications.
Author(s): Paul C. A. Roberts
Abstract/Introduction:
The workload of a modern calibration laboratory has placed ever greater demands on the Calibration Source, both in terms of functionality and the combination of voltage, current and frequency required to calibrate todays diverse range of instruments.
Go to Download Page
Email Reference
Document ID: 0639085C

An Introduction To Magnetic Metrology At The Navy Primary Standards Laboratory
Author(s): Donald W. Matson
Abstract/Introduction:
Magnetics is a not often encountered area of metrology. This paper will give a very brief description of some of the more frequently encountered magnetic units. It then describes the intrinsic standard (nuclear magnetic resonance) for this area of metrology and the work being carried out at the United States Navy Primary Standards Laboratory.
Go to Download Page
Email Reference
Document ID: 01242478

Cost-Effective Traceability For Oscilloscope Calibration
Author(s): Peter B. Crisp
Abstract/Introduction:
The widespread adoption of ISO 9000 has brought an increased awareness of the need for traceable oscilloscope calibration. However, in-depth knowledge of the traceability requirements tended to lie mainly with the oscilloscope manufacturers, rather than calibration companies or instrument users.
Go to Download Page
Email Reference
Document ID: AE3A0423

Design And Development Of An Integrated Oscilloscope Calibrator
Author(s): Tzafrir Tee Sheffer
Abstract/Introduction:
Integrated Oscilloscope Calibrators have been long in coming. Some of the complicating design and development issues associated with the creation of a full featured Oscilloscope Calibrator such as the Fluke 5500-SC option are discussed. Maintaining an economical solution while providing a fully programmable instrument was the challenge. Features such as a common signal and trigger ports for all functions, and programmable amplitude and frequency sources added to the complexity of the design. A unique architecture as well as some high quality high bandwidth switches, attenuators and power sensors are described. The implemented solutions are discussed down to block diagrams level.
Go to Download Page
Email Reference
Document ID: 1174155F

Uncertainty Procedure For NIST Surface Finish And Microform Calibrations
Author(s): J.F. Song, T.V. Vorburger
Abstract/Introduction:
An uncertainty procedure is used for reporting the NIST surface finish and microform calibration uncertainties. The combined standard uncertainty is a combination of the uncertainty from the geometric non-uniformity of the measured surface, and the uncertainty from the measuring system, which includes the uncertainties from the calibration and check standards, instruments, environment, measuring setup, and from the long-term variations. Theoretical and experimental methods for testing instrument repeatability and noise, stylus tip size, filter and nonlinearity are described. The relationships between the combined standard uncertainty and the calibration traceability, repeatability, and reproducibility are also discussed.
Go to Download Page
Email Reference
Document ID: 7FDC964B

Calibration Data Management: Collecting And Reporting Quality Information
Author(s): Nicholas B. Mason
Abstract/Introduction:
This paper is about collecting and reporting quality information using computer software in the calibration lab. This task could be accomplished with paper and pencil, but the advantages of automating data collection and reporting have been previously documented.
Go to Download Page
Email Reference
Document ID: FFF0298B

Automated Product Traceability
Author(s): Phillip T. Chase
Abstract/Introduction:
For years, the requirement to report significant out-of-tolerance conditions back to the customer has been an inviolate part of the calibration process. Current standards such as ANSI/NCSL Z540 require that the customer be notified promptly in writing whenever there is doubt as to the validity of either the calibration equipment itself or the customers measuring/test equipment. While this approach fulfills the responsibilities of the calibration laboratory, the corrective action process necessitates that the customer keep detailed records of use of each test instrument against each product tested in order to trace back possible risk situations.
Go to Download Page
Email Reference
Document ID: 3C67205B

Realization Of ITS-90 From 273.15 K Through 1234.93 K: One Companys Approach
Author(s): Xumo Li, Steve Iman, Mike Hirst, Mingjian Zhao
Abstract/Introduction:
All of the fixed points of the International Temperature Scale of 1990 (ITS-90) in the range from 273.15 K to 1234.93 K have been realized in specially-designed, permanently- sealed cells. The purity of all of the metals used was 99.9999% or greater. Many improvements in design and techniques used were made in order to get the highest possible accuracy. Since it is important that all of the fixed points be traceable to NIST (National Institute of Standards and Technology), these fixed points were compared with NIST data by using two NIST-calibrated standard platinum resistance thermometers (SPRTs).
Go to Download Page
Email Reference
Document ID: 56449AF5

Information Technology And Its Impact On Calibration Measurement Data Analysis And Control
Author(s): Kevin Lench
Abstract/Introduction:
Calibration Laboratories are experiencing increasing demand to document, analyse and manage calibration data. Can the utilisation of information technology improve and simplify this process?
Go to Download Page
Email Reference
Document ID: 534E7778

Documenting Calibrators Extend The Reach Of Metrology Professionals
Author(s): Richard Pirret
Abstract/Introduction:
Metrology professionals have witnessed an increased demand for documented, traceable calibrations, driven by quality standards and government regulations. In many cases, those regulations are encouraging the rigorous calibration of a new category of workload, process control instrumentation. At the same time, metrologists have seen continued pressure to control expenses. An emerging class of tools, documenting calibrators, permits the metrology professional to extend the reach of metrology services to the world of process instrumentation. The documenting calibrator allows the metrologist to enlist the process instrumentation technician as a calibration technician to perform consistent, documented calibrations during the course of normal maintenance rounds.
Go to Download Page
Email Reference
Document ID: 6C3D2B6A

The Effect Of Offset Or Intercept On The Treatment Of Calibration Data
Author(s): Frank E. Jones
Abstract/Introduction:
In the treatment of calibration data for measurement devices such as turbine flow meters and anemometers, the dependent variable (frequency, pulse rate, or other output) is conventionally divided by the independent variable ( flowrate, airspeed, or other quantity) to arrive at a calibration factor that can be used to infer values of the independent variable from values of the dependent variable. For example, for a turbine flow meter the dependent variable is f, the frequency or pulse rate, and the independent variable is Q, the volume flowrate through the meter. Conventionally, f/Q, the K factor of the meter is determined.
Go to Download Page
Email Reference
Document ID: 79E007B4

An Efficient Calculation For Indirect Measurements And A New Approach To The Theory Of Indirect Measurements
Author(s): S. G. Rabinovich
Abstract/Introduction:
An indirect measurement is a measurement in which the value of an unknown quantity (a measurand) is estimated using measurements of other quantities (arguments) related to the measurand by a known functional dependency. This paper addresses the basic problems inherent in indirect measurements. It demonstrates that the traditional approach to estimating a measurand is biased when the dependency between thme easurand and any relevant arguments is non-linear. In addition, the calculation of variance of this estimation of a measurand requires the calculation of a correlation coefficient. This calculation entails many well known problems. This paper describes an alternative technique: the Method of Reduction which is free from the above mentioned deficiencies.
Go to Download Page
Email Reference
Document ID: B2A2308E

Laser Thread Measurement System
Author(s): Glen m. Castore
Abstract/Introduction:
A method for making dimensional measurements of screw thread geometry using laser triangulation technology is described. The introduction of this type of technology for dimensional measurements of threads creates a need for evolution in the dimensional standards for thread form. An assessment is given of the advancements in measurement methods for screw thread geometry which can be expected to enter the market over the next several years.
Go to Download Page
Email Reference
Document ID: 4E8D3C0C

Development Of 6.453.20 Ohm And 100 Ohm Resistance Standards For The Transfer Of The Quantum Hall Resistance Value To Decade Resistance Values
Author(s): Armen C. Grossenbacher
Abstract/Introduction:
Reference (3) states that the U.S. representation of the Ohm is based on the quantized Hall resistance (QHR) of 6,453.20175 ohms . This odd value of the QHR has to be scaled or transferred to 1 ohm in order to be used to calibrate the U.S. legal ohm. The currently used method of scaling requires the use of resistance standards of 6,453.2 ohms and 100 ohms. This method provides the capability to perform the calibration of the legal U.S. unit of resistance with a combined uncertainty of 20 ppb.
Go to Download Page
Email Reference
Document ID: E4597C6D

The Oak Ridge Metrology Center Resistance Standards At The Y-12 Plant
Author(s): Ross Endsley
Abstract/Introduction:
This paper is a description of the renovation and upgrading of a resistance measurement station in a standards laboratory. The laboratory has a long history in many measurement disciplines, and the historical records on resistance standards have provided a good foundation for the future. Innovative use is made of an unusual mix of stable resistance standards. Measurements are automated using personal computer-based graphical interface software. Statistical analysis is applied to the data to provide uncertainty estimates and to improve confidence. Proven ratio techniques avoid dependency on instrument accuracy.
Go to Download Page
Email Reference
Document ID: 49664D0E

Statistical Manufacturing Control Smc() And Process Control Studies Pcs() In The Electronic Card Assembly And Test Ecat()
Author(s): Moin Ansari
Abstract/Introduction:
Statistics has been in existence since time immemorial. Since the backbone of U.S. manufacturing was in Detroit, the early implementation of Statistical Process Control (SPC) was conducted in the automobile industry. Most of the literature on SPC is for Variable Data as it is applied in the heavily mechanized automobile industry or their suppliers. In many cases the costs of measuring variable data does not justify application of SPC in the electronics industries.
Go to Download Page
Email Reference
Document ID: 41961DAA

An Update Of The 11 Year History Of A 35 mm Piston-Cylinder Used As Reference Standard For Pressure
Author(s): Martin Girard, Pierre Delajoud, Michael Bair
Abstract/Introduction:
In the early 1980s, 10 cm piston-cylinders were developed for piston gauges used as primary pressure standards in the range of 10 kPa to 1 MPa (1.5 to 150 psi). In 1988, we reported on the characterization and history of one of these piston-cylinders used as a reference standard. Since then, the piston-cylinder has repeatedly been intercompared with standards at NIST and Frances LNE. Most recently, a comparison with a newly developed 20c m piston-cylinder with very low uncertainty derived from NIST dimensional measurements was made. Analysis of the eleven year history ending with recent, significantly reduced uncertainties allows the very long term stability of such a piston-cylinder to be evaluated and leads to interesting conclusions on the effectiveness of piston-cylinder dimensional measurement and previous uncertainty estimates.
Go to Download Page
Email Reference
Document ID: 7B20CFC8

Computer Timekeeping
Author(s): Michael A. Lombardi
Abstract/Introduction:
The personal computer revolution that began in the 1970s created a huge new group of time and frequency users, those people who need to keep computer clocks on time. As you probably know, computer clocks arent particularly good at keeping time. Simple clocks like your wristwatch and most of the clocks in your home usually keep better time than a computer clock.
Go to Download Page
Email Reference
Document ID: 842DA0C8

Closing The Loop On Embedded Software Testing
Author(s): Alex Dorchak
Abstract/Introduction:
In a modern T&M instrument, increased functionality and accuracy requirements are mirrored internally by more computing power and more sophisticated state sequencing. Relays, switches, and other software controllable devices, such as DACs, must be transitioned in a precise order to a precise state to reach the desired instrument function, range, and value. The testing of this internal sequencing has become extraordinarily complex.
Go to Download Page
Email Reference
Document ID: 8144B4BF

Effective Arms Length Automated Calibration.
Author(s): Peter Dack
Abstract/Introduction:
The task of providing a fully traceable calibration service world-wide to support a broad range of multifunction calibrators can be expensive and difficult to implement. A portable multifunction transfer standard designed by Wavetek has proved to be an effective tool for automating the calibration process. The system accommodates automated calibration procedures that enable inexperienced operators to simulate the calibration routines recommended by the manufacturer.
Go to Download Page
Email Reference
Document ID: B36F3D46

An Efficient Method For Measuring The Density Or( Volume) Of Similar Objects
Author(s): Randall m. Schoonover
Abstract/Introduction:
Presented here with supporting data is a gravimetric method to determine the density of nearly identical objects. The method readily lends itself to automation and because it utilizes a flurocarbon liquid in place of water, bubble formation is not a problem. The use of an artifact density standard eliminates the expensive air density instrumentation and water temperature measurement associated with the use of water as the density standard. A precision of 1 part per million was achieved for repeated volume determiations of stainless steel 1 kilogram laboratory mass standard.
Go to Download Page
Email Reference
Document ID: BE01FDBA


Copyright © 2024