Advanced Search
RSS LinkedIn Twitter

Journal Archive

Platinum Metals Rev., 1989, 33, (2), 55

Cryogenic Temperature Measurement

A Thin Film Rhodium-Iron Temperature Sensor

  • By B. W. A. Ricketson
  • Consultant to Oxford Instruments Limited, Eynsham, Oxford
SHARE THIS PAGE:

Article Synopsis

Applications of the rhodium-iron resistance thermometer over the last two decades have demonstrated that this sensor has the widest temperature range known; it has been used between 0.01 and 800 K, nearly five orders of magnitude. Recent developments have resulted in the production of a small planar device, which is finding many uses in cryogenic instrumentation.

Platinum is the material most generally used for the construction of resistance thermometers. At liquid helium temperatures, however, its resistance is very low. The resistance of a sensor rated 25 ohms at room temperature is reduced to around 0.01 ohms at 4.2 K, so specialised and expensive equipment is required to measure the resistance to sufficient accuracy. Pure rhodium has a similar resistance-temperature curve to that of platinum but when a small quantity of iron is dissolved in the rhodium an extra temperature dependent resistance is found below approximately 40 K. This resistance is associated with certain magnetic dilute alloys and is related to the Kondo effect. There is also a temperature independent resistance that is always present when another element is added to a pure metal.

Professor B. R. Coles at Imperial College first found the anomalous resistance in the 1960s (1). At the beginning of the 1970s, R. L. Rusby at the National Physical Laboratory (N.P.L.) systematically looked at the various possible iron concentrations from the point of view of thermometry between 0.5 and 30 K (2, 3). He chose an alloy with 0.5 atomic per cent iron as having a good compromise between the temperature dependent and independent resistances. The resistance of this alloy more than doubles over this temperature range. Sensors made from freely suspended coils of rhodium-iron wire were encapsulated in a platinum sheath filled with helium gas. The thermometer had a resistance of about 100 ohms at the ice point and 5 ohms at liquid helium temperatures (4). Finding physical effects which change appreciably with temperature is relatively easy compared to finding an effect that is reproducible after handling, temperature cycling and long term storage. The reproducibility of the rhodium-iron thermometer has proved to be excellent, giving a stability over 15 years of around 0.2 mK. Sensors in this form are used by N. P. L. to hold the Echelle Provisoire de Temperature (EPT-76) which is the temperature scale at present specified for international use from 0.5 to 27 K, in the same manner as the International Practical Temperature Scale (IPTS-1968) is used from 13 K upwards.

Cryogenic machines often work in a high vacuum to ensure that heat conducted from the surroundings is kept to a minimum. Sensors as small as possible, are mounted at the required positions, and electrical leads are wound around thermal heat sinks to ensure that no extraneous heat reaches the sensor. In 1980, a 27 ohm ceramic sensor was introduced having a 20 mm length and 3.2 mm diameter (5), smaller than the sensors discussed above. This sensor is usable from 0.5 to 800 K, and is, in consequence, useful for ultra-high vacuum equipment where the temperature of the apparatus is raised to 600 K for degassing, before being cooled to perhaps 4 K. It is stable to 10 mK and in consequence has been installed in many low temperature industrial processes. To make a higher resistance unit from wire without increasing the size would be very difficult, so in 1986 an experimental programme was initiated to deposit a planar film of rhodium-iron alloy on a substrate.

The Planar Sensor

Quite a few attemps have been made over the last 5 years to produce a thermometer as a thin layer of rhodium-iron. The two greatest difficulties encountered were the production of a defect-free, almost single crystal layer, and the deposition of an alloy free from appreciable impurities, as both defects and impurities give rise to a large temperature independent resistance.

The deposition process was carried out by Dr. J. E. Evetts’ group in the Department of Material Science and Metallurgy at Cambridge University, and the experimental units were calibrated by Cryogenic Calibrations Ltd. (6). Sapphire was chosen as the substrate as it is a good thermal conductor at low temperatures and provides a good matching lattice for the rhodium-iron. Ultra-high vacuum getter sputtering was used, with precise control of the argon gas pressure, power input and substrate temperature. Films from 0.1 to 1 micrometre thick and containing various percentages of iron were deposited. The resulting units were calibrated from 1.5 to 300 K, or subjected to a large number of temperature cycles between 4.2 K and room temperature. The sensitivity (dR/dT) of the 0.1 micrometre thick film at 4.2 K was low compared to the value of the resistance at this temperature, indicating a high surface scattering, but reproducibility under thermal cycling was very good. The 0.3 micrometre film had better sensitivity but the 0.5 layer matched the behaviour of wire in both sensitivity and resistance; the former is shown in Figure 1. At this thickness there is strain in the layer, but this can be removed by cycling the device; about 20 to 30 cycles being required to reduce the change in resistance below that which would represent 1 mK. The overall reduction in resistance resulting from the removal of stress is equivalent to about 30 to 50 mK at a temperature of 4.2 K. The wire in the 27 ohm sensor mentioned above, also behaves in this way except that the overall change is about 5 mK, and 5 cycles are sufficient to stabilise the unit as shown by data in Figure 2.

Fig. 1

The normalised sensitivity, dW/dT (where W = R/R273) of films of different thicknesses and compositions varies as a function of temperature, but following the same general pattern as that of a wire wound sensor. The 0.1 μm thick film had a very large temperature independent resistance

—— wire

—.— 0.1 μm film with reduced iron content

–..– 0.3 μm Rh-0.5% Fe

- - - - 1 μm film with reduced iron content

- — - 1 μm Rh-0.5% Fe

....... 0.5 μm Rh-0.5% Fe

The normalised sensitivity, dW/dT (where W = R/R273) of films of different thicknesses and compositions varies as a function of temperature, but following the same general pattern as that of a wire wound sensor. The 0.1 μm thick film had a very large temperature independent resistance—— wire—.— 0.1 μm film with reduced iron content–..– 0.3 μm Rh-0.5% Fe- - - - 1 μm film with reduced iron content- — - 1 μm Rh-0.5% Fe....... 0.5 μm Rh-0.5% Fe

Fig. 2

The resistance of rhodium-iron wire and a 0.5 μm thick film varies significantly with temperature cycle from room temperature to 4.2 K but a 0.1 μm layer is largely unchanged

The resistance of rhodium-iron wire and a 0.5 μm thick film varies significantly with temperature cycle from room temperature to 4.2 K but a 0.1 μm layer is largely unchanged

This research has allowed the design of a sensor with a resistance of 273 ohms at the ice-point, and 18 to 24 ohms at 4.2 K, see Figure 3. The sapphire substrate is 8 mm long and 2.5 mm wide, a size suitable for mounting in a protective copper can 3.2 mm in diameter. Exact control of thickness and line width is difficult, and therefore small links provided in the resistor path may be broken to trim the resistor to the required ice point resistance.

Fig. 3

The Oxford Instruments rhodium-iron resistance thermometer is mounted on a sapphire substrate 8 mm long × 2.5 mm wide. It is a four terminal device, the leads being soldered to the pads. The sensor has a resistance of 273 ohms at the ice-point and 18 to 24 ohms at 4.2 K. This ensures a sensitivity of about 1 ohm per degree at both 4.2 and 273 K. The scale shows 0.5 mm divisions

The Oxford Instruments rhodium-iron resistance thermometer is mounted on a sapphire substrate 8 mm long × 2.5 mm wide. It is a four terminal device, the leads being soldered to the pads. The sensor has a resistance of 273 ohms at the ice-point and 18 to 24 ohms at 4.2 K. This ensures a sensitivity of about 1 ohm per degree at both 4.2 and 273 K. The scale shows 0.5 mm divisions

When mounted on a copper surface with a low temperature adhesive, heat dissipation is substantially improved; the unit shows excellent self-heating characteristics, about ten times better than the 27 ohm ceramic sensor but not as good as the standard helium-filled units. The choice of a high resistance gives a high voltage for a given self-heating.

This thin film rhodium-iron thermometer is now finding approval with customers as a simple stick-on sensor (5).

BACK TO TOP

References

  1. 1
    B. R. Coles, Phys. Lett., 1964, 8, (4), 243
  2. 2
    R. L. Rusby, Platinum Metals Rev., 1981, 25, (2), 57
  3. 3
    R. L. Rusby, in “ Temperature, Its Measurement and Control in Science and Industry ”, ed. J. F. Schooley, American Inst. of Physics, New York, 1982, 5, p. 829
  4. 4
    Made by H. Tinsley & Co., 61 Imperial Way, Croydon CR0 4RR
  5. 5
    Available from Oxford Instruments Ltd., Eynsham, Oxford, OX8 1TL
  6. 6
    “ Resistance Measurements on Rhodium-Iron Thin Films ”, Z. H. Barber,, J. E. Evetts,, R. E. Somekh,, B. W. Ricketson and J. A. Good, “Thermal and Temperature Measure in Science and Industry”, Inst. of Measurement and Control, London, 1987, p. 149 . This work was funded by Cryogenic Calibrations Ltd., Cryogenic Consultants Ltd. and a grant from the S.E.R.C.

Read more from this issue »

BACK TO TOP

SHARE THIS PAGE: