CHAPTER FIVE: 1965 - ANOTHER CAREER CHANGE

Atomic Energy of Canada Ltd. (AECL)
By 1964 the integration of the Army, Navy and Air Force into the "Canadian Armed Forces", with the hated green uniform, had started to happen. It had been the personal crusade of the Defence Minister in the Diefenbaker government, Paul Hellyer. The military fought it tooth and nail because it threatened a way of life and a lot of empires. I had some sympathy for their point of view, it meant for example the end of the autonomous RCAF with their distinctive uniforms, ranks and traditions inherited from the RAF. The Group Captains would henceforth become Colonels, the Squadron Leaders would be Majors and the Flight Lieutenants would be Captains. As far as I was concerned this was one more nail in the coffin of Canadian independence, because it was an obvious case of "harmonising" our air force with the American hierarchical structure, presumably to make us fit in even more seamlessly (and even more invisibly) with the NORAD (North American Air Defence), which was (and still is) essentially an American organisation, with Canada as the Bell-Boy.

One of the consequences of this was that the military took care to consolidated its position in every way it could, and one of the ways open to it was to assert every iota of authority that it had with respect to the day-to-day aspects of departmental operations. Inevitably this included projects that I was involved in. I began to resent having some Major or Lt. Colonel automatically appointed as "project manager", for projects which were the result of ideas that I had dreamed up, and about which they knew little or nothing. Of course they took credit for any successes, which became brownie points in their promotional stakes.

Until that point my career had given me immense satisfaction, but I decided to follow the old adage "Quit while you're ahead", and began casting round for something else. I made an application to Atomic Energy of Canada Limited (AECL), Commercial Products Division, in reply to a job advertisement for a development physicist to work on the peaceful uses of atomic energy. Specifically they were looking for someone with a background in electronics to design and develop radiation measurement instrumentation. I went for an interview with the leader of the development group, a chap about my own age by the name of Ron Tolmie. At that point the group consisted of himself and another physicist and two technicians. He showed me what they were doing, which looked very interesting. It was obvious to me that I would have a lot of learning to do to get up to speed in this field. I knew less than nothing about nuclear radiation measurement techniques and said so, but Tolmie did not seem too concerned about that.

I was a bit leery about leaving the field of weapons testing, (in which by now I had built up a solid international reputation), not to mention the very congenial environment that I worked in. It was impossible to know if the AECL group would flower and prosper, or if I would be able to "convert" to this new field quickly enough to be able to pull my weight in a reasonable time. I was a little concerned about the emphasis on revenue generation. I had never worked for a private company with a bottom line objective, and although AECL was something called a "Crown Corporation", (similar to a state-owned utility), it was pretty clear to me that the bottom line was a major factor in their modus operandi. After much soul searching I decided that either way I was in a bit of a trap. The present happy and fruitful arrangements were under obvious threat from the new order and it would only be a matter of time before all civilian scientists and engineers in the Department of National Defence were reduced to the role of support staff for the military. The only alternative was to cast my bread on the waters and do a bit of risk-taking. I decided to go for it and accepted the position with AECL effective February 1965.

The building which AECL (Commercial Products) occupied, was in a newly created government complex named "Tunney's Pasture", close to the very scenic Ottawa River. There were no cows in the pasture, only squat buildings designed by cut-price government contractors to maximise the number of bureaucrats per square foot per building. The AECL building was a bit different from these and was actually quite light and modern by comparison with the gloomy and over-built nineteenth century printing bureau, where I had spent the last four years. The office I was given was tiny in terms of square footage, but had large windows and modern office furniture. In fact the whole place had a purposeful and real- world orientation that the National Defence environment lacked, something which had never occurred to me before.

Nuclear radiation - some basics
As I had anticipated, I had a lot of learning to do on the science of nuclear radiation measurement. Because it is a major part of the next piece of this story it is worth sketching out the fundamentals of what this radiation is and how it has been put to very good use in a variety of industrial and medical applications. The term means what it says, radiation coming from the atomic nucleus. There is an uncanny similarity between the solar system, with all the planets orbiting around the sun, and a single atom of matter. The atom has a nucleus analogous to the sun, with electrons orbiting around it like the planets. The scale is also similar, in that there is lots and lots of empty space between the nucleus and the orbiting electrons. Each one of the ninety three naturally occurring elements is like a unique solar system, with different numbers of electrons orbiting around the nucleus.

If one of the electrons far away from the nucleus (like the planets Pluto or Neptune in our solar system) gets knocked out of the atom, then a minuscule "quantum" of radiation is released as visible light. The light will have a characteristic colour relating to which electron that was. For example sodium vapour lamps use an electric discharge to knock one of the electrons out of the atoms of sodium vapour, giving the sickly yellow colour which makes us look as we all have jaundice. Mercury vapour lamps aren't much better with the greenish hue which is characteristic of the visible light from the element mercury. It takes progressively more and more energy to knock out electrons that are closer to the nucleus of an atom, and the colour of the radiation that is released progresses more and more toward the blue end of the visible spectrum and finally moves right out of it altogether. The part of the radiation spectrum immediately above the visible is the ultra-violet or"UV".

In all cases the radiation released is itself a form of energy called "electromagnetic radiation". The manifestations of it cover all the familiar forms, from radio waves, to micro waves (the ones that unfreeze the TV dinner), to infrared and to visible light, with its range of colours enriching the lives of all of us lucky enough not to be blind or colour blind. The visible portion of the electromagnetic spectrum is just a tiny band in the whole range. The low end of the spectrum has low energy radiation, the old "longwave" radio transmissions for example are in this category. The microwave portion by contrast is many orders of magnitude more energetic. By the time we get to the UV band, immediately above the visible portion, the energy of the radiation has enough of a clout that it can dislocate electrons from any atoms that it intercepts.

When radiation has enough energy to exceed this threshold it is referred to as "ionising radiation" (creating "ions" by separating electrons from their parent atoms). The UV is a weak ionising radiation, meaning that it can not penetrate very far. Nevertheless it can damage the outer layer of body tissue that it can penetrate, the skin, by rearranging the individual atoms in it. This is why there is concern about the thinning of the ozone layer which provides a shield against the ionising UV radiation from the sun. Nobody knows exactly how ionising radiation causes cancer, but there is enough statistical evidence to show that it does. Damage to the DNA genetic codes carried by cells is almost certainly involved, causing them to pass on faulty blueprints for the copying process whereby cells are replicated.

As the energy of the radiation released by knocking electrons out of atomic orbits increases, it moves out of the UV and into the portion of the spectrum labelled "X-radiation". X-rays have enough energy to pass right through soft matter but are absorbed by denser material - like bones for example, which is why it so useful in medical applications. A chemical element can actually be identified by the energy or "colour" of the X-rays that it emits when bombarded by radiation with enough energy to knock the inner electrons of its atoms out of their orbits. This has spawned a whole science known as "X-ray Fluorescence", a technique for identifying elements in unknown samples. Examples include the identification of fake copies of paintings by the old masters. A beam of radiation is directed onto a tiny corner, allowing the elements in the paint to be identified. Certain elements that were used for pigments in the Renaissance period, were known not to have been used later on and vice-versa, making it easy to detect the fakes. This technique is "non-destructive", since it causes no damage to the substance being irradiated, whereas a chemical analysis would require chipping off a piece of the paint.

So much for what happens when the electrons in an atom are dislodged. It is much more difficult to break the nucleus apart, the "sun" of the atom. The fundamental particles that are good at doing this are called "neutrons", and they can only be generated with a lot of very high tech equipment. There is one naturally occurring element however in which the atoms have such large nuclei that they break apart all by themselves at random intervals, like popcorn being cooked in a pot. That element is Uranium, and each time this happens neutrons are released. The principle of "controlled nuclear fission" is to assemble a large enough mass of uranium in a single piece so that the "popcorn" effect produces enough neutrons to break apart atoms that were not ready to "pop" on their own.

These additional pops produce still more neutrons in a multiplier effect and the whole thing escalates to produce a self sustaining "chain reaction", releasing useful energy (nuclear energy) in the process. If the chain reaction is controlled, then you have a nuclear reactor producing heat, which can then be used to power steam turbines. The controlling is done by having material between the uranium fuel rods which absorbs excess neutrons, thereby ensuring that the popcorn multiplier effect is vigorous - but not catastrophic. If on the other hand the reaction is uncontrolled, then there is the monumental release of energy of every conceivable form that levelled Hiroshima and Nagasaki in 1945.

When neutrons penetrate the nucleus of an atom all sorts of reactions can take place. One of the most useful ones causes the substance being irradiated to become radioactive and give off penetrating gamma radiation. The portion of the electromagnetic spectrum occupied by gamma rays overlaps with the X-ray band at the low end, but goes up to many hundreds of times the energies of X-rays. Like visible light and X-rays, gamma rays have a spectrum of "colours". A substance that has been irradiated by neutrons and made radioactive, can be analysed for the elements it contains by using equipment to detect the characteristic gamma rays that it then emits and sort them into a spectrum according to their "colours", or more accurately energies, just as a prism would do for the colours in the visible spectrum. This method of analysis is known as "Neutron Activation" and like X-ray fluorescence, it is non-destructive.

Substances that are "cooked up" in a nuclear reactor to make them radioactive, do not stay that way for very long. The radioactivity dies away to insignificant levels in times ranging from a few seconds to hours, days and in some cases years. The time for the activity to decay to half of what it was at the beginning of the time interval is known as the "half life". After one half life the activity is halved, after two it has dropped to half of that again, which is a quarter of the original amount and so on. So that after say eight half lives the activity would be down to 1/256 of what it was.

Many elements have "isotopes", which are chemically indistinguishable from the normal form of the element but have extra neutrons in their atomic nuclei. The famous "heavy water" is a case in point. Chemically there is no test that will tell the difference from ordinary water, but it is very slightly denser. When an element such as Cobalt for example, is put into a reactor and irradiated with neutrons, some of the atomic nuclei capture extra neutrons which creates a radioactive isotope of Cobalt. This is then referred to as a "radioisotope" of Cobalt. That particular one has a half life of about five years and is used extensively as the source of radiation in cancer treatment equipment.

The industrial and medical uses of radioisotopes are very wide ranging indeed. Radioactive fluids (having half lives of only a few hours) are used to sort out problems in paper mill digesters, for example, where there is a need to find out just how long the slurry remains in a particular processing tank before being completely replaced by incoming material. Detectors can be set up to monitor the movement of the radioactive fluid through the processing plant from outside the various digesters, holding tanks and so on, without interfering with the process. The same technique is used in hospitals to track the movement of body fluids to aid in the diagnoses of problems that would be very difficult if not impossible to solve in any other way. Other applications involve gauges using radioisotopes to measure such things as levels of dangerous chemicals in sealed tanks, or the thickness of ferromagnetic coatings on magnetic recording tape.

One intriguing application of X-ray fluorescence I remember was the on-line sorting of Canadian Quarters which were minted before 1967. These had a relatively high silver content and were worth more than their face value some years later. An automated analysis system was used which allowed each coin arriving from a hopper to be irradiated with gamma rays from a suitable radioisotope. A detector analysed the X-rays which were induced in the material of the coin and if the silver X-rays exceeded a predetermined intensity, then a mechanism was activated to tip that coin into a different bin from the others.

Tuning in to a new environment
Ron Tolmie was primarily a "concept" man. He was always on the lookout for the latest developments and kept everyone perched precariously on what is known now as the "leading edge" of technology. He was particlularly interested in the application of digital techniques to solving some of the problems that they faced at AECL. One of these was the accurate positioning and collimation of the beams of radiation from the medical "teletherapy" machines, pioneered by AECL, and used in the treatment of cancer. The basis of the treatment was to bombard a cancerous tumour with the gamma radiation from a radioactive source. In order to do this the source (the radioisotope Cobalt-60) had to be housed in a heavily shielded container, with a tunnel-like aperture (the collimator) through which a pencil-like beam of the radiation could be directed at the tumor, without harming the surrounding tissue.

The machine to do this consisted of a horizontal platform on which the patient lay, and a rotating head containing the heavily shielded radioactive source. The positioning and orientation of this massive structure had to be done with considerable precision to ensure that the radiation went exactly where it was supposed to. The collimator consisted of several motor driven wedges, made of lead, which controlled the shape and area of the beam. These control functions were done with conventional electric motors, which were stopped by limit-switches as the moving parts reached their prescribed positions.

Control systems based on this "analogue" principle had limited accuracy and required a lot of costly and complex mechanical logic (implemented with relays and other gadgetry) to prevent collisions between the various different moving parts as they all moved to their prescribed positions. Tolmie came up with a scheme to do all of this using digital techniques and devices called "stepping motors", which was a totally novel concept. These stepping motors looked just like the common or garden electric motors that were already being used to do the job.

The difference was that instead of their shafts just spinning in response to the application of electrical power like ordinary motors, the shafts of these motors turned in precise one degree increments in response to individual pulses being applied to them. This in turn meant that the mechanisms that they were driving would also move in precise increments. The whole control problem could then be handled by electronic logic, which would decide which motors should be activated and in what sequence. The business of preventing collisions between moving parts was then almost a trivial problem. Furthermore the positions of all the moving parts would be known to a very high degree of accuracy, because electronic counters could easily keep track of the number of pulses which were fed to each motor.

He had designed and constructed a working mock-up model for demonstration purposes which proved his point, but the senior management in the production side felt that it was too risky to incorporate electronic intelligence to replace the tried and true electro-mechanical analogue systems. It was a classic case of coming up with something which (to paraphrase Churchill) was "too much and too soon". Years later of course they finally accepted the idea, but by then digital controls were being incorporated on most new system designs involving moving parts, from sewing machines to space craft. In the interim they lost a lot of profit that would have accrued from the enormous simplification and the superior reliability of the design concept proposed by Ron Tolmie.

One of the things which made any sort of digital substitute so attractive and advantageous was the impact of the newly available integrated circuits. The very first ones were being delivered in quantity at about the time I arrived at AECL. Ron Tolmie was quick to pick up on this new development and ordered in some for evaluation. They were made by Fairchild, a company that had made its name originally as the "Fairchild Camera and Instrument Corporation", but which had now suddenly become a leading semiconductor house.

The first devices were logic gates which allowed a single output to be controlled by several inputs. They replaced up to fifteen or twenty transistors and the same number of resistors, with a single plastic button the size of an aspirin tablet, with eight pins sticking out of the bottom. They required a power supply of 3.6 volts, which seemed a little strange. There were standard families of these things which enabled designers to produce incredibly complicated logic functions on a single circuit board, that would have required ten times the space using individual transistors and resistors. They were crude by today's standards and were implemented in something called "Resistor- Transistor Logic" (RTL). You had to be careful not to come up with a design that had the output of one chip connected to too many inputs of other ones, otherwise it would become overloaded and cause problems.

These early versions were quickly eclipsed by more sophisticated ones made by Texas Instruments known as "Transistor- Transistor Logic" or "TTL". They were packaged in little oblong packages with pins along either edge, giving them a startlingly insect-like appearence, and were of course immediately dubbed "bugs". It is interesting that the TTL family of logic required a five volt power supply, which quickly became the industry standard and has remained so ever since. Just recently however (1993), there has been a push to save power in battery operated processors by developing logic that can operate on 3.3 volts. Plus ca change...!

Computers circa 1965
The major impact of the new integrated circuits (I.C's) was in reducing the size of computers from two or three six-foot racks, to single boxes which could be moved around on a lab dolly. The market for scientific computing was dominated at that time by the Digital Equipment Corporation "DEC", based in Maynard Massachusetts. The company had been formed (as so many companies had) by a group of bright young engineers from the Massachusetts Institute of Technology (MIT). They competed head-to-head with IBM in the scientific computing market and soon created a comfortable niche for themselves.

In those days once a computer manufacturer managed to get a mainframe installed in a large institution, then the orders for upgrades and additional installations would follow as surely as night followed day, because of the need to maintain compatibility across the board within the institution. DEC took full advantage of this, as did other manufacturers. The main AECL Nuclear Research Laboratories at Chalk River, about 60 miles north- west of Ottawa, were a "DEC shop", with a huge investment in the "PDP" (Programmable Digital Processor) series of computers. It was not very long before Ron Tolmie decided that we too needed a computer and a DEC PDP-9 was ordered. It was the last of the "discrete component" computers, with literally hundreds of little circuit boards, each one having twenty or thirty resistors and transistors.

I went on the only computer programming course that I have ever taken to learn the basics of this relatively new science. At that time (early 1965) DEC was in its heyday, carpeting the world with its innovative and scientifically oriented products. The company had adopted a shrewd policy of buying up old water- powered mills in rural settings close to major markets. The strategy was that such buildings were invariably cheap to acquire, had the structural requirements to house lots of fairly heavy equipment and were in congenial settings well suited to the outdoorsy life styles of the young engineers that were so essential to the continued prosperity of a rapidly growing high-tech company.

The one they had chosen to serve the Ottawa market was in Carleton Place, a thirty five mile drive from the City. It was a setting right in line with the company philosophy, an old mill on a stream running through the little town, a traditional solid stone structure, complete with water-wheel, which (following some renovation) was a perfect and very picturesque habitat for their operations. The course was a week-long affair, after which I began to get the idea. The "assembly language" method of programming was primitive in the extreme and the tools to do it (editors assemblers and linkers) were poorly documented, and incredibly tedious and unforgiving to use. In today's parlance it would probably be dubbed as "User Hostile".

There were higher level languages like "FORTRAN", which allowed a programmer to write lines of code with statements that had some obvious meaning in the English language, but they were oriented toward data processing. We (and many others like us at that time) needed to use the newly available computers to control and acquire data from instruments which we ourselves had designed, and for that purpose the nuts and bolts assembly language programming was the only game in town. It is a very different story more than thirty years later, with personal computers in every laboratory, but even now if it is imperative for some reason to wring the maximum speed and performance out of a computer, then engineers and scientists still get back to basics and write the routines in assembly language.

To put things in perspective a few comparisons are in order. The PDP-9 machine which we bought consisted of three six-foot racks, one of which contained the 32 kilobytes of memory which was all it had. the standard method of loading and saving software was paper tape. The PDP-9 was very advanced in that it had an optical reader for the paper tape and also "DEC-tape", a magnetic tape system developed by DEC which could read and write in both directions. That tape system was the most innovative one that I have ever come across and was streets ahead of the pedantic seven-track IBM tape system which was (of course) the industry standard at the time. About a year later we decided to buy another 32 kilobytes of memory and it arrived in a truck. What a difference thirty-odd years has made in that field. Now my notebook computer, on which this narrative is being written, has 4 megabytes of memory (more than a hundred times as much) on a little card about three inches square.

Radiation Detectors - how they work
I realise that yet another primer on some abstruse technical subject is not exactly going to keep any readers on the edges of their seats at this stage. Nevertheless, a short paragraph describing what the gadgets are that actually sense gamma and X- radiation, is really necessary at this point and is not at all difficult to follow.

In spite of the breath-taking advances in the acquisition and processing of all sorts of data in many fields (thanks to computers), there really has not been any great revolution in the sensors that record whatever the physical or chemical parameters may be. In most areas the same ones are being used (albeit with some improvement) that were in use forty years ago. In that time period for example, the number of sensors capable of recording a "colour chart" (or more accurately a "spectrum") of gamma and X- radiation, has doubled - from one, to two! The Grand-Daddy, which was invented in the late 1940's, is the "Scintillation Detector". It is a transparent crystal of Sodium Iodide which is chemically similar to common salt (Sodium Chloride). When X-rays or gamma rays are absorbed by it, the individual atoms are ionised and tiny flashes of light or "scintillations" occur.


The big scintillation detector
has four photomultiplier tubes
The intensities of these little flashes are proportional to the energies or "colours" of the individual gamma or X-rays which cause them. These flashes are recorded by a light-sensing device called a "photomultiplier tube", which converts them to electrical pulses. The amplitude of these pulses are thus a measure of the energy or colour of each individual gamma or X-ray that generated it in the first place. These detectors are packaged up with the photomultiplier tube attached to one end of a cylindrical crystal. A typical size for a crystal would be three inches high by two inches in diameter. The photomultiplier tube to record the scintillations from such a crystal would be about five inches long, so that the whole package in a light-tight metal cover, would be about three inches in diameter by perhaps eight inches long.

The "solid state detector" is the second of the two devices which can be used to record a radiation spectrum. It also provides electrical pulses according to radiation energy, but it is vastly superior to the scintillation detector in its ability to distinguish fine differences in energies. Unfortunately it has to be in a vacuum, which is bad enough, but even more inconvenient is the requirement that it must be at the temperature of liquid Nitrogen to operate. These are daunting requirements indeed, but because the performance is so stunning in terms of being able to recognise the radiation from many many radioistopes in a sample of material, a great deal of effort has been put in over the years to make it an affordable device for most laboratories. In 1965 it was in the very early stages of development.

Until about 1975 solid state detectors had to be maintained for ever at the temperature of liquid Nitrogen. If they were allowed to warm up they would be ruined. Liquid Nitrogen is available more or less on tap in most labs, so that is not as impossible as it sounds. The detector itself (about the size of one of those little tins of tomato paste) is inside an evacuated tube attached to a "cold finger" in the form of a long copper rod, which dips into a giant thermos flask full of liquid Nitrogen, holding typically 31 litres, or about seven gallons. That amount lasts for nearly a month, which is reasonable.


An oscilloscope displays a blur of pulses from a detector.
The brightness of bands at different amplitudes,
is proportional to the radioactivities
of isotopes of different energies
Electromagnetic radiation including visible light, arrives in individual lumps ("photons" in physics lingo) at random intervals like rain drops. The more intense the radiation, the faster the rain drops arrive. The photons from different radioisotopes all arrive thick and fast in any old order, generating individual electrical pulses in a detector. What is needed to make any sense of all this is some means to record and sort them all according to amplitude. Only then can the data be displayed to show the intensities of each radioisotope and hence the concentration of the various elements in the sample.



A gamma-ray spectrum showing
the peaks for the radioisotopes
of Potassium Uranium and Thorium
The sorting is done by a device called an Analogue-to-Digital converter, the same sort of device that I had worked on in developing the small arms pressure measurement system. It sorts the pulses on the fly by assigning them digital numbers proportional to their amplitudes. At that time a range of one to 256 was adequate, and associated digital electronics then recorded the tallys in a 256 pigeon hole memory. The figure shown here is a chart or 'spectrum' of the data in the memory. In this particular case it is the spectrum of the three radioisotopes which occur naturally in the environment, Potassium, Uranium and Thorium. Each one has gamma radiations at several different energies and these cause the peaks which are seen in the spectrum. The most prominent one is due to radioactive Potassium, the other lesser ones are from Uranium and Thorium. They should be well resolved needle-like lines, but because the detector is much less than perfect, they tend to be triangular and overlap quite a bit, making analysis less than perfect as well.