To celebrate the international year of chemistry, James Mitchell Crow looks back at some of the discoveries and developments made by chemists over the past six decades
To celebrate the international year of chemistry, James Mitchell Crow looks back at some of the discoveries and developments made by chemists over the past six decades
1950s: radiocarbon dating
Where do we come from? What was life like for our ancestors? Such questions have long played on the mind of humans, and thanks to chemistry - and the action of cosmic rays - we now have a good part of the answer.
Shortly before the second world war, researchers studying Earth’s upper atmosphere made a discovery that would revolutionise our ability to peer into the past, allowing us to date ancient sites and artefacts produced by people as much as 50,000 years ago. Cosmic rays colliding with the top of the atmosphere were found to be generating neutrons, which react with nitrogen-14 to form carbon-14, ejecting a proton in the process. This produces a steady input of radioactive carbon into the biosphere, which decays with a half-life of about 5,600 years.
It was Willard Libby and his colleagues at the University of Chicago, US, that turned this observation into a tool to accurately date objects from our distant past - a discovery that won Libby the 1960 Nobel prize in chemistry.
All organisms absorb carbon-14 during their lives, through photosynthesis or by consuming food, but stop absorbing any more at the point of death. From that point onwards, their radiocarbon component slowly falls as it decays back to nitrogen-14. Libby showed that this steady decline in carbon-14 reveals any organic object’s age.
During his work to establish the technique’s reliability, Libby discovered how challenging archaeology had been up to that point. ’The first shock we had was when our advisors informed us that history extended back only 5,000 years,’ Libby recalled during his Nobel acceptance speech. ’We had thought initially that we would be able to get samples all along the curve back to 30,000 years. In fact, it is at about the time of the First Dynasty in Egypt (around BC3100) that the first historical date of any real certainty has been established.’
The laborious early techniques for measuring the amount of radiocarbon in a sample required literally counting the slow decay of radiocarbon atoms in a sample using a gieger counter. ’Today, using accelerator mass spectrometry [AMS] we actually measure the carbon-14 atoms individually, and not the decay that comes off of them, so we can now measure much older samples in a much shorter amount of time - instead of months, something in the order of tens of minutes,’ says Stewart Fallon, head of the radiocarbon dating laboratory at the Australian National University in Canberra. AMS needs so little material that specific compounds can be chromatographically extracted from a sample for dating individually.
Thanks to Libby’s work, history now extends far further back in time. The technique has also proved to be useful in fields outside archaeology. Fallon uses it to study the uptake of radiocarbon by giant corals in the sea since the burst of carbon-14 released into our atmosphere during the era of nuclear weapons testing. This data gives an accurate picture of ocean circulation over the past 50 years, information that can be used to test ocean current models, which are being developed in part to help predict the impact of climate change.
1960s: silicon chips
It is hard to imagine life without the now-ubiquitous silicon chip. Chemists might not be the first band of researchers to get the credit for its development, but in fact many chemical breakthroughs were needed before the electrical engineers could take over to produce silicon-based integrated circuits.
’Many of the initial developments in the field were made by chemists learning about silicon and its material properties,’ says Ali Javey, who researches nanomaterials for technological applications at the University of California, Berkeley, US. From finding ways to make ultra-pure semiconductor-grade silicon, to doping the material by infusing in atoms of other elements to tune its semiconducting properties, chemists initially played a leading role. ’Developing reactions to chemically etch the silicon and ways to pattern the integrated circuit - that’s all chemistry,’ Javey adds.
It was in the 1960s that all these developments finally came together to produce the first practical silicon chips. In 1961, US company Texas Instruments, one of the pioneers of integrated circuit silicon technology, demonstrated their ’Molecular Electronic Computer’ to the US Air Force, which was ’150 times smaller and 48 times lighter’ than a conventional computer of the time with the same processing power. The technology really took off thanks to interest from the US military and the space race. Some of the earliest chips were used in the first guidance computers to be fitted inside missiles, and for the Apollo moon missions.
In the intervening years, silicon became the domain of engineers rather than scientists. But today the input of chemists is becoming crucial again, says Javey. ’All of a sudden, the dimensions of silicon chips have come down to a size that chemists have been working at for decades - at the level of macromolecules. So chemists have become involved again, developing new nanoscale materials, fabrication techniques, and doping with control down to a few atomic layers.’
This is one of the areas where Javey himself works. ’We need to be able to place dopants within 1-2nm thick layers. Conventional techniques are no good for that.’
However, it could be in fundamental materials chemistry that the next big chip technology breakthrough comes. ’The coming direction - where chemists are playing a major role - is in developing new types of architecture based on flexible substrates rather than on rigid silicon wafers,’ says Javey
1970s: the statins
In 1971, Japanese pharmaceutical chemist Akira Endo set himself an ambitious challenge: to identify compounds that could lower cholesterol levels in the blood, a then-recently discovered risk factor linked to coronary heart disease. Forty years later, tens of millions of people have benefited from the family of drugs that Endo discovered, known as the statins. In the future, many millions more could be taking these medicines, as their use increasingly shifts from treatment to prevention.
’I think the discovery of the statins is comparable in importance to the discovery of antibiotics,’ says Nicholas Wald, director of the Wolfson Institute of Preventive Medicine at Queen Mary University of London, UK. ’The story of penicillin is well known - everyone has heard of Alexander Fleming. But not of Akira Endo - he’s an unsung hero of pharmaceutical chemistry.’
In 1968, Endo returned to Japan to work for the Sankyo pharmaceutical company, after a two year postdoctoral stint in the US. While abroad he had learned about cholesterol biosynthesis, and the link between high cholesterol levels and coronary heart disease. Today, cardiovascular disease is the leading cause of death worldwide, in countries rich and poor, but at the time it was primarily a disease of the western world.
Cholesterol is taken up from the diet, but can also be produced in the liver if dietary sources are lacking. Endo hypothesised that, rather than blocking cholesterol uptake from food, the best way to reduce cholesterol levels in the blood would be to inhibit the body’s cholesterol-producing pathway.
Endo started by looking in microbes, reasoning that they might have evolved metabolites to block steroid biosynthesis for self-defence. Two years and 6,000 microbes later, Endo had a hit - the fungus Penicillium citrinum was producing a compound, dubbed ML-236B, which blocked lipid synthesis. That compound is now known as mevastatin.
By the end of the decade, the first clinical trials of the drug were underway - by which time other medicinal chemistry groups were also working in the area. In 1987, lovastatin, developed by scientists at Merck and Co, became the first statin to receive regulatory approval by the US Food and Drug Administration (FDA). Pfizer’s Lipitor (atorvastatin) has become the biggest selling pharmaceutical of all time.
’The challenge now is moving statins from treatment to prevention,’ says Wald, who in 2000 along with colleague Malcolm Law first proposed the idea of a cardiovascular ’polypill’, a combination of cheap, safe and effective drugs proven to lower cholesterol and blood pressure.
Currently, statins are prescribed based on various risk factors. But Wald argues that reaching the age of 55 is sufficient risk alone, and is about to start a polypill trial to test the theory. ’The problem is that beyond middle age, everyone’s blood pressure and cholesterol are high - certainly higher than when that person was 20. That increase is the driver behind the high rates of stroke and heart disease.’
Whatever the next step in the statin story, it is clear that Akira Endo deserves greater recognition for his leading role in its development.
1980s: the polymerase chain reaction (PCR)
As with many of the great scientific discoveries, there’s a story behind the discovery of the polymerase chain reaction (PCR). Kary Mullis was driving through California, US, late one Friday evening in April 1983 when the technique that became PCR first occurred to him. Ten years later, Mullis was awarded the Nobel prize in chemistry for his discovery.
At the time of PCR’s invention, much of the work involving DNA was rather a messy business, with researchers trying to pick out the small amount of DNA of interest from the mass of strands isolated from any biological sample. As he drove along Route 128, Mullis wasn’t pondering a way to solve this fundamental problem, but a way to identify an unknown base next to a known sequence. However, he suddenly realised that he had hit upon something rather bigger.
’I had solved the most annoying problems in DNA chemistry in a single lightning bolt,’ Mullis recalled during his Nobel speech. ’With two oligonucleotides, DNA polymerase, and the four nucleoside triphosphates I could make as much of a DNA sequence as I wanted. Somehow, I thought, it had to be an illusion. Otherwise it would change DNA chemistry forever.’
PCR relies on the fact that base-pairing means double strands of DNA can act as a template for themselves - and, as Mullis discovered, it also relies on temperature cycling. It took until December 1983 for Mullis to finally achieve the first successful reaction. By repeatedly raising and then lowering the temperature, the two strands separate and then grow complementary second strands. The oligonucleotides added to the mixture act as primers for each new strand, which are extended into a new full-length strand by DNA polymerase enzymes.
Every time this cycle takes place, the amount of DNA doubles. After 30 cycles, that first pair of strands have become a billion strands.
The impact of the discovery of PCR on DNA research is hard to overstate. As Carl-Ivar Br?nd?n of the Royal Swedish Academy of Sciences said when presenting Mullis the Nobel prize: ’Cloning and sequencing of genes as well as site-directed mutagenesis have been facilitated and made more efficient. Genetic and evolutionary relationships are easily studied, even from ancient fossils containing only fragments of DNA. Biotechnology applications of PCR are numerous. In addition to being an indispensable research tool in drug design, the PCR method is now used in diagnosis of viral and bacterial infections including HIV. The method is so sensitive that it is used in forensic medicine to analyse the DNA content of a drop of blood or a strand of hair.’
1990s: antiretroviral drugs
The sudden emergence of HIV-Aids in the early 1980s, and its shocking subsequent global spread, is perhaps the greatest challenge ever to have faced medicinal and biological chemists. In the intervening 30 years, the Joint United Nations Programme on HIV-Aids (UNAIDS) estimates that more than 60 million people have been infected with HIV and nearly 30 million people have died as a result of this global pandemic.
However, the number of people dying from the disease now seems to be slowing. In 2009 there were 1.8 million Aids-related deaths, down from 2.1 million in 2004, according to UNAIDS. That fall is in no small part a result of the now-broad range of drugs that have been developed.
HIV has proven vulnerable to drug treatment at several points in its lifecycle. As a retrovirus, HIV carries its genetic information as RNA rather than DNA. To reproduce, the virus transcribes its RNA into DNA, a process involving a viral enzyme called reverse transcriptase. This is the enzyme targeted by the first successful HIV-Aids drug, azidothymidine (AZT). In 1987, AZT became the first drug approved by the US Food and Drug Administration to extend the lives of Aids patients, and in 1990 the drug was approved to delay the development of HIV into Aids.
However, it is in blocking a later step in the HIV lifecycle that chemical scientists have played a particularly important role.1 Once the viral DNA has been produced by the reverse transcriptase, it is incorporated into the DNA of the host cell, where it gets copied. Once this DNA has been transcribed by the infected cell into long polyproteins, an enzyme called a protease chops it up to release the active viral proteins, which go on to form new virions ready to infect more cells.
The drugs developed to inhibit the protease are one of the great success stories of rational drug design. HIV protease was validated as a drug target in mid-1980s, and by the end of the decade x-ray crystal structures of the enzyme had been published by several groups. (See Chemistry World, November 2009, p50)
By studying the structure and mechanism of the enzyme, drugs designed to block it could be developed. HIV protease works by inserting water molecules into particular amide groups along the polyprotein backbone, which cleaves the chain at those points to release the functioning viral proteins. The drugs that successfully target this enzyme mimic the tetrahedral intermediate of the hydrolysis reaction, and are designed not to be cleaved in two by the enzyme, so remain blocking the active site. For example, saquinavir, which in 1995 became the first HIV protease inhibitor to be approved by the FDA, sports a hydroxyl group in place of the amide usually cleaved by the protein.
These HIV drugs have been particularly successful as it is their emergence that led to the rise of drug combination therapies known as highly active antiretroviral therapy (HAART). These cocktails allow doctors to hit HIV at multiple points in its lifecycle at once, and have dramatically extended the life expectancy of HIV-positive people.
’HIV protease inhibitors have had wide benefits for those who can access them,’ says Andrew Abell, who studies enzyme inhibitors at the University of Adelaide in Australia. Abell began working on HIV protease inhibitors in the early 1990s, and worked on the problem as a visiting scientist at SmithKline Beecham Pharmaceuticals (now part of GSK) in Philadelphia, US.
’A huge amount of research effort went into developing HIV protease inhibitors - it’s a real chemistry success story,’ says Abell. The benefits continue to be felt widely today, he adds. ’The HIV protease inhibitor work really set the scene for chemists to interact with other scientific disciplines to quickly drive forward medicinal chemistry projects.’
The chemist’s job in the area isn’t done, he says. ’Retroviruses develop resistance to drugs, so we always need to develop new ones to stay one step ahead.’
2000s: thin film solar cells
As alternative energy rose inexorably up the political agenda throughout the 2000s, so rose the performance of a new breed of cheap, efficient solar cells that could fill that gap - if the price is right.
’Solar cells are all about cost,’ says Ali Javey, who studies light-harvesting nanomaterials at the University of California, Berkeley, US. ’Back in the 1950s, the first cells cost around $1,700 (?1,060 at today’s exchange rates) per watt. Today we’re at about $3 per watt, and it’s continuing to fall.’
The US government, which invested heavily in solar technology research as part of its economic stimulus package, predicts that solar energy could reach grid parity with fossil fuel generated electricity by 2015. Subsequent improvements should see the costs fall still further.
These developments are in no small part the result of emerging thin film technologies, which in 2009 dropped below the cost of traditional silicon panels for the first time. ’Today, the installed cost of a crystalline silicon solar panel is $5 per peak watt. For thin film cadmium telluride it is $4.50,’ says Javey.
The breakthrough for cadmium telluride cells has been finding ways to grow them uniformly, reliably and reproducibly over large areas at low cost. The US firm First Solar, for example, has developed a solution-based process for large scale manufacture of these cells.
However, other thin film technologies aren’t far behind. Amorphous silicon, organic dye-sensitised cells, and copper indium gallium selenide (CIS or CIGS), are also beginning to be commercialised. In some cases the first products might be rather niche, but that’s just the beginning, says Udo Bach, who researches dye-sensitised solar cells at Monash University in Melbourne, Australia.
’People like me who work in this area wouldn’t put in all the effort if they thought that all we could make is a solar-powered backpack that could charge a mobile phone. We have something much bigger in mind, and I definitely see the potential for these organic solar cells to be applied in large scale solar farms,’ says Bach.
Javey is also looking ahead. ’We’re trying to grow single crystal materials using a cheap process on low cost substrates,’ he says. ’We start with a cheap aluminium foil, and anodise it, which generates an array of pores in the surface. We use these pores like little test tubes to grow single crystals in. The test tube confines the growth of the crystal, so acts as a template.’ The resulting material is an array of light-absorbing nanopillars, all single crystals.
’It is very difficult to say which technology will make it in 10 or 20 years’ time - which is why we have to work on all of them,’ says Bach. ’That’s why it’s such an exciting area to work in,’ says Javey. ’One major technological advance could change the story completely.’
References
1 E De Clercq, Rev. Med. Virol., 2009, 19, 287
No comments yet