Argos Travel Insurance Argos Travel Insurance >

pharma-cology

Monday, December 17, 2007

Screening’s age of insecurity
High-throughput screening hasn’t given us more drugs. Can better data management make a difference?

PHOTO BY TONY FERNANDEZ: Researchers at today’s biotechnology and pharmaceutical companies should be brimming with confidence. The data are fruitful and multiplying. Many crucial target systems have been well characterized, and druglike behavior is better understood. Besides this knowledge, we have the technology—the robust assays, automation, and detection methods—necessary to move from targets, to hits, to leads, to drugs.

Yet the pace of research remains frenetic, and the competitive landscape cutthroat. That’s because organizations have little to show for all of the innovations of the 1990s, at least in terms of new chemical entities. High-throughput screening (HTS), in particular, has not lived up to the expectations that greeted its initial adoption in the first half of the decade. HTS has matured as a field, but lead attrition rates remain high, and little time has been shaved off R&D pipelines. Screeners seem confident in their programs’ ability to generate data. But they are less sure about whether they and their organizations are doing the best things with data generated.

“What we’ve entered now is an age of insecurity,” says Peter Hecht, senior vice president of discovery research operations at Tripos. “We have more than ever to choose from in terms of targets, compounds, and options for what we should be doing next. But we’re discovering that all the data around us isn’t making the answers come any easier.”

Maturation and industrialization: In the beginning, there was screening. Then came the 96-well plate and the concept of increasing throughput. The appeal was obvious: Why screen just a few compounds in an assay when you could screen whole plates at a time, particularly when a convenient new technique called combinatorial chemistry was letting you synthesize new compounds faster than ever before? By the mid-1990s, most pharmaceutical and biotechnology organizations had initiated HTS programs, and it was not long before the “ultra” modifier was added to the acronym.

Yet while screening’s modifiers have become increasingly superlative, emphasizing both speed and miniaturization, today’s screeners often reject these labels. “We don’t have a badge that says, ‘We are high throughput,’” points out Mike Snowden, head of the molecular discovery department at Glaxo Wellcome.

Snowden credits HTS’s active “conference culture” with perpetuating the notion that the technique is somehow set apart. In practice, Snowden and others say screening is screening, scaled according to the project at hand and balanced on a continuum with respect to quality and throughput. Says Bob Burrier, senior director of biochemical technologies at GelTex Pharmaceuticals, “We chose to focus on developing an infrastructure that would support our various projects, rather than a platform for a particular volume of screening. We have the ability to screen 100,000 data points a day, but we only do it when the project requires it.”

Such considerations demonstrate that screening has come of age—an industrial age in which high-throughput techniques are no longer adopted faddishly but applied as just one more tool in the biologist’s research repertoire. Much of screening’s maturation can be credited to experience, which has revealed the technique’s strengths as well as its limitations. But technological advances have also played an important role. Factors influencing screening’s maturation and industrialization include the following:

screening scientists having more experience managing the logistics of scalingassaysto a high-throughput environment;

access to more sensitive detection devices, particularly the wide range of fluorescence techniques that are now commonplace even in primary screening;

standardization of the 96- and 384-well plate formats, which can now accommodate most assays. Rather than pursuing further miniaturization, most screening groups are looking at pooling or high-content screening as methods for increasing throughput;

better-engineered automation, with more options available both in terms of workstations and fully integrated automated systems; and

the emergence of software systems tailored specifically to the needs of screening scientists and their organizations.
Managing data: Then versus nowScreening’s industrialization, particularly the availability of robust data management tools, has freed screeners to adopt a more systematic, process-oriented mind-set toward the techniques. “If you’d asked me five years ago to name my crucial data management need, I’d have answered, ‘Just get the data into Excel, please,’” says Snowden. “Today, getting data out of the readers, processing it for percent inhibition, and updating Oracle tables are of little interest to anyone because we can all do it, ad nauseam. It’s what you do with the data in your database, how you mine it—that’s where the competitive advantage is.”

The need to pause and think more carefully about data explains why increasing throughput has become less of a priority even as screening technology has made it easier to achieve. In the early days of screening, most assays generated single data points, such as a percent-inhibition value. Not surprisingly, higher-throughput techniques are at their most productive when generating data that can be easily interpreted. In fact, today’s data management systems are often customized to identify and automatically advance compounds from a screening run possessing a particular percent inhibition value.

But the high-throughput model works less well as the data get more complex—and complexity comes in many forms. Modern assay techniques, such as fluorescence and pooling, generate multiple data points for a well that must be resolved before scientists can act on the results. Cell-based assays and other high-content screens also tend to generate data that are not readily mappable back to a standard database unit. And then there are the logistical concerns arising from the focus on projects rather than particular screening platforms. Compounds stored on 96-well plates, for example, may be run in 384-well plates, requiring some way to map back to the original compounds before scientists can begin comparing plates or deciding which compounds show the most promise in an assay.

“As you generate more and more interrelated data points, you end up spending much more time interpreting that data,” explains Neil Carlson, a senior software engineer in the advanced discovery sciences group at Applied Biosystems. “The value of focusing purely on the volume of data produced drops as the amount of knowledge you gain from each data point increases.”
Screeners have confidence that they can generate data. They know they can put the results into a database, even those generated by newer detection devices or by more complicated assays. The bottlenecks—and the key opportunities for creative solutions to speed the process—now center on two key questions: Are the data produced by screening any good? And how can the good data be used to make better, faster decisions?
Online, unseen QC :The emphasis on quality and process control—on tracking HTS data in real time during the course of a project—is a direct outgrowth of screening’s industrialization. Not that quality is a new concern for screening scientists, who have always had tools available to help them spot problems with an assay. But screeners today don’t just want to weed out the junk; they want to stop it before it starts.

“A long time ago, when screening was a manual process, you could set up your 50 plates to run, get a cup of tea, wait for the reader to read the plates, and then sit down to work out whether the experiment had failed or not,” says Snowden. But today, Snowden and colleague Chris Molloy, head of HTS information technology (IT) and automation, note that online, immediate analysis of assay performance saves organizations critical time and money. Glaxo Wellcome’s online quality control system monitors and examines plates as they come off robots. Potential problems can be identified immediately and the entire operation shut down to make necessary corrections. It is a process that hearkens to a manufacturing assembly line rather than a research laboratory, and Molloy points out that such online, unseen data management has changed the screening work flow by removing tedious tasks. “Screeners today can spend time on more cerebral problem solving rather than wading through the mass of high-quality data that doesn’t need their attention,” Molloy says.

The availability of commercial software designed for screening has helped companies implement their own unique data management tools. Glaxo Wellcome’s internal IT staff built the quality control functionality as a module to an underlying commercial system; they also created software for hit definition and cheminformatics. All three of the major suppliers of screening data management software claim to provide fully functional packages along with tools and consulting services for helping customers build custom solutions (see box at right).The reason is simple, according to GelTex’s Burrier. “For commercial software to be useful, it must not only work out of the box, but adapt easily. Drug discovery is arguably the most data-intensive industry out there, so we need products that will move with us as our needs change,” Burrier says.

Yet some companies dislike the overhead associated with out-of-the-box commercial applications. “We found that most commercial solutions forced us to follow a number of procedures to get the data into the database and knowledge back out of the database that just weren’t relevant to our existing processes,” says Applied Biosystems’ Carlson. Applied Biosystems chose to build its own visualization system for assessing screening performance, which enables a scientist to validate and analyze screens in just two hours rather than eight.
Screening data management

IDBS (www.idbs.co.uk): Offers ActivityBase, an integrated chemical and biological data management system, and two data analysis packages: XLfit, for curve-fitting, and the SARgen reporting tool.

MDL Information Systems, Inc. (www.mdli.com): Offers Assay Explorer for biological data management and a plate management system, Apex, which was released this April.
Pharmacopeia, Inc. (www.oxmol.com): Pharmacopeia acquired Oxford Molecular last year and with it two software pacakges: OMMM, a screening plate and data processing system, and the RS3 Discovery HTS biological data management system.

Other providersEach of the vendors listed above provides complementary systems for integrating screening data with related data, particularly that from chemistry, toxicology, pharmacology, and genomics. They are joined in this endeavor by the following:
Spotfire, Inc. (www.spotfire.com): Released in April, Spotfire DecisionSite for Lead Discovery brings Spotfire’s patented data visualization tools into a single, Web-based environment for analyzing chemical and biological data.

Tripos, Inc. (www.tripos.com): Tripos is currently partnering with Pfizer and Bayer to build new informatics methods for analyzing chemical and biological data.
Integration expectations: Although data management systems have helped streamline many of the tasks essential to screening, they have been less successful at providing the context to assist medicinal chemists and others with interpreting and acting on screening results. The problem is not new, according to Trevor Heritage, senior vice president of discovery technology operations and marketing at Tripos. “Even when we were screening in tubes, we had trouble storing the metadata associated with each data point,” Heritage says. “Moving to 96-well plates made the problem worse, and now, at really high throughputs, it’s become nearly impossible.”

Lack of integration could be tolerated during screening’s infancy, when other data management and logistical issues had yet to be resolved. But with screening itself no longer a bottleneck, organizations can now afford to ask why screening has failed to jump-start stalled R&D pipelines. Jack Elands, vice president of marketing at IDBS, points out that disillusionment with HTS has led scientists back to another technique that was once viewed as a drug discovery panacea: rational drug design.

“In the early 1990s, computational chemistry was the technique that would lead to new drugs, and testing would simply confirm these new leads,” Elands recalls. “But when pure rational design didn’t pay off, the pendulum swung wildly the other way to blitz-screening of everything in a compound library. Today, both techniques have matured. Perhaps together, they will be able to achieve what they couldn’t separately.”

Better integration may be the solution, but it is not something that can be easily provided in a one-size-fits-all, out-of-the-box package. “The real questions that you need to answer during HTS are, ‘What do you want to test, and what do you want to make?’” says GelTex’s Burrier. Different companies answer these questions in different ways, and they are not interested in sharing their particular way of answering the questions. As a result, organizations expect vendors to provide open systems that readily hook into the components that scientists might need to access.

Possible solutions typically involve rationalizing screening using available data from genomics; cheminformatics; and absorption, distribution, metabolism, and excretion studies. Outsourcing screening organizations, such as Applied Biosystems and Discovery Technologies, Ltd., have implemented integrated lead-finding processes to make their services more attractive to customers. Tripos recently patented a method of characterizing the structural diversity of large combinatorial libraries, which can aid screeners in selecting compounds for testing. And Glaxo Wellcome points to its in-house cheminformatics system, which lets scientists consider chemical information immediately after primary screening.

A security blanket?HTS’s growth from fad to industrialized process has been mirrored by a change in mind-set. More data, it turns out, does not mean more answers—and it certainly does not translate automatically into more viable leads. “You can’t count on serendipity,” says Burrier. Whether an organization is intent on focused screening or backing out to run broader diversity screens, the emphasis today is on collecting good data and doing good things with the information. Only time will tell whether this increased attention on data management will help screening outgrow its age of insecurity.

A BRIEF HISTORY OF PHARMACOLOGY
Originating in the 19th century, the discipline makes drug development possible.
Pharmacology is one of the cornerstones of the drug discovery process. The medicinal chemist may create the candidate compound, but the pharmacologist is the one who tests it for physiologic activity. A promising compound is investigated by many other scientists—toxicologists, microbiologists, clinicians—but only after the pharmacologist has documented a potential therapeutic effect. This article briefly presents the historical development of pharmacology and some of the basic methods used.

Etymologically, pharmacology is the science of drugs (Greek pharmakos, medicine or drug; and logos, study). In actual use, however, its meaning is limited to the study of the actions of drugs. Pharmacology has been defined as “an experimental science which has for its purpose the study of changes brought about in living organisms by chemically acting substances (with the exception of foods), whether used for therapeutic purposes or not.”

Pharmacology studies the effects of drugs and how they exert their effects. There is a distinction between what a drug does and how it acts. Thus, amoxicillin cures a strep throat, and cimetidine promotes the healing of duodenal ulcers. Pharmacology asks “How”? Amoxicillin inhibits the synthesis of cell wall mucopeptide by the bacteria that cause the infection, and cimetidine inhibits gastric acid secretion by its antagonist action on histamine H2 receptors.

The main tasks of pharmacologists in the search for and development of new medicins are
.screening for desired activity,
.determining mode of action, and
.quantifying drug activity when chemical methods are not available.
Historical development Synthetic organic chemistry was born in 1828, when Friedrich Wohler synthesized urea from inorganic substances and thus demolished the vital force theory. The birth date of pharmacology is not as clear-cut. In the early 19th century, physiologists performed many pharmacologic studies. Thus, François Magendie studied the action of nux vomica (a strychnine-containing plant drug) on dogs, and showed that the spinal cord was the site of its convulsant action. His work was presented to the Paris Academy in 1809. In 1842, Claude Bernard discovered that the arrow poison curare acts at the neuromuscular junction to interrupt the stimulation of muscle by nerve impulses.

Nevertheless, pharmacology is held to have emerged as a separate science only when the first university chair was established. According to Walter Sneader, this occurred in 1847, when Rudolf Buchheim was appointed professor of pharmacology at the University of Dorpat in Estonia (then a part of Russia). Lacking outside funding, Buchheim built a laboratory at his own expense in the basement of his home. Although Buchheim is credited with turning the purely descriptive and empirical study of medicines into an experimental science, his reputation is overshadowed by that of his student, Oswald Schmiedeberg.

Oswald Schmiedeberg (1838–1921) is generally recognized as the founder of modern pharmacology. The son of a Latvian forester, Schmiedeberg obtained his medical doctorate in 1866 with a thesis on the measurement of chloroform in blood. He worked at Dorpat under Buchheim, succeeding him in 1869. In 1872, he became professor of pharmacology at the University of Strassburg, receiving generous government support in the form of a magnificent institute of pharmacology. He studied the pharmacology of chloroform and chloral hydrate. In 1869, Schmiedeberg showed that muscarine evoked the same effect on the heart as electrical stimulation of the vagus nerve. In 1878, he published a classic text, Outline of Pharmacology, and in 1885, he introduced urethane as a hypnotic.

In his 46 years at Strassburg, Schmiedeberg trained most of the men who became professors at other German universities and in several foreign countries. He was largely responsible for the preeminence of the German pharmaceutical industry up to World War II.

In the United States, the first chair in pharmacology was established at the University of Michigan in 1890 under John Jacob Abel, an American who had trained under Schmiedeberg. In 1893, Abel went to Johns Hopkins University in Baltimore, where he had a long and brilliant career. His major accomplishments include the isolation of epinephrine from adrenal gland extracts (1897–1898), isolation of histamine from pituitary extract (1919), and preparation of pure crystalline insulin (1926). His student Reid Hunt discovered acetylcholine in adrenal extracts in 1906.

Today, there is a pharmacology department in every college of medicine or pharmacy.

Animal studies Pharmacology depends largely on experiments conducted in laboratory animals, but even the human animal may be used as a test subject. Friedrich Serturner, the German pharmacist who isolated the first alkaloid from opium in 1805, administered a whopping dose (100 mg) to himself and three friends. All experienced the symptoms of severe opium poisoning for several days. The alkaloid was named morphine, for Morpheus, the Greek god of sleep.
An interesting example of the use of humans for testing occurred in the 1940s. Although digitalis had been a standard medication for heart disease for more than a century, there were still no reliable methods for evaluating its potency. Biological assays (bioassays) were performed on frogs, pigeons, and cats, but none were totally satisfactory.

In 1942, a group of cardiologists published “a method for bioassay of digitalis in humans”. The assay was based on quantitative changes in the electrocardiogram (ECG) of patients in the cardiac clinics of two New York City hospitals. It was hard to find patients whose ECGs could be standardized. Of 97 patients in whom calibration of the ECG was tried, only 18 proved to be satisfactory assay subjects. Fortunately, chemical research on the active glycosides of digitalis, and development of analytical methods, soon rendered all digitalis bioassays obsolete.

Although humans are no longer used as ad hoc laboratory animals, they are essential in clinical pharmacology. When a new drug compound has gone through sufficient preclinical testing to show potential therapeutic action and reasonable safety on short-term administration, and the data have been reviewed by the FDA, the compound is administered to a small number of human volunteers under closely controlled and monitored conditions. These Phase I clinical trials provide information about dosage and the most common side effects to be expected.
The animals most frequently used in pharmacologic studies are mammals. Mice are preferred because of their small size, ease of breeding, and short generation time. Rats, guinea pigs, rabbits, and dogs are also used; each has special characteristics that make it optimal for certain types of tests.

Basic techniques:Experimental pharmacology uses animals in various ways. Intact animals are essential for the acute, subacute, and chronic toxicity tests that a new drug substance must undergo, and for important special tests such as teratology and carcinogenicity. Pharmacology per se tends to use excised (isolated) organs or tissues and animals that are surgically prepared in various ways to aid in the detection and study of target activities.

Early in the development of pharmacologic techniques, it was found that an isolated organ or tissue remained functional for several hours in a bath containing a physiologic solution of salts through which oxygen was bubbled. Henrick Magnus (1802–1870) first applied this method to a strip of small intestine, Jean-François Heymans (1904) worked with the mammalian heart, and Claude Bernard experimented with isolated nerve–muscle preparations.

The organ or tissue is so suspended that the contraction or relaxation of the muscle is mechanically transmitted to a stylet. The stylet writes on a drum covered with smoked paper rotated by clockwork at a constant speed. This device, called a kymograph, graphically records motion or pressure. The effects of drug substances added to the bath can thus be visualized. The kymograph is a relatively crude device. In modern laboratories, organ and tissue movements are transmitted by force transducers to polygraph machines, which produce similar tracings. Or the polygraph is replaced by computerized equipment that issues a digital record.

The surgical preparation of animals is illustrated by the following examples. As early as 1849, the German anatomist Arnold Berthold transplanted testicular tissue into a capon (a castrated rooster) and showed that this induced growth of the comb. This basic method was used in the 20th century to isolate and study the male sex hormones.

Similarly, in 1924, Americans Edgar Allen and Edward Doisy used ovariectomized rats to test the action of estrogenic hormones. To study anti-inflammatory agents, rats can be made arthritic by injection of an oily suspension of killed bacteria (Freund’s adjuvant).
Drugs affecting gastric secretion may be studied in animals by forming a Heidenhain pouch—a small sac of the stomach, vagally denervated and closed off from the main cavity, but with an opening through the abdominal wall.

Rational design:Screening of candidate compounds and mode-of-action studies may focus on specific tissues, organs, or systems or on actions, such as antihistaminic or anticonvulsant. As knowledge of human biochemistry and molecular biology advances, pharmacology zeroes in more often on enzymatic action and receptors.

Captopril (Capoten), developed by M. Ondetti and co-workers at Squibb in the 1970s, exemplifies a molecule that was rationally designed to fit the active site of an enzyme—angiotensin converting enzyme (ACE). This drug, and subsequent ACE inhibitors, reduces blood pressure.

Knowledge of cell receptors is now on the cutting edge of pharmacology and drug discovery. The concept was first proposed about a hundred years ago by Paul Ehrlich, the great bacteriologist and chemist who synthesized salvarsan (also known as “606”) for the treatment of syphilis. On the basis of his research on bacterial toxins, Ehrlich postulated that the body’s cells possess a great many “receptors” by which they combine with the food substances in the body fluids. He theorized that the metabolic products of certain bacteria combine with the receptors of some cells, thus injuring the cells. Ehrlich visualized receptors as unsatisfied chemical side chains. This is not far from the modern idea of receptors as domains in enzymes or other proteins, with which drugs of appropriate structure can combine.

Illustrating the importance of receptor research are drugs that act on the adrenergic (sympathetic) nervous system. This system has both - and -receptors. Propranolol (Inderal) was the first specific -adrenergic receptor blocking agent. Marketed in 1964, it ended a long drought in new heart medicines and soon became a major therapy for angina pectoris, cardiac arrhythmias, hypertension, and essential tremor. However, all -adrenergic receptors are not identical, and propranolol is nonselective. Second-generation drugs such as atenolol (Tenormin) and metoprolol (Lopressor), developed in the late 1970s, have a preferential effect on l receptors, which are chiefly located in heart muscle. At higher doses, they also inhibit 2 receptors, which are found mainly in the bronchial and vascular musculature. We also have blockers of the -adrenoreceptors, such as prazosin (Minipress; early 1980s), and 1-blockers, such as terazosin (Hytrin; 1987). And there are /-blockers: Labetolol (Normodyne) and carvedilol (Coreg), developed in the mid-1990s, exhibit selective 1 and non selective -blocking action.

The methods and approaches touched on in this article are merely a sampling. Pharmacology is similar to medicinal chemistry in that it has developed a vast array of techniques, both general and specialized. Building on its past, the ongoing progress of pharmacology supports its critical role in modern drug discovery and augurs well for the future.