QSAR - Re: CCL:Quantum Chemistry in Drug Design

From: Dr. N. SUKUMAR <nagams]![rpi.edu>
Date: Thu, 05 Dec 2002 16:48:25 EST

The CPU bottleneck has certainly been a major factor, but I believe a
second factor is the descriptors commonly employed in drug design. If all
one uses in modeling are molecular-geometry-derived descriptors, atom
counts, topological descriptors and electrostatic potentials, then it
hardly seems worthwhile performing accurate quantum chemical computations,
especially in view of the enormous computational overhead. Ab initio
cmputations, however, can generate a lot more information at a fundamental
level, derived from the molecular wavefunction or electron density
distribution. There are a few research groups (ours among them) that have
investigated the use of electron-density-derived descriptors in drug
design. In the Transferable Atom Equivalents (TAE) method, first introduced
by Curt Breneman, we employ besides electrostatic potentials, electronic
kinetic energy densities, the Laplacian distribution introduced by Bader,
Fukui's function and Politzer's local average ionization potential. The
distributions of these electronic properties on the molecular van der Waals
surface (binned as histograms or encoded as wavelets) are used as
descriptors. These electron-density-derived descriptors have found success
in a number of applications, especially when used in combination with other
traditional descriptors. For small datasets of small molecules,
electron-density-derived descriptors can be readily determined from ab
initio computions, but for large pharmaceutical datasets and for
macromolecules, these descriptors can still be computed from an
atomic-fragment-based approach using the theory of Atoms In Molecules. This
is done in our RECON program, which employs atomic descriptors computed at
HF/6-31+G* level and is available for download from our website. Typical
CPU timings for RECON on a 1.7GHz Intel Pentium under linux are about 90
sec.for a set of 25 proteins and 7.5 min.for a 42,689 molecule dataset from
NCI -- comparable to times for computing topological descriptors. So I
would have to say that for such applications, CPU is no longer a limiting
factor.

Our protein chromatography studies are published in Mazza, et al,
Anal.Chem. 73, 5457-5461 (2001) and Song, et al, J.Chem.Inf.Comput.Sci. 42,
1347-1357 (2002), while the drug design applications are in various stages
of going to press and in press.

Dr. N. Sukumar
http://www.drugmining.com/
Rensselaer Department of Chemistry

On Sat, 30 Nov 2002 07:11:34 +0530 "Parthiban" wrote:

> Dear Friends:
> While several QSAR related techniques and methodologies are appearing in
> drug
> design Journals, very few talk about the more accurate quantum chemical
> methods in drug design arena.
>
> * What are the bottlenecks for the quantum chemical methods to get into
> the
> area of drug design.
>
> * For small molecules QC methods plays greater role, but for handling
> drug-
> like molecules and handling several thousands of compounds, QC methods do

> not
> see the limelight (correct me if i am wrong). Is the CPU-intensiveness
> alone
> is the reason. Or is there any some conceptual gap in this. [ I hear
> someone
> saying CPU-intensive is the reason and one has to wait for months to get

> results ]
>
> * Based on your experience/insight, can you think of some timeframe, say
> 5
> years, 10 years down the line, Quantum chemical methods would play a
> major
> role in the area of lead identification/optimization, or would you
> say "prediction of future is difficult!".
>
> I look forward to reading your views. Thanks.
>
> S. Parthiban
> Jubilant Biosys Ltd.
> http://www.jubilantbiosys.com
>
>
>
Received on 2002-12-05 - 17:35 GMT

This archive was generated by hypermail 2.2.0 : 2005-11-24 - 10:21 GMT