Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    UID:
    kobvindex_HPB1321798121
    Format: 1 online resource (431 p.)
    ISBN: 9789811250477 , 9811250472
    Note: Description based upon print version of record. , 1. Introduction. , Intro -- Content -- Preface -- AI-DRIVEN ADVANCES IN MODELING OF PROTEIN STRUCTURE -- Session Introduction: AI-Driven Advances in Modeling of Protein Structure -- 1. A short retrospect -- 2. A brief outline of current research -- 3. Future developments (complexes, ligand interactions, other molecules, dynamics, language models, geometry models, sequence design) -- 4. What is needed for further progress? -- 5. Overview of papers in this session -- 5.1. Evaluating significance of training data selection in machine learning -- 5.2. Geometric pattern transferability , 5.3. Supervised versus unsupervised sequence to contact learning -- 5.4. Side chain packing using SE(3) transformers -- 5.5. Feature selection in electrostatic representations of ligand binding sites -- References -- Training Data Composition Affects Performance of Protein Structure Analysis Algorithms -- 1. Introduction -- 2. Methods -- 2.1. Experimental Design -- 2.2. Task-specific Methods -- 3. Results -- 3.1. Performance on NMR and cryo-EM structures is consistently lower than performance on X-ray structures, independent of training set , 3.2. Inclusion of NMR data in the training set improves performance on held-out NMR data and does not degrade performance on X-ray data -- 3.3. Known biochemical and biophysical effects are replicated in trained models -- 3.4. Downsampling X-ray structures during training negatively affects performance on all types of data -- 4. Conclusion -- 5. Acknowledgments -- References -- Transferability of Geometric Patterns from Protein Self-Interactions to Protein-Ligand Interactions -- 1. Introduction -- 2. Related Work -- 3. Methods -- 3.1. Datasets -- 3.2. Contact extraction , 3.3. Representing contact geometry -- 4. Results -- 4.1. Protein self-contacts exhibit clear geometric clustering -- 4.2. Many geometric patterns transfer to protein-ligand contacts -- 4.3. Application to protein-ligand docking -- 5. Conclusion and Future Work -- Supplemental Material, Code, and Data Availability -- Acknowledgments -- References -- Interpreting Potts and Transformer Protein Models Through the Lens of Simplified Attention -- 1. Introduction -- 2. Background -- 3. Methods -- 3.1. Potts Models -- 3.2. Factored Attention -- 3.3. Single-layer attention , 3.4. Pretraining on Sequence Databases -- 3.5. Extracting Contacts -- 4. Results -- 5. Discussion -- Acknowledgements -- References -- Side-Chain Packing Using SE(3)-Transformer -- 1. Introduction -- 2. Methods -- 2.1. Neighborhood Graph Representation -- 2.2. The SE(3)-Transformer Architecture -- 2.3. Node Features -- 2.4. Final Layer -- 2.5. Rotamer Selection -- 2.6. Experiments -- 3. Results -- 4. Conclusion -- 5. Acknowledgements -- 6. References -- DeepVASP-E: A Flexible Analysis of Electrostatic Isopotentials for Finding and Explaining Mechanisms that Control Binding Specificity
    Additional Edition: Print version: Altman, Russ B Biocomputing 2022 - Proceedings Of The Pacific Symposium Singapore : World Scientific Publishing Company,c2021 9789811250460
    Language: English
    Keywords: Electronic books.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages