Kooperativer Bibliotheksverbund

Berlin Brandenburg

and
and

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Language
Year
  • 1
    Language: English
    In: PLoS ONE, 01 January 2014, Vol.9(5), p.e97530
    Description: To obtain predictive genes with lower redundancy and better interpretability, a hybrid gene selection method encoding prior information is proposed in this paper. To begin with, the prior information referred to as gene-to-class sensitivity (GCS) of all genes from microarray data is exploited by a single hidden layered feedforward neural network (SLFN). Then, to select more representative and lower redundant genes, all genes are grouped into some clusters by K-means method, and some low sensitive genes are filtered out according to their GCS values. Finally, a modified binary particle swarm optimization (BPSO) encoding the GCS information is proposed to perform further gene selection from the remainder genes. For considering the GCS information, the proposed method selects those genes highly correlated to sample classes. Thus, the low redundant gene subsets obtained by the proposed method also contribute to improve classification accuracy on microarray data. The experiments results on some open microarray data verify the effectiveness and efficiency of the proposed approach.
    Keywords: Sciences (General)
    E-ISSN: 1932-6203
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Language: English
    In: PLoS ONE, May 20, 2014, Vol.9(5)
    Description: To obtain predictive genes with lower redundancy and better interpretability, a hybrid gene selection method encoding prior information is proposed in this paper. To begin with, the prior information referred to as gene-to-class sensitivity (GCS) of all genes from microarray data is exploited by a single hidden layered feedforward neural network (SLFN). Then, to select more representative and lower redundant genes, all genes are grouped into some clusters by K-means method, and some low sensitive genes are filtered out according to their GCS values. Finally, a modified binary particle swarm optimization (BPSO) encoding the GCS information is proposed to perform further gene selection from the remainder genes. For considering the GCS information, the proposed method selects those genes highly correlated to sample classes. Thus, the low redundant gene subsets obtained by the proposed method also contribute to improve classification accuracy on microarray data. The experiments results on some open microarray data verify the effectiveness and efficiency of the proposed approach.
    Keywords: Genetic Research ; Artificial Neural Networks ; Genes ; Optimization Theory
    ISSN: 1932-6203
    Source: Cengage Learning, Inc.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    Language: English
    In: PLoS ONE, 01 January 2016, Vol.11(11), p.e0165803
    Description: For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system.
    Keywords: Sciences (General)
    E-ISSN: 1932-6203
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Language: English
    In: Journal of the American Chemical Society, Feb 23, 2011, Vol.133(7), p.2052-2055
    Description: One-pot colloidal synthesis of nanodisk heterostructures ([Cu.sub.1.94]S-[Zn.sub.x][Cd.sub.1-x]S) consisting of monoclinic [Cu.sup.1.94]S and wurtzite CdS is described. The two-component nanostructures could find potential application in the fabrication of heterojunction solar cells.
    Keywords: Chemical Synthesis -- Analysis ; Cadmium Compounds -- Chemical Properties ; Cadmium Compounds -- Structure ; Solar Cells -- Analysis ; Copper Compounds -- Chemical Properties ; Copper Compounds -- Structure
    ISSN: 0002-7863
    Source: Cengage Learning, Inc.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Language: English
    In: Neurocomputing, Sept 20, 2013, Vol.116, p.87(7)
    Description: To link to full-text access for this article, visit this link: http://dx.doi.org/10.1016/j.neucom.2011.12.062 Byline: Fei Han (a), Hai-Fen Yao (a), Qing-Hua Ling (b) Keywords: Extreme learning machine; Particle swarm optimization; Generalization performance; Convergence rate Abstract: Recently Extreme Learning Machine (ELM) for single-hidden-layer feedforward neural networks (SLFN) has been attracting attentions for its faster learning speed and better generalization performance than those of traditional gradient-based learning algorithms. However, ELM may need high number of hidden neurons and lead to ill-condition problem due to the random determination of the input weights and hidden biases. In this paper, a hybrid learning algorithm is proposed to overcome the drawbacks of ELM, which uses an improved particle swarm optimization (PSO) algorithm to select the input weights and hidden biases and Moore-Penrose (MP) generalized inverse to analytically determine the output weights. In order to obtain optimal SLFN, the improved PSO optimizes the input weights and hidden biases according to not only the root mean squared error (RMSE) on validation set but also the norm of the output weights. The proposed algorithm has better generalization performance than traditional ELM and other evolutionary ELMs, and the conditioning of the SLFN trained by the proposed algorithm is also improved. Experiment results have verified the efficiency and effectiveness of the proposed method. Author Affiliation: (a) School of Computer Science and Telecommunication Engineering, Jiangsu University, Zhenjiang, Jiangsu 212013, China (b) School of Computer Science and Engineering, Jiangsu University of Science and Technology, Zhenjiang, Jiangsu 212003, China
    Keywords: Artificial Neural Networks ; Data Mining ; Algorithms ; Computer Science ; Optimization Theory
    ISSN: 0925-2312
    Source: Cengage Learning, Inc.
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    Language: English
    In: Tetrahedron, Jan 23, 2010, Vol.66(4), p.789(55)
    Description: To link to full-text access for this article, visit this link: http://dx.doi.org/10.1016/j.tet.2009.11.001 Byline: Xiao-Long Qiu (a), Xiu-Hua Xu (a), Feng-Ling Qing (a)(b) Abbreviations: Ac, acetyl; AIBN, 2,2'-azobis(isobutyronitrile); AIDS, acquired immune deficiency syndrome; Bn, benzyl; Boc, tert-butoxycarbonyl; BOM, benzyloxymethyl; BSA, N,O-bis(trimethylsilyl)acetamide; BSTFA, N,O-bis(trimethylsilyl)trifluoroacetamide; Bz, benzoyl; CAN, ceric ammonium nitrate; Cbz, benzyloxycarbonyl; CMV, cytomegalovirus; DAST, diethylaminosulfur trifluouride; DBU, 1,8-diazabicyclo[5.4.0]undec-7-ene; DCC, 1,3-dicyclohexylcarbodiimide; DEAD, diethyl azodicarboxylate; DIAD, diisopropyl azodicarboxylate; DIBAL-H, diisobutylaluminium hydride; DIPEA, diisopropylethylamine; DMAP, 4-(dimethylamino)pyridine; DMF, N,N-dimethylformamide; DMP, 2,2-dimethoxypropane; DMS, dimethyl sulfide; DMSO, dimethylsulfoxide; DMTr, dimethoxytrityl; DNA, deoxyribonucleic acid; DNP, 2,4-dinitrophenyl; DTBMP, 2,6-di-tert-butyl-4-methylpyridine; EBV, Epstein-Barr virus; F-TEDA-BF.sub.4, 1-(chloromethyl)-4-fluoro-1,4-diazabicyclo[2.2.2]-octane bis(tetrafluoroborate); HBV, hepatitis B virus; HCV, hepatitis C virus; HIV, human immunodeficiency virus; HMDS, hexamethyldisilazane; HMPA, hexamethylphosphoramide; HMPT, hexamethylphosphorous triamide; HSV, herpes simplex virus; HWE, Horner-Wadsworth-Emmons; LDA, lithium diisopropylamide; LiHMDS, lithium bis(trimethylsilyl)amide; LTMP, lithium tetramethylpiperidide; m-CPBA, meta-chloroperbenzoic acid; MEM, methoxyethoxymethyl; MMT (MMTr), p-methoxyphenyldiphenylmethyl; Ms, methanesulfonyl; NaHMDS, sodium bis(trimethylsilyl)amide; NBA, N-bromoacetamide; NBS, N-bromosuccinimide; NCS, N-chlorosuccinimide; NfF, perfluorobutanesulfonyl fluoride; NFSI, N-fluorobenzenesulfonimide; NMO (NMMO), N-methylmorpholine N-oxide; N-PSP, N-(phenylseleno)phthalimide; RCM, ring-closing metathesis; p-An, p-methoxyphenyl; PCC, pyridinium chlorochromate; PDC, pyridinium dichromate; Piv, pivaloyl; PMB, p-methoxybenzyl; p-TSA, p-toluenesulfonic acid; Py, pyridine; RNA, ribonucleic acid; SAR, structure-activity relationship; SEM, trimethylsilylethoxymethyl; TASF, tris(dimethylamino)sulfur (trimethylsilyl)difluoride; TBAF, tetrabutylammonium fluoride; TBAI, tetrabutylammonium iodide; TBDMS (TBS), tert-butyldimethylsilyl; TBDPS, tert-butyldiphenylsilyl; TCDI, 1,1'-thiocarbonyldiimidazole; TEA, triethylamine; TEMPO, 2,2,6,6-tetramethylpiperidinyloxy; TEPA, triethyl phosphonoacetate; Tf, trifluoromethanesulfonyl; TFA, trifluoroacetic acid; TFAA, trifluoroacetic acid anhydride; THF, tetrahydrofuran; THP, tetrahydropyranyl; TIPDS (TPDS), 1,3-(1,1,3,3-tetraisopropyldisiloxanylidene); TMP, 2,2,6,6-tetramethylpiperidine; TMS, trimethylsilyl; TMSOTf, trimethylsilyl triflate; Tol, toluoyl; TPP, triphenylphosphine; Tr, triphenylmethyl (trityl); Ts, p-toluenesulfonyl (tosyl); VZV, varicella zoster virus Abstract: Display Omitted Author Affiliation: (a) Key Laboratory of Organofluorine Chemistry, Shanghai Institute of Organic Chemistry, Chinese Academy of Sciences, 345 Lingling Lu, Shanghai 200032, China (b) College of Chemistry, Chemistry Engineering and Biotechnology, Donghua University, 2999 North Renmin Lu, Shanghai 201620, China Article History: Received 22 October 2009
    Keywords: Dimethyl Sulfide ; Sulfonic Acids ; Hiv ; Toluene ; Pyridinium Compounds ; Dimethylformamide ; Pyridine ; Tetrahydrofuran ; Hepatitis C ; Rna ; Nucleosides ; Reagents ; Fluorides ; Lithium Compounds ; Nitrogen Compounds ; Organic Compounds
    ISSN: 0040-4020
    E-ISSN: 14645416
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    Language: English
    In: Neurocomputing, 20 September 2013, Vol.116, pp.87-93
    Description: Recently Extreme Learning Machine (ELM) for single-hidden-layer feedforward neural networks (SLFN) has been attracting attentions for its faster learning speed and better generalization performance than those of traditional gradient-based learning algorithms. However, ELM may need high number of hidden neurons and lead to ill-condition problem due to the random determination of the input weights and hidden biases. In this paper, a hybrid learning algorithm is proposed to overcome the drawbacks of ELM, which uses an improved particle swarm optimization (PSO) algorithm to select the input weights and hidden biases and Moore–Penrose (MP) generalized inverse to analytically determine the output weights. In order to obtain optimal SLFN, the improved PSO optimizes the input weights and hidden biases according to not only the root mean squared error (RMSE) on validation set but also the norm of the output weights. The proposed algorithm has better generalization performance than traditional ELM and other evolutionary ELMs, and the conditioning of the SLFN trained by the proposed algorithm is also improved. Experiment results have verified the efficiency and effectiveness of the proposed method.
    Keywords: Extreme Learning Machine ; Particle Swarm Optimization ; Generalization Performance ; Convergence Rate ; Computer Science
    ISSN: 0925-2312
    E-ISSN: 1872-8286
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    In: Complexity, 2018, Vol.2018, 22 pages
    Description: Although multiobjective particle swarm optimization (MOPSO) has good performance in solving multiobjective optimization problems, how to obtain more accurate solutions as well as improve the distribution of the solutions set is still a challenge. In this paper, to improve the convergence performance of MOPSO, an improved multiobjective quantum-behaved particle swarm optimization based on double search strategy and circular transposon mechanism (MOQPSO-DSCT) is proposed. On one hand, to solve the problem of the dramatic diversity reduction of the solutions set in later iterations due to the single search pattern used in quantum-behaved particle swarm optimization (QPSO), the double search strategy is proposed in MOQPSO-DSCT. The particles mainly learn from their personal best position in earlier iterations and then the particles mainly learn from the global best position in later iterations to balance the exploration and exploitation ability of the swarm. Moreover, to alleviate the problem of the swarm converging to local minima during the local search, an improved attractor construction mechanism based on opposition-based learning is introduced to further search a better position locally as a new attractor for each particle. On the other hand, to improve the accuracy of the solutions set, the circular transposon mechanism is introduced into the external archive to improve the communication ability of the particles, which could guide the population toward the true Pareto front (). The proposed algorithm could generate a set of more accurate and well-distributed solutions compared to the traditional MOPSO. Finally, the experiments on a set of benchmark test functions have verified that the proposed algorithm has better convergence performance than some state-of-the-art multiobjective optimization algorithms.
    Keywords: International Conferences ; State of the Art ; Circularity ; Swarm Intelligence ; Strategy ; Convergence ; Search Methods ; Genetic Algorithms ; Optimization ; Archives & Records ; Decomposition ; Algorithms ; Multiple Objective Analysis ; Objectives ; Mathematical Problems ; Optimization Algorithms ; Intelligence;
    ISSN: 1076-2787
    E-ISSN: 1099-0526
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Language: English
    In: Neural Computing and Applications, 2010, Vol.19(2), pp.255-261
    Description: In this paper, an improved approach incorporating adaptive particle swarm optimization (APSO) and a priori information into feedforward neural networks for function approximation problem is proposed. It is well known that gradient-based learning algorithms such as backpropagation algorithm have good ability of local search, whereas PSO has good ability of global search. Therefore, in the improved approach, the APSO algorithm encoding the first-order derivative information of the approximated function is used to train network to near global minima. Then, with the connection weights produced by APSO, the network is trained with a modified gradient-based algorithm with magnified gradient function. The modified gradient-based algorithm can reduce input-to-output mapping sensitivity and lessen the chance of being trapped into local minima. By combining APSO with local search algorithm and considering a priori information, the improved approach has better approximation accuracy and convergence rate. Finally, simulation results are given to verify the efficiency and effectiveness of the proposed approach.
    Keywords: Function approximation ; Particle swarm optimization ; A priori information ; Approximation accuracy ; Convergence rate
    ISSN: 0941-0643
    E-ISSN: 1433-3058
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Language: English
    In: Neurocomputing, 08 March 2017, Vol.228, pp.133-142
    Description: How to determine the network structure is an open problem in extreme learning machine (ELM). Error minimized extreme learning machine (EM-ELM) is a simple and efficient approach to determine the number of hidden nodes. However, similar to other constructive ELM, EM-ELM lays much emphasis on the convergence accuracy, which may obtain a single-hidden-layer feedforward neural networks (SLFN) with good convergence performance but bad condition. In this paper, an effective approach based on error minimized ELM and particle swarm optimization (PSO) is proposed to adaptively determine the structure of SLFN for regression problem. In the new method, to establish a compact and well-conditioning SLFN, the hidden node optimized by PSO is added to the SLFN one by one. Moreover, not only the regression accuracy but also the condition value of the hidden output matrix of the network is considered in the optimization process. Experiment results on various regression problems verify that the proposed algorithm achieves better generalization performance with fewer hidden nodes than other constructive ELM.
    Keywords: Extreme Learning Machine ; Particle Swarm Optimization ; Network Structure ; Generalization Performance ; Condition Value ; Computer Science
    ISSN: 0925-2312
    E-ISSN: 1872-8286
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages