feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    [Erscheinungsort nicht ermittelbar] : Frontiers Media SA
    UID:
    b3kat_BV044407646
    Format: 1 Online-Ressource (244 Seiten) , Illustrationen, Diagramme
    ISBN: 9782889451142
    Note: Published in: Frontiers in psychology
    Language: English
    Subjects: Psychology
    RVK:
    Keywords: Gesicht ; Visuelle Wahrnehmung ; Aufsatzsammlung
    URL: Volltext  (kostenfrei)
    URL: Volltext  (Description of rights in Directory of Open Access Books (DOAB): Attribution (CC by))
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    UID:
    edochu_18452_22894
    Format: 1 Online-Ressource (11 Seiten)
    Content: At the group level, women consistently perform better in face memory tasks than men and also show earlier and larger N170 components of event-related brain potentials (ERP), considered to indicate perceptual structural encoding of faces. Here we investigated sex differences in the relationship between the N170 and face memory performance in 152 men and 141 women at group mean and individual differences levels. ERPs and performance were measured in separate tasks, avoiding statistical dependency between the two. We confirmed previous findings about superior face memory in women and a—sex-independent—negative relationship between N170 latency and face memory. However, whereas in men, better face memory was related to larger N170 components, face memory in women was unrelated with the amplitude or latency of the N170. These data provide solid evidence that individual differences in face memory within men are at least partially related to more intense structural face encoding.
    Content: Peer Reviewed
    Note: This article was supported by the German Research Foundation (DFG) and the Open Access Publication Fund of Humboldt-Universität zu Berlin.
    In: Social cognitive and affective neuroscience, Oxford : Oxford Univ. Press, 15,2020,5, Seiten 587-597
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    UID:
    b3kat_BV046703848
    Format: 1 Online-Ressource
    Language: English
    URL: Volltext  (kostenfrei)
    Author information: Sommer, Werner
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    UID:
    edochu_18452_24002
    Format: 1 Online-Ressource (22 Seiten)
    Content: The study of socio-cognitive abilities emerged from intelligence research, and their specificity remains controversial until today. In recent years, the psychometric structure of face cognition (FC)—a basic facet of socio-cognitive abilities—was extensively studied. In this review, we summarize and discuss the divergent psychometric structures of FC in easy and difficult tasks. While accuracy in difficult tasks was consistently shown to be face-specific, the evidence for easy tasks was inconsistent. The structure of response speed in easy tasks was mostly—but not always—unitary across object categories, including faces. Here, we compare studies to identify characteristics leading to face specificity in easy tasks. The following pattern emerges: in easy tasks, face specificity is found when modeling speed in a single task; however, when modeling speed across multiple, different easy tasks, only a unitary factor structure is reported. In difficult tasks, however, face specificity occurs in both single task approaches and task batteries. This suggests different cognitive mechanisms behind face specificity in easy and difficult tasks. In easy tasks, face specificity relies on isolated cognitive sub-processes such as face identity recognition. In difficult tasks, face-specific and task-independent cognitive processes are employed. We propose a descriptive model and argue for FC to be integrated into common taxonomies of intelligence.
    Content: Peer Reviewed
    In: Journal of Intelligence : open access journal, Basel : MDPI, 9,2021,2
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    UID:
    b3kat_BV036757858
    Format: 1 Online-Ressource (38 S.)
    Note: Berlin, Humboldt-Univ., Diss., 2010
    Language: English
    Subjects: Psychology
    RVK:
    RVK:
    RVK:
    Keywords: Gesicht ; Erkennung ; Strukturmodell ; Hochschulschrift
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    UID:
    edochu_18452_25190
    Format: 1 Online-Ressource (17 Seiten)
    Content: According to the shared signal hypothesis (SSH) the impact of facial expressions on emotion processing partially depends on whether the gaze is directed toward or away from the observer. In autism spectrum disorder (ASD) several aspects of face processing have been found to be atypical, including attention to eye gaze and the identification of emotional expressions. However, there is little research on how gaze direction affects emotional expression processing in typically developing (TD) individuals and in those with ASD. This question is investigated here in two multimodal experiments. Experiment 1 required processing eye gaze direction while faces differed in emotional expression. Forty-seven children (aged 9–12 years) participated. Their Autism Diagnostic Observation Schedule (ADOS) scores ranged from 0 to 6 in the experiment. Event-related potentials (ERPs) were sensitive to gaze direction and emotion, but emotion processing did not depend on gaze direction. However, for angry faces the gaze direction effect on the N170 amplitude, as typically observed in TD individuals, diminished with increasing ADOS score. For neutral expressions this correlation was not significant. Experiment 2 required explicit emotion classifications in a facial emotion composite task while eye gaze was manipulated incidentally. A group of 22 children with ASD was compared to a propensity score-matched group of TD children (mean age = 13 years). The same comparison was carried out for a subgroup of nine children with ASD who were less trained in social cognition, according to clinician’s report. The ASD group performed overall worse in emotion recognition than the TD group, independently of emotion or gaze direction. However, for disgust expressions, eye tracking data revealed that TD children fixated relatively longer on the eyes of the stimulus face with a direct gaze as compared with averted gaze. In children with ASD we observed no such modulation of fixation behavior as a function of gaze direction. Overall, the present findings from ERPs and eye tracking confirm the hypothesis of an impaired sensitivity to gaze direction in children with ASD or elevated autistic traits, at least for specific emotions. Therefore, we conclude that multimodal investigations of the interaction between emotional processing and stimulus gaze direction are promising to understand the characteristics of individuals differing along the autism trait dimension.
    Content: Peer Reviewed
    In: Lausanne : Frontiers Research Foundation, 16
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    UID:
    edochu_18452_21909
    Format: 1 Online-Ressource (23 Seiten)
    Content: Despite the importance of perceiving and recognizing facial expressions in everyday life, there is no comprehensive test battery for the multivariate assessment of these abilities. As a first step toward such a compilation, we present 16 tasks that measure the perception and recognition of facial emotion expressions, and data illustrating each task's difficulty and reliability. The scoring of these tasks focuses on either the speed or accuracy of performance. A sample of 269 healthy young adults completed all tasks. In general, accuracy and reaction time measures for emotion-general scores showed acceptable and high estimates of internal consistency and factor reliability. Emotion-specific scores yielded lower reliabilities, yet high enough to encourage further studies with such measures. Analyses of task difficulty revealed that all tasks are suitable for measuring emotion perception and emotion recognition related abilities in normal populations.
    Content: Peer Reviewed
    In: Lausanne : Frontiers Media S.A., 5
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    Online Resource
    Online Resource
    Berlin : Humboldt-Universität zu Berlin
    UID:
    edochu_18452_21895
    Format: 1 Online-Ressource (22 Seiten)
    Content: A latent variable study examined whether different classes of working-memory tasks measure the same general construct of working-memory capacity (WMC). Data from 270 subjects were used to examine the relationship between Binding, Updating, Recall-N-back, and Complex Span tasks, and the relations of WMC with secondary memory measures, indicators of cognitive control from two response-conflict paradigms (Simon task and Eriksen flanker task), and fluid intelligence. Confirmatory factor analyses support the concept of a general WMC factor. Results from structural-equation modeling show negligible relations of WMC with response-conflict resolution, and very strong relations of WMC with secondary memory and fluid intelligence. The findings support the hypothesis that individual differences in WMC reflect the ability to build, maintain and update arbitrary bindings.
    Content: Peer Reviewed
    In: Lausanne : Frontiers Media S.A., 4
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    UID:
    edochu_18452_25540
    Format: 1 Online-Ressource (17 Seiten)
    Content: Recent empirical evidence reveals that creative idea generation builds upon an interplay of multiple neural networks. Measures of temporal complexity yield important information about the underlying mechanisms of these co-activated neural networks. A few neurophysiological studies investigated brain signal complexity (BSC) during the production of creative verbal associations and resting states, aiming to relate it with creative task performance. However, it is unknown whether the complexity of brain signals can distinguish between productions of typical and original verbal associations. In the present study, we investigated verbal creativity with multiscale entropy (MSE) of electroencephalography (EEG) signals, which quantifies complexity over multiple timescales, capturing unique dynamic features of neural networks. MSE was measured in verbal divergent thinking (DT) states while emphasizing on producing either typical verbal associations or original verbal associations. We hypothesized that MSE differentiates between brain states characterizing the production of typical and original associations and is a sensitive neural marker of individual differences in producing original associations. Results from a sample of N = 92 young adults revealed slightly higher average MSE for original as compared with typical association production in small and medium timescales at frontal electrodes and slightly higher average MSE for typical association production in higher timescales at parietal electrodes. However, measurement models failed to uncover specificity of individual differences as MSE in typical vs. original associations was perfectly correlated. Hence, individuals with higher MSE in original association condition also exhibit higher MSE during the production of typical associations. The difference between typical and original association MSE was not significantly associated with human-rated originality of the verbal associations. In sum, we conclude that MSE is a potential marker of creative verbal association states, but replications and extensions are needed, especially with respect to the brain-behavior relationships.
    Content: Peer Reviewed
    In: Lausanne : Frontiers, 14
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Online Resource
    Online Resource
    Berlin : Humboldt-Universität zu Berlin
    UID:
    edochu_18452_29246
    Format: 1 Online-Ressource (14 Seiten)
    ISSN: 0146-1672 , 0146-1672
    Content: People remember what they deem important. In line with research suggesting that lower-class (vs. higher class) individuals spontaneously appraise other people as more relevant, we show that social class is associated with the habitual use of face memory. We find that lower-class (vs. higher class) participants exhibit better incidental memory for faces (i.e., spontaneous memory for faces they had not been instructed to memorize; Studies 1 and 2). No social-class differences emerge for faces participants are instructed to learn (Study 2), suggesting that this pattern reflects class-based relevance appraisals rather than memory ability. Study 3 extends our findings to eyewitness identification. Lower-class (vs. higher-class) participants’ eyewitness accuracy is less impacted by the explicit relevance of a target (clearly relevant thief vs. incidental bystander). Integrative data analysis shows a robust negative association between social class and spontaneous face memory. Preregistration (Studies 1 and 3) and cross-cultural replication (Study 2) further strengthen the results.
    Content: Peer Reviewed
    In: Thousand Oaks, Calif. : Sage Publ., 50,2, Seiten 285-298, 0146-1672
    Language: English
    URL: Volltext  (kostenfrei)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages