Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    In: Journal of Neurology, Springer Science and Business Media LLC, Vol. 270, No. 8 ( 2023-08), p. 4049-4059
    Abstract: Atrial fibrillation (AF) detection and treatment are key elements to reduce recurrence risk in cryptogenic stroke (CS) with underlying arrhythmia. The purpose of the present study was to assess the predictors of AF in CS and the utility of existing AF-predicting scores in The Nordic Atrial Fibrillation and Stroke (NOR-FIB) Study. Method The NOR-FIB study was an international prospective observational multicenter study designed to detect and quantify AF in CS and cryptogenic transient ischaemic attack (TIA) patients monitored by the insertable cardiac monitor (ICM), and to identify AF-predicting biomarkers. The utility of the following AF-predicting scores was tested: AS5F, Brown ESUS-AF, CHA 2 DS 2 -VASc, CHASE-LESS, HATCH, HAVOC, STAF and SURF. Results In univariate analyses increasing age, hypertension, left ventricle hypertrophy, dyslipidaemia, antiarrhythmic drugs usage, valvular heart disease, and neuroimaging findings of stroke due to intracranial vessel occlusions and previous ischemic lesions were associated with a higher likelihood of detected AF. In multivariate analysis, age was the only independent predictor of AF. All the AF-predicting scores showed significantly higher score levels for AF than non-AF patients. The STAF and the SURF scores provided the highest sensitivity and negative predictive values, while the AS5F and SURF reached an area under the receiver operating curve (AUC)  〉  0.7. Conclusion Clinical risk scores may guide a personalized evaluation approach in CS patients. Increasing awareness of the usage of available AF-predicting scores may optimize the arrhythmia detection pathway in stroke units.
    Type of Medium: Online Resource
    ISSN: 0340-5354 , 1432-1459
    RVK:
    Language: English
    Publisher: Springer Science and Business Media LLC
    Publication Date: 2023
    detail.hit.zdb_id: 1421299-7
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    In: European Stroke Journal, SAGE Publications, Vol. 8, No. 1 ( 2023-03), p. 148-156
    Abstract: Secondary stroke prevention depends on proper identification of the underlying etiology and initiation of optimal treatment after the index event. The aim of the NOR-FIB study was to detect and quantify underlying atrial fibrillation (AF) in patients with cryptogenic stroke (CS) or transient ischaemic attack (TIA) using insertable cardiac monitor (ICM), to optimise secondary prevention, and to test the feasibility of ICM usage for stroke physicians. Patients and methods: Prospective observational international multicenter real-life study of CS and TIA patients monitored for 12 months with ICM (Reveal LINQ) for AF detection. Results: ICM insertion was performed in 91.5% by stroke physicians, within median 9 days after index event. Paroxysmal AF was diagnosed in 74 out of 259 patients (28.6%), detected early after ICM insertion (mean 48 ± 52 days) in 86.5% of patients. AF patients were older (72.6 vs 62.2; p  〈  0.001), had higher pre-stroke CHA₂DS₂-VASc score (median 3 vs 2; p  〈  0.001) and admission NIHSS (median 2 vs 1; p = 0.001); and more often hypertension ( p = 0.045) and dyslipidaemia ( p = 0.005) than non-AF patients. The arrhythmia was recurrent in 91.9% and asymptomatic in 93.2%. At 12-month follow-up anticoagulants usage was 97.3%. Discussion and conclusions: ICM was an effective tool for diagnosing underlying AF, capturing AF in 29% of the CS and TIA patients. AF was asymptomatic in most cases and would mainly have gone undiagnosed without ICM. The insertion and use of ICM was feasible for stroke physicians in stroke units.
    Type of Medium: Online Resource
    ISSN: 2396-9873 , 2396-9881
    Language: English
    Publisher: SAGE Publications
    Publication Date: 2023
    detail.hit.zdb_id: 2851287-X
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    In: The Lancet Neurology, Elsevier BV, Vol. 22, No. 2 ( 2023-02), p. 117-126
    Type of Medium: Online Resource
    ISSN: 1474-4422
    Language: English
    Publisher: Elsevier BV
    Publication Date: 2023
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A4-
    Abstract: End-to-end simulations play a key role in the analysis of any high-sensitivity cosmic microwave background (CMB) experiment, providing high-fidelity systematic error propagation capabilities that are unmatched by any other means. In this paper, we address an important issue regarding such simulations, namely, how to define the inputs in terms of sky model and instrument parameters. These may either be taken as a constrained realization derived from the data or as a random realization independent from the data. We refer to these as posterior and prior simulations, respectively. We show that the two options lead to significantly different correlation structures, as prior simulations (contrary to posterior simulations) effectively include cosmic variance, but they exclude realization-specific correlations from non-linear degeneracies. Consequently, they quantify fundamentally different types of uncertainties. We argue that as a result, they also have different and complementary scientific uses, even if this dichotomy is not absolute. In particular, posterior simulations are in general more convenient for parameter estimation studies, while prior simulations are generally more convenient for model testing. Before B EYOND P LANCK , most pipelines used a mix of constrained and random inputs and applied the same hybrid simulations for all applications, even though the statistical justification for this is not always evident. B EYOND P LANCK represents the first end-to-end CMB simulation framework that is able to generate both types of simulations and these new capabilities have brought this topic to the forefront. The B EYOND P LANCK posterior simulations and their uses are described extensively in a suite of companion papers. In this work, we consider one important applications of the corresponding prior simulations, namely, code validation. Specifically, we generated a set of one-year LFI 30 GHz prior simulations with known inputs and we used these to validate the core low-level B EYOND P LANCK algorithms dealing with gain estimation, correlated noise estimation, and mapmaking.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A12-
    Abstract: We present cosmological parameter constraints estimated using the Bayesian B EYOND P LANCK analysis framework. This method supports seamless end-to-end error propagation from raw time-ordered data onto final cosmological parameters. As a first demonstration of the method, we analyzed time-ordered Planck LFI observations, combined with selected external data (WMAP 33–61 GHz, Planck HFI DR4 353 and 857 GHz, and Haslam 408 MHz) in the form of pixelized maps that are used to break critical astrophysical degeneracies. Overall, all the results are generally in good agreement with previously reported values from Planck 2018 and WMAP, with the largest relative difference for any parameter amounting about 1 σ when considering only temperature multipoles between 30 ≤  ℓ  ≤ 600. In cases where there are differences, we note that the B EYOND P LANCK results are generally slightly closer to the high- ℓ HFI-dominated Planck 2018 results than previous analyses, suggesting slightly less tension between low and high multipoles. Using low- ℓ polarization information from LFI and WMAP, we find a best-fit value of τ  = 0.066 ± 0.013, which is higher than the low value of τ  = 0.052 ± 0.008 derived from Planck 2018 and slightly lower than the value of 0.069 ± 0.011 derived from the joint analysis of official LFI and WMAP products. Most importantly, however, we find that the uncertainty derived in the B EYOND P LANCK processing is about 30 % greater than when analyzing the official products, after taking into account the different sky coverage. We argue that this uncertainty is due to a marginalization over a more complete model of instrumental and astrophysical parameters, which results in more reliable and more rigorously defined uncertainties. We find that about 2000 Monte Carlo samples are required to achieve a robust convergence for a low-resolution cosmic microwave background (CMB) covariance matrix with 225 independent modes, and producing these samples takes about eight weeks on a modest computing cluster with 256 cores.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A10-
    Abstract: We present Planck Low Frequency Instrument (LFI) frequency sky maps derived within the B EYOND P LANCK framework. This framework draws samples from a global posterior distribution that includes instrumental, astrophysical, and cosmological parameters, and the main product is an entire ensemble of frequency sky map samples, each of which corresponds to one possible realization of the various modeled instrumental systematic corrections, including correlated noise, time-variable gain, as well as far sidelobe and bandpass corrections. This ensemble allows for computationally convenient end-to-end propagation of low-level instrumental uncertainties into higher-level science products, including astrophysical component maps, angular power spectra, and cosmological parameters. We show that the two dominant sources of LFI instrumental systematic uncertainties are correlated noise and gain fluctuations, and the products presented here support – for the first time – full Bayesian error propagation for these effects at full angular resolution. We compared our posterior mean maps with traditional frequency maps delivered by the Planck Collaboration, and find generally good agreement. The most important quality improvement is due to significantly lower calibration uncertainties in the new processing, as we find a fractional absolute calibration uncertainty at 70 GHz of Δ g 0 / g 0  = 5 × 10 −5 , which is nominally 40 times smaller than that reported by Planck 2018. However, we also note that the original Planck 2018 estimate has a nontrivial statistical interpretation, and this further illustrates the advantage of the new framework in terms of producing self-consistent and well-defined error estimates of all involved quantities without the need of ad hoc uncertainty contributions. We describe how low-resolution data products, including dense pixel-pixel covariance matrices, may be produced from the posterior samples directly, without the need for computationally expensive analytic calculations or simulations. We conclude that posterior-based frequency map sampling provides unique capabilities in terms of low-level systematics modeling and error propagation, and may play an important role for future Cosmic Microwave Background (CMB) B -mode experiments aiming at nanokelvin precision.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A5-
    Abstract: We describe the correction procedure for Analog-to-Digital Converter (ADC) differential non-linearities (DNL) adopted in the Bayesian end-to-end B EYOND P LANCK analysis framework. This method is nearly identical to that developed for the official Planck Low Frequency Instrument (LFI) Data Processing Center (DPC) analysis, and relies on the binned rms noise profile of each detector data stream. However, rather than building the correction profile directly from the raw rms profile, we first fit a Gaussian to each significant ADC-induced rms decrement, and then derive the corresponding correction model from this smooth model. The main advantage of this approach is that only samples which are significantly affected by ADC DNLs are corrected, as opposed to the DPC approach in which the correction is applied to all samples, filtering out signals not associated with ADC DNLs. The new corrections are only applied to data for which there is a clear detection of the non-linearities, and for which they perform at least comparably with the DPC corrections. Out of a total of 88 LFI data streams (sky and reference load for each of the 44 detectors) we apply the new minimal ADC corrections in 25 cases, and maintain the DPC corrections in 8 cases. All these corrections are applied to 44 or 70 GHz channels, while, as in previous analyses, none of the 30 GHz ADCs show significant evidence of non-linearity. By comparing the B EYOND P LANCK and DPC ADC correction methods, we estimate that the residual ADC uncertainty is about two orders of magnitude below the total noise of both the 44 and 70 GHz channels, and their impact on current cosmological parameter estimation is small. However, we also show that non-idealities in the ADC corrections can generate sharp stripes in the final frequency maps, and these could be important for future joint analyses with the Planck High Frequency Instrument (HFI), Wilkinson Microwave Anisotropy Probe (WMAP), or other datasets. We therefore conclude that, although the existing corrections are adequate for LFI-based cosmological parameter analysis, further work on LFI ADC corrections is still warranted.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A9-
    Abstract: We discuss the treatment of bandpass and beam leakage corrections in the Bayesian B EYOND P LANCK cosmic microwave background (CMB) analysis pipeline as applied to the Planck LFI measurements. As a preparatory step, we first applied three corrections to the nominal LFI bandpass profiles, including the removal of a known systematic effect in the ground measuring equipment at 61 GHz, along with a smoothing of standing wave ripples and edge regularization. The main net impact of these modifications is an overall shift in the 70 GHz bandpass of +0.6 GHz. We argue that any analysis of LFI data products, either from Planck or B EYOND P LANCK , should use these new bandpasses. In addition, we fit a single free bandpass parameter for each radiometer of the form Δ i  = Δ 0  +  δ i , where Δ 0 represents an absolute frequency shift per frequency band and δ i is a relative shift per detector. The absolute correction is only fitted at 30 GHz, with a full χ 2 -based likelihood, resulting in a correction of Δ 30  = 0.24 ± 0.03 GHz. The relative corrections were fitted using a spurious map approach that is fundamentally similar to the method pioneered by the WMAP team, but excluding the introduction of many additional degrees of freedom. All the bandpass parameters were sampled using a standard Metropolis sampler within the main B EYOND P LANCK Gibbs chain and the bandpass uncertainties were thus propagated to all other data products in the analysis. In summary, we find that our bandpass model significantly reduces leakage effects. For beam leakage corrections, we adopted the official Planck LFI beam estimates without any additional degrees of freedom and we only marginalized over the underlying sky model. We note that this is the first time that leakage from beam mismatch has been included for Planck LFI maps.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A2-
    Abstract: We present a Gibbs sampling solution to the mapmaking problem for cosmic microwave background (CMB) measurements that builds on existing destriping methodology. Gibbs sampling breaks the computationally heavy destriping problem into two separate steps: noise filtering and map binning. Considered as two separate steps, both are computationally much cheaper than solving the combined problem. This provides a huge performance benefit as compared to traditional methods and it allows us, for the first time, to bring the destriping baseline length to a single sample. Here, we applied the Gibbs procedure to simulated Planck 30 GHz data. We find that gaps in the time-ordered data are handled efficiently by filling them in with simulated noise as part of the Gibbs process. The Gibbs procedure yields a chain of map samples, from which we are able to compute the posterior mean as a best-estimate map. The variation in the chain provides information on the correlated residual noise, without the need to construct a full noise covariance matrix. However, if only a single maximum-likelihood frequency map estimate is required, we find that traditional conjugate gradient solvers converge much faster than a Gibbs sampler in terms of the total number of iterations. The conceptual advantages of the Gibbs sampling approach lies in statistically well-defined error propagation and systematic error correction. This methodology thus forms the conceptual basis for the mapmaking algorithm employed in the B EYOND P LANCK framework, which implements the first end-to-end Bayesian analysis pipeline for CMB observations.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    In: Astronomy & Astrophysics, EDP Sciences, Vol. 675 ( 2023-07), p. A14-
    Abstract: Using the Planck Low Frequency Instrument (LFI) and WMAP data within the global Bayesian B EYOND P LANCK framework, we constrained the polarized foreground emission between 30 and 70 GHz. We combined, for the first time, full-resolution Planck LFI time-ordered data with low-resolution WMAP sky maps at 33, 40, and 61 GHz. The spectral parameters were fit with a likelihood defined at the native resolution of each frequency channel. This analysis represents the first implementation of true multi-resolution component separation applied to CMB observations for both amplitude and spectral energy distribution (SED) parameters. For the synchrotron emission, we approximated the SED as a power-law in frequency and we find that the low signal-to-noise ratio of the current data strongly limits the number of free parameters that can be robustly constrained. We partitioned the sky into four large disjoint regions (High Latitude; Galactic Spur; Galactic Plane; and Galactic Center), each associated with its own power-law index. We find that the High Latitude region is prior-dominated, while the Galactic Center region is contaminated by residual instrumental systematics. The two remaining regions appear to be signal-dominated, and for these we derive spectral indices of β s Spur  = −3.17 ± 0.06 and β s Plane  = −3.03 ± 0.07, which is in good agreement with previous results. For the thermal dust emission, we assumed a modified blackbody model and we fit a single power-law index across the full sky. We find β d  = 1.64 ± 0.03, which is slightly steeper than the value reported in Planck HFI data, but still statistically consistent at the 2 σ confidence level.
    Type of Medium: Online Resource
    ISSN: 0004-6361 , 1432-0746
    RVK:
    RVK:
    Language: English
    Publisher: EDP Sciences
    Publication Date: 2023
    detail.hit.zdb_id: 1458466-9
    SSG: 16,12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages