Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    In: Circulation, Ovid Technologies (Wolters Kluwer Health), Vol. 116, No. 1 ( 2007-07-03), p. 57-66
    Abstract: Background— Proton pump inhibitors are used extensively for acid-related gastrointestinal diseases. Their effect on cardiac contractility has not been assessed directly. Methods and Results— Under physiological conditions (37°C, pH 7.35, 1.25 mmol/L Ca 2+ ), there was a dose-dependent decrease in contractile force in ventricular trabeculae isolated from end-stage failing human hearts superfused with pantoprazole. The concentration leading to 50% maximal response was 17.3±1.3 μg/mL. Similar observations were made in trabeculae from human atria, normal rabbit ventricles, and isolated rabbit ventricular myocytes. Real-time polymerase chain reaction demonstrated the expression of gastric H + /K + –adenosine triphosphatase in human and rabbit myocardium. However, measurements with BCECF-loaded rabbit trabeculae did not reveal any significant pantoprazole-dependent changes of pH i . Ca 2+ transients recorded from field-stimulated fluo 3–loaded myocytes (F/F 0 ) were significantly depressed by 10.4±2.1% at 40 μg/mL. Intracellular Ca 2+ fluxes were assessed in fura 2–loaded, voltage-clamped rabbit ventricular myocytes. Pantoprazole (40 μg/mL) caused an increase in diastolic [Ca 2+ ] i by 33±12%, but peak systolic [Ca 2+ ] i was unchanged, resulting in a decreased Ca 2+ transient amplitude by 25±8%. The amplitude of the L-type Ca 2+ current ( I Ca,L ) was reduced by 35±5%, and sarcoplasmic reticulum Ca 2+ content was reduced by 18±6%. Measurements of oxalate-supported sarcoplasmic reticulum Ca 2+ uptake in permeabilized cardiomyocytes indicated that pantoprazole decreased Ca 2+ sensitivity (K d ) of sarcoplasmic reticulum Ca 2+ adenosine triphosphatase: control, K d =358±15 nmol/L; 40 μg/mL pantoprazole, K d =395±12 nmol/L ( P 〈 0.05). Pantoprazole also acted on cardiac myofilaments to reduced Ca 2+ -activated force. Conclusions— Pantoprazole depresses cardiac contractility in vitro by depression of Ca 2+ signaling and myofilament activity. In view of the extensive use of this agent, the effects should be evaluated in vivo.
    Type of Medium: Online Resource
    ISSN: 0009-7322 , 1524-4539
    Language: English
    Publisher: Ovid Technologies (Wolters Kluwer Health)
    Publication Date: 2007
    detail.hit.zdb_id: 1466401-X
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Online Resource
    Online Resource
    Wiley ; 2016
    In:  Medical Physics Vol. 43, No. 12 ( 2016-12), p. 6455-6473
    In: Medical Physics, Wiley, Vol. 43, No. 12 ( 2016-12), p. 6455-6473
    Abstract: Recent reports indicate that model‐based iterative reconstruction methods may improve image quality in computed tomography (CT). One difficulty with these methods is the number of options available to implement them, including the selection of the forward projection model and the penalty term. Currently, the literature is fairly scarce in terms of guidance regarding this selection step, whereas these options impact image quality. Here, the authors investigate the merits of three forward projection models that rely on linear interpolation: the distance‐driven method, Joseph's method, and the bilinear method. The authors’ selection is motivated by three factors: (1) in CT, linear interpolation is often seen as a suitable trade‐off between discretization errors and computational cost, (2) the first two methods are popular with manufacturers, and (3) the third method enables assessing the importance of a key assumption in the other methods. Methods: One approach to evaluate forward projection models is to inspect their effect on discretized images, as well as the effect of their transpose on data sets, but significance of such studies is unclear since the matrix and its transpose are always jointly used in iterative reconstruction. Another approach is to investigate the models in the context they are used, i.e., together with statistical weights and a penalty term. Unfortunately, this approach requires the selection of a preferred objective function and does not provide clear information on features that are intrinsic to the model. The authors adopted the following two‐stage methodology. First, the authors analyze images that progressively include components of the singular value decomposition of the model in a reconstructed image without statistical weights and penalty term. Next, the authors examine the impact of weights and penalty on observed differences. Results: Image quality metrics were investigated for 16 different fan‐beam imaging scenarios that enabled probing various aspects of all models. The metrics include a surrogate for computational cost, as well as bias, noise, and an estimation task, all at matched resolution. The analysis revealed fundamental differences in terms of both bias and noise. Task‐based assessment appears to be required to appreciate the differences in noise; the estimation task the authors selected showed that these differences balance out to yield similar performance. Some scenarios highlighted merits for the distance‐driven method in terms of bias but with an increase in computational cost. Three combinations of statistical weights and penalty term showed that the observed differences remain the same, but strong edge‐preserving penalty can dramatically reduce the magnitude of these differences. Conclusions: In many scenarios, Joseph's method seems to offer an interesting compromise between cost and computational effort. The distance‐driven method offers the possibility to reduce bias but with an increase in computational cost. The bilinear method indicated that a key assumption in the other two methods is highly robust. Last, strong edge‐preserving penalty can act as a compensator for insufficiencies in the forward projection model, bringing all models to similar levels in the most challenging imaging scenarios. Also, the authors find that their evaluation methodology helps appreciating how model, statistical weights, and penalty term interplay together.
    Type of Medium: Online Resource
    ISSN: 0094-2405 , 2473-4209
    Language: English
    Publisher: Wiley
    Publication Date: 2016
    detail.hit.zdb_id: 1466421-5
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    In: Medical Physics, Wiley, Vol. 46, No. 12 ( 2019-12)
    Abstract: Model‐based iterative reconstruction is a promising approach to achieve dose reduction without affecting image quality in diagnostic x‐ray computed tomography (CT). In the problem formulation, it is common to enforce non‐negative values to accommodate the physical non‐negativity of x‐ray attenuation. Using this a priori information is believed to be beneficial in terms of image quality and convergence speed. However, enforcing non‐negativity imposes limitations on the problem formulation and the choice of optimization algorithm. For these reasons, it is critical to understand the value of the non‐negativity constraint. In this work, we present an investigation that sheds light on the impact of this constraint. Methods We primarily focus our investigation on the examination of properties of the converged solution. To avoid any possibly confounding bias, the reconstructions are all performed using a provably converging algorithm started from a zero volume. To keep the computational cost manageable, an axial CT scanning geometry with narrow collimation is employed. The investigation is divided into five experimental studies that challenge the non‐negativity constraint in various ways, including noise, beam hardening, parametric choices, truncation, and photon starvation. These studies are complemented by a sixth one that examines the effect of using ordered subsets to obtain a satisfactory approximate result within 50 iterations. All studies are based on real data, which come from three phantom scans and one clinical patient scan. The reconstructions with and without the non‐negativity constraint are compared in terms of image similarity and convergence speed. In select cases, the image similarity evaluation is augmented with quantitative image quality metrics such as the noise power spectrum and closeness to a known ground truth. Results For cases with moderate inconsistencies in the data, associated with noise and bone‐induced beam hardening, our results show that the non‐negativity constraint offers little benefit. By varying the regularization parameters in one of the studies, we observed that sufficient edge‐preserving regularization tends to dilute the value of the constraint. For cases with strong data inconsistencies, the results are mixed: the constraint can be both beneficial and deleterious; in either case, however, the difference between using the constraint or not is small relative to the overall level of error in the image. The results with ordered subsets are encouraging in that they show similar observations. In terms of convergence speed, we only observed one major effect, in the study with data truncation; this effect favored the use of the constraint, but had no impact on our ability to obtain the converged solution without constraint. Conclusions Our results did not highlight the non‐negativity constraint as being strongly beneficial for diagnostic CT imaging. Altogether, we thus conclude that in some imaging scenarios, the non‐negativity constraint could be disregarded to simplify the optimization problem or to adopt other forward projection models that require complex optimization machinery to be used together with non‐negativity.
    Type of Medium: Online Resource
    ISSN: 0094-2405 , 2473-4209
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2019
    detail.hit.zdb_id: 1466421-5
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    In: European Journal of Radiology, Elsevier BV, Vol. 82, No. 2 ( 2013-2), p. 270-274
    Type of Medium: Online Resource
    ISSN: 0720-048X
    Language: English
    Publisher: Elsevier BV
    Publication Date: 2013
    detail.hit.zdb_id: 2005350-2
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    Online Resource
    Online Resource
    Ovid Technologies (Wolters Kluwer Health) ; 2001
    In:  Circulation Vol. 104, No. 7 ( 2001-08-14)
    In: Circulation, Ovid Technologies (Wolters Kluwer Health), Vol. 104, No. 7 ( 2001-08-14)
    Type of Medium: Online Resource
    ISSN: 0009-7322 , 1524-4539
    Language: English
    Publisher: Ovid Technologies (Wolters Kluwer Health)
    Publication Date: 2001
    detail.hit.zdb_id: 1466401-X
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    Online Resource
    Online Resource
    IOP Publishing ; 2009
    In:  Physics in Medicine and Biology Vol. 54, No. 15 ( 2009-08-07), p. 4625-4644
    In: Physics in Medicine and Biology, IOP Publishing, Vol. 54, No. 15 ( 2009-08-07), p. 4625-4644
    Type of Medium: Online Resource
    ISSN: 0031-9155 , 1361-6560
    RVK:
    Language: Unknown
    Publisher: IOP Publishing
    Publication Date: 2009
    detail.hit.zdb_id: 1473501-5
    SSG: 12
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    In: Medical Physics, Wiley, Vol. 40, No. 3 ( 2013-03)
    Abstract: The temporal resolution of a given image in cardiac computed tomography (CT) has so far mostly been determined from the amount of CT data employed for the reconstruction of that image. The purpose of this paper is to examine the applicability of such measures to the newly introduced modality of dual‐source CT as well as to methods aiming to provide improved temporal resolution by means of an advanced image reconstruction algorithm. Methods: To provide a solid base for the examinations described in this paper, an extensive review of temporal resolution in conventional single‐source CT is given first. Two different measures for assessing temporal resolution with respect to the amount of data involved are introduced, namely, either taking the full width at half maximum of the respective data weighting function (FWHM‐TR) or the total width of the weighting function (total TR) as a base of the assessment. Image reconstruction using both a direct fan‐beam filtered backprojection with Parker weighting as well as using a parallel‐beam rebinning step are considered. The theory of assessing temporal resolution by means of the data involved is then extended to dual‐source CT. Finally, three different advanced iterative reconstruction methods that all use the same input data are compared with respect to the resulting motion artifact level. For brevity and simplicity, the examinations are limited to two‐dimensional data acquisition and reconstruction. However, all results and conclusions presented in this paper are also directly applicable to both circular and helical cone‐beam CT. Results: While the concept of total TR can directly be applied to dual‐source CT, the definition of the FWHM of a weighting function needs to be slightly extended to be applicable to this modality. The three different advanced iterative reconstruction methods examined in this paper result in significantly different images with respect to their motion artifact level, despite exactly the same amount of data being used in the reconstruction process. Conclusions: The concept of assessing temporal resolution by means of the data employed for reconstruction can nicely be extended from single‐source to dual‐source CT. However, for advanced (possibly nonlinear iterative) reconstruction algorithms the examined approach fails to deliver accurate results. New methods and measures to assess the temporal resolution of CT images need to be developed to be able to accurately compare the performance of such algorithms.
    Type of Medium: Online Resource
    ISSN: 0094-2405 , 2473-4209
    Language: English
    Publisher: Wiley
    Publication Date: 2013
    detail.hit.zdb_id: 1466421-5
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    In: Medical Physics, Wiley, Vol. 49, No. 8 ( 2022-08), p. 5014-5037
    Abstract: Various clinical studies show the potential for a wider quantitative role of diagnostic X‐ray computed tomography (CT) beyond size measurements. Currently, the clinical use of attenuation values is, however, limited due to their lack of robustness. This issue can be observed even on the same scanner across patient size and positioning. There are different causes for the lack of robustness in the attenuation values; one possible source of error is beam hardening of the X‐ray source spectrum. The conventional and well‐established approach to address this issue is a calibration‐based single material beam hardening correction (BHC) using a water cylinder. Purpose We investigate an alternative approach for single‐material BHC with the aim of producing a more robust result for the attenuation values. The underlying hypothesis of this investigation is that calibration‐based BHC automatically corrects for scattered radiation in a manner that is suboptimal in terms of bias as soon as the scanned object strongly deviates from the water cylinder used for calibration. Methods The approach we propose performs BHC via an analytical energy response model that is embedded into a correction pipeline that efficiently estimates and subtracts scattered radiation in a patient‐specific manner prior to BHC. The estimation of scattered radiation is based on minimizing, in average, the squared difference between our corrected data and the vendor‐calibrated data. The used energy response model is considering the spectral effects of the detector response and the prefiltration of the source spectrum, including a beam‐shaping bowtie filter. The performance of the correction pipeline is first characterized with computer simulated data. Afterward, it is tested using real 3‐D CT data sets of two different phantoms, with various kV settings and phantom positions, assuming a circular data acquisition. The results are compared in the image domain to those from the scanner. Results For experiments with a water cylinder, the proposed correction pipeline leads to similar results as the vendor. For reconstructions of a QRM liver phantom with extension ring, the proposed correction pipeline achieved a more uniform and stable outcome in the attenuation values of homogeneous materials within the phantom. For example, the root mean squared deviation between centered and off‐centered phantom positioning was reduced from 6.6 to 1.8 HU in one profile. Conclusions We have introduced a patient‐specific approach for single‐material BHC in diagnostic CT via the use of an analytical energy response model. This approach shows promising improvements in terms of robustness of attenuation values for large patient sizes. Our results contribute toward improving CT images so as to make CT attenuation values more reliable for use in clinical practice.
    Type of Medium: Online Resource
    ISSN: 0094-2405 , 2473-4209
    URL: Issue
    Language: English
    Publisher: Wiley
    Publication Date: 2022
    detail.hit.zdb_id: 1466421-5
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    In: Journal of the American College of Cardiology, Elsevier BV, Vol. 31, No. 7 ( 1998-06), p. 1641-1649
    Type of Medium: Online Resource
    ISSN: 0735-1097
    RVK:
    Language: English
    Publisher: Elsevier BV
    Publication Date: 1998
    detail.hit.zdb_id: 1468327-1
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Online Resource
    Online Resource
    Elsevier BV ; 2002
    In:  The American Journal of Cardiology Vol. 89, No. 4 ( 2002-02), p. 408-413
    In: The American Journal of Cardiology, Elsevier BV, Vol. 89, No. 4 ( 2002-02), p. 408-413
    Type of Medium: Online Resource
    ISSN: 0002-9149
    RVK:
    Language: English
    Publisher: Elsevier BV
    Publication Date: 2002
    detail.hit.zdb_id: 2019595-3
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages