首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
Recent work has indicated that pion production and the associated electromagnetic (EM) cascade may be an important contribution to the total astronaut exposure in space. Recent extensions to the deterministic space radiation transport code, HZETRN, allow the production and transport of pions, muons, electrons, positrons, and photons. In this paper, the extended code is compared to the Monte Carlo codes, Geant4, PHITS, and FLUKA, in slab geometries exposed to galactic cosmic ray (GCR) boundary conditions. While improvements in the HZETRN transport formalism for the new particles are needed, it is shown that reasonable agreement on dose is found at larger shielding thicknesses commonly found on the International Space Station (ISS). Finally, the extended code is compared to ISS data on a minute-by-minute basis over a seven day period in 2001. The impact of pion/EM production on exposure estimates and validation results is clearly shown. The Badhwar–O’Neill (BO) 2004 and 2010 models are used to generate the GCR boundary condition at each time-step allowing the impact of environmental model improvements on validation results to be quantified as well. It is found that the updated BO2010 model noticeably reduces overall exposure estimates from the BO2004 model, and the additional production mechanisms in HZETRN provide some compensation. It is shown that the overestimates provided by the BO2004 GCR model in previous validation studies led to deflated uncertainty estimates for environmental, physics, and transport models, and allowed an important physical interaction (π/EM) to be overlooked in model development. Despite the additional π/EM production mechanisms in HZETRN, a systematic under-prediction of total dose is observed in comparison to Monte Carlo results and measured data.  相似文献   

2.
3.
Long-term human presence in space requires the inclusion of radiation constraints in mission planning and the design of shielding materials, structures and vehicles. It is necessary to expose the numerical tools commonly used in radiation analyses to extensive verification, validation and uncertainty quantification. In this paper, the numerical error associated with energy discretization in HZETRN is addressed. An inadequate numerical integration scheme in the transport algorithm is shown to produce large errors in the low energy portion of the neutron and light ion fluence spectra. It is further shown that the errors result from the narrow energy domain of the neutron elastic cross section spectral distributions and that an extremely fine energy grid is required to resolve the problem under the current formulation. Since adding a sufficient number of energy points will render the code computationally inefficient, we revisit the light ion and neutron transport theory developed for HZETRN and focus on neutron elastic interactions. Two numerical methods (average value and collocation) are developed to provide adequate resolution in the energy domain and more accurately resolve the neutron elastic interactions. An energy grid convergence study is conducted to demonstrate the improved stability of the new methods. Based on the results of the convergence study and the ease of implementation, the average value method with a 100 point energy grid is found to be suitable for future use in HZETRN.  相似文献   

4.
The completion of the international space station (ISS) in 2011 has provided the space research community an ideal proving ground for future long duration human activities in space. Ionizing radiation measurements in ISS form the ideal tool for the validation of radiation environmental models, nuclear transport codes and nuclear reaction cross sections. Indeed, prior measurements on the space transportation system (STS; shuttle) provided vital information impacting both the environmental models and the nuclear transport code developments by indicating the need for an improved dynamic model of the low Earth orbit (LEO) trapped environment. Additional studies using thermo-luminescent detector (TLD), tissue equivalent proportional counter (TEPC) area monitors, and computer aided design (CAD) model of earlier ISS configurations, confirmed STS observations that, as input, computational dosimetry requires an environmental model with dynamic and directional (anisotropic) behavior, as well as an accurate six degree of freedom (DOF) definition of the vehicle attitude and orientation along the orbit of ISS.  相似文献   

5.
We have used several transport codes to calculate dose and dose equivalent values as well as the particle spectra behind a slab or inside a spherical shell shielding in typical space radiation environments. Two deterministic codes, HZETRN and UPROP, and two Monte Carlo codes, FLUKA and Geant4, are included. A soft solar particle event, a hard solar particle event, and a solar minimum galactic cosmic rays environment are considered; and the shielding material is either aluminum or polyethylene. We find that the dose values and particle spectra from HZETRN are in general rather consistent with Geant4 except for neutrons. The dose equivalent values from HZETRN and Geant4 are not far from each other, but the HZETRN values behind shielding are often lower than the Geant4 values. Results from FLUKA and Geant4 are mostly consistent for considered cases. However, results from the legacy code UPROP are often quite different from the other transport codes, partly due to its non-consideration of neutrons. Comparisons for the spherical shell geometry exhibit the same qualitative features as for the slab geometry. In addition, results from both deterministic and Monte Carlo transport codes show that the dose equivalent inside the spherical shell decreases from the center to the inner surface and this decrease is large for solar particle events; consistent with an earlier study based on deterministic radiation transport results. This study demonstrates both the consistency and inconsistency among these transport models in their typical space radiation predictions; further studies will be required to pinpoint the exact physics modules in these models that cause the differences and thus may be improved.  相似文献   

6.
Estimates of organ dose equivalents for the skin, eye lens, blood forming organs, central nervous system, and heart of female astronauts from exposures to the 1977 solar minimum galactic cosmic radiation spectrum for various shielding geometries involving simple spheres and locations within the Space Transportation System (space shuttle) and the International Space Station (ISS) are made using the HZETRN 2010 space radiation transport code. The dose equivalent contributions are broken down by charge groups in order to better understand the sources of the exposures to these organs. For thin shields, contributions from ions heavier than alpha particles comprise at least half of the organ dose equivalent. For thick shields, such as the ISS locations, heavy ions contribute less than 30% and in some cases less than 10% of the organ dose equivalent. Secondary neutron production contributions in thick shields also tend to be as large, or larger, than the heavy ion contributions to the organ dose equivalents.  相似文献   

7.
8.
Improved spacecraft shield design requires early entry of radiation constraints into the design process to maximize performance and minimize costs. As a result, we have been investigating high-speed computational procedures to allow shield analysis from the preliminary design concepts to the final design. In particular, we will discuss the progress towards a full three-dimensional and computationally efficient deterministic code for which the current HZETRN evaluates the lowest-order asymptotic term. HZETRN is the first deterministic solution to the Boltzmann equation allowing field mapping within the International Space Station (ISS) in tens of minutes using standard finite element method (FEM) geometry common to engineering design practice enabling development of integrated multidisciplinary design optimization methods. A single ray trace in ISS FEM geometry requires 14 ms and severely limits application of Monte Carlo methods to such engineering models. A potential means of improving the Monte Carlo efficiency in coupling to spacecraft geometry is given in terms of re-configurable computing and could be utilized in the final design as verification of the deterministic method optimized design.  相似文献   

9.
10.
The HZETRN deterministic radiation code is one of several tools developed to analyze the effects of harmful galactic cosmic rays (GCR) and solar particle events on mission planning and shielding for astronauts and instrumentation. This paper is a comparison study involving the two Monte Carlo transport codes, HETC–HEDS and FLUKA and the deterministic transport code, HZETRN. Each code is used to transport an ion from the 1977 solar minimum GCR spectrum impinging upon a 20 g/cm2 aluminum slab followed by a 30 g/cm2 water slab. This research is part of a systematic effort of verification and validation to quantify the accuracy of HZETRN and determine areas where it can be improved. Comparisons of dose and dose equivalent values at various depths in the water slab are presented in this report. This is followed by a comparison of the proton and forward, backward and total neutron flux at various depths in the water slab. Comparisons of the secondary light ion 2H, 3H, 3He and 4He fluxes are also examined.  相似文献   

11.
12.
The number of Earth orbiting objects is constantly growing, and some orbital regions are becoming risky environments for space assets of interest, which are increasingly threatened by accidental collisions with other objects, especially in Low-Earth Orbit (LEO). Collision risk assessment is performed by various methods, both covariance and non-covariance based. The Cube algorithm is a non-covariance-based method used to estimate the collision rates between space objects, whose concept consists in dividing the space in cubes of fixed dimension and, at each time instant, checking if two or more objects share the same cube. Up to now its application has been limited to the long-term scenarios of orbital debris evolutionary models, where considering the uncertainties is not necessary and impractical. Within operative contexts, instead, medium-term collision risk analysis may be an important task, in which the propagation-related uncertainties play a prominent role, but the timescale poses challenges for the application of standard covariance-based conjunction analysis techniques. In this framework, this paper presents an approach for the evaluation of the medium-term collision frequency for objects in LEO, called Uncertainty-aware Cube method. It is a modified version of the Cube, able to take the possible errors in the space objects’ position into account for the detection of the conjunctions. As an object’s orbit is propagated, the along-track position error grows more and more, and each object could potentially be in a different position with respect to the one determined by numerical propagation and, thus, in a different cube. Considering the uncertainties, at each time instant the algorithm associates more than one cube to each object and checks if they share at least one cube. If so, a conjunction is detected and a degree of confidence is evaluated. The performance of the method is assessed in different LEO scenarios and compared to the original Cube method.  相似文献   

13.
The health risks associated with exposure to various components of space radiation are of great concern when planning manned long-term interplanetary missions, such as future missions to Mars. Since it is not possible to measure the radiation environment inside of human organs in deep space, simulations based on radiation transport/interaction codes coupled to phantoms of tissue equivalent materials are used. However, the calculated results depend on the models used in the codes, and it is therefore necessary to verify their validity by comparison with measured data. The goal of this paper is to compare absorbed doses obtained in the MATROSHKA-R experiment performed at the International Space Station (ISS) with simulations performed with the three-dimensional Monte Carlo Particle and Heavy-Ion Transport code System (PHITS). The absorbed dose was measured using passive detectors (packages of thermoluminescent and plastic nuclear track detectors) placed on the surface of the spherical tissue equivalent phantom MATROSHKA-R, which was exposed aboard the ISS in the Service Zvezda Module from December 2005 to September 2006. The data calculated by PHITS assuming an ISS shielding of 3 g/cm2 and 5 g/cm2 aluminum mass thickness were in good agreement with the measurements. Using a simplified geometrical model of the ISS, the influence of variations in altitude and wall mass thickness of the ISS on the calculated absorbed dose was estimated. The uncertainties of the calculated data are also discussed; the relative expanded uncertainty of absorbed dose in phantom was estimated to be 44% at a 95% confidence level.  相似文献   

14.
To estimate astronaut health risk due to space radiation, one must have the ability to calculate various exposure-related quantities that are averaged over specific organs and tissue types. Such calculations require computational models of the ambient space radiation environment, particle transport, nuclear and atomic physics, and the human body. While significant efforts have been made to verify, validate, and quantify the uncertainties associated with many of these models and tools, relatively little work has focused on the uncertainties associated with the representation and utilization of the human phantoms. In this study, we first examine the anatomical properties of the Computerized Anatomical Man (CAM), Computerized Anatomical Female (CAF), Male Adult voXel (MAX), and Female Adult voXel (FAX) models by comparing the masses of various model tissues used to calculate effective dose to the reference values specified by the International Commission on Radiological Protection (ICRP). The MAX and FAX tissue masses are found to be in good agreement with the reference data, while major discrepancies are found between the CAM and CAF tissue masses and the reference data for almost all of the effective dose tissues. We next examine the distribution of target points used with the deterministic transport code HZETRN (High charge (Z) and Energy TRaNsport) to compute mass averaged exposure quantities. A numerical algorithm is presented and used to generate multiple point distributions of varying fidelity for many of the effective dose tissues identified in CAM, CAF, MAX, and FAX. The point distributions are used to compute mass averaged dose equivalent values under both a galactic cosmic ray (GCR) and solar particle event (SPE) environment impinging isotropically on three spherical aluminum shells with areal densities of 0.4 g/cm2, 2.0 g/cm2, and 10.0 g/cm2. The dose equivalent values are examined to identify a recommended set of target points for each of the tissues and to further assess the differences between CAM, CAF, MAX, and FAX. It is concluded that the previously published CAM and CAF point distributions were significantly under-sampled and that the set of point distributions presented here should be adequate for future studies involving CAM, CAF, MAX, or FAX. It is also found that the errors associated with the mass and location of certain tissues in CAM and CAF have a significant impact on the mass averaged dose equivalent values, and it is concluded that MAX and FAX are more accurate than CAM and CAF for space radiation analyses.  相似文献   

15.
An accurate understanding of the physical interactions and transport of space radiation is important for safe and efficient space operations. Secondary particles produced by primary particle interactions with intervening materials are an important contribution to radiation risk. Pions are copiously produced in the nuclear interactions typical of space radiations and can therefore be an important contribution to radiation exposure. Charged pions decay almost exclusively to muons. As a consequence, muons must also be considered in space radiation exposure studies. In this work, the NASA space radiation transport code HZETRN has been extended to include the transport of charged pions and muons. The relevant transport equation, solution method, and implemented cross sections are reviewed. Muon production in the Earth’s upper atmosphere is then investigated, and comparisons with recent balloon flight measurements of differential muon flux are presented. Muon production from the updated version of HZETRN is found to match the experimental data well.  相似文献   

16.
A rapid analytical procedure for the prediction of a micro-dosimeter response function in low Earth orbit (LEO), correlated with the Space Transportation System (STS, shuttle) Tissue Equivalent Proportional Counter (TEPC) measurements is presented. The analytical model takes into consideration the energy loss straggling and chord length distribution of the detector, and is capable of predicting energy deposition fluctuations in a cylindrical micro-volume of arbitrary aspect ratio (height/diameter) by incoming ions through both direct and indirect (δ ray) events. At any designated (ray traced) target point within the vehicle, the model accepts the differential flux spectrum of Galactic Cosmic Rays (GCRs) and/or trapped protons at LEO as input. On a desktop PC, the response function of TEPC for each ion in the GCR/trapped field is computed at the average rate of 30 s/ion. The ionizing radiation environment at LEO is represented by O’Neill’s GCR model (2004), covering charged particles in the 1 ? Z ? 28 range. O’Neill’s free space GCR model is coupled with the Langley Research Center (LaRC) angular dependent geomagnetic cutoff model to compute the transmission coefficient in LEO. The trapped proton environment is represented by a LaRC developed time dependent procedure which couples the AP8MIN/AP8MAX, Deep River Neutron Monitor (DRNM) and F10.7 solar radio frequency measurements. The albedo neutron environment is represented by the extrapolation of the Atmospheric Ionizing Radiation (AIR) measurements. The charged particle transport calculations correlated with STS 51 and 114 flights are accomplished by using the most recent version (2005) of the LaRC deterministic High charge (Z) and Energy TRaNsport (HZETRN) code. We present the correlations between the TEPC model predictions (response function) and TEPC measured differential/integral spectra in the lineal energy (y) domain for both GCR and trapped protons, with the conclusion that the model correctly accounts for the increase in flux at low y values where energetic ions are the primary contributor. We further discuss that, even with the incorporation of angular dependency in the cutoffs, comparison of the GCR differential/integral flux between STS 51 and 114 TEPC measured data and current calculations indicates that there still exists an underestimation by the simulations at low to mid range y values. This underestimation is partly related the exclusion of the secondary pion particle production from the current version of HZETRN.  相似文献   

17.
18.
We have developed a dynamic geomagnetic vertical cutoff rigidity model that predicts the energetic charged particle transmission through the magnetosphere. Initially developed for space applications, we demonstrate the applicability of this library of cutoff rigidity models for computing aircraft radiation dose. The world grids of vertical cutoff rigidities were obtained by particle trajectory tracing in a magnetospheric model. This reference set of world grids of vertical cutoff rigidities calculated for satellite altitudes covers all magnetic activity levels from super quiet to extremely disturbed (i.e., Kp indices ranging from 0 to 9+) for every three hours in universal time. We utilize the McIlwain "L" parameter as the basis of the interpolation technique to reduce these initial satellite altitude vertical cutoff rigidities to cutoff rigidity values at aircraft altitudes.  相似文献   

19.
20.
A key requirement for accurate trajectory prediction and space situational awareness is knowledge of how non-conservative forces affect space object motion. These forces vary temporally and spatially, and are driven by the underlying behavior of space weather particularly in Low Earth Orbit (LEO). Existing trajectory prediction algorithms adjust space weather models based on calibration satellite observations. However, lack of sufficient data and mismodeling of non-conservative forces cause inaccuracies in space object motion prediction, especially for uncontrolled debris objects. The uncontrolled nature of debris objects makes them particularly sensitive to the variations in space weather. Our research takes advantage of this behavior by utilizing observations of debris objects to infer the space environment parameters influencing their motion.The hypothesis of this research is that it is possible to utilize debris objects as passive, indirect sensors of the space environment. We focus on estimating atmospheric density and its spatial variability to allow for more precise prediction of LEO object motion. The estimated density is parameterized as a grid of values, distributed by latitude and local sidereal time over a spherical shell encompassing Earth at a fixed altitude of 400 km. The position and velocity of each debris object are also estimated. A Partially Orthogonal Ensemble Kalman Filter (POEnKF) is used for assimilation of space object measurements to estimate density.For performance comparison, the scenario characteristics (number of objects, measurement cadence, etc.) are based on a sensor tasking campaign executed for the High Accuracy Satellite Drag Model project. The POEnKF analysis details spatial comparisons between the true and estimated density fields, and quantifies the improved accuracy in debris object motion predictions due to more accurate drag force models from density estimates. It is shown that there is an advantage to utilizing multiple debris objects instead of just one object. Although the work presented here explores the POEnKF performance when using information from only 16 debris objects, the research vision is to utilize information from all routinely observed debris objects. Overall, the filter demonstrates the ability to estimate density to within a threshold of accuracy dependent on measurement/sensor error. In the case of a geomagnetic storm, the filter is able to track the storm and provide more accurate density estimates than would be achieved using a simple exponential atmospheric density model or MSIS Atmospheric Model (when calm conditions are assumed).  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号