首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   10篇
  免费   0篇
  国内免费   1篇
航空   5篇
航天技术   3篇
综合类   1篇
航天   2篇
  2006年   1篇
  2005年   1篇
  2004年   1篇
  2003年   2篇
  2000年   1篇
  1998年   1篇
  1986年   1篇
  1974年   1篇
  1967年   1篇
  1963年   1篇
排序方式: 共有11条查询结果,搜索用时 318 毫秒
1.
High-energy solar particles, produced in association with solar flares and coronal mass ejections, occasionally bombard the earth's atmosphere. resulting in radiation intensities additional to the background cosmic radiation. Access of these particles to the earth's vicinity during times of geomagnetic disturbances are not adequately described by using static geomagnetic field models. These solar fluxes are also often distributed non uniformly in space, so that fluxes measured by satellites obtained at great distances from the earth and which sample large volumes of space around the earth cannot be used to predict fluxes locally at the earth's surface. We present here a method which uses the ground-level neutron monitor counting rates as adjoint sources of the flux in the atmosphere immediately above them to obtain solar-particle effective dose rates as a function of position over the earth's surface. We have applied this approach to the large September 29-30, 1989 ground-level event (designated GLE 42) to obtain the magnitude and distribution of the solar-particle effective dose rate from an atypically large event. The results of these calculations clearly show the effect of the softer particle spectra associated with solar particle events, as compared with galactic cosmic rays, results in a greater sensitivity to the geomagnetic field, and, unlike cosmic rays, the near-absence of a "knee" near 60 degrees geomagnetic latitude.  相似文献   
2.
本文详述了遗传算法作为一种随机搜索算法在控制器设计参数优化中的应用。从遗传算法基本原理入手,结合工程实际,论述了遗传算法在PID控制器设计、鲁棒控制器设计、最优控制、系统参数辨识、模糊逻辑控制系统和神经网络控制中的应用成果。讨论了影响遗传算法的因素,并提出了改进的策略。  相似文献   
3.
Galactic cosmic rays interact with the solar wind, the earth's magnetic field and its atmosphere to produce hadron, lepton and photon fields at aircraft altitudes. In addition to cosmic rays, energetic particles generated by solar activity bombard the earth from time to time. These particles, while less energetic than cosmic rays, also produce radiation fields at aircraft altitudes which have qualitatively the same properties as atmospheric cosmic rays. We have used a code based on transport theory to calculate atmospheric cosmic-ray quantities and compared them with experimental data. Agreement with these data is seen to be good. We have then used this code to calculate equivalent doses to aircraft crews. We have also used the code to calculate radiation doses from several large solar energetic particle events which took place in 1989, including the very large event that occurred on September 29th and 30th of that year. The spectra incident on the atmosphere were determined assuming diffusive shock theory.  相似文献   
4.
Conclusion A satellite such as Neutral-1 should be instrumented with magnetometers, plasma detectors, and detectors of energetic particles, and flown to an altitude of some 26 R E in a high-inclination orbit. It can thus probe regions of the magnetosphere of particular importance but as yet unexplored. It also is in an orbit that offers the optimum variety of phenomena to be explored, with the additional advantage that the characteristics of each phenomenon can be compared one with the other and the interrelation of these phenomena deduced. Such a satellite offers unique opportunities to investigate a multitude of unknown phenomena, such as the origin and energization of the particles that cause auroras and constitute Van Allen radiation. It can also potentially yield data to help solve long-lived problems, viz.: do the particles that cause auroras come from the sun, and how does a ripple in the solar corona ultimately feed energy into the magnetosphere at an average rate of 1017-1018 ergs/sec? Someone should fly such a satellite at the earliest opportunity and certainly by sunspot maximum (1969) since the existing satellite and instrumental technology is adequate.  相似文献   
5.
Stable carbon isotope ratios (delta(13)C) were determined for alanine, proline, phenylalanine, valine, leucine, isoleucine, aspartate (aspartic acid and asparagine), glutamate (glutamic acid and glutamine), lysine, serine, glycine, and threonine from metabolically diverse microorganisms. The microorganisms examined included fermenting bacteria, organotrophic, chemolithotrophic, phototrophic, methylotrophic, methanogenic, acetogenic, acetotrophic, and naturally occurring cryptoendolithic communities from the Dry Valleys of Antarctica. Here we demonstrated that reactions involved in amino acid biosynthesis can be used to distinguish amino acids formed by life from those formed by nonbiological processes. The unique patterns of delta(13)C imprinted by life on amino acids produced a biological bias. We also showed that, by applying discriminant function analysis to the delta(13)C value of a pool of amino acids formed by biological activity, it was possible to identify key aspects of intermediary carbon metabolism in the microbial world. In fact, microorganisms examined in this study could be placed within one of three metabolic groups: (1) heterotrophs that grow by oxidizing compounds containing three or more carbon-to-carbon bonds (fermenters and organotrophs), (2) autotrophs that grow by taking up carbon dioxide (chemolitotrophs and phototrophs), and (3) acetoclastic microbes that grow by assimilation of formaldehyde or acetate (methylotrophs, methanogens, acetogens, and acetotrophs). Furthermore, we demonstrated that cryptoendolithic communities from Antarctica grouped most closely with the autotrophs, which indicates that the dominant metabolic pathways in these communities are likely those utilized for CO(2 )fixation. We propose that this technique can be used to determine the dominant metabolic types in a community and reveal the overall flow of carbon in a complex ecosystem.  相似文献   
6.
This paper considers the turbulent homogeneous mixing of two reactants undergoing a one step, second order, irreversible, exothermic chemical reaction with a rate constant of the Arrhenius type. A statistically stationary turbulent velocity field is assumed given and unaffected by mass or heat production due to the chemical reaction. Relative density fluctuations are neglected. A Hopf-like functional formalism is presented, with application to both statistically inhomogeneous and statistically homogeneous flows. Single and double point probability density function differential equations are derived from those functional equations. The limit of very large activation energies is considered; a low degree of statistical correlation between temperature and concentration fields during the ignition period is hypothesized. After making use of the homogeneity assumption a closure problem is still present due to the nonlocalness of the molecular diffusion term. The problem is rendered closed by assuming a Gaussian conditional expected value for the temperature at a point given the temperature at a neighboring point. The closure is seen to preserve very important mathematical and physical properties. A linear first order hyperbolic differential equation with variable coefficients for the probability density function of the temperature field is obtained. A second Damköhler number based on Taylor's microscale turns out to be an important controlling parameter. A numerical integration for different values of the second Damköhler number and the initial stochastic parameters is carried out. The mixture is seen to evolve towards an eventual thermal runaway, the detailed behavior however being different for different systems. Some peculiarities during the ignition period evolution are uncovered.  相似文献   
7.
Conclusions Experimental studies of the radiation zones have been subject to the same imperfections as most other experimental physics, with some extra problems because of the features peculiar to rocket and satellite work. We consider that the theoretical physicists must review the existing knowledge, and suggest specific experiments which would be of particular significance. The theoretical studies of radiation-zone phenomena are so new that the theoretician can start essentially afresh, without being fettered by theories based more on longevity than veracity. We trust that this critical review of experimental work may be of value in such efforts.The following quotation from Maritain(1959) is so apt to this paper that it will serve as a conclusion:The intellect ... is no longer interested in anything but the invention of apparatus to capture phenomena — conceptual nets that give the mind a certain practical dominion over nature, coupled with a deceptive understanding of it; ... by advancing in this fashion, not by linking new truths to already acquired truths, but by substituting new apparatus for outmoded apparatus; by handling things without under standing them; by gaining ground against the real bit by bit, patiently, through victories that are always piecemeal and provisory — by acquiring a secret taste for the matter with which it conspires — thus has the modern intellect developed within this lower order of demiurgy a kind of manifold and marvellously specialized touch as well as wonderful instincts for the chase.  相似文献   
8.
杜庆华  卢习林 《航空学报》1986,7(6):621-635
边界元即边界积分方程方法,可方便地用来解板弯曲问题。本文在局部极坐标下建立公式,采用具有较高协调性的函数插值方案,利用外点技术完全避免边界奇异积分。列举的一系列算例表明该方法输入数据少、机时省、精度高,是板结构强度分析的一个有效方法。  相似文献   
9.
The Omnibus model: a new model of data fusion?   总被引:1,自引:0,他引:1  
Over the last two decades there have been several process models proposed (and used) for data and information fusion. A common theme of these models is the existence of multiple levels of processing within the data fusion process. In the 1980's three models were adopted: the intelligence cycle; the JDL model; and the Boyd control. The 1990's saw the introduction of the Dasarathy model and the Waterfall model. However, each of these models has particular advantages and disadvantages. A new model for data and information fusion is proposed. This is the Omnibus model, which draws together each of the previous models and their associated advantages whilst managing to overcome some of the disadvantages. Where possible, the terminology used within the Omnibus model is aimed at a general user of data fusion technology to allow use by a distributed audience  相似文献   
10.
A greatly improved version of the computer program to calculate radiation dosage to air crew members is now available. Designated CARI-6, this program incorporates an updated geomagnetic cutoff rigidity model and a revision of the primary cosmic ray spectrum based on recent work by Gaisser and Stanev (1998). We believe CARI-6 provides the most accurate available method for calculating the radiation dosage to air crew members. The program is now utilized by airline companies around the world and provides unification for subsequent world-wide studies on the effects of natural radiation on aircrew members.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号