首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   3726篇
  免费   44篇
  国内免费   29篇
航空   1782篇
航天技术   1322篇
综合类   71篇
航天   624篇
  2021年   35篇
  2018年   51篇
  2017年   61篇
  2016年   35篇
  2015年   36篇
  2014年   75篇
  2013年   100篇
  2012年   85篇
  2011年   164篇
  2010年   115篇
  2009年   141篇
  2008年   207篇
  2007年   115篇
  2006年   109篇
  2005年   117篇
  2004年   81篇
  2003年   117篇
  2002年   90篇
  2001年   105篇
  2000年   88篇
  1999年   103篇
  1998年   107篇
  1997年   101篇
  1996年   88篇
  1995年   104篇
  1994年   108篇
  1993年   64篇
  1992年   75篇
  1991年   39篇
  1990年   44篇
  1989年   77篇
  1988年   39篇
  1987年   32篇
  1986年   35篇
  1985年   117篇
  1984年   79篇
  1983年   83篇
  1982年   60篇
  1981年   104篇
  1980年   29篇
  1979年   27篇
  1978年   33篇
  1977年   25篇
  1976年   23篇
  1975年   40篇
  1974年   21篇
  1972年   33篇
  1971年   25篇
  1969年   22篇
  1967年   23篇
排序方式: 共有3799条查询结果,搜索用时 62 毫秒
931.
Test requirements, which are generally collected in multiple disparate formats throughout the life cycle of an electronic product, could be used in various applications that reduce test and development cycle times and increase the confidence in the final test program. Unfortunately, test requirements are seldom captured in a consistent format that may be processed by a computer, thus eliminating the possibility of using such requirements in an engineering application. Additionally, such an approach disallows test requirements captured in one segment of the product life cycle to be reused in subsequent life cycle stages. This paper describes a model-based methodology, specifically the Test Requirements Model (TeRM), which can be shown to facilitate the transfer of test-related product information between various stages of the life cycle. This transportability, in conjunction with an exchange format that can be processed by a computer, permits test requirement information to support value-added applications in the engineering process throughout the life cycle of a product  相似文献   
932.
Performance of 10- and 20-target MSE classifiers   总被引:2,自引:0,他引:2  
MIT Lincoln Laboratory is responsible for developing the ATR (automatic target recognition) system for the DARPA-sponsored SAIP program; the baseline ATR system recognizes 10 GOB (ground order of battle) targets; the enhanced version of SAIP requires the ATR system to recognize 20 GOB targets. This paper presents ATR performance results for 10- and 20-target mean square error (MSE) classifiers using high-resolution SAR (synthetic aperture radar) imagery.  相似文献   
933.
934.
935.
Ion composition data from the first 22 months of operation of the Polar/TIMAS instrument, covering the 15-eV/e to 33-keV/e energy range, have been surveyed to determine the typical abundance, at solar minimum, of N2+, NO+ and O2+ ions in the auroral ion outflow, as compared to that of the better known O+ ions. The results indicate that molecular ions have roughly the same energy distribution as the O+ ions, with maximum differential flux occurring below 400 eV, but are far less abundant, by two orders of magnitude. The molecular ions also differ from the O+ ions in that they seem more specifically associated with enhanced geomagnetic activity.  相似文献   
936.
To estimate astronaut health risk due to space radiation, one must have the ability to calculate various exposure-related quantities that are averaged over specific organs and tissue types. Such calculations require computational models of the ambient space radiation environment, particle transport, nuclear and atomic physics, and the human body. While significant efforts have been made to verify, validate, and quantify the uncertainties associated with many of these models and tools, relatively little work has focused on the uncertainties associated with the representation and utilization of the human phantoms. In this study, we first examine the anatomical properties of the Computerized Anatomical Man (CAM), Computerized Anatomical Female (CAF), Male Adult voXel (MAX), and Female Adult voXel (FAX) models by comparing the masses of various model tissues used to calculate effective dose to the reference values specified by the International Commission on Radiological Protection (ICRP). The MAX and FAX tissue masses are found to be in good agreement with the reference data, while major discrepancies are found between the CAM and CAF tissue masses and the reference data for almost all of the effective dose tissues. We next examine the distribution of target points used with the deterministic transport code HZETRN (High charge (Z) and Energy TRaNsport) to compute mass averaged exposure quantities. A numerical algorithm is presented and used to generate multiple point distributions of varying fidelity for many of the effective dose tissues identified in CAM, CAF, MAX, and FAX. The point distributions are used to compute mass averaged dose equivalent values under both a galactic cosmic ray (GCR) and solar particle event (SPE) environment impinging isotropically on three spherical aluminum shells with areal densities of 0.4 g/cm2, 2.0 g/cm2, and 10.0 g/cm2. The dose equivalent values are examined to identify a recommended set of target points for each of the tissues and to further assess the differences between CAM, CAF, MAX, and FAX. It is concluded that the previously published CAM and CAF point distributions were significantly under-sampled and that the set of point distributions presented here should be adequate for future studies involving CAM, CAF, MAX, or FAX. It is also found that the errors associated with the mass and location of certain tissues in CAM and CAF have a significant impact on the mass averaged dose equivalent values, and it is concluded that MAX and FAX are more accurate than CAM and CAF for space radiation analyses.  相似文献   
937.
The gravity field model AIUB-CHAMP02S, which is based on six years of CHAMP GPS data, is presented here. The gravity field parameters were derived using a two step procedure: In a first step a kinematic trajectory of a low Earth orbiting (LEO) satellite is computed using the GPS data from the on-board receiver. In this step the orbits and clock corrections of the GPS satellites as well as the Earth rotation parameters (ERPs) are introduced as known. In the second step this kinematic orbit is represented by a gravitational force model and orbit parameters.  相似文献   
938.
Purpose of the present study is to provide algorithms for and examples of how to simulate star visibility and tracking by a Telescope attached to the main truss of the International Space Station (ISS).  相似文献   
939.
The presence of small-amplitude oscillations in prominences is well-known from long time ago. These oscillations, whose exciters are still unknown, seem to be of local nature and are interpreted in terms of magnetohydrodynamic (MHD) waves. During last years, observational evidence about the damping of these oscillations has grown and several mechanisms able to damp these oscillations have been the subject of intense theoretical modelling. Among them, the most efficient seem to be radiative cooling and ion-neutral collisions. Radiative cooling is able to damp slow MHD waves efficiently, while ion-neutral collisions, in partially ionised plasmas like those of solar prominences, can also damp fast MHD waves. In this paper, we plan to summarize our current knowledge about the time and spatial damping of small-amplitude oscillations in prominences.  相似文献   
940.
A critical need for NASA is the ability to accurately model the transport of heavy ions in the Galactic Cosmic Rays (GCR) through matter, including spacecraft walls, equipment racks, etc. Nuclear interactions are of great importance in the GCR transport problem, as they can cause fragmentation of the incoming ion into lighter ions. Since the radiation dose delivered by a particle is proportional to the square of (charge/velocity), fragmentation reduces the dose delivered by incident ions. The other mechanism by which dose can be reduced is ionization energy loss, which can lead to some particles stopping in the shielding. This is the conventional notion of shielding, but it is not applicable to human spaceflight since the particles in the GCR tend to be too energetic to be stopped in the relatively thin shielding that is possible within payload mass constraints. Our group has measured a large number of fragmentation cross sections, intended to be used as input to, or for validation of, NASA’s radiation transport models. A database containing over 200 charge-changing cross sections and over 2000 fragment production cross sections has been compiled. In this report, we examine in detail the contrast between fragment measurements at large acceptance and small acceptance. We use output from the PHITS Monte Carlo code to test our assumptions using as an example 40Ar data (and simulated data) at a beam energy of 650 MeV/nucleon. We also present preliminary analysis in which isotopic resolution was attained for beryllium fragments produced by beams of 10B and 11B. Future work on the experimental data set will focus on extracting and interpreting production cross sections for light fragments.  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号