- API
Nano Dust Analyzer Project
data.nasa.gov | Last Updated 2020-01-29T04:54:41.000Z<p> We propose to develop a new highly sensitive instrument to confirm the existence of the so-called nano-dust particles, characterize their impact parameters, and measure their chemical composition. Simultaneous theoretical studies will be used to derive the expected&nbsp; mass and velocity ranges of these putative particles to formulate science and measurement requirements for the future deployment of&nbsp; the proposed Nano-Dust Analyzer (NDA)&nbsp;</p> <p> Early dust instruments onboard Pioneer 8 and 9 and Helios spacecraft detected a flow of submicron sized dust particles coming from the direction of the Sun. These particles originate in the inner solar system from mutual collisions among meteoroids and move on&nbsp; hyperbolic orbits that leave the Solar System under the prevailing radiation pressure force. Later dust instruments with higher&nbsp; sensitivity had to avoid looking toward the Sun because of interference from the solar wind and UV radiation and thus contributed&nbsp; little to the characterization of the dust stream. The one exception is the Ulysses dust detector that observed escaping dust particles&nbsp; high above the solar poles, which confirm the suspicion that charged nanometer sized dust grains are carried to high heliographic&nbsp; latitudes by electromagnetic interactions with the Interplanetary Magnetic Field (IMF). Recently, the STEREO WAVES instruments&nbsp; recorded a large number of intense electric field signals, which were interpreted as impacts from nanometer sized particles striking the&nbsp; spacecraft with velocities of about the solar wind speed. This high flux and strong spatial and/or temporal variations of nanometer&nbsp; sized dust grains at low latitude appears to be uncorrelated with the solar wind properties. This is a mystery as it would require that&nbsp; the total collisional meteoroid debris inside 1 AU is cast in nanometer sized fragments. The observed fluxes of inner-source pickup ions&nbsp; also point to the existence of a much enhanced dust population in the nanometer size range.&nbsp;</p> <p> This new heliospherical phenomenon of nano-dust streams may have consequences throughout the planetary system, but as of yet no dust instrument exists that could be used to shed light on their properties. &nbsp;We propose to develop a dust analyzer capable to detect and&nbsp; analyze these mysterious dust particles coming from the solar direction and to embark upon complementary theoretical studies to&nbsp; understand their characteristics. The instrument is based on the Cassini Dust Analyzer (CDA) that has analyzed the composition of&nbsp; nanometer sized dust particles emanating from the Jovian and Saturnian systems but could not be pointed towards the Sun. By&nbsp; applying technologies implemented in solar wind instruments and coronagraphs a highly sensitive dust analyzer will be developed and&nbsp; tested in the laboratory. The dust analyzer shall be able to characterize impact properties (impact charge and energy distribution of&nbsp; ions from which mass and speed of the impacting grains may be derived) and chemical composition of individual nanometer sized&nbsp; particles while exposed to solar wind and UV radiation. The measurements will enable us to identify the source of the dust by&nbsp; comparing their elemental composition with that of larger micrometeoroid particles of cometary and asteroid origin and will reveal&nbsp; interaction of nano-dust with the interplanetary medium by investigating the relation of the dust flux with solar wind and IMF&nbsp; properties.&nbsp;</p> <p> Complementary theoretically studies will be performed to understand the characteristics of nano-dust particles at 1 AU to answer the&nbsp; following questions:&nbsp; - What is the speed range at which nanometer sized particles impact
- API
TRMM (TMPA-RT) Near Real-Time Precipitation L3 1 day 0.25 degree x 0.25 degree V7 (TRMM_3B42RT_Daily) at GES DISC
data.nasa.gov | Last Updated 2022-01-17T05:59:46.000ZTMPA (3B42RT_Daily) dataset have been discontinued as of Dec. 31, 2019, and users are strongly encouraged to shift to the successor IMERG dataset (doi: 10.5067/GPM/IMERGDE/DAY/06; 10.5067/GPM/IMERGDL/DAY/06). This daily accumulated precipitation product is generated from the Near Real-Time 3-hourly TRMM Multi-Satellite Precipitation Analysis TMPA (3B42RT). It is produced at the NASA GES DISC, as a value added product. Simple summation of valid retrievals in a grid cell is applied for the data day. The result is given in (mm). Although the grid is from 60S to 60N, the high latitudes (beyond 50S/N) near real-time retrievals are considered very unreliable and thus are screened out from the daily accumulations. The beginning and ending time for every daily granule are listed in the file global attributes, and are taken correspondingly from the first and the last 3-hourly granules participating in the aggregation. Thus the time period covered by one daily granule amounts to 24 hours, which can be inspected in the file global attributes. Counts of valid retrievals for the day are provided for every variable, making it possible to compute conditional and unconditional mean precipitation for grid cells where less than 8 retrievals for the day are available. Efforts have been made to make the format of this derived product as similar as possible to the new Global Precipitation Measurement CF-compliant file format. The latency of this derived daily product is about 7 hours after the UTC day is closed. Users should be mindful that the price for the short latency of these data is the reduced quality as compared to the research quality product. The information provided here on the TRMM mission, and on the original 3-hr 3B42 product, remain relevant for this derived product. Note, however, this product is in netCDF-4 format. The following describes the derivation in more details. The daily accumulation is derived by summing *valid* retrievals in a grid cell for the data day. Since the 3-hourly source data are in mm/hr, a factor of 3 is applied to the sum. Thus, for every grid cell we have Pdaily = 3 * SUM{Pi * 1[Pi valid]}, i=[1,Nf] Pdaily_cnt = SUM{1[Pi valid]} where: Pdaily - Daily accumulation (mm) Pi - 3-hourly input, in (mm/hr) Nf - Number of 3-hourly files per day, Nf=8 1[.] - Indicator function; 1 when Pi is valid, 0 otherwise Pdaily_cnt - Number of valid retrievals in a grid cell per day. Grid cells for which Pdaily_cnt=0, are set to fill value in the Daily files. Note that Pi=0 is a valid value. On occasion, the 3-hourly source data have fill values for Pi in a very few grid cells. The total accumulation for such grid cells is still issued, inspite of the likelihood that thus resulting accumulation has a larger uncertainty in representing the "true" daily total. These events are easily detectable using "counts" variables that contain Pdaily_cnt, whereby users can screen out any grid cells for which Pdaily_cnt less than Nf. There are various ways the accumulated daily error could be estimated from the source 3-hourly error. In this release, the daily error provided in the data files is calculated as follows. First, squared 3-hourly errors are summed, and then square root of the sum is taken. Similarly to the precipitation, a factor of 3 is finally applied: Perr_daily = 3 * { SUM[ (Perr_i * 1[Perr_i valid])^2 ] }^0.5 , i=[1,Nf] Ncnt_err = SUM( 1[Perr_i valid] ) where: Perr_daily - Magnitude of the daily accumulated error power, (mm) Ncnt_err - The counts for the error variable Thus computed Perr_daily represents the worst case scenario that assumes the error in the 3-hourly source data, which is given in mm/hr, is accumulating within the 3-hourly period of the source data and then during the day. These values, however, can easily be conveted to root mean square error estimate of the rainfall rate: rms_err = { (Perr_daily/3) ^2 / Ncnt
- API
Experimental and Analytical Development of a Health Management System for Electro-Mechanical Actuators
data.nasa.gov | Last Updated 2020-01-29T01:49:29.000ZExpanded deployment of Electro-Mechanical Actuators (EMAs) in critical applications has created much interest in EMA Prognostic Health Management (PHM), a key enabling technology of Condition Based Maintenance (CBM). As such, Impact Technologies, LLC is collaborating with the NASA Ames Research Center to perform a number of research efforts in support of NASA’s Integrated Vehicle Health Management (IVHM) initiatives. These efforts have combined experimental test stand development, laboratory seeded fault testing, and physical model-based health monitoring in a comprehensive PHM system development strategy. This paper discusses two closely related EMA research programs being conducted by Impact and NASA Ames. The first of these efforts resulted in the creation of an electro-mechanical actuator test stand for the Prognostics Center of Excellence at the NASA Ames Research Center. The second effort is ongoing and is utilizing physics-based modeling techniques to develop an algorithm and software package toolset for PHM of aircraft EMA systems using a hybrid (virtual sensor) approach.
- API
Optimal Alarm Systems
data.nasa.gov | Last Updated 2020-01-29T03:25:13.000ZAn optimal alarm system is simply an optimal level-crossing predictor that can be designed to elicit the fewest false alarms for a fixed detection probability. It currently use Kalman filtering for dynamic systems to provide a layer of predictive capability for the forecasting of adverse events. Predicted Kalman filter future process values and a fixed critical threshold can be used to construct a candidate level-crossing event over a predetermined prediction window. Due to the fact that the alarm regions for an optimal level-crossing predictor cannot be expressed in closed form, one of our aims has been to investigate approximations for the design of an optimal alarm system. Approximations to this sort of alarm region are required for the most computationally efficient generation of a ROC curve or other similar alarm system design metrics. Algorithms based upon the optimal alarm system concept also require models that appeal to a variety of data mining and machine learning techniques. As such, we have investigated a serial architecture which was used to preprocess a full feature space by using SVR (Support Vector Regression), implicitly reducing it to a univariate signal while retaining salient dynamic characteristics (see AIAA attachment below). This step was required due to current technical constraints, and is performed by using the residual generated by SVR (or potentially any regression algorithm) that has properties which are favorable for use as training data to learn the parameters of a linear dynamical system. Future development will lift these restrictions so as to allow for exposure to a broader class of models such as a switched multi-input/output linear dynamical system in isolation based upon heterogeneous (both discrete and continuous) data, obviating the need for the use of a preprocessing regression algorithm in serial. However, the use of a preprocessing multi-input/output nonlinear regression algorithm in serial with a multi-input/output linear dynamical system will allow for the characterization of underlying static nonlinearities to be investigated as well. We will even investigate the use of non-parametric methods such as Gaussian process regression and particle filtering in isolation to lift the linear and Gaussian assumptions which may be invalid for many applications. Future work will also involve improvement of approximations inherent in use of the optimal alarm system of optimal level-crossing predictor. We will also perform more rigorous testing and validation of the alarm systems discussed by using standard machine learning techniques and consider more complex, yet practically meaningful critical level-crossing events. Finally, a more detailed investigation of model fidelity with respect to available data and metrics has been conducted (see attachment below). As such, future work on modeling will involve the investigation of necessary improvements in initialization techniques and data transformations for a more feasible fit to the assumed model structure. Additionally, we will explore the integration of physics-based and data-driven methods in a Bayesian context, by using a more informative prior.
- API
Metrics for Evaluating Performance of Prognostic Techniques
data.nasa.gov | Last Updated 2020-01-29T03:23:28.000ZPrognostics is an emerging concept in condition basedmaintenance(CBM)ofcriticalsystems.Alongwith developing the fundamentals of being able to confidently predict Remaining Useful Life (RUL), the technology calls for fielded applications as it inches towards maturation. This requires a stringent performance evaluation so that the significance of the concept can be fully exploited. Currently, prognostics concepts lack standard definitions and suffer from ambiguous and inconsistent interpretations. This lack of standards is in part due to the varied end-user requirements for different applications, time scales, available information, domain dynamics, etc. to name a few issues. Instead, the research community has used a variety of metrics based largely on convenience with respect to their respective requirements. Very little attention has been focused on establishing a common ground to compare different efforts. This paper surveys the metrics that are already used for prognostics in a variety of domains including medicine, nuclear, automotive, aerospace, and electronics. It also considers other domains that involve prediction-related tasks, such as weather and finance. Differences and similarities between these domains and health maintenancehave been analyzed to help understand what performance evaluation methods may or may not be borrowed. Further, these metrics have been categorized in several ways that may be useful in deciding upon a suitable subset for a specific application. Some important prognostic concepts have been defined using a notational framework that enables interpretation of different metrics coherently. Last, but not the least, a list of metrics has been suggested to assess critical aspects of RUL predictions before they are fielded in real applications.
- API
ORACLES Navigational and Meteorological Data
data.nasa.gov | Last Updated 2022-08-22T13:04:33.000ZORACLES_MetNav_AircraftInSitu_Data are in situ meteorological and navigational measurements collected onboard the P-3 Orion or ER-2 aircraft during the ObseRvations of Aerosols above CLouds and their intEractionS (ORACLES) campaign. These measurements were collected from August 19, 2016 – October 27, 2016, August 1, 2017 – September 4, 2017 and September 21, 2018 – October 27, 2018. ORACLES provides multi-year airborne observations over the complete vertical column of key parameters that drive aerosol-cloud interactions in the southeast Atlantic, an area with some of the largest inter-model differences in aerosol forcing assessments on the planet. The P-3 Orion aircraft was utilized as a low-flying platform for simultaneous in situ and remote sensing measurements of aerosols and clouds and was supplemented by ER-2 remote sensing during the 2016 campaign. Data collection for this product is complete. Southern Africa produces almost one-third of the Earth’s biomass burning aerosol particles. The ORACLES (ObseRvations of Aerosols above CLouds and their intEractionS) experiment was a five year investigation with three intensive observation periods (August 19, 2016 – October 27, 2016; August 1, 2017 – September 4, 2017; September 21, 2018 – October 27, 2018) and was designed to study key processes that determine the climate impacts of African biomass burning aerosols. ORACLES provided multi-year airborne observations over the complete vertical column of the key parameters that drive aerosol-cloud interactions in the southeast Atlantic, an area with some of the largest inter-model differences in aerosol forcing assessments. These inter-model differences in aerosol and cloud distributions, as well as their combined climatic effects in the SE Atlantic are partly due to the persistence of aerosols above clouds. The varying separation of cloud and aerosol layers sampled during ORACLES allow for a process-oriented understanding of how variations in radiative heating profiles impact cloud properties, which is expected to improve model simulations for other remote regions experience long-range aerosol transport above clouds. ORACLES utilized two NASA aircraft, the P-3 and ER-2. The P-3 was used as a low-flying platform for simultaneous in situ and remote sensing measurements of aerosols and clouds in all three campaigns, supplemented by ER-2 remote sensing in 2016. ER-2 observations will be used to enhance satellite-based remote sensing by resolving variability within a particular scene, and by guiding the development of new and improved remote sensing techniques.
- API
CORONA Satellite Photographs from the U.S. Geological Survey
data.nasa.gov | Last Updated 2022-01-17T05:16:00.000ZThe first generation of U.S. photo intelligence satellites collected more than 860,000 images of the Earth’s surface between 1960 and 1972. The classified military satellite systems code-named CORONA, ARGON, and LANYARD acquired photographic images from space and returned the film to Earth for processing and analysis. The images were originally used for reconnaissance and to produce maps for U.S. intelligence agencies. In 1992, an Environmental Task Force evaluated the application of early satellite data for environmental studies. Since the CORONA, ARGON, and LANYARD data were no longer critical to national security and could be of historical value for global change research, the images were declassified by Executive Order 12951 in 1995. The first successful CORONA mission was launched from Vandenberg Air Force Base in 1960. The satellite acquired photographs with a telescopic camera system and loaded the exposed film into recovery capsules. The capsules or buckets were de-orbited and retrieved by aircraft while the capsules parachuted to earth. The exposed film was developed and the images were analyzed for a range of military applications. The intelligence community used Keyhole (KH) designators to describe system characteristics and accomplishments. The CORONA systems were designated KH-1, KH-2, KH-3, KH-4, KH-4A, and KH-4B. The ARGON systems used the designator KH-5 and the LANYARD systems used KH-6. Mission numbers were a means for indexing the imagery and associated collateral data. A variety of camera systems were used with the satellites. Early systems (KH-1, KH-2, KH-3, and KH-6) carried a single panoramic camera or a single frame camera (KH-5). The later systems (KH-4, KH-4A, and KH-4B) carried two panoramic cameras with a separation angle of 30° with one camera looking forward and the other looking aft. The original film and technical mission-related documents are maintained by the National Archives and Records Administration (NARA). Duplicate film sources held in the USGS EROS Center archive are used to produce digital copies of the imagery. Mathematical calculations based on camera operation and satellite path were used to approximate image coordinates. Since the accuracy of the coordinates varies according to the precision of information used for the derivation, users should inspect the preview image to verify that the area of interest is contained in the selected frame. Users should also note that the images have not been georeferenced.
- API
Probabilistic Model-Based Diagnosis for Electrical Power Systems
data.nasa.gov | Last Updated 2020-01-29T02:07:39.000ZWe present in this article a case study of the probabilistic approach to model-based diagnosis. Here, the diagnosed system is a real-world electrical power system, namely the Advanced Diagnostic and Prognostic Testbed (ADAPT) located at the NASA Ames Research Center. Our probabilistic approach is formally well-founded, and based on Bayesian networks and arithmetic circuits. We pay special attention to meeting two of the main challenges model development and real-time reasoning often associated with real-world application of model-based diagnosis technologies. To address the challenge of model development, we develop a systematic approach to representing electrical power systems as Bayesian networks, supported by an easy-touse specication language. To address the real-time reasoning challenge, we compile Bayesian networks into arithmetic circuits. Arithmetic circuit evaluation supports real-time diagnosis by being predictable and fast. In experiments with the ADAPT Bayesian network, which contains 503 discrete nodes and 579 edges and produces accurate results, the time taken to compute the most probable explanation using arithmetic circuits has a mean of 0.2625 milliseconds and a standard deviation of 0.2028 milliseconds. In comparative experiments, we found that while the variable elimination and join tree propagation algorithms also perform very well in the ADAPT setting, arithmetic circuit evaluation was an order of magnitude or more faster. **Reference:** O. J. Mengshoel, M. Chavira, K. Cascio, S. Poll, A. Darwiche, and S. Uckun. "Probabilistic Model-Based Diagnosis: An Electrical Power System Case Study”. Accepted to IEEE Transactions on Systems, Man, and Cybernetics, Part A, 2009.
- API
MISR Level 3 FIRSTLOOK Global Land product in netCDF format covering a day V002
data.nasa.gov | Last Updated 2023-01-19T22:32:40.000ZThis file contains the MISR Level 3 FIRSTLOOK Component Global Land product in netCDF format covering a day
- API
Rapid Electrochemical Detection and Identification of Microbiological and Chemical Contaminants for Manned Spaceflight Project
data.nasa.gov | Last Updated 2020-01-29T03:33:53.000Z<p>A great deal of effort has gone into the development of point-of-use methods to meet the challenge of rapid bacterial identification for both environmental monitoring and clinical applications.&nbsp; Unfortunately, most of the methods developed rely on Preliminary Chain Reaction (PCR) and face inherent limitations because of the requirement for enzymatic components and thermal control.&nbsp; Other methods based on surface plasmon resonance, quartz crystal microbalance, and fluorescence has been reported with good detection limits, but, these methods are immunological and cannot provide genetic-level information.&nbsp; Further, they require labeled markers, complicated fluid handling systems, and sensitive optics that drive up cost and complexity and preclude them from outside the laboratory.&nbsp; Recent work by a group at the University of Toronto has focused on developing an electrochemical platform that combines ultrasensitive detection, straightforward sample processing, and inexpensive components to create a cost-effective, user-friendly device for detection and identification of microorganisms.&nbsp; The platform combines an electrical cell lysis chamber, and electrochemical reporter system, and nanostructured microelectrodes (NMEs) to detect specific nucleic acid sequences.&nbsp; The nucleic acid sequences are unique to a given type of microorganism and can be used to identify the microorganisms present in a sample.</p><p>From the perspective of the anticipated prototype device &nbsp;(Lam, et al. 2012. <em>Polymerase Chain Reaction-Free, Sample-to-Answer Bacterial Detection in 30 Minutes with Integrated Cell Lysis</em>. Anal. Chem. <strong>84(1)</strong>: 21-5), detection of microbial contaminants will begin with a lysis chamber designed to release DNA and RNA from microorganisms present in the sample using ultrasonic or electrochemical technology.&nbsp; The DNA and RNA mixture is then passed into an analysis chamber containing an array of nanostructured microelectrodes (NMEs).&nbsp; The surface of the NMEs will be functionalized with probe molecules for DNA or RNA sequences specific to the bacteria being targeted.&nbsp; Binding of the DNA or RNA to the appropriate detection probe on the NME surface in the presence of an electrochemical reporter system will change the electrochemical properties of the NMEs.&nbsp; A potentiostat is used to measure the current at each individual electrode before and after addition of the DNA and RNA mixture.&nbsp; The difference in current before and after addition of the mixture to the NMEs is compared against a pre-determined threshold to check for the presence of target bacteria in the sample.&nbsp; The process for detection of chemical contaminants is very similar.&nbsp; The lysis chamber would be bypassed and the sample would flow directly into the analysis chamber.&nbsp; The NMEs will be functionalized with molecules to selectively bind the desired targets (analytes) and the change in the electrochemical response of each NME can again be used to detect and quantify the contaminants.&nbsp; Depending on the analyte of interest, it may be possible to directly measure analyte binding on the surface of the NMEs without the use of an electrochemical reporter system. The overall project will focus on optimization of the individual aspects of the detection platform in preparation for construction of a prototype for a flight experiment.&nbsp; The scope of the work in this proposal is limited to characterization and optimization of the lysis step/sample preparation, probe selection, and NME structure.&nbsp; Lysis conditions will be optimized by evaluating parameters associated with the oscillation frequency and lysis time for ultrasonic techniques and applied voltage for the electrochemical techniques.&nbsp; Cell viability, as determined by fluorescent detection of DNA or R