Public Metadata This form must be completed in one session unless you use the Save/Restore buttons below. If you have any questions about this form, please send email to grc-dl-strs-repository-manager@mail.nasa.gov. You need to have a Javascript enabled browser to use this web page. Otherwise, it will cause problems trying to use the buttons. The following metadata is to be stored in the STRS Application Repository. Except for the submitter's email address, it will be publicly available. Go to (Beginning, Submittal) Name of Item Submitted Enter the unique name: This name should be used in all other forms, paperwork, and email for identification. NASA only: This name should be the same as the name used in the New Technology Report (NTR). Non-NASA: Use the name from your release process. Enter the version, if known: Enter the your identification number (NTR#, LEW#, NPO#, etc.), if known: Type of Item Submitted Enter the item type (Click on one of the example buttons provided to avoid typing: , , , , , , ) If none of the above, please enter a one-word or two-word summary of type. Description Enter the usage: (maximum of 80 characters) Enter the functional description: NASA only: Most of the usage and functional description may be created from the NTR data's descriptive title (1), abstract (9), description of the problem (Section I), and purpose (Section II). The capabilities should include such items as: Frequency range(s), Modulation, Spreading, Functionality, Forward Error Correction/Encoding/Decoding, User Data Rates, Over-the-air Symbol Rates, Scrambling, and Data Formatting. Capabilities Enter the main capabilities of the item (Click on one of the example buttons provided to avoid typing: , , , , , , , , ) If none of the above, please enter a one-word or two-word summary of capabilities. Target Platform Enter the target platform (Enter name of target system, OS, FPGAs, tools, and versions, etc.) Software classification Enter the Software Classification Not determined or N/A Class A: Human Rated Space Software Systems Class B: Non-Human Space Rated Software Systems or Large Scale Aeronautics Vehicles Class C: Mission Support Software or Aeronautic Vehicles, or Major Engineering/Research Facility Software Class D: Basic Science/Engineering Design and Research and Technology Software Class E: Small Light Weight Design Concept and Research and Technology Software Class F: General Purpose Computing Software (Multi-Center or Multi- Program/Project) Class G: General Purpose Computing Software (Single Center or Project) Class H: General Purpose Desktop Software ) Safety critical Is the application developed for a safety critical system? (Choose: Yes, No, TBD). NASA only: For more information, Safety Critical software is defined in detail in NASA-STD-8739.8, Appendix A, The Software Assurance Classification Assessment. Deployment date Enter the deployment date (mm/dd/yyyy) Leave blank; i.e. click if this application has not been deployed. Deployment date is when the item is first used in the target environment rather than a test environment. Architecture Enter the software architecture and version None STRS v1.02 STRS NASA-STD-4009 STRS NASA-STD-4009A OMG STI 1.0 Other . If "Other" was selected, please name it so it can be added to the list: Release category Enter the release category (Choose: Public Release (PUBLIC) Open Source Release (OPEN) U.S. and Foreign Release (FOREIGN) U.S. Release Only (US) U.S. Government Purpose Release (GPR) GPR/Beta Release (BETA) GPR/Project Release (PROJECT) GPR/Developmental Release (DEV) GPR/Interagency Release (INTERAGENCY), GPR/NASA Release (NASA) Not releasable (NONE) Vendor release process Not determined yet (TBD) ) (NASA only: more detail in NPR 2210.1 in section A.2) Release restrictions Are there any export control restrictions (ITAR, SBU, CUI, EAR, etc.): None ITAR SBU CUI EAR Unknown The export control restrictions may be International Traffic in Arms Regulations (ITAR), Sensitive But Unclassified (SBU), Controlled Unclassified Information (CUI), Export Administration Regulations (EAR), etc. Enter any contract restrictions: None Agreement Copyright Patent Models Enter the availability of models (Choose: Yes, No) Models of interest are in files that may be used by a tool to perform simulations or generate code. The description of these models and the associated tools are required elsewhere but the availability of such models to speed up the reuse process is desirable for evaluation of reuse level of effort. Technology Readiness Level (TRL): Enter the Technology Readiness Level (TRL): 1. Basic principles observed and reported. 2. Technology concept and/or application formulated. 3. Analytical and experimental critical function and/or characteristic proof-of-concept. 4. Component/subsystem validation in laboratory environment. 5. System/subsystem/component validation in relevant environment. 6. System/subsystem model or prototyping demonstration in a relevant end-to-end environment. 7. System prototyping demonstration in an operational environment (ground or space). 8. Actual system completed and "mission qualified" through test and demonstration in an operational environment. 9. Actual system "mission proven" through successful mission operations (ground or space). Not determined yet (TBD). (NASA only: more detail in NPR 7123.1, Appendix E.) TRL justification Enter a short (publicly viewable) TRL justification This is usually an indication of mission and operational environment where tested and used. Target/operational environment Enter whether the target/operational environment is Space or Ground (Choose: Space Only, Ground Only, Both Space and Ground) . Note that this is to support TRL above. Compute Reuse Readiness Levels (RRL) : The Reuse Readiness Level (RRL) Assessment will be stored in the STRS Application Repository. For background information, see NASA's Earth Observing System Data and Information System (EOSDIS) web site. Specifically, NASA's Earth Science Data System Working Group (ESDSWG) has studied reuse. References: Reuse Readiness Levels as a Measure of Software Reusability Example Select one item for each RRL category below: RRL Category RRL Assessment (1) Documentation 1. Little or no internal or external documentation available 2. Partially to fully commented source code available 3. Basic external documentation for sophisticated users available 4. Reference manual available 5. User manual available 6. Tutorials available 7. Interface guide available 8. Extension guide and/or design/developers guide available 9. Documentation on design, customization, testing, use, and reuse is available (2) Extensibility 1. No ability to extend or modify program behavior 2. Very difficult to extend the software system, even for application contexts similar to the original application domain 3. Extending the software is difficult, even for application contexts similar to the original application domain 4. Some extensibility is possible through configuration changes and/or moderate software modification 5. Consideration for future extensibility designed into the system for a moderate range of application contexts; extensibility approach defined and at least partially documented 6. Designed to allow extensibility across a moderate to broad range of application contexts, provides many points of extensibility, and a thorough and detailed extensibility plan exists 7. Demonstrated to be extensible by an external development team in a similar context 8. Demonstrated extensibility on an external program, clear approach for modifying and extending features across a broad range of application domains 9. Demonstrated extensibility in multiple scenarios, provides specific documentation and features to build extensions which are used across a range of domains by multiple user groups (3) Intellectual Property Issues 1. Product developers have been identified, but no rights have been determined. 2. Developers are discussing rights that comply with their organizational policies. 3. Rights agreements have been proposed to developers. 4. Developers have negotiated on rights agreements. 5. Agreement on ownership, limited reuse rights, and recommended citation. 6. Developer list, recommended citation, and rights statements have been drafted. 7. Developer list and limited rights statement included in product prototype. 8. Recommended citation and intellectual property rights statement included in product. 9. Statements describing unrestricted rights, recommended citation, and developers embedded into product. (4) Modularity 1. Not designed with modularity 2. (Between) 3. Modularity at major system or subsystem level only 4. (Between) 5. Partial segregation of generic and specific functionality 6. (Between) 7. Clear delineations of specific and reusable components 8. (Between) 9. All functions and data encapsulated into objects or accessible through web service interfaces (5) Packaging 1. Software or executable available only, no packaging 2. (Between) 3. Detailed installation instructions available 4. (Between) 5. Software is easily configurable for different contexts 6. (Between) 7. OS detect and auto-build for supported platforms 8. (Between) 9. Installation user interface provided (6) Portability 1. The software is not portable 2. Some parts of the software may be portable 3. The software is only portable with significant costs 4. The software may be portable at a reasonable cost 5. The software is moderately portable 6. The software is portable 7. The software is highly portable 8. (Between) 9. The software is completely portable (7) Standards Compliance 1. No standards compliance 2. No standards compliance beyond best practices 3. Some compliance with local standards and best practices 4. Standards compliance, but incomplete and untested 5. Standards compliance with some testing 6. Verified standards compliance with proprietary standards 7. Verified standards compliance with open standards 8. Verified standards compliance with recognized standards 9. Independently verified standards compliance with recognized standards (8) Support 1. No support available 2. Minimal support available 3. Some support available 4. Moderate systematic support is available 5. Support provided by an informal user community 6. Formal support available 7. Organized/defined support by developer available 8. Support available by the organization that developed the asset 9. Large user community with well-defined support available (9) Verification and Testing 1. No testing performed 2. Software application formulated and unit testing performed 3. Testing includes testing for error conditions and proof of handling of unknown input 4. Software application demonstrated in a laboratory context 5. Software application tested and validated in a laboratory context 6. Software application demonstrated in a relevant context 7. Software application tested and validated in a relevant context 8. Software application "qualified" through test and demonstration (meets requirements) and successfully delivered 9. Actual software application tested and validated through successful use of application output You need to have a Javascript enabled browser to use this web page. Otherwise, it will cause problems trying to use the buttons. Click to . (not editable) RRL justification Enter a short (publicly viewable) RRL justification This is usually an indication of the level of effort for design, documentation, test, configuration management, standards used, etc. Contact Enter your email address in case there are any questions relating to the metadata. Except for the submitter's email address, the data will be publicly available. Would you like to receive a copy of your Public Metadata submitted? (Choose: Yes No ) If you have checked the Yes box to get a copy of the Public Metadata, please re-enter your email address for verification. Thank you. Go to (Beginning, Submittal) You need to have a Javascript enabled browser to use this web page. Otherwise, it will cause problems trying to use the buttons. To restore previously saved data, click on the button below using the same computer/browser as when saved: To save data entered on the form, to restore later, click on the buttons below. To get more information, click on the button below: To reset entries to blank: To send the form by email, click on: If the contents of this form should be encrypted or if the Submit button gets an error not described below, then click: . If "Errors in STRS Form Data" is shown below, fix the errors and try again. If no errors are displayed, the RRL Average above is computed before submitting. Thank you for your participation in creating the STRS Application Repository public metadata. The public metadata will be stored in the STRS Application Repository for the benefit of all potential users of the application. Back to top. '