Technology Readiness Levels
What are Technology Readiness Levels or TRLs?
Technology Readiness Levels (TRLs) is a tool which can be used to assess and communicate the maturity level of a technology project. The TRL system has nine levels, split into three groups – TRLs 1-3 (least mature) indicating research stages of development, TRLs 4-6 indicating developmental stages of development, and TRLs 7-9 (most mature) indicating deployment stages of development.
Originally developed by NASA for space exploration technologies in the 1970s, TRLs have been widely implemented globally by a range of organisations across government and industry sectors, including Google, BP, the EU, the Australian Department of Defence and safety-critical industries like mining, aerospace and space. TRLs enable clear communication about expectations between various parties, a shared language for considering technology maturity and risk between different stakeholders, and a systematic approach to the system development lifecycle that has clear guideposts and milestones.
How does FrontierSI use TRLs?
Typically, TRLs 1-3 fall within the domain of research organisations such as Universities, and TRLs 7-9 fall within the domain of industry. TRLs 7-9 take promising technologies and take them through to the journey to maturity through production, ready for deployment. This ‘middle space’ is sometimes referred to as the ‘valley of death’ as it is often neglected – and is the space which FrontierSI as an organisation aims to create impact alongside its partners.
At FrontierSI, we use TRLs for a variety of reasons across many of our projects. They provide a framework to support us in having clear expectations when at the project design stage, when commencing projects and during the project delivery phase. TRLs assist , to assess progress over time in projects, and as a communication tool to explore what’s possible for the future. This helps us to get delivery to where it needs to be and helps us bring project partners together into an ecosystem.
The TRL levels which we work with are:
TRL | Definition | Hardware Description | Software/Data Analysis Description | |
Research | 1 |
Basic principles observed |
Scientific knowledge generated underpinning hardware technology concepts/applications. | Scientific knowledge generated underpinning basic properties of software architecture and mathematical formulation |
2 |
Technology concept formulated |
Invention begins, practical application is identified but is speculative, no experimental proof or detailed analysis is available to support the conjecture. A tailored solution is defined based on definition of user requirements, specific application, and operating conditions. | Practical application is identified but is speculative, no experimental proof or detailed analysis is available to support the conjecture. Basic properties of algorithms, representations and concepts defined. Basic principles coded. Experiments performed with synthetic or sample data | |
3 |
Experimental proof of concept |
Analytical studies place the technology solution in an appropriate context and laboratory demonstrations, controlled field tests, modelling and simulation validate analytical predictions and limitations. | Development of limited functionality to validate critical properties and predictions using non-integrated software components. | |
Development | 4 |
Technology validated in lab research/local environment |
A low fidelity system/component breadboard assembly is built and operated to demonstrate basic functionality in critical test environments. Associated performance predictions are defined relative to the final operating environment. |
Key, functionally critical, software components are integrated, and functionally validated, to establish interoperability and begin architecture development. Relevant Environments defined and performance in this environment predicted. |
5 |
Technology validated in relevant environment |
A medium fidelity system/component brassboard assembly is built and operated to demonstrate overall performance in a simulated operational environment with realistic support elements that validates performance of core components. Performance predictions are made for subsequent development phases. | End-to-end software elements implemented and interfaced with existing systems conforming to target environment. End-to-end software system, tested in relevant environment, meeting predicted performance. Operational environment performance predicted. Prototype implementations developed. | |
6 |
Technology demonstrated in relevant environment |
A high-fidelity system/component prototype assembly that adequately addresses all core features and key performance metrics is built and operated in relevant environments to demonstrate operation under relevant environmental conditions. | Prototype implementations of the software demonstrated on full-scale realistic problems. Partially integrate with existing hardware/software systems. Limited documentation available. Engineering feasibility fully demonstrated. | |
Deployment | 7 |
System prototype demonstrated in operational environment |
A high-fidelity system/engineering unit that adequately addresses all critical features and key performance metrics is built and operated to rigorously demonstrate performance in the actual operational environment and platform (ground, airborne, or space). | Prototype software exists having all key functionality available for demonstration and test. Well integrated with operational hardware/software systems demonstrating operational feasibility. Most software bugs removed. Limited documentation available. |
8 |
System complete and qualified |
The feature-complete system in its final configuration is successfully demonstrated through testing and performance analysis against key metrics for its intended operational environment and platform (ground, airborne, or space). | All software has been thoroughly debugged and fully integrated with all operational hardware and software systems. All user documentation, training documentation, and maintenance documentation completed. All functionality successfully demonstrated in simulated operational scenarios. Verification and Validation (V&V) completed (Or just testing completed?) | |
9 |
Actual system proven in operational environment |
The finalised system is successfully deployed and operated in an actual mission and is proven to meet performance requirements in-situ. | All software has been thoroughly debugged and fully integrated with all operational hardware/software systems. All documentation has been completed. Sustaining software engineering support is in place. System has been successfully operated in the operational environment. |