Go Back   SOCNET: The Special Operations Community Network > U.S. Air Force Special Operations > AFSOC Aviation

Reply
 
Thread Tools Display Modes
  #1  
Old 30 March 2012, 19:13
Rockville Rockville is offline
Confirmed User
 
Join Date: Feb 2012
Location: Woodbridge, VA
Posts: 657
F-22

The scientists figured out what wrong with the F-22.

Some pilots blacked out from lack of oxygen.

DoD News Briefing on F-22 Scientific Advisory Board Findings
http://www.defense.gov/transcripts/t...nscriptid=5001
Reply With Quote
  #2  
Old 1 April 2012, 07:46
Maestro Maestro is offline
Confirmed Abuser
 
Join Date: Oct 2011
Location: Skool in DC
Posts: 95
If there's a lesson here for GPF & SOF, its that in high complexity systems failure modes pop-up in unanticipated areas. It may be better to have a "second-best" system and accept slightly lower performance in return for predictable and manageable failure modes, rather than insist on the absolute top-end tech and get caught off guard by unforeseen catastrophic failures.
Reply With Quote
  #3  
Old 1 April 2012, 09:34
Tracy's Avatar
Tracy Tracy is offline
Been There Done That
 
Join Date: Feb 1997
Location: West
Posts: 9,962
Quote:
Originally Posted by Maestro View Post
If there's a lesson here for GPF & SOF, its that in high complexity systems failure modes pop-up in unanticipated areas. It may be better to have a "second-best" system and accept slightly lower performance in return for predictable and manageable failure modes, rather than insist on the absolute top-end tech and get caught off guard by unforeseen catastrophic failures.
I strongly agree.
Reply With Quote
  #4  
Old 1 April 2012, 13:38
smp52 smp52 is offline
Confirmed User
 
Join Date: Feb 2004
Location: So. Cal
Posts: 1,311
Quote:
Originally Posted by Maestro View Post
If there's a lesson here for GPF & SOF, its that in high complexity systems failure modes pop-up in unanticipated areas. It may be better to have a "second-best" system and accept slightly lower performance in return for predictable and manageable failure modes, rather than insist on the absolute top-end tech and get caught off guard by unforeseen catastrophic failures.
+2

This dove-tails with the system engineering approach as prescribed by Colonel John Boyd and his colleagues and the concept behind the Black Swan thought as described by the author Nassim Nicolas Taleb, who focused not the financial engineering industry but the theory is applicable in other places.

From a systems engineering standpoint, I concur and I hope more people see this, particularly the decision makers. There are limits to understanding and when one stacks up high complexity systems, multiple mission sets, inability to validate each key characteristic due to cost (therefore more modelling and simulation is used) it creates gaps in knowledge. Add this with pushing through immature systems, combining DT/OT in cases where they should not, we're compounding this effect severely.
Reply With Quote
  #5  
Old 3 April 2012, 12:42
OutsideTheB OutsideTheB is offline
On the Extract Bird
 
Join Date: Sep 2011
Location: Northern Virginia
Posts: 2,475
One of my brothers whose degree is in electrical engineering spent his entire Air Force career (10 years active and remaining years in the Air National Guard) working the electronics of all our cutting edge aircraft, and if there's ever a theme I heard from him in all those years, it was the need for the KISS principle. He completed his final years at Langley AFB as the senior NCO overseeing maintenance and training of the F-22 electronic systems, and I loved hearing his endless stories about trying to fix problems the systems engineers at GD and Lockheed failed to anticipate because things were overly complex.

ETA: Actually, I should add that just yesterday I was talking with this brother who is a test engineer (contractor) for the Navy's cyber warfare program, and he was confirming what I found while working with Information Assurance in my past work that so many of our decision makers continue to fail to give enough attention to that quadrant of the risk management matrix where the probability of occurrence may be miniscule, but it is catastrophic.

Last edited by OutsideTheB; 3 April 2012 at 13:12.
Reply With Quote
  #6  
Old 3 April 2012, 21:21
SN's Avatar
SN SN is offline
Old and Broke
 
Join Date: Sep 2001
Location: The GREAT State of Texas
Posts: 4,133
Quote:
Originally Posted by Maestro View Post
If there's a lesson here for GPF & SOF, its that in high complexity systems failure modes pop-up in unanticipated areas. It may be better to have a "second-best" system and accept slightly lower performance in return for predictable and manageable failure modes, rather than insist on the absolute top-end tech and get caught off guard by unforeseen catastrophic failures.
Agree, one reason why I cringe when we replace a warm body with a cold box.

Software guys/gals have to stop promising the moon too. Most of our acquisition issues are software related. We have to go back to incrimental improvements and leave "revolutionary improvements" for the folks at DARPA.
__________________
FISHDO
Reply With Quote
  #7  
Old 10 April 2012, 22:19
Rockville Rockville is offline
Confirmed User
 
Join Date: Feb 2012
Location: Woodbridge, VA
Posts: 657
system engineering and policy-making.......

There is a study on system engineering done last year.
Pre-Milestone A and Early-Phase Systems Engineering:
A Retrospective Review and Benefits for Future Air Force Acquisition free at http://www.nap.edu/catalog.php?record_id=12065
done by http://sites.nationalacademies.org/DEPS/AFSB/index.htm
Reply With Quote
Reply

Thread Tools
Display Modes

Our new posting rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off



All times are GMT -4. The time now is 19:40.
Powered by vBulletin® Version 3.8.3
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
Socnet.com All Rights Reserved
SOCNET