My ideas incorporate all of these ideas and a little more. There is this idea that computers will continue to advance, helping us solve more problems and aiding in design of new technologies and machines. It is believed that this development is asymptotic and computers will quickly reach the level of human of intelligence. Soon after they will create computer systems that are smarter than humans. This is called the Singularity event, from which time anything that happens afterward will be hard to predict as it is unknown how the world and humanity will be changed. It could be our bane or boon. My thoughts on computer/machine intelligence have been formed by the ideas of cognitive scientist Douglas Hofstadter, author of Escher, Godel, Bach: an Eternal Golden Braid, and I Am a Strange Loop, whose writing and ideas I have interpreted to mean that "artificial" intelligence will be no different than natural intelligence. I place artificial in quotes because it is my belief that intelligence, or the sense of I, and emotions are a result that emerges from complex information processing systems, whether those systems are made of brains or computers. In such, we can't create computer intelligence, we can only create the systems from which they can arise. Or perhaps it is closer to the truth that we won't even do that, that advanced computers will.
My Seventh Generation fighter aircraft incorporate advanced versions of the Gen 6 attributes post-Singularity (which I don't envision as being a destructive, genocidal "Rise of the Machines" that is central to many science-fiction plots) with no human pilots or remote operators. The "artificial" intelligent persons inhabit a module that can be physically loaded into a number of receptive bodies and vehicles, with the idea of copying or down-/uploading being an anathema to the concept of Identity and Individuality. (This runs counter to the philosophical reasoning that since an exact copy is indistinguishable from the original the two can be said to be the same. My question is does entropy allow exact copies to be made? And if you transmit yourself to another body or system, which is an act of copying, and the original is deleted, is the transmitted identity a new being while the original is dead? Or if there is no deletion of the original, how do the two identical identities correlate? Can such be deemed a "budding", asexual reproduction of a full adult conscious intelligence?) There are also no kinetic weapons loaded; all weapons are directed energy and mal-/killware. As I draw no distinction between machine and human intelligence (nor have a need for literary plot devices such as The Three Laws of Robotics--as the machines are people, not tools) I view machine intelligence as having the same sense of self-preservation, that while such can be programmed at the operating level like genetic motivated instincts, it can also arise by a conscious desire to remain extant, and even super-intelligent machines will engage in warfare to secure natural resources such as energy sources and rare earth metals for their continued existence. But I don't think their warfare will not be like ours. The means to copy oneself makes attrition difficult, so cyber attacks that alter or damage the concept of identity, or proselytization maybe the primary means of containing a population dependent on finite and scarce resources.
The following designs are around the 2030 timeframe, after a nuclear world war that seems highly likely given current global events, growing nationalism, rising food and energy prices, and polarizing attitudes. As such wars change the world, the surviving nations are not the same as they are today, and given the unpredictability beyond the Singularity, this speculative fiction is a free-for-all for what might come.
The first idea I had was the FQi-2A, or Fighter Drone (intelligent), model 2A.