Hacker News new | past | comments | ask | show | jobs | submit login
USAF Test Pilot School, DARPA announce aerospace machine learning breakthrough (af.mil)
100 points by rntn 7 months ago | hide | past | favorite | 105 comments



> In total, the team made over 100,000 lines of flight-critical software changes across 21 test flights.

I'm sure they're working much more rigorously than typical "tech" company software developers, and maybe the writer used that sentence to commend the scale of effort that contributed to a success.

But, to a tech industry software engineer ear, the sentence sounds different. Heavy reactive software changes would typically mean a lot of defects and tech debt introduced.

Here's to hoping that autonomous fighter jets aren't programmed anything at all like cat pictures sharing apps are.

("Commit #37f2ad30 - Fixed missing minus sign that had airplane flying upside-down.")


100,000 LOC modified - 21 tests. Ship it.

In all seriousness, I hope that includes simulation code / integration tests and the 21 test flights were essentially E2E tests.


Seriously, that line strikes fear into my heart and I only handle rest APIs not billion dollar airplanes with actual people in them.


Dont go by the price tag. Thats just MIL complex accounting called "Cost plus" which means price = cost + whatever they feel like. Cuz Govt cant go anywhere else to buy this stuff.


> Thats just MIL complex accounting called "Cost plus" which means price = cost + whatever they feel like.

Directionally correct, but technically incorrect.

The fudge factor is in the “cost”. The “plus” (essentially the profit) is set.


Could be many lines of removed code. I imagine they may have introduced a simpler interface offloading governance to configuration data.


$ cargo update

+8,967 −7,331


> I'm sure they're working much more rigorously than typical "tech" company software developers

I was sure about Boeing as well.


it's also likely that lines is higher than normal because of how close to bare metal they have to be.


As the cost of failure rises, so does the value of the test suite.


The video mentions the risk of "over G" breaking the airframe.

I am wondering about the gap between the over G threshold for the pilot vs over G threshold for the airframe. Assuming the X-62 (or other AI aircraft) can stay below the airframe breaking point but above the human breaking point, does that mean that the AI has a practically insurmountable advantage in most dogfight situations vs human pilots (as well as other offensive/defensive engagements)?


Normally, the airframe and/or human g limits are well outside the aerodynamic limits. Pulling 10+g is just not physically possible in most combat conditions, regardless of aircraft.


Is that because of limits to air resistance vs. aircraft surface area?


Because to turn at higher g would require a larger wing to grab more air, which is heavier, which would require more wing, which is heavier. The limitation is the density of modern aircraft construction.


No. The ability to pull more G's is seldom the deciding factor in air combat. The most important thing is detecting the adversary aircraft first (possibly via data link) so that you can dictate the terms of the engagement. Then once the shooting starts missile quality, defensive systems, and energy state (speed + altitude) make a huge difference. Even if the AI aircraft can pull 10 G's or more for a few seconds that won't necessarily be sufficient to cause a miss.


Don't worry, no AI will ever dogfight with you. It will shoot you down from 40km away with a fire-and-forget weapon.


In a small skirmish in which the side with the AI has a lot of resources, sure.

In a large scale war, I’m not so sure. A fire-and-forget weapon with a 40km range is expensive and might be a scarce resource. Bullets are much cheaper, and if the AI can fly itself close enough to shoot you, it can do so inexpensively.

And this is not to mention that a 40km range weapon requires targeting that works at 40km.


Highly maneuverable reusable aircraft (even drones) are expensive, and a scarcer resource than missiles.

In modern great power military conflict, if you get to the point where you can fly F-16s or a hypermaneuverable AI dogfighting drone into visual range against an opponent [with inferior planes] who's out of missiles, haven't you already won?

If the opponent isn't inferior in dogfights, then instead of using ~$1mil munitions as expendables, you're using planes (and possibly pilots) as expendables. That's not a good or economical trade-off when missiles are as good as they are today.


40km isn't even the horizon. This cost a million and will kill you four times further.

https://en.m.wikipedia.org/wiki/AIM-120_AMRAAM

Do you believe a jet fighter drone with a gun can be manufactured with sensing, computing, landing, takeoff, and AI for a million, or even a million per kill vs fighter craft? I do not.

Maybe you'll want AI for loitering munitions, for air to air a fighter drone is going to look more like a first stage for a multi-missile launch platform. It'll fly a while, loiter, volley against aircraft or missiles, then return for rearm and recover.


Bullets are cheap, but an AI airplane with a gun is far more expensive than almost any fire and forget missile since it is effectively a larger, more complex, more expensive fire and forget missile.


That’s the point though, AI is replicable. You don’t have to create the AI every time you fire, just once, and then copy it over forevermore.


The software is freely replicable, the hardware is not. And the hardware for just the “running the AI” part of the reusable AI fighter is going to be more expensive than the electronics of a single use fire-and-forget air-to-air missile, and so will most of the parts of the fighter compared to the missile.


You don’t need a 1.8T parameter LLM to control an AI guided missile. In all likelihood, the model to control these can be run on an iPhone-level computer, which is max $1000. The bottleneck is still the missile.


The idea would be that an AI airplane with a gun would destroy more targets, on expectation, in its lifetime than a missile would.


Depends greatly on the AA capabilities of your adversary. Slower moving, larger drones with guns are easier to detect and shoot down than missiles, and probably more expensive than the average missile as well.

The only way a gun might be more efficient is in a situation with total air supremacy like Iraq or Afghanistan, and the targets would ideally be in the open.


Couldn't the aeronautical capabilities of an unpiloted aircraft have a significantly higher ceiling if the design doesn't need to accommodate for limiting g-forces on a human?


Not by much. Building airframes that can handle higher G forces than a human comes at a huge weight penalty which impacts cost, range, and payload. Plus, it becomes really tough to avoid compressor stall when the flight envelope gets into high AoA plus high speed. The engineers end up having to do funky things with the inlet that increase drag and radar signature.


Yes, but it is potentially reusable.


Yup, any protracted war will resemble WW2 or Ukraine within a year.

It's an ugly truth that you need an advanced infrastructure to build these sophisticated weapons, that level of infrastructure is easily destroyed.


Those weapons are sophisticated today because they were developed by the western military industrial complex under heavy oversight by politicians, neither of which had a need for a cheap, effective, high explosive drone.

Fundamentally, those cost a few thousand dollars in components at most. The reason the end products are expensive is that they are not optimized for mass production. Which, for better or worse, is starting to change. My 2c.


How close do you have to be to confirm the thing you want to kill isn't friendly?


That's kind of a political question. Politicians set the rules of engagement, which dictate the level of confirmation required before employing weapons. It's a spectrum. At one end of the spectrum a certain area can be designated as a free-fire zone in which any airborne contact is assumed to be hostile. In the middle of the spectrum it may be required to use data links and IFF to avoid firing on friendly or neutral aircraft. And at the other end of the spectrum, visual confirmation may be required.


> It will shoot you down from 40km away with a fire-and-forget weapon.

It's already happening in Gaza. https://www.theguardian.com/world/2024/apr/03/israel-gaza-ai...


Wow. So much wow. Lavender and The Gospel “AI” used to identify individuals and buildings to bomb with the goal being to drop buildings on a hamas suspected person even if 10-100s of civilians are present with “dumb” bombs

So much to dissect here but will say this, I’ve been waiting for this story for a while.

I think it’s debatable whether it’s wise to have any kind of online presence what with the NSA taking over everything from your router to your watch to your doctor’s office and smart toilet.

I don’t work in the tech field but what would these systems look like? Basically they ingest intelligence like phone calls and logs and whatever else they get and it spits out a list of people with associated last known addresses and estimated civilian casualties and then lists supporting evidence?

Is this just ML that’s been worked on for years or is this some GPT based advancements? Can their system analyze voice, photos etc and it just predicts like a GPT?

Ps I’m not pro or anti anybody. I have no dog in this fight and I’m not versed enough to make a judgement. My heart goes to everyone affected. What a disaster and a tragedy war is.


A missile is just a dogfighting AI that shoots every bullet all at once.


Look at drone tactics deployed in Ukraine. It's shocking. Not in the future but right now, modern warfare is a swarm of <$100 jerry rigged recycled devices flying around delivering a few ounces of explosive whatever head or body can be accessed.


I think more likely a AI kill bot drone will be the way we die. You can build ten for the cost of one fire and forget missile, and it will be a lot more precise only killing the humans and leaving the infrastructure intact.


What, exactly, is the difference between an “AI kill bot drone” and a missile?



While a compelling short film, those tiny quadcopters are handicapped missiles. They fly slowly enough that modern anti-drone systems could engage them; their use of visual sensors leaves them vulnerable to laser blinding. Self-destructing drones are missiles trading manoeuvrability for speed.


Tiny missiles which are practically cost-free compared to conventional weapons, difficult to track by your enemies, and not just difficult to take out of the air, but oftentimes several orders of magnitude more expensive than the drone itself. Part of their effectiveness lies in the economic asymmetry involved.


Yes in the recent Iran attack on Israel the interceptors cost about $3mm each and the drones range from $20k to $2k. That asymmetry leads to a calculus of attack through overwhelming mass that’s so vast it’s impractical if not impossible to resist it.

https://www.businessinsider.com/suicide-drones-much-cheaper-...

https://www.politico.com/news/2023/12/19/missile-drone-penta...

https://www.wsj.com/livecoverage/israel-iran-strikes-live-co...

In fact the US DoD is already proposing a “hellscape” capability this is precisely built around this idea as the primary method of countering China in the future.

https://www.defenseone.com/technology/2023/08/hellscape-dod-...


Same thing happening in Ukraine. I actually think this is one of the huge stories that almost nobody is talking about right now. Sudden paradigm shift means big changes can happen quickly.

You have a fantastic username btw.


> stories that almost nobody is talking about right now

It is being talked about non-stop in military and international relations news. The point isn’t that drones are being ignored. It is that self-destructing drones are best thought of as a new niche of missile, one that is cheaper and more manoeuvrable than a conventional missile at the expense of speed. (As opposed to drones' traditional framing as tiny planes.)


They aren't free. Autonomy is computationally and sensor hungry, and consumes a lot of power. The moment a platform approaches the capabilities you are thinking of, it stops being cheap and stops being small. All while those drones are stopped by doors, curtains, strong wind, any kind of mist or darkness (unless you use thermal vision, which is $$$), or rain.


Thermal vision isn’t particularly expensive (I can get a high resolution compact IR camera for less than $20 on AliExpress now) and turkey is already fielding AI loitering munitions that are fully autonomous. Drones with weapons aren’t stopped by curtains or doors, and modern drones aren’t stopped by winds. I can fly my consumer DJI on a day with strong winds just fine.

I think you’re underestimating the rate of change in the technology and cost space along every one of these dimensions. Your assessment was true 5 years ago, isn’t true now, and will seem anachronistic in another 5.


The context of my reply is that "slaughterbot" short video, not 100kg+ drones.

It goes both ways. Just half a century years ago plenty of people were convinced we'd be having flying cars by now, projecting the speed of progress in one particular area forward. It doesn't work like that, progress in any given area is S-shaped, not exponential.

It's even worse when processes are adversarial. Where are the TB2s in Ukraine? Are you aware that FPV drones seem to have less than 10% success ratio nowadays?


My DJI weighs about 1kg and I could put explosives and enough compute on it for less than 100g to be self guided and lethal today, as well as cover every other dimension you mentioned.

The problem with flying cars is the physics at that time wasn’t conceivable (but is now it’s just I’m practically useless). The technology exists to do everything discussed by a DIY effort. It’s not about some quantum leap in ability. It’s about scale, cost reduction, and improvement of existing tooling and tying it all together. Which militaries world wide are -already- doing.

A 10% success ratio when the munition costs an order or two magnitude less is a winning proposition. The key isn’t each drone being successful it’s about launching 100,000 drones at once and being unstoppable en masse


Your DJI can't cope with doors. 100g of explosives don't do much, see Ukraine where this was attempted multiple times with very limited effect. And most importantly, your DJI enjoys massive subsidy because it is mass produced for entertainment, any custom weapon system would be at least an order of magnitude more expensive. This is already the case in Ukraine btw, with FPVs being comparable in total cost to entry point Mavics, despite being MUCH cruder.

It's not a winning proposition when you take into account that those FPV launches require teams of dozens of people to support, and that good old artillery is both cheaper, scarier (=more suppressive), and more effective.

Launch 100,000 drones against what target? How do you find the target? How do you get to it without getting detected and destroyed? Where do you launch from, what's your launch platform, and is it survivable (if you just plan to bring trucks to the front line, they'll just get blown up pre launch)? How do you deal with high power directed microwave, lasers, or just basic AA shells filling the air with shrapnel for peanuts? How do you coordinate the attack, especially under EW, without saturating all bandwidth available way before you reach 100,000?

You're severely underestimating the challenges and how much it costs to solve them. There are multiple reasons why torpedo boats never worked against competent navies, despite being theoretically much cheaper than battleships or modern destroyers; many of those reasons still apply to those ideas about cheap drones.

In the end it might very much be cheaper and more effective to use a good old JASSM-ER.


> 100,000 drones at once

This is already theoretically possible with multiple teams using current technology. Intel has been doing these drone light shows for a while now, and here's a video from 2 years ago with only 1000 drones.

I wonder if the person cuing the launch at the beginning of the video understands what "drop[ping] the hammer" is a reference to. Maybe this is a case of unintentional Chekhov's gun?

https://en.wikipedia.org/wiki/Chekhov's_gun

There's a whole playlist of these on Intel's YouTube channel:

https://www.youtube.com/watch?v=bAlVweZ4dqQ&list=PLk2sjg_-F-...

Even more Intel drone videos:

https://www.youtube.com/playlist?list=PLk2sjg_-F-MftRXx0mAnA...

Oh yeah, they also have another video, where they break their own record for their Guinness world record show with 2018 drones. That's the number of drones and the year they did it in. That's already 5 years+ ago. They let slip that their drone coordination software was literally designed to control "limitless amount of drones" and it's only using a single PC for coordination.

I agree with everything you've said, in case it wasn't clear.

https://www.youtube.com/watch?v=xwQ31-vSgfs&list=PLk2sjg_-F-...

https://www.intc.com/news-events/press-releases/detail/142/i...


I’d also note at $2000 a drone a 100,000 drone attack would only cost $200,000,000 or half of the defense Israel mounted against 300 drones.


Going to be expensive to equip every area with anti-drone systems.


US Export control frameworks managed to keep missile technologies from reimplemented by bored teenagers while actual guided munition technology were left stalling for decades(for better or worse - Ukraine is running out of SAMs daily), resulting in media narrative disconnect between those.


I can drop a cluster bomb filled with drones into a city, army base, factory and they kill everyone with precision and are collectively almost unstoppable.

Yes flying a large drone a few hundred miles over enemy fortifications is quite stoppable compared to a hypersonic missile. So, don’t do that.


> I can drop a cluster bomb filled with drones into a city, army base, factory and they kill everyone with precision and are collectively almost unstoppable.

Citation needed.


Defensive laser weapon


Lasers rays? More like a ray of failure. Shift your spectrum, Microwaves are in and lasers are out.

https://news.ycombinator.com/item?id=39873977 And

https://www.scientificamerican.com/article/new-microwave-wea...


Reusability, good maneuverability, range, loiter time.

Something to close the gap to dispense cheaper munitions than long-range guided missiles. And to do so with a dramatically faster OODA loop.

Flying sausages with rocket motors and stubby wings don't do very well by most metrics.


The AI bot/drone returns home

It is also a lot more maneuverable and has far more flexible mission profiles


I personally see this as a potential of drones being used to shoot down other drones or missiles, and the inevitable drones trying to shoot down those drones.


Semantics, dogfight with the missile then :)


I guess this was inevitable. It's a shame we have not advanced beyond our basic instinct in all these millennia. Every new technology always ends up as a weapon, one way or another. Even today, when we have more than enough resources and means to distribute them to everyone, blind greed and hunger for power prevails.


> we have more than enough resources and means to distribute them to everyone

Do we? To what threshold? Sustainably?

We don’t have, for example, have the resources to provide every person on the planet with the quality of life of even a lower-quintile American. (Home, power, plumbing, car, petrol, varied industrially-produced foods, television, phone, internet, in-person education, security, et cetera.)


I bet we could do clean water, food, and battery powered lighting for night time though.


I'm not sure of that. I worry that unless we take over the national government, bureaucrats will prevent the aid from reaching every resident for ideological reasons (or cultural reasons, e.g., about the proper role of women) or simply because a bureaucrat wants to steal -- and of course taking over a government is difficult and expensive.

Also, you leave basic security out of your list: not having roving gangs for example always stealing your stuff and sexually harassing your daughter. That's more important that having lighting after dark.


Your concern isn't a resource issue, though. The resources exist, but so do the people willing to siphon resources to their benefit over the basic needs of others.


>Your concern isn't a resource issue

I don't like that way of framing it. I prefer to say that the US would need to spend additional resources (which seem prohibitive to me) to ensure that the nominal resources reach the intended recipients.


You're right in a sense. But I wonder, what does it mean to say "we have enough resources to provide for all humans" when human nature itself makes it impossible to provide for all humans with the resources we have now. Possibly with any amount of resources.

It's not a resource issue, but it's not an external factor either.


Exactly. Until human desires are saturated, there will always be humans who desire more and will take it if they have the force to do so.

And we have no evidence thus far that human desires can be saturated.


I'm sorry that you think owning a car and eating industrially-produced foods is a mark of a "good" standard of living. Both these are unfortunate byproducts of a runaway economic model that tied profits to consumption, while destroying the very prerequisites of its existence.

We more than definitely have enough resources to satisfy everyone's basic needs. The basic needs would be: clean air to breathe, clean water to drink, simple healthy food, a safe home and community, quality education and healthcare, and the freedom to do whatever it is you want to do as long as it doesn't impinge on these basic needs of anyone else.


> we have more than enough resources and means to distribute them to everyone

I read that as giving everyone weapons and was surprised at the tone shift.


Awesome! There isn't any real air-to-air combat nowadays though, so I wonder how they test the results?

Looks like the contractor behind the AI pilot software is https://en.wikipedia.org/wiki/Shield_AI


Ironically stealth may change that. Aircraft that can't see each other at long range due to stealth will tend to wind up in close proximity.


The Gundam Hypothesis of the future of warfare:

> The main use of the Minovsky particle was in combat and communication. When the Minovsky particle is spread in large numbers in the open air or in open space, the particles disrupt low-frequency electromagnetic radiation, such as microwaves and radio waves. The Minovsky particle also interferes with the operations of electronic circuitry and destroys unprotected circuits due to the particles' high electrical charge which act like a continuous electromagnetic pulse on metal objects. Because of the way Minovsky particles react with other types of radiation, radar systems and long-range wireless communication systems become useless, infra-red signals are diffracted and their accuracy decreases, and visible light is fogged. This became known as the "Minovsky Effect".

The disruption of electromagnetic radiation is due to the small lattice of the I-field creating fringes that long wavelengths cannot penetrate, and that diffract wavelengths that have similar distance with the fringes. This diffraction and polarization process disrupts the electromagnetic waves.[23]

The only counter measure to the "M" particle in the series was to install bulky and expensive shielding on all electronic equipment, but only to counteract the effect it had on electronic circuitry. While this could be done for space ships and naval ships, this ruled out the use of precision guided weapons, such as guided missiles. Due to this, the military use of Minovsky particles ushered in a new era of close-range combat. This is the primary reason for the birth of the Zeon close-combat weapon: the mobile suit.[24]


"There was a time when battles could be won and lost with the mere push of a button, and these mobile dolls are the absolute root of that detestable and hateful spirit. When war is dehumanized both victory and defeat become miserable, and God no longer lends a helping hand." - Treize Khushrenada

Not UC, I know, but it's the first thing that came to mind when I saw this thread.


This has a strong Macross Plus vibe. I'm not sure if it's good news, though.


I think it is good news. Wars of the future can be fought with machines and the country with the most resources will win.


>Wars of the future can be fought with machines and the country with the most resources will win.

These machines are built to do the most damage on the enemy and not just their machines. Once the robots are down, the winning party don't hold theirs back.

Guns and bombs can hit further than arrows so enemies keep fighting further from the front line, but the death toll keeps increasing.

>the country with the most resources will win.

With more and better machines, even more resources will be put into wars. More resources get wasted, more machines get destroyed and more people get to die.

Total wars is a product of industrial revolutions. It changes the units in the scale of wars, from thousands victims to dozens of millions. And these machines - however smart and accurate - always end up killing civilians en masse. Look at the sophistication of the Israeli weapon system, what happened to Gaza in the past few weeks and tell me how technology saves lives in wars? It it isn't even symmetric warfare!

I hope you get to watch this incredible visualization one day, if you haven't already: The Fallen of WWII – https://vimeo.com/128373915


> It changes the units in the scale of wars, from thousands victims to dozens of millions.

World population growth has more to do with this than anything else.

The area of china lost 40 million people, 70% of its population, during the Three Kingdom Wars - that’s on the order of 20% of the world’s population.


How many lives were saved by almost all Iranian missiles and drones being shot down before they hit targets in population centers in Israel?


Except the machines will only fight each other as a defensive countermeasure. The goal will be extracting a political surrender by the government as always by which means terrorizing the populace and inflicting mass misery is the most effective means of pressuring the government. The armies purpose is to prevent that from happening, and as the armies disappear and it’s just metal on metal, the civilians become the ultimate target of the machines and much sharper in focus by the war planners.

Also it becomes easier to do things like sneak a kill bot drone into a city and release it as the ultimate asymmetric warfare aka terrorism. Your terror attacks no longer scale with your ability to train and insert fighters willing to die.

The truth of the matter is we already have the technology at the DIY level to do this, and it’s just a matter of time before AI kill bots become the standard of war.


>The goal will be extracting a political surrender by the government . . . terrorizing the populace and inflicting mass misery is the most effective means of pressuring the government.

That didn't work on Germany or Japan in WWII. (Bombing had an effect, but the effect was mainly to degrade the enemy's ability to manufacture weapons, produce fuels and lubricants and move things around.)


Except for the two nuclear bombs dropped on cities?


OK, but if the nukes are what persuaded Tokyo to surrender, it did so after Tokyo lost almost all of its warships and stopped having enough fuel for its warplanes and started having so much difficulty importing things by sea or by air that there was no hope at all of their continuing to run an industrial economy and would be lucky to manage to continue to feed most of their population.


Yep. Advanced military technology tends to favor countries with high GDP per capita, which are mostly liberal democracies.


You can't be naive enough to believe wars of the future will only involve machines fighting other machines.


No first the machine then once one country runs out of resources and machines they will either surrender or more machines will start bombing them until they do. Yes people will still die but think of Japan and nukes. It will be much the same. You will realize oh damn drones can come in and basically kill all day long and we ran out of drones to defend this. Guess we surrender or die. So yea people still die but first will be a resource war followed by a short lived hopeless defence. Technological superiority will allow one country to lose very little life as it will be fought from thousands of miles away.


> non-deterministic artificial intelligence

Is it really non-deterministic, or is that just a misinterpretation of the opacity in deep learning algorithms? If the word choice isn’t a mistake, I wonder what the non-determinism adds.


Possibly both? Modern AI models often include a "temperature" parameter which produces a certain level of randomness in the output. But the inner models are also opaque in a way that makes it impossible for humans to reason about them, so the output may appear non-deterministic for all practical purposes even if it is technically deterministic in a purely mathematical sense.


Sigh it means in their neural network layer they have a random sampling step. Yay for Gaussians.


IDK, it could mean it's not if/else structured computer code. The idea that you can't guarantee consistent results under small variation is a big deal to non technical people who are buying ML applications


The model knows to stay above the hard deck!


The language of this PR is mind-boggling hyperbole:

>We've fundamentally changed the conversation by showing this can be executed safely and responsibly,” said Col. James Valpiani

How did they show this?

The evidence is anecdotal: A team member with a status interest in "success" says his demo didn't screw up under a test condition.

>For decades machine learning has been historically prohibited due to high risk and lack of independent control.

Go on...

>The X-62A is flown with safety pilots onboard with the independent ability to disengage the AI.

...Sounds safe and effective...

>However, test pilots did not have to activate the safety switch at any point during the dogfights over Edwards.

Huzzah!

OK, well that proves it. Looks like theory met practice and a beautiful baby of a new contract has been born!

>In total, the team made over 100,000 lines of flight-critical software changes across 21 test flights.

Bragging about lines of code changed during testing? Great design!

I recall "expert" testimony on the viability of Reagan's Star Wars: "A human programmer can't do this, it's too complex, it will have to be written by AI."

Skate ahead 40 years—

Introducing "Co-pilot"!

Ridiculous PR


Good news everyone, we've solved the self-driving car problem by adding the z axis


I am highly interested how they managed to get the code certified.

I once was told that flight critical components of those aircrafts have to be verified using formal methods. That is at odds with everything I know about modern machine learning.

But maybe the plane is operated by a kernel machine…


I suspect that military aviation has a different certification path. Especially for R+D such as this.


I guess that is one use for the thousands of fighter jets the US military has in storage.


This is the future: swarms of tiny drones with just enough explosive to kill a person.

https://www.youtube.com/watch?v=KqoGacUu07I


I thought loyal wingman was supposed to arrive years ago? This is really the first neural network attempt to autopilot?


Sure, not. But now it is conscious decision of NN, to become military pilot. Previously tried policeman duty.

https://www.youtube.com/watch?v=wA-NRyWoYII


Hooray. More ways to end life.


It amazes me how people can make snarky comments like this given what's going on in Ukraine. Militaries need to exist precisely so some dictator like Putin can't wake up one day on a whim and be like "you don't deserve to exist; I'm going to take over your country." You can either live in this reality or bask in some imagined smug moral superiority. Your choice.


what could possibly go wrong?


“In three years Cyberdyne will become the largest supplier of military computer systems. All stealth bombers are upgraded with Cyberdyne computers, becoming fully unmanned. Afterward, they fly with a perfect operational record.”


Yikes




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: