An official website of the United States government Here's how you know

Media Roundtable on JCO Industry Day Demo #5

By U.S. Army Public AffairsJuly 25, 2024

Moderator: Okay, thanks, everybody, for your patience there. Of course, you heard that we had a little incident there at the--at the Polk building, but we're ready to go ahead and get started. So just good afternoon and welcome to the JCO industry demonstration number five media roundtable. Today's panel members are all from the JCO and include Colonel Mike Parent, Chief of the JCO's Acquisition Resource division. Major Matthew Miller, JCO Assistant Product Manager Technology Planning, Hi-Singh Salum, JCO project manager and test lead. And you all should have received a copy of their bios along with the demonstration information sheet. If you didn't get them, just please let me know after this, and I'll get those to you right after we're done here. Colonel Parent will provide some brief opening remarks, and then we'll get into your questions. But first, here are today's ground rules. All comments will be treated as on the record. Today we will only be discussing the fifth demonstration event, and I will take your unrelated questions for follow-up. Please identify yourself and your news organization prior to asking a question. Only one question and follow-up per person to allow time for others to ask questions. If time permits, we'll start again from the top. If you have more questions than time allows, please email those to me for follow-up. Please ensure your phones are muted unless you are asking a question. We have 45 minutes for this engagement, and with that, I'll turn it over to Colonel Parent for his opening comments.

Colonel Mike Parent: Hey, good afternoon, everyone, and thank you all very much for participating in today's media event. Very excited today to talk about our demo five and the challenges and also the importance of this [*0:01:46.4] event. That's really all I have to start it out. And I'm again, excited about your questions. And back to you, Jason.

Moderator: Okay, thank you for that, sir. And we'll go ahead and we'll start with Doug Cameron, and then we'll go to Sean Carberry. So go ahead, Doug, kick us off.

Doug Cameron: I'm sorry. I'm in a really noisy place. I'm going to get quiet, so I'll circle back.

Moderator: Okay. We'll go on to Sean Carberry and then to Dan Schere.

Sean Carberry: Thanks. Yeah, I mean, I guess we'll sort of do the tee-up question for all our benefit then, is if you can do a little bit more of a specific rundown of this demonstration, what some of the key things were that you were looking to see out of it and what the, you know, initial findings have been coming out of it so far.

CMP: Sure. So demo five this past June really was our most challenging JCO demonstration to date. We had nearly 60 proposals. We started out a request for white paper about six to nine months ago and whittled it down to eight vendors, nine solutions plus 70 government-sponsored excursions. Really was the most challenging demonstration to date at this point, where up to 50 US targets were converging on a defended area. Flight profiles of single, multi-axis, and advanced threat representative targets were rotary wings, fixed wings, which included fast-moving jets, slow-moving propeller-driven UASs, the use of both--a mixture of both groups, one and three attacking in mass, in waves. So, although I can't discuss at this time the actual results of the demo, what I can say is that the selected vendors did show increased security and awareness of the threat environment that US and allies are facing. So it was a very successful demonstration in informing US and our allies what capabilities exist out there for this very challenging profile.

SC:  Okay. And so are you able to provide any more details on the sort of types of the responses and the information sheet talks about kinetic and non-kinetic categories of defeat, but if you're able to provide a little bit more on what types of things and which were more or less promising.

CMP: Well, what I can say is the capabilities and technologies we looked at was multinational radars, electro-optical infrared cameras, radio frequency scanners, and jammers, [*0:04:51.6] rockets, drone on drones. It was also high-powered microwave and machine guns used as well. The challenge of the profile really meant that no one characteristic, no one capability, whether kinetic or non-kinetic in itself, could really defeat this kind of a profile. So, what we saw was that you really do need a full system of systems approach, a layered approach, because we're talking about a very large profile, 50 or more in a swarm, coming out from different angles, from all angles, really, with different speeds and different sizes.

Moderator: Okay, sir, thank you for that. Next, we'll go to Dan, and then we'll go to Lydia Antonio Vila.

Dan Schere: Hi, thanks so much for doing this. I was wondering, could you be a little more specific about how many, you know, how many different swarms there were during the demo? And were they all around 50, or were some closer to 20, or can you be sort of a little more specific about some of those numbers?

CMP: Do we have--Hi-Singh, are you on the line to help answer this one?

Hi-Sing Salum: Yes, sir. Hi-Sing from JCO team test lead. So, yeah, on average, not every test case or every scenario was to do the 50 for multiple reasons, all from environmental to the target itself being finicky, being able to launch on time. So, the scenarios were, on average above 40, and that's due to multiple reasons, but the intent in each test case was to be able to launch 50.

DS: So, just so I understand, the average number was about 40 for the swarms?

HS: That's correct, sir.

DS: Okay. Also, I was wondering if AI and ML was incorporated into this at all, and if so, in what ways?

HS: So, sir, so these different vendors used--on their C2, they're advertised having AI or ML or some way of filtering these targets to prioritize which target they needed to defeat first. I cannot speak to specific effectiveness of their proposal just because we're still going through the data analysis and the performance of how they affect it. But the old vendors, that team had some level of success and they were able to collect in advance their system based on the number of targets and the overwhelming profile that we were throwing out there. So all of them gave us verbal acknowledgments and we're happy to be part of this data collection event.

DS: Okay, thanks.

Moderator: Yeah, thank you for that. Next up will be Lydia, and then we'll go to Sydney Freedburg.

Lydia Antonio-Vila: Hi. Thank you so much. Sorry, I lost my audio for a second. I know you mentioned we were going to be focusing on this last----

Moderator: Lydia, I think we lost her.

LAV: Sorry. I'm back. I apologize. I know you mentioned we were going to be focusing on this last demonstration. Were you able to share any comparisons to how this demonstration went compared to the previous ones?

CMP: Well, thanks, Lydia. So I guess my response would be that definitely a maturing of the vendors and the capabilities that are coming out to the demonstrations is showing that greater level of maturity. And like I mentioned earlier, the overall awareness of the threat, the speed of change is very quick. So what we're seeing overseas and with adversaries is that they're evolving very quickly, they’re evolving quickly with command and control capabilities and also evading. And therefore, what we're finding is that the vendors that we down--like I said earlier, there was nearly 60 that originally proposed attending this demo. So when we did whittle it down to those eight vendors and also the ones that came for excursions, we're showing that there's a lot greater level of maturity and awareness that the speed of change is very quick.

Moderator: Did you have a follow-up?

LAV: Thank you.

Moderator: Did you have a follow-up, Lydia?

LAV: No. No, thank you.

Moderator: Okay, then we'll move on to Sydney, and then we'll go to Meredith Roaten.

Sydney Freedberg: Hi, Sydney Freedberg from Breaking Defense. Thanks very much for doing this. Let me ask. And I haven't covered the previous demo, so I may be missing something. It seems like in many ways, with 50--up to 50 incoming of different types, different altitudes, speeds, signatures near simultaneously, this is much less a test about does this effector produce adequate effect and much more of a stress test of those command and control, those sensing target prioritization algorithms that you mentioned. Is that a fair characteristic of the test or am I focusing on an aspect of it?

CMP: You know, I'd actually say it is both. That's a fair characteristic, but I would say it's both as well, because even beyond the command and control, when you have nearly 50 targets coming out at you that are varying groups, one, two, three at varying speeds, also the effector that you can put down range on target, so you have to detect, track, ID and also send a kinetic or non-kinetic effect onto those targets. So it really is a combination of everything. And that is kind of what we mean when we say it's really a system of systems approach. But you're right, it does challenge everything to include the command and control. You don't want to obviously engage a target multiple times if you got so many coming at you. You have to be able to differentiate and have--going after the most promising threat first. One of the most challenging, maybe that's going to come after you and actually have an effect on you. So it definitely does challenge the whole spectrum to include command and control.

SF: And to follow up on that, you know, were--when you talked about eight solutions, was each of those an integrated system of systems with multiple effectors and sensors, or was it actually eight-point solutions? And in which case, how was the C2 integration and the sensor integration tested?

CMP: Many of the vendors came with multiple solutions integrated in. They had multiple different types of effectors, whether EW or kinetic, a combination of many. And they had their own command and control of some kind that helped place those effectors on target. And I'll stop there and allow maybe Hi-Sing or Major Miller to chime in if they have anything official to add.

HS: Yes, sir. So you're correct. So there's--there's limitations of the test and test design. And the way we design the demonstration is allowing each vendor an opportunity to showcase their capability on--and you're spot on on the integration levels of all this. The system components, how those work together, how they're interacting with the operator, and the [*0:13:51.8] that the vendor was coming--or their homework that they used to prepare for the demonstration. And during the demonstration, they changed that kind of in order to tailor their approach given the threat. That all was observational data collected and will inform the report how we are scoring or at least assessing how to address the [*0:14:28.1]. But the layer defense was constrained as part of the limitation of the test and as far as being able to compare vendors with other vendors. We did not give them the freedom to tailor to the best of their extent all those components to have a better operational flavor of emplacement of the system.

Moderator: Okay, thank you. All right, thank you for that. Next we'll go to Meredith and then to

Matt Beinart.

Merideth Roaten: Hi. Thanks. I wanted to ask if you can give an average number of drones that were taken out by a single effector, and if it's possible to say, you know, was it common for the majority of the drones to be taken out throughout each vendor's assessment?

CMP: Well, I cannot discuss exacting results at this time about how the actual system performs. Once we complete our evaluation, we do our actual report, each of the vendors will get their performance data. But what I can say is that the vendors did show very good effectiveness in some cases against different types of drones. But it was challenging. Not every vendor with their solution was able to have the optimal results that they would have wanted. But I can't get into exact detail on number of drones that were felled or destroyed or turned back or affected.

MR: Could you say if it was like more than--if there was any that reached more than half gone?

CMP: No, I really can't at this time. So the results will be coming out at the end of this month and then provided to the vendors. But what I can say is that the systems are definitely maturing, but I want to keep--everyone should keep in mind that this is a very challenging profile for all vendors. We're talking about almost 50 drones coming at a defended area from all directions, varying--varying levels of types of drones and speeds, rotary wing, all the way to fixed wing and jet powered. So it was very challenging and it would be too early to say which [*0:17:17.8] performed best because each one performed in a certain way characteristic based on their capabilities. But this was a very challenging profile for any vendor in any capability to detect, track, ID, and defeat.

MR: Just a quick follow-up. I didn't see directed energy systems called out specifically in the capabilities that were used here. Were any directed energy systems used?

CMP: There was for excursions, some direct to energy, but they were--they were collecting data for different purposes, just to collect data for their sponsors and then. So, Hi-Singh or Matt, do you have anything you want to add to that?

Major Miller: Roger that. So, yeah, this is Major Miller. As part of the systems officially under demonstration through the source selection process, no vendors that participated did have directed energy capabilities. As Colonel Parent indicated, one of our side excursions that was sponsored by one of the services did utilize high-power microwaves, and so they were collecting data to specifically support their program of record. But as far as the demonstration results that the KCO is putting together, all of the selected systems did not--none of them utilized directed energy.

MR: Okay, thank you.

Moderator: Okay, thank you. We're going to go to Matt next and then to Brandy Vincent. Go ahead, Matt.

Matt Beinart: Hi. Thank you for doing this. I had one specific question, just to double-check one item, and then I've got a separate follow-up. The fact sheet that we received noted that there were eight vendors participating, but nine total systems. So I was wondering, which of the eight vendors brought multiple systems and what were those two specific systems that they brought?

CMP: For that one, I will pass it off to Hi-Singh and Matt.

HS: Hi-Singh. So, yeah, ELTA North America, they submitted multiple proposals on the white paper response, and they proposed two interesting approaches to the counter swarm. So both of those, one was on the move, and the other one, which is a vehicle-mounted solution, and then the other one was a Match K [ph], which is a platoon kind of carry type of solution. The both of them had individual characteristics that made them worthy of being invited. And that's part of the reason why we had those nine systems and 8 vendors.

MB: Okay, got it. Appreciate that. And then the separate follow-up is, you know, what are the next steps? You know, following this demo, you know, was mentioned in a previous answer about providing results to the vendors by the end of the month. Are there plans to, you know, maybe just select a smaller group within these vendors for further work? What are the next steps? Thank you.

CMP: So, thank you for that. So the next step is once we do finish our final report and performance, we'll provide that information to the individual vendors, but also to the services combatant commands. Then we'll work with the combatant commands and the services on what their interest is within those vendors that performed based on the results. And then following it in the next fiscal year, we're looking at potentially depending on the results and also the needs of the services and the combatant commands looking at prototyping from the pool of vendors. Again, it's always based on what the combatant commands and the services, what their needs are, what they see as being promising for their geographical areas. But we'll work with them, make a determination if there is something that is worth prototyping for them, and then we'll go ahead and select vendors down from that list.

Moderator: Okay, sir. Thank you for that. Next up is Brandy, and then we'll go to Sam Skove.

Brandy Vincent: Thank you so much for doing this. Kind of following up on that previous question. Can you speak a little bit more about what [*0:22:08.5] maybe you're targeting and the specific type of drone swarms you all were demo encountering? So maybe where we see this fielded. We know there's been reports with incidents of Iranian drone swarms and Russian drone swarms, for instance. So, what can you tell us about the real-world implications of what you all tested?

CMP: I know--that's a great question. And going back to what I said earlier about the speed of change being so quick and the threat evolving so quickly, and we're seeing that overseas in multiple different locations with our own services and our allies as well, that the change is happening so quickly. We'll work with the services to determine in their area whether or not kinetic, EW or DE potentially are the best solutions for the threats that they're seeing. Obviously, overseas right now, we're seeing, and you can see this in the news, where one-way attack and also map [ph] is coming more and more into play. And we'll continue to work with the services commissioner so we can fully identify that and then work with them on which solution coming out of the demo. If it does show promise, can we prototype and send forward? We may not necessarily always prototype something. However, I do want to bring that point up. It really depends on the performance coming out of the demo and whether or not the services, the combatant commands see that as something that can get after the threat that has evolved, even evolving today, it's going to evolve again tomorrow and next month. So it's going to continue to evolve. And I would even say when it comes to drones, one of the bottom line things that we can see out of it is that really all soldiers are responsible for protection. And a lot of the old characteristics that we used to see with concealment and cover and deception are coming more and more into play. I just want to make sure I emphasize that in training as well, training of soldiers to use different systems and also the discipline, concealment, cover, and deception are more important now than they probably have been for the really longest time.

BV: Thank you for that. And then did someone want to add anything else?

CMP: Nope.

BV: Okay. One other follow up I had, I may be mistaken, but I remember when y'all had first mentioned this demo for five, you said it would be at White Sands Missile Range in New Mexico. So why was it in Yuma instead?

CMP: So what we did is we looked at--we originally did plan on White Sands. And we looked at a number of different ranges as backups. But YPG or Yuma Proving Grounds proved to be the one that had the best capabilities for what we needed once we did have a request for whitepapers, so we submitted a request for whitepapers. We got responses back, like I mentioned, nearly 60. And then we started down selecting from those 60 down to the vendors that did come out to our demo and perform. So based off of the vendors down selected and the key point that we were looking for, YPG showed to have the keyboard that we needed the most to support that demonstration. That's not to say that White Sands Missile Range doesn't have great capability, just that for the vendors that we down-selected, Yuma Proving Grounds was more than sufficient to perform those activities.

Moderator: Okay, sir.

BV: Thank you.

Moderator: Thanks, sir. Next up will be Sam, and then we'll go to Jen Judson. Go ahead, Sam.

Sam Skove: Hey, I was just wondering if you guys can define a little bit how you define success. I mean, it sounds like it was challenging, but did you eventually say, you know, that within a certain range is the limit of success, that, you know, you have to take it down within 500 meters of the actual test site? Does it have to be 90% of the drones, 100% of the drones, any sort of thing to sort of like bookends how you're defining what you're counting as a full success given the challenge?

CMP: So for a demonstration of this magnitude, really the fastest--once the vendors are on the range and we launch the target and they can start performing the activities, detect, track, ID and some level of defeat, you really have to give the vendors some level of credit for being able to perform to that level because it really is extremely challenging environment. We are talking about over 100 degrees out in the desert performing these tasks. So just by being able to perform those activities, the vendors were doing an outstanding job. Not to say that I'm lowering the bar, I'm just saying that it really is a challenging environment. Each vendor, though, did come out with a certain level, and it was selected based on their [*0:27:29.5] responses. So not all vendors came out with the same items. They did not come out with the same kinetic or the same EW or the same flavor, even how to detect, track, and ID. So each one is unique in its own way, what it came out with. So you can't really say if you brought this many down with your capability or you defeated that certain range because we weren't coming out with a generic-- everyone must have APKWS, for example. Each one had their own level of capability, their own kinetic and non-kinetic. And therefore, it really did vary by what they came out with. What it did show, though, was that having a system assistance approach was indeed the best approach, just because of the challenge of the threats, and the degree of the threats of coming after them, the mass and the wave type attacks. But I'll go ahead and ask if Hi-Singh or Matt wanted to add anything to that comment.

MM: Yes, sir. Hi Sam. This is Major Miller again. So I think part of answering this question, I think we need to frame our understanding around different categories of events that occur within the test and evaluation umbrella. So the JCO conducts demonstrations. What we do is we like to identify solutions that exist throughout industry. And then they come to us and they demonstrate what their capability is. Somewhat differently from what the services do is to developmental test or operational test. As part of those testing activities, they develop technical performance measures that they're testing against separately from what we're doing as a demonstration, which we're trying to see the capabilities that industry has to bear against this problem set. And so when we conduct a demonstration, we don't necessarily go into it with a preconceived notion of what success for each vendor means, especially, particularly when we're looking at this type of problem that we haven't really seen systems in large quantities go against this profile. So, for instance, if we were to come in here and say, hey, if they destroy 75% or more targets, that's considered success because we just don't have that general understanding. So what we do instead--what we do instead is we, and I think you hit on it a little bit. We identified a certain range ring around the system under demonstration location, which was 500 meters. Up to 50 drones were launched at that location, and the vendor had the opportunity to defeat up to that quantity of drones. What we do is we like to--we try to compare vendor systems against other vendor systems because that gives the JCO an opportunity to recommend to the services and the combatant commands which systems we propose them to continue to pursue development or procurement activities. And so in order to do that comparison, we are identifying how many drones are able to breach that 500-meter mark out from the systems under demonstration. And that's how we're doing a quantitative comparison between the vendors in that manner. So, yes, something along the lines, I think, of what you were thinking of when you stated, you know, 500 meters out from the system. Over.

SS: Thanks.

Moderator: Okay, thank you. Next up is Jen, and then we'll go to Jake Epstein.

Jen Judson: Hi, Colonel Parent. I don't know if you're on a cell phone or something, but you were breaking up a little bit during the last, wanted you to be aware. But I wanted to ask when you say that maturity has grown a lot, can you put some more flavor on that in terms of what this maturity looks like? How far have you come here? What capabilities existed you know a year ago and what have you been able to achieve that you couldn't have in recent years if you could do that? When you talk about maturity, is it specific to defeating you know drone swarms, or is it just more broadly speaking about counter UAS generally?

CMP: Yes. So thank you. Yes, I happen to be on a cell, so apologize for that. When I think of maturity, I think of it more of a broader approach. Just from what we have seen when we first went out to our demonstrations three or four years ago, the vendors that are being selected in coming out to our demonstrations, having a better understanding of the threats that they must defeat, their kinetic control is better, and they have a better understanding of what is required to operate in the environment, at the intended environment for counter drones. Overall, we're seeing a much better capability from the vendors in their actual solutions themselves as well. Before we saw it, we did see a lot of kinetic in our previous two demonstrations. For example, you may recall that we had one-way attack demonstrations where we used a lot of--really a lot of the same OP systems, but the command and control may have been different and the fire control vendor was different. They may mature their capability, being able to better acquire track, and therefore, I don't want to, can't talk about results here on this call, but the expectation is that the results are maturing as well. So we are--very promising. Technology is maturing, getting better capability, better solutions, better track, ID, and defeat capabilities. And then we'll take that information provided to the warfighters and let them make a decision on whether or not it what they would like to have prototyped, and then we'll get it out to an operational assessment for them to buy, try and decide.

JJ:  And following up on that, is there anything that, you know based on what you saw, and I know you can't discuss results, but anything you kind of foresee being able to ingest into the current counter UAS architecture, into any of the services rapidly, or you know is there still, from what you've seen, work to be done before being able to, you know, buy try, decide whatever before ingesting, if the elements or tech demonstrated here?

CMP: Well, a little bit of a difficult question, but I would say that I think there is the capability that was demonstrated that can be--can be used, and buy, try, decide quickly, prototype quickly, and put out to the warfighter, put out to the combatant command. But with that being said though, really it's always a learning environment. So, coming out of demonstrations, we really have three approaches that would come out of the demonstration of three outputs. One, the JCO can look at the solutions that participate in the demonstration and determine that more investment is required perhaps, maybe more risk deduction before it's prototyped. We don't want to prototype something, send it out to the warfighter that isn't ready, because obviously, that would do more harm than good. So we'll look at it. If there’s more investment required, more risk reduction needed, in which case we may invest in that risk reduction, or we may inform the vendor where they were a little bit lacking and where they need to show the system more to reduce that risk. The other output is the services themselves, the observers of the demonstration may look at something and say, you know, we think we could use that now, or we think that we would want to invest in it to do risk reduction and mature it. We did see this previously with Leonidas, with HPM and the Army purchasing that system for continued maturity. And then that last one is the services who are observing the demo that say we want that now. And they'll look at the performance results as well, and they'll say, we want that now. We think it's at the maturity level required to defeat the threat that we're seeing today or in the near future. And we want that prototype now. So, we'll look at all three of those outcomes and the performance data and inputs from the services of combat commands, and that's how will come up with what we'll do following the demo.

Moderator: Okay.

JJ: Thank you.

Moderator: Thank you, sir. Next up will be Jake Epstein, and then I'll check to see if there's anybody that I've not called on yet. So go ahead, Jake.

Jake Epstein: Hey, thanks for having this. I'm wondering if somebody could just go into a little bit more detail on the kinetic interceptor drones and how this kind of evolved into what it is now. I mean, is it--is that kind of pulling on lessons from theater where you have this sort of drone-on-drone combat? I guess, yeah, how did we get to that being a preferred defeat method? Thanks.

CMP: So you came a little bit broken in that one. Are you referring to drone on drone? Are you referring to kinetic, such as rocket? Could you elaborate one more time, please?

JE: Sorry about that. The kinetic interceptor drones. I'm just kind of wondering if we can get a bit of like a rundown on how we got to that being a go-to method of defeat in this demonstration.

CMP: Well, so for drone on drone, when you think of being in the continental United States or being outside the United States, really within the US, and looking at homeland defense, drone on drone is an approach that is a safer approach for the environment within the United States or within the homeland. Really, any country would consider their homeland kinetic or EW or DE becomes a lot more challenging to use within a homeland. So drone on drone definitely has its kind of sweet spot within homeland defense versus other types of kinetic or non-kinetic approaches. So you really have to look at what the threat environment is and where your operating is, and then so what is the best effector for that environment? Obviously, when you're overseas, you're in an operation where you have to defend the defended area, you have a mass attack or your adversary is close to you or fairly close to you, so they can bring effector on you very quickly. You don't want to have every effector possible, every different type of effector possible that can support the defense of your defended area. But when you're in the homeland, you do have different considerations to take into account on the different effectors that you may select. Now overseas, a layered approach definitely works best. Of course, that works best in the United States as well, but there are different considerations that have to be considered in the US about which effectors that you can actually use. Does that help? Does that help at all? Because the participants or our allies and partners that came out to our demo, they all have their own different needs. So, therefore, we want to have a broad spattering of effectors that they can then look at and compare and get performance data from. So it's important to have that drone on drone to give that full picture of those capabilities.

Moderator: Jake, did you have a follow-up?

JE: No, no, that was helpful. Thank you.

Moderator: Okay. Was there, is there anybody else on the line that I missed?

DC: This is Doug Cameron. I’m back in a quiet place. If I could throw one in?

Moderator: You bet. Go ahead, Doug.

DC: Appreciate it. Thanks. I just want to circle back a little bit to the comment about the learning environment. Is any number of vendors over the past few months, including vendors in this trial, who will certainly swear blind to myself, but I'm sure they've done the same to my colleagues that they could field systems in CENTCOM or wherever if only there was a path to acquisition or procurement. I mean, is there an element of getting over their skis by industry sometimes, which things like this demonstration, and you talked about how challenging it was, kind of bears out.

CMP: I apologize. You came in very broken. I wasn't able to hear the full question. Was anybody else on the line that could hear the question?

DC: I'll come back to my phone as opposed to the speaker.

CMP: Oh, thank you.

DC: Okay. I'll try and say it quickly. Yeah, there's any number of vendors, including someone in this demonstration, who told reporters, including me, that they could build systems [*0:41:00] if only there was a path to procurement. He talked about how challenging this particular demonstration was. I mean, is it the case that sometimes vendors are basically, you know, getting ahead of themselves, frankly, overpromising, and, you know, that's almost part of the demonstration to provide a reality check?

CMP: Oh, yeah, you're definitely correct. We have obviously seen the past where vendors at different expos or events have made claims that they can perform to a certain level against certain UAS capabilities. So really our demonstrations validate whether or not that is true. So in our request for white paper process, we'll look at what they claim they can do and then we'll down-select and we'll also have all presentations before we make determinations about just how accurate they are. We do [*0:42:02.0], also visit their facilities if necessary, and then once we do that, down select. You're correct. The demonstrations help validate that the capabilities that the vendors claim they can conduct, the effect that they claim that they can put forward countering UAS are indeed what they say they are. So that definitely provides the warfighter, the allies, and partners a validated performance data that they can then use to inform their procurement decisions.

DC: Gotcha. That's very useful. Quick last one from me. I was interested in what you were saying with regard to the specifics on homeland defense and what best suits homeland defense. Excuse my ignorance here, but have there been any specific demonstrations focused on homeland defense, suitable solutions, or any plans?

CMP: So, there has been demonstrations that have been more geared towards homeland defense by some of our partners? There is--there is demonstrations throughout the year by the services, even by the combatant commands now by Homeland Security that does look at different aspects of the counter small UAS and those are conducted. The JCO in the past has looked at low collateral effects interceptors, which is geared more towards homeland defense. That was one of our very first demonstrations a few years ago. So we have looked at that as well as drone hunter, but also our--just so you're aware, there are demonstrations beyond just the JCO throughout the year and they do focus in on those areas.

DC: Perfect. Okay, thanks so much.

Moderator: Okay. Thank you for that, sir. Was there anybody else that I did not call on? Okay, that's going to take us right to the, about the 45-minute mark. So that's going to conclude this media roundtable. So again, if there's any additional follow-up questions, go ahead and send those to me by email. I'd like to thank our panel members for their time today, and I'd like to thank everybody for their attendance. And again, that concludes today's MRT. And thank you for everybody's participation. Have a good day.