CISA’s acting head uploaded sensitive files into public version of ChatGPT
(politico.com)198 points by rurp 5 days ago
198 points by rurp 5 days ago
I had a C-level guy who installed on his fresh notebook a FireFox extension from the wrong domain, it contained mailware - he missed the official link in Google and clicked on whatever scammer site to download the extension :-X
Have worked in places where juniors had to lock devices when on prem; only authorized hardware in the rooms. Yet, the danger was from sloppy O6+ not the O1/GS6 who would (ready&abel) carry the water.
The is a serious problem with folk with power and authority and somehow no responsibility.
That's across government, service and corporate.
> The is a serious problem with folk with power and authority and somehow no responsibility.
Or perhaps the fundamental problem is with people in general - perhaps people without power and authority follow rules only because they don't have the power and authority to ignore them.
In the 00's, DIA had episodes of career researchers watching porn from secured and monitored systems and then losing their jobs and clearances. One can only conclude they wanted to be fired or were really, really stupid.
It’s absolutely necessary to have ChatGPT.com blocked from ITAR/EAR regulated organizations, such as aerospace, defense, etc. I’m really shocked this wasn’t already the case.
"The report says Gottumukkala requested a special exemption to access ChatGPT, which is blocked for other Department of Homeland Security staff."
Surely that must have been approved by the Secretary of Homeland Security Kristi Noem, his former boss back in SD.
ITAR, yes, but there's no such thing as a person or organization that's not EAR-regulated. Everything exported from the US that's not covered by ITAR (State Department) is covered by EAR (Department of Commerce), even if only EAR99.
Sure. That doesn't mean denying access to ChatGPT though - the way I see it, the entire value proposition of Microsoft offering OpenAI models through Azure is to enable access to ChatGPT under contractual terms that make it appropriate for use in government and enterprise organizations, including those dealing with sensitive technology work.
I mean, they are all using O365 to run their day-to-day businesses anyway.
I used to work in a large technology multinational - not "tech industry", but proper industrial technology; the kind of corp that does everything, from dishwashers to oil rigs. It took nearly a year from OpenAI releasing GPT-4 to us having some form of access to this model for general work (coding and otherwise) internally, and from what I understand[0], it's just how long it took for the company to evaluate risks and iron out appropriate contractual agreements with Microsoft wrt. using generative models hosted on Azure. But they did it, which proves to me it's entirely possible, even in places where people are more worried about accidentally falling afoul of technology exports control than insider training.
--
[0] - Purely observational, I had no access to any insider/sensitive information regarding this process.
I agree....but ITAR and EAR can be super vauge especially in higher education.
Its something I have been talking about. Going to be needed for everyone.
I really enjoyed unchecking all those cookie controls. Of the 1668 partner companies who are so interested in me, a good third have a "legitimate interest". With each wanting to drop several cookies, it seems odd that Privacy Badger only thinks there are 19 cookies to block. Could some of them be fakes - flooding the zone?
Damn. I forgot to read the article.
The same cookie can be shared with several partners or collected data can be passed to the partners.
It's not a cookie law — it's a privacy law about sharing personal data. When I know your SSN and email address, I might want to sell that pairing to 1668 companies and I have to get your "consent" for each.
I for one, after doing a bit of reserach, was shocked to find out the person in question is apparently completely unqualified for the job (if him pasting sensitive information into public ChatGPT didn't already make that abundantly clear). But the highlight from his Wikipedia page is this one:
>In December 2025, Politico reported that Gottumukkala had requested to see access to a controlled access program—an act that would require taking a polygraph—in June. Gottumukkala failed the polygraph in the final weeks of July. The Department of Homeland Security began investigating the circumstances surrounding the polygraph test the following month and suspended six career staffers, telling them that the polygraph did not need to be administered.[12]
So the guy failed a polygraph to access a highly controlled system full of confidential information, and the solution to that problem was to fire the people in charge of ensuring the system was secure.
We're speed running America into the ground and half the country is willfully ignorant to it happening.
Polygraphs have to be one of the most awkward / bizarre requirements for accessing a program. They are not scientifically reliable.
The US uses them more pervasively it seems, but there's still remnants of it elsewhere.
The UK uses them for post-conviction monitoring in certain offenses: https://www.gov.uk/government/publications/police-crime-sent... ...and there's more than one British polygraph group: BPA and BPS (https://www.britishpolygraphassociation.org/, https://polygraph.org.uk/)
Australia did indeed reject the polygraph for security clearance: https://antipolygraph.org/blog/2006/10/19/australian-securit...
Canada however does seem to use it as part of their intelligence screening: https://www.canada.ca/en/security-intelligence-service/corpo...
> Do I have to go through the polygraph test to join CSIS?
> Yes. All CSIS employees must obtain a Top Secret security clearance and the polygraph is a mandatory part of the process.
Seems to be the same for CSE and to get "Enhanced Top Secret" clearance.
Back to the US, the Department of Labor says that private employers can't force people to undergo a polygraph test: https://www.dol.gov/agencies/whd/polygraph But of course this does not apply to public sector jobs, where it's used more pervasively.
They're somewhat effective at stopping people applying if those people know they will have to lie
Not defending the buy but completely might be inaccurate. He has a masters in comp sci eng. https://en.wikipedia.org/wiki/Madhu_Gottumukkala
I do realize this scholastic achievement is not indication he knows what he is doing.
And an MBA. He seems like a lot of people I know who skim through their technical degrees just to get the credentials. And my experience is that Masters is often easier to get than a Bachelors.
Anyway what he did makes it abundantly clear that this person should not be head of security for anything.
They aren't willfully ignorant, they're cheering it on.
"pessimistic... but can't see it being positive"
why "but"?
It's bizarre that someone would choose to use the public, 4o bot over the ChatGPT Pro level bot available in the properly siloed and compliant Azure hosted ChatGPT already available to them at that time. The government can use segregated secure systems set up specifically for government use and sensitive documents.
It looks like he requested and got permission to work with "For Unofficial Use Only" documents on ChatGPT 4o - the bureaucracy allowed it - and nobody bothered to intervene. The incompetence and ignorance both are ridiculous.
Fortunately, nothing important was involved - it was "classified because everything gets classified" bureaucratic type classification, but if you're CISA leadership, you've gotta be on the ball, you can't do newbie bullshit like this.
The vast majority of government staff are career professionals who know what they are doing, not political appointees who showed up in the past year.
This is a "Cybersecurity chief" causing an intern-level IT incident.
In many industries, this would be a rapid incident at the company-level and also an immediate fireable offense and in some governments this would be a complete massive scandal + press conference broadcasted across the country.
The CTO created the update? Otherwise it's not the same situation
No but they could have easily created the culture that massively increased the probability of such mishaps... we have all seen how not OK work environment negatively affects deliveries right, or read about boeing fiasco(s).
Not an insider just to be clear here so maybe just really bad luck. But no benefit of doubt for the third strike.
Yay, on-premise llms are what is recomended for serious use, at least US gov thinks that :) But rest of us need to pay subscriptions for 3r party businesses passing back and forth our... everything ?
In old days ppl was saying: "I have no secrets" and now we evolved into "I know how to not upload important docs" ;)
There have to be GovCloud only LLMs just for this case.
I swear this government is headed by appointed nephews of appointed nephews.
I keep thinking back about that Chernobyl miniseries; head of the science department used to run a shoe factory. No one needs to be competent at their job anymore
The article says
> [ChatGPT] is blocked for other Department of Homeland Security staff. Gottumukkala “was granted permission to use ChatGPT with DHS controls in place,” adding that the use was “short-term and limited.”
He had a special exemption to use it as head of Cyber and still got flagged by cybersecurity checks. So obviously they don't think it's safe to use broadly.
They already have a deal with OpenAI to build a government focused one https://openai.com/global-affairs/introducing-chatgpt-gov/
> So obviously they don't think it's safe to use broadly.
More likely, everything gets added to the list because there shouldn't be false positives, it's worth investigating to make sure there isn't an adjacent gap in the security systems.
You are uploading information to the chat system every time you use it. Doubly true if you’re having it analyze or work with documents.
I presume pulling this data out is simple if you’re, say, China.
There really no security to investigate. Without a private instance, it’s an absolute non-starter for anything classified.
Somehow I think that the weak link in our government security is at the top - the President, his cabinet, and various heads of agencies. Because nobody questions what they're allowed to do, and so they're exempt from various common-sense security protocols. We already saw some pretty egregious security breaches from Pete Hegseth.
That's also the case in businesses. No one denies the CEO a security exemption.
whether he is personally and directly responsible for this specific incident, his leadership absolutely sets the tone for the rest of the federal government.
> I swear this government is headed by appointed nephews of appointed nephews.
Don't forget the Large Adult Sons!
https://www.newyorker.com/culture/cultural-comment/the-land-...
> No one needs to be competent at their job anymore
That's actually the whole point. Placing incompetents in positions of authority means they know absolutely to whom they owe their loyalty. Because they know they would never have that job on merit. And since they don't really know how to do the job, they have no moral qualms about doing a poor job, or strong opinions on what they should be doing -- other than whatever mission their patron has given them. It's a tool used by weak leaders and it's unfortunately very effective.
> I swear this government is headed by appointed nephews of appointed nephews.
No joke, the previous head of the State Department task force tasked with fighting corruption and nepotism in international contracting was named Rich Nephew. (He's a very talented career civil servant and I mean no shade I just find that hilarious.)
DEI at its worst is exactly what you say. (At its best, it's "we hire for abilities, but we also look for abilities in non-traditional people".
But, even though that's what DEI can be, not all "someone got a git not because of ability" is DEI. Cronyism, racism, and sexism all do that, too.
In the case of this administration, I think the traditional term is "yes men" - people who are hired not for ability, but because they will not say no to the boss.
They say that most fascist governments fall apart because they actively despise competence, which it turns out you need if you are trying to run a country.
Competence gives way to ideology.
I once read an interesting book on the economy of Nazi Germany. There were a lot of smart CEOs and high ranking civil servants who perfectly predicted US industrial might.
They say it, but they're wrong. Historically speaking there have been basically about 2 fascist governments, and they fell because they lost wars. And Germany, for one, did run them with high competence, to the extend that it took years for many countries to do anything about.
It we loosen "fascist" to just mean any authoritarian government, there are many that run of very long time.
WWII started in 1939 and was done in early 1945, so it didn't take that long.
More importantly, maybe the Nazi's were competent at first, but they absolutely fell apart internally due to mistrust, back stabbing, and demanding of loyalty above all else. Hitler famously made many poor military decisions.
> There have to be GovCloud only LLMs just for this case.
I hear Los Alamos labs has an LLM that makes ChatGPT look like a toy. And then there's Sentinel, which may be the same thing I'm not sure.
And we all heard they reverse engineered alien anti gravity technology in the 80s.
All I've heard was that they were aware it's anti gravity. Nothing about reverse engineering.
Care to say more about that?
Bob Lazar claims he was allocated in this project where not only they found working device capable of emiting gravitational waves out of phase with earth gravitational waves, but they achieved the same effect bombarding a mysterious unknown at the time element. He called the material element115 (the logical guess, element 114 properties was known/was synthesized), while emiting one proton and decaying back to element 115 the effect was achieved.
Apparently he was in fact allocated on a top secret project at los Alamos and his expertise was alternative propulsion back everything else is folklore, but it is deep folklore if you're interested in conspiracy theories
I wonder how far removed the interim director of the CISA is from any real world security. I bet they have not seen or solved any real security problems and merely are an executive looking over cybersec. This probably is another example of why you need rank and file security peeps into security leadership roles rather than some random exec.
I would like to be able to say that it is uncommon, but based on what I am seeing in my neck of the woods, all sorts of, one would think, private information is ingested by various online llms. I would have been less annoyed with it had those been local deployments, but, uhhh, to say it is not a first choice is being over the top charitable with current corporates. And it is not even question of money! Some of those corps throw crazy money at it.
edit: Just in case, in the company I currently work at, compliance apparently signed off on this with only a rather slim type of data verbotten from upload.
I adore that this guy had security clearance and I doubt I'd clear that bar. Last time I looked at the interview there was a question:
> have you ever misused drugs?
and I doubt I'd be able to resist the response:
> of course not, I only use drugs properly.
also I wouldn't lie, because that's would undermine the purpose. Still sad I can't apply for SC jobs because I'm extremely patriotic and improving my nation is something that appeals.
FWIW I have held a security clearance during my career, and telling them I smoked weed was not a dealbreaker. What they are ultimately looking for is reasons why you could be coerced into divulging classified information. If you owe money due to drugs/gambling, etc, that's where it becomes a dealbreaker.
Yeah, this is true. They are looking for vulnerabilities that can be exploited by others - the fact you smoke a blunt once a week is not a problem in that regard.
You can see an archived list of industrial security clearance decisions here [0] which is interesting, and occasionally entertaining, reading. "Drug involvement security concerns" usually involve either actively using drugs or, worse, lying to cover up drug use, both of which are viewed as security concerns and grounds for rejection.
[0] https://web.archive.org/web/20170218040331/http://www.dod.mi...
wait, so I can apply and be honest? Sick! I just poorly misassumed they had classicly archaic interpretations of drugs.
Current use is still a problem AFAIK (not sure on weed).
That said I can confirm that a few years back a friend who had previously used/experimented with a wide variety of substances (EDM scene, psychs), had no trouble getting a clearance.
They disclosed all of it, said they weren't currently using it and wouldn't for as long as they were in the job role, passed the drug test, and that was fine.
That said, to add to the "lying is a bad idea" point: I believe some of their references were asked about if they'd ever known that friend to have a dependency + if they were aware of any current/very recent use.
OC had a point. If you take drugs in the way they are intended to be used, you can say no with a clear conscience. Whether the interviewer will accept that if they later find out you took drugs, I couldn't tell you.
You would not get a security clearance, and the admin would make a note on your IQ. The correct answer is simply
> no
and keep the rest of it in your head.
how is it low IQ to be honest? People have to make decisions and if the decision is "no", I can handle that. Empowering the person making the decision to the fullest extent is something I'd still be interested in, even if it is to my detriment. Its like when middle-management ask me to lie or withold information from the COO or CEO, its just a no. If they're shit then its on the organisation to sort that out. Second guessing everything leads to even worse dysfunction.
We're not talking about sneaking into a concert or something low-stakes, the security of our nation is the foundation of our very civilization. I have dual citizenship of a nation that borders Russia and was once the USSR, so I appreciate the stakes of worst case scenarios because one of my nations was under that boot rather recently.
A smart person seeking a security clearance would not volunteer information that wasn't asked for, that causes him to be denied the clearance.
smart is not necessarily the same as deceitful. Also the question:
> do you misuse drugs
is very much asking for the information about my drug use. So it was asked for.
The Dept of Homeland Security has had its own internal gen-AI chat bot since before Trump took office [0]. That this guy couldn’t make do with that, and didn’t think through the repercussions of uploading non-public documents to a public chatbot doesn’t bode well for his ability to manage CISA
[0] https://www.dhs.gov/archive/news/2024/12/17/dhss-responsible...
From wikipedia:
He graduated from Andhra University with a bachelor of engineering in electronics and communication engineering, the University of Texas at Arlington with a master's degree in computer science engineering, the University of Dallas with a Master of Business Administration in engineering and technology management, and Dakota State University with a doctorate in information systems.
And he still manages to make a rookie mistake. Time to investigate Mr. Gottumukkala's credentials. I wouldn't be surprised if he's a fraud.
And in all manner of regulated industries. People simply cannot resist throwing anything and everything at the magic text machine. A company can control its IT assets, but if the content is displayable on a screen, rest assured users will just take photos and upload to their personal LLM accounts to get the generative answers they endlessly desire.
I’m actually shocked that security teams aren’t up in arms over this exfiltration of company secrets. I know some companies that are running their own models and agents but the vast majority are copilot/claude/codex’ing away sending all that sweet sweet IP to 3rd parties
You can get agreements with all of the providers around data sharing etc and host the models themselves through AWS or another cloud provider. That's what clueful companies are doing, as expecting people not to use this stuff is doomed to fail.
This administration's op-sec has been consistently "barney fife" levels of incompetence.
And probably also been asked to draw a clock at a certain time, too.
If it wasn't meant to be eaten, it shouldn't have tasted so good!
Personally I believe this but it gets into conspiracy theory real quick. There are far simpler explanations.
Same, I want to believe that this is all a ruse and that the are smart and just really good at playing dumb, but there are just too MANY of them.
It's sycophancy plain and simple. Surround yourself with only yes-men, it ends up becoming less and less competent as the ones who stand up and say no are replaced.
Even if they know better, they can't do better because they know there is no loyalty to nay-sayers.
The simpler explanation is that all the competent people saw what happened the first go around and want nothing to do with it. That leaves a detritus of sociopathic wannabes to select from for staff, all vying to mirror the behavioral profile of dear leader.
Unfortunately for Maduro, that operation was run by military professionals rather than directly by Trump's lackeys. But give Hegseth enough time and he'll bring them around to the new standard.
When I saw mention it was in context of a “contracting” type set of info / document I actually chuckled - I spent a decade in procurement and sales for high stakes contracts. Incompetent person has no idea how to manage a procurement and goes online. Basically this is a 2026 version of an inept executive bashing “what is an RFP” into a search engine from 2007.
The trick is how to weaponize the incompetence against them.
And when the CCP compromised the law enforcement portal for every American ISP, stealing info on 80% of Americans, including both the Kamala and Trump campaigns, under the previous admin it was rock solid op-sec, presumably.
Or when the previous admin leaked classified Iran attack plans from the Pentagon, so bad that they didn't even know whether they were hacked or not.
You can at least pretend to make a technical argument over a political one.
It was a previous admin who mandated a backdoor. Predictably, enemies of the state got access to the backdoor.
Source? I cannot find anything suggesting that law enforcement agencies operate the portals. They are mandated by law and used by law enforcement, but operated by the telecom providers.
From [0]: “Last year almost a dozen major U.S. ISPs were the victim”, “the intruders spent much of the last year rooting around the ISP networks”, “telecom administrators failing to change default passwords”, “Biden FCC officials did try to implement some very basic cybersecurity safeguards, requiring that telecoms try to do a better job securing their networks”. Per the original topic, the article goes on to explain how the Trump admin destroyed those little security steps.
I’m okay with some both-sidesing of bad opsec, but I think you’re incorrect on the blame in this story, and to the extent it is the government’s responsibility, the Trump II response was worse than the Biden’s.
[0] https://www.techdirt.com/2025/11/07/trump-cybersecurity-poli...
You're the one making a political argument by doing a whataboutism that attempts to negate the failings of this administration. Which you're not even doing correctly because by every measure the previous administration was drastically more competent by looking at the qualifications of the people who filled their posts.
It's been the same with every administration, unfortunately. It's just a side effect of such an unnecessarily big goverment.
If they are so leaky then why were they able to capture Maduro without a single American casualty? On one hand you claim incompetence and yet no one was tipped off. So maybe the Signal group chat wasn't as important as it was made out to be?
You have to actively maintain a state of ignorance to say this isn’t different. Go look at all of the public reporting starting in January about the way appointees in the Pentagon, DOGE, etc. blew through the normal policies and procedures controlling access, clearing people, or restricting sharing.
For example, this wasn’t just “oops, I used the wrong number” but Hegseth getting a custom line run into a secure facility so he could use a personal computer of unknown provenance and security:
https://www.nytimes.com/2025/04/24/us/politics/hegseth-signa...
That’s one of the reasons why one of the first moves they made was to fire CISOs and the inspectors general who would normally be investigating serious policy violations.
This isn’t “big government”, it’s the attitude that the law is a tool used to hurt their opponents and help themselves but never the reverse.
Which polygraph, "lie detector" polygraph?
https://www.apa.org/topics/cognitive-neuroscience/polygraph
> Reviews of decades of scientific research suggest that polygraph tests are not reliable or accurate enough to be used in most forensic, legal or employment settings.
> Although lying can cause the physiological responses measured by polygraph machines—such as sweating and increased heart rate—those same changes can occur even when people are not lying, for example when they are nervous.
https://en.wikipedia.org/wiki/Madhu_Gottumukkala
He was the 'CTO' of South Dakota and later the CIO/Commissioner of the South Dakota Bureau of Information and Telecommunications under governor Kristi Noem.
Edit: (From a European perspective) it seems like the southern states really took over the US establishment. I hadn't really grasped the level of it, before.
> Edit: (From a European perspective) it seems like the southern states really took over the US establishment. I hadn't really grasped the level of it, before.
It's good to know the Americans aren't the only ones who never look at maps outside their own country
I am so happy that my embarrassing lack of geographical knowledge of the US states' internal geographies amused you. A good laugh is great for your health, I've heard.
At least I know where your country is located.
Now, let me quiz you on the geographical locations of French regions? Or perhaps Finnish regions, if that's something you work closer with, day-to-day?
;)
Sounds about on par with what I would expect competence wise.
Hand-picked by Noem, so yeah.
https://en.wikipedia.org/wiki/Madhu_Gottumukkala
> In April 2025, secretary of homeland security Kristi Noem named Gottumukkala as the deputy director of the Cybersecurity and Infrastructure Security Agency; he began serving in the position on May 16. That month, Gottumukkala told personnel at the agency that much of its leadership was resigning and that he would serve as its acting director beginning on May 30.
These days I think that thing's main purpose is to bounce people who would otherwise request access that they don't really need. If it isn't worth sitting down for the machine you don't really need it.
> Gottumukkala failed the polygraph in the final weeks of July. The Department of Homeland Security began investigating the circumstances surrounding the polygraph test the following month and suspended six career staffers, telling them that the polygraph did not need to be administered.
This is pretty insane though.
More context is that he was promoted under Noem in her old job too, just before the Presidential election.
> On Tuesday, Gov. Kristi Noem announced Gottumukkala's appointment as CIO. In a statement, she said he will prioritize the state’s citizens, their data and government service delivery.
https://www.govtech.com/workforce/south-dakota-governor-appo...
> Cybersecurity monitoring systems then reportedly flagged the uploads in early August. That triggered a DHS-led damage assessment to determine whether the information had been exposed.
So it means, a DLP solution, browsers trusting its CA and it silently handling HTTP in clear-text right?
I’m a little surprised by the takes in the comments. Obviously, heads of departments or agencies, CEOs, or similar personnel are generally not in the same league as normal employees when it comes to compliance.
Productivity and efficiency are key for their work. I am sure there are lots of Sysadmins here, that had to disable security controls for a manager or had to configure something in a way to circumvent security controls from actually working. I have been in many situations where I have been asked by IT colleagues if doing something like that was fine, because an executive had to read a PowerPoint file NOW.
Sysadmins are afforded special leniency because of their demonstrated competence. Their leeway is earned. In this case, the "cyber security chief" has no proven skill other than absolute loyalty to his boss, which justified his skipping the usual vetting procedure.
Obviously those kinds of stories are common, but you can’t seriously be suggesting that it is a good or acceptable thing?
Execs are just as stupid as your average person and bypassing security controls for them puts an organization at an even greater risk due to the kinds of information they have access to. They just get away with it because they’re in charge.
It touched a nerve because no one in the trump admin is qualified to do their job. There's a lot of corruption and a lot of people getting access to things they shouldn't due to their relationship and loyalty, not merit. There's a big difference from a sys admin having super user access and some random politically connected hack abusing their privilege.
DOGE/Musk, noem, Kash, hegseth, etc.
Where does this "cybersecurity monitoring" take place? On OpenAIs side? Or some kind of monitoring tools on the devices themself?
In any enterprise, normal would be to have monitoring on all ingress and egress points from the network and on devices themselves. You can't only have monitoring on managed devices because someone might BYOD and plug in an unmanaged device/connect it to internal wifi etc.
You bring in vendors and they need guest wifi to give you a demo, you need to be able to give them something to connect to but you don't want that pipe to be unmonitored.
What I'm really asking/wondering is how (and who or which party) figured out that this was leaked, and secondly how that propagated to the public. I don't really expect to find that answer. But if I had to guess OpenAI found out first, because employees there are more likely to leak the fact that the leak happened.
But also, how was it caught in the first place? Was it automatically flagged because content scanners automatically identified this as a concern, or was his account specially flagged for extra monitoring because of who he is?
it says "according to four Department of Homeland Security officials with knowledge of the incident." and "according to the four officials, each of whom was granted anonymity for fear of retribution." .. so It seems to be an internal lead.
as the post above says.. on managed devices, there can be an enforced vpn, that monitors all traffic coming and going, and while its at it, strip out the encryption and look inside the packets, and apply heuristics like .. what is the host domain, is it from a known LLM site.. and is its a POST message sending data, and then does the text of that data have a string matching "INTERNAL USE ONLY". I assume something like this.
Once again, if you or I did this, it's federal crime and federal time.
But when the chief does it, it's an oopsie poopsie "special exemption".
> Once again, if you or I did this, it's federal crime and federal time.
For a single incident? I doubt it. And, you need to show (criminal) intent. We still have no idea if this was accidental. To be clear, before this incident, he looked like just another senior IT admin. I still see it that way.My assumption is that it goes the other direction on a permanent basis.
I Googled for "cisa employment nationality requirements". I got a bunch of pages from CISA itself about how to apply for various jobs: recent grad, experienced specialist, and military vet. All have a bold statement under eligibility that says: "US Citizenship is required." It think it is safe to assume that Dr. Madhu Gottumukkala is a naturalised US citizen.
This is a very good question. Seems like it would negatively affect our security posture.
> None of the files Gottumukkala plugged into ChatGPT were classified, according to the four officials, each of whom was granted anonymity for fear of retribution. But the material included CISA contracting documents marked “for official use only,” a government designation for information that is considered sensitive and not for public release.
Guys... we're talking about FOUO. Not even low-level classified. This is a nothingburger. The toilet paper you wipe with is FOUO, there is essentially no document in the government that isn't at least FOUO.
Leaked is not the correct word here. Generally as it's used, it implies some intent to disclose, the information for it's own purposes. You would call a disclosure to the war thunder forums a leak, because the intent was to use that information to win an argument. You wouldn't call Leaving boxes of classified information in a wearhouse where you'd normally read them a leak. (At least not as a verb). Likewise you wouldn't call it a leak if you mistakenly abandoned them in a park.
That said, IIRC For Official Use Only is the lowest level of classification (note not classified) it's not even NOFORN. It's even multiple levels below Sensitive But Unclassified.
So, who cares?
Much more significant is he failed the SCI/full poly... that means you lied about something. Yes I know polys don't work, but the point of the poly is to try to ensure you've disclosed everything that could be used against you, which ideally means no one could flip you or manipulate you. The functional part is to determine if you have anxiety about things you might try to hide, because that fear can be used against you. No fear/anxiety, or nothing you're trying to hide means you're harder to manipulate.
That feels bad even ignoring the whole hostile spys kinda thing.
It's so often the guys that are at the top who are the exception to the rules that are the problem.
I knew some folks who worked military communications and they broke rules regularly because senior officers just didn't want to walk across the street to do something secure...