<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/"><channel><title>AI on DAMNSEC</title><link>https://damnsec.com/tags/ai/</link><description>Recent content in AI on DAMNSEC</description><generator>Hugo -- 0.151.0</generator><language>en-us</language><lastBuildDate>Wed, 22 Apr 2026 00:00:00 +0000</lastBuildDate><atom:link href="https://damnsec.com/tags/ai/index.xml" rel="self" type="application/rss+xml"/><item><title>All New and Improved Cybersecurity...Attack Surfaces?</title><link>https://damnsec.com/blog/all-new-and-improved-cybersecurity-attack-surfaces/</link><pubDate>Wed, 22 Apr 2026 00:00:00 +0000</pubDate><guid>https://damnsec.com/blog/all-new-and-improved-cybersecurity-attack-surfaces/</guid><description>&lt;p&gt;Since the generative AI boom, everyone has found a use case for it. Reviewing their writing, planning a study guide, sharing hard-coded API keys/company secrets critical to their organisation&amp;rsquo;s infrastructure, and even using it as a financial adviser.
What? The whole sharing company secrets thing? Pfft. Oh yeah, people do it all the time.&lt;/p&gt;
&lt;p&gt;&lt;a href="https://securityaffairs.com/144597/security/samsung-data-leak-chatgpt.html#:~:text=%E2%80%9CIn%20another%20case%2C%20an%20employee,Affairs%20as%20your%20favorite%20blog."&gt;Samsung employees unwittingly leaked company secret data by using ChatGPT&lt;/a&gt;. They shared information like meeting documents and source code with chatGPT. This led to 3 data leaks in just under a month via chatGPT.
How about the &lt;a href="https://thehackernews.com/2025/02/12000-api-keys-and-passwords-found-in.html"&gt;12,000+ API Keys and Passwords Found in Public Datasets Used for LLM Training&lt;/a&gt;? There were 219 different types of secrets found here. AWS root keys, Slack webhooks, Mailchimp API keys, name it. A lot of these were live, too.&lt;/p&gt;</description></item></channel></rss>