- SSO protects only the apps your IdP (identity provider) knows about. The AI tools your staff accesses through direct browser logins are completely invisible to your IdP.
- The HSCC AI Third-Party Risk Guide (April 2026) now formally requires healthcare organizations to maintain an active, continuously updated inventory of all AI tools employees use — including those IT never approved.
- 97% of organizations that experienced AI-related security breaches lacked proper access controls.
- Shadow AI incidents carry an average of $670,000 in additional breach costs.
- Policies against unapproved AI tools don't work on their own: 52% of workers say they'll break their company's AI policy if it makes their job easier, and 25% already have.
- LastPass closes the SSO coverage gap by surfacing every app your staff logs into through the browser, giving you the AI tool inventory HIPAA auditors and cyber insurers require.
Disclaimer: While the information in this article can aid in HIPAA compliance, its use does not constitute legal or audit advice. To cover your specific use case, be sure to consult with a security professional with expertise in HIPAA compliance.
- SSO protects only the apps your IdP (identity provider) knows about. The AI tools your staff accesses through direct browser logins are completely invisible to your IdP.
- The HSCC AI Third-Party Risk Guide (April 2026) now formally requires healthcare organizations to maintain an active, continuously updated inventory of all AI tools employees use — including those IT never approved.
- 97% of organizations that experienced AI-related security breaches lacked proper access controls.
- Shadow AI incidents carry an average of $670,000 in additional breach costs.
- Policies against unapproved AI tools don't work on their own: 52% of workers say they'll break their company's AI policy if it makes their job easier, and 25% already have.
- LastPass closes the SSO coverage gap by surfacing every app your staff logs into through the browser, giving you the AI tool inventory HIPAA auditors and cyber insurers require.
You probably can't name every SaaS or AI tool your staff logged into last week, but you know your core clinical systems sit behind SSO. By every measure, your access management is on solid ground.
But 31.4% of all vendor interactions now occur via direct browser access, which bypasses your IdP (identity provider) entirely.
Based on that, how critical is it to know whether your team uses tools outside SSO?
Here's a stat worth pondering: According to a 2025 Wolters Kluwer Health survey, 17% of healthcare professionals admit they use unauthorized AI tools, and their primary reason is for a faster workflow.
But handling patient data with these unapproved tools raises privacy concerns, as inputs like PHI (protected health information) can be leaked via prompt injection attacks.
SSO and MFA protect the systems your identity provider knows about. But they can't discover or govern AI tools your staff adopts without IT approval. Under both HIPAA's updated Security Rule and the HSCC AI Third-Party Risk Guide (April 2026), that unmonitored layer is now a formal compliance requirement to address, not a "best practice" to get to eventually.
Which brings us to an important question.
Why does SSO still leave your healthcare organization exposed to Shadow AI risk?
SSO federated authentication works by linking your IdP (identity provider) to the apps you've configured. When a clinician authenticates through Okta, Entra ID, or Active Directory, the IdP verifies their identity and passes an access token to the connected app. This is how authorized access works.
But every SaaS or AI tool a nurse, physician, or administrator accesses without going through this process is completely invisible to your IdP.
This direct browser login with a business or personal email address is untracked and unprotected, which means you now have a visibility problem.
SSO coverage gap: The SaaS and AI apps your employees access outside your identity provider's reach, where no access policies apply and no visibility exists. See the types of AI tools LastPass surfaces.
"LastPass's quickness is what makes it so effective. It's fast and easy to use, which is why our employees love it."
Brad Sweet, Network Systems and Security Manager at HealtheConnections, trusted health information exchange for 1,500+ organizations
The illusion of control: What your dashboard actually reveals
Your IdP dashboard shows you what passes through SSO. It grades your security posture based on the vendors you've formally procured and the logins you can see. On the surface, the dashboard gives you a feeling of control, but it's only showing you the portion of activity it can measure.
According to HIMSS field reporting, the healthcare orgs tracking their AI usage are mainly large academic health systems like Mass General Brigham, which runs one of the largest hospital-system-based research enterprises in the U.S.
Mass General Brigham currently has 3,700 ongoing clinical trials and a research budget of nearly $2 billion.
This means most small to mid-sized healthcare enterprises are operating with an AI inventory that exists only on the official procurement list, not in the reality of daily usage.
That gap is exactly what the HSCC AI Third-Party Risk Guide calls out: Your official vendor list is a record of intent, while your operational reality is based on actual behavior, where the risk lives.
Illusion of control: The false confidence that comes from monitoring only the access your identity provider can see — while the majority of SaaS and AI activity in your organization goes untracked.
What does Shadow AI look like in a clinical setting, and what are the risks?
Your healthcare team adopts AI tools to help with dosing, information retrieval, medical searches, and clinical summaries.
But if any of these actions bypass centralized identity verification, RBAC controls, and the monitoring essential for HIPAA compliance, your organization faces amplified risks.
- According to Varonis, 64% of healthcare orgs use unverified apps in daily workflows, which increases their risk of exposure.
- 81% of data policy violations involve PHI – The HIPAA Journal
- 90% of healthcare orgs have sensitive files (about 48,000 on average) exposed via M365 Copilot – 24X7 The Voice of HTM
Why is Shadow AI in healthcare now a HIPAA compliance problem?
Because HHS proposed a major overhaul of the HIPAA Security Rule (published in the Federal Register, January 2025) to address modern AI data flows.
Key proposed changes include:
- Mandatory MFA for accounts accessing systems containing ePHI
- Documented risk analysis for any AI tool interacting with ePHI
- Maintaining an accurate inventory of all tech assets that may affect the CIA (confidentiality, integrity, or availability) of ePHI
Organizations that can't demonstrate the above for AI tools in use — including those IT didn't approve — face exposure under both the current Security Rule and incoming requirements.
In April 2026, the Health Sector Coordinating Council, a coalition of 400+ healthcare providers, pharma & medtech companies, and health IT entities, published the AI Third-Party Risk and Supply Chain Transparency Guide.
The guide translates the Jan 2025 HIPAA Security Rule proposals into actionable controls, which positions you to comply before the Rule is finalized.
Here's why doing this is critical:
- The average healthcare data breach cost $7.42 million in 2025, the most expensive among all industries.
- Breaches involving Shadow AI carry an additional $670,000 in costs and take the longest to identity and contain at 279 days, five weeks longer than the global average.
If you're a community hospital, a mid-size health system, or healthcare-adjacent business managing ePHI, Shadow AI is an operational risk that must be addressed before your next audit, not after.
"Shadow AI leaks more than people realize by quietly teaching outsiders how your organization works without having to hack systems. If employees paste real internal content into AI tools, that can reveal naming conventions for systems and documents, folder structures, and more that can be leveraged in digital squatting and other social engineering attacks."
Stephanie Schneider, LastPass Cyber Threat Intelligence Analyst
Where does LastPass fit in your healthcare organization's access and compliance framework?
LastPass doesn't replace your EHR security controls, IdP, or HIPAA compliance program. But it can help close the specific gap SSO can't, the layer where your AI tools and non-SSO apps live.
First, LastPass connects to your existing IdP, whether that's Microsoft Entra ID, Active Directory, Okta, or Google Workspace.
Your staff will access LastPass with their existing corporate identities; there's no separate system to learn or manage.
From that foundation, LastPass covers the access layer your IdP can't reach, such as the AI tools, direct logins, and non-SSO apps that make up a growing share of how your staff works:
- SaaS Monitoring surfaces every SaaS and AI/chatbot app your staff logs into. This gives you the foundation, a working SaaS & AI inventory baseline.
- SaaS Protect allows your team to act on what SaaS Monitoring surfaces. You can warn clinicians the moment they try to log into an unapproved AI tool, block access to high-risk tools that present clear ePHI exposure risk, or require approved AI apps to be accessed with secure, vault-stored credentials rather than personal email accounts.
- MFA and access policies extend consistent authentication controls for non-SSO apps like legacy clinical tools that can't integrate with your IdP.
The practical outcome: Your IdP provides compliance documentation for access that flows through it. LastPass can provide the visibility, governance, and audit trail for the rest, including the AI tool inventory your next HIPAA audit will ask about.
See what AI tools your staff are using right now. Start a free LastPass trial and get your full SaaS inventory in minutes.
How does LastPass compare to other identity security vendors?
The right question for healthcare IT teams isn't "which tool has the most features." It's "which tool gives your team the coverage and compliance reporting you need, at a cost and complexity level you can actually sustain."
The comparison table below focuses on how each platform discovers SaaS and AI usage, not the total breadth of enterprise IAM features.
| LastPass Business Max | 1Password XAM | Bitwarden Enterprise | |
|---|---|---|---|
| Who is this actually built for? | IT teams of 1–3 people at small to mid-sized orgs managing SaaS and AI sprawl | Orgs with dedicated IT/security staff and capacity to configure and run a multi-module system | Technical teams prioritizing open-source control and hands-on configuration |
| How is SaaS and AI usage discovered? | Browser-based SaaS & AI discovery surfaces direct logins as users authenticate | SaaS discovery requires configuring the SaaS Manager module, a separate product layer within XAM | Limited to vault-managed credentials; AI tools accessed via direct browser login aren't surfaced |
| Does it capture access outside SSO? | Yes, detects direct browser logins regardless of IdP | No — SaaS Manager relies on IdP signals; direct browser logins outside SSO aren't captured by default | No — Access Intelligence surfaces apps with at-risk credentials within the vault; direct browser logins to SaaS and AI tools aren't captured |
| What does day-to-day administration look like at scale? | Single console for SaaS inventory, credential reports, vault policies, and MFA enforcement | Multi-module architecture (password manager, SaaS Manager, Device Trust) adds configuration layers and run-time effort | Self-hosted deployment requires ongoing server maintenance; cloud option is simpler but reduces flexibility |
| Does pricing stay predictable as my team grows? | Flat $9/user/month; with all Business Max capabilities included | SaaS Manager and Device Trust features are part of the extended platform and add additional cost | $6/user/month, but SaaS discovery doesn't include real-time app-level control with the ability to block, warn, or allow apps |
| Does it help meet AI governance requirements? | Provides automatic AI tool inventory, access logs, and policy enforcement for human-driven AI usage | Supports governance but designed for orgs with teams to configure and operationalize | Agent Access SDK focuses on just-in-time credential access for AI agents, not workforce AI governance i.e. doesn't provide AI tool discovery or Shadow SaaS and AI visibility |
| What does it not replace? | Doesn't replace full enterprise IAM, CASB, or SSPM; designed to close the non-SSO visibility gap | Capable platform but requires staff and budget to fully leverage | Open-source architecture a plus for technical teams but a barrier for orgs without the engineering resources to deploy, customize, and maintain |
How do you know if your healthcare organization has a shadow AI governance gap?
A practical list of questions can help identify a Shadow AI governance gap:
- Can you list every AI tool your staff logged into last week, including the ones they signed up for with personal email addresses?
- Do you have access logs for the non-SSO apps your billing team, admin staff, and clinical support roles use daily?
- If a staff member used an AI transcription tool in a patient care meeting last month, did you know about it?
- Could you produce an AI tool inventory within 24 hours if your HIPAA compliance officer asked for one?
If you answered no to any of these, you're at the same starting point as most healthcare IT teams evaluating this right now.
The difference between orgs that manage this well vs. those that don't isn't budget but visibility.
LastPass gives your team SaaS visibility in hours, without adding another platform to manage or agent to deploy. If your current stack can't answer the above questions today, that's a gap you can close before your next audit cycle.
Start your free LastPass trial now to see your full AI tool inventory and close your access gaps before your next HIPAA audit.
Sources
Upguard: The Shadow Supply Chain Report (2026)
Upguard: The Shadow Supply Chain: The SSO & AI Visibility Gap (2026)
Healthcare Dive: Shadow AI use is widespread in healthcare
Wolters Kluwer: Shadow AI: Providers are using unapproved tools to improve workflow (2026)
Wolters Kluwer: Shadow AI, a hidden risk to healthcare (2026)
Forbes: The Trust Factor: Navigating Shadow AI in Healthcare



