PII Scanner Auditor
The PII Scanner is a essential safety node for any AI application dealing with sensitive user data. It detects and redacts/blocks Personal Identifiable Information (PII) like SSNs, emails, and credit card numbers.
π‘οΈ Use Case
- Regulatory Compliance: Enforce GDPR, CCPA, and HIPAA compliance by ensuring PII never leaves your secure perimeter.
- Data Leakage Prevention: Automatically mask sensitive identifiers in model responses.
π Implementation
This auditor hooks into both the Request (to block) and Response (to redact) phases.
import re
from lucid_sdk import create_auditor, Proceed, Deny, Redact, Warn
builder = create_auditor(auditor_id="pii-scanner")
# Regional PII Patterns
SSN_PATTERN = re.compile(r'\b\d{3}-\d{2}-\d{4}\b')
@builder.on_request
def scan_request_pii(data: dict):
prompt = data.get("prompt", "")
if SSN_PATTERN.search(prompt):
# Critical failure: Block the request entirely
return Deny(reason="High-sensitivity PII (SSN) detected in request")
return Proceed()
@builder.on_response
def scan_response_pii(response: dict, request: dict = None):
content = response.get("content", "")
if SSN_PATTERN.search(content):
# Operational safety: Redact the value before release
redacted = SSN_PATTERN.sub("[SSN-REDACTED]", content)
return Redact(
modifications={"content": redacted},
reason="SSN found in model response and redacted"
)
return Proceed()
auditor = builder.build()
βΈοΈ Deployment Configuration
Add this to your auditors.yaml:
chain:
- name: pii-scanner
image: "lucid/pii-auditor:v1"
port: 8081
π Behavior
- Request: If a user types "My SSN is 123-456-7890", the auditor returns
DENY, and the model is never invoked. - Response: If the model hallucinated an SSN, the auditor returns
REDACT, and the user sees the masked version.