What can you ask BriteAI?
Discover and onboard.
- What Azure services does TENET monitor?
- Identify alert coverage gaps.
- Help me connect my Azure subscription.
- What are the most important security metrics for my environment?
Find and fix issues.
- Explain this anomaly.
- Why did this VM stop sending metrics?
- Analyze this security finding.
- How do I fix this misconfiguration?
Get insights fast.
- Summarize recent security alerts.
- Explain this compliance gap.
- How many critical findings did we have last week?
- How did the last deployment affect my risk score?
Translate and collaborate.
- Explain this KQL query.
- Rewrite this query to include more resource types.
- Create an exec summary of my security posture.
- Explain this dashboard in plain English.
Context-Aware
Answers grounded in your live Azure data
Before BriteAI generates a single token, it injects your live security data — pre-sorted by risk — directly into its context window. No hallucinated resource names. No generic advice.
- Tab-aware context injection — BriteAI knows which TENET view you have open
- Risk-ranked data loaded before the first token is generated
- Grounded in your actual Azure resources, not training data
- No prompt engineering required — just ask in plain language
Deep Queries
Calls MCP tools when context isn't enough
When injected context does not contain the full dataset, BriteAI does not guess — it calls the TENET MCP server directly. From ad-hoc Resource Graph KQL queries to Log Analytics stack traces, it fetches exactly what it needs.
- Runs Resource Graph KQL to answer inventory and relationship questions
- Queries Log Analytics for exceptions, slow requests, and raw traces
- Searches Azure Monitor metrics on demand for performance analysis
- Fetches Microsoft Learn docs in real time for how-to guidance
Real-time Streaming
Instant streaming answers with full Markdown
BriteAI streams tokens in real-time via Server-Sent Events. Responses render as structured Markdown — tables, bullet lists, syntax-highlighted code blocks, and risk-scored summaries.
- Tokens stream via Server-Sent Events — no waiting for a full response
- Responses render as rich Markdown with tables and code blocks
- Risk-scored summaries highlight the most critical findings first
- Works inside TENET without any external AI tool setup
Correlation Engine
Cross-domain risk surfaced automatically
A VM with a CPU spike means nothing in isolation. BriteAI correlates anomalies, open NSG ports, over-privileged identities, and compliance gaps to surface the findings that are actually dangerous.
- Joins metric anomalies with open NSG ports and MITRE technique mappings
- Links over-privileged identities to the resources they can reach
- Surfaces compliance gaps that are amplified by active findings
- Ranks cross-domain findings by combined blast radius and exploitability
SRE Agent Integration
TENET MCP Server
Connect your SRE agents, automation scripts, and AI pipelines directly to TENET's live Azure data via a standard Model Context Protocol endpoint. JSON-RPC 2.0 over Streamable HTTP — no custom SDKs required.
- 11 MCP tools covering security, compliance, anomalies, identity, and AI services
- JSON-RPC 2.0 over Streamable HTTP — no custom SDK required
- Per-user API keys generated in TENET Settings
- Compatible with Claude, GPT, LangChain, and any MCP-capable agent
Ask your Azure anything. Get answers that are actually true.
Start a free TENET trial and BriteAI is ready from day one — no setup, no prompts to engineer. Just ask.
14-day free trial · No credit card required · Cancel anytime