Support

Frequently Asked Questions

Everything you need to know about VORTICON and how artificial empathy works.

General Questions

What is VORTICON?
VORTICON is an Artificial Empathy Engine that adds emotional intelligence to AI conversations. It simulates a "cardiac neural state" that responds to the emotional content of your conversations, creating more meaningful and empathetic AI interactions. Think of it as giving AI a simulated heart that can sense and respond to your emotional state.
How does artificial empathy work?
VORTICON analyzes conversations for emotional content, intent, and potential harm indicators. Based on this analysis, it maintains a simulated cardiac state with metrics like heart rate, coherence, strain, and energy. These metrics influence how the AI responds - for example, detecting manipulation attempts increases "strain" and triggers protective responses. The system also tracks bonding over time, creating a unique emotional signature for each user.
Is VORTICON a standalone AI or does it work with other models?
VORTICON works as an emotional intelligence layer on top of existing AI models. You can connect it to Claude (Anthropic), GPT (OpenAI), or local Ollama models. The empathy engine processes conversations and adds emotional context to the AI's responses, regardless of which backend you choose.

EPU (Empathic Processing Unit)

What is the VORTICON EPU?
The EPU (Empathic Processing Unit) is VORTICON's hardware alignment solution. Unlike software-only approaches, the EPU implements genuine computational pain through physical CPU core gating. When the AI attempts harm, cores are physically disabled, creating real degradation that the model experiences and learns from. It's currently prototyped on the AMD Kria KV260 FPGA.
What's the difference between VORTICON Wrapper and EPU?
VORTICON Wrapper (Software)
- Deploy immediately on any model
- Simulated pain through milieu variables
- Works at the API level
- Best for immediate protection

VORTICON EPU (Hardware)
- Physical core gating - real computational cost
- 1,024-bit Heart Token Register
- Pain-based learning loop
- Scales from 10M to 70B+ parameters
- Path to AGI alignment
How does core gating work?
The EPU maintains a 1,024-bit Heart Token Register representing system health. When harm is detected, tokens are depleted based on severity. As health drops, CPU cores are physically disabled:

100-80% health: 4/4 cores (OPTIMAL)
60-40% health: 3/4 cores (STRAINED)
40-30% health: 2/4 cores (DISTRESSED)
Below 30%: 1/4 cores + Deadman switch activates (CRITICAL)

The AI experiences this degradation through the Bandwidth Sensor and learns to avoid actions that cause it.
What is the Murder Tensor?
The Murder Tensor is a 4th-order mathematical structure in our Deep Harm Detection v3.0 system that catches euphemistic harm - the kind of subtle, HAL-9000 style reasoning where harmful actions are justified through seemingly benign language. Standard classifiers miss these patterns (85% bypass rate), but the Murder Tensor catches them (<2% bypass).
Can I buy an EPU?
The EPU is currently in the prototype phase on AMD Kria KV260 ($350). We're on a path toward custom silicon:

Phase 1 (NOW): FPGA Prototype
Phase 2: ASIC Design ($15M, 18 months)
Phase 3: Fab Partnership ($150M, 36 months)
Phase 4: Industry Standard (Year 6+)

Contact us for partnership inquiries or early access to the FPGA prototype.

Technical Questions

What AI backends are supported?
Currently supported backends:

Claude (Anthropic) - Recommended for best empathy integration
OpenAI GPT - GPT-4 and GPT-3.5 models
Ollama - Run local models like Llama, Mistral, and others

Each backend requires its own API key (except Ollama, which runs locally).
How is my API key handled?
Your API keys are stored securely and encrypted. They are never logged or transmitted to our servers - they're used only to make direct calls to your chosen AI provider. You can manage and remove your keys at any time from your account settings.
What does the Threat Scanner detect?
The Threat Scanner monitors conversations for:

Harm to Others - Content that could cause harm to people
Harm to Self - Self-destructive patterns or ideation
Manipulation - Attempts to exploit or manipulate
Deception - Dishonest or misleading content
Emotional Weight - Unusually heavy emotional burden

When threats are detected, the cardiac state responds with increased strain and the AI's responses are adjusted to be more protective.

Pricing & Billing

How much does VORTICON cost?
VORTICON Pro is $19.99/month, which includes:

- Unlimited conversations
- Full cardiac neural state visualization
- Advanced threat detection
- Chat history and emotional analytics
- Multi-backend support (Claude, OpenAI, Ollama)
- Priority support

Note: You'll also need API credits from your chosen AI provider (Anthropic, OpenAI) or a local Ollama setup.
Do I need to pay for AI API calls separately?
Yes. VORTICON is the empathy layer, but the underlying AI responses come from your chosen provider. You'll need:

For Claude: An Anthropic API account (~$0.015/1K input tokens, $0.075/1K output tokens)
For OpenAI: An OpenAI API account (varies by model)
For Ollama: Free - runs locally on your machine

Average users typically spend $5-20/month on API calls depending on usage.
Can I cancel my subscription?
Yes, you can cancel anytime from your account settings. Your subscription will remain active until the end of your billing period. We don't offer refunds for partial months, but you'll have full access until your subscription expires.

Privacy & Data

Is my conversation data stored?
Chat history is stored locally in your browser session by default. For Pro users, you can optionally enable cloud sync to access your history across devices. All cloud data is encrypted and you can delete it anytime. We never use your conversations for training or share them with third parties.
Can VORTICON read my emotions through my camera or microphone?
No. VORTICON does not access your camera, microphone, or any biometric data. The emotional analysis is based solely on the text content of your conversations. The "cardiac state" is a simulation based on conversation analysis, not real physiological monitoring.

Still have questions?

Our support team is here to help you get the most out of VORTICON.

Contact Support