A few days ago, we noticed something strange.
Some products show up consistently in AI answers.
Others… don’t exist at all.
Even when they’re good.
The problem we kept seeing
We tried asking different tools — ChatGPT, Perplexity, Gemini — questions like:
“best tools for X”
“alternatives to Y”
“how to do Z”
And the results weren’t random.
The same competitors kept appearing.
The same sources kept getting referenced.
And many products were completely invisible.
That’s when it clicked
This isn’t just SEO anymore.
It’s something closer to:
AI visibility
Not:
“do you rank on Google?”
But:
does AI even know your product exists?
does it recommend you?
or your competitors instead?
So we built Spotaq
We wanted a simple way to answer one question:
How does AI “see” your product?
So we built Spotaq.
You enter your product or website, and it:
checks how AI-assisted search surfaces it
shows which competitors appear instead
identifies gaps in visibility
suggests what to do next
No signup. No setup. Just a quick check.
What we learned building this
A few things became clear very quickly:
1. Single prompts are misleading
One query might show you. Another might not.
Patterns matter more than snapshots.
2. Competitors are the real signal
Seeing who shows up instead of you is way more useful than a “score”.
3. Sources matter more than models
AI tools often pull from:
Reddit threads
comparison blogs
review platforms
documentation
If you’re not present there, you’re unlikely to show up.
This is still early
We don’t think this is fully figured out yet.
In many ways, this feels like:
SEO in the early days
Not everyone is paying attention.
But the ones who do will have an advantage.
What we’re doing next
We’re continuing to:
test how visibility changes over time
map which sources influence AI answers
turn this into more actionable insights
Try it
If you’re curious:
(We’d genuinely love feedback — still building this in public.)