Breaking Down Dhruv Rathee’s New AI Product — AI Fiesta

— Issue #24 of The Artificial Newsletter

Yes, I built a thumbnail with Dhruv Rathee and no, he doesn’t look like himself. That’s what happens when you let an AI do the makeup and lighting. 😄

If you’ve been following the Indian creator space, you probably saw the launch of AI Fiesta by Dhruv Rathee. At first glance, it looks like a neat aggregator of all the top AI models under one subscription — ChatGPT-5 Plus, Gemini 2.5 Pro, Claude 4 Sonnet, Perplexity Sonar, Grok 4, and even DeepSeek.

Instead of juggling multiple accounts and paying for each separately, users get a single dashboard, a library of prompts, and even a learning community. Sounds great — but how does something like this actually work under the hood? And what challenges might it face?

How Platforms Like AI Fiesta Are Built

While Rathee hasn’t shared the technical blueprint, here’s the most likely setup:

  • API Aggregation Layer
    The platform calls the APIs of different AI providers (OpenAI, Anthropic, Google, etc.) and routes user requests through a backend service. Think of it as a “hub” that decides which AI engine to query.

  • Middleware for Features
    Extra features like side-by-side comparisons or auto-prompt correction are handled by middleware — basically custom code that processes your input/output before and after the API call.

  • Frontend Experience
    A web and mobile UI that abstracts away the complexity. You don’t see tokens, API keys, or rate limits — just a clean chat window.

  • Cost Layering
    Since each model charges per token or per request, AI Fiesta likely buys usage at scale and resells access via subscription. That’s why they can offer ₹999/month pricing compared to paying individually for each model.

⚡ Why This Matters

For a typical user:

  • One subscription replaces 5–6 costly ones

  • UPI/local payments make it far more accessible

  • Community + curated prompts lower the learning curve

It’s less about “new AI” and more about better packaging and distribution of existing AI power.

🚨 The Hidden Challenges

Products like AI Fiesta sound great, but they come with serious operational risks:

  1. Rising API Costs
    Providers like OpenAI or Anthropic can change pricing anytime. If token rates rise, platforms reselling AI access either have to eat the cost or raise subscription fees.

  2. Rate Limits & Licensing
    Some AI vendors don’t love their models being resold. If a contract violation is alleged, access could be throttled or revoked.

  3. Latency & Reliability
    Routing requests through multiple APIs means higher latency. If one service has downtime, users blame AI Fiesta — not the original provider.

  4. Scaling Issues
    Early adopters may love the low price. But if too many people join and usage spikes, the company’s margins could collapse unless they renegotiate bulk deals.

  5. Security & Data Privacy
    Since requests pass through their servers, AI Fiesta has to handle data securely. Any breach could damage trust quickly.

🧭 What To Watch For

  • How transparent is the company about which models you’re actually hitting?

  • Do they eventually build their own proprietary layer (fine-tuned models, custom embeddings) to reduce dependency?

  • Can they maintain pricing once initial buzz dies down and infra bills pile up?

💡 Takeaway

Dhruv Rathee has tapped into something important: people don’t want 10 subscriptions, they want one tool that “just works.”

But sustainability will hinge on business economics, not just hype. If API costs rise, or licensing restrictions tighten, AI Fiesta will need more than aggregation — it will need innovation.

👉 Subscribe here if you’d like to keep learning about how AI products are built, the economics behind them, and how you can build your own.