Inflectiv AI Roadmap

Phase 1 — Kepler (Alpha) — LIVE

Focus: Dataset creation, ingestion, and AI agent workflows

Data ingestion & knowledge creation

  • Upload and process files (PDF, DOCX, TXT, MD, CSV, JSON, XLS, spreadsheets)

  • Import data from URLs and web sources

  • Generate synthetic datasets using AI

  • Automatic chunking and semantic indexing

Dataset management

  • Create and update datasets using natural language

  • Public datasets and community browsing

  • Dataset visibility controls (public/private)

  • Encryption with Seal for secure datasets

  • Decentralized storage via Walrus

Agent creation & usage

  • Create AI agents using natural language

  • Update agent behavior via prompts

  • Public agents for testing before purchase

  • Private sandbox agents for development

  • Query datasets through agents

Blockchain & authentication

  • SUI zkLogin authentication

  • Wallet connection (Suiet and others)

  • Native encryption and decentralized storage integration

Platform foundations

  • Marketplace as primary entry point

  • Dataset and agent dashboard structure

  • Improved ingestion stability and validation

  • Migration of legacy users, datasets, and chats

  • Blog and content publishing system


Phase 2 — Cassini (Beta) - LIVE

Focus: Marketplace economy, subscriptions, API access, and platform growth

Marketplace & monetization

  • Paid datasets and premium access

  • Marketplace filters and discovery improvements

  • Dataset editing and listing management

  • Platform fee system

  • Referral and affiliate program with tracking

  • Creator commissions and withdrawal flow

Credits, billing & subscriptions

  • Tiered subscription plans

  • Payments in crypto or fiat

  • Credit usage tracking and limits

  • Credit rewards and top-ups

Developer ecosystem

  • Public API access for datasets and agents

  • SDK support for builders

  • Model Context Protocol access for integrations

  • Foundations for connecting Inflectiv to external AI systems

AI intelligence & dataset quality

  • Automatic dataset quality scoring

  • Auto-detected language, domain, and data type

  • Suggested dataset use cases

  • Mobula integration for token metadata enrichment

User onboarding & UX

  • Guided onboarding flows for new users

  • Marketplace tutorial rewards

  • Improved dataset creation calls-to-action

  • Search improvements and navigation upgrades

  • Profile and account management updates

Platform expansion

  • New Inflectiv app UI

  • New marketing landing page

  • Expanded file format support

  • Scalable storage tiers based on subscription


Phase 2.x — Ongoing Updates

Currently in progress or planned next

Support & operations

  • AI helpdesk trained on Inflectiv documentation

  • Ticketing fallback support system

  • Promo code management and upgrades

Admin & analytics

  • Admin panel for platform management

  • Manual user and credit controls

  • PostHog analytics integration

Trust & identity

  • Creator badge system (profile completion, subscriber tiers, verified providers)

  • Future KYB verification for licensed or IP datasets

Marketplace enhancements

  • Creator profile pages with public info and listings

  • Featured and promoted datasets

  • Improved dataset tagging and categorization

Data tools

  • Dataset splitter and automated dataset generation pipelines

  • Improved token search and historical data enrichment via Mobula

  • ElizaOS integration for advanced chatbot deployment


Phase 3 — Voyager (Future)

Focus: Tokenized datasets, protocol economy, and large-scale infrastructure**

Tokenization infrastructure

  • INAI token deployment across Base and SUI

  • Dataset token factory for launching tokenized datasets

  • Creator token allocation and vesting logic

  • Token metadata and mapping to datasets

  • Tokenization UI and backend trigger flows

  • Tokenization fees and activation mechanics

Bonding curve launch model

  • INAI-denominated bonding curves for dataset tokens

  • Curve parameters and pricing configuration

  • Investor token purchases through curves

  • Automatic curve completion detection

Liquidity & graduation

  • Automatic liquidity pool creation

  • Supply allocation between LP and investors

  • Automatic transition from curve to LP trading

Revenue routing & buybacks

  • Dataset credit metering and usage tracking

  • Revenue splitter (buyback, treasury, operations)

  • Buyback thresholds and execution logic

  • Token burn mechanics

  • Buyback logging and transparency

Marketplace token integration

  • Tokenized dataset labels in marketplace

  • Optional token-priced dataset access

  • Token volatility warnings for users

Investor & analytics dashboards

  • Dataset token dashboard with price, liquidity, and buyback history

  • Investor discovery views

  • Monitoring for pools, meters, and buybacks

  • Anti-farming protection rules


Phase 3.1+ — Platform Scale (Planned)

Organization & enterprise features

  • Multi-user organizations with roles and permissions

  • Team dataset collaboration and governance

Advanced ingestion

  • Sensor and IoT data ingestion pipelines

  • Real-time streaming data support

Agent ecosystem

  • Native ElizaOS agent support

  • Integration with external agent ecosystems (e.g., Virtuals)

  • Agent marketplace with discovery and ratings

Last updated