by opentiny
Provides a collection of Vue 3 UI components for building enterprise‑level AI chat applications, with built‑in streaming, theming, and flexible storage support.
TinyRobot offers a rich set of ready‑to‑use Vue 3 components such as chat bubbles, message input, conversation containers, and attachment handlers. It follows the OpenTiny design system, enabling developers to quickly assemble interactive AI assistants and integrate them with any AI model backend.
npx -y pnpm add @opentiny/tiny-robot
Optional packages:
@opentiny/tiny-robot-kit for AI model request utilities.@opentiny/tiny-robot-svgs for standalone SVG icons.import '@opentiny/tiny-robot/dist/style.css';
<template>
<tr-bubble role="ai" content="Hello!" placement="start" />
<tr-bubble role="user" content="How do I start?" placement="end" />
</template>
<script setup>
import { TrBubble } from '@opentiny/tiny-robot';
</script>
useMessage or useConversation.Q: Which Vue version is required? A: Vue 3.2 or higher.
Q: Do I need the kit package to display chat components? A: No, the core package handles UI. The kit is only needed for model request utilities.
Q: Can I customize the component theme? A: Yes, TinyRobot provides a theme configuration API and supports custom CSS variables.
Q: How does streaming work? A: The components expose a streaming slot that updates as partial responses arrive from the backend.
Q: Is the library tree‑shakable? A: Yes, you can import individual components to keep bundle size minimal.
TinyRobot is an AI component library built for Vue 3, following the OpenTiny Design system. It provides rich AI interaction components to help developers quickly build enterprise-level AI applications.
English | 简体中文
TinyRobot is a monorepo containing the following packages:
Core package — @opentiny/tiny-robot is the main package.
# Using pnpm (recommended)
pnpm add @opentiny/tiny-robot
# Using npm
npm install @opentiny/tiny-robot
# Using yarn
yarn add @opentiny/tiny-robot
Optional packages:
@opentiny/tiny-robot-kit — Only needed if you use AI model request or data-processing features. Add it when required:
pnpm add @opentiny/tiny-robot-kit
@opentiny/tiny-robot-svgs — Optional. Install separately only if you need to use the SVG icon library standalone or with custom icons:
pnpm add @opentiny/tiny-robot-svgs
In your main.js or main.ts:
import { createApp } from 'vue'
import App from './App.vue'
import '@opentiny/tiny-robot/dist/style.css'
const app = createApp(App)
app.mount('#app')
<template>
<div class="chat-container">
<tr-bubble role="ai" content="Hello! I'm TinyRobot, an AI component library for Vue 3." placement="start" />
<tr-bubble role="user" content="That's great! How can I get started?" placement="end" />
</div>
</template>
<script setup>
import { TrBubble } from '@opentiny/tiny-robot'
</script>
tiny-robot/
├── packages/
│ ├── components/ # Core component library
│ │ ├── src/
│ │ │ ├── bubble/ # Chat bubble components
│ │ │ ├── sender/ # Message input component
│ │ │ ├── container/ # Container component
│ │ │ ├── history/ # Conversation history
│ │ │ ├── attachments/ # File attachments
│ │ │ └── ... # Other components
│ │ └── package.json
│ ├── kit/ # Utility functions and AI tools
│ │ ├── src/
│ │ │ ├── providers/ # AI provider implementations
│ │ │ ├── vue/ # Vue composables
│ │ │ │ ├── message/ # useMessage composable
│ │ │ │ └── conversation/ # useConversation composable
│ │ │ └── storage/ # Storage utilities
│ │ └── package.json
│ ├── svgs/ # SVG icon library
│ ├── playground/ # Development playground
│ └── test/ # Test suite
├── docs/ # Documentation site
│ ├── src/ # Documentation source
│ └── demos/ # Component demos
├── scripts/ # Build and utility scripts
└── package.json
# Install dependencies
pnpm install
# Start development server (playground + docs)
pnpm dev
Start Development Server:
pnpm dev in the project root directorypackages/components/src/, changes will be automatically reflected in the documentation pageDocumentation:
docs/src/docs/demos/Testing:
pnpm test to execute testsMIT License - see LICENSE file for details.
Contributions are welcome! Please read our Contributing Guide to understand the recommended workflow, commit message conventions, and how to submit Issues and Pull Requests.
Built with ❤️ by the OpenTiny team.
Note: This project is part of the OpenTiny ecosystem.
Please log in to share your review and rating for this MCP.
Explore related MCPs that share similar capabilities and solve comparable challenges
by modelcontextprotocol
A Model Context Protocol server for Git repository interaction and automation.
by zed-industries
A high‑performance, multiplayer code editor designed for speed and collaboration.
by modelcontextprotocol
Model Context Protocol Servers
by modelcontextprotocol
A Model Context Protocol server that provides time and timezone conversion capabilities.
by cline
An autonomous coding assistant that can create and edit files, execute terminal commands, and interact with a browser directly from your IDE, operating step‑by‑step with explicit user permission.
by upstash
Provides up-to-date, version‑specific library documentation and code examples directly inside LLM prompts, eliminating outdated information and hallucinated APIs.
by daytonaio
Provides a secure, elastic infrastructure that creates isolated sandboxes for running AI‑generated code with sub‑90 ms startup, unlimited persistence, and OCI/Docker compatibility.
by continuedev
Enables faster shipping of code by integrating continuous AI agents across IDEs, terminals, and CI pipelines, offering chat, edit, autocomplete, and customizable agent workflows.
by github
Connects AI tools directly to GitHub, enabling natural‑language interactions for repository browsing, issue and pull‑request management, CI/CD monitoring, code‑security analysis, and team collaboration.