docs(readme): add AI Cloud introduction to README (#24635)

Add AI Cloud section to both English and Chinese README files,
including core capabilities (inference, app management, model library,
templates, GPU ops) and supported applications (Ollama, OpenClaw,
Dify, ComfyUI).
This commit is contained in:
Zexi Li
2026-04-08 14:59:20 +08:00
committed by GitHub
parent 3ae9d58b91
commit 1079017a92
2 changed files with 31 additions and 1 deletions

View File

@@ -9,7 +9,9 @@
<img src="https://v1.cloudpods.org/images/cloudpods_logo_green.png" alt="Cloudpods" height="100">
Cloudpods是一个开源的Golang实现的云原生的融合多云/混合云的云平台,也就是一个云上之云”。Cloudpods不仅可以管理本地的虚拟机和物理机资源还可以管理多个云平台和云账号。Cloudpods隐藏了这些异构基础设施资源的数据模型和API的差异对外暴露了一套统一的API允许用户就像用一个云一样地访问多云。从而大大降低了访问多云的复杂度提升了管理多云的效率。
Cloudpods是一个开源的Golang实现的云原生的融合多云/混合云的云平台,也就是一个云上之云”。Cloudpods不仅可以管理本地的虚拟机和物理机资源还可以管理多个云平台和云账号。Cloudpods隐藏了这些异构基础设施资源的数据模型和API的差异对外暴露了一套统一的API允许用户就像用一个云一样地访问多云。从而大大降低了访问多云的复杂度提升了管理多云的效率。
Cloudpods还提供了 **AI 云**一个面向大语言模型LLM推理与 AI 容器应用的统一管理平台,帮助企业在同一平台完成 AI 工作负载的部署、调度与运维,与 Cloudpods 私有云/多云资源体系无缝集成。
## 谁需要Cloudpods?
@@ -19,11 +21,24 @@ Cloudpods是一个开源的Golang实现的云原生的融合多云/混合云的
* 在混合云的场景,能够在一个界面访问私有云和公有云
* 通过一个集中的入口访问分布在多个公有云平台上的多个账号
* 当前只使用一个云公有云账号但希望将来使用多云的用户
* 需要部署和管理 LLM 推理服务及 AI 容器应用,并需要 GPU 调度支持的用户
## 功能
请访问[产品介绍](https://www.cloudpods.org/docs/introduction/)页面了解详细信息。
### AI 云
* **AI 推理服务**:部署和管理 LLM 推理实例,支持 GPU 调度、模型挂载与推理服务地址分配。
* **AI 应用管理**:一站式部署 LLM 应用编排、智能体助手、图像生成等 AI 容器应用。
* **模型库**:统一管理模型来源、版本与缓存,支持多实例复用、离线分发,避免重复下载。
* **模板与镜像**:通过模板定义 CPU/内存/GPU 等资源规格,通过镜像管理容器运行环境,实现标准化交付。
* **GPU 运维**GPU 设备自动探测与注册,支持 NVIDIA/CUDA 环境的统一配置与管理。
支持的 AI 应用:
* AI 推理Ollama
* AI 应用OpenClaw、Dify、ComfyUI
### 支持的云平台
* 公有云:

View File

@@ -11,6 +11,8 @@
Cloudpods is a cloud-native open source unified multi/hybrid-cloud platform developed with Golang, i.e. Cloudpods is *a cloud on clouds*. Cloudpods is able to manage not only on-premise KVM/baremetals, but also resources from many cloud accounts across many cloud providers. It hides the differences of underlying cloud providers and exposes one set of APIs that allow programatically interacting with these many clouds.
Cloudpods also provides **AI Cloud**, a unified management platform for large language model (LLM) inference and AI container applications, helping enterprises deploy, schedule, and operate AI workloads on a single platform, seamlessly integrated with the Cloudpods private cloud / multi-cloud resource ecosystem.
## Who needs Cloudpods?
* Those who need a simple solution to virtualize a few physical servers into a private cloud
@@ -19,11 +21,24 @@ Cloudpods is a cloud-native open source unified multi/hybrid-cloud platform deve
* Those who need a cohesive view of both public and private cloud in a hybrid cloud setup
* Those who need a centric portal to access multiple acccounts from multiple public clouds
* Those who is currently using a single cloud account, but will not lose the possibility to adopt multicloud strategy
* Those who need to deploy and manage LLM inference services and AI container applications with GPU support
## Features
See [Introduction](https://www.cloudpods.org/docs/introduction/) for details.
### AI Cloud
* **AI Inference Services**: Deploy and manage LLM inference instances with GPU scheduling, model mounting, and inference service address allocation.
* **AI Application Management**: One-stop deployment of LLM application orchestration, agent assistants, image generation, and other AI container applications.
* **Model Library**: Unified management of model sources, versions, and caches, supporting multi-instance reuse and offline distribution.
* **Templates and Images**: Define resource specifications (CPU/memory/GPU) through templates, manage container runtime environments through images.
* **GPU Operations**: Automatic GPU device detection and registration, unified NVIDIA/CUDA environment configuration and management.
Supported AI applications:
* AI Inference: Ollama
* AI Applications: OpenClaw, Dify, ComfyUI
### Supported cloud providers
* Public Clouds: