DGPT Docs
  • πŸ¦Ήβ€β™‚οΈOverview
  • πŸ“–Introduction
  • Product Overview
    • πŸ“ͺObjectives and vision of the APP
    • πŸ“²Core functions and features
    • πŸ’°How to make money from AI arithmetic at DGPT
    • πŸ†DGPT Innovative Blockchain Core Business Model
    • πŸ₯³DGPT delivers new value in three ways
    • πŸ—ΏDGPT's proven and sophisticated business model
  • Project background
    • πŸŒ•Company Background
    • πŸ•ΈοΈAI Server Evolution and Parallelism
    • πŸ›€οΈAI Arithmetic Proliferation under ChatGPT's Demand
    • βœ…Computational power as a core production factor
    • ♻️Blockchain fuelling sustainable earning ecosystems
  • Product architecture
    • 🎨Calculation Platform
    • πŸ§‘β€πŸ«Core mechanism - distributed heterogeneous arithmetic infrastructure
    • πŸ§‘β€βš–οΈCalculation Node Platform
    • βš–οΈQuantitative Income Platform
    • 🌏Overview of AI techniques applied to DGPT
  • Calculation of proceeds
    • πŸ“ˆCalculation Power Settlement Formula
  • Token economics
    • πŸ™Œ (Coming Soon)
  • Summary
    • 🌐Summary
  • Link
    • πŸ–₯️Website
    • πŸ˜€Twitter
    • πŸ“”Medium
    • πŸ“ΊYoutube
    • πŸ—£οΈFacebook
    • πŸ“ΉTiktok
Powered by GitBook
On this page
  1. Project background

AI Server Evolution and Parallelism

Servers have gone through an evolution of four models with scenario requirements: general purpose servers, cloud servers, edge servers, and AI servers.AI servers have enhanced their parallel computing capabilities by adopting GPUs to better support the needs of AI applications;

AI servers can be divided into two types, training and inference, based on application scenarios. The training process requires high chip arithmetic, and according to IDC, the proportion of inference arithmetic demand is expected to rise to 60.8% by 2025, with the widespread application of large models;

AI servers can be divided into CPU+GPU, CPU+FPGA, CPU+ASIC and other forms of combination according to the type of chip. At present, the main choice in China is the CPU+GPU combination, accounting for 91.9%;

The cost of AI servers comes mainly from chips such as CPUs and GPUs, which take up anywhere from 25 to 70 per cent of the total cost. For training servers, more than 80 per cent of the cost comes from the CPU and GPU.

PreviousCompany Background NextAI Arithmetic Proliferation under ChatGPT's Demand

Last updated 1 year ago

πŸ•ΈοΈ