Explore

Find agent skills by outcome

4,234 skills indexed with the new KISS metadata standard.

Showing 24 of 4,234Category: General
General
PromptBeginner5 minmarkdown

Tokenization

Qwen-7B uses BPE tokenization on UTF-8 bytes using the `tiktoken` package.

0
General
PromptBeginner5 minmarkdown

トークン化

Qwen-7B は `tiktoken` パッケージを使用して、UTF-8 バイトを BPE トークン化します。

0
General
PromptBeginner5 minmarkdown

Tokenization

> 注:作为术语的“tokenization”在中文中尚无共识的概念对应,本文档采用英文表达以利说明。

0
General
PromptBeginner5 minmarkdown

<p align="left">

中文</a>&nbsp | &nbsp<a href="README.md">English</a>&nbsp | &nbsp<a href="README_JA.md">日本語</a> | &nbsp<a href="README_FR.md">Français</a> | &nbsp<a href="README_ES.md">Español</a>

0
General
PromptBeginner5 minmarkdown

<p align="left">

<a href="README_CN.md">中文</a>&nbsp | &nbsp<a href="README.md">English</a>&nbsp | &nbsp<a href="README_JA.md">日本語</a> | &nbsp<a href="README_FR.md">Français</a> | &nbspEspañol

0
General
PromptBeginner5 minmarkdown

<p align="left">

<a href="README_CN.md">中文</a>&nbsp | &nbsp<a href="README.md">English</a>&nbsp | &nbsp<a href="README_JA.md">日本語</a>&nbsp | &nbspFrançais | &nbsp<a href="README_ES.md">Español</a>

0
General
PromptBeginner5 minmarkdown

FAQ

Flash attention is an option for accelerating training and inference. Only NVIDIA GPUs of Turing, Ampere, Ada, and Hopper architecture, e.g., H100, A100, RTX 3090, T4, RTX 2080, can support flash attention. **You can use our models without installing it.**

0
General
PromptBeginner5 minmarkdown

FAQ

Flash attention は、トレーニングと推論を加速するオプションです。H100、A100、RTX 3090、T4、RTX 2080 などの Turing、Ampere、Ada、および Hopper アーキテクチャの NVIDIA GPU だけが、flash attention をサポートできます。それをインストールせずに私たちのモデルを使用することができます。

0
General
PromptBeginner5 minmarkdown

FAQ

flash attention是一个用于加速模型训练推理的可选项,且仅适用于Turing、Ampere、Ada、Hopper架构的Nvidia GPU显卡(如H100、A100、RTX 3090、T4、RTX 2080),您可以在不安装flash attention的情况下正常使用模型进行推理。

0
General
PromptBeginner5 minmarkdown

<p align="left">

<a href="README_CN.md">中文</a>&nbsp | &nbspEnglish&nbsp | &nbsp<a href="README_JA.md">日本語</a> | &nbsp<a href="README_FR.md">Français</a> | &nbsp<a href="README_ES.md">Español</a>

0
General
PromptBeginner5 minmarkdown

__pycache__

*.so

0
General
PromptBeginner5 minmarkdown

__pycache__

*.so

0
General
PromptBeginner5 minmarkdown

Project Specification #####

**/test

0
General
PromptBeginner5 minmarkdown

<!-- markdownlint-disable first-line-h1 -->

<!-- markdownlint-disable html -->

0
General
PromptBeginner5 minmarkdown

<!-- markdownlint-disable first-line-h1 -->

<!-- markdownlint-disable html -->

0
General
PromptBeginner5 minmarkdown

*.tmp

*.swp

0
General
PromptBeginner5 minmarkdown

Welcome to BELLE project! We appreciate your interest in contributing to our project.

In order to make the contribution process as smooth as possible, we have established some

0
General
PromptBeginner5 minmarkdown

<img src="assets/belle_logo.png" style="vertical-align: middle; width: 35px;"> BELLE: Be Everyone's Large Language model Engine

*Read this in [English](README_en.md).*

0
General
PromptBeginner5 minmarkdown

<img src="assets/belle_logo.png" style="vertical-align: middle; width: 35px;"> BELLE: Be Everyone's Large Language model Engine

*[中文README](README.md).*

0
General
PromptBeginner5 minmarkdown

theme: jekyll-theme-cayman

generic skill

0
General
PromptBeginner5 minmarkdown

Byte-compiled / optimized / DLL files

__pycache__/

0
General
PromptBeginner5 minmarkdown

Contributing

We are happy to accept your contributions to make this repo better and more awesome! To avoid unnecessary work on either

0
General
PromptBeginner5 minmarkdown

[**🇨🇳中文**](https://github.com/shibing624/MedicalGPT/blob/main/README.md) | [**🌐English**](https://github.com/shibing624/MedicalGPT/blob/main/README_EN.md) | [**📖文档/Docs**](https://github.com/shibi

<div align="center">

0
General
PromptBeginner5 minmarkdown

[**🇨🇳中文**](https://github.com/shibing624/MedicalGPT/blob/main/README.md) | [**🌐English**](https://github.com/shibing624/MedicalGPT/blob/main/README_EN.md) | [**📖文档/Docs**](https://github.com/shibi

<div align="center">

0