Lobe i18n is a CLI workflow tool that uses ChatGPT for automated i18n.
English ・ 简体中文 ・ Changelog · Report Bug · Request Feature
Table of Contents
- 🤖 Utilize ChatGPT for automated i18n translation
- ✂️ Support automatic splitting of large files without worrying about ChatGPT token limits.
- ♻️ Support incremental i18n updates, automatically extract new content based on
entry
files. - 🗂️ Support single file mode
en_US.json
and folder modeen_US/common.json
to work perfectly withi18next
. - 🌲 Support
flat
andtree
structure for locale files. - 🛠️ Support customizing OpenAI models, API proxies, temperature and topP.
- 📝 Support automated i18n translation of
Markdown
files.
To install Lobe i18n, run the following command:
npm install -g @lobehub/i18n-cli
Important
Please make sure you have Node.js
version >= 18.
To initialize the Lobe i18n configuration, run the following command:
$ lobe-i18n -o # or use the full flag --option
Important
To use AI auto-generation, you need to fill in the OpenAI Token in the settings.
# Translate Locale files
$ lobe-i18n # or $ lobe-i18n locale
# Translate Markdown files
$ lobe-i18n md
# Run i18n translation and markdown translation simultaneously
$ lobe-i18n --with-md
# Specify the configuration file
$ lobe-i18n -c './custom-config.js' # or use the full flag --config
You can choose any configuration method in cosmiconfig format
i18n
property inpackage.json
.i18nrc
file in JSON or YAML format.i18nrc.json
,.i18nrc.yaml
,.i18nrc.yml
,.i18nrc.js
,.i18nrc.cjs
defineConfig
provides a secure definition method that can be imported from@lobehub/i18n-cli
Tip
This project provides a secure definition method defineConfig
that can be imported from @lobehub/i18n-cli
This project provides some additional configuration items set with environment variables:
Environment Variable | Required | Description | Example |
---|---|---|---|
OPENAI_API_KEY |
Yes | This is the API key you apply on the OpenAI account page | sk-xxxxxx...xxxxxx |
OPENAI_PROXY_URL |
No | If you manually configure the OpenAI interface proxy, you can use this configuration item to override the default OpenAI API request base URL | https://api.chatanywhere.cn/v1 The default value is https://api.openai.com/v1 |
Property Name | Required | Type | Default Value | Description |
---|---|---|---|---|
entry | * |
string |
- | Entry file or folder |
entryLocale | * |
string |
- | Language to use as translation reference |
modelName | string |
gpt-3.5-turbo |
Model to use | |
output | * |
string |
- | Location to store localized files |
outputLocales | * |
string[] |
[] |
All the languages to be translated |
reference | string |
- | Provide some context for more accurate translations | |
splitToken | number |
- | Split the localized JSON file by tokens, automatically calculated by default | |
temperature | number |
0 |
Sampling temperature to use | |
topP | number |
1 |
Nucleus sampling threshold, controls the diversity of generated text | |
concurrency | number |
5 |
Number of concurrently pending promises returned | |
experimental | experimental |
{} |
Experimental features, see below | |
markdown | markdown |
{} |
See markdown configuration below |
Property Name | Required | Type | Default Value | Description |
---|---|---|---|---|
jsonMode | boolean |
false |
Enable gpt force JSON output for stability (only supported by new models after November 2023) |
const { defineConfig } = require('@lobehub/i18n-cli');
module.exports = defineConfig({
entry: 'locales/en_US.json',
entryLocale: 'en_US',
output: 'locales',
outputLocales: ['zh_CN', 'ja_JP'],
});
{
"entry": "locales/en_US.json",
"entryLocale": "en_US",
"output": "locales",
"outputLocales": ["zh_CN", "ja_JP"]
}
{
"...": "...",
"i18n": {
"entry": "locales/en_US.json",
"entryLocale": "en_US",
"output": "locales",
"outputLocales": ["zh_CN", "ja_JP"]
}
}
There are two types of file structures supported: flat
and tree
.
A flat structure means that all translations for different languages are stored in a single file, as shown below:
- locales
- en_US.json
- ja_JP.json
- zh_CN.json
- ...
Tip
The flat structure
requires configuring the entry
property in the configuration file to the corresponding JSON file Example
{
"entry": "locales/en.json",
"entryLocale": "en_US",
"output": "locales",
"outputLocales": ["zh_CN", "ja_JP"]
}
A tree structure means that the translations for each language are stored in separate language folders, as shown below:
- locales
- en_US
- common.json
- header.json
- subfolder
- ...
- ja_JP
- common.json
- header.json
- subfolder
- ...
- zh_CN
- common.json
- header.json
- subfolder
- ...
Tip
The tree structure
requires configuring the entry
property in the configuration file to the corresponding folder Example
{
"entry": "locales/en_US",
"entryLocale": "en_US",
"output": "locales",
"outputLocales": ["zh_CN", "ja_JP"]
}
Use the lobe-i18n
command to generate i18n files automatically:
$ lobe-i18n
Property Name | Required | Type | Default | Description |
---|---|---|---|---|
entry | * |
string[] |
[] |
Entry file or folder, supports glob |
entryLocale | string |
Inherit parent locale | Reference language for translation | |
entryExtension | string |
.md |
Entry file extension | |
exclude | string[] |
[] |
Files to be filtered, supports glob |
|
outputLocales | string[] |
Inherit parent locale | All languages to be translated | |
outputExtensions | function |
(locale) => '.{locale}.md' |
Output file extension generation | |
mode | string ,mdast ,function |
string |
Translation mode selection, explained below | |
translateCode | boolean |
false |
Whether to translate code blocks under mdast , other modes are invalid |
|
includeMatter | boolean |
false |
Whether to include front matter in the translation |
By default, the translated file names are generated as .{locale}.md
. You can customize the output file extensions with outputExtensions
.
Note
In the example below, the entry file extension is .zh-CN.md
, but we want the output file extension for the en-US
translation to be .md
, while other languages keep the default extensions.
module.exports = {
markdown: {
entry: ['./README.zh-CN.md', './docs/**/*.zh-CN.md'],
entryLocale: 'zh-CN',
entryExtension: '.zh-CN.md',
outputLocales: ['en-US', 'ja-JP'],
outputExtensions: (locale, { getDefaultExtension }) => {
if (locale === 'en-US') return '.md';
return getDefaultExtension(locale);
},
},
};
outputExtensions
supports the followingprops
:
interface OutputExtensionsProps {
/**
* @description The locale of the translated file to output
*/
locale: string;
config: {
/**
* @description The content of the translated file to input
*/
fileContent: string;
/**
* @description The path of the translated file to input
*/
filePath: string;
/**
* @description The default method for generating extensions
*/
getDefaultExtension: (locale: string) => string;
};
}
mode
is used to specify the translation mode, which supports two modes and custom generation modes.
string
- Translates the completemarkdown
content.mdast
- Parses the text withmdast
structure and translates thetext value
content. To translate code blocks, you need to enabletranslateCode
.
Warning
In mdast
mode, the content to be translated will be reduced to a minimum, removing most markdown syntax structures and links.
This mode can greatly reduce token consumption, but it may result in inaccurate translation results.
The translated files will be generated in the same directory as the entry file, with the corresponding language suffix added to the extension:
- README.md
- README.zh-CN.md
- docs
- usage.md
- usage.zh-CN.md
- subfolder
- ...
Tip
Use the lobe-i18n md
command to automate the generation of i18n files:
$ lobe-i18n md
You can use Github Codespaces for online development:
Alternatively, you can clone the repository and run the following command for local development:
$ git clone https://github.com/lobehub/lobe-cli-toolbox.git
$ cd lobe-cli-toolbox
$ bun install
$ cd packages/lobe-i18n
$ bun dev
We welcome contributions in all forms. If you're interested in contributing code, you can check our GitHub Issues, show off your skills, and demonstrate your ideas.
- 🤖 Lobe Chat - An open-source, extensible (Function Calling), high-performance chatbot framework. It supports one-click free deployment of your private ChatGPT/LLM web application.
- 🤯 Lobe Theme - The modern theme for stable diffusion webui, exquisite interface design, highly customizable UI, and efficiency boosting features.
- langchainjs - https://github.com/hwchase17/langchainjs
- ink - https://github.com/vadimdemedes/ink
- transmart - https://github.com/Quilljou/transmart
Copyright © 2023 LobeHub.
This project is licensed under the MIT license.