This utility allows you to quickly and easily process text using any of OpenAI's chat models.
You can configure and save presets for easy reuse of prompts, and the utility will chunk long input text into multiple requests.
The utility is highly configurable, so you can dial in the settings that work best for you.
Happy text processing!
Click here to open the utility.
- 📖 Process Text With AI
- Quickly and easily process text using any of OpenAI's chat models.
- 📒 Presets
- Speed up your workflow by saving presets that let you reuse your prompts. Includes support for prompt variables.
- ✂️ Chunking
- Automatically split long inputs into chunks that leave token headroom for your desired output.
- ⚙️ Configurable
- Lots of control! Configure AI models, prompts, token length, and chunking logic.
- 🧑💻 Supports Many Use Cases
- Supports summarization, analysis, data extraction, translation, formatting—the possibilities are endless!
If this project helped you, please consider buying me a coffee or sponsoring me. Your support is much appreciated!
- Overview
- Donate
- Table of Contents
- Quick Start
- Use Cases
- npm Package
- TypeScript
- Icon Attribution
- Contributing
- ⭐ Found It Helpful? Star It!
- License
This utility is a static webapp hosted on GitHub Pages.
Click here to open the utility.
This utility supports variety of use cases, including (but certainly not limited to) the following:
- Summarization - Summarize large amounts of text.
- Analysis - Analyze text using specified criteria. Possibilities are endless, from sentiment analysis to finding bugs in software.
- Data Extraction - Extract key data from large amounts of text.
- Language Translation - Translate text from one language to another.
- Text Formatting - Format output text using specified criteria.
One of the main features of this utility is the ability to save and reuse presets. This can significantly speed up your workflow.
This package is available on npm, should you want to use its text processing utilities in your own app.
npm i ai-text-processor
import { TextUtils } from `ai-text-processor`
Utility Functions:
TextUtils.shrinkText
- Condense whitespace and remove timestamps (#:#)TextUtils.getEstimatedTokenCount
- Estimate the number of tokens in textTextUtils.getChunks
- Split text into chunks based on token limits
Type definitions have been included for TypeScript support.
Favicon by Twemoji.
Open source software is awesome and so are you. 😎
Feel free to submit a pull request for bugs or additions, and make sure to update tests as appropriate. If you find a mistake in the docs, send a PR! Even the smallest changes help.
For major changes, open an issue first to discuss what you'd like to change.
⭐ Found It Helpful? Star It!
If you found this project helpful, let the community know by giving it a star: 👉⭐
See LICENSE.md.