Skip to content

Commit 2e93997

Browse files
authored
Document README
1 parent 6ecd327 commit 2e93997

File tree

1 file changed

+83
-1
lines changed

1 file changed

+83
-1
lines changed

README.md

Lines changed: 83 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1,83 @@
1-
# typechat-go
1+
# typechat-go
2+
3+
typechat-go is a library that makes it easy to build natural language interfaces using types.
4+
5+
This project is a Go implementation of the original project [TypeChat](https://github.com/microsoft/TypeChat) by Microsoft.
6+
7+
Visit https://microsoft.github.io/TypeChat for more information on what it enables you do.
8+
9+
This implementation loosely follows what version 0.10 of TypeChat does with slightly different ergonomics more appropiate for Go.
10+
11+
Some of the key differences are that this library has less opinions about how you communicate with the LLM, and so as long as you provide a valid client you can easily use this.
12+
13+
## How to use
14+
15+
### Prompt + Return Type
16+
17+
This functionality allows you to pass in a natural language prompt and the expected result type you wish the LLM to use when replying. For example:
18+
```go
19+
type Classifier struct {
20+
Sentiment string
21+
}
22+
23+
ctx := context.Background()
24+
25+
// Provide a model client that implements the required interface
26+
// i.e. Do(ctx context.Context, prompt string) (response []byte, err error)
27+
// This model can call to OpenAPI, Azure or any LLM. You control the transport.
28+
model := ...
29+
30+
prompt := typechat.NewPrompt[Classifier](model, "Today is a good day!")
31+
result, err := prompt.Execute(ctx)
32+
if err != nil {
33+
...
34+
}
35+
36+
fmt.Println(result.Sentiment) // provided by the LLM
37+
```
38+
39+
You'll notice that this implementation is using Generics, so the result you get from the LLM is fully typed and able to be uused by the rest of your application.
40+
41+
### Prompt + Program
42+
43+
This functionality allows you to pass in a natural language prompt along with an interface of behavior that your application supports. The library will have the LLM generate a sequence of steps it deems necessary to accomplish a given task.
44+
45+
```go
46+
type API interface {
47+
CreateTweet(message string)
48+
CreateLinkedInMessage(message string)
49+
}
50+
51+
ctx := context.Background()
52+
53+
// Provide a model client that implements the required interface
54+
// i.e. Do(ctx context.Context, prompt string) (response []byte, err error)
55+
// This model can call to OpenAPI, Azure or any LLM. You control the transport.
56+
model := ...
57+
58+
prompt := typechat.NewPrompt[API](model, "I really need to tweet and post on my LinkedIN that I've been promoted!")
59+
program, err := prompt.CreateProgram(ctx)
60+
if err != nil {
61+
...
62+
}
63+
64+
// Program will contain the necessary invocations your application has to do with the provided API to accomplish the task as idenfitied by the LLM.
65+
program.Steps[0].Name == "CreateTweet"
66+
program.Steps[0].Args == []any{"I have been promoted!"}
67+
68+
program.Steps[1].Name == "CreateLinkedInMessage"
69+
program.Steps[1].Args == []any{"I have been promoted!"}
70+
71+
// You can build a program executor on top of this structure.
72+
```
73+
74+
## Contributing
75+
76+
This library is under development and still requires more work to solidify the provided APIs so use with caution. A release will be done at some point in the near future.
77+
78+
### TODOs
79+
80+
- More tests
81+
- Provide an adapter package to create out of the box with OpenAI and Azure clients
82+
- Other use cases such as conversations
83+
- Figure out best way to stay in sync with the original TypeChat project

0 commit comments

Comments
 (0)