Outpainting is actually just inpainting within so called 'generation frames'. This repo demonstrates how you can leverage the Dall-E API for outpainting.
Our starting image may look like this:
We want to extend 256px downward, so we shift the image upward and introduce 256px of 'filler' pixels (these will be replaced anyway). By passing resulting image to the Dall-E API both as a source and mask, we get the following result back:
We do the same for the top part, and stitch the result using a canvas element.
Clone the repo:
git clone https://github.com/SabatinoMasala/dalle-api-outpainting-sample.git
The sample has 2 parts:
- Server
- Client app
The server will handle the communication with the Dall-E API. Start by generating an API key on OpenAI.
Copy ./server/.env.example
to ./server/.env
and fill in the API key here.
In the server directory, run yarn install
and then node index.js
. This will start the server.
The client app is a simple Vue application, go into the client
directory, run yarn install
and then yarn dev
. Open the URL in your browser, and you'll be presented with the sample application.