This plugin is based on the B-Tocs ABAP SDK and enables the SAP ABAP Server to use Large Language Models provided by the OLLAMA infrastructure services. Most of the Large Language Models can be tested from the ABAP server. For more information about the models and their licences see the OLLAMA Library or the referenced Hugging Face model page.
The plugin is free at your own risk. MIT License.
flowchart LR
subgraph sap["SAP ABAP System"]
sap_bf["SAP Business Functions"]
subgraph sdk["B-Tocs SDK"]
sdkcore["B-Tocs SDK"]
plugin["Plugin OLLAMA"]
sdkcore-->plugin
end
sap_bf-->sdkcore
end
subgraph cloud-native-world["Cloud Native World"]
subgraph onpremise["Data Center On-Prem"]
service1
end
subgraph datacenter["Data Center"]
service2
end
subgraph hyperscaler["HyperScaler"]
service3
end
subgraph sapbtp["SAP BTP"]
service4
end
subgraph saas["SaaS"]
service5
end
end
plugin-->service1
plugin-->service2
plugin-->service3
plugin-->service4
plugin-->service5
- Configure the OLLAMA docker container
- Load your required models, e.g. "llama2", "mistral", "mixtral", ...
- Test the models from command line within the container
- check webservice port 11434 is available
- An installed B-Tocs ABAP SDK is required. Check for Updates.
- Install this plugin with abapGit.
- Use Package ZBTOCS_OLLAMA or $BTOCS_OLLAMA (local only)
- Configure SM59 RFC Destination to your service
- Test the connection
- Execute program ZBTOCS_OLLAMA_GUI_RWS_DEMO for a demo
- Try model 'llama2' with a simple prompt
- Try the system prompt feature
- Try other models with same prompts
DATA lo_response TYPE REF TO zif_btocs_rws_response.
DATA(ls_result) = VALUE zbtocs_ollama_s_generate_res( ).
DATA(lo_connector) = zcl_btocs_ollama_connector=>create( ).
IF lo_connector->set_endpoint(
iv_rfc = p_rfc
iv_profile = p_prf
) EQ abap_true.
lo_response = lo_connector->api_generate(
EXPORTING
is_params = VALUE zbtocs_ollama_s_generate_par(
model = 'llama2'
role = 'user'
prompt = 'Wie ist SAP zu einem Weltmarktführer für ERP Systeme geworden?'
sys_prompt = 'Antworte in Deutsch und in max. 5 Stichpunkten.'
template = ||
context = ||
)
iv_parse = abap_true
IMPORTING
es_result = ls_result
).
IF ls_result IS NOT INITIAL.
cl_demo_output=>write_text( text = |Response: { ls_result-response }| ).
cl_demo_output=>display( ).
ENDIF.
ENDIF.
lo_connector->destroy( ).
Check report ZBTOCS_OLLAMA_GUI_RWS_DEMO for more.
The Ollama API is documented here. Not all these API endpoints are implemented in ABAP.
Program | Description | API endpoint |
---|---|---|
ZBTOCS_OLLAMA_GUI_RWS_DEMO | Default usage generation | /api/generate |
ZBTOCS_OLLAMA_GUI_API_TAGS | List of installed models | /api/tags |
ZBTOCS_OLLAMA_GUI_API_SHOW | Model info | /api/show |
ZBTOCS_OLLAMA_GUI_API_EMBED | Generate embedding | /api/embeddings |
- 0.2.0 2024/02/20
- api endpoint /api/tags
- api endpoint /api/show
- api endpoint /api/embeddings
- added support for images in /api/generate
- additional information in result popup for /api/generate
- connector refactoring: parse* methods
- 0.1.0 2024/01/22
- initial release
- connector
- api endpoint /api/generate
- Some Ollama API features are not implemented yet
- The performance of the webservice depends on your AI backend. In local environment the performance is bad...
- There is currently no data center experience for optimal conditions for the container and parallel requests
- Known issues with backward compatibility. Will be fixed soon...
This plugin is free to use at your own risk (MIT licence).
It is free for you too. But sharing of ideas and code is expected: Contribute code, docs, ideas, ... to the project via pull requests, blogs, ...
Support for individual issues is not available. Ask a good developer, partner or the community.
Fork this repository and contribute through pull requests. More extensive participation in this repository is possible after positive experiences.