@@ -32,6 +32,8 @@ What would you like to do?
32
32
33
33
🔄[ ** Build workflows** ] ( #workflows ) . Easily integrate calls to an LLM into larger workflows via Ansible Playbooks (alpha)
34
34
35
+ 𝑓(𝑥) [ ** Execute functions** ] ( #functions ) . Support for [ OpenAI functions] ( https://platform.openai.com/docs/guides/gpt/function-calling ) (alpha)
36
+
35
37
🐳 ** Docker image** . The ChatGPT Wrapper is also available as a docker image. (experimental)
36
38
37
39
:test_tube : ** Flask API** . You can use the ChatGPT Wrapper as an API. (experimental)
@@ -510,6 +512,101 @@ It is also possible to execute workflows directly with `ansible-playbook`, by si
510
512
ansible-playbook playbooks/hello-world.yaml
511
513
```
512
514
515
+ ### Functions
516
+
517
+ ** NOTE: Alpha, subject to change**
518
+
519
+ The wrapper supports [ OpenAI functions] ( https://platform.openai.com/docs/guides/gpt/function-calling ) for all models that support it.
520
+
521
+ Mutiple functions may be attached, and the LLM can choose to call any or all of the attached functions.
522
+
523
+ The example configuration below assumes you want to add a new function called ` test_function ` .
524
+
525
+ #### Creating functions.
526
+
527
+ Functions are created as callable Python classes, that inherit from the base ` Function ` class.
528
+
529
+ The class name must be the camel-cased version of the snake-cased function name, so ` test_function ` becomes ` TestFunction ` .
530
+
531
+ There is one required method to implement, ` __call__ ` , and its return value must be a dictionary -- this is what will be
532
+ returned to the LLM as the result of the function call.
533
+
534
+ ``` python
535
+ from lwe.core.function import Function
536
+
537
+ class TestFunction (Function ):
538
+ def __call__ (self , word : str , repeats : int , enclose_with : str = ' ' ) -> dict :
539
+ """
540
+ Repeat the provided word a number of times.
541
+
542
+ :param word: The word to repeat.
543
+ :type content: str
544
+ :param repeats: The number of times to repeat the word.
545
+ :type repeats: int
546
+ :param enclose_with: Optional string to enclose the final content.
547
+ :type enclose_with: str, optional
548
+ :return: A dictionary containing the repeated content.
549
+ :rtype: dict
550
+ """
551
+ try :
552
+ repeated_content = " " .join([word] * repeats)
553
+ enclosed_content = f " { enclose_with}{ repeated_content}{ enclose_with} "
554
+ output = {
555
+ ' result' : enclosed_content,
556
+ ' message' : f ' Repeated the word { word} { repeats} times. ' ,
557
+ }
558
+ except Exception as e:
559
+ output = {
560
+ ' error' : str (e),
561
+ }
562
+ return output
563
+ ```
564
+
565
+ The file should be named ` [function_name].py ` , e.g. ` test_function.py ` , and be placed in the ` functions ` directory
566
+ in either the base config directory, or in the profile config directory. (These directories are listed in the output
567
+ of the ` /config ` command).
568
+
569
+ #### Providing the function definition
570
+
571
+ In the example above, notice both the [ type hints] ( https://docs.python.org/3/library/typing.html ) in the function signature (e.g. ` word: str ` ),
572
+ and the [ reStructured text] ( https://www.sphinx-doc.org/en/master/usage/restructuredtext/basics.html ) documentation of the method arguments.
573
+ This is the default method for providing the function definition to the OpenAI API.
574
+
575
+ Alternatively, you may provide the function definition by creating a ` [function_name].config.yaml ` file in the same location as the
576
+ ` [function_name].py ` file, e.g. ` test_function.config.yaml ` -- if provided, its contents will be used instead of the default
577
+ method.
578
+
579
+ Finally, for full control, you may override the ` get_config() ` method of the base ` Function ` class, and return
580
+ a dictionary of the function definition.
581
+
582
+ #### Attaching functions.
583
+
584
+ For now, the function list must be attached to a [ preset] ( #presets ) , as a list of function names, like so:
585
+
586
+ ``` yaml
587
+ metadata :
588
+ name : gpt-4-function-test
589
+ provider : chat_openai
590
+ # Add this to have the FIRST function CALL response from the LLM returned directly.
591
+ # return_on_function_call: true
592
+ # Add this to have the LAST function RESPONSE from the LLM returned directly.
593
+ # return_on_function_response: true
594
+ model_customizations :
595
+ # Other attributes.
596
+ model_name : gpt-4-0613
597
+ model_kwargs :
598
+ # Functions are added under this key, as a list of function names.
599
+ # Multiple functions can be added.
600
+ functions :
601
+ - test_function
602
+ ` ` `
603
+
604
+ A preset can be edited by using the ` /preset-edit` command:
605
+
606
+ Note the special `return_on_function_call` and `return_on_function_response` metadata attributes, which can be used to
607
+ control the return value, useful when using the `ApiBackend`module, or via [workflows](#workflows)
608
+
609
+
513
610
# ## Flask API (experimental)
514
611
515
612
- Run `python lwe/gpt_api.py --port 5000` (default port is 5000) to start the server
0 commit comments