--- tags: - deepseek - commercial use - function-calling - function calling extra_gated_prompt: "Purchase access to this repo [HERE](https://buy.stripe.com/14k3cY4pL4PFbQY3dq)" --- # Function Calling Fine-tuned DeepSeek Coder 33B Purchase access to this model [here](https://buy.stripe.com/14k3cY4pL4PFbQY3dq). Performance demo video [here](https://share.descript.com/view/0uQraTWCbkp). This model is fine-tuned for function calling. - The function metadata format is the same as used for OpenAI. - The model is suitable for commercial use. - AWQ and GGUF are available on request after purchase. Check out other fine-tuned function calling models [here](https://mart.trelis.com). ## Quick Server Setup Runpod one click templates: [You must add a HuggingFace Hub access token (HUGGING_FACE_HUB_TOKEN) to the environment variables as this is a gated model.] - [4bit awq](https://runpod.io/gsc?template=lvxofymu84&ref=jmfkcdio) - [8bit with eetq](https://runpod.io/gsc?template=pb0wx07yam&ref=jmfkcdio). Runpod Affiliate [Link](https://runpod.io?ref=jmfkcdio) (helps support the Trelis channel). ## Inference Scripts See below for sample prompt format. Complete inference scripts are available for purchase [here](https://trelis.com/enterprise-server-api-and-inference-guide/): - Easily format prompts using tokenizer.apply_chat_format (starting from openai formatted functions and a list of messages) - Automate catching, handling and chaining of function calls. ## Prompt Format ``` B_FUNC, E_FUNC = "You have access to the following functions. Use them if required:\n\n", "\n\n" B_INST, E_INST = "\n### Instruction:\n", "\n### Response:\n" #DeepSeek Coder Style prompt = f"{B_INST}{B_FUNC}{functionList.strip()}{E_FUNC}{user_prompt.strip()}{E_INST}\n\n" ``` ### Using tokenizer.apply_chat_template For an easier application of the prompt, you can set up as follows: Set up `messages`: ``` [ { "role": "function_metadata", "content": "FUNCTION_METADATA" }, { "role": "user", "content": "What is the current weather in London?" }, { "role": "function_call", "content": "{\n \"name\": \"get_current_weather\",\n \"arguments\": {\n \"city\": \"London\"\n }\n}" }, { "role": "function_response", "content": "{\n \"temperature\": \"15 C\",\n \"condition\": \"Cloudy\"\n}" }, { "role": "assistant", "content": "The current weather in London is Cloudy with a temperature of 15 Celsius" } ] ``` with `FUNCTION_METADATA` as: ``` [ { "type": "function", "function": { "name": "get_current_weather", "description": "This function gets the current weather in a given city", "parameters": { "type": "object", "properties": { "city": { "type": "string", "description": "The city, e.g., San Francisco" }, "format": { "type": "string", "enum": ["celsius", "fahrenheit"], "description": "The temperature unit to use." } }, "required": ["city"] } } }, { "type": "function", "function": { "name": "get_clothes", "description": "This function provides a suggestion of clothes to wear based on the current weather", "parameters": { "type": "object", "properties": { "temperature": { "type": "string", "description": "The temperature, e.g., 15 C or 59 F" }, "condition": { "type": "string", "description": "The weather condition, e.g., 'Cloudy', 'Sunny', 'Rainy'" } }, "required": ["temperature", "condition"] } } } ] ``` and then apply the chat template to get a formatted prompt: ``` tokenizer = AutoTokenizer.from_pretrained('Trelis/deepseek-coder-33b-instruct-function-calling-v3', trust_remote_code=True) prompt = tokenizer.apply_chat_template(prompt, tokenize=False, add_generation_prompt=True) ``` If you are using a gated model, you need to first run: ``` pip install huggingface_hub huggingface-cli login ``` ### Manual Prompt: ``` Human: You have access to the following functions. Use them if required: [ { "type": "function", "function": { "name": "get_stock_price", "description": "Get the stock price of an array of stocks", "parameters": { "type": "object", "properties": { "names": { "type": "array", "items": { "type": "string" }, "description": "An array of stocks" } }, "required": [ "names" ] } } }, { "type": "function", "function": { "name": "get_big_stocks", "description": "Get the names of the largest N stocks by market cap", "parameters": { "type": "object", "properties": { "number": { "type": "integer", "description": "The number of largest stocks to get the names of, e.g. 25" }, "region": { "type": "string", "description": "The region to consider, can be \"US\" or \"World\"." } }, "required": [ "number" ] } } } ] Get the names of the five largest stocks by market cap Assistant: { "name": "get_big_stocks", "arguments": { "number": 5 } }<|EOT|>``` # Dataset See [Trelis/function_calling_v3](https://huggingface.co./datasets/Trelis/function_calling_v3). # License This model may be used commercially for inference according to the terms of the DeepSeek license, or for further fine-tuning and inference. Users may not re-publish or re-sell this model in the same or derivative form (including fine-tunes). ** The SFT chat fine-tuned model's repo card follows below. **
[🏠Homepage] | [🤖 Chat with DeepSeek Coder] | [Discord] | [Wechat(微信)]