Public
Documentation Settings

YottaAnswers - Public API

The YottaAnswers Public API is a collection of routes that are open to the public to use the system to generate a collection of direct answers to a question from billions of possible answers. It's intended to be used in non-commerical use cases.

Currently, the API uses system the equivalent system of Giga model on the YottaAnswers site.

Getting Started

Before a user can even start using our API to its fullest capabilities, they first must sign up on https://yottaanswers.com/api-registration , upon registration they will be emailed a JWT key which they can use to access the unthrottled route.

If the user wants to test the API before signup they can with the public throttled route.

Example of use

Boost LLM results

Using API results to fill your LLMs, can significantly boost their results. It can be seen in the table, which shows experiment of exact question answering.

ModelBase AccuracyWith API results AccuracyDifference
Flan T5 XL15%52%+242%
Phi 234%56%+65%
Mistral 7b37%55%+47%
Llama 7b43%55%+28%
Llama 13b43%60%+39%
GPT-3.556%62%+10%
GPT-461%61%+0%

Results of API are used as additional input to a LLM. The API results used are from Giga model.

We are giving an code example of how to incorporate our API results to LLM generation

python
import requests
def create_additional_information_yotta(answers_list:List[Dict[str, Any]],gather_answers:int=0) -> Tuple[str,List[str]]:
    answers = []
    links = []
    list_of_answers = []
    for a in answers_list:
        answer,sentence = a['answer'].strip().strip('.').strip(),a['sentence'].strip().strip('.').strip()
        links.append(a['link'])
        start = sentence.find(answer)
        list_of_answers.append(answer)
        if start == -1:
            sentence_highlight = sentence
        else:
            end = start + len(answer)
            sentence_highlight = sentence[:start] + '**' + answer + '**' + sentence[end:]
        answers.append(sentence_highlight)
    context_highlight = '. '.join(answers[:gather_answers])
    context_highlight = context_highlight + "."
    return context_highlight,links[:gather_answers]
def call_model_generation(text_input:str) -> str:
    return model.generate(text_input)
def call_yotta_api(question:str) -> List[Dict[str, Any]]:
    payload = {"question": question}
    headers = {
        "Authorization": "YOUR_API_KEY",
        "Content-Type": "application/json"
    }
    response = requests.request("POST", yotta_api, json=payload, headers=headers)
    answers = response.json()
    return answers
if __name__ == "__main__":
    question = "Yout question here."
    answers = call_yotta_api(question)
    context,links = create_additional_information_yotta(answers,5)
    print(context)
    print(links)
    llm_input = f"{question} {context}"
    llm_output = call_model_generation(llm_input)
    print(llm_output)
Loading