HugNLP is a novel development and application library based on Hugging Face for improving the convenience and effectiveness of NLP researchers. The founder and main developer is Jianing Wang. The collaborators (programmers) are Nuo Chen and Qiushi Sun.
Link
Join me as I build streaming inference into the Hugging Face text generation server, going through cuda, python, rust, grpc, websockets, server-sent events, and more…
Link
Prompt engineering is a relatively new discipline for developing and optimizing prompts to efficiently use language models (LMs) for a wide variety of applications and research topics. Prompt engineering skills help to better understand the capabilities and limitations of large language models (LLMs). Researchers use prompt engineering to improve the capacity of LLMs on a wide range of common and complex tasks such as question answering and arithmetic reasoning. Developers use prompt engineering to design robust and effective prompting techniques that interface with LLMs and other tools.
Dust apps rely on model providers to interact with large language models. You can setup your first model provider by clicking on the Providers pane and setting up the OpenAI provider. You’ll need to create an account at OpenAI and retrieve your API key.
Link
In this repository we have notebooks and source code used to build the OpenAI x Pinecone Q&A app. You can find more information in our webinar here.
Link
We’ll provide you with some examples of how Sagify can simplify and expedite your ML pipelines. You can train, tune and deploy a Machine Learning on the same day by using Sagify!
Link
Sous-chef.ai allows users to take photos of ingredients in their fridge/pantry and ask for a user’s food preferences. Using this information, the app provides customized suggested recipes.
Link