Deepchecks is the leading tool for validating your machine learning models and data, and it enables doing so with minimal effort. Deepchecks accompanies you through various validation needs such as verifying your data’s integrity, inspecting its distributions, validating data splits, evaluating your model and comparing between different models.
Link
In this blog, we will build a Flask web app that can input any long piece of information such as a blog or news article and summarize it into just five lines!
Text summarization is an NLP(Natural Language Processing) task. SBERT(Sentence-BERT) has been used to achieve the same.
By the end of the article, you will learn how to integrate AI models and specifically pre-trained BERT models with Flask web technology as well!
In this tutorial, we will build a job recommendation and skill discovery script that will take unstructured text as input, and will then output job recommendations and skill suggestions based on entities such as skills, years of experience, diploma, and major.
We will extract entities and relations from job descriptions using the BERT model and we will attempt to build a knowledge graph from skills and years of experience.
Link
How to build a knowledge graph from job descriptions using fine-tuned transformer-based Named Entity Recognition (NER) and spacy’s relation extraction models. The method described here can be used in any different field such as biomedical, finance, healthcare, etc.
Below are the steps we are going to take:
Load our fine-tuned transformer NER and spacy relation extraction model in google colab
Create a Neo4j Sandbox and add our entities and relations
FastAPI might be able to help. FastAPI is FastAPI is a web framework for building APIs with Python. We will use FastAPI in this article to build a REST API to service an NLP model which can be queried via GET request and can dole out responses to those queries.
For this example, we will skip the building of our own model, and instead leverage the Pipeline class of the HuggingFace Transformers library.
This post is a simple tutorial for how to use a variant of BERT to classify sentences.
This is an example that is basic enough as a first intro, yet advanced enough to showcase some of the key concepts involved.
Link
Slack chats can become messy with time, proving difficult to extract meaningful information.
In this article, I want to present a quick codeless way of fine-tuning and deploying the commonly used BERT classifier to do conversational analysis.
We will use that system to extract tasks, facts, and other valuable information from our Slack conversations.
It could be easily extended for categorizing any other textual data, like support requests, emails, etc.