Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks
Paper
•
1908.10084
•
Published
•
9
This is a custom Sentence Transformer model fine-tuned from sentence-transformers/all-MiniLM-L6-v2. Designed as part of the HelpingAI ecosystem, it enhances semantic similarity and contextual understanding, with an emphasis on emotionally intelligent responses.
SentenceTransformer(
(0): Transformer({'max_seq_length': 256, 'do_lower_case': False})
(1): Pooling({'pooling_mode_mean_tokens': True})
(2): Normalize()
)
<sentence_0, sentence_1, similarity_score>5e-5.Ensure you have the sentence-transformers library installed:
pip install -U sentence-transformers
Load and use the model in your Python environment:
from sentence_transformers import SentenceTransformer
# Load the HelpingAI semantic similarity model
model = SentenceTransformer("HelpingAI/HAI")
# Encode sentences
sentences = [
"A woman is slicing a pepper.",
"A girl is styling her hair.",
"The sun is shining brightly today."
]
embeddings = model.encode(sentences)
print(embeddings.shape) # Output: (3, 384)
# Calculate similarity
from sklearn.metrics.pairwise import cosine_similarity
similarity_scores = cosine_similarity([embeddings[0]], embeddings[1:])
print(similarity_scores)
high accuracy in sentiment-informed response tests.
If you use the HAI model, please cite the original Sentence-BERT paper:
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
Base model
sentence-transformers/all-MiniLM-L6-v2