--- language: - ar - en base_model: - ALLaM-AI/ALLaM-7B-Instruct-preview pipeline_tag: text-generation library_name: transformers license: apache-2.0 extra_gated_prompt: > By requesting access, you confirm that you will use this model responsibly and in accordance with Navidโs licensing and usage policies. You also agree not to use this model to conduct experiments that cause harm to human subjects. extra_gated_fields: Full Name: text Phone Number (Optional โ enter "N/A" if you prefer not to share): text Company/Institution: text Job Title: text Country: country City: text Industry: type: select options: - Government / Public Sector - Legal / Compliance - Healthcare - Finance / Banking - Insurance - Education / Research - Technology / Software - Telecommunications - Energy / Utilities - Oil & Gas - Retail / E-commerce - Manufacturing / Industrial - Transportation / Logistics - Media / Marketing - Non-profit / NGO - Consulting - label: Other value: other Purpose for Download: text I acknowledge and agree that I will not use this model for any commercial purpose without obtaining prior written permission from Navid: type: checkbox I have read and agree to the Navid License Terms associated with this model: type: checkbox Privacy Notice: type: select options: - "I understand that these details will not be shared with third parties." --- # Yehia: A Simple (nice to talk to) Arabic Model
## ๐ ๏ธ How Yehia was made?
Yehia is trained using **Group Relative Policy Optimization (GRPO)** โa method that refines its answers by comparing and selecting the best responses. Its development follows the **3C3H** metric, prioritizing:
- **Correctness โ
:** Accurate information to build trust.
- **Completeness ๐:** Full, well-rounded answers.
- **Conciseness โ๏ธ:** Clear, to-the-point responses.
- **Helpfulness ๐ค:** Always aiming to support and uplift.
- **Honesty ๐ฌ:** Transparent, straightforward communication.
- **Harmlessness โค๏ธ:** Promoting kindness and safety.
And the Judge model of our answer was none other than `claude-sonnet-3.5` ๐
## ๐ Getting Started
To start using Yehia, you can easily load the model with the `transformers` library:
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import torch
model_name = "Navid-AI/Yehia-7B-preview"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, attn_implementation="flash_attention_2", device_map="auto")
messages = [
{"role": "system", "content": "ุฃูุช ูุญููุ ุฐูุงุกู ุงุตุทูุงุนููู ุทูุฑุชู ุดุฑูุฉ 'ูููุฏ'ุ ู
ุชุฎุตุตู ูู ุงูุชูููุฑ ุงูู
ูุทูู ูุงูุชุญููู ุงูุฏููู. ู
ูู
ุชู ุฅููุงู
ุงูู
ุณุชุฎุฏู
ูู ูุฏุนู
ูู
ูู ุฑุญูุชูู
ูุญู ุงูุชุนููู
ุ ุงููู
ูุ ูุชุญููู ุฃูุฏุงููู
ู
ู ุฎูุงู ุชูุฏูู
ุญูููู ุฐููุฉู ูู
ุฏุฑูุณุฉ."},
{"role": "user", "content": "ู
ุฑุญุจุงู ูุง ูุญูู! ููู ุญุงูู ุงูููู
ุ"}
]
inputs = tokenizer.apply_chat_template(messages, add_generation_prompt=True, return_tensors="pt", return_dict=True).to(model.device)
outputs = model.generate(**inputs, max_new_tokens=512)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
**Note:** If `flash_attention_2` is giving you any problems just remove it.
## ๐ What Can Yehia Do?
- **Explain Concepts ๐ก:** Break down educational topics in Arabic to help learners understand easily.
- **Engage in Conversations ๐ฃ๏ธ:** Offer friendly and supportive chats that uplift users.
- **Promote Learning ๐:** Encourage curiosity and provide knowledge in an accessible way.
Yehia shines in conversations that feel personal and uplifting, always striving to improve.
## ๐ญ Remember
Yehiaโs name means *โGod is graciousโ* in Arabicโreflecting its mission to bring grace and connection to every interaction. Whether youโre a student, creator, or just curious, Yehia is here to brighten your day.
## ๐ Citation
If you would like to cite Yehia in your work, please use the following BibTeX entry:
```
@misc{yehia2025,
title={Yehia 7B Preview},
author={Navid-AI},
year={2025},
howpublished={\url{https://huggingface.co/Navid-AI/Yehia-7B-preview}}
}
```