Spaces:
Running
title: EastSync AI
emoji: π
colorFrom: pink
colorTo: blue
sdk: docker
app_file: app.py
pinned: false
licence: mit
short_description: HR skill-gap analysis and training planner
tags:
- mcp
- mcp-in-action-track-enterprise
- fastmcp
- gradio
- context-engineering
- multi-agent
- smolagents
- supabase
- enterprise
- gemini
- openai
- anthropic
- elevenlabs
- huggingface
##Hackathon submission details
ποΈ Data Sources & External APIs
Our platform relies on a variety of external services to process data, reason, and generate the final output.
| API / Tool | Purpose | Integration Details |
|---|---|---|
| LLM Providers (OpenAI, Gemini, Anthropic, HF) | Core Reasoning Engine | Provides the intelligence for skill-gap assessment, parsing project requirements, and synthesizing final training plans. Accessed via the SmolAgents Model Context Protocol (MCP). |
| Supabase | Structured Data Repository | Acts as the system's database for internal HR data: storing project details, employee profiles, and persisting the final analysis results and training plans. |
| Web Search (via DuckDuckGo tool) | Training Resource Discovery | Used by the Web Search Agent to query for the most relevant and up-to-date online courses and certification programs (e.g., on Coursera, EdX, etc.) based on missing skills. |
| ElevenLabs | User Experience | Provides high-quality, streamed text-to-speech narration to walk the user through the live analysis and planning process. |
video demo
https://res.cloudinary.com/dnvqeopvd/video/upload/v1764442409/EastSync_AI_demo_lgt5nm.mp4
All narration on the demo is done using ElevenLabs. The first 13 seconds intro is a pre-recorded audio, while the rest of it is streamed in real-time as the app is running. This is gone by using Gemini 2.5 Flash to convert live logs into narration, which is fed into ElevenLabs to produce the audio stream, played by a Gradio Audio component.
CV matching https://res.cloudinary.com/dhi2fghp5/video/upload/v1764522431/screen_1764522130412_qxi7gx.mp4
Social Media Post:
###Tracks we would like to compete in:
- Track 2: MCP in Action (Enterprise)
- ElevenLabs Track (We think our use of ElevenLabs is creative and innovative)
- Google Gemini Track (We think our use of Google Gemini Flash creative and interesting)
- OpenAI Track
π EastSync-AI β HR Skill Gap Analysis and Training Planner
Welcome to EastSync-AI, an intelligent platform built with SmolAgents for the Hugging Face GenAI Agents & MCP Hackathon! This project is a Multi-Agent System designed to solve a critical HR challenge: automatically identifying skill deficiencies across an organization and generating personalized, actionable training roadmaps to close those gaps.
The system expertly orchestrates calls between multiple LLM providers (via the Model Context Protocol), a knowledge base (Supabase), and web search tools to deliver comprehensive readiness reports.
π Project Overview
EastSync-AI operates by comparing required project skills against the current skill sets of assigned team members. It uses a dynamic, multi-agent workflow to perform complex data analysis and then synthesizes the findings into a clear, visual capability report and training plan.
Core Features
- Skill Gap Analysis: Automatically quantifies and identifies skill gaps between employee profiles and project technical requirements.
- Training Plan Generation: Creates personalized, time-bound training plans with recommended online courses and materials found via real-time web search.
- Multi-Provider LLM Support: Supports flexible configuration across OpenAI, Anthropic, Google Gemini, Hugging Face, and OpenRouter models.
- Database Integration: Supabase is used for securely storing and retrieving employee, project, and analysis data.
- Dynamic Narration: Integrates ElevenLabs text-to-speech for an engaging, narrated application experience.
π€ Agent Logic and Responsibilities
The system is powered by a coordinated team of specialized SmolAgents, each responsible for a distinct part of the analysis and planning pipeline.
| π§© Agent Name | Role | Responsibilities |
|---|---|---|
π― Orchestrator Agent (orchestrator_agent.py) |
The Manager | Coordinates the entire workflow, manages the sequence of operations, and compiles the final comprehensive report and training plan from all agent inputs. |
| π§ Supabase Agent | The Data Handler | Interacts directly with the Supabase database to retrieve project requirements and fetch current employee skill profiles to initiate the gap analysis. |
| π Web Search Agent | The Researcher | Executes real-time web searches for high-quality, relevant training courses (e.g., Data Engineering, SQL, JavaScript) to fill the identified skill gaps. |
π» Tech Stack
Core Technologies
- Python 3.11.6+: Programming language
- SmolAgents: Multi-agent framework for orchestration
- Gradio+: Web UI framework
- Pydantic 2.0+: Data validation and models
- Supabase: Database and backend services
- ElevenLabs: Text-to-speech service
LLM Providers Supported
- OpenAI (GPT models)
- Anthropic Claude
- Google Gemini
- Hugging Face Inference API
- OpenRouter
Package Management
- uv: Fast Python package installer and resolver
βοΈ Prerequisites and Installation
Prerequisites
- Python 3.13.5 or higher
- uv package manager
- API keys for your chosen LLM provider(s)
- Supabase API Key
- (Optional) ElevenLabs API key
Installation Steps
Clone the repository:
git clone <repository-url> cd EastSync-AIInstall dependencies using uv:
uv syncThen activate the environment:
source .venv/bin/activateSet up environment variables:
Create a
.envfile or export the following environment variables based on your chosen services:# For LLM Providers (Choose at least one) export OPENAI_API_KEY="your-openai-api-key" export CLAUDE_API_KEY="your-claude-api-key" export GEMINI_API_KEY="your-gemini-api-key" export HF_API_KEY="your-huggingface-api-key" export OPENROUTER_API_KEY="your-openrouter-api-key" # For Database and Narration (Optional) export SUPABASE_API_KEY="your-supabase-api-key" export ELEVEN_LABS_API_KEY="your-elevenlabs-api-key"
Usage
Starting the Web Application
Run the Gradio web interface:
python -m app
The application will start a local web server (typically at http://127.0.0.1:7860). Open this URL in your browser to access the interface.
Using the Application
Enter your query in the text box, for example:
- "Analyze project 2 and suggest training plans for John Doe and Daniel Tatar."
- "What skills are missing for the AI project team?"
Click "Analyze and Plan" to generate the skill-gap analysis and training recommendations.
Configuring the LLM Provider
Edit app.py to change the LLM provider and model:
from llm_provider import LLMProvider, LLMProviderType
# Example: Using OpenAI
llm_provider = LLMProvider(LLMProviderType.OPENAI, LLMModelType.openai.gpt_5_1).get_model()
# Example: Using Hugging Face
llm_provider = LLMProvider(LLMProviderType.HF, LLMModelType.hf.meta_llama_3_3_70b_instruct).get_model()
#Example: Using Gemini
llm_provider = LLMProvider(LLMProviderType.GEMINI, LLMModelType.gemini.gemini_2_5_pro).get_model()
Code Structure
llm_provider.py: Handles LLM provider abstraction, supporting multiple providers through a unified interfacemodels.py: Defines Pydantic models for employees, projects, training plans, and analysis resultsagents/orchestrator_agent.py: Main orchestrator agent that coordinates the analysis workflowapp.py: Gradio web interface entry point
Environment Variables
| Variable | Description | Required For |
|---|---|---|
OPENAI_API_KEY |
OpenAI API key | OpenAI provider |
CLAUDE_API_KEY |
Anthropic API key | Claude provider |
GEMINI_API_KEY |
Google Gemini API key | Gemini provider |
HF_API_KEY |
Hugging Face API token | Hugging Face provider |
OPENROUTER_API_KEY |
OpenRouter API key | OpenRouter provider |
SUPABASE_API_KEY |
Supabase API key | Database features |
ELEVEN_LABS_API_KEY |
ElevenLabs API key | Text-to-speech |
License
See LICENSE file for details.
π₯ Team
Team Name: EastSync
Team Members:
- Stanislav Sava - @StanSava - Machine Learning & Frontend Engineer
- Dragos Dit - @pixel-pirat3 - Mobile Engineer
- Daniel Tatar - @apxutektor - Java Engineer