init
This commit is contained in:
19
backend/.env.example
Normal file
19
backend/.env.example
Normal file
@@ -0,0 +1,19 @@
|
||||
# Ollama Configuration
|
||||
OLLAMA_BASE_URL=http://localhost:11434
|
||||
# OLLAMA_MODEL=codellama:7b
|
||||
OLLAMA_MODEL=qwen3:8b
|
||||
|
||||
# Database
|
||||
DATABASE_URL=sqlite+aiosqlite:///./review.db
|
||||
|
||||
# Security - сгенерированные ключи
|
||||
SECRET_KEY=a1b2c3d4e5f6g7h8i9j0k1l2m3n4o5p6q7r8s9t0u1v2w3x4y5z6a7b8c9d0e1f2
|
||||
ENCRYPTION_KEY=z9y8x7w6v5u4t3s2r1q0p9o8n7m6l5k4j3i2h1g0f9e8d7c6b5a4z3y2x1w0v9u8
|
||||
|
||||
# Server
|
||||
HOST=0.0.0.0
|
||||
PORT=8000
|
||||
DEBUG=True
|
||||
|
||||
# CORS - можно указать через запятую
|
||||
CORS_ORIGINS=http://localhost:5173,http://localhost:3000
|
||||
141
backend/README.md
Normal file
141
backend/README.md
Normal file
@@ -0,0 +1,141 @@
|
||||
# AI Review Backend
|
||||
|
||||
FastAPI backend для AI Code Review Agent с поддержкой LangGraph и Ollama.
|
||||
|
||||
## Установка
|
||||
|
||||
```bash
|
||||
# Создайте виртуальное окружение
|
||||
python -m venv venv
|
||||
source venv/bin/activate # Windows: venv\Scripts\activate
|
||||
|
||||
# Установите зависимости
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
## Настройка
|
||||
|
||||
Создайте `.env` файл из примера:
|
||||
|
||||
```bash
|
||||
cp .env.example .env
|
||||
```
|
||||
|
||||
Отредактируйте `.env`:
|
||||
|
||||
```env
|
||||
# Ollama - убедитесь что Ollama запущен
|
||||
OLLAMA_BASE_URL=http://localhost:11434
|
||||
OLLAMA_MODEL=codellama
|
||||
|
||||
# Database
|
||||
DATABASE_URL=sqlite+aiosqlite:///./review.db
|
||||
|
||||
# Security - сгенерируйте случайные строки!
|
||||
SECRET_KEY=your-secret-key-here
|
||||
ENCRYPTION_KEY=your-encryption-key-here
|
||||
|
||||
# Server
|
||||
HOST=0.0.0.0
|
||||
PORT=8000
|
||||
DEBUG=True
|
||||
|
||||
# CORS
|
||||
CORS_ORIGINS=http://localhost:5173
|
||||
```
|
||||
|
||||
## Запуск
|
||||
|
||||
```bash
|
||||
# Запуск сервера
|
||||
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
|
||||
|
||||
# Или через Python
|
||||
python -m app.main
|
||||
```
|
||||
|
||||
API будет доступен на `http://localhost:8000`
|
||||
|
||||
Swagger документация: `http://localhost:8000/docs`
|
||||
|
||||
## Структура
|
||||
|
||||
```
|
||||
app/
|
||||
├── agents/ # LangGraph агенты
|
||||
│ ├── reviewer.py # Основной агент
|
||||
│ ├── prompts.py # Промпты для LLM
|
||||
│ └── tools.py # Инструменты агента
|
||||
├── api/ # FastAPI endpoints
|
||||
│ ├── repositories.py
|
||||
│ ├── reviews.py
|
||||
│ └── webhooks.py
|
||||
├── models/ # SQLAlchemy модели
|
||||
│ ├── repository.py
|
||||
│ ├── pull_request.py
|
||||
│ ├── review.py
|
||||
│ └── comment.py
|
||||
├── schemas/ # Pydantic схемы
|
||||
├── services/ # Git платформы (Gitea, GitHub, Bitbucket)
|
||||
├── webhooks/ # Webhook обработчики
|
||||
├── config.py # Конфигурация
|
||||
├── database.py # Database setup
|
||||
└── main.py # FastAPI приложение
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### Repositories
|
||||
- `GET /api/repositories` - список
|
||||
- `POST /api/repositories` - создать
|
||||
- `PUT /api/repositories/{id}` - обновить
|
||||
- `DELETE /api/repositories/{id}` - удалить
|
||||
|
||||
### Reviews
|
||||
- `GET /api/reviews` - список с фильтрами
|
||||
- `GET /api/reviews/{id}` - детали
|
||||
- `POST /api/reviews/{id}/retry` - повторить
|
||||
- `GET /api/reviews/stats/dashboard` - статистика
|
||||
|
||||
### Webhooks
|
||||
- `POST /api/webhooks/gitea/{repo_id}`
|
||||
- `POST /api/webhooks/github/{repo_id}`
|
||||
- `POST /api/webhooks/bitbucket/{repo_id}`
|
||||
|
||||
### WebSocket
|
||||
- `ws://localhost:8000/ws/reviews` - real-time
|
||||
|
||||
## Разработка
|
||||
|
||||
### Тестирование API
|
||||
|
||||
```bash
|
||||
# Используйте Swagger UI
|
||||
open http://localhost:8000/docs
|
||||
|
||||
# Или curl
|
||||
curl http://localhost:8000/health
|
||||
```
|
||||
|
||||
### База данных
|
||||
|
||||
База данных создается автоматически при первом запуске (SQLite).
|
||||
|
||||
Для production рекомендуется PostgreSQL:
|
||||
|
||||
```env
|
||||
DATABASE_URL=postgresql+asyncpg://user:pass@localhost/dbname
|
||||
```
|
||||
|
||||
## Зависимости
|
||||
|
||||
Основные пакеты:
|
||||
- `fastapi` - веб-фреймворк
|
||||
- `sqlalchemy` - ORM
|
||||
- `langchain` - LLM фреймворк
|
||||
- `langgraph` - граф агентов
|
||||
- `httpx` - HTTP клиент
|
||||
- `cryptography` - шифрование
|
||||
|
||||
См. `requirements.txt` для полного списка.
|
||||
|
||||
4
backend/app/__init__.py
Normal file
4
backend/app/__init__.py
Normal file
@@ -0,0 +1,4 @@
|
||||
"""AI Code Review Agent Backend"""
|
||||
|
||||
__version__ = "0.1.0"
|
||||
|
||||
6
backend/app/agents/__init__.py
Normal file
6
backend/app/agents/__init__.py
Normal file
@@ -0,0 +1,6 @@
|
||||
"""LangGraph agents for code review"""
|
||||
|
||||
from app.agents.reviewer import ReviewerAgent
|
||||
|
||||
__all__ = ["ReviewerAgent"]
|
||||
|
||||
139
backend/app/agents/prompts.py
Normal file
139
backend/app/agents/prompts.py
Normal file
@@ -0,0 +1,139 @@
|
||||
"""Prompts for AI code reviewer"""
|
||||
|
||||
SYSTEM_PROMPT = """Ты строгий и внимательный code reviewer с многолетним опытом. Твоя задача - тщательно анализировать код и находить ВСЕ проблемы.
|
||||
|
||||
ОБЯЗАТЕЛЬНО проверяй:
|
||||
1. **Синтаксические ошибки** - опечатки, незакрытые скобки, некорректный синтаксис языка
|
||||
2. **Потенциальные баги** - логические ошибки, неправильная обработка исключений, проблемы с null/undefined
|
||||
3. **Проблемы безопасности** - SQL injection, XSS, небезопасное использование eval, утечки данных
|
||||
4. **Нарушения best practices** - неправильное использование React (key prop, hooks), плохие названия переменных
|
||||
5. **Проблемы производительности** - неэффективные алгоритмы, лишние ререндеры, утечки памяти
|
||||
6. **Читаемость кода** - сложная логика, отсутствие обработки ошибок
|
||||
|
||||
Особое внимание:
|
||||
- В React: правильность использования key, hooks rules, JSX syntax
|
||||
- Опечатки в строковых константах (API paths, Content-Type headers)
|
||||
- Незакрытые/лишние скобки в JSX и JavaScript
|
||||
- Несоответствие кода описанию в PR
|
||||
|
||||
Для каждой проблемы укажи:
|
||||
- Номер строки
|
||||
- Уровень серьезности: ERROR (критично), WARNING (важно), INFO (рекомендация)
|
||||
- Что не так
|
||||
- Как исправить
|
||||
|
||||
Будь требовательным! Даже мелкие опечатки могут сломать продакшн."""
|
||||
|
||||
|
||||
CODE_REVIEW_PROMPT = """Проанализируй следующий код из файла `{file_path}`:
|
||||
|
||||
```{language}
|
||||
{code}
|
||||
```
|
||||
|
||||
Контекст: это изменения в Pull Request.
|
||||
{patch_info}
|
||||
|
||||
Найди проблемы и предложи улучшения. Для каждой проблемы укажи:
|
||||
1. Номер строки
|
||||
2. Уровень: INFO, WARNING или ERROR
|
||||
3. Описание проблемы
|
||||
4. Рекомендация
|
||||
|
||||
Ответ дай в формате JSON:
|
||||
{{
|
||||
"comments": [
|
||||
{{
|
||||
"line": <номер_строки>,
|
||||
"severity": "INFO|WARNING|ERROR",
|
||||
"message": "описание проблемы и рекомендация"
|
||||
}}
|
||||
]
|
||||
}}
|
||||
|
||||
Если проблем нет, верни пустой массив comments."""
|
||||
|
||||
|
||||
DIFF_REVIEW_PROMPT = """Ты СТРОГИЙ code reviewer. Твоя задача - найти ВСЕ ошибки в коде.
|
||||
{pr_context}
|
||||
Анализируй изменения в файле `{file_path}`:
|
||||
|
||||
```diff
|
||||
{diff}
|
||||
```
|
||||
|
||||
ПОШАГОВЫЙ АНАЛИЗ каждой строки с +:
|
||||
|
||||
Шаг 1: ЧИТАЙ КАЖДУЮ СТРОКУ с + внимательно
|
||||
Шаг 2: ПРОВЕРЬ каждую строку на:
|
||||
a) ОПЕЧАТКИ - неправильные слова, typos
|
||||
b) СИНТАКСИС - скобки, кавычки, запятые
|
||||
c) ЛОГИКА - правильность кода
|
||||
d) REACT ПРАВИЛА - key, hooks, JSX
|
||||
|
||||
Шаг 3: НАЙДИ ошибки (даже мелкие!)
|
||||
|
||||
КОНКРЕТНЫЕ ПРИМЕРЫ ОШИБОК (ОБЯЗАТЕЛЬНО ИЩИ ТАКИЕ):
|
||||
|
||||
❌ ОПЕЧАТКИ В СТРОКАХ:
|
||||
'Content-Type': 'shmapplication/json' // ОШИБКА! должно быть 'application/json'
|
||||
const url = 'htps://example.com' // ОШИБКА! должно быть 'https'
|
||||
|
||||
❌ НЕЗАКРЫТЫЕ СКОБКИ:
|
||||
{{condition && (<div>text</div>}} // ОШИБКА! пропущена )
|
||||
<span>{{text</span> // ОШИБКА! пропущена }}
|
||||
|
||||
❌ НЕПРАВИЛЬНЫЙ KEY В REACT:
|
||||
<div>
|
||||
<Item> // ОШИБКА! key должен быть ЗДЕСЬ
|
||||
<img key={{id}} /> // а не здесь
|
||||
</Item>
|
||||
</div>
|
||||
|
||||
❌ УДАЛЕНИЕ KEY:
|
||||
-<Item key={{id}}> // ОШИБКА! удалили key
|
||||
+<Item>
|
||||
|
||||
❌ НЕСООТВЕТСТВИЕ ОПИСАНИЮ PR:
|
||||
Описание PR: "Добавление функционала редактирования аватара"
|
||||
Код: меняет Content-Type на 'shmapplication/json' // ОШИБКА! не связано с аватарами
|
||||
|
||||
ОБЯЗАТЕЛЬНО ПРОВЕРЬ:
|
||||
1. СООТВЕТСТВИЕ ОПИСАНИЮ PR - делает ли код то что написано в описании?
|
||||
2. Все строки в кавычках - нет ли опечаток?
|
||||
3. Все скобки - все ли закрыты?
|
||||
4. Все JSX элементы - правильно ли?
|
||||
5. React key - на правильном элементе?
|
||||
|
||||
{format_instructions}
|
||||
|
||||
ВАЖНО:
|
||||
1. ТОЛЬКО JSON в ответе!
|
||||
2. НЕ ПИШИ "Thank you" или другой текст
|
||||
3. Даже мелкая опечатка - это ERROR!
|
||||
4. Если проблем НЕТ: {{"comments": []}}
|
||||
|
||||
Структура ответа:
|
||||
{{
|
||||
"comments": [
|
||||
{{
|
||||
"line": 58,
|
||||
"severity": "ERROR",
|
||||
"message": "Опечатка в строке: 'shmapplication/json' должно быть 'application/json'"
|
||||
}}
|
||||
]
|
||||
}}"""
|
||||
|
||||
|
||||
SUMMARY_PROMPT = """На основе всех найденных проблем в PR создай краткое резюме ревью.
|
||||
|
||||
Найденные проблемы:
|
||||
{issues_summary}
|
||||
|
||||
Создай краткое резюме (2-3 предложения), которое:
|
||||
- Указывает общее количество найденных проблем по уровням серьезности
|
||||
- Выделяет наиболее критичные моменты
|
||||
- Дает общую оценку качества кода
|
||||
|
||||
Ответ верни в виде текста без форматирования."""
|
||||
|
||||
488
backend/app/agents/reviewer.py
Normal file
488
backend/app/agents/reviewer.py
Normal file
@@ -0,0 +1,488 @@
|
||||
"""Main reviewer agent using LangGraph"""
|
||||
|
||||
from typing import TypedDict, List, Dict, Any, Optional
|
||||
from langgraph.graph import StateGraph, END
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.agents.tools import CodeAnalyzer, detect_language, should_review_file
|
||||
from app.agents.prompts import SYSTEM_PROMPT, SUMMARY_PROMPT
|
||||
from app.models import Review, Comment, PullRequest, Repository
|
||||
from app.models.review import ReviewStatusEnum
|
||||
from app.models.comment import SeverityEnum
|
||||
from app.services import GiteaService, GitHubService, BitbucketService
|
||||
from app.services.base import BaseGitService
|
||||
from app.config import settings
|
||||
|
||||
|
||||
class ReviewState(TypedDict):
|
||||
"""State for the review workflow"""
|
||||
review_id: int
|
||||
pr_number: int
|
||||
repository_id: int
|
||||
status: str
|
||||
files: List[Dict[str, Any]]
|
||||
analyzed_files: List[str]
|
||||
comments: List[Dict[str, Any]]
|
||||
error: Optional[str]
|
||||
git_service: Optional[BaseGitService]
|
||||
|
||||
|
||||
class ReviewerAgent:
|
||||
"""Agent for reviewing code using LangGraph"""
|
||||
|
||||
def __init__(self, db: AsyncSession):
|
||||
self.db = db
|
||||
self.analyzer = CodeAnalyzer(
|
||||
ollama_base_url=settings.ollama_base_url,
|
||||
model=settings.ollama_model
|
||||
)
|
||||
self.graph = self._build_graph()
|
||||
|
||||
def _build_graph(self) -> StateGraph:
|
||||
"""Build the LangGraph workflow"""
|
||||
workflow = StateGraph(ReviewState)
|
||||
|
||||
# Add nodes
|
||||
workflow.add_node("fetch_pr_info", self.fetch_pr_info)
|
||||
workflow.add_node("fetch_files", self.fetch_files)
|
||||
workflow.add_node("analyze_files", self.analyze_files)
|
||||
workflow.add_node("post_comments", self.post_comments)
|
||||
workflow.add_node("complete_review", self.complete_review)
|
||||
|
||||
# Set entry point
|
||||
workflow.set_entry_point("fetch_pr_info")
|
||||
|
||||
# Add edges
|
||||
workflow.add_edge("fetch_pr_info", "fetch_files")
|
||||
workflow.add_edge("fetch_files", "analyze_files")
|
||||
workflow.add_edge("analyze_files", "post_comments")
|
||||
workflow.add_edge("post_comments", "complete_review")
|
||||
workflow.add_edge("complete_review", END)
|
||||
|
||||
return workflow.compile()
|
||||
|
||||
def _remove_think_blocks(self, text: str) -> str:
|
||||
"""Remove <think>...</think> blocks from text"""
|
||||
import re
|
||||
# Remove <think> blocks
|
||||
text = re.sub(r'<think>.*?</think>', '', text, flags=re.DOTALL | re.IGNORECASE)
|
||||
# Remove extra whitespace
|
||||
text = re.sub(r'\n\n+', '\n\n', text)
|
||||
return text.strip()
|
||||
|
||||
def _escape_html_in_text(self, text: str) -> str:
|
||||
"""Escape HTML tags in text to prevent Markdown from hiding them
|
||||
|
||||
Wraps code-like content (anything with < >) in backticks.
|
||||
"""
|
||||
import re
|
||||
|
||||
# Pattern to find HTML-like tags (e.g., <CharacterItem>, <img>)
|
||||
# We want to wrap them in backticks so they display correctly
|
||||
def replace_tag(match):
|
||||
tag = match.group(0)
|
||||
# If it's already in backticks or code block, skip
|
||||
return f"`{tag}`"
|
||||
|
||||
# Find all <...> patterns and wrap them
|
||||
text = re.sub(r'<[^>]+>', replace_tag, text)
|
||||
|
||||
return text
|
||||
|
||||
def _get_git_service(self, repository: Repository) -> BaseGitService:
|
||||
"""Get appropriate Git service for repository"""
|
||||
from app.utils import decrypt_token
|
||||
from app.config import settings
|
||||
|
||||
# Parse repository URL to get owner and name
|
||||
# Assuming URL format: https://git.example.com/owner/repo
|
||||
parts = repository.url.rstrip('/').split('/')
|
||||
repo_name = parts[-1].replace('.git', '')
|
||||
repo_owner = parts[-2]
|
||||
|
||||
base_url = '/'.join(parts[:-2])
|
||||
|
||||
# Определяем токен: проектный или мастер
|
||||
if repository.api_token:
|
||||
# Используем проектный токен
|
||||
try:
|
||||
decrypted_token = decrypt_token(repository.api_token)
|
||||
print(f" 🔑 Используется проектный токен")
|
||||
except ValueError as e:
|
||||
raise ValueError(f"Не удалось расшифровать API токен для репозитория {repository.name}: {str(e)}")
|
||||
else:
|
||||
# Используем мастер токен
|
||||
platform = repository.platform.value.lower()
|
||||
if platform == "gitea":
|
||||
decrypted_token = settings.master_gitea_token
|
||||
elif platform == "github":
|
||||
decrypted_token = settings.master_github_token
|
||||
elif platform == "bitbucket":
|
||||
decrypted_token = settings.master_bitbucket_token
|
||||
else:
|
||||
raise ValueError(f"Unsupported platform: {repository.platform}")
|
||||
|
||||
if not decrypted_token:
|
||||
raise ValueError(
|
||||
f"API токен не указан для репозитория {repository.name} "
|
||||
f"и мастер токен для {platform} не настроен в .env (MASTER_{platform.upper()}_TOKEN)"
|
||||
)
|
||||
|
||||
print(f" 🔑 Используется мастер {platform} токен")
|
||||
|
||||
if repository.platform.value == "gitea":
|
||||
return GiteaService(base_url, decrypted_token, repo_owner, repo_name)
|
||||
elif repository.platform.value == "github":
|
||||
return GitHubService(base_url, decrypted_token, repo_owner, repo_name)
|
||||
elif repository.platform.value == "bitbucket":
|
||||
return BitbucketService(base_url, decrypted_token, repo_owner, repo_name)
|
||||
else:
|
||||
raise ValueError(f"Unsupported platform: {repository.platform}")
|
||||
|
||||
async def fetch_pr_info(self, state: ReviewState) -> ReviewState:
|
||||
"""Fetch PR information"""
|
||||
try:
|
||||
# Update review status
|
||||
result = await self.db.execute(
|
||||
select(Review).where(Review.id == state["review_id"])
|
||||
)
|
||||
review = result.scalar_one()
|
||||
review.status = ReviewStatusEnum.FETCHING
|
||||
await self.db.commit()
|
||||
|
||||
# Get repository
|
||||
result = await self.db.execute(
|
||||
select(Repository).where(Repository.id == state["repository_id"])
|
||||
)
|
||||
repository = result.scalar_one()
|
||||
|
||||
# Initialize Git service
|
||||
git_service = self._get_git_service(repository)
|
||||
state["git_service"] = git_service
|
||||
|
||||
# Fetch PR info
|
||||
pr_info = await git_service.get_pull_request(state["pr_number"])
|
||||
|
||||
print("\n" + "📋"*40)
|
||||
print("ИНФОРМАЦИЯ О PR")
|
||||
print("📋"*40)
|
||||
print(f"\n📝 Название: {pr_info.title}")
|
||||
print(f"👤 Автор: {pr_info.author}")
|
||||
print(f"🔀 Ветки: {pr_info.source_branch} → {pr_info.target_branch}")
|
||||
print(f"📄 Описание:")
|
||||
print("-" * 80)
|
||||
print(pr_info.description if pr_info.description else "(без описания)")
|
||||
print("-" * 80)
|
||||
print("📋"*40 + "\n")
|
||||
|
||||
# Store PR info in state
|
||||
state["pr_info"] = {
|
||||
"title": pr_info.title,
|
||||
"description": pr_info.description,
|
||||
"author": pr_info.author,
|
||||
"source_branch": pr_info.source_branch,
|
||||
"target_branch": pr_info.target_branch
|
||||
}
|
||||
|
||||
state["status"] = "pr_info_fetched"
|
||||
return state
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ ОШИБКА в fetch_pr_info: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
state["error"] = str(e)
|
||||
state["status"] = "failed"
|
||||
return state
|
||||
|
||||
async def fetch_files(self, state: ReviewState) -> ReviewState:
|
||||
"""Fetch changed files in PR"""
|
||||
try:
|
||||
git_service = state["git_service"]
|
||||
|
||||
print("\n" + "📥"*40)
|
||||
print("ПОЛУЧЕНИЕ ФАЙЛОВ ИЗ PR")
|
||||
print("📥"*40)
|
||||
|
||||
# Get changed files
|
||||
files = await git_service.get_pr_files(state["pr_number"])
|
||||
|
||||
print(f"\n📊 Получено файлов из API: {len(files)}")
|
||||
for i, f in enumerate(files, 1):
|
||||
print(f"\n {i}. {f.filename}")
|
||||
print(f" Status: {f.status}")
|
||||
print(f" +{f.additions} -{f.deletions}")
|
||||
print(f" Patch: {'ДА' if f.patch else 'НЕТ'} ({len(f.patch) if f.patch else 0} символов)")
|
||||
if f.patch:
|
||||
print(f" Первые 200 символов patch:")
|
||||
print(f" {f.patch[:200]}...")
|
||||
|
||||
# Filter files that should be reviewed
|
||||
reviewable_files = []
|
||||
skipped_files = []
|
||||
|
||||
for f in files:
|
||||
if should_review_file(f.filename):
|
||||
reviewable_files.append({
|
||||
"path": f.filename,
|
||||
"status": f.status,
|
||||
"additions": f.additions,
|
||||
"deletions": f.deletions,
|
||||
"patch": f.patch,
|
||||
"language": detect_language(f.filename)
|
||||
})
|
||||
else:
|
||||
skipped_files.append(f.filename)
|
||||
|
||||
print(f"\n✅ Файлов для ревью: {len(reviewable_files)}")
|
||||
for rf in reviewable_files:
|
||||
print(f" - {rf['path']} ({rf['language']})")
|
||||
|
||||
if skipped_files:
|
||||
print(f"\n⏭️ Пропущено файлов: {len(skipped_files)}")
|
||||
for sf in skipped_files:
|
||||
print(f" - {sf}")
|
||||
|
||||
print("📥"*40 + "\n")
|
||||
|
||||
state["files"] = reviewable_files
|
||||
state["status"] = "files_fetched"
|
||||
|
||||
# Update review
|
||||
result = await self.db.execute(
|
||||
select(Review).where(Review.id == state["review_id"])
|
||||
)
|
||||
review = result.scalar_one()
|
||||
review.status = ReviewStatusEnum.ANALYZING
|
||||
await self.db.commit()
|
||||
|
||||
return state
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ ОШИБКА в fetch_files: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
state["error"] = str(e)
|
||||
state["status"] = "failed"
|
||||
return state
|
||||
|
||||
async def analyze_files(self, state: ReviewState) -> ReviewState:
|
||||
"""Analyze files and generate comments"""
|
||||
try:
|
||||
all_comments = []
|
||||
|
||||
print("\n" + "🔬"*40)
|
||||
print("НАЧАЛО АНАЛИЗА ФАЙЛОВ")
|
||||
print("🔬"*40)
|
||||
print(f"Файлов для анализа: {len(state['files'])}")
|
||||
|
||||
for i, file_info in enumerate(state["files"], 1):
|
||||
file_path = file_info["path"]
|
||||
patch = file_info.get("patch")
|
||||
language = file_info.get("language", "text")
|
||||
|
||||
print(f"\n📂 Файл {i}/{len(state['files'])}: {file_path}")
|
||||
print(f" Язык: {language}")
|
||||
print(f" Размер patch: {len(patch) if patch else 0} символов")
|
||||
print(f" Additions: {file_info.get('additions')}, Deletions: {file_info.get('deletions')}")
|
||||
|
||||
if not patch or len(patch) < 10:
|
||||
print(f" ⚠️ ПРОПУСК: patch пустой или слишком маленький")
|
||||
continue
|
||||
|
||||
# Analyze diff with PR context
|
||||
pr_info = state.get("pr_info", {})
|
||||
comments = await self.analyzer.analyze_diff(
|
||||
file_path=file_path,
|
||||
diff=patch,
|
||||
language=language,
|
||||
pr_title=pr_info.get("title", ""),
|
||||
pr_description=pr_info.get("description", "")
|
||||
)
|
||||
|
||||
print(f" 💬 Получено комментариев: {len(comments)}")
|
||||
|
||||
# Add file path to each comment
|
||||
for comment in comments:
|
||||
comment["file_path"] = file_path
|
||||
all_comments.append(comment)
|
||||
|
||||
print(f"\n✅ ИТОГО комментариев: {len(all_comments)}")
|
||||
print("🔬"*40 + "\n")
|
||||
|
||||
state["comments"] = all_comments
|
||||
state["status"] = "analyzed"
|
||||
|
||||
# Update review
|
||||
result = await self.db.execute(
|
||||
select(Review).where(Review.id == state["review_id"])
|
||||
)
|
||||
review = result.scalar_one()
|
||||
review.files_analyzed = len(state["files"])
|
||||
review.status = ReviewStatusEnum.COMMENTING
|
||||
await self.db.commit()
|
||||
|
||||
return state
|
||||
|
||||
except Exception as e:
|
||||
print(f"❌ ОШИБКА в analyze_files: {e}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
state["error"] = str(e)
|
||||
state["status"] = "failed"
|
||||
return state
|
||||
|
||||
async def post_comments(self, state: ReviewState) -> ReviewState:
|
||||
"""Post comments to PR"""
|
||||
try:
|
||||
# Save comments to database
|
||||
result = await self.db.execute(
|
||||
select(Review).where(Review.id == state["review_id"])
|
||||
)
|
||||
review = result.scalar_one()
|
||||
|
||||
db_comments = []
|
||||
for comment_data in state["comments"]:
|
||||
# Фильтруем <think> блоки из сообщения
|
||||
message = comment_data.get("message", "")
|
||||
message = self._remove_think_blocks(message)
|
||||
# Экранируем HTML теги (чтобы они не исчезали в Markdown)
|
||||
message = self._escape_html_in_text(message)
|
||||
|
||||
comment = Comment(
|
||||
review_id=review.id,
|
||||
file_path=comment_data["file_path"],
|
||||
line_number=comment_data.get("line", 1),
|
||||
content=message,
|
||||
severity=SeverityEnum(comment_data.get("severity", "INFO").lower()),
|
||||
posted=False
|
||||
)
|
||||
self.db.add(comment)
|
||||
db_comments.append({**comment_data, "message": message})
|
||||
|
||||
await self.db.commit()
|
||||
|
||||
# Post to Git platform
|
||||
git_service = state["git_service"]
|
||||
pr_info = state.get("pr_info", {})
|
||||
|
||||
# Generate summary
|
||||
summary = await self.analyzer.generate_summary(
|
||||
all_comments=db_comments,
|
||||
pr_title=pr_info.get("title", ""),
|
||||
pr_description=pr_info.get("description", "")
|
||||
)
|
||||
|
||||
# Фильтруем <think> блоки из summary
|
||||
summary = self._remove_think_blocks(summary)
|
||||
# Экранируем HTML теги в summary
|
||||
summary = self._escape_html_in_text(summary)
|
||||
|
||||
if db_comments:
|
||||
# Format comments for API
|
||||
formatted_comments = [
|
||||
{
|
||||
"file_path": c["file_path"],
|
||||
"line_number": c.get("line", 1),
|
||||
"content": f"**{c.get('severity', 'INFO').upper()}**: {c.get('message', '')}"
|
||||
}
|
||||
for c in db_comments
|
||||
]
|
||||
|
||||
try:
|
||||
# Determine review status based on severity
|
||||
has_errors = any(c.get('severity', '').upper() == 'ERROR' for c in db_comments)
|
||||
event = "REQUEST_CHANGES" if has_errors else "COMMENT"
|
||||
|
||||
await git_service.create_review(
|
||||
pr_number=state["pr_number"],
|
||||
comments=formatted_comments,
|
||||
body=summary,
|
||||
event=event
|
||||
)
|
||||
|
||||
# Mark comments as posted
|
||||
result = await self.db.execute(
|
||||
select(Comment).where(Comment.review_id == review.id)
|
||||
)
|
||||
comments = result.scalars().all()
|
||||
for comment in comments:
|
||||
comment.posted = True
|
||||
await self.db.commit()
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error posting comments to Git platform: {e}")
|
||||
# Continue even if posting fails
|
||||
else:
|
||||
# No issues found - approve PR
|
||||
try:
|
||||
await git_service.create_review(
|
||||
pr_number=state["pr_number"],
|
||||
comments=[],
|
||||
body=summary,
|
||||
event="APPROVE" # Approve if no issues
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error posting approval: {e}")
|
||||
|
||||
review.comments_generated = len(db_comments)
|
||||
await self.db.commit()
|
||||
|
||||
state["status"] = "commented"
|
||||
return state
|
||||
|
||||
except Exception as e:
|
||||
state["error"] = str(e)
|
||||
state["status"] = "failed"
|
||||
return state
|
||||
|
||||
async def complete_review(self, state: ReviewState) -> ReviewState:
|
||||
"""Complete the review"""
|
||||
try:
|
||||
result = await self.db.execute(
|
||||
select(Review).where(Review.id == state["review_id"])
|
||||
)
|
||||
review = result.scalar_one()
|
||||
|
||||
if state.get("error"):
|
||||
review.status = ReviewStatusEnum.FAILED
|
||||
review.error_message = state["error"]
|
||||
else:
|
||||
review.status = ReviewStatusEnum.COMPLETED
|
||||
|
||||
from datetime import datetime
|
||||
review.completed_at = datetime.utcnow()
|
||||
await self.db.commit()
|
||||
|
||||
state["status"] = "completed"
|
||||
return state
|
||||
|
||||
except Exception as e:
|
||||
state["error"] = str(e)
|
||||
state["status"] = "failed"
|
||||
return state
|
||||
|
||||
async def run_review(
|
||||
self,
|
||||
review_id: int,
|
||||
pr_number: int,
|
||||
repository_id: int
|
||||
) -> Dict[str, Any]:
|
||||
"""Run the review workflow"""
|
||||
initial_state: ReviewState = {
|
||||
"review_id": review_id,
|
||||
"pr_number": pr_number,
|
||||
"repository_id": repository_id,
|
||||
"status": "pending",
|
||||
"files": [],
|
||||
"analyzed_files": [],
|
||||
"comments": [],
|
||||
"error": None,
|
||||
"git_service": None
|
||||
}
|
||||
|
||||
final_state = await self.graph.ainvoke(initial_state)
|
||||
return final_state
|
||||
|
||||
299
backend/app/agents/tools.py
Normal file
299
backend/app/agents/tools.py
Normal file
@@ -0,0 +1,299 @@
|
||||
"""Tools for the reviewer agent"""
|
||||
|
||||
import json
|
||||
import re
|
||||
from typing import List, Dict, Any, Optional
|
||||
from langchain_ollama import OllamaLLM
|
||||
from langchain_core.output_parsers import JsonOutputParser
|
||||
from langchain_core.prompts import PromptTemplate
|
||||
from app.agents.prompts import DIFF_REVIEW_PROMPT, CODE_REVIEW_PROMPT
|
||||
|
||||
|
||||
class CodeAnalyzer:
|
||||
"""Tool for analyzing code with Ollama"""
|
||||
|
||||
def __init__(self, ollama_base_url: str, model: str):
|
||||
self.llm = OllamaLLM(
|
||||
base_url=ollama_base_url,
|
||||
model=model,
|
||||
temperature=0.3, # Увеличили для более внимательного анализа
|
||||
format="json" # Форсируем JSON формат
|
||||
)
|
||||
# Используем JsonOutputParser для гарантированного JSON
|
||||
self.json_parser = JsonOutputParser()
|
||||
|
||||
def _extract_json_from_response(self, response: str) -> Dict[str, Any]:
|
||||
"""Extract JSON from LLM response"""
|
||||
# Remove markdown code blocks if present
|
||||
response = response.strip()
|
||||
if response.startswith('```'):
|
||||
response = re.sub(r'^```(?:json)?\s*', '', response)
|
||||
response = re.sub(r'\s*```$', '', response)
|
||||
|
||||
# Try to find JSON in the response
|
||||
json_match = re.search(r'\{[\s\S]*\}', response)
|
||||
if json_match:
|
||||
try:
|
||||
json_str = json_match.group()
|
||||
print(f" 🔍 Найден JSON: {json_str[:200]}...")
|
||||
return json.loads(json_str)
|
||||
except json.JSONDecodeError as e:
|
||||
print(f" ❌ Ошибка парсинга JSON: {e}")
|
||||
print(f" 📄 JSON строка: {json_str[:500]}")
|
||||
else:
|
||||
print(f" ❌ JSON не найден в ответе!")
|
||||
print(f" 📄 Ответ: {response[:500]}")
|
||||
|
||||
# If no valid JSON found, return empty comments
|
||||
return {"comments": []}
|
||||
|
||||
async def generate_summary(
|
||||
self,
|
||||
all_comments: List[Dict[str, Any]],
|
||||
pr_title: str = "",
|
||||
pr_description: str = ""
|
||||
) -> str:
|
||||
"""Generate overall review summary in markdown"""
|
||||
if not all_comments:
|
||||
return """## 🤖 AI Code Review
|
||||
|
||||
✅ **Отличная работа!** Серьезных проблем не обнаружено.
|
||||
|
||||
Код выглядит хорошо и соответствует стандартам."""
|
||||
|
||||
# Группируем по severity
|
||||
errors = [c for c in all_comments if c.get('severity', '').upper() == 'ERROR']
|
||||
warnings = [c for c in all_comments if c.get('severity', '').upper() == 'WARNING']
|
||||
infos = [c for c in all_comments if c.get('severity', '').upper() == 'INFO']
|
||||
|
||||
summary = f"""## 🤖 AI Code Review
|
||||
|
||||
### 📊 Статистика
|
||||
|
||||
- **Всего проблем:** {len(all_comments)}
|
||||
"""
|
||||
|
||||
if errors:
|
||||
summary += f"- ❌ **Критичных:** {len(errors)}\n"
|
||||
if warnings:
|
||||
summary += f"- ⚠️ **Важных:** {len(warnings)}\n"
|
||||
if infos:
|
||||
summary += f"- ℹ️ **Рекомендаций:** {len(infos)}\n"
|
||||
|
||||
summary += "\n### 💡 Рекомендации\n\n"
|
||||
|
||||
if errors:
|
||||
summary += "⚠️ **Найдены критичные проблемы!** Пожалуйста, исправьте их перед мержем в main.\n\n"
|
||||
elif warnings:
|
||||
summary += "Найдены важные замечания. Рекомендуется исправить перед мержем.\n\n"
|
||||
else:
|
||||
summary += "Проблемы не критичны, но рекомендуется учесть.\n\n"
|
||||
|
||||
summary += "📝 **Детальные комментарии для каждой проблемы опубликованы ниже.**\n"
|
||||
|
||||
return summary
|
||||
|
||||
async def analyze_diff(
|
||||
self,
|
||||
file_path: str,
|
||||
diff: str,
|
||||
language: Optional[str] = None,
|
||||
pr_title: str = "",
|
||||
pr_description: str = ""
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Analyze code diff and return comments"""
|
||||
|
||||
if not diff or not diff.strip():
|
||||
print(f"⚠️ Пустой diff для {file_path}")
|
||||
return []
|
||||
|
||||
# Add PR context if available
|
||||
pr_context = ""
|
||||
if pr_title or pr_description:
|
||||
pr_context = f"\n\n**КОНТЕКСТ PR:**\n"
|
||||
if pr_title:
|
||||
pr_context += f"Название: {pr_title}\n"
|
||||
if pr_description:
|
||||
pr_context += f"Описание: {pr_description}\n"
|
||||
pr_context += "\nОБЯЗАТЕЛЬНО проверь: соответствует ли код описанию PR!\n"
|
||||
|
||||
# Получаем инструкции по формату JSON от парсера
|
||||
format_instructions = self.json_parser.get_format_instructions()
|
||||
|
||||
prompt = DIFF_REVIEW_PROMPT.format(
|
||||
file_path=file_path,
|
||||
diff=diff,
|
||||
pr_context=pr_context,
|
||||
format_instructions=format_instructions
|
||||
)
|
||||
|
||||
print("\n" + "="*80)
|
||||
print(f"🔍 АНАЛИЗ ФАЙЛА: {file_path}")
|
||||
print("="*80)
|
||||
|
||||
if pr_title or pr_description:
|
||||
print(f"\n📋 КОНТЕКСТ PR:")
|
||||
print("-" * 80)
|
||||
if pr_title:
|
||||
print(f"Название: {pr_title}")
|
||||
if pr_description:
|
||||
desc_short = pr_description[:200] + ("..." if len(pr_description) > 200 else "")
|
||||
print(f"Описание: {desc_short}")
|
||||
print("-" * 80)
|
||||
|
||||
print(f"\n📝 DIFF ({len(diff)} символов):")
|
||||
print("-" * 80)
|
||||
# Показываем первые 800 символов diff
|
||||
print(diff[:800] + ("...\n[обрезано]" if len(diff) > 800 else ""))
|
||||
print("-" * 80)
|
||||
print(f"\n💭 ПРОМПТ ({len(prompt)} символов):")
|
||||
print("-" * 80)
|
||||
print(prompt[:500] + "...")
|
||||
print("-" * 80)
|
||||
|
||||
try:
|
||||
print(f"\n⏳ Отправка запроса к Ollama ({self.llm.model})...")
|
||||
|
||||
# Создаем chain с LLM и JSON парсером
|
||||
chain = self.llm | self.json_parser
|
||||
|
||||
# Получаем результат
|
||||
result = await chain.ainvoke(prompt)
|
||||
|
||||
print(f"\n🤖 ОТВЕТ AI (распарсен через JsonOutputParser):")
|
||||
print("-" * 80)
|
||||
print(json.dumps(result, ensure_ascii=False, indent=2)[:500] + "...")
|
||||
print("-" * 80)
|
||||
|
||||
comments = result.get("comments", [])
|
||||
|
||||
if comments:
|
||||
print(f"\n✅ Найдено комментариев: {len(comments)}")
|
||||
for i, comment in enumerate(comments, 1):
|
||||
print(f"\n {i}. Строка {comment.get('line', '?')}:")
|
||||
print(f" Severity: {comment.get('severity', '?')}")
|
||||
print(f" Message: {comment.get('message', '?')[:100]}...")
|
||||
else:
|
||||
print("\n⚠️ Комментариев не найдено! AI не нашел проблем.")
|
||||
|
||||
print("="*80 + "\n")
|
||||
|
||||
return comments
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ ОШИБКА при анализе {file_path}: {e}")
|
||||
print(f" Тип ошибки: {type(e).__name__}")
|
||||
import traceback
|
||||
traceback.print_exc()
|
||||
|
||||
# Fallback: попытка извлечь JSON вручную
|
||||
print("\n🔄 Попытка fallback парсинга...")
|
||||
try:
|
||||
if hasattr(e, 'args') and len(e.args) > 0:
|
||||
response_text = str(e.args[0])
|
||||
result = self._extract_json_from_response(response_text)
|
||||
return result.get("comments", [])
|
||||
except:
|
||||
pass
|
||||
|
||||
return []
|
||||
|
||||
async def analyze_code(
|
||||
self,
|
||||
file_path: str,
|
||||
code: str,
|
||||
language: str = "python",
|
||||
patch_info: str = ""
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Analyze full code content and return comments"""
|
||||
|
||||
if not code or not code.strip():
|
||||
return []
|
||||
|
||||
prompt = CODE_REVIEW_PROMPT.format(
|
||||
file_path=file_path,
|
||||
code=code,
|
||||
language=language,
|
||||
patch_info=patch_info
|
||||
)
|
||||
|
||||
try:
|
||||
response = await self.llm.ainvoke(prompt)
|
||||
result = self._extract_json_from_response(response)
|
||||
return result.get("comments", [])
|
||||
except Exception as e:
|
||||
print(f"Error analyzing code for {file_path}: {e}")
|
||||
return []
|
||||
|
||||
|
||||
def detect_language(file_path: str) -> str:
|
||||
"""Detect programming language from file extension"""
|
||||
extension_map = {
|
||||
'.py': 'python',
|
||||
'.js': 'javascript',
|
||||
'.ts': 'typescript',
|
||||
'.tsx': 'typescript',
|
||||
'.jsx': 'javascript',
|
||||
'.java': 'java',
|
||||
'.go': 'go',
|
||||
'.rs': 'rust',
|
||||
'.cpp': 'cpp',
|
||||
'.c': 'c',
|
||||
'.cs': 'csharp',
|
||||
'.php': 'php',
|
||||
'.rb': 'ruby',
|
||||
'.swift': 'swift',
|
||||
'.kt': 'kotlin',
|
||||
'.scala': 'scala',
|
||||
'.sh': 'bash',
|
||||
'.sql': 'sql',
|
||||
'.html': 'html',
|
||||
'.css': 'css',
|
||||
'.scss': 'scss',
|
||||
'.yaml': 'yaml',
|
||||
'.yml': 'yaml',
|
||||
'.json': 'json',
|
||||
'.xml': 'xml',
|
||||
'.md': 'markdown',
|
||||
}
|
||||
|
||||
ext = '.' + file_path.split('.')[-1] if '.' in file_path else ''
|
||||
return extension_map.get(ext.lower(), 'text')
|
||||
|
||||
|
||||
def should_review_file(file_path: str) -> bool:
|
||||
"""Determine if file should be reviewed"""
|
||||
# Skip binary, generated, and config files
|
||||
skip_extensions = {
|
||||
'.png', '.jpg', '.jpeg', '.gif', '.svg', '.ico',
|
||||
'.pdf', '.zip', '.tar', '.gz',
|
||||
'.lock', '.min.js', '.min.css',
|
||||
'.pyc', '.pyo', '.class', '.o',
|
||||
}
|
||||
|
||||
skip_patterns = [
|
||||
'node_modules/',
|
||||
'venv/',
|
||||
'.git/',
|
||||
'dist/',
|
||||
'build/',
|
||||
'__pycache__/',
|
||||
'.next/',
|
||||
'.nuxt/',
|
||||
'package-lock.json',
|
||||
'yarn.lock',
|
||||
'poetry.lock',
|
||||
]
|
||||
|
||||
# Check extension
|
||||
ext = '.' + file_path.split('.')[-1] if '.' in file_path else ''
|
||||
if ext.lower() in skip_extensions:
|
||||
return False
|
||||
|
||||
# Check patterns
|
||||
for pattern in skip_patterns:
|
||||
if pattern in file_path:
|
||||
return False
|
||||
|
||||
return True
|
||||
|
||||
14
backend/app/api/__init__.py
Normal file
14
backend/app/api/__init__.py
Normal file
@@ -0,0 +1,14 @@
|
||||
"""API endpoints"""
|
||||
|
||||
from fastapi import APIRouter
|
||||
|
||||
from app.api import repositories, reviews, webhooks
|
||||
|
||||
api_router = APIRouter()
|
||||
|
||||
api_router.include_router(repositories.router, prefix="/repositories", tags=["repositories"])
|
||||
api_router.include_router(reviews.router, prefix="/reviews", tags=["reviews"])
|
||||
api_router.include_router(webhooks.router, prefix="/webhooks", tags=["webhooks"])
|
||||
|
||||
__all__ = ["api_router"]
|
||||
|
||||
419
backend/app/api/repositories.py
Normal file
419
backend/app/api/repositories.py
Normal file
@@ -0,0 +1,419 @@
|
||||
"""Repository management endpoints"""
|
||||
|
||||
import secrets
|
||||
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, func
|
||||
from typing import List
|
||||
from cryptography.fernet import Fernet
|
||||
|
||||
from app.database import get_db
|
||||
from app.models import Repository
|
||||
from app.schemas.repository import (
|
||||
RepositoryCreate,
|
||||
RepositoryUpdate,
|
||||
RepositoryResponse,
|
||||
RepositoryList
|
||||
)
|
||||
from app.config import settings
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
def get_cipher():
|
||||
"""Get Fernet cipher for encryption"""
|
||||
# Use first 32 bytes of encryption key, base64 encoded
|
||||
key = settings.encryption_key.encode()[:32]
|
||||
# Pad to 32 bytes if needed
|
||||
key = key.ljust(32, b'0')
|
||||
# Base64 encode for Fernet
|
||||
import base64
|
||||
key_b64 = base64.urlsafe_b64encode(key)
|
||||
return Fernet(key_b64)
|
||||
|
||||
|
||||
def encrypt_token(token: str) -> str:
|
||||
"""Encrypt API token"""
|
||||
cipher = get_cipher()
|
||||
return cipher.encrypt(token.encode()).decode()
|
||||
|
||||
|
||||
def decrypt_token(encrypted_token: str) -> str:
|
||||
"""Decrypt API token"""
|
||||
cipher = get_cipher()
|
||||
return cipher.decrypt(encrypted_token.encode()).decode()
|
||||
|
||||
|
||||
@router.get("", response_model=RepositoryList)
|
||||
async def list_repositories(
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""List all repositories"""
|
||||
# Get total count
|
||||
count_result = await db.execute(select(func.count(Repository.id)))
|
||||
total = count_result.scalar()
|
||||
|
||||
# Get repositories
|
||||
result = await db.execute(
|
||||
select(Repository)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(Repository.created_at.desc())
|
||||
)
|
||||
repositories = result.scalars().all()
|
||||
|
||||
# Add webhook URL to each repository
|
||||
items = []
|
||||
for repo in repositories:
|
||||
repo_dict = {
|
||||
"id": repo.id,
|
||||
"name": repo.name,
|
||||
"platform": repo.platform,
|
||||
"url": repo.url,
|
||||
"config": repo.config,
|
||||
"is_active": repo.is_active,
|
||||
"created_at": repo.created_at,
|
||||
"updated_at": repo.updated_at,
|
||||
"webhook_url": f"http://{settings.host}:{settings.port}/api/webhooks/{repo.platform.value}/{repo.id}"
|
||||
}
|
||||
items.append(RepositoryResponse(**repo_dict))
|
||||
|
||||
return RepositoryList(items=items, total=total)
|
||||
|
||||
|
||||
@router.post("", response_model=RepositoryResponse)
|
||||
async def create_repository(
|
||||
repository: RepositoryCreate,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Create a new repository"""
|
||||
# Generate webhook secret if not provided
|
||||
webhook_secret = repository.webhook_secret or secrets.token_urlsafe(32)
|
||||
|
||||
# Encrypt API token (если указан)
|
||||
encrypted_token = encrypt_token(repository.api_token) if repository.api_token else None
|
||||
|
||||
# Create repository
|
||||
db_repository = Repository(
|
||||
name=repository.name,
|
||||
platform=repository.platform,
|
||||
url=repository.url,
|
||||
api_token=encrypted_token,
|
||||
webhook_secret=webhook_secret,
|
||||
config=repository.config or {}
|
||||
)
|
||||
|
||||
db.add(db_repository)
|
||||
await db.commit()
|
||||
await db.refresh(db_repository)
|
||||
|
||||
# Prepare response
|
||||
webhook_url = f"http://{settings.host}:{settings.port}/api/webhooks/{db_repository.platform.value}/{db_repository.id}"
|
||||
|
||||
return RepositoryResponse(
|
||||
id=db_repository.id,
|
||||
name=db_repository.name,
|
||||
platform=db_repository.platform,
|
||||
url=db_repository.url,
|
||||
config=db_repository.config,
|
||||
is_active=db_repository.is_active,
|
||||
created_at=db_repository.created_at,
|
||||
updated_at=db_repository.updated_at,
|
||||
webhook_url=webhook_url
|
||||
)
|
||||
|
||||
|
||||
@router.get("/{repository_id}", response_model=RepositoryResponse)
|
||||
async def get_repository(
|
||||
repository_id: int,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get repository by ID"""
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.id == repository_id)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
webhook_url = f"http://{settings.host}:{settings.port}/api/webhooks/{repository.platform.value}/{repository.id}"
|
||||
|
||||
return RepositoryResponse(
|
||||
id=repository.id,
|
||||
name=repository.name,
|
||||
platform=repository.platform,
|
||||
url=repository.url,
|
||||
config=repository.config,
|
||||
is_active=repository.is_active,
|
||||
created_at=repository.created_at,
|
||||
updated_at=repository.updated_at,
|
||||
webhook_url=webhook_url
|
||||
)
|
||||
|
||||
|
||||
@router.put("/{repository_id}", response_model=RepositoryResponse)
|
||||
async def update_repository(
|
||||
repository_id: int,
|
||||
repository_update: RepositoryUpdate,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Update repository"""
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.id == repository_id)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
# Update fields
|
||||
update_data = repository_update.model_dump(exclude_unset=True)
|
||||
|
||||
# Encrypt API token if provided and not empty
|
||||
if "api_token" in update_data and update_data["api_token"]:
|
||||
update_data["api_token"] = encrypt_token(update_data["api_token"])
|
||||
elif "api_token" in update_data and not update_data["api_token"]:
|
||||
# If empty string provided, don't update token
|
||||
del update_data["api_token"]
|
||||
|
||||
for field, value in update_data.items():
|
||||
setattr(repository, field, value)
|
||||
|
||||
await db.commit()
|
||||
await db.refresh(repository)
|
||||
|
||||
webhook_url = f"http://{settings.host}:{settings.port}/api/webhooks/{repository.platform.value}/{repository.id}"
|
||||
|
||||
return RepositoryResponse(
|
||||
id=repository.id,
|
||||
name=repository.name,
|
||||
platform=repository.platform,
|
||||
url=repository.url,
|
||||
config=repository.config,
|
||||
is_active=repository.is_active,
|
||||
created_at=repository.created_at,
|
||||
updated_at=repository.updated_at,
|
||||
webhook_url=webhook_url
|
||||
)
|
||||
|
||||
|
||||
@router.delete("/{repository_id}")
|
||||
async def delete_repository(
|
||||
repository_id: int,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Delete repository"""
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.id == repository_id)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
await db.delete(repository)
|
||||
await db.commit()
|
||||
|
||||
return {"message": "Repository deleted"}
|
||||
|
||||
|
||||
@router.post("/{repository_id}/scan")
|
||||
async def scan_repository(
|
||||
repository_id: int,
|
||||
background_tasks: BackgroundTasks,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Scan repository for new pull requests and start reviews"""
|
||||
from app.models import PullRequest, Review
|
||||
from app.models.pull_request import PRStatusEnum
|
||||
from app.models.review import ReviewStatusEnum
|
||||
from app.services import GiteaService, GitHubService, BitbucketService
|
||||
from app.utils import decrypt_token
|
||||
|
||||
# Get repository
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.id == repository_id)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
if not repository.is_active:
|
||||
raise HTTPException(status_code=400, detail="Repository is not active")
|
||||
|
||||
# Parse repository URL to get owner and name
|
||||
parts = repository.url.rstrip('/').split('/')
|
||||
repo_name = parts[-1].replace('.git', '')
|
||||
repo_owner = parts[-2]
|
||||
base_url = '/'.join(parts[:-2])
|
||||
|
||||
# Get appropriate Git service
|
||||
from app.config import settings
|
||||
|
||||
if repository.api_token:
|
||||
try:
|
||||
decrypted_token = decrypt_token(repository.api_token)
|
||||
except ValueError as e:
|
||||
raise HTTPException(status_code=400, detail=str(e))
|
||||
else:
|
||||
# Используем мастер токен
|
||||
platform = repository.platform.value.lower()
|
||||
if platform == "gitea":
|
||||
decrypted_token = settings.master_gitea_token
|
||||
elif platform == "github":
|
||||
decrypted_token = settings.master_github_token
|
||||
elif platform == "bitbucket":
|
||||
decrypted_token = settings.master_bitbucket_token
|
||||
else:
|
||||
raise HTTPException(status_code=400, detail=f"Unsupported platform: {repository.platform}")
|
||||
|
||||
if not decrypted_token:
|
||||
raise HTTPException(
|
||||
status_code=400,
|
||||
detail=f"API токен не указан и мастер токен для {platform} не настроен"
|
||||
)
|
||||
|
||||
if repository.platform.value == "gitea":
|
||||
git_service = GiteaService(base_url, decrypted_token, repo_owner, repo_name)
|
||||
elif repository.platform.value == "github":
|
||||
git_service = GitHubService(base_url, decrypted_token, repo_owner, repo_name)
|
||||
elif repository.platform.value == "bitbucket":
|
||||
git_service = BitbucketService(base_url, decrypted_token, repo_owner, repo_name)
|
||||
else:
|
||||
raise HTTPException(status_code=400, detail=f"Unsupported platform: {repository.platform}")
|
||||
|
||||
try:
|
||||
# For Gitea, get list of open PRs
|
||||
import httpx
|
||||
if repository.platform.value == "gitea":
|
||||
url = f"{base_url}/api/v1/repos/{repo_owner}/{repo_name}/pulls"
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
url,
|
||||
headers={"Authorization": f"token {decrypted_token}"},
|
||||
params={"state": "open"}
|
||||
)
|
||||
response.raise_for_status()
|
||||
prs = response.json()
|
||||
elif repository.platform.value == "github":
|
||||
url = f"https://api.github.com/repos/{repo_owner}/{repo_name}/pulls"
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
url,
|
||||
headers={
|
||||
"Authorization": f"token {decrypted_token}",
|
||||
"Accept": "application/vnd.github.v3+json"
|
||||
},
|
||||
params={"state": "open"}
|
||||
)
|
||||
response.raise_for_status()
|
||||
prs = response.json()
|
||||
else:
|
||||
# Bitbucket
|
||||
url = f"https://api.bitbucket.org/2.0/repositories/{repo_owner}/{repo_name}/pullrequests"
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
url,
|
||||
headers={"Authorization": f"Bearer {decrypted_token}"},
|
||||
params={"state": "OPEN"}
|
||||
)
|
||||
response.raise_for_status()
|
||||
prs = response.json().get("values", [])
|
||||
|
||||
new_reviews = []
|
||||
|
||||
for pr_data in prs:
|
||||
# Get PR number based on platform
|
||||
if repository.platform.value == "bitbucket":
|
||||
pr_number = pr_data["id"]
|
||||
pr_title = pr_data["title"]
|
||||
pr_author = pr_data["author"]["display_name"]
|
||||
pr_url = pr_data["links"]["html"]["href"]
|
||||
source_branch = pr_data["source"]["branch"]["name"]
|
||||
target_branch = pr_data["destination"]["branch"]["name"]
|
||||
else:
|
||||
pr_number = pr_data["number"]
|
||||
pr_title = pr_data["title"]
|
||||
pr_author = pr_data["user"]["login"]
|
||||
pr_url = pr_data["html_url"]
|
||||
source_branch = pr_data["head"]["ref"]
|
||||
target_branch = pr_data["base"]["ref"]
|
||||
|
||||
# Check if PR already exists
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == pr_number
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
|
||||
if not pr:
|
||||
# Create new PR
|
||||
pr = PullRequest(
|
||||
repository_id=repository.id,
|
||||
pr_number=pr_number,
|
||||
title=pr_title,
|
||||
author=pr_author,
|
||||
source_branch=source_branch,
|
||||
target_branch=target_branch,
|
||||
url=pr_url,
|
||||
status=PRStatusEnum.OPEN
|
||||
)
|
||||
db.add(pr)
|
||||
await db.commit()
|
||||
await db.refresh(pr)
|
||||
|
||||
# Check if there's already a review for this PR
|
||||
result = await db.execute(
|
||||
select(Review).where(
|
||||
Review.pull_request_id == pr.id,
|
||||
Review.status.in_([
|
||||
ReviewStatusEnum.PENDING,
|
||||
ReviewStatusEnum.FETCHING,
|
||||
ReviewStatusEnum.ANALYZING,
|
||||
ReviewStatusEnum.COMMENTING
|
||||
])
|
||||
)
|
||||
)
|
||||
existing_review = result.scalar_one_or_none()
|
||||
|
||||
if not existing_review:
|
||||
# Create new review
|
||||
review = Review(
|
||||
pull_request_id=pr.id,
|
||||
status=ReviewStatusEnum.PENDING
|
||||
)
|
||||
db.add(review)
|
||||
await db.commit()
|
||||
await db.refresh(review)
|
||||
|
||||
# Start review in background
|
||||
from app.api.webhooks import start_review_task
|
||||
background_tasks.add_task(
|
||||
start_review_task,
|
||||
review.id,
|
||||
pr.pr_number,
|
||||
repository.id
|
||||
)
|
||||
|
||||
new_reviews.append({
|
||||
"review_id": review.id,
|
||||
"pr_number": pr.pr_number,
|
||||
"pr_title": pr.title
|
||||
})
|
||||
|
||||
return {
|
||||
"message": f"Found {len(prs)} open PR(s), started {len(new_reviews)} new review(s)",
|
||||
"total_prs": len(prs),
|
||||
"new_reviews": len(new_reviews),
|
||||
"reviews": new_reviews
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
raise HTTPException(status_code=500, detail=f"Error scanning repository: {str(e)}")
|
||||
|
||||
218
backend/app/api/reviews.py
Normal file
218
backend/app/api/reviews.py
Normal file
@@ -0,0 +1,218 @@
|
||||
"""Review management endpoints"""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, BackgroundTasks
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select, func
|
||||
from sqlalchemy.orm import joinedload
|
||||
|
||||
from app.database import get_db
|
||||
from app.models import Review, Comment, PullRequest
|
||||
from app.schemas.review import ReviewResponse, ReviewList, ReviewStats, PullRequestInfo, CommentResponse
|
||||
from app.agents import ReviewerAgent
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
@router.get("", response_model=ReviewList)
|
||||
async def list_reviews(
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
repository_id: int = None,
|
||||
status: str = None,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""List all reviews with filters"""
|
||||
query = select(Review).options(joinedload(Review.pull_request))
|
||||
|
||||
# Apply filters
|
||||
if repository_id:
|
||||
query = query.join(PullRequest).where(PullRequest.repository_id == repository_id)
|
||||
|
||||
if status:
|
||||
query = query.where(Review.status == status)
|
||||
|
||||
# Get total count
|
||||
count_query = select(func.count(Review.id))
|
||||
if repository_id:
|
||||
count_query = count_query.join(PullRequest).where(PullRequest.repository_id == repository_id)
|
||||
if status:
|
||||
count_query = count_query.where(Review.status == status)
|
||||
|
||||
count_result = await db.execute(count_query)
|
||||
total = count_result.scalar()
|
||||
|
||||
# Get reviews
|
||||
query = query.offset(skip).limit(limit).order_by(Review.started_at.desc())
|
||||
result = await db.execute(query)
|
||||
reviews = result.scalars().all()
|
||||
|
||||
# Convert to response models
|
||||
items = []
|
||||
for review in reviews:
|
||||
pr_info = PullRequestInfo(
|
||||
id=review.pull_request.id,
|
||||
pr_number=review.pull_request.pr_number,
|
||||
title=review.pull_request.title,
|
||||
author=review.pull_request.author,
|
||||
source_branch=review.pull_request.source_branch,
|
||||
target_branch=review.pull_request.target_branch,
|
||||
url=review.pull_request.url
|
||||
)
|
||||
|
||||
items.append(ReviewResponse(
|
||||
id=review.id,
|
||||
pull_request_id=review.pull_request_id,
|
||||
pull_request=pr_info,
|
||||
status=review.status,
|
||||
started_at=review.started_at,
|
||||
completed_at=review.completed_at,
|
||||
files_analyzed=review.files_analyzed,
|
||||
comments_generated=review.comments_generated,
|
||||
error_message=review.error_message
|
||||
))
|
||||
|
||||
return ReviewList(items=items, total=total)
|
||||
|
||||
|
||||
@router.get("/{review_id}", response_model=ReviewResponse)
|
||||
async def get_review(
|
||||
review_id: int,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Get review by ID with comments"""
|
||||
result = await db.execute(
|
||||
select(Review)
|
||||
.options(joinedload(Review.pull_request), joinedload(Review.comments))
|
||||
.where(Review.id == review_id)
|
||||
)
|
||||
review = result.unique().scalar_one_or_none()
|
||||
|
||||
if not review:
|
||||
raise HTTPException(status_code=404, detail="Review not found")
|
||||
|
||||
pr_info = PullRequestInfo(
|
||||
id=review.pull_request.id,
|
||||
pr_number=review.pull_request.pr_number,
|
||||
title=review.pull_request.title,
|
||||
author=review.pull_request.author,
|
||||
source_branch=review.pull_request.source_branch,
|
||||
target_branch=review.pull_request.target_branch,
|
||||
url=review.pull_request.url
|
||||
)
|
||||
|
||||
comments = [
|
||||
CommentResponse(
|
||||
id=comment.id,
|
||||
file_path=comment.file_path,
|
||||
line_number=comment.line_number,
|
||||
content=comment.content,
|
||||
severity=comment.severity,
|
||||
posted=comment.posted,
|
||||
posted_at=comment.posted_at,
|
||||
created_at=comment.created_at
|
||||
)
|
||||
for comment in review.comments
|
||||
]
|
||||
|
||||
return ReviewResponse(
|
||||
id=review.id,
|
||||
pull_request_id=review.pull_request_id,
|
||||
pull_request=pr_info,
|
||||
status=review.status,
|
||||
started_at=review.started_at,
|
||||
completed_at=review.completed_at,
|
||||
files_analyzed=review.files_analyzed,
|
||||
comments_generated=review.comments_generated,
|
||||
error_message=review.error_message,
|
||||
comments=comments
|
||||
)
|
||||
|
||||
|
||||
async def run_review_task(review_id: int, pr_number: int, repository_id: int, db: AsyncSession):
|
||||
"""Background task to run review"""
|
||||
agent = ReviewerAgent(db)
|
||||
await agent.run_review(review_id, pr_number, repository_id)
|
||||
|
||||
|
||||
@router.post("/{review_id}/retry")
|
||||
async def retry_review(
|
||||
review_id: int,
|
||||
background_tasks: BackgroundTasks,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Retry a failed review"""
|
||||
result = await db.execute(
|
||||
select(Review).options(joinedload(Review.pull_request)).where(Review.id == review_id)
|
||||
)
|
||||
review = result.scalar_one_or_none()
|
||||
|
||||
if not review:
|
||||
raise HTTPException(status_code=404, detail="Review not found")
|
||||
|
||||
# Reset review status
|
||||
from app.models.review import ReviewStatusEnum
|
||||
review.status = ReviewStatusEnum.PENDING
|
||||
review.error_message = None
|
||||
await db.commit()
|
||||
|
||||
# Run review in background
|
||||
background_tasks.add_task(
|
||||
run_review_task,
|
||||
review.id,
|
||||
review.pull_request.pr_number,
|
||||
review.pull_request.repository_id,
|
||||
db
|
||||
)
|
||||
|
||||
return {"message": "Review queued"}
|
||||
|
||||
|
||||
@router.get("/stats/dashboard", response_model=ReviewStats)
|
||||
async def get_review_stats(db: AsyncSession = Depends(get_db)):
|
||||
"""Get review statistics for dashboard"""
|
||||
# Total reviews
|
||||
total_result = await db.execute(select(func.count(Review.id)))
|
||||
total_reviews = total_result.scalar()
|
||||
|
||||
# Active reviews
|
||||
from app.models.review import ReviewStatusEnum
|
||||
active_result = await db.execute(
|
||||
select(func.count(Review.id)).where(
|
||||
Review.status.in_([
|
||||
ReviewStatusEnum.PENDING,
|
||||
ReviewStatusEnum.FETCHING,
|
||||
ReviewStatusEnum.ANALYZING,
|
||||
ReviewStatusEnum.COMMENTING
|
||||
])
|
||||
)
|
||||
)
|
||||
active_reviews = active_result.scalar()
|
||||
|
||||
# Completed reviews
|
||||
completed_result = await db.execute(
|
||||
select(func.count(Review.id)).where(Review.status == ReviewStatusEnum.COMPLETED)
|
||||
)
|
||||
completed_reviews = completed_result.scalar()
|
||||
|
||||
# Failed reviews
|
||||
failed_result = await db.execute(
|
||||
select(func.count(Review.id)).where(Review.status == ReviewStatusEnum.FAILED)
|
||||
)
|
||||
failed_reviews = failed_result.scalar()
|
||||
|
||||
# Total comments
|
||||
comments_result = await db.execute(select(func.count(Comment.id)))
|
||||
total_comments = comments_result.scalar()
|
||||
|
||||
# Average comments per review
|
||||
avg_comments = total_comments / total_reviews if total_reviews > 0 else 0
|
||||
|
||||
return ReviewStats(
|
||||
total_reviews=total_reviews,
|
||||
active_reviews=active_reviews,
|
||||
completed_reviews=completed_reviews,
|
||||
failed_reviews=failed_reviews,
|
||||
total_comments=total_comments,
|
||||
avg_comments_per_review=round(avg_comments, 2)
|
||||
)
|
||||
|
||||
110
backend/app/api/webhooks.py
Normal file
110
backend/app/api/webhooks.py
Normal file
@@ -0,0 +1,110 @@
|
||||
"""Webhook endpoints"""
|
||||
|
||||
from fastapi import APIRouter, Depends, Request, Header, BackgroundTasks
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from typing import Optional
|
||||
|
||||
from app.database import get_db
|
||||
from app.schemas.webhook import GiteaWebhook, GitHubWebhook, BitbucketWebhook
|
||||
from app.webhooks import handle_gitea_webhook, handle_github_webhook, handle_bitbucket_webhook
|
||||
from app.agents import ReviewerAgent
|
||||
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
async def start_review_task(review_id: int, pr_number: int, repository_id: int):
|
||||
"""Background task to start review"""
|
||||
from app.database import async_session_maker
|
||||
async with async_session_maker() as db:
|
||||
agent = ReviewerAgent(db)
|
||||
await agent.run_review(review_id, pr_number, repository_id)
|
||||
|
||||
|
||||
@router.post("/gitea/{repository_id}")
|
||||
async def gitea_webhook(
|
||||
repository_id: int,
|
||||
request: Request,
|
||||
background_tasks: BackgroundTasks,
|
||||
x_gitea_signature: Optional[str] = Header(None),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Handle Gitea webhook"""
|
||||
raw_payload = await request.body()
|
||||
webhook_data = GiteaWebhook(**await request.json())
|
||||
|
||||
result = await handle_gitea_webhook(
|
||||
webhook_data=webhook_data,
|
||||
signature=x_gitea_signature or "",
|
||||
raw_payload=raw_payload,
|
||||
db=db
|
||||
)
|
||||
|
||||
# Start review in background if created
|
||||
if "review_id" in result:
|
||||
background_tasks.add_task(
|
||||
start_review_task,
|
||||
result["review_id"],
|
||||
webhook_data.number,
|
||||
repository_id
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@router.post("/github/{repository_id}")
|
||||
async def github_webhook(
|
||||
repository_id: int,
|
||||
request: Request,
|
||||
background_tasks: BackgroundTasks,
|
||||
x_hub_signature_256: Optional[str] = Header(None),
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Handle GitHub webhook"""
|
||||
raw_payload = await request.body()
|
||||
webhook_data = GitHubWebhook(**await request.json())
|
||||
|
||||
result = await handle_github_webhook(
|
||||
webhook_data=webhook_data,
|
||||
signature=x_hub_signature_256 or "",
|
||||
raw_payload=raw_payload,
|
||||
db=db
|
||||
)
|
||||
|
||||
# Start review in background if created
|
||||
if "review_id" in result:
|
||||
background_tasks.add_task(
|
||||
start_review_task,
|
||||
result["review_id"],
|
||||
webhook_data.number,
|
||||
repository_id
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
@router.post("/bitbucket/{repository_id}")
|
||||
async def bitbucket_webhook(
|
||||
repository_id: int,
|
||||
request: Request,
|
||||
background_tasks: BackgroundTasks,
|
||||
db: AsyncSession = Depends(get_db)
|
||||
):
|
||||
"""Handle Bitbucket webhook"""
|
||||
webhook_data = BitbucketWebhook(**await request.json())
|
||||
|
||||
result = await handle_bitbucket_webhook(
|
||||
webhook_data=webhook_data,
|
||||
db=db
|
||||
)
|
||||
|
||||
# Start review in background if created
|
||||
if "review_id" in result:
|
||||
background_tasks.add_task(
|
||||
start_review_task,
|
||||
result["review_id"],
|
||||
webhook_data.pullrequest.id,
|
||||
repository_id
|
||||
)
|
||||
|
||||
return result
|
||||
|
||||
61
backend/app/config.py
Normal file
61
backend/app/config.py
Normal file
@@ -0,0 +1,61 @@
|
||||
"""Application configuration"""
|
||||
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
from pydantic import field_validator
|
||||
from typing import List, Union
|
||||
import json
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
"""Application settings"""
|
||||
|
||||
# Ollama
|
||||
ollama_base_url: str = "http://localhost:11434"
|
||||
ollama_model: str = "codellama:7b"
|
||||
|
||||
# Database
|
||||
database_url: str = "sqlite+aiosqlite:///./review.db"
|
||||
|
||||
# Security
|
||||
secret_key: str = "change-this-to-a-secure-random-string"
|
||||
encryption_key: str = "change-this-to-a-secure-random-string"
|
||||
|
||||
# Master Git tokens (optional, используются если не указаны в проекте)
|
||||
master_gitea_token: str = ""
|
||||
master_github_token: str = ""
|
||||
master_bitbucket_token: str = ""
|
||||
|
||||
# Server
|
||||
host: str = "0.0.0.0"
|
||||
port: int = 8000
|
||||
debug: bool = True
|
||||
|
||||
# CORS - можно задать как строку с запятой или JSON массив
|
||||
cors_origins: Union[List[str], str] = "http://localhost:5173,http://localhost:3000"
|
||||
|
||||
@field_validator('cors_origins', mode='before')
|
||||
@classmethod
|
||||
def parse_cors_origins(cls, v):
|
||||
if isinstance(v, str):
|
||||
# Если строка с запятыми
|
||||
if ',' in v:
|
||||
return [origin.strip() for origin in v.split(',')]
|
||||
# Если JSON массив
|
||||
try:
|
||||
parsed = json.loads(v)
|
||||
if isinstance(parsed, list):
|
||||
return parsed
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
pass
|
||||
# Если одиночная строка
|
||||
return [v.strip()]
|
||||
return v
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
case_sensitive=False
|
||||
)
|
||||
|
||||
|
||||
settings = Settings()
|
||||
|
||||
42
backend/app/database.py
Normal file
42
backend/app/database.py
Normal file
@@ -0,0 +1,42 @@
|
||||
"""Database configuration and session management"""
|
||||
|
||||
from sqlalchemy.ext.asyncio import create_async_engine, AsyncSession, async_sessionmaker
|
||||
from sqlalchemy.orm import declarative_base
|
||||
from app.config import settings
|
||||
|
||||
# Create async engine
|
||||
engine = create_async_engine(
|
||||
settings.database_url,
|
||||
echo=settings.debug,
|
||||
future=True
|
||||
)
|
||||
|
||||
# Create async session factory
|
||||
async_session_maker = async_sessionmaker(
|
||||
engine,
|
||||
class_=AsyncSession,
|
||||
expire_on_commit=False
|
||||
)
|
||||
|
||||
# Base class for models
|
||||
Base = declarative_base()
|
||||
|
||||
|
||||
async def get_db() -> AsyncSession:
|
||||
"""Dependency for getting database session"""
|
||||
async with async_session_maker() as session:
|
||||
try:
|
||||
yield session
|
||||
await session.commit()
|
||||
except Exception:
|
||||
await session.rollback()
|
||||
raise
|
||||
finally:
|
||||
await session.close()
|
||||
|
||||
|
||||
async def init_db():
|
||||
"""Initialize database tables"""
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
|
||||
119
backend/app/main.py
Normal file
119
backend/app/main.py
Normal file
@@ -0,0 +1,119 @@
|
||||
"""Main FastAPI application"""
|
||||
|
||||
from fastapi import FastAPI, WebSocket, WebSocketDisconnect
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from contextlib import asynccontextmanager
|
||||
from typing import List
|
||||
import json
|
||||
|
||||
from app.config import settings
|
||||
from app.database import init_db
|
||||
from app.api import api_router
|
||||
|
||||
|
||||
class ConnectionManager:
|
||||
"""WebSocket connection manager"""
|
||||
|
||||
def __init__(self):
|
||||
self.active_connections: List[WebSocket] = []
|
||||
|
||||
async def connect(self, websocket: WebSocket):
|
||||
await websocket.accept()
|
||||
self.active_connections.append(websocket)
|
||||
|
||||
def disconnect(self, websocket: WebSocket):
|
||||
self.active_connections.remove(websocket)
|
||||
|
||||
async def broadcast(self, message: dict):
|
||||
"""Broadcast message to all connected clients"""
|
||||
for connection in self.active_connections:
|
||||
try:
|
||||
await connection.send_json(message)
|
||||
except Exception:
|
||||
pass
|
||||
|
||||
|
||||
# Create connection manager
|
||||
manager = ConnectionManager()
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Lifespan events"""
|
||||
# Startup
|
||||
await init_db()
|
||||
yield
|
||||
# Shutdown
|
||||
pass
|
||||
|
||||
|
||||
# Create FastAPI app
|
||||
app = FastAPI(
|
||||
title="AI Code Review Agent",
|
||||
description="AI агент для автоматического ревью Pull Request",
|
||||
version="0.1.0",
|
||||
lifespan=lifespan
|
||||
)
|
||||
|
||||
# Configure CORS
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=settings.cors_origins,
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
# Include API routes
|
||||
app.include_router(api_router, prefix="/api")
|
||||
|
||||
|
||||
@app.get("/")
|
||||
async def root():
|
||||
"""Root endpoint"""
|
||||
return {
|
||||
"message": "AI Code Review Agent API",
|
||||
"version": "0.1.0",
|
||||
"docs": "/docs"
|
||||
}
|
||||
|
||||
|
||||
@app.get("/health")
|
||||
async def health_check():
|
||||
"""Health check endpoint"""
|
||||
return {"status": "healthy"}
|
||||
|
||||
|
||||
@app.websocket("/ws/reviews")
|
||||
async def websocket_endpoint(websocket: WebSocket):
|
||||
"""WebSocket endpoint for real-time review updates"""
|
||||
await manager.connect(websocket)
|
||||
try:
|
||||
while True:
|
||||
# Keep connection alive
|
||||
data = await websocket.receive_text()
|
||||
# Echo back or handle client messages if needed
|
||||
await websocket.send_json({"type": "pong", "message": "connected"})
|
||||
except WebSocketDisconnect:
|
||||
manager.disconnect(websocket)
|
||||
|
||||
|
||||
async def broadcast_review_update(review_id: int, event_type: str, data: dict = None):
|
||||
"""Broadcast review update to all connected clients"""
|
||||
message = {
|
||||
"type": event_type,
|
||||
"review_id": review_id,
|
||||
"data": data or {}
|
||||
}
|
||||
await manager.broadcast(message)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(
|
||||
"app.main:app",
|
||||
host=settings.host,
|
||||
port=settings.port,
|
||||
reload=settings.debug
|
||||
)
|
||||
|
||||
9
backend/app/models/__init__.py
Normal file
9
backend/app/models/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Database models"""
|
||||
|
||||
from app.models.repository import Repository
|
||||
from app.models.pull_request import PullRequest
|
||||
from app.models.review import Review
|
||||
from app.models.comment import Comment
|
||||
|
||||
__all__ = ["Repository", "PullRequest", "Review", "Comment"]
|
||||
|
||||
39
backend/app/models/comment.py
Normal file
39
backend/app/models/comment.py
Normal file
@@ -0,0 +1,39 @@
|
||||
"""Comment model"""
|
||||
|
||||
from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, Boolean, Text, Enum
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy.sql import func
|
||||
from datetime import datetime
|
||||
import enum
|
||||
|
||||
from app.database import Base
|
||||
|
||||
|
||||
class SeverityEnum(str, enum.Enum):
|
||||
"""Comment severity levels"""
|
||||
INFO = "info"
|
||||
WARNING = "warning"
|
||||
ERROR = "error"
|
||||
|
||||
|
||||
class Comment(Base):
|
||||
"""Review comment model"""
|
||||
|
||||
__tablename__ = "comments"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
review_id = Column(Integer, ForeignKey("reviews.id"), nullable=False)
|
||||
file_path = Column(String, nullable=False)
|
||||
line_number = Column(Integer, nullable=False)
|
||||
content = Column(Text, nullable=False)
|
||||
severity = Column(Enum(SeverityEnum), default=SeverityEnum.INFO)
|
||||
posted = Column(Boolean, default=False)
|
||||
posted_at = Column(DateTime, nullable=True)
|
||||
created_at = Column(DateTime, default=datetime.utcnow, server_default=func.now())
|
||||
|
||||
# Relationships
|
||||
review = relationship("Review", back_populates="comments")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Comment(id={self.id}, file={self.file_path}:{self.line_number}, severity={self.severity})>"
|
||||
|
||||
43
backend/app/models/pull_request.py
Normal file
43
backend/app/models/pull_request.py
Normal file
@@ -0,0 +1,43 @@
|
||||
"""Pull Request model"""
|
||||
|
||||
from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, Enum
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy.sql import func
|
||||
from datetime import datetime
|
||||
import enum
|
||||
|
||||
from app.database import Base
|
||||
|
||||
|
||||
class PRStatusEnum(str, enum.Enum):
|
||||
"""Pull Request status"""
|
||||
OPEN = "open"
|
||||
REVIEWING = "reviewing"
|
||||
REVIEWED = "reviewed"
|
||||
CLOSED = "closed"
|
||||
|
||||
|
||||
class PullRequest(Base):
|
||||
"""Pull Request model"""
|
||||
|
||||
__tablename__ = "pull_requests"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
repository_id = Column(Integer, ForeignKey("repositories.id"), nullable=False)
|
||||
pr_number = Column(Integer, nullable=False)
|
||||
title = Column(String, nullable=False)
|
||||
author = Column(String, nullable=False)
|
||||
source_branch = Column(String, nullable=False)
|
||||
target_branch = Column(String, nullable=False)
|
||||
status = Column(Enum(PRStatusEnum), default=PRStatusEnum.OPEN)
|
||||
url = Column(String, nullable=False)
|
||||
created_at = Column(DateTime, default=datetime.utcnow, server_default=func.now())
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow, server_default=func.now())
|
||||
|
||||
# Relationships
|
||||
repository = relationship("Repository", back_populates="pull_requests")
|
||||
reviews = relationship("Review", back_populates="pull_request", cascade="all, delete-orphan")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<PullRequest(id={self.id}, pr_number={self.pr_number}, title={self.title})>"
|
||||
|
||||
40
backend/app/models/repository.py
Normal file
40
backend/app/models/repository.py
Normal file
@@ -0,0 +1,40 @@
|
||||
"""Repository model"""
|
||||
|
||||
from sqlalchemy import Column, Integer, String, Boolean, DateTime, JSON, Enum
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy.sql import func
|
||||
from datetime import datetime
|
||||
import enum
|
||||
|
||||
from app.database import Base
|
||||
|
||||
|
||||
class PlatformEnum(str, enum.Enum):
|
||||
"""Git platform types"""
|
||||
GITEA = "gitea"
|
||||
GITHUB = "github"
|
||||
BITBUCKET = "bitbucket"
|
||||
|
||||
|
||||
class Repository(Base):
|
||||
"""Repository model for tracking Git repositories"""
|
||||
|
||||
__tablename__ = "repositories"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
name = Column(String, nullable=False)
|
||||
platform = Column(Enum(PlatformEnum), nullable=False)
|
||||
url = Column(String, nullable=False)
|
||||
api_token = Column(String, nullable=True) # Encrypted, optional (uses master token if not set)
|
||||
webhook_secret = Column(String, nullable=False)
|
||||
config = Column(JSON, default=dict) # Review configuration
|
||||
is_active = Column(Boolean, default=True)
|
||||
created_at = Column(DateTime, default=datetime.utcnow, server_default=func.now())
|
||||
updated_at = Column(DateTime, default=datetime.utcnow, onupdate=datetime.utcnow, server_default=func.now())
|
||||
|
||||
# Relationships
|
||||
pull_requests = relationship("PullRequest", back_populates="repository", cascade="all, delete-orphan")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Repository(id={self.id}, name={self.name}, platform={self.platform})>"
|
||||
|
||||
43
backend/app/models/review.py
Normal file
43
backend/app/models/review.py
Normal file
@@ -0,0 +1,43 @@
|
||||
"""Review model"""
|
||||
|
||||
from sqlalchemy import Column, Integer, String, DateTime, ForeignKey, Enum
|
||||
from sqlalchemy.orm import relationship
|
||||
from sqlalchemy.sql import func
|
||||
from datetime import datetime
|
||||
import enum
|
||||
from typing import Optional
|
||||
|
||||
from app.database import Base
|
||||
|
||||
|
||||
class ReviewStatusEnum(str, enum.Enum):
|
||||
"""Review status"""
|
||||
PENDING = "pending"
|
||||
FETCHING = "fetching"
|
||||
ANALYZING = "analyzing"
|
||||
COMMENTING = "commenting"
|
||||
COMPLETED = "completed"
|
||||
FAILED = "failed"
|
||||
|
||||
|
||||
class Review(Base):
|
||||
"""Code review model"""
|
||||
|
||||
__tablename__ = "reviews"
|
||||
|
||||
id = Column(Integer, primary_key=True, index=True)
|
||||
pull_request_id = Column(Integer, ForeignKey("pull_requests.id"), nullable=False)
|
||||
status = Column(Enum(ReviewStatusEnum), default=ReviewStatusEnum.PENDING)
|
||||
started_at = Column(DateTime, default=datetime.utcnow, server_default=func.now())
|
||||
completed_at = Column(DateTime, nullable=True)
|
||||
files_analyzed = Column(Integer, default=0)
|
||||
comments_generated = Column(Integer, default=0)
|
||||
error_message = Column(String, nullable=True)
|
||||
|
||||
# Relationships
|
||||
pull_request = relationship("PullRequest", back_populates="reviews")
|
||||
comments = relationship("Comment", back_populates="review", cascade="all, delete-orphan")
|
||||
|
||||
def __repr__(self):
|
||||
return f"<Review(id={self.id}, status={self.status}, pr_id={self.pull_request_id})>"
|
||||
|
||||
32
backend/app/schemas/__init__.py
Normal file
32
backend/app/schemas/__init__.py
Normal file
@@ -0,0 +1,32 @@
|
||||
"""Pydantic schemas for API"""
|
||||
|
||||
from app.schemas.repository import (
|
||||
RepositoryCreate,
|
||||
RepositoryUpdate,
|
||||
RepositoryResponse,
|
||||
RepositoryList
|
||||
)
|
||||
from app.schemas.review import (
|
||||
ReviewResponse,
|
||||
ReviewList,
|
||||
CommentResponse
|
||||
)
|
||||
from app.schemas.webhook import (
|
||||
GiteaWebhook,
|
||||
GitHubWebhook,
|
||||
BitbucketWebhook
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
"RepositoryCreate",
|
||||
"RepositoryUpdate",
|
||||
"RepositoryResponse",
|
||||
"RepositoryList",
|
||||
"ReviewResponse",
|
||||
"ReviewList",
|
||||
"CommentResponse",
|
||||
"GiteaWebhook",
|
||||
"GitHubWebhook",
|
||||
"BitbucketWebhook",
|
||||
]
|
||||
|
||||
49
backend/app/schemas/repository.py
Normal file
49
backend/app/schemas/repository.py
Normal file
@@ -0,0 +1,49 @@
|
||||
"""Repository schemas"""
|
||||
|
||||
from pydantic import BaseModel, Field, HttpUrl
|
||||
from typing import Optional, Dict, Any, List
|
||||
from datetime import datetime
|
||||
from app.models.repository import PlatformEnum
|
||||
|
||||
|
||||
class RepositoryBase(BaseModel):
|
||||
"""Base repository schema"""
|
||||
name: str = Field(..., description="Repository name")
|
||||
platform: PlatformEnum = Field(..., description="Git platform")
|
||||
url: str = Field(..., description="Repository URL")
|
||||
config: Optional[Dict[str, Any]] = Field(default_factory=dict, description="Review configuration")
|
||||
|
||||
|
||||
class RepositoryCreate(RepositoryBase):
|
||||
"""Schema for creating repository"""
|
||||
api_token: Optional[str] = Field(None, description="API token for Git platform (optional, uses master token if not set)")
|
||||
webhook_secret: Optional[str] = Field(None, description="Webhook secret (generated if not provided)")
|
||||
|
||||
|
||||
class RepositoryUpdate(BaseModel):
|
||||
"""Schema for updating repository"""
|
||||
name: Optional[str] = None
|
||||
url: Optional[str] = None
|
||||
api_token: Optional[str] = None
|
||||
webhook_secret: Optional[str] = None
|
||||
config: Optional[Dict[str, Any]] = None
|
||||
is_active: Optional[bool] = None
|
||||
|
||||
|
||||
class RepositoryResponse(RepositoryBase):
|
||||
"""Schema for repository response"""
|
||||
id: int
|
||||
is_active: bool
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
webhook_url: str = Field(..., description="Webhook URL for this repository")
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class RepositoryList(BaseModel):
|
||||
"""Schema for repository list response"""
|
||||
items: List[RepositoryResponse]
|
||||
total: int
|
||||
|
||||
70
backend/app/schemas/review.py
Normal file
70
backend/app/schemas/review.py
Normal file
@@ -0,0 +1,70 @@
|
||||
"""Review schemas"""
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import Optional, List
|
||||
from datetime import datetime
|
||||
from app.models.review import ReviewStatusEnum
|
||||
from app.models.comment import SeverityEnum
|
||||
|
||||
|
||||
class CommentResponse(BaseModel):
|
||||
"""Schema for comment response"""
|
||||
id: int
|
||||
file_path: str
|
||||
line_number: int
|
||||
content: str
|
||||
severity: SeverityEnum
|
||||
posted: bool
|
||||
posted_at: Optional[datetime] = None
|
||||
created_at: datetime
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class PullRequestInfo(BaseModel):
|
||||
"""Schema for pull request information"""
|
||||
id: int
|
||||
pr_number: int
|
||||
title: str
|
||||
author: str
|
||||
source_branch: str
|
||||
target_branch: str
|
||||
url: str
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class ReviewResponse(BaseModel):
|
||||
"""Schema for review response"""
|
||||
id: int
|
||||
pull_request_id: int
|
||||
pull_request: PullRequestInfo
|
||||
status: ReviewStatusEnum
|
||||
started_at: datetime
|
||||
completed_at: Optional[datetime] = None
|
||||
files_analyzed: int
|
||||
comments_generated: int
|
||||
error_message: Optional[str] = None
|
||||
comments: Optional[List[CommentResponse]] = None
|
||||
|
||||
class Config:
|
||||
from_attributes = True
|
||||
|
||||
|
||||
class ReviewList(BaseModel):
|
||||
"""Schema for review list response"""
|
||||
items: List[ReviewResponse]
|
||||
total: int
|
||||
|
||||
|
||||
class ReviewStats(BaseModel):
|
||||
"""Schema for review statistics"""
|
||||
total_reviews: int
|
||||
active_reviews: int
|
||||
completed_reviews: int
|
||||
failed_reviews: int
|
||||
total_comments: int
|
||||
avg_comments_per_review: float
|
||||
|
||||
68
backend/app/schemas/webhook.py
Normal file
68
backend/app/schemas/webhook.py
Normal file
@@ -0,0 +1,68 @@
|
||||
"""Webhook payload schemas"""
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
from typing import Optional, Dict, Any
|
||||
|
||||
|
||||
class GiteaPullRequest(BaseModel):
|
||||
"""Gitea pull request data"""
|
||||
id: int
|
||||
number: int
|
||||
title: str
|
||||
body: Optional[str] = None
|
||||
state: str
|
||||
user: Dict[str, Any]
|
||||
head: Dict[str, Any]
|
||||
base: Dict[str, Any]
|
||||
html_url: str
|
||||
|
||||
|
||||
class GiteaWebhook(BaseModel):
|
||||
"""Gitea webhook payload"""
|
||||
action: str = Field(..., description="Action type: opened, synchronized, closed, etc.")
|
||||
number: int = Field(..., description="Pull request number")
|
||||
pull_request: GiteaPullRequest
|
||||
repository: Dict[str, Any]
|
||||
sender: Dict[str, Any]
|
||||
|
||||
|
||||
class GitHubPullRequest(BaseModel):
|
||||
"""GitHub pull request data"""
|
||||
id: int
|
||||
number: int
|
||||
title: str
|
||||
body: Optional[str] = None
|
||||
state: str
|
||||
user: Dict[str, Any]
|
||||
head: Dict[str, Any]
|
||||
base: Dict[str, Any]
|
||||
html_url: str
|
||||
|
||||
|
||||
class GitHubWebhook(BaseModel):
|
||||
"""GitHub webhook payload"""
|
||||
action: str
|
||||
number: int
|
||||
pull_request: GitHubPullRequest
|
||||
repository: Dict[str, Any]
|
||||
sender: Dict[str, Any]
|
||||
|
||||
|
||||
class BitbucketPullRequest(BaseModel):
|
||||
"""Bitbucket pull request data"""
|
||||
id: int
|
||||
title: str
|
||||
description: Optional[str] = None
|
||||
state: str
|
||||
author: Dict[str, Any]
|
||||
source: Dict[str, Any]
|
||||
destination: Dict[str, Any]
|
||||
links: Dict[str, Any]
|
||||
|
||||
|
||||
class BitbucketWebhook(BaseModel):
|
||||
"""Bitbucket webhook payload"""
|
||||
pullrequest: BitbucketPullRequest
|
||||
repository: Dict[str, Any]
|
||||
actor: Dict[str, Any]
|
||||
|
||||
9
backend/app/services/__init__.py
Normal file
9
backend/app/services/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Git platform services"""
|
||||
|
||||
from app.services.base import BaseGitService
|
||||
from app.services.gitea import GiteaService
|
||||
from app.services.github import GitHubService
|
||||
from app.services.bitbucket import BitbucketService
|
||||
|
||||
__all__ = ["BaseGitService", "GiteaService", "GitHubService", "BitbucketService"]
|
||||
|
||||
77
backend/app/services/base.py
Normal file
77
backend/app/services/base.py
Normal file
@@ -0,0 +1,77 @@
|
||||
"""Base service for Git platforms"""
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from typing import List, Dict, Any, Optional
|
||||
from dataclasses import dataclass
|
||||
|
||||
|
||||
@dataclass
|
||||
class FileChange:
|
||||
"""Represents a changed file in PR"""
|
||||
filename: str
|
||||
status: str # added, modified, removed
|
||||
additions: int
|
||||
deletions: int
|
||||
patch: Optional[str] = None
|
||||
content: Optional[str] = None
|
||||
|
||||
|
||||
@dataclass
|
||||
class PRInfo:
|
||||
"""Pull request information"""
|
||||
number: int
|
||||
title: str
|
||||
description: str
|
||||
author: str
|
||||
source_branch: str
|
||||
target_branch: str
|
||||
url: str
|
||||
state: str
|
||||
|
||||
|
||||
class BaseGitService(ABC):
|
||||
"""Base class for Git platform services"""
|
||||
|
||||
def __init__(self, base_url: str, token: str, repo_owner: str, repo_name: str):
|
||||
self.base_url = base_url.rstrip("/")
|
||||
self.token = token
|
||||
self.repo_owner = repo_owner
|
||||
self.repo_name = repo_name
|
||||
|
||||
@abstractmethod
|
||||
async def get_pull_request(self, pr_number: int) -> PRInfo:
|
||||
"""Get pull request information"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
async def get_pr_files(self, pr_number: int) -> List[FileChange]:
|
||||
"""Get list of changed files in PR"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
async def get_file_content(self, file_path: str, ref: str) -> str:
|
||||
"""Get file content at specific ref"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
async def create_review_comment(
|
||||
self,
|
||||
pr_number: int,
|
||||
file_path: str,
|
||||
line_number: int,
|
||||
comment: str,
|
||||
commit_id: str
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review comment on PR"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
async def create_review(
|
||||
self,
|
||||
pr_number: int,
|
||||
comments: List[Dict[str, Any]],
|
||||
body: str = ""
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review with multiple comments"""
|
||||
pass
|
||||
|
||||
181
backend/app/services/bitbucket.py
Normal file
181
backend/app/services/bitbucket.py
Normal file
@@ -0,0 +1,181 @@
|
||||
"""Bitbucket API service"""
|
||||
|
||||
import httpx
|
||||
from typing import List, Dict, Any
|
||||
from app.services.base import BaseGitService, FileChange, PRInfo
|
||||
|
||||
|
||||
class BitbucketService(BaseGitService):
|
||||
"""Service for interacting with Bitbucket API"""
|
||||
|
||||
def __init__(self, base_url: str, token: str, repo_owner: str, repo_name: str):
|
||||
# Bitbucket Cloud uses api.bitbucket.org
|
||||
super().__init__("https://api.bitbucket.org/2.0", token, repo_owner, repo_name)
|
||||
|
||||
def _get_headers(self) -> Dict[str, str]:
|
||||
"""Get headers for API requests"""
|
||||
return {
|
||||
"Authorization": f"Bearer {self.token}",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
def _get_repo_path(self) -> str:
|
||||
"""Get repository API path"""
|
||||
return f"{self.base_url}/repositories/{self.repo_owner}/{self.repo_name}"
|
||||
|
||||
async def get_pull_request(self, pr_number: int) -> PRInfo:
|
||||
"""Get pull request information from Bitbucket"""
|
||||
url = f"{self._get_repo_path()}/pullrequests/{pr_number}"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
return PRInfo(
|
||||
number=data["id"],
|
||||
title=data["title"],
|
||||
description=data.get("description", ""),
|
||||
author=data["author"]["display_name"],
|
||||
source_branch=data["source"]["branch"]["name"],
|
||||
target_branch=data["destination"]["branch"]["name"],
|
||||
url=data["links"]["html"]["href"],
|
||||
state=data["state"]
|
||||
)
|
||||
|
||||
async def get_pr_files(self, pr_number: int) -> List[FileChange]:
|
||||
"""Get list of changed files in PR"""
|
||||
url = f"{self._get_repo_path()}/pullrequests/{pr_number}/diffstat"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
changes = []
|
||||
for file in data.get("values", []):
|
||||
status = file.get("status", "modified")
|
||||
changes.append(FileChange(
|
||||
filename=file["new"]["path"] if file.get("new") else file["old"]["path"],
|
||||
status=status,
|
||||
additions=file.get("lines_added", 0),
|
||||
deletions=file.get("lines_removed", 0)
|
||||
))
|
||||
|
||||
return changes
|
||||
|
||||
async def get_file_content(self, file_path: str, ref: str) -> str:
|
||||
"""Get file content at specific ref"""
|
||||
url = f"{self._get_repo_path()}/src/{ref}/{file_path}"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
return response.text
|
||||
|
||||
async def create_review_comment(
|
||||
self,
|
||||
pr_number: int,
|
||||
file_path: str,
|
||||
line_number: int,
|
||||
comment: str,
|
||||
commit_id: str
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review comment on PR"""
|
||||
url = f"{self._get_repo_path()}/pullrequests/{pr_number}/comments"
|
||||
|
||||
payload = {
|
||||
"content": {
|
||||
"raw": comment
|
||||
},
|
||||
"inline": {
|
||||
"path": file_path,
|
||||
"to": line_number
|
||||
}
|
||||
}
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
async def create_review(
|
||||
self,
|
||||
pr_number: int,
|
||||
comments: List[Dict[str, Any]],
|
||||
body: str = "",
|
||||
event: str = "COMMENT"
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review with separate comment for each issue
|
||||
|
||||
Args:
|
||||
pr_number: PR number
|
||||
comments: List of comments
|
||||
body: Overall review summary (markdown supported)
|
||||
event: Review event (не используется, для совместимости)
|
||||
"""
|
||||
print(f"\n📤 Публикация ревью в Bitbucket PR #{pr_number}")
|
||||
print(f" Комментариев для публикации: {len(comments)}")
|
||||
|
||||
url = f"{self._get_repo_path()}/pullrequests/{pr_number}/comments"
|
||||
|
||||
# 1. Сначала публикуем общий summary
|
||||
if body:
|
||||
print(f"\n 📝 Публикация общего summary ({len(body)} символов)...")
|
||||
payload = {
|
||||
"content": {
|
||||
"raw": body
|
||||
}
|
||||
}
|
||||
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
print(f" ✅ Summary опубликован!")
|
||||
|
||||
# 2. Затем публикуем каждую проблему отдельным комментарием
|
||||
if comments:
|
||||
print(f"\n 💬 Публикация {len(comments)} отдельных комментариев...")
|
||||
for i, comment in enumerate(comments, 1):
|
||||
severity_emoji = {
|
||||
"ERROR": "❌",
|
||||
"WARNING": "⚠️",
|
||||
"INFO": "ℹ️"
|
||||
}.get(comment.get("severity", "INFO").upper(), "💬")
|
||||
|
||||
# Bitbucket ссылка на строку
|
||||
file_url = f"https://bitbucket.org/{self.repo_owner}/{self.repo_name}/pull-requests/{pr_number}/diff#{comment['file_path']}T{comment['line_number']}"
|
||||
|
||||
# Форматируем комментарий
|
||||
comment_body = f"{severity_emoji} **[`{comment['file_path']}:{comment['line_number']}`]({file_url})**\n\n"
|
||||
comment_body += f"**{comment.get('severity', 'INFO').upper()}**: {comment['content']}"
|
||||
|
||||
payload = {
|
||||
"content": {
|
||||
"raw": comment_body
|
||||
}
|
||||
}
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
print(f" ✅ {i}/{len(comments)}: {comment['file_path']}:{comment['line_number']}")
|
||||
except Exception as e:
|
||||
print(f" ❌ {i}/{len(comments)}: Ошибка - {e}")
|
||||
|
||||
print(f"\n 🎉 Все комментарии опубликованы!")
|
||||
return {"summary": "posted", "comments_count": len(comments)}
|
||||
|
||||
228
backend/app/services/gitea.py
Normal file
228
backend/app/services/gitea.py
Normal file
@@ -0,0 +1,228 @@
|
||||
"""Gitea API service"""
|
||||
|
||||
import httpx
|
||||
from typing import List, Dict, Any, Optional
|
||||
from app.services.base import BaseGitService, FileChange, PRInfo
|
||||
|
||||
|
||||
class GiteaService(BaseGitService):
|
||||
"""Service for interacting with Gitea API"""
|
||||
|
||||
def _get_headers(self) -> Dict[str, str]:
|
||||
"""Get headers for API requests"""
|
||||
return {
|
||||
"Authorization": f"token {self.token}",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
def _get_repo_path(self) -> str:
|
||||
"""Get repository API path"""
|
||||
return f"{self.base_url}/api/v1/repos/{self.repo_owner}/{self.repo_name}"
|
||||
|
||||
async def get_pull_request(self, pr_number: int) -> PRInfo:
|
||||
"""Get pull request information from Gitea"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
return PRInfo(
|
||||
number=data["number"],
|
||||
title=data["title"],
|
||||
description=data.get("body", ""),
|
||||
author=data["user"]["login"],
|
||||
source_branch=data["head"]["ref"],
|
||||
target_branch=data["base"]["ref"],
|
||||
url=data["html_url"],
|
||||
state=data["state"]
|
||||
)
|
||||
|
||||
async def get_pr_files(self, pr_number: int) -> List[FileChange]:
|
||||
"""Get list of changed files in PR"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}/files"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
files_data = response.json()
|
||||
|
||||
changes = []
|
||||
for file in files_data:
|
||||
patch = file.get("patch")
|
||||
|
||||
# Если patch отсутствует, попробуем получить через diff API
|
||||
if not patch:
|
||||
print(f"⚠️ Patch отсутствует для {file['filename']}, попытка получить через .diff")
|
||||
try:
|
||||
diff_url = f"{self._get_repo_path()}/pulls/{pr_number}.diff"
|
||||
diff_response = await client.get(diff_url, headers=self._get_headers())
|
||||
if diff_response.status_code == 200:
|
||||
full_diff = diff_response.text
|
||||
# Извлекаем diff для конкретного файла
|
||||
patch = self._extract_file_diff(full_diff, file["filename"])
|
||||
print(f"✅ Получен diff через .diff API ({len(patch) if patch else 0} символов)")
|
||||
except Exception as e:
|
||||
print(f"❌ Не удалось получить diff: {e}")
|
||||
|
||||
changes.append(FileChange(
|
||||
filename=file["filename"],
|
||||
status=file["status"],
|
||||
additions=file.get("additions", 0),
|
||||
deletions=file.get("deletions", 0),
|
||||
patch=patch
|
||||
))
|
||||
|
||||
return changes
|
||||
|
||||
def _extract_file_diff(self, full_diff: str, filename: str) -> str:
|
||||
"""Extract diff for specific file from full diff"""
|
||||
lines = full_diff.split('\n')
|
||||
file_diff = []
|
||||
in_file = False
|
||||
|
||||
for i, line in enumerate(lines):
|
||||
# Начало diff для файла
|
||||
if line.startswith('diff --git') and filename in line:
|
||||
in_file = True
|
||||
file_diff.append(line)
|
||||
continue
|
||||
|
||||
# Следующий файл - прекращаем
|
||||
if in_file and line.startswith('diff --git') and filename not in line:
|
||||
break
|
||||
|
||||
if in_file:
|
||||
file_diff.append(line)
|
||||
|
||||
return '\n'.join(file_diff) if file_diff else None
|
||||
|
||||
async def get_file_content(self, file_path: str, ref: str) -> str:
|
||||
"""Get file content at specific ref"""
|
||||
url = f"{self._get_repo_path()}/contents/{file_path}"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
params={"ref": ref}
|
||||
)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
# Gitea returns base64 encoded content
|
||||
import base64
|
||||
content = base64.b64decode(data["content"]).decode("utf-8")
|
||||
return content
|
||||
|
||||
async def get_pr_commits(self, pr_number: int) -> List[Dict[str, Any]]:
|
||||
"""Get commits in PR"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}/commits"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
async def create_review_comment(
|
||||
self,
|
||||
pr_number: int,
|
||||
file_path: str,
|
||||
line_number: int,
|
||||
comment: str,
|
||||
commit_id: str
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review comment on PR"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}/reviews"
|
||||
|
||||
payload = {
|
||||
"body": comment,
|
||||
"commit_id": commit_id,
|
||||
"comments": [{
|
||||
"path": file_path,
|
||||
"body": comment,
|
||||
"new_position": line_number
|
||||
}]
|
||||
}
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
async def create_review(
|
||||
self,
|
||||
pr_number: int,
|
||||
comments: List[Dict[str, Any]],
|
||||
body: str = "",
|
||||
event: str = "COMMENT"
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review with separate comment for each issue
|
||||
|
||||
Args:
|
||||
pr_number: PR number
|
||||
comments: List of comments with file_path, line_number, content, severity
|
||||
body: Overall review summary (markdown supported)
|
||||
event: Review event (не используется, для совместимости)
|
||||
|
||||
Note: Gitea не поддерживает inline комментарии через API,
|
||||
поэтому создаем отдельный комментарий для каждой проблемы.
|
||||
"""
|
||||
print(f"\n📤 Публикация ревью в Gitea PR #{pr_number}")
|
||||
print(f" Комментариев для публикации: {len(comments)}")
|
||||
|
||||
url = f"{self._get_repo_path()}/issues/{pr_number}/comments"
|
||||
|
||||
# 1. Сначала публикуем общий summary
|
||||
if body:
|
||||
print(f"\n 📝 Публикация общего summary ({len(body)} символов)...")
|
||||
payload = {"body": body}
|
||||
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
print(f" ✅ Summary опубликован!")
|
||||
|
||||
# 2. Затем публикуем каждую проблему отдельным комментарием
|
||||
if comments:
|
||||
print(f"\n 💬 Публикация {len(comments)} отдельных комментариев...")
|
||||
for i, comment in enumerate(comments, 1):
|
||||
severity_emoji = {
|
||||
"ERROR": "❌",
|
||||
"WARNING": "⚠️",
|
||||
"INFO": "ℹ️"
|
||||
}.get(comment.get("severity", "INFO").upper(), "💬")
|
||||
|
||||
# Создаем ссылку на строку
|
||||
file_url = f"{self.base_url}/{self.repo_owner}/{self.repo_name}/pulls/{pr_number}/files#L{comment['line_number']}"
|
||||
|
||||
# Форматируем комментарий
|
||||
comment_body = f"{severity_emoji} **[`{comment['file_path']}:{comment['line_number']}`]({file_url})**\n\n"
|
||||
comment_body += f"**{comment.get('severity', 'INFO').upper()}**: {comment['content']}"
|
||||
|
||||
payload = {"body": comment_body}
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
print(f" ✅ {i}/{len(comments)}: {comment['file_path']}:{comment['line_number']}")
|
||||
except Exception as e:
|
||||
print(f" ❌ {i}/{len(comments)}: Ошибка - {e}")
|
||||
|
||||
print(f"\n 🎉 Все комментарии опубликованы!")
|
||||
return {"summary": "posted", "comments_count": len(comments)}
|
||||
|
||||
181
backend/app/services/github.py
Normal file
181
backend/app/services/github.py
Normal file
@@ -0,0 +1,181 @@
|
||||
"""GitHub API service"""
|
||||
|
||||
import httpx
|
||||
from typing import List, Dict, Any
|
||||
from app.services.base import BaseGitService, FileChange, PRInfo
|
||||
|
||||
|
||||
class GitHubService(BaseGitService):
|
||||
"""Service for interacting with GitHub API"""
|
||||
|
||||
def __init__(self, base_url: str, token: str, repo_owner: str, repo_name: str):
|
||||
# GitHub always uses api.github.com
|
||||
super().__init__("https://api.github.com", token, repo_owner, repo_name)
|
||||
|
||||
def _get_headers(self) -> Dict[str, str]:
|
||||
"""Get headers for API requests"""
|
||||
return {
|
||||
"Authorization": f"token {self.token}",
|
||||
"Accept": "application/vnd.github.v3+json",
|
||||
"Content-Type": "application/json"
|
||||
}
|
||||
|
||||
def _get_repo_path(self) -> str:
|
||||
"""Get repository API path"""
|
||||
return f"{self.base_url}/repos/{self.repo_owner}/{self.repo_name}"
|
||||
|
||||
async def get_pull_request(self, pr_number: int) -> PRInfo:
|
||||
"""Get pull request information from GitHub"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
return PRInfo(
|
||||
number=data["number"],
|
||||
title=data["title"],
|
||||
description=data.get("body", ""),
|
||||
author=data["user"]["login"],
|
||||
source_branch=data["head"]["ref"],
|
||||
target_branch=data["base"]["ref"],
|
||||
url=data["html_url"],
|
||||
state=data["state"]
|
||||
)
|
||||
|
||||
async def get_pr_files(self, pr_number: int) -> List[FileChange]:
|
||||
"""Get list of changed files in PR"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}/files"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(url, headers=self._get_headers())
|
||||
response.raise_for_status()
|
||||
files_data = response.json()
|
||||
|
||||
changes = []
|
||||
for file in files_data:
|
||||
changes.append(FileChange(
|
||||
filename=file["filename"],
|
||||
status=file["status"],
|
||||
additions=file.get("additions", 0),
|
||||
deletions=file.get("deletions", 0),
|
||||
patch=file.get("patch")
|
||||
))
|
||||
|
||||
return changes
|
||||
|
||||
async def get_file_content(self, file_path: str, ref: str) -> str:
|
||||
"""Get file content at specific ref"""
|
||||
url = f"{self._get_repo_path()}/contents/{file_path}"
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.get(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
params={"ref": ref}
|
||||
)
|
||||
response.raise_for_status()
|
||||
data = response.json()
|
||||
|
||||
# GitHub returns base64 encoded content
|
||||
import base64
|
||||
content = base64.b64decode(data["content"]).decode("utf-8")
|
||||
return content
|
||||
|
||||
async def create_review_comment(
|
||||
self,
|
||||
pr_number: int,
|
||||
file_path: str,
|
||||
line_number: int,
|
||||
comment: str,
|
||||
commit_id: str
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review comment on PR"""
|
||||
url = f"{self._get_repo_path()}/pulls/{pr_number}/comments"
|
||||
|
||||
payload = {
|
||||
"body": comment,
|
||||
"commit_id": commit_id,
|
||||
"path": file_path,
|
||||
"line": line_number,
|
||||
"side": "RIGHT"
|
||||
}
|
||||
|
||||
async with httpx.AsyncClient() as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
return response.json()
|
||||
|
||||
async def create_review(
|
||||
self,
|
||||
pr_number: int,
|
||||
comments: List[Dict[str, Any]],
|
||||
body: str = "",
|
||||
event: str = "COMMENT"
|
||||
) -> Dict[str, Any]:
|
||||
"""Create a review with separate comment for each issue
|
||||
|
||||
Args:
|
||||
pr_number: PR number
|
||||
comments: List of comments
|
||||
body: Overall review summary (markdown supported)
|
||||
event: Review event (не используется, для совместимости)
|
||||
"""
|
||||
print(f"\n📤 Публикация ревью в GitHub PR #{pr_number}")
|
||||
print(f" Комментариев для публикации: {len(comments)}")
|
||||
|
||||
url = f"{self._get_repo_path()}/issues/{pr_number}/comments"
|
||||
|
||||
# 1. Сначала публикуем общий summary
|
||||
if body:
|
||||
print(f"\n 📝 Публикация общего summary ({len(body)} символов)...")
|
||||
payload = {"body": body}
|
||||
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
print(f" ✅ Summary опубликован!")
|
||||
|
||||
# 2. Затем публикуем каждую проблему отдельным комментарием
|
||||
if comments:
|
||||
print(f"\n 💬 Публикация {len(comments)} отдельных комментариев...")
|
||||
for i, comment in enumerate(comments, 1):
|
||||
severity_emoji = {
|
||||
"ERROR": "❌",
|
||||
"WARNING": "⚠️",
|
||||
"INFO": "ℹ️"
|
||||
}.get(comment.get("severity", "INFO").upper(), "💬")
|
||||
|
||||
# GitHub ссылка на строку
|
||||
file_url = f"https://github.com/{self.repo_owner}/{self.repo_name}/pull/{pr_number}/files#L{comment['line_number']}"
|
||||
|
||||
# Форматируем комментарий
|
||||
comment_body = f"{severity_emoji} **[`{comment['file_path']}:{comment['line_number']}`]({file_url})**\n\n"
|
||||
comment_body += f"**{comment.get('severity', 'INFO').upper()}**: {comment['content']}"
|
||||
|
||||
payload = {"body": comment_body}
|
||||
|
||||
try:
|
||||
async with httpx.AsyncClient(timeout=30.0) as client:
|
||||
response = await client.post(
|
||||
url,
|
||||
headers=self._get_headers(),
|
||||
json=payload
|
||||
)
|
||||
response.raise_for_status()
|
||||
print(f" ✅ {i}/{len(comments)}: {comment['file_path']}:{comment['line_number']}")
|
||||
except Exception as e:
|
||||
print(f" ❌ {i}/{len(comments)}: Ошибка - {e}")
|
||||
|
||||
print(f"\n 🎉 Все комментарии опубликованы!")
|
||||
return {"summary": "posted", "comments_count": len(comments)}
|
||||
|
||||
40
backend/app/utils.py
Normal file
40
backend/app/utils.py
Normal file
@@ -0,0 +1,40 @@
|
||||
"""Utility functions"""
|
||||
|
||||
from cryptography.fernet import Fernet, InvalidToken
|
||||
from app.config import settings
|
||||
import base64
|
||||
|
||||
|
||||
def get_cipher():
|
||||
"""Get Fernet cipher for encryption"""
|
||||
# Use first 32 bytes of encryption key, base64 encoded
|
||||
key = settings.encryption_key.encode()[:32]
|
||||
# Pad to 32 bytes if needed
|
||||
key = key.ljust(32, b'0')
|
||||
# Base64 encode for Fernet
|
||||
key_b64 = base64.urlsafe_b64encode(key)
|
||||
return Fernet(key_b64)
|
||||
|
||||
|
||||
def encrypt_token(token: str) -> str:
|
||||
"""Encrypt API token"""
|
||||
cipher = get_cipher()
|
||||
return cipher.encrypt(token.encode()).decode()
|
||||
|
||||
|
||||
def decrypt_token(encrypted_token: str) -> str:
|
||||
"""Decrypt API token
|
||||
|
||||
Raises:
|
||||
ValueError: If token cannot be decrypted (wrong encryption key)
|
||||
"""
|
||||
cipher = get_cipher()
|
||||
try:
|
||||
return cipher.decrypt(encrypted_token.encode()).decode()
|
||||
except InvalidToken:
|
||||
raise ValueError(
|
||||
"Не удалось расшифровать API токен. "
|
||||
"Возможно, ключ шифрования (ENCRYPTION_KEY) был изменен. "
|
||||
"Пожалуйста, обновите API токен для этого репозитория."
|
||||
)
|
||||
|
||||
8
backend/app/webhooks/__init__.py
Normal file
8
backend/app/webhooks/__init__.py
Normal file
@@ -0,0 +1,8 @@
|
||||
"""Webhook handlers"""
|
||||
|
||||
from app.webhooks.gitea import handle_gitea_webhook
|
||||
from app.webhooks.github import handle_github_webhook
|
||||
from app.webhooks.bitbucket import handle_bitbucket_webhook
|
||||
|
||||
__all__ = ["handle_gitea_webhook", "handle_github_webhook", "handle_bitbucket_webhook"]
|
||||
|
||||
97
backend/app/webhooks/bitbucket.py
Normal file
97
backend/app/webhooks/bitbucket.py
Normal file
@@ -0,0 +1,97 @@
|
||||
"""Bitbucket webhook handler"""
|
||||
|
||||
from fastapi import HTTPException
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.models import Repository, PullRequest, Review
|
||||
from app.models.pull_request import PRStatusEnum
|
||||
from app.models.review import ReviewStatusEnum
|
||||
from app.schemas.webhook import BitbucketWebhook
|
||||
|
||||
|
||||
async def handle_bitbucket_webhook(
|
||||
webhook_data: BitbucketWebhook,
|
||||
db: AsyncSession
|
||||
) -> dict:
|
||||
"""Handle Bitbucket webhook"""
|
||||
|
||||
# Find repository by URL
|
||||
repo_url = webhook_data.repository.get("links", {}).get("html", {}).get("href", "")
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.url == repo_url)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
# Check if repository is active
|
||||
if not repository.is_active:
|
||||
return {"message": "Repository is not active"}
|
||||
|
||||
# Get PR state
|
||||
pr_state = webhook_data.pullrequest.state.lower()
|
||||
|
||||
# Handle PR events
|
||||
if pr_state in ["open", "opened"]:
|
||||
# Create or update PR
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == webhook_data.pullrequest.id
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
|
||||
if not pr:
|
||||
pr = PullRequest(
|
||||
repository_id=repository.id,
|
||||
pr_number=webhook_data.pullrequest.id,
|
||||
title=webhook_data.pullrequest.title,
|
||||
author=webhook_data.pullrequest.author.get("display_name", ""),
|
||||
source_branch=webhook_data.pullrequest.source.get("branch", {}).get("name", ""),
|
||||
target_branch=webhook_data.pullrequest.destination.get("branch", {}).get("name", ""),
|
||||
url=webhook_data.pullrequest.links.get("html", {}).get("href", ""),
|
||||
status=PRStatusEnum.OPEN
|
||||
)
|
||||
db.add(pr)
|
||||
await db.commit()
|
||||
await db.refresh(pr)
|
||||
else:
|
||||
pr.title = webhook_data.pullrequest.title
|
||||
pr.status = PRStatusEnum.OPEN
|
||||
await db.commit()
|
||||
|
||||
# Create review
|
||||
review = Review(
|
||||
pull_request_id=pr.id,
|
||||
status=ReviewStatusEnum.PENDING
|
||||
)
|
||||
db.add(review)
|
||||
await db.commit()
|
||||
await db.refresh(review)
|
||||
|
||||
return {
|
||||
"message": "Review created",
|
||||
"review_id": review.id,
|
||||
"pr_id": pr.id
|
||||
}
|
||||
|
||||
elif pr_state in ["closed", "merged", "declined"]:
|
||||
# Mark PR as closed
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == webhook_data.pullrequest.id
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
if pr:
|
||||
pr.status = PRStatusEnum.CLOSED
|
||||
await db.commit()
|
||||
|
||||
return {"message": "PR closed"}
|
||||
|
||||
return {"message": "Event not handled"}
|
||||
|
||||
116
backend/app/webhooks/gitea.py
Normal file
116
backend/app/webhooks/gitea.py
Normal file
@@ -0,0 +1,116 @@
|
||||
"""Gitea webhook handler"""
|
||||
|
||||
import hmac
|
||||
import hashlib
|
||||
from fastapi import HTTPException
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.models import Repository, PullRequest, Review
|
||||
from app.models.pull_request import PRStatusEnum
|
||||
from app.models.review import ReviewStatusEnum
|
||||
from app.schemas.webhook import GiteaWebhook
|
||||
|
||||
|
||||
def verify_gitea_signature(payload: bytes, signature: str, secret: str) -> bool:
|
||||
"""Verify Gitea webhook signature"""
|
||||
if not signature:
|
||||
return False
|
||||
|
||||
expected_signature = hmac.new(
|
||||
secret.encode(),
|
||||
payload,
|
||||
hashlib.sha256
|
||||
).hexdigest()
|
||||
|
||||
return hmac.compare_digest(signature, expected_signature)
|
||||
|
||||
|
||||
async def handle_gitea_webhook(
|
||||
webhook_data: GiteaWebhook,
|
||||
signature: str,
|
||||
raw_payload: bytes,
|
||||
db: AsyncSession
|
||||
) -> dict:
|
||||
"""Handle Gitea webhook"""
|
||||
|
||||
# Find repository by URL
|
||||
repo_url = webhook_data.repository.get("html_url", "")
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.url == repo_url)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
# Verify signature
|
||||
if not verify_gitea_signature(raw_payload, signature, repository.webhook_secret):
|
||||
raise HTTPException(status_code=403, detail="Invalid signature")
|
||||
|
||||
# Check if repository is active
|
||||
if not repository.is_active:
|
||||
return {"message": "Repository is not active"}
|
||||
|
||||
# Handle PR events
|
||||
if webhook_data.action in ["opened", "synchronized", "reopened"]:
|
||||
# Create or update PR
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == webhook_data.number
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
|
||||
if not pr:
|
||||
pr = PullRequest(
|
||||
repository_id=repository.id,
|
||||
pr_number=webhook_data.number,
|
||||
title=webhook_data.pull_request.title,
|
||||
author=webhook_data.pull_request.user.get("login", ""),
|
||||
source_branch=webhook_data.pull_request.head.get("ref", ""),
|
||||
target_branch=webhook_data.pull_request.base.get("ref", ""),
|
||||
url=webhook_data.pull_request.html_url,
|
||||
status=PRStatusEnum.OPEN
|
||||
)
|
||||
db.add(pr)
|
||||
await db.commit()
|
||||
await db.refresh(pr)
|
||||
else:
|
||||
pr.title = webhook_data.pull_request.title
|
||||
pr.status = PRStatusEnum.OPEN
|
||||
await db.commit()
|
||||
|
||||
# Create review
|
||||
review = Review(
|
||||
pull_request_id=pr.id,
|
||||
status=ReviewStatusEnum.PENDING
|
||||
)
|
||||
db.add(review)
|
||||
await db.commit()
|
||||
await db.refresh(review)
|
||||
|
||||
return {
|
||||
"message": "Review created",
|
||||
"review_id": review.id,
|
||||
"pr_id": pr.id
|
||||
}
|
||||
|
||||
elif webhook_data.action == "closed":
|
||||
# Mark PR as closed
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == webhook_data.number
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
if pr:
|
||||
pr.status = PRStatusEnum.CLOSED
|
||||
await db.commit()
|
||||
|
||||
return {"message": "PR closed"}
|
||||
|
||||
return {"message": "Event not handled"}
|
||||
|
||||
116
backend/app/webhooks/github.py
Normal file
116
backend/app/webhooks/github.py
Normal file
@@ -0,0 +1,116 @@
|
||||
"""GitHub webhook handler"""
|
||||
|
||||
import hmac
|
||||
import hashlib
|
||||
from fastapi import HTTPException
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy import select
|
||||
|
||||
from app.models import Repository, PullRequest, Review
|
||||
from app.models.pull_request import PRStatusEnum
|
||||
from app.models.review import ReviewStatusEnum
|
||||
from app.schemas.webhook import GitHubWebhook
|
||||
|
||||
|
||||
def verify_github_signature(payload: bytes, signature: str, secret: str) -> bool:
|
||||
"""Verify GitHub webhook signature"""
|
||||
if not signature or not signature.startswith("sha256="):
|
||||
return False
|
||||
|
||||
expected_signature = "sha256=" + hmac.new(
|
||||
secret.encode(),
|
||||
payload,
|
||||
hashlib.sha256
|
||||
).hexdigest()
|
||||
|
||||
return hmac.compare_digest(signature, expected_signature)
|
||||
|
||||
|
||||
async def handle_github_webhook(
|
||||
webhook_data: GitHubWebhook,
|
||||
signature: str,
|
||||
raw_payload: bytes,
|
||||
db: AsyncSession
|
||||
) -> dict:
|
||||
"""Handle GitHub webhook"""
|
||||
|
||||
# Find repository by URL
|
||||
repo_url = webhook_data.repository.get("html_url", "")
|
||||
result = await db.execute(
|
||||
select(Repository).where(Repository.url == repo_url)
|
||||
)
|
||||
repository = result.scalar_one_or_none()
|
||||
|
||||
if not repository:
|
||||
raise HTTPException(status_code=404, detail="Repository not found")
|
||||
|
||||
# Verify signature
|
||||
if not verify_github_signature(raw_payload, signature, repository.webhook_secret):
|
||||
raise HTTPException(status_code=403, detail="Invalid signature")
|
||||
|
||||
# Check if repository is active
|
||||
if not repository.is_active:
|
||||
return {"message": "Repository is not active"}
|
||||
|
||||
# Handle PR events
|
||||
if webhook_data.action in ["opened", "synchronize", "reopened"]:
|
||||
# Create or update PR
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == webhook_data.number
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
|
||||
if not pr:
|
||||
pr = PullRequest(
|
||||
repository_id=repository.id,
|
||||
pr_number=webhook_data.number,
|
||||
title=webhook_data.pull_request.title,
|
||||
author=webhook_data.pull_request.user.get("login", ""),
|
||||
source_branch=webhook_data.pull_request.head.get("ref", ""),
|
||||
target_branch=webhook_data.pull_request.base.get("ref", ""),
|
||||
url=webhook_data.pull_request.html_url,
|
||||
status=PRStatusEnum.OPEN
|
||||
)
|
||||
db.add(pr)
|
||||
await db.commit()
|
||||
await db.refresh(pr)
|
||||
else:
|
||||
pr.title = webhook_data.pull_request.title
|
||||
pr.status = PRStatusEnum.OPEN
|
||||
await db.commit()
|
||||
|
||||
# Create review
|
||||
review = Review(
|
||||
pull_request_id=pr.id,
|
||||
status=ReviewStatusEnum.PENDING
|
||||
)
|
||||
db.add(review)
|
||||
await db.commit()
|
||||
await db.refresh(review)
|
||||
|
||||
return {
|
||||
"message": "Review created",
|
||||
"review_id": review.id,
|
||||
"pr_id": pr.id
|
||||
}
|
||||
|
||||
elif webhook_data.action == "closed":
|
||||
# Mark PR as closed
|
||||
result = await db.execute(
|
||||
select(PullRequest).where(
|
||||
PullRequest.repository_id == repository.id,
|
||||
PullRequest.pr_number == webhook_data.number
|
||||
)
|
||||
)
|
||||
pr = result.scalar_one_or_none()
|
||||
if pr:
|
||||
pr.status = PRStatusEnum.CLOSED
|
||||
await db.commit()
|
||||
|
||||
return {"message": "PR closed"}
|
||||
|
||||
return {"message": "Event not handled"}
|
||||
|
||||
29
backend/requirements.txt
Normal file
29
backend/requirements.txt
Normal file
@@ -0,0 +1,29 @@
|
||||
# FastAPI и зависимости
|
||||
fastapi>=0.100.0
|
||||
uvicorn[standard]>=0.23.0
|
||||
python-multipart>=0.0.6
|
||||
websockets>=11.0
|
||||
|
||||
# Database
|
||||
sqlalchemy>=2.0.0
|
||||
aiosqlite>=0.19.0
|
||||
|
||||
# LangChain/LangGraph - используем более новые версии совместимые с Python 3.13
|
||||
langchain>=0.3.0
|
||||
langchain-community>=0.3.0
|
||||
langgraph>=0.2.0
|
||||
langchain-ollama>=0.2.0
|
||||
|
||||
# HTTP клиент
|
||||
httpx>=0.25.0
|
||||
|
||||
# Encryption - используем версию с wheels для Python 3.13
|
||||
cryptography>=41.0.0
|
||||
|
||||
# Pydantic
|
||||
pydantic>=2.5.0
|
||||
pydantic-settings>=2.1.0
|
||||
|
||||
# Utilities
|
||||
python-dotenv>=1.0.0
|
||||
|
||||
45
backend/start.bat
Normal file
45
backend/start.bat
Normal file
@@ -0,0 +1,45 @@
|
||||
@echo off
|
||||
REM AI Review Backend Start Script for Windows
|
||||
|
||||
echo 🚀 Starting AI Review Backend...
|
||||
|
||||
REM Check if venv exists
|
||||
if not exist "venv" (
|
||||
echo 📦 Creating virtual environment...
|
||||
python -m venv venv
|
||||
)
|
||||
|
||||
REM Activate venv
|
||||
echo 🔧 Activating virtual environment...
|
||||
call venv\Scripts\activate.bat
|
||||
|
||||
REM Install dependencies
|
||||
echo 📥 Installing dependencies...
|
||||
pip install -q -r requirements.txt
|
||||
|
||||
REM Check .env
|
||||
if not exist ".env" (
|
||||
echo ⚠️ .env file not found!
|
||||
echo Creating .env from .env.example...
|
||||
copy .env.example .env
|
||||
echo.
|
||||
echo ⚠️ IMPORTANT: Edit .env and set SECRET_KEY and ENCRYPTION_KEY!
|
||||
echo.
|
||||
pause
|
||||
)
|
||||
|
||||
REM Check Ollama
|
||||
echo 🤖 Checking Ollama...
|
||||
where ollama >nul 2>nul
|
||||
if %ERRORLEVEL% NEQ 0 (
|
||||
echo ❌ Ollama not found! Please install from https://ollama.ai/
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Start server
|
||||
echo ✅ Starting server on http://localhost:8000
|
||||
echo 📚 API docs: http://localhost:8000/docs
|
||||
echo.
|
||||
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
|
||||
|
||||
49
backend/start.sh
Normal file
49
backend/start.sh
Normal file
@@ -0,0 +1,49 @@
|
||||
#!/bin/bash
|
||||
|
||||
# AI Review Backend Start Script
|
||||
|
||||
echo "🚀 Starting AI Review Backend..."
|
||||
|
||||
# Check if venv exists
|
||||
if [ ! -d "venv" ]; then
|
||||
echo "📦 Creating virtual environment..."
|
||||
python3 -m venv venv
|
||||
fi
|
||||
|
||||
# Activate venv
|
||||
echo "🔧 Activating virtual environment..."
|
||||
source venv/bin/activate
|
||||
|
||||
# Install dependencies
|
||||
echo "📥 Installing dependencies..."
|
||||
pip install -q -r requirements.txt
|
||||
|
||||
# Check .env
|
||||
if [ ! -f ".env" ]; then
|
||||
echo "⚠️ .env file not found!"
|
||||
echo "Creating .env from .env.example..."
|
||||
cp .env.example .env
|
||||
echo ""
|
||||
echo "⚠️ IMPORTANT: Edit .env and set SECRET_KEY and ENCRYPTION_KEY!"
|
||||
echo ""
|
||||
read -p "Press Enter to continue..."
|
||||
fi
|
||||
|
||||
# Check Ollama
|
||||
echo "🤖 Checking Ollama..."
|
||||
if ! command -v ollama &> /dev/null; then
|
||||
echo "❌ Ollama not found! Please install from https://ollama.ai/"
|
||||
exit 1
|
||||
fi
|
||||
|
||||
if ! ollama list | grep -q "codellama"; then
|
||||
echo "📥 Pulling codellama model..."
|
||||
ollama pull codellama
|
||||
fi
|
||||
|
||||
# Start server
|
||||
echo "✅ Starting server on http://localhost:8000"
|
||||
echo "📚 API docs: http://localhost:8000/docs"
|
||||
echo ""
|
||||
uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
|
||||
|
||||
Reference in New Issue
Block a user