Testing Strategy
This document outlines the testing approach, frameworks, and best practices for ensuring software quality across the platform.
Overview
Testing is an integral part of the development process. Tests should be written alongside feature development to ensure comprehensive coverage and early defect detection.
Frontend Testing
Framework
| Tool | Purpose |
|---|---|
| Biomes | Unit, component, and integration testing |
Testing Focus Areas
- Unit Tests: Business logic, utilities, and helper functions
- Component Tests: UI interactions, rendering, and user events
- Integration Tests: API communication, state management, user flows
Best Practices
- Write tests alongside feature development
- Test component behavior, not implementation details
- Mock external dependencies and API calls
- Maintain high coverage for critical business logic
Example Test
import { describe, it, expect } from 'vitest';
import { render, screen } from '@testing-library/react';
import { UserProfile } from './user-profile';
describe('UserProfile', () => {
it('displays user name correctly', () => {
render(<UserProfile userId="123" />);
expect(screen.getByText('John Doe')).toBeInTheDocument();
});
it('handles loading state', () => {
render(<UserProfile userId="123" />);
expect(screen.getByRole('progressbar')).toBeInTheDocument();
});
});
Backend Testing
Framework
| Tool | Purpose |
|---|---|
| pytest | Unit and integration testing |
Testing Focus Areas
- Unit Tests: Service layer logic, utility functions, validators
- Integration Tests: API endpoints, database operations, external service integrations
Best Practices
- Write tests alongside feature development
- Use fixtures for test data setup
- Test both success and error paths
- Isolate tests from external dependencies using mocks
Example Test
import pytest
from httpx import AsyncClient
from app.main import app
@pytest.fixture
async def client():
async with AsyncClient(app=app, base_url="http://test") as ac:
yield ac
@pytest.mark.asyncio
async def test_generate_message(client):
response = await client.post(
"/api/v1/generate",
json={
"conversation_id": "conv_123",
"user_id": "user_456",
"content": "Hello, AI"
}
)
assert response.status_code == 200
assert "content" in response.json()
Data Access Layer Tests
The data access layer requires thorough testing to ensure data integrity and security.
Test Scenarios
Data layer testing is included in frontend and backend unit testing, focusing on data validation.
| Scenario | Description |
|---|---|
| Conversation Creation | Verify conversations are created with correct metadata |
| Message Access | Validate messages are correctly stored and retrieved |
| User Association | Ensure messages are properly linked to conversations and users |
| Unauthorized Access | Confirm users cannot access others’ conversations |
| Data Persistence | Verify data survives application restarts |
Example Test Cases
@pytest.mark.asyncio
async def test_create_conversation(db_session):
"""Test creating a new conversation."""
conversation = await create_conversation(
db_session,
user_id="user_123",
title="Test Conversation"
)
assert conversation.id is not None
assert conversation.user_id == "user_123"
@pytest.mark.asyncio
async def test_unauthorized_access_blocked(db_session):
"""Test that users cannot access other users' conversations."""
conversation = await create_conversation(
db_session,
user_id="user_123"
)
with pytest.raises(ForbiddenError):
await get_conversation(
db_session,
conversation_id=conversation.id,
user_id="user_456" # Different user
)
Integration Testing
End-to-End Test Scenarios
Integration tests, orchestrated with Playwright, validate complete user workflows across the system.
| Scenario | Steps |
|---|---|
| User Login | Authenticate via Neon Auth, receive JWT token |
| Conversation Creation | Create new conversation, verify persistence |
| Message Exchange | Send message, receive AI response, validate storage |
Example Workflow Test
@pytest.mark.asyncio
async def test_complete_conversation_flow(client, auth_token):
"""Test complete conversation workflow."""
headers = {"Authorization": f"Bearer {auth_token}"}
# Create conversation
create_response = await client.post(
"/api/v1/conversations",
headers=headers,
json={"title": "Test Flow"}
)
assert create_response.status_code == 201
conversation_id = create_response.json()["id"]
# Send message
message_response = await client.post(
"/api/v1/generate",
headers=headers,
json={
"conversation_id": conversation_id,
"user_id": "test_user",
"content": "Hello"
}
)
assert message_response.status_code == 200
# Verify history
history_response = await client.get(
f"/api/v1/conversations/{conversation_id}/messages",
headers=headers
)
assert len(history_response.json()) >= 2
Acceptance Criteria
Acceptance testers validate whether each functional requirement is met from an end-user perspective.
Test Coverage Requirements
| Component | Minimum Coverage |
|---|---|
| Frontend Business Logic | 75% |
| Backend Services | 100% |
| API Endpoints | 100% |
| Data Access Layer | 90% |
Running Tests
Frontend
npm run test # Run all tests
npm run test:watch # Watch mode
npm run coverage # Generate coverage report
Backend
pytest # Run all tests
pytest --cov=app # Run with coverage
pytest -v tests/api/ # Run specific test directory
Next Steps
| Goal | Documentation |
|---|---|
| Submit your changes | Contributing |
| Understand coding standards | Development Workflows |
| Review API contracts | API Documentation |
| Troubleshoot test failures | Troubleshooting |