Transform any content into engaging quizzes with the power of AI
TestMe is a production-ready, full-stack web application that generates intelligent quizzes from multiple content sources using advanced AI. Built with modern technologies and designed for scalability, it supports YouTube videos, PDFs, images, and custom text to create engaging multiple-choice, true/false, and mathematical questions.
π― Perfect for: Educators, content creators, training organizations, and anyone looking to create interactive learning experiences.
π Live Demo: https://test-me-beta.vercel.app
- Quick Start
- Features
- Demo
- Technology Stack
- Getting Started
- AI Provider Setup
- API Keys Setup Guide
- Enhanced Large Quiz Generation
- Development
- Deployment
- Configuration Management
- Architecture
- Contributing
- License
- Developer
Want to get TestMe running quickly? Follow these essential steps:
# 1. Clone and install
git clone https://github.com/nosisky/testme.git
cd testme
npm install
# 2. Setup environment (copy and fill with your API keys)
cp env.example .env.local
# 3. Start development server
npm run dev
Essential environment variables to configure:
OPENAI_API_KEY
- For AI quiz generationGOOGLE_CLIENT_ID
&GOOGLE_CLIENT_SECRET
- For authenticationMONGODB_URI
- For database storageYOUTUBE_API_KEY
- For YouTube video processing
π Detailed setup instructions are available in the Getting Started section below.
- Multi-Source Content: Generate quizzes from YouTube videos, PDFs (up to 5 pages), images, and custom text
- AI-Powered Knowledge Gap Analysis: Automatically identifies learning gaps and key topics
- Smart Question Generation: Creates multiple-choice, true/false, and mathematical questions
- Adaptive Difficulty: Customizable difficulty levels (easy, medium, hard, expert)
- LaTeX Math Support: Full mathematical equation rendering with MathJax
- Interactive Quiz Interface: Animated quiz-taking experience with real-time feedback
- Responsive Design: Optimized for desktop, tablet, and mobile devices
- Professional UI: Modern design with consistent theming
- Quiz Sharing: Share quizzes via secure links
- Performance Analytics: Detailed quiz statistics and user performance tracking
- Google OAuth Integration: Secure authentication with Google accounts
- User Management: Role-based access control (admin/user)
- Data Privacy: Secure storage of user data and quiz results
- Anonymous Quiz Taking: Option for users to take quizzes without registration
- Multiple AI Providers: Support for OpenAI, Anthropic Claude, DeepSeek, and AWS Bedrock
- Environment-Based Configuration: Secure API key management
- Enhanced Large Quiz Generation: Dynamic token management for 15+ questions
- Fallback Systems: Robust error handling with mock service support
- Cost Optimization: Choose providers based on budget and performance needs
Visit our live demo: https://test-me-beta.vercel.app
Test Account: Use Google OAuth to create your account or try the demo content.
- Framework: Next.js 15 with App Router
- Language: TypeScript
- Styling: SCSS Modules with CSS Variables
- UI Components: Custom React components with responsive design
- Math Rendering: MathJax for LaTeX equation support
- Runtime: Node.js with Next.js API Routes
- Database: MongoDB with Mongoose ODM
- Authentication: NextAuth.js with Google OAuth
- File Processing: PDF parsing, image analysis, YouTube transcript extraction
- AI SDK: Vercel AI SDK for unified provider interface
- Providers: OpenAI, Anthropic Claude, DeepSeek, AWS Bedrock
- Vision: OpenAI GPT-4 Vision for image text extraction
- Platform: Vercel (recommended) or Docker
- Monitoring: Built-in analytics and error tracking
- CI/CD: GitHub Actions ready
- Node.js: Version 18 or higher
- npm or yarn: Package manager
- MongoDB: Local instance or MongoDB Atlas
- Git: Version control
-
Clone the repository
git clone https://github.com/nosisky/testme.git cd testme
-
Install dependencies
npm install
-
Environment setup
cp env.example .env.local
Create a .env.local
file in the root directory with the following configuration:
# =================================================================
# APPLICATION CONFIGURATION
# =================================================================
NEXTAUTH_URL=http://localhost:3000
NEXTAUTH_SECRET=your_nextauth_secret_here
# =================================================================
# AUTHENTICATION (Required)
# =================================================================
GOOGLE_CLIENT_ID=your_google_client_id
GOOGLE_CLIENT_SECRET=your_google_client_secret
# =================================================================
# DATABASE (Required)
# =================================================================
MONGODB_URI=mongodb://localhost:27017/testme
# Or for MongoDB Atlas:
# MONGODB_URI=mongodb+srv://username:[email protected]/testme
# =================================================================
# EXTERNAL APIs (Required)
# =================================================================
YOUTUBE_API_KEY=your_youtube_api_key
# =================================================================
# AI PROVIDER CONFIGURATION
# =================================================================
# Choose your primary AI provider (openai, claude, deepseek, bedrock)
DEFAULT_AI_PROVIDER=openai
# Configure API keys for your chosen provider(s)
# OpenAI (Recommended - reliable, fast, good quality)
OPENAI_API_KEY=your_openai_api_key
# Anthropic Claude (Best for detailed explanations)
CLAUDE_API_KEY=your_claude_api_key
# DeepSeek (Most cost-effective option)
DEEPSEEK_API_KEY=your_deepseek_api_key
# AWS Bedrock (Enterprise-grade, requires AWS setup)
AWS_ACCESS_KEY_ID=your_aws_access_key_id
AWS_SECRET_ACCESS_KEY=your_aws_secret_access_key
AWS_REGION=us-east-1
# =================================================================
# DEVELOPMENT & TESTING
# =================================================================
# Set to 'true' to use mock AI service for testing (no API costs)
USE_MOCK_AI=false
# Set to 'development' for detailed logging
NODE_ENV=development
Best for: General use, reliable performance, fastest responses
- Models: GPT-4o-mini (default), GPT-4, GPT-3.5-turbo
- Pricing: ~$0.0001 per 1K tokens (very affordable)
- Setup: Get API key
DEFAULT_AI_PROVIDER=openai
OPENAI_API_KEY=sk-your-openai-key-here
Best for: Detailed explanations, complex reasoning, educational content
- Models: Claude-3-haiku (default), Claude-3-sonnet
- Pricing: ~$0.00025 per 1K tokens
- Setup: Get API key
DEFAULT_AI_PROVIDER=claude
CLAUDE_API_KEY=sk-ant-your-claude-key-here
Best for: Cost-effective alternative, budget-conscious deployments
- Models: deepseek-chat (default)
- Pricing: ~$0.00014 per 1K tokens (most affordable)
- Setup: Get API key
DEFAULT_AI_PROVIDER=deepseek
DEEPSEEK_API_KEY=sk-your-deepseek-key-here
Best for: Enterprise use, enhanced security, compliance requirements
- Models: Claude-3.7-Sonnet (via inference profile)
- Pricing: Variable based on AWS pricing
- Setup: Requires AWS account and Bedrock access
DEFAULT_AI_PROVIDER=bedrock
AWS_ACCESS_KEY_ID=your-aws-access-key
AWS_SECRET_ACCESS_KEY=your-aws-secret-key
AWS_REGION=us-east-1
Important: Bedrock uses inference profiles for newer Claude models. The system automatically uses us.anthropic.claude-3-7-sonnet-20250219-v1:0
which is the cross-region inference profile for Claude 3.7 Sonnet.
JSON Output: Since Bedrock Claude models don't have native JSON mode, TestMe uses clear delimiters (<JSON_START>
and <JSON_END>
) to ensure reliable JSON extraction from responses.
Supported regions: us-east-1, us-east-2, us-west-2 (automatically routed based on availability)
Provider | Cost | Speed | Quality | Best Use Case |
---|---|---|---|---|
OpenAI | π°π° | β‘β‘β‘ | βββ | General purpose, production |
Claude | π°π°π° | β‘β‘ | ββββ | Educational content, detailed explanations |
DeepSeek | π° | β‘β‘ | βββ | Budget deployments, development |
Bedrock | π°π°π° | β‘β‘ | ββββ | Enterprise, compliance-critical |
- Visit Google Cloud Console
- Create a new project or select existing one
- Enable Google+ API and People API
- Go to APIs & Services β Credentials
- Create OAuth client ID with these settings:
- Application type: Web application
- Authorized JavaScript origins:
http://localhost:3000
(development)https://your-domain.com
(production)
- Authorized redirect URIs:
http://localhost:3000/api/auth/callback/google
(development)https://your-domain.com/api/auth/callback/google
(production)
- In the same Google Cloud project
- Enable YouTube Data API v3
- Create API Key and restrict it to YouTube Data API
- Add domains that will use the API
Option A: MongoDB Atlas (Recommended for production)
- Create account at MongoDB Atlas
- Create a cluster
- Create database user with read/write permissions
- Whitelist your IP address (or 0.0.0.0/0 for development)
- Get connection string
Option B: Local MongoDB
# Install MongoDB locally
brew install mongodb-community # macOS
# or follow official installation guide for your OS
# Start MongoDB service
brew services start mongodb-community
# Generate a secure random string
openssl rand -base64 32
This application now supports Dynamic Token Management + Chunked Generation for generating 15+ questions reliably:
- Automatic Chunking: Large quiz requests (15+ questions) are automatically split into smaller chunks
- Dynamic Token Limits: Token limits are automatically adjusted based on request size:
- 10-14 questions: 32,000 tokens
- 15-19 questions: 64,000 tokens
- 20+ questions: 128,000 tokens (with beta headers)
- Enhanced Bedrock Support: Optimized for AWS Bedrock Claude 3.7 Sonnet
- Intelligent Fallback: If a chunk fails, returns successfully generated questions
- Rate Limiting Protection: Built-in delays between chunks to avoid throttling
Set your AI provider in .env
:
DEFAULT_AI_PROVIDER=bedrock
AWS_REGION=us-east-1
- Small Requests (< 15 questions): Standard generation
- Large Requests (β₯ 15 questions):
- Split into chunks of 7 questions each
- Each chunk generated with 1-second delay
- Results combined automatically
- Enhanced token limits applied
This ensures reliable generation of large quizzes without hitting token limits or rate restrictions.
npm run dev
The application will be available at http://localhost:3000
# Development server with Turbopack
npm run dev
# Production build
npm run build
# Start production server
npm start
# Run linting
npm run lint
# Run type checking
npm run type-check
- Hot Reload: Instant updates during development
- TypeScript: Full type safety and IntelliSense
- ESLint: Code quality and consistency
- SCSS Modules: Scoped styling with CSS variables
- Mock AI Service: Test without API costs
-
Connect your repository
npm i -g vercel vercel login vercel
-
Environment Variables
- Copy all variables from
.env.local
to Vercel dashboard - Set
NEXTAUTH_URL
to your production domain - Ensure all API keys are properly configured
- Copy all variables from
-
Domain Configuration
- Update Google OAuth redirect URIs with production domain
- Update CORS settings if needed
# Build Docker image
docker build -t testme .
# Run container
docker run -p 3000:3000 --env-file .env.local testme
# Build for production
npm run build
# Start production server
npm start
Change AI provider without code changes:
# Switch to Claude
DEFAULT_AI_PROVIDER=claude
# Switch to DeepSeek
DEFAULT_AI_PROVIDER=deepseek
# Restart application
npm run dev
- Development: Use mock services and local databases
- Staging: Test with real APIs and staging databases
- Production: Full configuration with monitoring
- β Store API keys in environment variables only
- β Use different API keys for different environments
- β Regularly rotate API keys
- β Monitor API usage and costs
- β Implement rate limiting
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Frontend β β Backend β β External β
β (Next.js) βββββΊβ (API Routes) βββββΊβ Services β
β β β β β β
β β’ React UI β β β’ Authenticationβ β β’ AI Providers β
β β’ SCSS Modules β β β’ Quiz Logic β β β’ YouTube API β
β β’ TypeScript β β β’ File Processingβ β β’ MongoDB β
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
- Content Input: User uploads/inputs content
- Analysis: AI analyzes content for key topics and gaps
- Generation: AI creates targeted quiz questions
- Storage: Quiz saved to MongoDB
- Delivery: Interactive quiz served to users
- Analytics: Performance tracking and insights
- Authentication: Google OAuth with NextAuth.js
- Authorization: Role-based access control
- Data Protection: Encrypted storage and transmission
- API Security: Rate limiting and input validation
We welcome contributions! Please see our Contributing Guide for details.
- Fork the repository
- Create a feature branch (
git checkout -b feature/amazing-feature
) - Commit your changes (
git commit -m 'Add amazing feature'
) - Push to the branch (
git push origin feature/amazing-feature
) - Open a Pull Request
- TypeScript: Strict mode enabled
- ESLint: Follow configured rules
- Prettier: Code formatting
- SCSS: Use modules and CSS variables
- Testing: Write tests for new features
This project is licensed under the MIT License - see the LICENSE file for details.
Dealwap
- π§ Email: [email protected]
- πΌ GitHub: @nosisky
- π Portfolio: dealwap.dev
- π¦ Twitter: @nosisky
- Version: 1.0.0
- Last Updated: December 2024
- Status: Production Ready β
- Maintenance: Actively maintained
- License: MIT License
For support, feature requests, or bug reports:
- GitHub Issues: Create an issue - Best for bug reports and feature requests
- Email: [email protected] - For direct communication and business inquiries
- Documentation: Check this README and inline code comments
- Community: Star the repo and follow for updates
Special thanks to:
- Vercel AI SDK team for the excellent AI integration tools
- Next.js team for the amazing full-stack framework
- OpenAI, Anthropic, DeepSeek, AWS for providing powerful AI models
- The open-source community for inspiration and support