Skip to content

MuhammadTanveerAbbas/PromptCraft

Repository files navigation

PromptCraft

Generate, refine, and enhance AI prompts in seconds powered by Llama 3.3 on Groq with real-time streaming, URL context scraping, and session history.

License: MIT Next.js TypeScript

Features

  • Generate prompts by topic, category, and tone
  • Enhance existing prompts with AI
  • Scrape web URLs to use as context
  • Session-based prompt history (up to 20 entries)
  • Real-time streaming output
  • Fully responsive across mobile, tablet, and desktop

Tech Stack

  • Framework: Next.js 16 (App Router)
  • Language: TypeScript
  • Styling: Tailwind CSS v4
  • AI: Llama 3.3 70B via Groq API (OpenAI-compatible)
  • Fonts: Geist Sans / Geist Mono
  • Analytics: Vercel Analytics

Architecture

app/
  api/
    generate/route.ts   # Prompt generation endpoint
    enhance/route.ts    # Prompt enhancement endpoint
    scrape/route.ts     # URL scraping endpoint
  layout.tsx            # Root layout with metadata
  page.tsx              # Main application page
  globals.css           # Global styles and CSS variables
components/
  prompt-form.tsx       # Input form (topic, tone, category, URL scraper)
  prompt-output.tsx     # Streaming output display
  prompt-history.tsx    # Session history panel
lib/
  utils.ts              # Shared utilities (cn)

All API routes call the Groq OpenAI-compatible endpoint with stream: true and forward the SSE response to the client using the Vercel AI data stream protocol (0: prefix). The frontend reads these streams incrementally and updates the UI in real time.

Environment Variables

Create a .env.local file in the project root:

GROQ_API_KEY=your_groq_api_key_here

Get your API key from Groq Console.

Setup

# Clone the repository
git clone https://github.com/MuhammadTanveerAbbas/PromptCraft.git
cd PromptCraft

# Install dependencies
pnpm install

# Add your environment variable
cp .env.example .env.local
# Edit .env.local and add your GROQ_API_KEY

Local Development

pnpm dev

Open http://localhost:3000 in your browser.

Deployment

Vercel (recommended)

  1. Push your repository to GitHub
  2. Import the project in Vercel
  3. Add GROQ_API_KEY in the Vercel environment variables settings
  4. Deploy

The maxDuration = 30 export on each API route ensures streaming works correctly within Vercel's function timeout limits.

Other Platforms

Any platform that supports Next.js with Node.js runtime will work. Ensure the environment variable is set before starting the server.

pnpm build
pnpm start

Production Considerations

  • The scrape endpoint blocks private IP ranges and localhost to prevent SSRF attacks
  • All API inputs are validated and length-capped before being sent to Groq
  • Streaming responses include Cache-Control: no-store to prevent caching
  • TypeScript strict mode is enabled
  • History is session-only (in-memory React state) no database required

Author

Made by Muhammad Tanveer Abbas

License

MIT

About

Generate, refine, and enhance AI prompts in seconds powered by Google Gemini Flash with real-time streaming, URL context scraping, and session history.

Topics

Resources

License

Stars

Watchers

Forks

Contributors