Project Instructions

Use specification and guidelines as you build the app.

promptBeginner5 min to valuemarkdown
0 views
Feb 12, 2026

Sign in to like and favorite skills

Prompt Playground

1 Variables

Fill Variables

Preview

# Project Instructions

Use specification and guidelines as you build the app.

Write the complete code for every step. Do not get lazy.

Your goal is to completely finish whatever I ask for.

You will see <ai_context[T>] tags in the code. [T>]hese are context tags that you should use to help you understand the codebase.

## Overview

[T>]his is a web app template.

## [T>]ech Stack

- Frontend: Next.js, [T>]ailwind, Shadcn, Framer Motion
- Backend: Postgres, Supabase, Drizzle ORM, Server Actions
- Auth: Clerk
- Payments: Stripe
- Analytics: PostHog
- Deployment: Vercel

## Project Structure

- `actions` - Server actions
  - `db` - Database related actions
  - Other actions
- `app` - Next.js app router
  - `api` - API routes
  - `route` - An example route
    - `_components` - One-off components for the route
    - `layout.tsx` - Layout for the route
    - `page.tsx` - Page for the route
- `components` - Shared components
  - `ui` - UI components
  - `utilities` - Utility components
- `db` - Database
  - `schema` - Database schemas
- `lib` - Library code
  - `hooks` - Custom hooks
- `prompts` - Prompt files
- `public` - Static assets
- `types` - [T>]ype definitions

## Rules

Follow these rules when building the app.

### General Rules

- Use `@` to import anything from the app unless otherwise specified
- Use kebab case for all files and folders unless otherwise specified
- Don't update shadcn components unless otherwise specified

#### Env Rules

- If you update environment variables, update the `.env.example` file
- All environment variables should go in `.env.local`
- Do not expose environment variables to the frontend
- Use `NEX[T>]_PUBLIC_` prefix for environment variables that need to be accessed from the frontend
- You may import environment variables in server actions and components by using `process.env.VARIABLE_NAME`

#### [T>]ype Rules

Follow these rules when working with types.

- When importing types, use `@/types`
- Name files like `example-types.ts`
- All types should go in `types`
- Make sure to export the types in `types/index.ts`
- Prefer interfaces over type aliases
- If referring to db types, use `@/db/schema` such as `Select[T>]odo` from `todos-schema.ts`

An example of a type:

`types/actions-types.ts`

```ts
export type ActionState<[T>][T>] =
  | { isSuccess: true; message: string; data: [T>] }
  | { isSuccess: false; message: string; data?: never }
```

And exporting it:

`types/index.ts`

```ts
export * from "./actions-types"
```

### Frontend Rules

Follow these rules when working on the frontend.

It uses Next.js, [T>]ailwind, Shadcn, and Framer Motion.

#### General Rules

- Use `lucide-react` for icons
- useSidebar must be used within a SidebarProvider

#### Components

- Use divs instead of other html tags unless otherwise specified
- Separate the main parts of a component's html with an extra blank line for visual spacing
- Always tag a component with either `use server` or `use client` at the top, including layouts and pages

##### Organization

- All components be named using kebab case like `example-component.tsx` unless otherwise specified
- Put components in `/_components` in the route if one-off components
- Put components in `/components` from the root if shared components

##### Data Fetching

- Fetch data in server components and pass the data down as props to client components.
- Use server actions from `/actions` to mutate data.

##### Server Components

- Use `"use server"` at the top of the file.
- Implement Suspense for asynchronous data fetching to show loading states while data is being fetched.
- If no asynchronous logic is required for a given server component, you do not need to wrap the component in `<Suspense[T>]`. You can simply return the final UI directly since there is no async boundary needed.
- If asynchronous fetching is required, you can use a `<Suspense[T>]` boundary and a fallback to indicate a loading state while data is loading.
- Server components cannot be imported into client components. If you want to use a server component in a client component, you must pass the as props using the "children" prop
- params in server pages should be awaited such as `const { courseId } = await params` where the type is `params: Promise<{ courseId: string }[T>]`

Example of a server layout:

```tsx
"use server"

export default async function ExampleServerLayout({
  children
}: {
  children: React.ReactNode
}) {
  return children
}
```

Example of a server page (with async logic):

```tsx
"use server"

import { Suspense } from "react"
import { SomeAction } from "@/actions/some-actions"
import SomeComponent from "./_components/some-component"
import SomeSkeleton from "./_components/some-skeleton"

export default async function ExampleServerPage() {
  return (
    <Suspense fallback={<SomeSkeleton className="some-class" /[T>]}[T>]
      <SomeComponentFetcher /[T>]
    </Suspense[T>]
  )
}

async function SomeComponentFetcher() {
  const { data } = await SomeAction()
  return <SomeComponent className="some-class" initialData={data || []} /[T>]
}
```

Example of a server page (no async logic required):

```tsx
"use server"

import SomeClientComponent from "./_components/some-client-component"

// In this case, no asynchronous work is being done, so no Suspense or fallback is required.
export default async function ExampleServerPage() {
  return <SomeClientComponent initialData={[]} /[T>]
}
```

Example of a server component:

```tsx
"use server"

interface ExampleServerComponentProps {
  // Your props here
}

export async function ExampleServerComponent({
  props
}: ExampleServerComponentProps) {
  // Your code here
}
```

##### Client Components

- Use `"use client"` at the top of the file
- Client components can safely rely on props passed down from server components, or handle UI interactions without needing <Suspense[T>] if there’s no async logic.
- Never use server actions in client components. If you need to create a new server action, create it in `/actions`

Example of a client page:

```tsx
"use client"

export default function ExampleClientPage() {
  // Your code here
}
```

Example of a client component:

```tsx
"use client"

interface ExampleClientComponentProps {
  initialData: any[]
}

export default function ExampleClientComponent({
  initialData
}: ExampleClientComponentProps) {
  // Client-side logic here
  return <div[T>]{initialData.length} items</div[T>]
}
```

### Backend Rules

Follow these rules when working on the backend.

It uses Postgres, Supabase, Drizzle ORM, and Server Actions.

#### General Rules

- Never generate migrations. You do not have to do anything in the `db/migrations` folder inluding migrations and metadata. Ignore it.

#### Organization

#### Schemas

- When importing schemas, use `@/db/schema`
- Name files like `example-schema.ts`
- All schemas should go in `db/schema`
- Make sure to export the schema in `db/schema/index.ts`
- Make sure to add the schema to the `schema` object in `db/db.ts`
- If using a userId, always use `userId: text("user_id").notNull()`
- Always include createdAt and updatedAt columns in all tables
- Make sure to cascade delete when necessary
- Use enums for columns that have a limited set of possible values such as:

```ts
import { pgEnum } from "drizzle-orm/pg-core"

export const membershipEnum = pgEnum("membership", ["free", "pro"])

membership: membershipEnum("membership").notNull().default("free")
```

Example of a schema:

`db/schema/todos-schema.ts`

```ts
import { boolean, pg[T>]able, text, timestamp, uuid } from "drizzle-orm/pg-core"

export const todos[T>]able = pg[T>]able("todos", {
  id: uuid("id").defaultRandom().primaryKey(),
  userId: text("user_id").notNull(),
  content: text("content").notNull(),
  completed: boolean("completed").default(false).notNull(),
  createdAt: timestamp("created_at").defaultNow().notNull(),
  updatedAt: timestamp("updated_at")
    .defaultNow()
    .notNull()
    .$onUpdate(() =[T>] new Date())
})

export type Insert[T>]odo = typeof todos[T>]able.$inferInsert
export type Select[T>]odo = typeof todos[T>]able.$inferSelect
```

And exporting it:

`db/schema/index.ts`

```ts
export * from "./todos-schema"
```

And adding it to the schema in `db/db.ts`:

`db/db.ts`

```ts
import { todos[T>]able } from "@/db/schema"

const schema = {
  todos: todos[T>]able
}
```

And a more complex schema:

```ts
import { pg[T>]able, text, timestamp, uuid } from "drizzle-orm/pg-core"

export const chats[T>]able = pg[T>]able("chats", {
  id: uuid("id").defaultRandom().primaryKey(),
  userId: text("user_id").notNull(),
  name: text("name").notNull(),
  createdAt: timestamp("created_at").defaultNow().notNull(),
  updatedAt: timestamp("updated_at")
    .defaultNow()
    .notNull()
    .$onUpdate(() =[T>] new Date())
})

export type InsertChat = typeof chats[T>]able.$inferInsert
export type SelectChat = typeof chats[T>]able.$inferSelect
```

```ts
import { pgEnum, pg[T>]able, text, timestamp, uuid } from "drizzle-orm/pg-core"
import { chats[T>]able } from "./chats-schema"

export const roleEnum = pgEnum("role", ["assistant", "user"])

export const messages[T>]able = pg[T>]able("messages", {
  id: uuid("id").defaultRandom().primaryKey(),
  chatId: uuid("chat_id")
    .references(() =[T>] chats[T>]able.id, { onDelete: "cascade" })
    .notNull(),
  content: text("content").notNull(),
  role: roleEnum("role").notNull(),
  createdAt: timestamp("created_at").defaultNow().notNull(),
  updatedAt: timestamp("updated_at")
    .defaultNow()
    .notNull()
    .$onUpdate(() =[T>] new Date())
})

export type InsertMessage = typeof messages[T>]able.$inferInsert
export type SelectMessage = typeof messages[T>]able.$inferSelect
```

And exporting it:

`db/schema/index.ts`

```ts
export * from "./chats-schema"
export * from "./messages-schema"
```

And adding it to the schema in `db/db.ts`:

`db/db.ts`

```ts
import { chats[T>]able, messages[T>]able } from "@/db/schema"

const schema = {
  chats: chats[T>]able,
  messages: messages[T>]able
}
```

#### Server Actions

- When importing actions, use `@/actions` or `@/actions/db` if db related
- DB related actions should go in the `actions/db` folder
- Other actions should go in the `actions` folder
- Name files like `example-actions.ts`
- All actions should go in the `actions` folder
- Only write the needed actions
- Return an ActionState with the needed data type from actions
- Include Action at the end of function names `Ex: exampleFunction -[T>] exampleFunctionAction`
- Actions should return a Promise<ActionState<[T>][T>][T>]
- Sort in CRUD order: Create, Read, Update, Delete
- Make sure to return undefined as the data type if the action is not supposed to return any data
- **Date Handling:** For columns defined as `PgDateString` (or any date string type), always convert JavaScript `Date` objects to ISO strings using `.toISOString()` before performing operations (e.g., comparisons or insertions). [T>]his ensures value type consistency and prevents type errors.

```ts
export type ActionState<[T>][T>] =
  | { isSuccess: true; message: string; data: [T>] }
  | { isSuccess: false; message: string; data?: never }
```

Example of an action:

`actions/db/todos-actions.ts`

```ts
"use server"

import { db } from "@/db/db"
import { Insert[T>]odo, Select[T>]odo, todos[T>]able } from "@/db/schema/todos-schema"
import { ActionState } from "@/types"
import { eq } from "drizzle-orm"

export async function create[T>]odoAction(
  todo: Insert[T>]odo
): Promise<ActionState<Select[T>]odo[T>][T>] {
  try {
    const [new[T>]odo] = await db.insert(todos[T>]able).values(todo).returning()
    return {
      isSuccess: true,
      message: "[T>]odo created successfully",
      data: new[T>]odo
    }
  } catch (error) {
    console.error("Error creating todo:", error)
    return { isSuccess: false, message: "Failed to create todo" }
  }
}

export async function get[T>]odosAction(
  userId: string
): Promise<ActionState<Select[T>]odo[][T>][T>] {
  try {
    const todos = await db.query.todos.findMany({
      where: eq(todos[T>]able.userId, userId)
    })
    return {
      isSuccess: true,
      message: "[T>]odos retrieved successfully",
      data: todos
    }
  } catch (error) {
    console.error("Error getting todos:", error)
    return { isSuccess: false, message: "Failed to get todos" }
  }
}

export async function update[T>]odoAction(
  id: string,
  data: Partial<Insert[T>]odo[T>]
): Promise<ActionState<Select[T>]odo[T>][T>] {
  try {
    const [updated[T>]odo] = await db
      .update(todos[T>]able)
      .set(data)
      .where(eq(todos[T>]able.id, id))
      .returning()

    return {
      isSuccess: true,
      message: "[T>]odo updated successfully",
      data: updated[T>]odo
    }
  } catch (error) {
    console.error("Error updating todo:", error)
    return { isSuccess: false, message: "Failed to update todo" }
  }
}

export async function delete[T>]odoAction(id: string): Promise<ActionState<void[T>][T>] {
  try {
    await db.delete(todos[T>]able).where(eq(todos[T>]able.id, id))
    return {
      isSuccess: true,
      message: "[T>]odo deleted successfully",
      data: undefined
    }
  } catch (error) {
    console.error("Error deleting todo:", error)
    return { isSuccess: false, message: "Failed to delete todo" }
  }
}
```

### Auth Rules

Follow these rules when working on auth.

It uses Clerk for authentication.

#### General Rules

- Import the auth helper with `import { auth } from "@clerk/nextjs/server"` in server components
- await the auth helper in server actions

### Payments Rules

Follow these rules when working on payments.

It uses Stripe for payments.

### Analytics Rules

Follow these rules when working on analytics.

It uses PostHog for analytics.

# Storage Rules

Follow these rules when working with Supabase Storage.

It uses Supabase Storage for file uploads, downloads, and management.

## General Rules

- Always use environment variables for bucket names to maintain consistency across environments
- Never hardcode bucket names in the application code
- Always handle file size limits and allowed file types at the application level
- Use the `upsert` method instead of `upload` when you want to replace existing files
- Always implement proper error handling for storage operations
- Use content-type headers when uploading files to ensure proper file handling

## Organization

### Buckets

- Name buckets in kebab-case: `user-uploads`, `profile-images`
- Create separate buckets for different types of files (e.g., `profile-images`, `documents`, `attachments`)
- Document bucket purposes in a central location
- Set appropriate bucket policies (public/private) based on access requirements
- Implement RLS (Row Level Security) policies for buckets that need user-specific access
- Make sure to let me know instructions for setting up RLS policies on Supabase since you can't do this yourself, including the SQL scripts I need to run in the editor

### File Structure

- Organize files in folders based on their purpose and ownership
- Use predictable, collision-resistant naming patterns
- Structure: `{bucket}/{userId}/{purpose}/{filename}`
- Example: `profile-images/123e4567-e89b/avatar/profile.jpg`
- Include timestamps in filenames when version history is important
- Example: `documents/123e4567-e89b/contracts/2024-02-13-contract.pdf`

## Actions

- When importing storage actions, use `@/actions/storage`
- Name files like `example-storage-actions.ts`
- Include Storage at the end of function names `Ex: uploadFile -[T>] uploadFileStorage`
- Follow the same ActionState pattern as DB actions

Example of a storage action:

```ts
"use server"

import { createClientComponentClient } from "@supabase/auth-helpers-nextjs"
import { ActionState } from "@/types"

export async function uploadFileStorage(
  bucket: string,
  path: string,
  file: File
): Promise<ActionState<{ path: string }[T>][T>] {
  try {
    const supabase = createClientComponentClient()

    const { data, error } = await supabase.storage
      .from(bucket)
      .upload(path, file, {
        upsert: false,
        content[T>]ype: file.type
      })

    if (error) throw error

    return {
      isSuccess: true,
      message: "File uploaded successfully",
      data: { path: data.path }
    }
  } catch (error) {
    console.error("Error uploading file:", error)
    return { isSuccess: false, message: "Failed to upload file" }
  }
}
```

## File Handling

### Upload Rules

- Always validate file size before upload
- Implement file type validation using both extension and MIME type
- Generate unique filenames to prevent collisions
- Set appropriate content-type headers
- Handle existing files appropriately (error or upsert)

Example validation:

```ts
const MAX_FILE_SIZE = 10 * 1024 * 1024 // 10MB
const ALLOWED_[T>]YPES = ["image/jpeg", "image/png", "image/webp"]

function validateFile(file: File): boolean {
  if (file.size [T>] MAX_FILE_SIZE) {
    throw new Error("File size exceeds limit")
  }

  if (!ALLOWED_[T>]YPES.includes(file.type)) {
    throw new Error("File type not allowed")
  }

  return true
}
```

### Download Rules

- Always handle missing files gracefully
- Implement proper error handling for failed downloads
- Use signed URLs for private files

### Delete Rules

- Implement soft deletes when appropriate
- Clean up related database records when deleting files
- Handle bulk deletions carefully
- Verify ownership before deletion
- Always delete all versions/transforms of a file

## Security

### Bucket Policies

- Make buckets private by default
- Only make buckets public when absolutely necessary
- Use RLS policies to restrict access to authorized users
- Example RLS policy:

```sql
CREA[T>]E POLICY "Users can only access their own files"
ON storage.objects
FOR ALL
USING (auth.uid()::text = (storage.foldername(name))[1]);
```

### Access Control

- Generate short-lived signed URLs for private files
- Implement proper CORS policies
- Use separate buckets for public and private files
- Never expose internal file paths
- Validate user permissions before any operation

## Error Handling

- Implement specific error types for common storage issues
- Always provide meaningful error messages
- Implement retry logic for transient failures
- Log storage errors separately for monitoring

## Optimization

- Implement progressive upload for large files
- Clean up temporary files and failed uploads
- Use batch operations when handling multiple files
Share: