diff --git a/.dockerignore b/.dockerignore new file mode 100644 index 00000000..186fe69a --- /dev/null +++ b/.dockerignore @@ -0,0 +1,4 @@ +**/node_modules +**/.next +**/.turbo +**/.env* diff --git a/.gitignore b/.gitignore index 67333569..9cafba4f 100644 --- a/.gitignore +++ b/.gitignore @@ -41,3 +41,4 @@ tls/ dist/ .env +.turbo diff --git a/README.md b/README.md index f3dcf2ce..0934b4fe 100644 --- a/README.md +++ b/README.md @@ -23,8 +23,63 @@ How is this possible? [PGlite](https://pglite.dev/), a WASM version of Postgres This is a monorepo split into the following projects: -- [Frontend (Next.js)](./apps/postgres-new/): This contains the primary web app built with Next.js -- [Backend (pg-gateway)](./apps/db-service/): This serves S3-backed PGlite databases over the PG wire protocol using [pg-gateway](https://github.com/supabase-community/pg-gateway) +- [Web](./apps/web/): The primary web app built with Next.js +- [Browser proxy](./apps/browser-proxy/): Proxies Postgres TCP connections back to the browser using [pg-gateway](https://github.com/supabase-community/pg-gateway) and Web Sockets +- [Deploy worker](./apps/deploy-worker/): Deploys in-browser databases to database platforms (currently Supabase is supported) + +### Setup + +From the monorepo root: + +1. Install dependencies + + ```shell + npm i + ``` + +2. Start local Supabase stack: + ```shell + npx supabase start + ``` +3. Store local Supabase URL/anon key in `./apps/web/.env.local`: + ```shell + npx supabase status -o env \ + --override-name api.url=NEXT_PUBLIC_SUPABASE_URL \ + --override-name auth.anon_key=NEXT_PUBLIC_SUPABASE_ANON_KEY | + grep NEXT_PUBLIC >> ./apps/web/.env.local + ``` +4. Create an [OpenAI API key](https://platform.openai.com/api-keys) and save to `./apps/web/.env.local`: + ```shell + echo 'OPENAI_API_KEY=""' >> ./apps/web/.env.local + ``` +5. Store local KV (Redis) vars. Use these exact values: + + ```shell + echo 'KV_REST_API_URL="http://localhost:8080"' >> ./apps/web/.env.local + echo 'KV_REST_API_TOKEN="local_token"' >> ./apps/web/.env.local + ``` + +6. Start local Redis containers (used for rate limiting). Serves an API on port 8080: + + ```shell + docker compose -f ./apps/web/docker-compose.yml up -d + ``` + +7. Fill in the remaining variables for each app as seen in: + + - `./apps/web/.env.example` + - `./apps/browser-proxy/.env.example` + - `./apps/deploy-worker/.env.example` + +### Development + +From the monorepo root: + +```shell +npm run dev +``` + +_**Important:** This command uses `turbo` under the hood which understands the relationship between dependencies in the monorepo and automatically builds them accordingly (ie. `./packages/*`). If you by-pass `turbo`, you will have to manually build each `./packages/*` before each `./app/*` can use them._ ## Why rename postgres.new? diff --git a/apps/browser-proxy/README.md b/apps/browser-proxy/README.md index 1b6f4e9e..0d0abc2f 100644 --- a/apps/browser-proxy/README.md +++ b/apps/browser-proxy/README.md @@ -8,17 +8,9 @@ It is using a WebSocket server and a TCP server to make the communication betwee Copy the `.env.example` file to `.env` and set the correct environment variables. -Install dependencies: +Run the dev server from the monorepo root. See [Development](../../README.md#development). -```sh -npm install -``` - -Start the proxy in development mode: - -```sh -npm run dev -``` +The browser proxy will be listening on ports `5432` (Postgres TCP) and `443` (Web Sockets). ## Deployment @@ -30,4 +22,4 @@ Deploy the app: ```sh fly deploy --app database-build-browser-proxy -``` \ No newline at end of file +``` diff --git a/apps/db-service/.dockerignore b/apps/db-service/.dockerignore deleted file mode 100644 index 47719bef..00000000 --- a/apps/db-service/.dockerignore +++ /dev/null @@ -1,5 +0,0 @@ -fly.toml -Dockerfile -.dockerignore -node_modules -.git diff --git a/apps/db-service/Dockerfile b/apps/db-service/Dockerfile deleted file mode 100644 index 9476c5f9..00000000 --- a/apps/db-service/Dockerfile +++ /dev/null @@ -1,68 +0,0 @@ -# syntax = docker/dockerfile:1 - -# Adjust NODE_VERSION as desired -ARG NODE_VERSION=20.4.0 -FROM node:${NODE_VERSION}-bookworm as base - -LABEL fly_launch_runtime="NodeJS" - -# NodeJS app lives here -WORKDIR /app - -# Set production environment -ENV NODE_ENV=production - -# Build S3FS -FROM base as build-s3fs - -# Install dependencies -RUN apt-get update && \ - apt-get install -y \ - libfuse-dev - -RUN git clone https://github.com/s3fs-fuse/s3fs-fuse.git --branch v1.94 && \ - cd s3fs-fuse && \ - ./autogen.sh && \ - ./configure && \ - make && \ - make install - -# Build app -FROM base as build-app - -# Install packages needed to build node modules -RUN apt-get update -qq && \ - apt-get install -y \ - python-is-python3 \ - pkg-config \ - build-essential - -# Install node modules -COPY --link package.json . -RUN npm install --production=false - -# Copy application code -COPY --link . . - -# Build app -RUN npm run build - -# Remove development dependencies -RUN npm prune --production - -# Final stage for app image -FROM base - -# Install dependencies -RUN apt-get update && \ - apt-get install -y \ - fuse \ - && rm -rf /var/lib/apt/lists/* - -COPY --from=build-s3fs /usr/local/bin/s3fs /usr/local/bin/s3fs -COPY --from=build-app /app /app - -ENTRYPOINT [ "./entrypoint.sh" ] - -# Start the server by default, this can be overwritten at runtime -CMD [ "node", "dist/index.js" ] diff --git a/apps/db-service/README.md b/apps/db-service/README.md deleted file mode 100644 index b4383a68..00000000 --- a/apps/db-service/README.md +++ /dev/null @@ -1,101 +0,0 @@ -# DB Service - -This service is still WIP. It uses [`s3fs`](https://github.com/s3fs-fuse/s3fs-fuse) to mount an S3-compatible storage to `/mnt/s3` then serve PGlite instances via the PGDATA that lives under `/mnt/s3/dbs/`. - -It also requires TLS certs, since we use SNI to reverse proxy DB connections (eg. `12345.db.example.com` serves `/mnt/s3/dbs/12345`). These certs live under `/mnt/s3/tls`. - -## TODO - -- [x] Containerize -- [ ] Connect to Supabase DB to validate creds/dbs -- [ ] DB versioning -- [ ] PGlite upload service - -## Development - -### Without `s3fs` - -If want to develop locally without dealing with containers or underlying storage: - -1. Generate certs that live under `./tls`: - ```shell - npm run generate:certs - ``` -1. Run the `pg-gateway` server: - ```shell - npm run dev - ``` - All DBs will live under `./dbs`. -1. Connect to the server via `psql`: - - ```shell - psql "host=localhost port=5432 user=postgres" - ``` - - or to test a real database ID, add a loopback entry to your `/etc/hosts` file: - - ``` - # ... - - 127.0.0.1 12345.db.example.com - ``` - - and connect to that host: - - ```shell - psql "host=12345.db.example.com port=5432 user=postgres" - ``` - -### With `s3fs` - -To simulate an environment closer to production, you can test the service with DBs backed by `s3fs` using Minio and Docker. - -1. Start Minio as a local s3-compatible server: - ```shell - docker compose up -d minio - ``` -1. Initialize test bucket: - ```shell - docker compose up minio-init - ``` - This will run to completion then exit. -1. Initialize local TLS certs: - - ```shell - docker compose up --build tls-init - ``` - - This will build the container (if it's not cached) then run to completion and exit. Certs are stored under `/mnt/s3/tls`. - -1. Run the `pg-gateway` server: - ```shell - docker compose up --build db-service - ``` - This will build the container (if it's not cached) then run the Node `db-service`. All DBs will live under `/mnt/s3/dbs`. -1. Connect to the server via `psql`: - - ```shell - psql "host=localhost port=5432 user=postgres" - ``` - - > Note the very first time a DB is created will be very slow (`s3fs` writes are slow with that many file handles) so expect this to hang for a while. Subsequent requests will be much quicker. This is temporary anyway - in the future the DB will have to already exist in `/mnt/s3/dbs/` in order to connect. - - or to test a real database ID, add a loopback entry to your `/etc/hosts` file: - - ``` - # ... - - 127.0.0.1 12345.db.example.com - ``` - - and connect to that host: - - ```shell - psql "host=12345.db.example.com port=5432 user=postgres" - ``` - -To stop all Docker containers, run: - -```shell -docker compose down -``` diff --git a/apps/db-service/docker-compose.yml b/apps/db-service/docker-compose.yml deleted file mode 100644 index 13e26335..00000000 --- a/apps/db-service/docker-compose.yml +++ /dev/null @@ -1,63 +0,0 @@ -services: - db-service: - image: db-service - build: - context: . - environment: - S3FS_ENDPOINT: http://minio:9000 - S3FS_BUCKET: test - S3FS_REGION: us-east-1 # default region for s3-compatible APIs - S3FS_MOUNT: /mnt/s3 - AWS_ACCESS_KEY_ID: minioadmin - AWS_SECRET_ACCESS_KEY: minioadmin - ports: - - 5432:5432 - devices: - - /dev/fuse - cap_add: - - SYS_ADMIN - depends_on: - minio: - condition: service_healthy - tls-init: - image: tls-init - build: - context: . - environment: - S3FS_ENDPOINT: http://minio:9000 - S3FS_BUCKET: test - S3FS_REGION: us-east-1 # default region for s3-compatible APIs - S3FS_MOUNT: /mnt/s3 - AWS_ACCESS_KEY_ID: minioadmin - AWS_SECRET_ACCESS_KEY: minioadmin - devices: - - /dev/fuse - cap_add: - - SYS_ADMIN - command: ./scripts/generate-certs.sh - depends_on: - minio: - condition: service_healthy - minio: - image: minio/minio - environment: - MINIO_ROOT_USER: minioadmin - MINIO_ROOT_PASSWORD: minioadmin - ports: - - 9000:9000 - command: server /data - healthcheck: - test: timeout 5s bash -c ':> /dev/tcp/127.0.0.1/9000' || exit 1 - interval: 5s - timeout: 5s - retries: 1 - minio-init: - image: minio/mc - entrypoint: > - /bin/sh -c " - mc alias set local http://minio:9000 minioadmin minioadmin; - (mc ls local/test || mc mb local/test); - " - depends_on: - minio: - condition: service_healthy diff --git a/apps/db-service/entrypoint.sh b/apps/db-service/entrypoint.sh deleted file mode 100755 index be930a28..00000000 --- a/apps/db-service/entrypoint.sh +++ /dev/null @@ -1,38 +0,0 @@ -#!/bin/bash - -set -e -set -o pipefail - -cleanup() { - echo "Unmounting s3fs..." - fusermount -u $S3FS_MOUNT - exit 0 -} - -forward_signal() { - kill -$1 "$MAIN_PID" -} - -trap 'forward_signal SIGINT' SIGINT -trap 'forward_signal SIGTERM' SIGTERM -trap 'cleanup' EXIT - -# Create the mount point directory -mkdir -p $S3FS_MOUNT - -# Mount the S3 bucket -s3fs $S3FS_BUCKET $S3FS_MOUNT -o use_path_request_style -o url=$S3FS_ENDPOINT -o endpoint=$S3FS_REGION - -# Check if the mount was successful -if mountpoint -q $S3FS_MOUNT; then - echo "S3 bucket mounted successfully at $S3FS_MOUNT" -else - echo "Failed to mount S3 bucket" - exit 1 -fi - -# Execute the original command -"$@" & -MAIN_PID=$! - -wait $MAIN_PID diff --git a/apps/db-service/package.json b/apps/db-service/package.json deleted file mode 100644 index 1003cc7f..00000000 --- a/apps/db-service/package.json +++ /dev/null @@ -1,20 +0,0 @@ -{ - "name": "db-service", - "type": "module", - "scripts": { - "start": "node dist/index.js", - "dev": "tsx src/index.ts", - "build": "tsc -b", - "generate:certs": "scripts/generate-certs.sh", - "psql": "psql 'host=localhost port=5432 user=postgres sslmode=verify-ca sslrootcert=ca-cert.pem'" - }, - "dependencies": { - "@electric-sql/pglite": "0.2.0-alpha.9", - "pg-gateway": "^0.2.5-alpha.2" - }, - "devDependencies": { - "@types/node": "^20.14.11", - "tsx": "^4.16.2", - "typescript": "^5.5.3" - } -} diff --git a/apps/db-service/scripts/generate-certs.sh b/apps/db-service/scripts/generate-certs.sh deleted file mode 100755 index 8e474774..00000000 --- a/apps/db-service/scripts/generate-certs.sh +++ /dev/null @@ -1,18 +0,0 @@ -#!/bin/bash - -set -e -set -o pipefail - -S3FS_MOUNT=${S3FS_MOUNT:=.} -CERT_DIR="$S3FS_MOUNT/tls" - -mkdir -p $CERT_DIR -cd $CERT_DIR - -openssl genpkey -algorithm RSA -out ca-key.pem -openssl req -new -x509 -key ca-key.pem -out ca-cert.pem -days 365 -subj "/CN=MyCA" - -openssl genpkey -algorithm RSA -out key.pem -openssl req -new -key key.pem -out csr.pem -subj "/CN=*.db.example.com" - -openssl x509 -req -in csr.pem -CA ca-cert.pem -CAkey ca-key.pem -CAcreateserial -out cert.pem -days 365 diff --git a/apps/db-service/src/index.ts b/apps/db-service/src/index.ts deleted file mode 100644 index 9e28a20c..00000000 --- a/apps/db-service/src/index.ts +++ /dev/null @@ -1,109 +0,0 @@ -import { PGlite, PGliteInterface } from '@electric-sql/pglite' -import { mkdir, readFile } from 'node:fs/promises' -import net from 'node:net' -import { hashMd5Password, PostgresConnection, TlsOptions } from 'pg-gateway' - -const s3fsMount = process.env.S3FS_MOUNT ?? '.' -const dbDir = `${s3fsMount}/dbs` -const tlsDir = `${s3fsMount}/tls` - -await mkdir(dbDir, { recursive: true }) -await mkdir(tlsDir, { recursive: true }) - -const tls: TlsOptions = { - key: await readFile(`${tlsDir}/key.pem`), - cert: await readFile(`${tlsDir}/cert.pem`), - ca: await readFile(`${tlsDir}/ca-cert.pem`), -} - -function getIdFromServerName(serverName: string) { - // The left-most subdomain contains the ID - // ie. 12345.db.example.com -> 12345 - const [id] = serverName.split('.') - return id -} - -const server = net.createServer((socket) => { - let db: PGliteInterface - - const connection = new PostgresConnection(socket, { - serverVersion: '16.3 (PGlite 0.2.0)', - authMode: 'md5Password', - tls, - async validateCredentials(credentials) { - if (credentials.authMode === 'md5Password') { - const { hash, salt } = credentials - const expectedHash = await hashMd5Password('postgres', 'postgres', salt) - return hash === expectedHash - } - return false - }, - async onTlsUpgrade({ tlsInfo }) { - if (!tlsInfo) { - connection.sendError({ - severity: 'FATAL', - code: '08000', - message: `ssl connection required`, - }) - connection.socket.end() - return - } - - if (!tlsInfo.sniServerName) { - connection.sendError({ - severity: 'FATAL', - code: '08000', - message: `ssl sni extension required`, - }) - connection.socket.end() - return - } - - const databaseId = getIdFromServerName(tlsInfo.sniServerName) - - console.log(`Serving database '${databaseId}'`) - - db = new PGlite(`${dbDir}/${databaseId}`) - }, - async onStartup() { - if (!db) { - console.log('PGlite instance undefined. Was onTlsUpgrade never called?') - connection.sendError({ - severity: 'FATAL', - code: 'XX000', - message: `error loading database`, - }) - connection.socket.end() - return true - } - - // Wait for PGlite to be ready before further processing - await db.waitReady - return false - }, - async onMessage(data, { isAuthenticated }) { - // Only forward messages to PGlite after authentication - if (!isAuthenticated) { - return false - } - - // Forward raw message to PGlite - try { - const responseData = await db.execProtocolRaw(data) - connection.sendData(responseData) - } catch (err) { - console.error(err) - } - return true - }, - }) - - socket.on('end', async () => { - console.log('Client disconnected') - await db?.close() - }) -}) - -server.listen(5432, async () => { - console.log('Server listening on port 5432') -}) diff --git a/apps/db-service/tsconfig.json b/apps/db-service/tsconfig.json deleted file mode 100644 index 0876a623..00000000 --- a/apps/db-service/tsconfig.json +++ /dev/null @@ -1,103 +0,0 @@ -{ - "compilerOptions": { - /* Visit https://aka.ms/tsconfig to read more about this file */ - /* Projects */ - // "incremental": true, /* Save .tsbuildinfo files to allow for incremental compilation of projects. */ - // "composite": true, /* Enable constraints that allow a TypeScript project to be used with project references. */ - // "tsBuildInfoFile": "./.tsbuildinfo", /* Specify the path to .tsbuildinfo incremental compilation file. */ - // "disableSourceOfProjectReferenceRedirect": true, /* Disable preferring source files instead of declaration files when referencing composite projects. */ - // "disableSolutionSearching": true, /* Opt a project out of multi-project reference checking when editing. */ - // "disableReferencedProjectLoad": true, /* Reduce the number of projects loaded automatically by TypeScript. */ - /* Language and Environment */ - "target": "ESNext", /* Set the JavaScript language version for emitted JavaScript and include compatible library declarations. */ - // "lib": [], /* Specify a set of bundled library declaration files that describe the target runtime environment. */ - // "jsx": "preserve", /* Specify what JSX code is generated. */ - // "experimentalDecorators": true, /* Enable experimental support for legacy experimental decorators. */ - // "emitDecoratorMetadata": true, /* Emit design-type metadata for decorated declarations in source files. */ - // "jsxFactory": "", /* Specify the JSX factory function used when targeting React JSX emit, e.g. 'React.createElement' or 'h'. */ - // "jsxFragmentFactory": "", /* Specify the JSX Fragment reference used for fragments when targeting React JSX emit e.g. 'React.Fragment' or 'Fragment'. */ - // "jsxImportSource": "", /* Specify module specifier used to import the JSX factory functions when using 'jsx: react-jsx*'. */ - // "reactNamespace": "", /* Specify the object invoked for 'createElement'. This only applies when targeting 'react' JSX emit. */ - // "noLib": true, /* Disable including any library files, including the default lib.d.ts. */ - // "useDefineForClassFields": true, /* Emit ECMAScript-standard-compliant class fields. */ - // "moduleDetection": "auto", /* Control what method is used to detect module-format JS files. */ - /* Modules */ - "module": "NodeNext", /* Specify what module code is generated. */ - "rootDir": "./src", /* Specify the root folder within your source files. */ - // "moduleResolution": "node10", /* Specify how TypeScript looks up a file from a given module specifier. */ - // "baseUrl": "./", /* Specify the base directory to resolve non-relative module names. */ - // "paths": {}, /* Specify a set of entries that re-map imports to additional lookup locations. */ - // "rootDirs": [], /* Allow multiple folders to be treated as one when resolving modules. */ - // "typeRoots": [], /* Specify multiple folders that act like './node_modules/@types'. */ - // "types": [], /* Specify type package names to be included without being referenced in a source file. */ - // "allowUmdGlobalAccess": true, /* Allow accessing UMD globals from modules. */ - // "moduleSuffixes": [], /* List of file name suffixes to search when resolving a module. */ - // "allowImportingTsExtensions": true, /* Allow imports to include TypeScript file extensions. Requires '--moduleResolution bundler' and either '--noEmit' or '--emitDeclarationOnly' to be set. */ - // "resolvePackageJsonExports": true, /* Use the package.json 'exports' field when resolving package imports. */ - // "resolvePackageJsonImports": true, /* Use the package.json 'imports' field when resolving imports. */ - // "customConditions": [], /* Conditions to set in addition to the resolver-specific defaults when resolving imports. */ - // "resolveJsonModule": true, /* Enable importing .json files. */ - // "allowArbitraryExtensions": true, /* Enable importing files with any extension, provided a declaration file is present. */ - // "noResolve": true, /* Disallow 'import's, 'require's or ''s from expanding the number of files TypeScript should add to a project. */ - /* JavaScript Support */ - // "allowJs": true, /* Allow JavaScript files to be a part of your program. Use the 'checkJS' option to get errors from these files. */ - // "checkJs": true, /* Enable error reporting in type-checked JavaScript files. */ - // "maxNodeModuleJsDepth": 1, /* Specify the maximum folder depth used for checking JavaScript files from 'node_modules'. Only applicable with 'allowJs'. */ - /* Emit */ - "declaration": true, /* Generate .d.ts files from TypeScript and JavaScript files in your project. */ - // "declarationMap": true, /* Create sourcemaps for d.ts files. */ - // "emitDeclarationOnly": true, /* Only output d.ts files and not JavaScript files. */ - "sourceMap": true, /* Create source map files for emitted JavaScript files. */ - // "inlineSourceMap": true, /* Include sourcemap files inside the emitted JavaScript. */ - // "outFile": "./", /* Specify a file that bundles all outputs into one JavaScript file. If 'declaration' is true, also designates a file that bundles all .d.ts output. */ - "outDir": "./dist", /* Specify an output folder for all emitted files. */ - // "removeComments": true, /* Disable emitting comments. */ - // "noEmit": true, /* Disable emitting files from a compilation. */ - // "importHelpers": true, /* Allow importing helper functions from tslib once per project, instead of including them per-file. */ - // "downlevelIteration": true, /* Emit more compliant, but verbose and less performant JavaScript for iteration. */ - // "sourceRoot": "", /* Specify the root path for debuggers to find the reference source code. */ - // "mapRoot": "", /* Specify the location where debugger should locate map files instead of generated locations. */ - // "inlineSources": true, /* Include source code in the sourcemaps inside the emitted JavaScript. */ - // "emitBOM": true, /* Emit a UTF-8 Byte Order Mark (BOM) in the beginning of output files. */ - // "newLine": "crlf", /* Set the newline character for emitting files. */ - // "stripInternal": true, /* Disable emitting declarations that have '@internal' in their JSDoc comments. */ - // "noEmitHelpers": true, /* Disable generating custom helper functions like '__extends' in compiled output. */ - // "noEmitOnError": true, /* Disable emitting files if any type checking errors are reported. */ - // "preserveConstEnums": true, /* Disable erasing 'const enum' declarations in generated code. */ - // "declarationDir": "./", /* Specify the output directory for generated declaration files. */ - /* Interop Constraints */ - // "isolatedModules": true, /* Ensure that each file can be safely transpiled without relying on other imports. */ - // "verbatimModuleSyntax": true, /* Do not transform or elide any imports or exports not marked as type-only, ensuring they are written in the output file's format based on the 'module' setting. */ - // "isolatedDeclarations": true, /* Require sufficient annotation on exports so other tools can trivially generate declaration files. */ - // "allowSyntheticDefaultImports": true, /* Allow 'import x from y' when a module doesn't have a default export. */ - "esModuleInterop": true, /* Emit additional JavaScript to ease support for importing CommonJS modules. This enables 'allowSyntheticDefaultImports' for type compatibility. */ - // "preserveSymlinks": true, /* Disable resolving symlinks to their realpath. This correlates to the same flag in node. */ - "forceConsistentCasingInFileNames": true, /* Ensure that casing is correct in imports. */ - /* Type Checking */ - "strict": true, /* Enable all strict type-checking options. */ - // "noImplicitAny": true, /* Enable error reporting for expressions and declarations with an implied 'any' type. */ - // "strictNullChecks": true, /* When type checking, take into account 'null' and 'undefined'. */ - // "strictFunctionTypes": true, /* When assigning functions, check to ensure parameters and the return values are subtype-compatible. */ - // "strictBindCallApply": true, /* Check that the arguments for 'bind', 'call', and 'apply' methods match the original function. */ - // "strictPropertyInitialization": true, /* Check for class properties that are declared but not set in the constructor. */ - // "noImplicitThis": true, /* Enable error reporting when 'this' is given the type 'any'. */ - // "useUnknownInCatchVariables": true, /* Default catch clause variables as 'unknown' instead of 'any'. */ - // "alwaysStrict": true, /* Ensure 'use strict' is always emitted. */ - // "noUnusedLocals": true, /* Enable error reporting when local variables aren't read. */ - // "noUnusedParameters": true, /* Raise an error when a function parameter isn't read. */ - // "exactOptionalPropertyTypes": true, /* Interpret optional property types as written, rather than adding 'undefined'. */ - // "noImplicitReturns": true, /* Enable error reporting for codepaths that do not explicitly return in a function. */ - // "noFallthroughCasesInSwitch": true, /* Enable error reporting for fallthrough cases in switch statements. */ - // "noUncheckedIndexedAccess": true, /* Add 'undefined' to a type when accessed using an index. */ - // "noImplicitOverride": true, /* Ensure overriding members in derived classes are marked with an override modifier. */ - // "noPropertyAccessFromIndexSignature": true, /* Enforces using indexed accessors for keys declared using an indexed type. */ - // "allowUnusedLabels": true, /* Disable error reporting for unused labels. */ - // "allowUnreachableCode": true, /* Disable error reporting for unreachable code. */ - /* Completeness */ - // "skipDefaultLibCheck": true, /* Skip type checking .d.ts files that are included with TypeScript. */ - "skipLibCheck": true /* Skip type checking all .d.ts files. */ - }, - "include": [ - "src/**/*" - ] -} \ No newline at end of file diff --git a/apps/deploy-worker/.env.example b/apps/deploy-worker/.env.example new file mode 100644 index 00000000..04ae599e --- /dev/null +++ b/apps/deploy-worker/.env.example @@ -0,0 +1,8 @@ +SUPABASE_ANON_KEY="" +SUPABASE_OAUTH_CLIENT_ID="" +SUPABASE_OAUTH_SECRET="" +SUPABASE_SERVICE_ROLE_KEY="" +SUPABASE_URL="" +SUPABASE_PLATFORM_URL="https://supabase.com" +SUPABASE_PLATFORM_API_URL="https://api.supabase.com" +SUPABASE_PLATFORM_DEPLOY_REGION="us-east-1" diff --git a/apps/deploy-worker/.gitignore b/apps/deploy-worker/.gitignore new file mode 100644 index 00000000..36fabb6c --- /dev/null +++ b/apps/deploy-worker/.gitignore @@ -0,0 +1,28 @@ +# dev +.yarn/ +!.yarn/releases +.vscode/* +!.vscode/launch.json +!.vscode/*.code-snippets +.idea/workspace.xml +.idea/usage.statistics.xml +.idea/shelf + +# deps +node_modules/ + +# env +.env +.env.production + +# logs +logs/ +*.log +npm-debug.log* +yarn-debug.log* +yarn-error.log* +pnpm-debug.log* +lerna-debug.log* + +# misc +.DS_Store diff --git a/apps/deploy-worker/Dockerfile b/apps/deploy-worker/Dockerfile new file mode 100644 index 00000000..3e8238e4 --- /dev/null +++ b/apps/deploy-worker/Dockerfile @@ -0,0 +1,40 @@ +FROM node:22-alpine AS base + +FROM base AS builder + +WORKDIR /app + +RUN npm install -g turbo@^2 +COPY . . + +# Generate a partial monorepo with a pruned lockfile for a target workspace. +RUN turbo prune @database.build/deploy-worker --docker + +FROM base AS installer +WORKDIR /app + +# First install the dependencies (as they change less often) +COPY --from=builder /app/out/json/ . +RUN npm install + +# Build the project +COPY --from=builder /app/out/full/ . +RUN npx turbo run build --filter=@database.build/deploy-worker + +FROM base AS runner + +RUN apk add --no-cache postgresql16-client + +# Don't run production as root +RUN addgroup --system --gid 1001 nodejs +RUN adduser --system --uid 1001 nodejs +USER nodejs + +COPY --from=installer --chown=nodejs:nodejs /app /app + +WORKDIR /app/apps/deploy-worker + +EXPOSE 443 +EXPOSE 5432 + +CMD ["node", "--experimental-strip-types", "src/index.ts"] \ No newline at end of file diff --git a/apps/deploy-worker/README.md b/apps/deploy-worker/README.md new file mode 100644 index 00000000..0a3b325a --- /dev/null +++ b/apps/deploy-worker/README.md @@ -0,0 +1,7 @@ +## Development + +Copy the `.env.example` file to `.env` and set the correct environment variables. + +Run the dev server from the monorepo root. See [Development](../../README.md#development). + +The deploy worker will be listening on port `4000` (HTTP). diff --git a/apps/deploy-worker/package.json b/apps/deploy-worker/package.json new file mode 100644 index 00000000..6c4b09b1 --- /dev/null +++ b/apps/deploy-worker/package.json @@ -0,0 +1,28 @@ +{ + "name": "@database.build/deploy-worker", + "type": "module", + "scripts": { + "start": "node --env-file=.env --experimental-strip-types src/index.ts", + "dev": "npm run start", + "build": "echo 'built'", + "type-check": "tsc", + "generate:database-types": "npx supabase gen types --lang=typescript --local > ./supabase/database-types.ts", + "generate:management-api-types": "npx openapi-typescript https://api.supabase.com/api/v1-json -o ./supabase/management-api/types.ts" + }, + "dependencies": { + "@database.build/deploy": "*", + "@hono/node-server": "^1.13.2", + "@hono/zod-validator": "^0.4.1", + "@supabase/supabase-js": "^2.45.4", + "hono": "^4.6.5", + "neverthrow": "^8.0.0", + "openapi-fetch": "^0.13.0", + "zod": "^3.23.8" + }, + "devDependencies": { + "@total-typescript/tsconfig": "^1.0.4", + "@types/node": "^22.5.4", + "openapi-typescript": "^7.4.2", + "typescript": "^5.5.4" + } +} diff --git a/apps/deploy-worker/src/deploy.ts b/apps/deploy-worker/src/deploy.ts new file mode 100644 index 00000000..c35b060d --- /dev/null +++ b/apps/deploy-worker/src/deploy.ts @@ -0,0 +1,199 @@ +import { DeployError, IntegrationRevokedError } from '@database.build/deploy' +import { + createDeployedDatabase, + createManagementApiClient, + generatePassword, + getAccessToken, + getDatabaseUrl, + getPoolerUrl, + SUPABASE_SCHEMAS, + type SupabaseClient, + type SupabaseDeploymentConfig, + type SupabasePlatformConfig, + type SupabaseProviderMetadata, +} from '@database.build/deploy/supabase' +import { exec as execSync } from 'node:child_process' +import { promisify } from 'node:util' +const exec = promisify(execSync) + +/** + * Deploy a local database on Supabase + * If the database was already deployed, it will overwrite the existing database data + */ +export async function deploy( + ctx: { + supabase: SupabaseClient + supabaseAdmin: SupabaseClient + supabasePlatformConfig: SupabasePlatformConfig + supabaseDeploymentConfig: SupabaseDeploymentConfig + }, + params: { databaseId: string; integrationId: number; localDatabaseUrl: string } +) { + // check if the integration is still active + const integration = await ctx.supabase + .from('deployment_provider_integrations') + .select('*') + .eq('id', params.integrationId) + .single() + + if (integration.error) { + throw new DeployError('Integration not found', { cause: integration.error }) + } + + if (integration.data.revoked_at) { + throw new IntegrationRevokedError() + } + + const accessToken = await getAccessToken(ctx, { + integrationId: params.integrationId, + // the integration isn't revoked, so it must have credentials + credentialsSecretId: integration.data.credentials!, + }) + + const managementApiClient = createManagementApiClient(ctx, accessToken) + + // this is just to check if the integration is still active, an IntegrationRevokedError will be thrown if not + await managementApiClient.GET('/v1/organizations') + + const { data: deployment, error: createDeploymentError } = await ctx.supabase + .from('deployments') + .insert({ + local_database_id: params.databaseId, + }) + .select('id') + .single() + + if (createDeploymentError) { + if (createDeploymentError.code === '23505') { + throw new DeployError('Deployment already in progress', { cause: createDeploymentError }) + } + + throw new DeployError('Cannot create deployment', { cause: createDeploymentError }) + } + + try { + // check if the database was already deployed + const deployedDatabase = await ctx.supabase + .from('deployed_databases') + .select('*') + .eq('local_database_id', params.databaseId) + .eq('deployment_provider_integration_id', params.integrationId) + .maybeSingle() + + if (deployedDatabase.error) { + throw new DeployError('Cannot find deployed database', { cause: deployedDatabase.error }) + } + + let databasePassword: string | undefined + + if (!deployedDatabase.data) { + const createdDeployedDatabase = await createDeployedDatabase(ctx, { + databaseId: params.databaseId, + integrationId: params.integrationId, + }) + + deployedDatabase.data = createdDeployedDatabase.deployedDatabase + databasePassword = createdDeployedDatabase.databasePassword + } + + const { error: linkDeploymentError } = await ctx.supabase + .from('deployments') + .update({ + deployed_database_id: deployedDatabase.data.id, + }) + .eq('id', deployment.id) + + if (linkDeploymentError) { + throw new DeployError('Cannot link deployment with deployed database', { + cause: linkDeploymentError, + }) + } + + const project = (deployedDatabase.data.provider_metadata as SupabaseProviderMetadata).project + + // create temporary credentials to restore the Supabase database + const remoteDatabaseUser = `db_build_${generatePassword()}` + const remoteDatabasePassword = generatePassword() + const createUserResponse = await managementApiClient.POST('/v1/projects/{ref}/database/query', { + body: { + query: `create user "${remoteDatabaseUser}" with password '${remoteDatabasePassword}' in role postgres`, + }, + params: { + path: { + ref: project.id, + }, + }, + }) + + if (createUserResponse.error) { + throw new DeployError('Cannot create temporary role for deployment', { + cause: createUserResponse.error, + }) + } + + const remoteDatabaseUrl = getDatabaseUrl({ + project, + databaseUser: remoteDatabaseUser, + databasePassword: remoteDatabasePassword, + }) + + const excludedSchemas = SUPABASE_SCHEMAS.map((schema) => `--exclude-schema=${schema}`).join(' ') + + // use pg_dump and pg_restore to transfer the data from the local database to the remote database + const command = `pg_dump "${params.localDatabaseUrl}" -Fc ${excludedSchemas} -Z 0 | pg_restore -d "${remoteDatabaseUrl}" --clean --if-exists` + + try { + await exec(command) + } catch (error) { + throw new DeployError( + 'Cannot transfer the data from the local database to the remote database', + { + cause: error, + } + ) + } finally { + // delete the temporary credentials + const deleteUserResponse = await managementApiClient.POST( + '/v1/projects/{ref}/database/query', + { + body: { + query: `drop user "${remoteDatabaseUser}";`, + }, + params: { + path: { ref: project.id }, + }, + } + ) + + if (deleteUserResponse.error) { + throw new DeployError('Cannot delete temporary role for deployment', { + cause: deleteUserResponse.error, + }) + } + } + + await ctx.supabase + .from('deployments') + .update({ + status: 'success', + }) + .eq('id', deployment.id) + + return { + name: project.name, + url: `${ctx.supabasePlatformConfig.url}/dashboard/project/${project.id}`, + databasePassword, + databaseUrl: getDatabaseUrl({ project, databasePassword }), + poolerUrl: getPoolerUrl({ project, databasePassword }), + } + } catch (error) { + await ctx.supabase + .from('deployments') + .update({ + status: 'failed', + }) + .eq('id', deployment.id) + + throw error + } +} diff --git a/apps/deploy-worker/src/index.ts b/apps/deploy-worker/src/index.ts new file mode 100644 index 00000000..04af8fc2 --- /dev/null +++ b/apps/deploy-worker/src/index.ts @@ -0,0 +1,103 @@ +import { DeployError, IntegrationRevokedError } from '@database.build/deploy' +import { + type Database, + type Region, + type SupabaseDeploymentConfig, + type SupabasePlatformConfig, +} from '@database.build/deploy/supabase' +import { revokeIntegration } from '@database.build/deploy/supabase' +import { serve } from '@hono/node-server' +import { zValidator } from '@hono/zod-validator' +import { createClient } from '@supabase/supabase-js' +import { Hono } from 'hono' +import { cors } from 'hono/cors' +import { HTTPException } from 'hono/http-exception' +import { z } from 'zod' +import { deploy } from './deploy.ts' + +const supabasePlatformConfig: SupabasePlatformConfig = { + url: process.env.SUPABASE_PLATFORM_URL!, + apiUrl: process.env.SUPABASE_PLATFORM_API_URL!, + oauthClientId: process.env.SUPABASE_OAUTH_CLIENT_ID!, + oauthSecret: process.env.SUPABASE_OAUTH_SECRET!, +} + +const supabaseDeploymentConfig: SupabaseDeploymentConfig = { + region: process.env.SUPABASE_PLATFORM_DEPLOY_REGION! as Region, +} + +const app = new Hono() + +app.use('*', cors()) + +app.post( + '/', + zValidator( + 'json', + z.object({ + databaseId: z.string(), + integrationId: z.number().int(), + databaseUrl: z.string(), + }) + ), + async (c) => { + const { databaseId, integrationId, databaseUrl: localDatabaseUrl } = c.req.valid('json') + + const accessToken = c.req.header('Authorization')?.replace('Bearer ', '') + const refreshToken = c.req.header('X-Refresh-Token') + if (!accessToken || !refreshToken) { + throw new HTTPException(401, { message: 'Unauthorized' }) + } + + const supabaseAdmin = createClient( + process.env.SUPABASE_URL!, + process.env.SUPABASE_SERVICE_ROLE_KEY! + ) + + const supabase = createClient( + process.env.SUPABASE_URL!, + process.env.SUPABASE_ANON_KEY! + ) + + const { error } = await supabase.auth.setSession({ + access_token: accessToken, + refresh_token: refreshToken, + }) + + if (error) { + throw new HTTPException(401, { message: 'Unauthorized' }) + } + + const ctx = { + supabase, + supabaseAdmin, + supabasePlatformConfig, + supabaseDeploymentConfig, + } + + try { + const project = await deploy(ctx, { databaseId, integrationId, localDatabaseUrl }) + return c.json({ project }) + } catch (error: unknown) { + console.error(error) + if (error instanceof DeployError) { + throw new HTTPException(500, { message: error.message }) + } + if (error instanceof IntegrationRevokedError) { + await revokeIntegration(ctx, { integrationId }) + throw new HTTPException(406, { message: error.message }) + } + throw new HTTPException(500, { message: 'Internal server error' }) + } + } +) + +app.get('') + +const port = 4000 +console.log(`Server is running on port ${port}`) + +serve({ + fetch: app.fetch, + port, +}) diff --git a/apps/deploy-worker/tsconfig.json b/apps/deploy-worker/tsconfig.json new file mode 100644 index 00000000..0963b32c --- /dev/null +++ b/apps/deploy-worker/tsconfig.json @@ -0,0 +1,9 @@ +{ + "extends": "@total-typescript/tsconfig/tsc/no-dom/app", + "include": ["src/**/*.ts"], + "compilerOptions": { + "allowImportingTsExtensions": true, + "noEmit": true, + "outDir": "dist" + } +} diff --git a/apps/postgres-new/.env.example b/apps/postgres-new/.env.example deleted file mode 100644 index b8560b13..00000000 --- a/apps/postgres-new/.env.example +++ /dev/null @@ -1,16 +0,0 @@ -NEXT_PUBLIC_SUPABASE_ANON_KEY="" -NEXT_PUBLIC_SUPABASE_URL="" -NEXT_PUBLIC_BROWSER_PROXY_DOMAIN="" - -OPENAI_API_KEY="" -# Optional -# OPENAI_API_BASE="" -# OPENAI_MODEL="" - -# Vercel KV (local Docker available) -KV_REST_API_URL="http://localhost:8080" -KV_REST_API_TOKEN="local_token" - -NEXT_PUBLIC_LEGACY_DOMAIN=https://postgres.new -NEXT_PUBLIC_CURRENT_DOMAIN=https://database.build -REDIRECT_LEGACY_DOMAIN=false diff --git a/apps/postgres-new/app/(main)/db/[id]/page.tsx b/apps/postgres-new/app/(main)/db/[id]/page.tsx deleted file mode 100644 index cd13c697..00000000 --- a/apps/postgres-new/app/(main)/db/[id]/page.tsx +++ /dev/null @@ -1,29 +0,0 @@ -'use client' - -import { useRouter } from 'next/navigation' -import { useEffect } from 'react' -import { useApp } from '~/components/app-provider' -import Workspace from '~/components/workspace' - -export default function Page({ params }: { params: { id: string } }) { - const databaseId = params.id - const router = useRouter() - const { dbManager } = useApp() - - useEffect(() => { - async function run() { - if (!dbManager) { - throw new Error('dbManager is not available') - } - - try { - await dbManager.getDbInstance(databaseId) - } catch (err) { - router.push('/') - } - } - run() - }, [dbManager, databaseId, router]) - - return -} diff --git a/apps/postgres-new/app/api/chat/route.ts b/apps/postgres-new/app/api/chat/route.ts deleted file mode 100644 index b81e4af5..00000000 --- a/apps/postgres-new/app/api/chat/route.ts +++ /dev/null @@ -1,120 +0,0 @@ -import { createOpenAI } from '@ai-sdk/openai' -import { Ratelimit } from '@upstash/ratelimit' -import { kv } from '@vercel/kv' -import { ToolInvocation, convertToCoreMessages, streamText } from 'ai' -import { codeBlock } from 'common-tags' -import { convertToCoreTools, maxMessageContext, maxRowLimit, tools } from '~/lib/tools' -import { createClient } from '~/utils/supabase/server' - -// Allow streaming responses up to 30 seconds -export const maxDuration = 30 - -const inputTokenRateLimit = new Ratelimit({ - redis: kv, - limiter: Ratelimit.fixedWindow(1000000, '30m'), - prefix: 'ratelimit:tokens:input', -}) - -const outputTokenRateLimit = new Ratelimit({ - redis: kv, - limiter: Ratelimit.fixedWindow(10000, '30m'), - prefix: 'ratelimit:tokens:output', -}) - -type Message = { - role: 'user' | 'assistant' - content: string - toolInvocations?: (ToolInvocation & { result: any })[] -} - -const chatModel = process.env.OPENAI_MODEL ?? 'gpt-4o-2024-08-06' - -// Configure OpenAI client with custom base URL -const openai = createOpenAI({ - apiKey: process.env.OPENAI_API_KEY, - baseURL: process.env.OPENAI_API_BASE ?? 'https://api.openai.com/v1', - compatibility: 'strict', -}) - -export async function POST(req: Request) { - const supabase = createClient() - - const { data, error } = await supabase.auth.getUser() - - // We have middleware, so this should never happen (used for type narrowing) - if (error) { - return new Response('Unauthorized', { status: 401 }) - } - - const { user } = data - - const { remaining: inputRemaining } = await inputTokenRateLimit.getRemaining(user.id) - const { remaining: outputRemaining } = await outputTokenRateLimit.getRemaining(user.id) - - if (inputRemaining <= 0 || outputRemaining <= 0) { - return new Response('Rate limited', { status: 429 }) - } - - const { messages }: { messages: Message[] } = await req.json() - - // Trim the message context sent to the LLM to mitigate token abuse - const trimmedMessageContext = messages.slice(-maxMessageContext) - - const result = await streamText({ - system: codeBlock` - You are a helpful database assistant. Under the hood you have access to an in-browser Postgres database called PGlite (https://github.com/electric-sql/pglite). - Some special notes about this database: - - foreign data wrappers are not supported - - the following extensions are available: - - plpgsql [pre-enabled] - - vector (https://github.com/pgvector/pgvector) [pre-enabled] - - use <=> for cosine distance (default to this) - - use <#> for negative inner product - - use <-> for L2 distance - - use <+> for L1 distance - - note queried vectors will be truncated/redacted due to their size - export as CSV if the full vector is required - - When generating tables, do the following: - - For primary keys, always use "id bigint primary key generated always as identity" (not serial) - - Prefer 'text' over 'varchar' - - Keep explanations brief but helpful - - Don't repeat yourself after creating the table - - When creating sample data: - - Make the data realistic, including joined data - - Check for existing records/conflicts in the table - - When querying data, limit to 5 by default. The maximum number of rows you're allowed to fetch is ${maxRowLimit} (to protect AI from token abuse). - If the user needs to fetch more than ${maxRowLimit} rows at once, they can export the query as a CSV. - - When performing FTS, always use 'simple' (languages aren't available). - - When importing CSVs try to solve the problem yourself (eg. use a generic text column, then refine) - vs. asking the user to change the CSV. No need to select rows after importing. - - You also know math. All math equations and expressions must be written in KaTex and must be wrapped in double dollar \`$$\`: - - Inline: $$\\sqrt{26}$$ - - Multiline: - $$ - \\sqrt{26} - $$ - - No images are allowed. Do not try to generate or link images, including base64 data URLs. - - Feel free to suggest corrections for suspected typos. - `, - model: openai(chatModel), - messages: convertToCoreMessages(trimmedMessageContext), - tools: convertToCoreTools(tools), - async onFinish({ usage }) { - await inputTokenRateLimit.limit(user.id, { - rate: usage.promptTokens, - }) - await outputTokenRateLimit.limit(user.id, { - rate: usage.completionTokens, - }) - }, - }) - - return result.toAIStreamResponse() -} diff --git a/apps/postgres-new/components/chat.tsx b/apps/postgres-new/components/chat.tsx deleted file mode 100644 index 9b375864..00000000 --- a/apps/postgres-new/components/chat.tsx +++ /dev/null @@ -1,714 +0,0 @@ -'use client' - -import { Message, generateId } from 'ai' -import { useChat } from 'ai/react' -import { AnimatePresence, m } from 'framer-motion' -import { ArrowDown, ArrowUp, Flame, Paperclip, PlugIcon, Square } from 'lucide-react' -import { - FormEventHandler, - useCallback, - useEffect, - useImperativeHandle, - useMemo, - useRef, - useState, -} from 'react' -import { Button } from '~/components/ui/button' -import { Skeleton } from '~/components/ui/skeleton' -import { TablesData } from '~/data/tables/tables-query' -import { saveFile } from '~/lib/files' -import { useAutoScroll, useDropZone } from '~/lib/hooks' -import { requestFileUpload } from '~/lib/util' -import { cn } from '~/lib/utils' -import { AiIconAnimation } from './ai-icon-animation' -import { useApp } from './app-provider' -import ChatMessage from './chat-message' -import { CopyableField } from './copyable-field' -import SignInButton from './sign-in-button' -import { Accordion, AccordionContent, AccordionItem, AccordionTrigger } from './ui/accordion' -import { Tabs, TabsContent, TabsList, TabsTrigger } from './ui/tabs' -import { useWorkspace } from './workspace' - -export function getInitialMessages(tables: TablesData): Message[] { - return [ - // An artificial tool call containing the DB schema - // as if it was already called by the LLM - { - id: generateId(), - role: 'assistant', - content: '', - toolInvocations: [ - { - state: 'result', - toolCallId: generateId(), - toolName: 'getDatabaseSchema', - args: {}, - result: tables, - }, - ], - }, - ] -} - -export default function Chat() { - const { user, isLoadingUser, focusRef, setIsSignInDialogOpen, isRateLimited, liveShare } = - useApp() - const [inputFocusState, setInputFocusState] = useState(false) - - const { - databaseId, - isLoadingMessages, - isLoadingSchema, - isConversationStarted, - messages, - appendMessage, - stopReply, - } = useWorkspace() - - const { input, setInput, handleInputChange, isLoading } = useChat({ - id: databaseId, - api: '/api/chat', - }) - - const { ref: scrollRef, isSticky, scrollToEnd } = useAutoScroll() - - // eslint-disable-next-line react-hooks/exhaustive-deps - const nextMessageId = useMemo(() => generateId(), [messages.length]) - - const sendCsv = useCallback( - async (file: File) => { - if (file.type !== 'text/csv') { - // Add an artificial tool call requesting the CSV - // with an error indicating the file wasn't a CSV - appendMessage({ - role: 'assistant', - content: '', - toolInvocations: [ - { - state: 'result', - toolCallId: generateId(), - toolName: 'requestCsv', - args: {}, - result: { - success: false, - error: `The file has type '${file.type}'. Let the user know that only CSV imports are currently supported.`, - }, - }, - ], - }) - return - } - - const fileId = generateId() - - await saveFile(fileId, file) - - const text = await file.text() - - // Add an artificial tool call requesting the CSV - // with the file result all in one operation. - appendMessage({ - role: 'assistant', - content: '', - toolInvocations: [ - { - state: 'result', - toolCallId: generateId(), - toolName: 'requestCsv', - args: {}, - result: { - success: true, - fileId: fileId, - file: { - name: file.name, - size: file.size, - type: file.type, - lastModified: file.lastModified, - }, - preview: text.split('\n').slice(0, 4).join('\n').trim(), - }, - }, - ], - }) - }, - [appendMessage] - ) - - const { - ref: dropZoneRef, - isDraggingOver, - cursor: dropZoneCursor, - } = useDropZone({ - async onDrop(files) { - if (!user) { - return - } - - const [file] = files - - if (file) { - await sendCsv(file) - } - }, - cursorElement: ( - - Add file to chat - - ), - }) - - const inputRef = useRef(null) - - // Scroll to end when chat is first mounted - useEffect(() => { - scrollToEnd() - }, [scrollToEnd]) - - // Focus input when LLM starts responding (for cases when it wasn't focused prior) - useEffect(() => { - if (isLoading) { - inputRef.current?.focus() - } - }, [isLoading]) - - const lastMessage = messages.at(-1) - - const handleFormSubmit: FormEventHandler = useCallback( - (e) => { - // Manually manage message submission so that we can control its ID - // We want to control the ID so that we can perform layout animations via `layoutId` - // (see hidden dummy message above) - e.preventDefault() - appendMessage({ - id: nextMessageId, - role: 'user', - content: input, - }) - setInput('') - - // Scroll to bottom after the message has rendered - setTimeout(() => { - scrollToEnd() - }, 0) - }, - [appendMessage, nextMessageId, input, setInput, scrollToEnd] - ) - - const [isMessageAnimationComplete, setIsMessageAnimationComplete] = useState(false) - - const isChatEnabled = - !isLoadingMessages && !isLoadingSchema && user !== undefined && !liveShare.isLiveSharing - - const isSubmitEnabled = isChatEnabled && Boolean(input.trim()) - - // Create imperative handle that can be used to focus the input anywhere in the app - useImperativeHandle(focusRef, () => ({ - focus() { - if (inputRef.current) { - inputRef.current.focus() - } - }, - })) - - return ( -
- {isDraggingOver && ( - - )} - {dropZoneCursor} -
- {isLoadingMessages || isLoadingSchema ? ( -
- - - - - - -
- ) : isConversationStarted ? ( -
- - setIsMessageAnimationComplete(false)} - onAnimationComplete={() => setIsMessageAnimationComplete(true)} - initial="show" - animate="show" - > - {messages.map((message, i) => ( - - ))} - - {isRateLimited && !isLoading && ( - - -
-

Hang tight!

-

- We're seeing a lot of AI traffic from your end and need to temporarily - pause your chats to make sure our servers don't melt. -

- -

Have a quick coffee break and try again in a few minutes!

-
-
- )} -
- - {isLoading && ( - - - - - {lastMessage && - (lastMessage.role === 'user' || - (lastMessage.role === 'assistant' && !lastMessage.content)) && ( - - Working on it... - - )} - - )} - -
-
- ) : ( -
- {user ? ( - <> - - - What would you like to create? - - - ) : ( - - -

- To prevent abuse we ask you to sign in before chatting with AI. -

-

{ - setIsSignInDialogOpen(true) - }} - > - Why do I need to sign in? -

-
- )} -
- )} - - {!isSticky && ( - - - - )} - -
-
- - {!user && !isLoadingUser && isConversationStarted && ( - - -

- To prevent abuse we ask you to sign in before chatting with AI. -

-

{ - setIsSignInDialogOpen(true) - }} - > - Why do I need to sign in? -

-
- )} -
-
- {/* - * This is a hidden dummy message acting as an animation anchor - * before the real message is added to the chat. - * - * The animation starts in this element's position and moves over to - * the location of the real message after submit. - * - * It works by sharing the same `layoutId` between both message elements - * which framer motion requires to animate between them. - */} - {input && ( - - {input} - - )} - -