-
Notifications
You must be signed in to change notification settings - Fork 174
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Linearlite example using PGlite for client side store
- Loading branch information
Showing
131 changed files
with
5,801 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,3 @@ | ||
dist | ||
.env.local | ||
db/data/ |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,6 @@ | ||
{ | ||
"trailingComma": "es5", | ||
"semi": false, | ||
"tabWidth": 2, | ||
"singleQuote": true | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,76 @@ | ||
# Linearlite + PGlite + ElectricSQL | ||
|
||
This is a demo app that shows how to build a local-first app using PGlite and the ElectricSQL sync engine. | ||
|
||
It's an example of a team collaboration app such as Linear built using ElectricSQL - a sync engine that synchronises little subsets of your Postgres data into local apps and services. So you can have the data you need, in-sync, wherever you need it. | ||
|
||
It's built on top of the excellent clone of the Linear UI built by [Tuan Nguyen](https://github.com/tuan3w). | ||
|
||
## Setup | ||
|
||
This example is part of the [ElectricSQL monorepo](../..) and is designed to be built and run as part of the [pnpm workspace](https://pnpm.io/workspaces) defined in [`../../pnpm-workspace.yaml`](../../pnpm-workspace.yaml). | ||
|
||
Navigate to the root directory of the monorepo, e.g.: | ||
|
||
```shell | ||
cd ../../ | ||
``` | ||
|
||
Install and build all of the workspace packages and examples: | ||
|
||
```shell | ||
pnpm install | ||
pnpm run -r build | ||
``` | ||
|
||
Navigate back to this directory: | ||
|
||
```shell | ||
cd examples/linearlite | ||
``` | ||
|
||
Start the example backend services using [Docker Compose](https://docs.docker.com/compose/): | ||
|
||
```shell | ||
pnpm backend:up | ||
``` | ||
|
||
> Note that this always stops and deletes the volumes mounted by any other example backend containers that are running or have been run before. This ensures that the example always starts with a clean database and clean disk. | ||
Start the write path server: | ||
|
||
```shell | ||
pnpm run write-server | ||
``` | ||
|
||
Now start the dev server: | ||
|
||
```shell | ||
pnpm dev | ||
``` | ||
|
||
When you're done, stop the backend services using: | ||
|
||
```shell | ||
pnpm backend:down | ||
``` | ||
|
||
## How it works | ||
|
||
Linearlite demonstrates a local-first architecture using ElectricSQL and PGlite. Here's how the different pieces fit together: | ||
|
||
### Backend Components | ||
|
||
1. **Postgres Database**: The source of truth, containing the complete dataset. | ||
|
||
2. **Electric Sync Service**: Runs in front of Postgres, managing data synchronization from it to the clients. Preduces replication streams for a subset of the database called "shapes". | ||
|
||
3. **Write Server**: A simple HTTP server that handles write operations, applying them to the Postgres database. | ||
|
||
### Frontend Components | ||
|
||
1. **PGlite**: An in-browser database that stores a local copy of the data, enabling offline functionality and fast queries. | ||
|
||
2. **PGlite + Electric Sync Plugin**: Connects PGlite to the Electric sync service and loads the data into the local database. | ||
|
||
3. **React Frontend**: A Linear-inspired UI that interacts directly with the local database. |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,30 @@ | ||
version: "3.3" | ||
name: "pglite-linearlite" | ||
|
||
services: | ||
postgres: | ||
image: postgres:16-alpine | ||
environment: | ||
POSTGRES_DB: linearlite | ||
POSTGRES_USER: postgres | ||
POSTGRES_PASSWORD: password | ||
ports: | ||
- 54321:5432 | ||
volumes: | ||
- ./postgres.conf:/etc/postgresql/postgresql.conf:ro | ||
tmpfs: | ||
- /var/lib/postgresql/data | ||
- /tmp | ||
command: | ||
- postgres | ||
- -c | ||
- config_file=/etc/postgresql/postgresql.conf | ||
|
||
backend: | ||
image: electricsql/electric | ||
environment: | ||
DATABASE_URL: postgresql://postgres:password@postgres:5432/linearlite?sslmode=disable | ||
ports: | ||
- 3000:3000 | ||
depends_on: | ||
- postgres |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,2 @@ | ||
listen_addresses = '*' | ||
wal_level = logical |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,53 @@ | ||
import { faker } from '@faker-js/faker' | ||
import { generateNKeysBetween } from 'fractional-indexing' | ||
import { v4 as uuidv4 } from 'uuid' | ||
|
||
export function generateIssues(numIssues) { | ||
// generate properly spaced kanban keys and shuffle them | ||
const kanbanKeys = faker.helpers.shuffle( | ||
generateNKeysBetween(null, null, numIssues) | ||
) | ||
return Array.from({ length: numIssues }, (_, idx) => | ||
generateIssue(kanbanKeys[idx]) | ||
) | ||
} | ||
|
||
function generateIssue(kanbanKey) { | ||
const issueId = uuidv4() | ||
const createdAt = faker.date.past() | ||
return { | ||
id: issueId, | ||
title: faker.lorem.sentence({ min: 3, max: 8 }), | ||
description: faker.lorem.sentences({ min: 2, max: 6 }, `\n`), | ||
priority: faker.helpers.arrayElement([`none`, `low`, `medium`, `high`]), | ||
status: faker.helpers.arrayElement([ | ||
`backlog`, | ||
`todo`, | ||
`in_progress`, | ||
`done`, | ||
`canceled`, | ||
]), | ||
created: createdAt.toISOString(), | ||
modified: faker.date | ||
.between({ from: createdAt, to: new Date() }) | ||
.toISOString(), | ||
kanbanorder: kanbanKey, | ||
username: faker.internet.userName(), | ||
comments: faker.helpers.multiple( | ||
() => generateComment(issueId, createdAt), | ||
{ count: faker.number.int({ min: 0, max: 1 }) } | ||
), | ||
} | ||
} | ||
|
||
function generateComment(issueId, issueCreatedAt) { | ||
const createdAt = faker.date.between({ from: issueCreatedAt, to: new Date() }) | ||
return { | ||
id: uuidv4(), | ||
body: faker.lorem.text(), | ||
username: faker.internet.userName(), | ||
issue_id: issueId, | ||
created: createdAt.toISOString(), | ||
modified: createdAt.toISOString(), // comments are never modified | ||
} | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,72 @@ | ||
import postgres from 'postgres' | ||
import { generateIssues } from './generate_data.js' | ||
|
||
if (!process.env.DATABASE_URL) { | ||
throw new Error(`DATABASE_URL is not set`) | ||
} | ||
|
||
const DATABASE_URL = process.env.DATABASE_URL | ||
const ISSUES_TO_LOAD = process.env.ISSUES_TO_LOAD || 512 | ||
const BATCH_SIZE = 1000 | ||
const issues = generateIssues(ISSUES_TO_LOAD) | ||
|
||
console.info(`Connecting to Postgres at ${DATABASE_URL}`) | ||
const sql = postgres(DATABASE_URL) | ||
|
||
async function batchInsert(sql, table, columns, dataArray, batchSize = 1000) { | ||
for (let i = 0; i < dataArray.length; i += batchSize) { | ||
const batch = dataArray.slice(i, i + batchSize) | ||
|
||
await sql` | ||
INSERT INTO ${sql(table)} ${sql(batch, columns)} | ||
` | ||
|
||
process.stdout.write( | ||
`Loaded ${Math.min(i + batchSize, dataArray.length)} of ${dataArray.length} ${table}s\r` | ||
) | ||
} | ||
} | ||
|
||
const issueCount = issues.length | ||
let commentCount = 0 | ||
|
||
try { | ||
// Process data in batches | ||
for (let i = 0; i < issues.length; i += BATCH_SIZE) { | ||
const issueBatch = issues.slice(i, i + BATCH_SIZE) | ||
|
||
await sql.begin(async (sql) => { | ||
// Disable FK checks | ||
await sql`SET CONSTRAINTS ALL DEFERRED` | ||
|
||
// Insert issues | ||
const issuesData = issueBatch.map(({ comments: _, ...rest }) => rest) | ||
const issueColumns = Object.keys(issuesData[0]) | ||
await batchInsert(sql, 'issue', issueColumns, issuesData, BATCH_SIZE) | ||
|
||
// Insert related comments | ||
const batchComments = issueBatch.flatMap((issue) => issue.comments) | ||
const commentColumns = Object.keys(batchComments[0]) | ||
await batchInsert( | ||
sql, | ||
'comment', | ||
commentColumns, | ||
batchComments, | ||
BATCH_SIZE | ||
) | ||
|
||
commentCount += batchComments.length | ||
}) | ||
|
||
process.stdout.write( | ||
`\nProcessed batch ${Math.floor(i / BATCH_SIZE) + 1}: ${Math.min(i + BATCH_SIZE, issues.length)} of ${issues.length} issues\n` | ||
) | ||
} | ||
|
||
console.info(`Loaded ${issueCount} issues with ${commentCount} comments.`) | ||
} catch (error) { | ||
console.error('Error loading data:', error) | ||
throw error | ||
} finally { | ||
await sql.end() | ||
} |
Oops, something went wrong.