Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore: Linearlite - Electric sync + PGlite demo app #393

Open
wants to merge 35 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
35 commits
Select commit Hold shift + click to select a range
a3a20f6
Initial commit of Linearlite demo, read/write to local PGlite working
samwillis Oct 3, 2024
2d3ab99
Use new liveQuery that can load a live query in a route loader
samwillis Oct 9, 2024
e84d9ec
Working syncToTable for PGlite Linearlite
samwillis Oct 9, 2024
4960fc5
Configure eslint and fix failures
samwillis Oct 10, 2024
8482d37
Local tentertive state via shadow tables (no sync back yet)
samwillis Oct 10, 2024
08c3503
Synced status icons
samwillis Oct 10, 2024
bc7270c
Missing SQL update on board
samwillis Oct 10, 2024
8b80112
Add PGLite REPL modal
samwillis Oct 10, 2024
304bbff
Working two way sync!
samwillis Oct 10, 2024
220d828
Update readme and change name of write server
samwillis Oct 10, 2024
c7760f7
Adjust ordering of issue to fall back to id
samwillis Oct 11, 2024
d4aa0cd
Use windowed live query
samwillis Oct 13, 2024
59ffafb
WIP refactor to local edits in synced table
samwillis Oct 14, 2024
997a569
Working two way sync
samwillis Oct 15, 2024
3b26515
Improve render perfomance
samwillis Oct 15, 2024
5b29bd2
Fix search
samwillis Oct 15, 2024
1e66ad9
Fix debounce on issue page
samwillis Oct 15, 2024
48ff4bb
WIP Kanban board with windowed live queries
samwillis Oct 16, 2024
a080ce7
Debounce saerch
samwillis Oct 16, 2024
cc9f4c4
Use Postgres FTS for search
samwillis Oct 16, 2024
cc4b7ab
Tweaks
samwillis Oct 16, 2024
da74956
Fix type issues
samwillis Nov 20, 2024
ae73344
"Fix" eslint
samwillis Nov 20, 2024
0d1e75c
Upgrade electric and convert the write server to Hono
samwillis Nov 28, 2024
d01effc
Swap to postgres.js so compatible with CF Workers
samwillis Nov 28, 2024
827a23e
pglite-sync: wip
samwillis Nov 28, 2024
a28c961
split out initial sync
samwillis Nov 28, 2024
fb18325
Load data in batches
samwillis Dec 2, 2024
44bf9b3
Loading faster, and a loading screen
samwillis Dec 3, 2024
d2e5ec4
Added table schema checks to linearite demo triggers. (#449)
FindAPattern Dec 3, 2024
a88cd82
Deploy with supabase
samwillis Dec 3, 2024
64d7ea0
Fix write server
samwillis Dec 5, 2024
32d9e71
pglite-live: remove console.log
samwillis Dec 8, 2024
c62c3d4
WIP Readme
samwillis Dec 8, 2024
ba95a72
Fix sync for new electric client
samwillis Dec 9, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions demos/linearlite/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
dist
.env.local
db/data/
6 changes: 6 additions & 0 deletions demos/linearlite/.prettierrc
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"trailingComma": "es5",
"semi": false,
"tabWidth": 2,
"singleQuote": true
}
60 changes: 60 additions & 0 deletions demos/linearlite/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Linearlite + PGlite + ElectricSQL

This is a demo app that shows how to build a local-first app using PGlite and the ElectricSQL sync engine.

It's an example of a team collaboration app such as Linear built using ElectricSQL - a sync engine that synchronises little subsets of your Postgres data into local apps and services. So you can have the data you need, in-sync, wherever you need it.

It's built on top of the excellent clone of the Linear UI built by [Tuan Nguyen](https://github.com/tuan3w).

## Setup

1. Make sure you've installed all dependencies for the monorepo and built all packages.

From the root directory:

- `pnpm i`
- `pnpm run -r build`

2. Add a `.env` file with the following (or similar), in this directory:

```
DATABASE_URL=postgresql://postgres:password@localhost:54321/linearlite
VITE_ELECTRIC_URL=http://localhost:3000
VITE_WRITE_SERVER_URL=http://localhost:3001
```

3. Start the docker containers:

`pnpm run backend:up`

4. Start the write path server:

`pnpm run write-server`

5. Start the dev server:

`pnpm run dev`

5. When done, tear down the backend containers:

`pnpm run backend:down`

## How it works

LinearLite demonstrates a local-first architecture using ElectricSQL and PGlite. Here's how the different pieces fit together:

### Backend Components

1. **Postgres Database**: The source of truth, containing the complete dataset.

2. **Electric Sync Service**: Runs in front of Postgres, managing data synchronization from it to the clients. Preduces replication streams for a subset of the database called "shapes".

3. **Write Server**: A simple HTTP server that handles write operations, applying them to the Postgres database.

### Frontend Components

1. **PGlite**: An in-browser database that stores a local copy of the data, enabling offline functionality and fast queries.

2. **PGlite + Electric Sync Plugin**: Connects PGlite to the Electric sync service and loads the data into the local database.

3. **React Frontend**: A Linear-inspired UI that interacts directly with the local database.
30 changes: 30 additions & 0 deletions demos/linearlite/backend/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
version: "3.3"
name: "pglite-linearlite"

services:
postgres:
image: postgres:16-alpine
environment:
POSTGRES_DB: linearlite
POSTGRES_USER: postgres
POSTGRES_PASSWORD: password
ports:
- 54321:5432
volumes:
- ./postgres.conf:/etc/postgresql/postgresql.conf:ro
tmpfs:
- /var/lib/postgresql/data
- /tmp
command:
- postgres
- -c
- config_file=/etc/postgresql/postgresql.conf

backend:
image: electricsql/electric
environment:
DATABASE_URL: postgresql://postgres:password@postgres:5432/linearlite?sslmode=disable
ports:
- 3000:3000
depends_on:
- postgres
2 changes: 2 additions & 0 deletions demos/linearlite/backend/postgres.conf
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
listen_addresses = '*'
wal_level = logical
53 changes: 53 additions & 0 deletions demos/linearlite/db/generate_data.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,53 @@
import { faker } from '@faker-js/faker'
import { generateNKeysBetween } from 'fractional-indexing'
import { v4 as uuidv4 } from 'uuid'

export function generateIssues(numIssues) {
// generate properly spaced kanban keys and shuffle them
const kanbanKeys = faker.helpers.shuffle(
generateNKeysBetween(null, null, numIssues)
)
return Array.from({ length: numIssues }, (_, idx) =>
generateIssue(kanbanKeys[idx])
)
}

function generateIssue(kanbanKey) {
const issueId = uuidv4()
const createdAt = faker.date.past()
return {
id: issueId,
title: faker.lorem.sentence({ min: 3, max: 8 }),
description: faker.lorem.sentences({ min: 2, max: 6 }, `\n`),
priority: faker.helpers.arrayElement([`none`, `low`, `medium`, `high`]),
status: faker.helpers.arrayElement([
`backlog`,
`todo`,
`in_progress`,
`done`,
`canceled`,
]),
created: createdAt.toISOString(),
modified: faker.date
.between({ from: createdAt, to: new Date() })
.toISOString(),
kanbanorder: kanbanKey,
username: faker.internet.userName(),
comments: faker.helpers.multiple(
() => generateComment(issueId, createdAt),
{ count: faker.number.int({ min: 0, max: 1 }) }
),
}
}

function generateComment(issueId, issueCreatedAt) {
const createdAt = faker.date.between({ from: issueCreatedAt, to: new Date() })
return {
id: uuidv4(),
body: faker.lorem.text(),
username: faker.internet.userName(),
issue_id: issueId,
created: createdAt.toISOString(),
modified: createdAt.toISOString(), // comments are never modified
}
}
72 changes: 72 additions & 0 deletions demos/linearlite/db/load_data.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
import postgres from 'postgres'
import { generateIssues } from './generate_data.js'

if (!process.env.DATABASE_URL) {
throw new Error(`DATABASE_URL is not set`)
}

const DATABASE_URL = process.env.DATABASE_URL
const ISSUES_TO_LOAD = process.env.ISSUES_TO_LOAD || 512
const BATCH_SIZE = 1000
const issues = generateIssues(ISSUES_TO_LOAD)

console.info(`Connecting to Postgres at ${DATABASE_URL}`)
const sql = postgres(DATABASE_URL)

async function batchInsert(sql, table, columns, dataArray, batchSize = 1000) {
for (let i = 0; i < dataArray.length; i += batchSize) {
const batch = dataArray.slice(i, i + batchSize)

await sql`
INSERT INTO ${sql(table)} ${sql(batch, columns)}
`

process.stdout.write(
`Loaded ${Math.min(i + batchSize, dataArray.length)} of ${dataArray.length} ${table}s\r`
)
}
}

const issueCount = issues.length
let commentCount = 0

try {
// Process data in batches
for (let i = 0; i < issues.length; i += BATCH_SIZE) {
const issueBatch = issues.slice(i, i + BATCH_SIZE)

await sql.begin(async (sql) => {
// Disable FK checks
await sql`SET CONSTRAINTS ALL DEFERRED`

// Insert issues
const issuesData = issueBatch.map(({ comments: _, ...rest }) => rest)
const issueColumns = Object.keys(issuesData[0])
await batchInsert(sql, 'issue', issueColumns, issuesData, BATCH_SIZE)

// Insert related comments
const batchComments = issueBatch.flatMap((issue) => issue.comments)
const commentColumns = Object.keys(batchComments[0])
await batchInsert(
sql,
'comment',
commentColumns,
batchComments,
BATCH_SIZE
)

commentCount += batchComments.length
})

process.stdout.write(
`\nProcessed batch ${Math.floor(i / BATCH_SIZE) + 1}: ${Math.min(i + BATCH_SIZE, issues.length)} of ${issues.length} issues\n`
)
}

console.info(`Loaded ${issueCount} issues with ${commentCount} comments.`)
} catch (error) {
console.error('Error loading data:', error)
throw error
} finally {
await sql.end()
}
Loading
Loading