-
Notifications
You must be signed in to change notification settings - Fork 975
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
## What are you changing in this pull request and why? This adds all of the adjustments needed to handle the Netlify to Vercel migration. - Adjusts serverless functions to support Vercel syntax - Adds `vercel.json` file to handle redirects - Adds function to append a query string param to image sources only if hosted through vercel to prevent broken cached images from being shown post migration.⚠️ Note that this is setup to load on both Netlify and Vercel that way we can merge this PR into `current` and then update the DNS when ready to start on Vercel.⚠️ ## Preview Links Vercel: https://docs-getdbt-com-git-vercel-migration-dbt-labs.vercel.app Netlify: https://deploy-preview-3968--docs-getdbt-com.netlify.app ## Testing In addition to our internal testing documentation, the following steps should be performed and confirmed before merging into `current`. 1. Confirm `get-discourse-comments` function works on both Netlify and Vercel previews. Should render comment section with a test comment. Vercel: https://docs-getdbt-com-git-vercel-migration-dbt-labs.vercel.app/blog/create-dbt-documentation-10x-faster-with-ChatGPT Netlify: https://deploy-preview-3968--docs-getdbt-com.netlify.app/blog/create-dbt-documentation-10x-faster-with-chatgpt 2. Confirm `get-discourse-topics` DiscourseFeed component works on both Netlify and Vercel Previews. Vercel Forum Page: https://docs-getdbt-com-git-vercel-migration-dbt-labs.vercel.app/community/forum Netlify Forum Page: https://deploy-preview-3968--docs-getdbt-com.netlify.app/community/forum 3. Confirm `get-discourse-topics` DiscourseHelpFeed component renders under "Questions from the Community" on the following pages Vercel: https://docs-getdbt-com-git-vercel-migration-dbt-labs.vercel.app/docs/build/incremental-models Netlify: https://deploy-preview-3968--docs-getdbt-com.netlify.app/docs/build/incremental-models 4. Confirm redirects work. Once we are on Vercel all redirects will need to be added in a [vercel.json](https://github.com/dbt-labs/docs.getdbt.com/blob/836e16373a4d20c80f83db1a895c454341210af0/website/vercel.json) file. Until then, if redirects are added they will need to be added in both `_redirects` and `vercel.json`. ## Additional Comments @JKarlavige in an effort to try and prevent cached images once the migration is complete I created a [new function ](https://github.com/dbt-labs/docs.getdbt.com/blob/836e16373a4d20c80f83db1a895c454341210af0/website/functions/image-cache-wrapper.js)that will append a query param only if the site is served from Vercel that way we don't bust the cache prior to migration. Right now it is only applying to image sources used in functional components and not markdown files. I originally was attempting to build a plugin that applied a param to all image sources regardless of where they were set but was having trouble getting it to work. Any thoughts on how we can improve this so we don't have to manually add a param to markdown image sources after migration? ## Checklist - [x] Confirm Discourse comments and discourse topics load correctly on both Vercel and Netlify deploy previews - [x] Confirm redirects work - [x] Confirm Algolia search works
- Loading branch information
Showing
11 changed files
with
4,345 additions
and
13 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,169 @@ | ||
const axios = require('axios') | ||
require("dotenv").config(); | ||
|
||
const { DISCOURSE_DEVBLOG_API_KEY , DISCOURSE_USER_SYSTEM } = process.env | ||
const DEVBLOG_PROD_URL = 'https://docs.getdbt.com/blog/' | ||
const DEV_ENV = 'dev-' | ||
const PREVIEW_ENV = 'deploy-preview-' | ||
|
||
// Set API endpoint and headers | ||
let discourse_endpoint = `https://discourse.getdbt.com` | ||
let headers = { | ||
'Accept': 'application/json', | ||
'Api-Key': DISCOURSE_DEVBLOG_API_KEY, | ||
'Api-Username': DISCOURSE_USER_SYSTEM, | ||
} | ||
|
||
async function getDiscourseComments(request, response) { | ||
let topicId, comments, DISCOURSE_TOPIC_ID; | ||
|
||
const blogUrl = await getBlogUrl(request) | ||
|
||
if (blogUrl === DEVBLOG_PROD_URL) { | ||
DISCOURSE_TOPIC_ID = 21 | ||
} else { | ||
DISCOURSE_TOPIC_ID = 2 | ||
} | ||
|
||
try { | ||
const env = | ||
blogUrl === DEVBLOG_PROD_URL | ||
? "" | ||
: blogUrl.includes("localhost") | ||
? DEV_ENV | ||
: PREVIEW_ENV; | ||
const postTitle = `${env}${request.query.title}`; | ||
const postSlug = request.query.slug; | ||
const cleanSlug = cleanUrl(request.query.slug); | ||
const externalId = truncateString(`${env}${cleanSlug}`); | ||
|
||
console.table({ | ||
blogUrl, | ||
postTitle, | ||
postSlug, | ||
cleanSlug, | ||
externalId, | ||
}); | ||
|
||
|
||
if (!postSlug) throw new Error("Unable to query Discourse API. Error reading slug."); | ||
|
||
topicId = await searchDiscourseExternalId(externalId); | ||
|
||
// First check if the dev blog post exists in Discourse | ||
// Get the comments if it does | ||
if (typeof topicId === "number") { | ||
comments = await getDiscourseTopicbyID(topicId); | ||
} else { | ||
// If the dev blog post does not exist in Discourse | ||
// Create a new topic and get the comments | ||
topicId = await createDiscourseTopic(postTitle, externalId, cleanSlug, blogUrl, DISCOURSE_TOPIC_ID); | ||
if (typeof topicId === "number") { | ||
comments = await getDiscourseTopicbyID(topicId); | ||
comments.shift(); | ||
comments = { topicId, comments }; | ||
|
||
return await response.status(200).json(comments); | ||
} else { | ||
console.log("Unable to create Discourse topic TopicID is not a number."); | ||
return await response.status(500).json({ error: "Unable to create Discourse topic TopicID is not a number." }); | ||
} | ||
} | ||
|
||
comments.shift(); | ||
comments = { topicId, comments }; | ||
|
||
return await response.status(200).json(comments); | ||
} catch (err) { | ||
console.log("err on getDiscourseComments", err); | ||
return await response.status(500).json({ error: "Unable to get topics from Discourse." }); | ||
} | ||
} | ||
|
||
async function createDiscourseTopic(title, externalId, slug, blogUrl, DISCOURSE_TOPIC_ID) { | ||
console.log(`Creating a new topic in Discourse - ${title}`) | ||
try { | ||
const response = await axios.post(`${discourse_endpoint}/posts`, { | ||
title: title, | ||
raw: `This is a companion discussion topic for the original entry at ${blogUrl}${slug}`, | ||
category: DISCOURSE_TOPIC_ID, | ||
embed_url: `${blogUrl}${slug}`, | ||
external_id: externalId, | ||
tags: ['devblog'], | ||
visible: false | ||
}, { headers }) | ||
|
||
let topicId = await response.data.topic_id | ||
|
||
console.log('Topic successfully created with topic_id', topicId) | ||
|
||
return topicId | ||
|
||
} catch(err) { | ||
console.log('err on createDiscourseTopic', err) | ||
return err | ||
} | ||
} | ||
|
||
async function getDiscourseTopicbyID(topicId) { | ||
console.log(`Topic found setting topic id - ${topicId}`) | ||
try { | ||
let response = await axios.get(`${discourse_endpoint}/t/${topicId}.json`, { headers }) | ||
let { data } = await response | ||
let post_stream = data.post_stream | ||
let post_count = data.posts_count | ||
|
||
// If there is more than one comment make the topic visibile in Discourse | ||
if (post_count > 1 && data.visible === false) { | ||
console.log(`Topic has more than one comment. Changing visibility to visible.`) | ||
await axios.put(`${discourse_endpoint}/t/${topicId}`, { | ||
visible: true | ||
}, { headers }) | ||
} | ||
|
||
// Filter only 'regular' posts in Discourse. (e.g. not moderator actions, small_actions, whispers) | ||
post_stream.posts = post_stream.posts.filter(post => post.post_type === 1) | ||
|
||
return post_stream.posts | ||
} catch(err) { | ||
console.log('err on getDiscourseTopicbyID', err) | ||
return err | ||
} | ||
} | ||
|
||
async function searchDiscourseExternalId(externalId) { | ||
console.log(`Searching for external_id in Discourse - ${externalId}`); | ||
try { | ||
const data = await axios.get(`${discourse_endpoint}/t/external_id/${externalId}.json`, { headers }); | ||
return data.data.id; | ||
} catch (err) { | ||
if (err.response.status === 404) { | ||
console.log("No topics found in Discourse."); | ||
return null; | ||
} | ||
console.log("Unable to search Discourse for external_id.", err); | ||
return err; | ||
} | ||
} | ||
|
||
|
||
// Truncate external_id to 50 characters per Discourse API requirements | ||
function truncateString(str) { | ||
if (str.length <= 50) { | ||
return str | ||
} | ||
return str.slice(0, 50) | ||
} | ||
|
||
// Remove query params and hash from URL to prevent duplicate topics | ||
function cleanUrl(url) { | ||
return url.split("?")[0].split("#")[0]; | ||
} | ||
|
||
// Create a function to get the host name from the request and add /blog/ to the end | ||
async function getBlogUrl(req) { | ||
const host = req.headers.host | ||
return `https://${host}/blog/` | ||
} | ||
|
||
module.exports = getDiscourseComments; |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,136 @@ | ||
const axios = require('axios') | ||
|
||
async function getDiscourseTopics(request, response) { | ||
const { DISCOURSE_API_KEY , DISCOURSE_USER } = process.env | ||
|
||
const body = request.body | ||
|
||
try { | ||
// Set API endpoint and headers | ||
let discourse_endpoint = `https://discourse.getdbt.com` | ||
let headers = { | ||
'Accept': 'application/json', | ||
'Api-Key': DISCOURSE_API_KEY, | ||
'Api-Username': DISCOURSE_USER, | ||
} | ||
|
||
const query = buildQueryString(body) | ||
if(!query) throw new Error('Unable to build query string.') | ||
|
||
// Get topics from Discourse | ||
let { data: { posts, topics } } = await axios.get(`${discourse_endpoint}/search?q=${query}`, { headers }) | ||
|
||
// Return empty array if no topics found for search query | ||
// 200 status is used to prevent triggering Datadog alerts | ||
if(!topics || topics?.length <= 0) { | ||
// Log message with encoded query and end function | ||
console.log('Unable to get results from api request.') | ||
console.log(`Search query: ${query}`) | ||
return await response.status(200).json([]) | ||
} | ||
|
||
// Set author and like_count for topics if not querying by specific term | ||
let allTopics = topics | ||
if(!body?.term) { | ||
allTopics = topics.reduce((topicsArr, topic) => { | ||
// Get first post in topic | ||
const firstTopicPost = posts?.find(post => | ||
post?.post_number === 1 && | ||
post?.topic_id === topic?.id | ||
) | ||
// If post found | ||
// Get username | ||
if(firstTopicPost?.username) { | ||
topic.author = firstTopicPost.username | ||
} | ||
// Get like count | ||
if(firstTopicPost?.like_count) { | ||
topic.like_count = firstTopicPost.like_count | ||
} | ||
|
||
if(firstTopicPost?.blurb) { | ||
topic.blurb = firstTopicPost.blurb | ||
} | ||
|
||
// Push updated topic to array | ||
topicsArr.push(topic) | ||
|
||
return topicsArr | ||
}, []) | ||
} | ||
|
||
// Return topics | ||
//return await returnResponse(200, allTopics) | ||
return await response.status(200).json(allTopics) | ||
} catch(err) { | ||
// Log and return the error | ||
console.log('err', err) | ||
return await response.status(500).json({ error: 'Unable to get topics from Discourse.'}) | ||
} | ||
} | ||
|
||
function buildQueryString(body) { | ||
if(!body) return null | ||
|
||
// start with empty query string | ||
let query = '' | ||
|
||
// check param and apply to query if set | ||
for (const [key, value] of Object.entries(body)) { | ||
// validate categories | ||
// if valid, add to query string | ||
if(validateItem({ key, value })) { | ||
if(key === 'category') { | ||
query += `#${value} ` | ||
} else if(key === 'inString') { | ||
query += `in:${value}` | ||
} else if(key === 'status' && Array.isArray(value)) { | ||
value?.map(item => { | ||
query += `${key}:${item} ` | ||
}) | ||
} else { | ||
query += `${key}:${value} ` | ||
} | ||
} | ||
} | ||
|
||
if(query) { | ||
const encodedQuery = encodeURIComponent(query) | ||
return encodedQuery | ||
} | ||
} | ||
|
||
function validateItem({ key, value }) { | ||
// predefined Discourse values | ||
// https://docs.discourse.org/#tag/Search/operation/search | ||
const inStringValues = ['title', 'first', 'pinned', 'wiki'] | ||
const orderValues = ['latest', 'likes', 'views', 'latest_topic'] | ||
const statusValues = ['open', 'closed', 'public', 'archived', 'noreplies', 'single_user', 'solved', 'unsolved'] | ||
|
||
// validate keys | ||
if(key === 'inString') { | ||
return inStringValues.includes(value) | ||
? true | ||
: false | ||
} else if(key === 'order') { | ||
return orderValues.includes(value) | ||
? true | ||
: false | ||
} else if(key === 'status') { | ||
if(Array.isArray(value)) { | ||
let isValid = true | ||
value?.map(item => { | ||
if(!statusValues.includes(item)) isValid = false | ||
}) | ||
return isValid | ||
} else { | ||
return statusValues.includes(value) | ||
? true | ||
: false | ||
} | ||
} else { | ||
return true | ||
} | ||
} | ||
|
||
module.exports = getDiscourseTopics |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,18 @@ | ||
// This function is used to break the cache on images | ||
// preventing stale or broken images from being served | ||
|
||
import useDocusaurusContext from '@docusaurus/useDocusaurusContext'; | ||
|
||
const CACHE_VERSION = '2' | ||
|
||
export default function imageCacheWrapper(src) { | ||
const { siteConfig: {customFields} } = useDocusaurusContext(); | ||
|
||
const cacheParam = customFields?.isVercel === '1' | ||
? `?v=${CACHE_VERSION}` | ||
: `` | ||
|
||
return ( | ||
src + cacheParam | ||
) | ||
} |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.