A Design Tokens Workflow (part 14)
Using Notion With Style Dictionary for Design Tokens Management
- Getting Started With Style Dictionary
- Outputting to Different Formats with Style Dictionary
- Beyond JSON: Exploring File Formats for Design Tokens
- Converting Tokens with Style Dictionary
- Organising Outputs with Style Dictionary
- Layers, referencing tokens in Style Dictionary
- Implementing Light and Dark Mode with Style Dictionary
- Implementing Light and Dark Mode with Style Dictionary (part 2)
- Implementing Multi-Brand Theming with Style Dictionary
- Creating Multiple Themes with Style Dictionary
- Creating Sass-backed CSS Custom Properties With Style Dictionary
- Creating a Penpot Design Tokens Format with Style Dictionary
- Generating Utility Classes from Design Tokens using Style Dictionary
- Using Notion With Style Dictionary for Design Tokens Management ā You are here
- Managing Microcopy with Design Tokens
On This Page:
Design tokens are great for developers, but what about everyone else on your team? The designers, product managers, and stakeholders who need to contribute without touching code?
JSON files work well for machines, but they're not exactly inviting for non-technical folks. That's where Notion comes in. It's the perfect bridge between your design system and the rest of your organisation.
In this tutorial, I'll show you how to set up a collaborative design token workflow using Notion as a user-friendly interface and Style Dictionary to generate platform-specific output. Changes made in Notion automatically sync to your codebase, and you can even push edits back for team review.
All the code and scripts mentioned in this tutorial are available in a GitHub repository for easy download and use
Benefits of This Approach
Using Notion for design token management offers several advantages:
- Accessibility: Non-technical team members can view, comment on, and propose token changes without touching code
- Collaboration: Built-in comments, version history, and real-time updates keep the team synchronised
- Organization: Database views allow filtering, sorting, and organizing tokens by category, status, or type
- Documentation: Rich content can be added alongside token values for context and usage guidelines
- Approval Workflow: Status fields manage which tokens are draft, in review, approved, or deprecated
Setting Up Your Notion Database
Create a Notion database with the following properties:
- Name (Title): The token name (e.g., "Primary Blue")
- Token Path (Text): The path Style Dictionary will use (e.g., "color.brand.primary")
- Value (Text): The actual value (e.g., "#007bff")
- Type (Select): The token type (color, dimension, fontFamily, fontWeight, etc.)
- Category (Select): How the token is grouped (brand, semantic, component, foundation, etc.)
- Description (Text): Notes on what the token is for and when to use it
- Status (Select): Draft, Review, Approved, Deprecated
Once your database is set up with these properties, you can start adding your design tokens. The Status field controls which tokens get synced. By default, only "Approved" tokens are pulled into your build, creating a natural approval workflow before changes reach production.
Token Path Format
The Token Path determines the structure of your generated CSS variables.
- Use dot notation:
color.brand.primarybecomes--color-brand-primary - Each level creates a nested structure:
typography.size.basebecomes--typography-size-base - Nested paths organise your token tree in the JSON output
The Style Dictionary build will succeed even with incorrect paths, but won't generate what you expect.
Setting Up the Notion Integration
Connecting Notion to your build process requires several steps. Complete them in order.
Step 1: Create the Integration
- Go to https://www.notion.com/my-integrations
- Click + New integration
- Give it a name (e.g., "Design Tokens Sync")
- Select your workspace
- Under Capabilities, make sure "Read content" is checked
- Click Save
- Copy your Internal Integration Token (the long string starting with
ntn_) - Store this securely, it's your API key for accessing Notion
After creating the integration, you'll see the integration details page where you can copy your Internal Integration Token.
Step 2: Get Your Database ID
You need the actual database ID, not the parent page ID.
- Open your Notion database (the table view)
- Copy the URL from your browser
- It looks like:
https://www.notion.so/abc123def456ghi789jkl?v=xyz&t=123 - The database ID is the part before the
?v=(the long hex string) - Save this ID
Step 3: Connect the Integration to Your Database
This step is critical, skipping it causes "database not found" errors.
- Open your Notion database
- Click the ā¢ā¢ā¢ menu in the top right corner
- Scroll to Connections
- Click + Add connections (or Confirm access if shown)
- Select your integration name
- Confirm access
From the menu, select + Add connections to grant your integration access to this specific database.
The integration now has permission to read from that database. Without this, the API will reject all requests with a 404 error.
Step 4: Set Up Your Development Environment
Now that your Notion integration is configured, let's set up your local development environment.
First, install the required npm packages:
npm install @notionhq/client style-dictionary dotenv nodemonCreate a .env file in your project root with your Notion credentials:
NOTION_TOKEN=ntn_your_long_token_here
NOTION_DATABASE_ID=abc123def456ghi789jklMake sure your package.json has "type": "module" so you can use ES modules:
{
"type": "module",
"name": "design-tokens",
"version": "1.0.0",
...
}That's it! You're ready to sync.
Fetching Tokens from Notion
Create a file called fetch-tokens.js that pulls approved tokens from Notion and generates a JSON file for Style Dictionary.
First, let's set up the imports and initialise the Notion client:
import { Client } from '@notionhq/client';
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
import dotenv from 'dotenv';
dotenv.config();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
// Initialise Notion client
const notion = new Client({ auth: process.env.NOTION_TOKEN });
const databaseId = process.env.NOTION_DATABASE_ID;Next, create a function to fetch all approved tokens from your Notion database. This handles pagination since Notion returns results in batches:
/**
* Fetch all approved tokens from Notion database
*/
async function fetchTokensFromNotion() {
try {
let allResults = [];
let hasMore = true;
let startCursor = undefined;
// Handle pagination
while (hasMore) {
const response = await notion.databases.query({
database_id: databaseId,
start_cursor: startCursor,
filter: {
property: 'Status',
select: {
equals: 'Approved',
},
},
});
allResults = [...allResults, ...response.results];
hasMore = response.has_more;
startCursor = response.next_cursor;
}
return allResults;
} catch (error) {
console.error('Error fetching from Notion:', error.message);
throw error;
}
}Now we need a function to extract the token data from Notion's page properties format into a simpler structure:
/**
* Extract token data from Notion page properties
*/
function extractTokenData(page) {
const properties = page.properties;
return {
id: page.id,
name: properties.Name?.title[0]?.plain_text || '',
path: properties['Token Path']?.rich_text[0]?.plain_text || '',
value: properties.Value?.rich_text[0]?.plain_text || '',
type: properties.Type?.select?.name || '',
category: properties.Category?.select?.name || '',
status: properties.Status?.select?.name || 'Draft',
description: properties.Description?.rich_text[0]?.plain_text || '',
};
}The tokens need to be converted from a flat list into a nested JSON structure that Style Dictionary can understand. This function builds that tree:
/**
* Convert flat token list to nested JSON structure
*/
function buildTokenTree(tokens) {
const tree = {};
tokens.forEach((token) => {
const pathParts = token.path.split('.');
let current = tree;
pathParts.forEach((part, index) => {
if (index === pathParts.length - 1) {
// Last part - add the token
current[part] = {
$value: token.value,
$type: token.type,
};
if (token.description) {
current[part].$description = token.description;
}
// Add Notion metadata as extensions
current[part].$extensions = {
'com.notion': {
status: token.status,
category: token.category,
notionId: token.id,
lastSynced: new Date().toISOString(),
},
};
} else {
// Create nested object if it doesn't exist
if (!current[part]) {
current[part] = {};
}
current = current[part];
}
});
});
return tree;
}To detect conflicts between local changes and Notion updates, we need to flatten the nested token structure back into an array for comparison:
/**
* Flatten nested token object back into array (for comparison/conflict detection)
*/
function flattenTokens(obj, prefix = '') {
const tokens = [];
Object.keys(obj).forEach((key) => {
const fullPath = prefix ? `${prefix}.${key}` : key;
const value = obj[key];
if (value.$value) {
// This is a token
tokens.push({
path: fullPath,
value: value.$value,
type: value.$type || '',
description: value.$description || '',
category: value.$extensions?.['com.notion']?.category || '',
status: value.$extensions?.['com.notion']?.status || 'Draft',
notionId: value.$extensions?.['com.notion']?.notionId || null,
});
} else if (typeof value === 'object' && value !== null) {
// Recurse for nested objects
tokens.push(...flattenTokens(value, fullPath));
}
});
return tokens;
}Finally, the main sync function that orchestrates everything:
/**
* Main function to sync tokens from Notion
*/
async function syncTokens() {
console.log('š Fetching tokens from Notion...');
const pages = await fetchTokensFromNotion();
console.log(`ā
Found ${pages.length} approved tokens\n`);
const notionTokens = pages.map(extractTokenData);
const newTokenTree = buildTokenTree(notionTokens);
// Create tokens directory if it doesn't exist
const tokensDir = path.join(__dirname, 'tokens');
if (!fs.existsSync(tokensDir)) {
fs.mkdirSync(tokensDir, { recursive: true });
}
// Check for conflicts with existing tokens.json
const outputPath = path.join(tokensDir, 'tokens.json');
let existingTokens = {};
let conflicts = [];
if (fs.existsSync(outputPath)) {
try {
const existingData = JSON.parse(fs.readFileSync(outputPath, 'utf-8'));
existingTokens = flattenTokens(existingData);
} catch (error) {
console.warn(
'ā ļø Could not read existing tokens.json, treating as new file\n'
);
}
}
// Flatten new tokens for comparison
const flatNewTokens = flattenTokens(newTokenTree);
// Check for conflicts (token exists in both but with different values)
flatNewTokens.forEach((newToken) => {
const existing = existingTokens.find((t) => t.path === newToken.path);
if (existing && existing.value !== newToken.value) {
conflicts.push({
path: newToken.path,
local: existing.value,
notion: newToken.value,
});
}
});
// Report what we found
if (conflicts.length > 0) {
console.log(
`ā ļø Found ${conflicts.length} conflicts (local changes != Notion):`
);
conflicts.forEach((conflict) => {
console.log(` ${conflict.path}:`);
console.log(` Local: ${conflict.local}`);
console.log(` Notion: ${conflict.notion}`);
});
console.log(`\nSyncing from Notion (Notion values take precedence)\n`);
}
// Write tokens to JSON file (Notion is source of truth)
fs.writeFileSync(outputPath, JSON.stringify(newTokenTree, null, 2), 'utf-8');
console.log(`š¾ Tokens saved to ${outputPath}`);
console.log('⨠Sync complete!');
return newTokenTree;
}
// Run if called directly
if (import.meta.url === `file://${process.argv[1]}`) {
syncTokens().catch(console.error);
}
export { syncTokens };What the sync script does:
- Fetches approved tokens from your Notion database
- Compares with existing tokens.json to detect conflicts
- Reports any differences: If you've edited a token locally and it changed in Notion, you'll see a warning
- Syncs from Notion: Notion is always the source of truth, so the Notion values win
- Preserves metadata: Keeps the
$extensionsand Notion IDs for future syncs and pushes
This way you know exactly what changed and won't be surprised by overwritten values.
Integrating With Style Dictionary
Once tokens are syncing from Notion, create a build.js file that runs Style Dictionary to generate platform-specific files.
First, set up the imports and environment:
import StyleDictionary from 'style-dictionary';
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
import dotenv from 'dotenv';
dotenv.config();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const notionDatabaseId = process.env.NOTION_DATABASE_ID;This function adds a header comment to generated files with Notion metadata:
/**
* Add Notion metadata comment to generated files
*/
function addNotionHeader(filePath) {
const notionUrl = `https://www.notion.so/${notionDatabaseId}`;
const header = `/**
* Do not edit directly, this file was auto-generated.
*
* Generated from Notion Design Tokens
* Database: ${notionUrl}
* Last updated: ${new Date().toISOString()}
*/\n\n`;
const content = fs.readFileSync(filePath, 'utf-8');
const lines = content.split('\n');
// Remove all comment blocks at the top of the file
let contentStartIndex = 0;
let inComment = false;
for (let i = 0; i < lines.length; i++) {
if (lines[i].includes('/**')) {
inComment = true;
}
if (inComment && lines[i].includes('*/')) {
contentStartIndex = i + 1;
// Skip empty lines after comment
while (
contentStartIndex < lines.length &&
lines[contentStartIndex].trim() === ''
) {
contentStartIndex++;
}
break;
}
}
const newContent =
header + lines.slice(contentStartIndex).join('\n').trimStart();
fs.writeFileSync(filePath, newContent, 'utf-8');
}The main build function runs Style Dictionary to generate platform files:
async function build() {
console.log('šļø Building with Style Dictionary...\n');
// Configure Style Dictionary
const sd = new StyleDictionary({
source: ['tokens/**/*.json'],
platforms: {
css: {
transformGroup: 'css',
buildPath: 'build/css/',
files: [
{
destination: 'variables.css',
format: 'css/variables',
},
],
},
scss: {
transformGroup: 'scss',
buildPath: 'build/scss/',
files: [
{
destination: '_variables.scss',
format: 'scss/variables',
},
],
},
js: {
transformGroup: 'js',
buildPath: 'build/js/',
files: [
{
destination: 'tokens.js',
format: 'javascript/es6',
},
],
},
},
});
// Build all platforms
await sd.buildAllPlatforms();
// Add Notion metadata to generated files
addNotionHeader(path.join(__dirname, 'build/css/variables.css'));
addNotionHeader(path.join(__dirname, 'build/scss/_variables.scss'));
console.log('\nā
Build complete!');
}
build().catch(console.error);
Adding Advanced Features
Once the basic workflow is in place, consider these optional enhancements:
Organizing by Category
If you want to keep tokens organised by category (which can be useful with large token sets):
async function syncTokensByCategory() {
const pages = await fetchTokensFromNotion();
const tokens = pages.map(extractTokenData);
// Group by category
const categories = {};
tokens.forEach((token) => {
const category = token.category || 'general';
if (!categories[category]) {
categories[category] = [];
}
categories[category].push(token);
});
// Write separate files per category
const tokensDir = path.join(__dirname, 'tokens');
if (!fs.existsSync(tokensDir)) {
fs.mkdirSync(tokensDir, { recursive: true });
}
Object.keys(categories).forEach((category) => {
const tree = buildTokenTree(categories[category]);
const outputPath = path.join(tokensDir, `${category}.json`);
fs.writeFileSync(outputPath, JSON.stringify(tree, null, 2), 'utf-8');
console.log(`š¾ ${category} tokens saved to ${outputPath}`);
});
}Validating Token Data
Add basic validation to catch issues early, invalid color values, malformed dimensions, and other data problems:
function validateToken(token) {
const errors = [];
if (!token.path) {
errors.push('Missing token path');
}
if (!token.value) {
errors.push('Missing token value');
}
if (!token.type) {
errors.push('Missing token type');
}
// Validate color format
if (token.type === 'color') {
const hexPattern = /^#([A-Fa-f0-9]{6}|[A-Fa-f0-9]{3})$/;
if (!hexPattern.test(token.value)) {
errors.push(`Invalid color format: ${token.value}`);
}
}
// Validate dimension format
if (token.type === 'dimension') {
const dimensionPattern = /^\d+(\.\d+)?(px|rem|em|%)$/;
if (!dimensionPattern.test(token.value)) {
errors.push(`Invalid dimension format: ${token.value}`);
}
}
return errors;
}
// Use in extractTokenData
function extractTokenData(page) {
const properties = page.properties;
const token = {
id: page.id,
name: properties.Name?.title[0]?.plain_text || '',
path: properties['Token Path']?.rich_text[0]?.plain_text || '',
value: properties.Value?.rich_text[0]?.plain_text || '',
type: properties.Type?.select?.name || '',
category: properties.Category?.select?.name || '',
status: properties.Status?.select?.name || 'Draft',
description: properties.Description?.rich_text[0]?.plain_text || '',
};
const errors = validateToken(token);
if (errors.length > 0) {
console.warn(`ā ļø Validation errors for "${token.name}":`, errors);
}
return token;
}Using Token References
One feature I find useful is the ability to reference other tokens. So if you want one token to use the value of another:
function buildTokenTree(tokens) {
const tree = {};
tokens.forEach((token) => {
const pathParts = token.path.split('.');
let current = tree;
pathParts.forEach((part, index) => {
if (index === pathParts.length - 1) {
// Check if value is a reference (starts with {)
const isReference =
token.value.startsWith('{') && token.value.endsWith('}');
current[part] = {
value: isReference ? token.value : token.value,
$type: token.type,
};
if (token.description) {
current[part].$description = token.description;
}
} else {
if (!current[part]) {
current[part] = {};
}
current = current[part];
}
});
});
return tree;
}In your Notion database, you can now use references like:
- Value:
{color.brand.primary}
Style Dictionary will automatically resolve these references during the build process.
Preserving Notion Metadata With $extensions
Keep Notion metadata in your generated JSON files using the Design Tokens $extensions field, which is designed specifically for custom metadata.
The sync script includes this by default, each token gets an $extensions object that tracks:
- status: Whether the token is Draft, Review, Approved, or Deprecated
- category: How the token is grouped
- notionId: The unique ID of the Notion page (useful for linking back)
- lastSynced: When the token was last pulled from Notion
This means your generated JSON looks like this:
{
"color": {
"primary": {
"$value": "#007bff",
"$type": "color",
"$description": "Primary brand color",
"$extensions": {
"com.notion": {
"status": "Approved",
"category": "brand",
"notionId": "a1b2c3d4e5f6g7h8",
"lastSynced": "2026-02-09T10:30:00.000Z"
}
}
}
}
}This metadata is useful for:
- Audit trails: Track exactly when tokens changed and what their approval status was
- Debugging: Identify which Notion page a token came from without confusion
- Automation: Use the extension data in custom Style Dictionary transforms or other tools
- Documentation: Generated files become self-documenting with full context
The $extensions field is optional, Style Dictionary and other tools treat it as optional metadata and work fine without it. For complex workflows or full audit trail tracking in version control, it's worth preserving.
If you want to dive deeper into the $extensions specification and see real-world examples, check out my post Understanding $extensions in the Design Tokens Specification and the Design Tokens Extensions Explorer tool.
Updating Your package.json
Add helpful scripts to your package.json for managing the workflow:
{
"name": "design-tokens",
"version": "1.0.0",
"type": "module",
"scripts": {
"sync": "node fetch-tokens.js",
"push": "node push-tokens.js",
"build": "node build.js",
"sync:build": "npm run sync && npm run build",
"watch": "nodemon --watch tokens -e json --exec npm run build"
},
"devDependencies": {
"@notionhq/client": "^2.2.15",
"style-dictionary": "^4.1.4",
"dotenv": "^16.3.1",
"nodemon": "^3.0.0"
}
}This enables the following commands:
npm run sync- Fetch tokens from Notionnpm run push- Push changes from JSON back to Notionnpm run build- Build with Style Dictionary (without syncing)npm run sync:build- Sync tokens AND build (combined operation)npm run watch- Watch for token changes and rebuild
Two-Way Sync: Going the Other Direction
This workflow supports bidirectional syncing, pull tokens from Notion into your JSON, or push changes from your JSON back to Notion for team review.
Create a push-tokens.js script to enable pushing.
First, set up the imports and initialise the Notion client:
import { Client } from '@notionhq/client';
import fs from 'fs';
import path from 'path';
import { fileURLToPath } from 'url';
import dotenv from 'dotenv';
dotenv.config();
const __filename = fileURLToPath(import.meta.url);
const __dirname = path.dirname(__filename);
const notion = new Client({ auth: process.env.NOTION_TOKEN });
const databaseId = process.env.NOTION_DATABASE_ID;This function flattens the nested token structure back into an array for processing. It's the inverse of the buildTokenTree function:
/**
* Flatten nested token object into array for processing
* This converts the hierarchical JSON structure back to a flat array
*/
function flattenTokens(obj, prefix = '') {
const tokens = [];
Object.keys(obj).forEach((key) => {
const fullPath = prefix ? `${prefix}.${key}` : key;
const value = obj[key];
if (value.$value) {
// This is a token leaf node
tokens.push({
path: fullPath,
value: value.$value,
type: value.$type || '',
description: value.$description || '',
category: value.$extensions?.['com.notion']?.category || '',
status: value.$extensions?.['com.notion']?.status || 'Draft',
notionId: value.$extensions?.['com.notion']?.notionId || null,
});
} else if (typeof value === 'object' && value !== null) {
// Recurse for nested objects (branches)
tokens.push(...flattenTokens(value, fullPath));
}
});
return tokens;
}This function updates an existing Notion page with new token values:
/**
* Update an existing Notion page with new token data
* Only updates the fields that can be modified (value, type, category, description)
*/
async function updateNotionPage(pageId, token) {
try {
await notion.pages.update({
page_id: pageId,
properties: {
Value: {
rich_text: [
{
type: 'text',
text: { content: token.value },
},
],
},
Type: {
select: { name: token.type },
},
Category: {
select: { name: token.category },
},
Description: {
rich_text: [
{
type: 'text',
text: { content: token.description },
},
],
},
},
});
return true;
} catch (error) {
console.error(` ā Failed to update "${token.path}":`, error.message);
return false;
}
}This function creates a new Notion page for tokens that don't exist yet:
/**
* Create a new Notion page in the database for a new token
* New tokens are created with "Draft" status for review
*/
async function createNotionPage(token) {
try {
await notion.pages.create({
parent: { database_id: databaseId },
properties: {
Name: {
title: [
{
type: 'text',
text: { content: token.path },
},
],
},
'Token Path': {
rich_text: [
{
type: 'text',
text: { content: token.path },
},
],
},
Value: {
rich_text: [
{
type: 'text',
text: { content: token.value },
},
],
},
Type: {
select: { name: token.type },
},
Category: {
select: { name: token.category },
},
Description: {
rich_text: [
{
type: 'text',
text: { content: token.description },
},
],
},
Status: {
select: { name: 'Draft' }, // New tokens start as drafts
},
},
});
return true;
} catch (error) {
console.error(` ā Failed to create "${token.path}":`, error.message);
return false;
}
}This function fetches all existing tokens from Notion to compare against local changes:
/**
* Fetch all tokens currently in Notion database
* Returns a map of token path -> Notion data for quick lookup
* This is used to determine what exists in Notion vs. what we want to push
*/
async function fetchExistingTokensFromNotion() {
try {
let allPages = [];
let hasMore = true;
let startCursor = undefined;
// Fetch all pages from database (not only approved ones)
while (hasMore) {
const response = await notion.databases.query({
database_id: databaseId,
start_cursor: startCursor,
});
allPages = [...allPages, ...response.results];
hasMore = response.has_more;
startCursor = response.next_cursor;
}
// Build lookup map of token path -> Notion data for fast comparison
const notionTokens = {};
allPages.forEach((page) => {
const props = page.properties;
const tokenPath = props['Token Path']?.rich_text[0]?.plain_text;
if (tokenPath) {
notionTokens[tokenPath] = {
id: page.id,
value: props.Value?.rich_text[0]?.plain_text || '',
type: props.Type?.select?.name || '',
description: props.Description?.rich_text[0]?.plain_text || '',
category: props.Category?.select?.name || '',
};
}
});
return notionTokens;
} catch (error) {
console.error('Error fetching from Notion:', error.message);
throw error;
}
}This function compares local tokens with Notion versions to detect changes:
/**
* Compare local token with Notion version to detect changes
* Returns true if the token needs to be updated in Notion
*/
function hasTokenChanged(local, notion) {
if (!notion) return true; // New token (doesn't exist in Notion yet)
// Check if any of the key properties differ
return (
local.value !== notion.value ||
local.type !== notion.type ||
local.description !== notion.description ||
local.category !== notion.category
);
}Finally, the main push function that orchestrates the entire process:
/**
* Main function to push tokens to Notion
* This implements a "smart sync" that only updates what has changed
*/
async function pushTokens() {
console.log('š¤ Pushing tokens to Notion...\n');
// Read current tokens from JSON
const tokensPath = path.join(__dirname, 'tokens', 'tokens.json');
if (!fs.existsSync(tokensPath)) {
console.error('ā tokens/tokens.json not found. Run "npm run sync" first.');
process.exit(1);
}
const tokensData = JSON.parse(fs.readFileSync(tokensPath, 'utf-8'));
const localTokens = flattenTokens(tokensData);
// Fetch current state from Notion for comparison
console.log('š Checking Notion database for existing tokens...\n');
const notionTokens = await fetchExistingTokensFromNotion();
// Categorise tokens: new, changed, or unchanged
const changedTokens = [];
const newTokens = [];
const unchangedTokens = [];
localTokens.forEach((token) => {
const notionToken = notionTokens[token.path];
if (!notionToken) {
// Token doesn't exist in Notion yet
newTokens.push(token);
} else if (hasTokenChanged(token, notionToken)) {
// Token exists but has different values - needs update
token.notionId = notionToken.id; // Store ID for updating
changedTokens.push(token);
} else {
// Token exists and is identical
unchangedTokens.push(token);
}
});
// Report what we found
console.log(`Found ${localTokens.length} tokens total:`);
console.log(` ⢠${unchangedTokens.length} unchanged (skipping)`);
console.log(` ⢠${changedTokens.length} changed`);
console.log(` ⢠${newTokens.length} new\n`);
// Exit early if nothing to do
if (changedTokens.length === 0 && newTokens.length === 0) {
console.log('Everything is already synced! Nothing to do.\n');
return;
}
// Push changes to Notion
let updated = 0;
let created = 0;
// Update existing tokens
for (const token of changedTokens) {
process.stdout.write(`Updating "${token.path}"... `);
if (await updateNotionPage(token.notionId, token)) {
console.log('ā
');
updated++;
}
}
// Create new tokens
for (const token of newTokens) {
process.stdout.write(`Creating "${token.path}"... `);
if (await createNotionPage(token)) {
console.log('ā
');
created++;
}
}
console.log(`\nā
Push complete!`);
console.log(` Updated: ${updated} tokens`);
console.log(` Created: ${created} new tokens`);
}
export { pushTokens };
How this works:
- Fetches from Notion: Queries the entire database to get current token state
- Builds a lookup map: Creates a fast dictionary of token path ā current values
- Compares local vs. Notion: For each local token, checks if value, type, category, or description differs
- Only updates what changed: Only makes API calls for tokens that actually differ
- No hidden files: Everything is tracked by comparing against the live Notion database
This approach is better because:
- No hidden
.lastPushed.jsonfiles to manage - Single source of truth is always Notion
- Works even if you skip a push or run the script multiple times
- Transparent, you can see exactly what's different
- Safe, only touches tokens that actually changed
How it works:
- Reads your JSON: Parses
tokens/tokens.jsonand flattens the nested structure - Matches tokens: Uses the
notionIdstored in$extensionsto find which Notion page to update - Updates or creates: For existing tokens, updates their values; for new ones, creates new Notion pages (marked as "Draft")
- Preserves metadata: Keeps category, status, and description synced
This is especially useful when:
- A developer tweaks a token value and wants that change reflected in Notion
- You're bulk-importing tokens from another source
- You want to automate token creation in Notion via script
Run npm run push after editing your JSON, and the Notion database updates automatically.
There you have it!
We've created a collaborative design token workflow that helps bring everyone together. Your team can now manage tokens in Notion's friendly interface, while developers get clean JSON for their builds. Everything stays in sync from one source of truth.
This foundation is pretty solid and should ready for expansion. You could add validation to catch errors early, use token references to avoid repetition, or organise by categories for larger sets. Going further you could add functionality that will allow the tokens to be imported into design tools liek Figma or Penpot.
For the complete working example with all scripts, check out the GitHub repository.