graph LR
A[Content Update] --> B[Spell Check]
B --> C[Link Validation]
C --> D[SEO Analysis]
D --> E{Quality Gates}
E -->|Pass| F[Deploy to Staging]
E -->|Fail| G[Block Deployment]
F --> H[Production Deploy]
H --> I[Post-Deploy Validation]
CI/CD Quality Assurance for Public Site
1 Purpose
Establish automated quality assurance pipelines for the ethicic-public website to ensure content accuracy, link integrity, SEO optimization, and professional presentation before publication.
2 Current State Assessment
Missing CI/CD Components: - ❌ Automated spell checking - ❌ Link validation and broken link detection - ❌ Content quality scoring - ❌ SEO meta tag validation - ❌ Image optimization verification - ❌ Accessibility compliance testing - ❌ Performance regression testing
3 Recommended CI/CD Pipeline Architecture
3.1 GitHub Actions Workflow Structure
# .github/workflows/content-quality.yml
name: Content Quality Assurance
on:
push:
branches: [main, staging]
pull_request:
branches: [main]
schedule:
- cron: '0 6 * * *' # Daily link validation
jobs:
content-quality:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v3
- name: Spell Check
uses: streetsidesoftware/cspell-action@v2
- name: Link Validation
uses: lycheeverse/lychee-action@v1
- name: SEO Analysis
run: npm run seo-audit
- name: Performance Testing
run: npm run lighthouse-ci3.2 Content Quality Gates
| Gate | Tool | Pass Criteria | Failure Action |
|---|---|---|---|
| Spelling | cspell | Zero misspellings | Block deployment |
| Links | lychee | <5% broken links | Block deployment |
| SEO | lighthouse | Score >85 | Warning only |
| Performance | lighthouse | Score >90 | Block deployment |
| Accessibility | axe-core | Zero violations | Block deployment |
4 Spell Check Implementation
4.1 CSpell Configuration
// .cspell.json
{
"version": "0.2",
"language": "en",
"words": [
"ECIC",
"ethicic",
"ESG",
"fiduciary",
"stewardship",
"sustainability",
"diversification",
"portfolio",
"alpha",
"beta",
"sharpe",
"drawdown"
],
"dictionaries": [
"en_US",
"financial-terms",
"investment-vocabulary"
],
"files": [
"**/*.{md,qmd,html,txt}",
"!node_modules/**",
"!.git/**"
],
"ignoreWords": [
"githubusercontent",
"kinsta",
"ubicloud",
"deepinfra"
]
}4.2 Financial Dictionary Extension
// dictionaries/financial-terms.txt
401k
403b
fiduciary
stewardship
rebalancing
diversification
capitalization
volatility
correlation
sharpe
sortino
calmar
drawdown4.3 Automated Spell Check Action
# Manual spell check execution
npm install -g cspell
cspell "content/**/*.{md,qmd}" --config .cspell.json
# Add words to dictionary
echo "newfinancialterm" >> dictionaries/financial-terms.txt5 Link Validation System
5.1 Lychee Configuration
# .lycheeignore
# Ignore patterns for link checker
localhost
127.0.0.1
*.local
example.com
mailto:*
# Internal staging URLs
staging.ethicic.com
dev.ethicic.com
# Social media that blocks crawlers
twitter.com/*/status/*
linkedin.com/in/*5.2 Comprehensive Link Validation
# .github/workflows/link-check.yml
name: Link Validation
on:
schedule:
- cron: '0 6 * * 1' # Weekly on Monday
workflow_dispatch:
jobs:
linkcheck:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Lychee Link Checker
uses: lycheeverse/lychee-action@v1
with:
args: --verbose --no-progress --exclude-mail '**/*.md' '**/*.qmd'
fail: true
env:
GITHUB_TOKEN: ${{secrets.GITHUB_TOKEN}}
- name: Create Issue on Failure
if: failure()
uses: actions/github-script@v6
with:
script: |
github.rest.issues.create({
owner: context.repo.owner,
repo: context.repo.repo,
title: 'Broken Links Detected',
body: 'Link validation failed. Check the action logs for details.',
labels: ['bug', 'content-quality']
})5.3 Link Health Monitoring
# Daily link health check script
#!/bin/bash
# scripts/link-health-check.sh
echo "🔍 Checking link health for ethicic-public..."
# Check internal links
lychee --format detailed "content/**/*.md" \
--exclude "localhost" \
--exclude "*.local" \
> link-report.txt
# Check external links with retry
lychee --format detailed "content/**/*.md" \
--include "https://.*" \
--retry-wait-time 2 \
--max-redirects 5 \
>> link-report.txt
# Generate summary
echo "📊 Link Health Summary:"
grep -c "✅" link-report.txt && echo " working links"
grep -c "❌" link-report.txt && echo " broken links"6 SEO and Content Quality Validation
6.1 Lighthouse CI Configuration
// lighthouserc.json
{
"ci": {
"collect": {
"url": [
"https://ethicic.com",
"https://ethicic.com/strategies/growth",
"https://ethicic.com/strategies/income",
"https://ethicic.com/blog"
],
"numberOfRuns": 3
},
"assert": {
"assertions": {
"categories:performance": ["error", {"minScore": 0.9}],
"categories:accessibility": ["error", {"minScore": 0.95}],
"categories:seo": ["warn", {"minScore": 0.85}],
"categories:best-practices": ["error", {"minScore": 0.9}]
}
}
}
}6.2 Meta Tag Validation
// scripts/validate-meta-tags.js
const fs = require('fs');
const path = require('path');
const matter = require('gray-matter');
function validateMetaTags(filePath) {
const content = fs.readFileSync(filePath, 'utf8');
const { data } = matter(content);
const required = ['title', 'description'];
const missing = required.filter(tag => !data[tag]);
if (missing.length > 0) {
console.error(`❌ ${filePath}: Missing ${missing.join(', ')}`);
return false;
}
// Title length validation
if (data.title && data.title.length > 60) {
console.warn(`⚠️ ${filePath}: Title too long (${data.title.length} chars)`);
}
// Description length validation
if (data.description && data.description.length > 160) {
console.warn(`⚠️ ${filePath}: Description too long (${data.description.length} chars)`);
}
console.log(`✅ ${filePath}: Meta tags valid`);
return true;
}
// Process all content files
const contentDir = 'content';
const files = fs.readdirSync(contentDir, { recursive: true })
.filter(file => file.endsWith('.md') || file.endsWith('.qmd'));
let allValid = true;
files.forEach(file => {
if (!validateMetaTags(path.join(contentDir, file))) {
allValid = false;
}
});
process.exit(allValid ? 0 : 1);7 Content Quality Scoring
7.1 Readability Analysis
// scripts/readability-check.js
const fs = require('fs');
const readability = require('automated-readability');
const matter = require('gray-matter');
function analyzeReadability(filePath) {
const content = fs.readFileSync(filePath, 'utf8');
const { content: body } = matter(content);
const stats = readability.calculateReadability({
sentences: body.split(/[.!?]+/).length,
words: body.split(/\s+/).length,
characters: body.replace(/\s/g, '').length
});
console.log(`📊 ${filePath}:`);
console.log(` Flesch Score: ${stats.fleschKincaidReadingEase.toFixed(1)}`);
console.log(` Grade Level: ${stats.fleschKincaidGradeLevel.toFixed(1)}`);
// Flag content that's too complex for general audience
if (stats.fleschKincaidGradeLevel > 12) {
console.warn(`⚠️ Content may be too complex for general audience`);
}
return stats;
}7.2 Content Completeness Validation
# scripts/content-validation.sh
#!/bin/bash
echo "🔍 Validating content completeness..."
# Check for required front matter
find content -name "*.qmd" -o -name "*.md" | while read file; do
if ! grep -q "^title:" "$file"; then
echo "❌ Missing title: $file"
fi
if ! grep -q "^description:" "$file"; then
echo "❌ Missing description: $file"
fi
if ! grep -q "^date:" "$file"; then
echo "❌ Missing date: $file"
fi
done
# Check for broken internal references
find content -name "*.qmd" -o -name "*.md" | xargs grep -l "\[.*\](" | while read file; do
grep -o "\[.*\]([^)]*)" "$file" | grep "^.*\](\." | while read link; do
target=$(echo "$link" | sed 's/.*](\.\?\(.*\))/\1/')
if [ ! -f "content/$target" ] && [ ! -f "$target" ]; then
echo "❌ Broken internal link in $file: $target"
fi
done
done8 Image and Media Optimization
8.1 Image Validation Pipeline
# .github/workflows/media-optimization.yml
name: Media Optimization
on:
push:
paths: ['assets/images/**', 'content/**/images/**']
jobs:
optimize-images:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Optimize Images
uses: calibreapp/image-actions@main
with:
githubToken: ${{ secrets.GITHUB_TOKEN }}
compressOnly: true
- name: Validate Image Alt Text
run: |
# Check for missing alt text in markdown
find content -name "*.md" -o -name "*.qmd" | \
xargs grep -l "!\[" | \
xargs grep "!\[\](" && echo "❌ Images without alt text found" || echo "✅ All images have alt text"8.2 Media File Standards
// .image-optimization.json
{
"jpeg": {
"quality": 85,
"progressive": true
},
"png": {
"quality": 90,
"compressionLevel": 6
},
"webp": {
"quality": 85,
"lossless": false
},
"maxWidth": 1200,
"maxFileSize": "500KB"
}9 Deployment Pipeline Integration
9.1 Pre-Deployment Quality Gates
# .github/workflows/deploy.yml
name: Deploy to Production
on:
push:
branches: [main]
jobs:
quality-gates:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Content Quality Check
run: |
npm run spell-check
npm run link-check
npm run meta-validation
npm run readability-check
- name: Performance Baseline
run: npm run lighthouse-ci
- name: Security Scan
uses: securecodewarrior/github-action-add-sarif@v1
with:
sarif-file: security-scan.sarif
deploy:
needs: quality-gates
runs-on: ubuntu-latest
if: success()
steps:
- name: Deploy to Kinsta
run: |
# Kinsta deployment via API or Git integration
curl -X POST "$KINSTA_DEPLOY_WEBHOOK" \
-H "Authorization: Bearer $KINSTA_API_KEY"9.2 Post-Deployment Validation
# scripts/post-deploy-validation.sh
#!/bin/bash
SITE_URL="https://ethicic.com"
echo "🚀 Post-deployment validation for $SITE_URL"
# Basic health check
if curl -f -s "$SITE_URL" > /dev/null; then
echo "✅ Site is accessible"
else
echo "❌ Site is not accessible"
exit 1
fi
# Check critical pages
critical_pages=("/" "/strategies/growth" "/strategies/income" "/about" "/contact")
for page in "${critical_pages[@]}"; do
if curl -f -s "$SITE_URL$page" > /dev/null; then
echo "✅ $page is accessible"
else
echo "❌ $page is not accessible"
exit 1
fi
done
# Performance validation
lighthouse_score=$(lighthouse "$SITE_URL" --only-categories=performance --output=json | jq '.categories.performance.score * 100')
if [ "$lighthouse_score" -gt 90 ]; then
echo "✅ Performance score: $lighthouse_score"
else
echo "⚠️ Performance score below threshold: $lighthouse_score"
fi
echo "✅ Post-deployment validation complete"10 Monitoring and Alerting
10.1 Content Quality Monitoring
# Daily content health report
#!/bin/bash
# scripts/daily-quality-report.sh
DATE=$(date +%Y-%m-%d)
REPORT_FILE="reports/quality-report-$DATE.md"
echo "# Content Quality Report - $DATE" > "$REPORT_FILE"
echo "" >> "$REPORT_FILE"
# Spell check summary
echo "## Spell Check Results" >> "$REPORT_FILE"
cspell "content/**/*.{md,qmd}" --reporter @cspell/reporter-json | \
jq -r '.issues | length as $count | "Found \($count) spelling issues"' >> "$REPORT_FILE"
# Link validation summary
echo "## Link Validation Results" >> "$REPORT_FILE"
lychee --format json "content/**/*.md" | \
jq -r '"Total links: \(.total) | Successful: \(.successful) | Broken: \(.errors)"' >> "$REPORT_FILE"
# Performance metrics
echo "## Performance Metrics" >> "$REPORT_FILE"
lighthouse "https://ethicic.com" --output=json | \
jq -r '.categories | to_entries[] | "\(.key): \(.value.score * 100 | round)%"' >> "$REPORT_FILE"
echo "📊 Quality report generated: $REPORT_FILE"10.2 Automated Issue Creation
// scripts/create-quality-issues.js
const { Octokit } = require('@octokit/rest');
async function createQualityIssues(qualityResults) {
const octokit = new Octokit({
auth: process.env.GITHUB_TOKEN
});
if (qualityResults.brokenLinks.length > 0) {
await octokit.rest.issues.create({
owner: 'ethicalcapital',
repo: 'ethicic-public',
title: `Broken Links Detected - ${new Date().toISOString().split('T')[0]}`,
body: `The following links are broken:\n\n${qualityResults.brokenLinks.map(link => `- ${link}`).join('\n')}`,
labels: ['bug', 'content-quality', 'broken-links']
});
}
if (qualityResults.spellingErrors.length > 0) {
await octokit.rest.issues.create({
owner: 'ethicalcapital',
repo: 'ethicic-public',
title: `Spelling Errors Found - ${new Date().toISOString().split('T')[0]}`,
body: `The following spelling errors were found:\n\n${qualityResults.spellingErrors.map(error => `- ${error}`).join('\n')}`,
labels: ['content-quality', 'spelling']
});
}
}This comprehensive CI/CD quality assurance system ensures professional content standards before publication while maintaining automated monitoring for ongoing quality.