Merge pull request #1251 from veggiemonk/feat/go-rewrite

Cleaning up
This commit is contained in:
Julien Bisconti
2026-02-28 10:46:56 +01:00
committed by GitHub
47 changed files with 5866 additions and 2111 deletions

2
.github/CODEOWNERS vendored
View File

@@ -1 +1 @@
*.md @veggiemonk @agebhar1 @dmitrytokarev @gesellix @mashb1t @moshloop @vegasbrianc @noteed
* @veggiemonk @agebhar1 @dmitrytokarev @gesellix @mashb1t @moshloop @vegasbrianc @noteed

View File

@@ -1,94 +1,55 @@
# Contributing to awesome-docker
First: if you're unsure or afraid of anything, just ask or submit the issue or pull request anyways. You won't be yelled at for giving your best effort. The worst that can happen is that you'll be politely asked to change something. We appreciate any sort of contributions, and don't want a wall of rules to get in the way of that.
Thanks for taking the time to contribute.
However, for those individuals who want a bit more guidance on the best way to contribute to the project, read on. This document will cover what we're looking for. By addressing all the points we're looking for, it raises the chances we can quickly merge or address your contributions.
This repository is a curated list of Docker/container resources plus a Go-based maintenance CLI used by CI. Contributions are welcome for both content and tooling.
We appreciate and recognize [all contributors](https://github.com/veggiemonk/awesome-docker/graphs/contributors).
Please read and follow the [Code of Conduct](./CODE_OF_CONDUCT.md).
Please note that this project is released with a [Contributor Code of Conduct](https://github.com/veggiemonk/awesome-docker/blob/master/.github/CODE_OF_CONDUCT.md). By participating in this project you agree to abide by its terms.
## What We Accept
# Table of Contents
- New high-quality Docker/container-related projects
- Fixes to descriptions, ordering, or categorization
- Removal of broken, archived, deprecated, or duplicate entries
- Improvements to the Go CLI and GitHub workflows
- [Mission Statement](#mission-statement)
- [Quality Standards](#quality-standards)
- [Contribution Guidelines](#contribution-guidelines)
- [New Collaborators](#new-collaborators)
## README Entry Rules
# Mission Statement
- Use one link per entry.
- Prefer GitHub project/repository URLs over marketing pages.
- Keep entries alphabetically sorted within their section.
- Keep descriptions concise and concrete.
- Use `:heavy_dollar_sign:` for paid/commercial services.
- Do not use `:skull:`; archived/deprecated projects should be removed.
- Avoid duplicate links and redirect variants.
`awesome-docker` is a hand-crafted list for high-quality information about Docker and its resources. It should be related or compatible with Docker or containers. If it's just an image built on top of Docker, the project possibly belongs to other [awesome lists](https://github.com/sindresorhus/awesome). You can check the [awesome-selfhosted list](https://github.com/Kickball/awesome-selfhosted) or the [awesome-sysadmin list](https://github.com/n1trux/awesome-sysadmin) as well.
If it's a **tutorial or a blog post**, they get outdated really quickly so we don't really put them on the list but if it is on a very advanced and/or specific topic, we will consider it!
If something is awesome, share it (pull request or [issue](https://github.com/veggiemonk/awesome-docker/issues/new) or [chat](https://gitter.im/veggiemonk/awesome-docker)), let us know why and we will help you!
## Local Validation
# Quality Standards
```bash
# Build CLI
make build
Note that we can help you achieve those standards, just try your best and be brave.
We'll guide you to the best of our abilities.
# Validate README formatting and content
make lint
To be on the list, it would be **nice** if entries adhere to these quality standards:
# Run code tests (when touching Go code)
make test
- It should take less than 20 sec to find what is the project, how to install it and how to use it.
- Generally useful to the community.
- A project on GitHub with a well documented `README.md` file and plenty of examples is considered high quality.
- Clearly stating if an entry is related to (Linux) containers and not to Docker. There is an [awesome list](https://github.com/Friz-zy/awesome-linux-containers) for that.
- Clearly stating "what is it" i.e. which category it belongs to.
- Clearly stating "what is it for" i.e. mention a real problem it solves (even a small one). Make it clear for the next person.
- If it is a **WIP** (work in progress, not safe for production), please mention it. (Remember the time before Docker 1.0 ? ;-) )
- Always put the link to the GitHub project instead of the website!
# Optional: full external checks (requires GITHUB_TOKEN)
./awesome-docker check
./awesome-docker validate
```
To be on the list, the project **must** have:
## Pull Request Expectations
- How to setup/install the project
- How to use the project (examples)
- Keep the PR focused to one logical change.
- Explain what changed and why.
- If adding entries, include the target category.
- If removing entries, explain why (archived, broken, duplicate, etc.).
- Fill in the PR template checklist.
If your PR is not merged, we will tell you why so that you may be able to improve it.
But usually, we are pretty relaxed people, so just come and say hi, we'll figure something out together.
## Maintainer Notes
# Contribution Guidelines
## I want to share a project, what should I do?
- **Adding to the list:** Submit a pull request or open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new)
- **Removing from the list:** Submit a pull request or open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new)
- Changing something else: Submit a pull request or open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new)
- Don't know what to do: Open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new) or join our [chat](https://gitter.im/veggiemonk/awesome-docker), let us know what's going on.
**join the chat:**
[![Join the chat at https://gitter.im/veggiemonk/awesome-docker](https://badges.gitter.im/Join%20Chat.svg)](https://gitter.im/veggiemonk/awesome-docker)
or you can
**ping us on Twitter:**
* [veggiemonk](https://twitter.com/veggiemonk)
* [idomyowntricks](https://twitter.com/idomyowntricks)
* [gesellix](https://twitter.com/gesellix)
* [dmitrytokarev](https://twitter.com/dmitrytok)
### Rules for Pull Request
- Each item should be limited to one link, no duplicates, no redirection (careful with `http` vs `https`!)
- The link should be the name of the package or project or website
- Description should be clear and concise (read it out loud to be sure)
- Description should follow the link, on the same line
- Entries are listed alphabetically, please respect the order
- If you want to add more than one link, please don't do all PR on the exact same line, it usually results in conflicts and your PR cannot be automatically merged...
Please contribute links to packages/projects you have used or are familiar with. This will help ensure high-quality entries.
#### Your commit message will be a [tweet](https://twitter.com/awesome_docker) so write a [good commit message](https://chris.beams.io/posts/git-commit/), keep that in mind :)
# New Collaborators
If you just joined the team of maintainers for this repo, first of all: WELCOME!
If it is your first time maintaining an open source project, read the [best practice guides for maintainers](https://opensource.guide/best-practices/).
Here are the few things you need to know:
* We don't push directly to the master branch. Every entry **MUST** be reviewed!
* Each entry should be in accordance to this quality standards and contribution guidelines.
* To ask a contributor to make a change, just copy paste this message [here](https://github.com/veggiemonk/awesome-docker/pull/289#issuecomment-285608004) and change few things like names and stuff. **The main idea is to help people making great projects.**
* If something seems weird, i.e. if you don't understand what a project does or the documentation is poor, don't hesitate to (nicely) ask for more explanation (see previous point).
* Say thank you to people who contribute to this project! It may not seems like much but respect and gratitude are important :D
- Changes should be reviewed before merge.
- Prefer helping contributors improve a PR over silently rejecting it.
- Keep `.github` documentation and workflows aligned with current tooling.

View File

@@ -1,18 +1,21 @@
---
name: Add a project
about: Add a new project to the list
title: add [PROJECT_NAME]
title: "add: [PROJECT_NAME]"
labels: pending-evaluation
assignees: ''
---
Category:
Repository link:
Description:
Description (one sentence):
Author:
Why this should be in the list:
Notes (`:heavy_dollar_sign:` if relevant):
Or directly write it:
```markdown
[REPO](https://github.com/AUTHOR/REPO) - DESCRIPTION. By [@AUTHOR](https://github.com/AUTHOR)
[REPO](https://github.com/AUTHOR/REPO) - DESCRIPTION. By [AUTHOR](https://github.com/AUTHOR)
```

137
.github/MAINTENANCE.md vendored
View File

@@ -1,116 +1,81 @@
# 🔧 Maintenance Guide for Awesome Docker
# Maintenance Guide
This guide helps maintainers keep the awesome-docker list up-to-date and high-quality.
This guide describes how maintainers keep the list and automation healthy.
## 🤖 Automated Systems
## Automated Workflows
### Weekly Health Reports
- **What**: Checks all GitHub repositories for activity, archived status, and maintenance
- **When**: Every Monday at 9 AM UTC
- **Where**: Creates/updates a GitHub issue with label `health-report`
- **Action**: Review the report and mark abandoned projects with `:skull:`
### Pull Requests / Weekly QA (`pull_request.yml`)
### Broken Links Detection
- **What**: Tests all links in README.md for availability
- **When**: Every Saturday at 2 AM UTC + on every PR
- **Where**: Creates/updates a GitHub issue with label `broken-links`
- **Action**: Fix or remove broken links, or add to exclusion list
- Runs on pull requests and weekly on Saturday.
- Builds the Go CLI and runs `./awesome-docker validate`.
### PR Validation
- **What**: Checks for duplicate links and basic validation
- **When**: On every pull request
- **Action**: Automated - contributors see results immediately
### Broken Links Report (`broken_links.yml`)
## 📋 Manual Maintenance Tasks
- Runs weekly on Saturday and on manual trigger.
- Executes `./awesome-docker check`.
- Opens/updates a `broken-links` issue when problems are found.
### Monthly Review (First Monday of the month)
1. Check health report issue for archived/stale projects
2. Mark archived projects with `:skull:` in README.md
3. Review projects with 2+ years of inactivity
4. Remove projects that are truly abandoned/broken
### Weekly Health Report (`health_report.yml`)
### Quarterly Deep Dive (Every 3 months)
1. Run: `npm run health-check` for detailed report
2. Review project categories - are they still relevant?
3. Check for popular new Docker tools to add
4. Update documentation links if newer versions exist
- Runs weekly on Monday and on manual trigger.
- Executes `./awesome-docker health` then `./awesome-docker report`.
- Opens/updates a `health-report` issue.
### Annual Cleanup (January)
1. Remove all `:skull:` projects older than 1 year
2. Review CONTRIBUTING.md guidelines
3. Update year references in documentation
4. Check Node.js version requirements
### Deploy to GitHub Pages (`deploy-pages.yml`)
## 🛠️ Maintenance Commands
- Runs on pushes to `master` and manual trigger.
- Builds website with `./awesome-docker build` and publishes `website/`.
## Day-to-Day Commands
```bash
# Test all links (requires GITHUB_TOKEN)
npm test
# Build CLI
make build
# Test PR changes only
npm run test-pr
# README lint/validation
make lint
# Generate health report (requires GITHUB_TOKEN)
npm run health-check
# Auto-fix formatting issues
./awesome-docker lint --fix
# Build the website
npm run build
# Update dependencies
npm update
# Link checks and health checks (requires GITHUB_TOKEN)
make check
make health
make report
```
## 📊 Quality Standards
## Content Maintenance Policy
### Adding New Projects
- Must have clear documentation (README with install/usage)
- Should have activity within last 18 months
- GitHub project preferred over website links
- Must be Docker/container-related
- Remove archived/deprecated projects instead of tagging them.
- Remove broken links that cannot be fixed.
- Keep sections alphabetically sorted.
- Keep descriptions short and actionable.
### Marking Projects as Abandoned
Use `:skull:` emoji when:
- Repository is archived on GitHub
- No commits for 2+ years
- Project explicitly states it's deprecated
- Maintainer confirms abandonment
## Suggested Review Cadence
### Removing Projects
Only remove (don't just mark `:skull:`):
- Broken/404 links that can't be fixed
- Duplicate entries
- Spam or malicious projects
- Projects that never met quality standards
### Weekly
## 🚨 Emergency Procedures
- Triage open `broken-links` and `health-report` issues.
- Merge straightforward quality PRs.
### Critical Broken Links
If important resources are down:
1. Check if they moved (update URL)
2. Search for alternatives
3. Check Internet Archive for mirrors
4. Temporarily comment out until resolved
### Monthly
### Spam Pull Requests
1. Close immediately
2. Mark as spam
3. Block user if repeated offense
4. Don't engage in comments
- Review sections for stale/duplicate entries.
- Re-run `check` and `health` manually if needed.
## 📈 Metrics to Track
### Quarterly
- Total projects: ~731 GitHub repos
- Health status: aim for <5% archived
- Link availability: aim for >98% working
- PR merge time: aim for <7 days
- Weekly contributor engagement
- Review `.github` docs and templates for drift.
- Confirm workflows still match repository tooling and policies.
## 🤝 Getting Help
## Contributor Support
- Open a discussion in GitHub Discussions
- Check AGENTS.md for AI assistant guidelines
- Review CONTRIBUTING.md for contributor info
When requesting PR changes, be explicit and actionable:
- point to section/order problems,
- explain why a link should be removed,
- suggest exact wording when description quality is the issue.
---
*Last updated: 2025-10-01*
Last updated: 2026-02-27

View File

@@ -1,48 +1,28 @@
<!-- Congrats on creating an Awesome Docker entry! 🎉 -->
# Summary
<!-- **Remember that entries are ordered alphabetically** -->
Describe what changed and why.
# TLDR
* all entries sorted alphabetically (from A to Z),
* If paying service add :heavy_dollar_sign:
* If WIP add :construction:
* clear and short description of the project
* project MUST have: How to setup/install
* project MUST have: How to use (examples)
* we can help you get there :)
## Scope
## Quality Standards
- [ ] README entries/content
- [ ] Go CLI/tooling
- [ ] GitHub workflows or `.github` docs
Note that we can help you achieve those standards, just try your best and be brave.
We'll guide you to the best of our abilities.
## If This PR Adds/Edits README Entries
To be on the list, it would be **nice** if entries adhere to these quality standards:
- Category/section touched:
- New or updated project links:
- It should take less than 20 sec to find what is the project, how to install it and how to use it.
- Generally useful to the community.
- A project on GitHub with a well documented `README.md` file and plenty of examples is considered high quality.
- Clearly stating if an entry is related to (Linux) containers and not to Docker. There is an [awesome list](https://github.com/Friz-zy/awesome-linux-containers) for that.
- Clearly stating "what is it" i.e. which category it belongs to.
- Clearly stating "what is it for" i.e. mention a real problem it solves (even a small one). Make it clear for the next person.
- If it is a **WIP** (work in progress, not safe for production), please mention it. (Remember the time before Docker 1.0 ? ;-) )
- Always put the link to the GitHub project instead of the website!
## Validation
To be on the list, the project **must** have:
- [ ] `make lint`
- [ ] `make test` (if Go code changed)
- [ ] `./awesome-docker check` (if `GITHUB_TOKEN` available)
- How to setup/install the project
- How to use the project (examples)
If your PR is not merged, we will tell you why so that you may be able to improve it.
But usually, we are pretty relaxed people, so just come and say hi, we'll figure something out together.
# Rules for Pull Request
- Each item should be limited to one link, no duplicates, no redirection (careful with `http` vs `https`!)
- The link should be the name of the package or project or website
- Description should be clear and concise (read it out loud to be sure)
- Description should follow the link, on the same line
- Entries are listed alphabetically, please respect the order
- If you want to add more than one link, please don't do all PR on the exact same line, it usually results in conflicts and your PR cannot be automatically merged...
Please contribute links to packages/projects you have used or are familiar with. This will help ensure high-quality entries.
## Contributor Checklist
- [ ] Entries are alphabetically ordered in their section
- [ ] Links point to project repositories (no duplicates or redirects)
- [ ] Descriptions are concise and specific
- [ ] Archived/deprecated projects were removed instead of tagged
- [ ] Used `:heavy_dollar_sign:` only when applicable

View File

@@ -1,17 +1,13 @@
version: 2
updates:
# Enable version updates for npm
- package-ecosystem: "npm"
# Look for `package.json` and `lock` files in the `root` directory
# Enable version updates for Go modules
- package-ecosystem: "gomod"
directory: "/"
# Check the npm registry for updates every day (weekdays)
schedule:
interval: "weekly"
# Enable version updates for GitHub Actions
- package-ecosystem: "github-actions"
# Workflow files stored in the default location of `.github/workflows`
# You don't need to specify `/.github/workflows` for `directory`. You can use `directory: "/"`.
directory: "/"
schedule:
interval: "weekly"

View File

@@ -2,43 +2,34 @@ name: Broken Links Report
on:
schedule:
# Run every Saturday at 2 AM UTC
- cron: "0 2 * * 6"
workflow_dispatch:
concurrency:
group: broken-links-${{ github.ref }}
cancel-in-progress: false
jobs:
check-links:
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: read
issues: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6.0.2
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v6.2.0
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
with:
node-version: lts/*
go-version-file: go.mod
- uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # ratchet:actions/cache@v5.0.3
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Install Dependencies
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
- name: Build
run: go build -o awesome-docker ./cmd/awesome-docker
- name: Run Link Check
id: link_check
run: |
npm test > link_check_output.txt 2>&1 || true
if grep -q "❌ ERROR" link_check_output.txt; then
echo "has_errors=true" >> $GITHUB_OUTPUT
else
echo "has_errors=false" >> $GITHUB_OUTPUT
fi
run: ./awesome-docker ci broken-links --issue-file broken_links_issue.md --github-output "$GITHUB_OUTPUT"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
@@ -48,34 +39,8 @@ jobs:
with:
script: |
const fs = require('fs');
const output = fs.readFileSync('link_check_output.txt', 'utf8');
const issueBody = fs.readFileSync('broken_links_issue.md', 'utf8');
// Extract error information
const errorMatch = output.match(/❌ ERROR[\s\S]*$/);
const errorInfo = errorMatch ? errorMatch[0] : 'Link check failed - see workflow logs';
const issueBody = `# 🔗 Broken Links Detected
The weekly link check has found broken or inaccessible links in the repository.
## Error Details
\`\`\`
${errorInfo}
\`\`\`
## Action Required
Please review and fix the broken links above. Options:
- Update the URL if the resource moved
- Remove the entry if it's permanently unavailable
- Add to \`tests/exclude_in_test.json\` if it's a known false positive
---
*Auto-generated by [broken_links.yml](https://github.com/veggiemonk/awesome-docker/blob/master/.github/workflows/broken_links.yml)*
`;
// Check for existing issue
const issues = await github.rest.issues.listForRepo({
owner: context.repo.owner,
repo: context.repo.repo,
@@ -91,16 +56,14 @@ jobs:
issue_number: issues.data[0].number,
body: issueBody
});
console.log(`Updated issue #${issues.data[0].number}`);
} else {
const issue = await github.rest.issues.create({
await github.rest.issues.create({
owner: context.repo.owner,
repo: context.repo.repo,
title: '🔗 Broken Links Detected - Action Required',
title: 'Broken Links Detected',
body: issueBody,
labels: ['broken-links', 'bug']
});
console.log(`Created issue #${issue.data.number}`);
}
- name: Close Issue if No Errors
@@ -115,7 +78,6 @@ jobs:
labels: 'broken-links',
per_page: 1
});
if (issues.data.length > 0) {
await github.rest.issues.update({
owner: context.repo.owner,
@@ -124,11 +86,4 @@ jobs:
state: 'closed',
state_reason: 'completed'
});
await github.rest.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issues.data[0].number,
body: '✅ All links are now working! Closing this issue.'
});
console.log(`Closed issue #${issues.data[0].number}`);
}

View File

@@ -20,19 +20,17 @@ jobs:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6.0.2
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
- name: Setup Node.js
uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v6.2.0
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
with:
node-version-file: '.nvmrc'
cache: 'npm'
go-version-file: go.mod
- name: Install dependencies
run: npm ci
- name: Build CLI
run: go build -o awesome-docker ./cmd/awesome-docker
- name: Build website
run: npm run build
run: ./awesome-docker build
- name: Upload artifact
uses: actions/upload-pages-artifact@7b1f4a764d45c48632c6b24a0339c27f5614fb0b # ratchet:actions/upload-pages-artifact@v4

View File

@@ -2,56 +2,46 @@ name: Weekly Health Report
on:
schedule:
# Run every Monday at 9 AM UTC
- cron: "0 9 * * 1"
workflow_dispatch: # Allow manual trigger
workflow_dispatch:
concurrency:
group: health-report-${{ github.ref }}
cancel-in-progress: false
jobs:
health-check:
runs-on: ubuntu-latest
timeout-minutes: 30
permissions:
contents: write
contents: read
issues: write
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6.0.2
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v6.2.0
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
with:
node-version: lts/*
go-version-file: go.mod
- uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # ratchet:actions/cache@v5.0.3
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
- name: Build
run: go build -o awesome-docker ./cmd/awesome-docker
- name: Install Dependencies
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
- name: Run Health Check
run: node tests/health_check.mjs
continue-on-error: true
- name: Run Health + Report
id: report
run: ./awesome-docker ci health-report --issue-file health_report.txt --github-output "$GITHUB_OUTPUT"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Upload Health Report
uses: actions/upload-artifact@b7c566a772e6b6bfb58ed0dc250532a479d7789f # ratchet:actions/upload-artifact@v5
with:
name: health-report
path: HEALTH_REPORT.md
- name: Create Issue with Health Report
- name: Create/Update Issue with Health Report
if: steps.report.outputs.has_report == 'true'
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # ratchet:actions/github-script@v8
with:
script: |
const fs = require('fs');
const report = fs.readFileSync('health_report.txt', 'utf8');
const issueBody = report;
// Read the health report
const report = fs.readFileSync('HEALTH_REPORT.md', 'utf8');
// Check if there's already an open issue
const issues = await github.rest.issues.listForRepo({
owner: context.repo.owner,
repo: context.repo.repo,
@@ -60,25 +50,19 @@ jobs:
per_page: 1
});
const issueBody = report + '\n\n---\n*This report is auto-generated weekly. See [health_check.mjs](https://github.com/veggiemonk/awesome-docker/blob/master/tests/health_check.mjs) for details.*';
if (issues.data.length > 0) {
// Update existing issue
await github.rest.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issues.data[0].number,
body: issueBody
});
console.log(`Updated issue #${issues.data[0].number}`);
} else {
// Create new issue
const issue = await github.rest.issues.create({
await github.rest.issues.create({
owner: context.repo.owner,
repo: context.repo.repo,
title: '🏥 Weekly Health Report - Repository Maintenance Needed',
title: 'Weekly Health Report - Repository Maintenance Needed',
body: issueBody,
labels: ['health-report', 'maintenance']
});
console.log(`Created issue #${issue.data.number}`);
}

View File

@@ -11,22 +11,16 @@ jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6.0.2
- uses: actions/setup-node@6044e13b5dc448c55e2357c09f80417699197238 # ratchet:actions/setup-node@v6.2.0
with:
node-version: lts/*
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
- uses: actions/cache@cdf6c1fa76f9f475f3d7449005a359c84ca0f306 # ratchet:actions/cache@v5.0.3
id: cache
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
restore-keys: |
${{ runner.os }}-node-
go-version-file: go.mod
- name: Install Dependencies
# if: steps.cache.outputs.cache-hit != 'true'
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
- run: npm run test-pr
- name: Build
run: go build -o awesome-docker ./cmd/awesome-docker
- name: Validate
run: ./awesome-docker validate
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}

4
.gitignore vendored
View File

@@ -10,3 +10,7 @@ website/table.html
.idea
**/.DS_Store
.worktrees
# Go
/awesome-docker

1
.nvmrc
View File

@@ -1 +0,0 @@
lts/*

View File

@@ -1,28 +1,78 @@
# Agent Guidelines for awesome-docker
## Commands
- Build website: `npm run build` (converts README.md to website/index.html)
- Test all links: `npm test` (runs tests/test_all.mjs, requires GITHUB_TOKEN)
- Test PR changes: `npm run test-pr` (runs tests/pull_request.mjs, checks duplicates)
- Health check: `npm run health-check` (generates HEALTH_REPORT.md, requires GITHUB_TOKEN)
- Build CLI: `make build` (or `go build -o awesome-docker ./cmd/awesome-docker`)
- Rebuild from scratch: `make rebuild`
- Show local workflows: `make help`
- Format Go code: `make fmt`
- Run tests: `make test` (runs `go test ./internal/... -v`)
- Race tests: `make test-race`
- Lint README rules: `make lint` (runs `./awesome-docker lint`)
- Auto-fix lint issues: `make lint-fix`
- Check links: `make check` (runs `./awesome-docker check`; `GITHUB_TOKEN` enables GitHub repo checks)
- PR-safe link checks: `make check-pr`
- PR validation: `make validate` (lint + external link checks in PR mode)
- Build website: `make website` (generates `website/index.html` from `README.md`)
- Health scoring: `make health` (requires `GITHUB_TOKEN`, refreshes `config/health_cache.yaml`)
- Print health report (Markdown): `make report`
- Print health report (JSON): `make report-json` or `./awesome-docker report --json`
- Generate report files: `make report-file` (`HEALTH_REPORT.md`) and `make report-json-file` (`HEALTH_REPORT.json`)
- Maintenance shortcut: `make workflow-maint` (health + JSON report file)
## Architecture
- **Main content**: README.md - curated list of Docker resources (markdown format)
- **Build script**: build.js - converts README.md to HTML using showdown & cheerio
- **Tests**: tests/*.mjs - link validation, duplicate detection, URL checking
- **Website**: website/ - static site deployment folder
- **Main content**: `README.md` (curated Docker/container resources)
- **CLI entrypoint**: `cmd/awesome-docker/main.go` (Cobra commands)
- **Core packages**:
- `internal/parser` - parse README sections and entries
- `internal/linter` - alphabetical/order/format validation + autofix
- `internal/checker` - HTTP and GitHub link checks
- `internal/scorer` - repository health scoring and report generation
- `internal/cache` - exclude list and health cache read/write
- `internal/builder` - render README to website HTML from template
- **Config**:
- `config/exclude.yaml` - known link-check exclusions
- `config/website.tmpl.html` - HTML template for site generation
- `config/health_cache.yaml` - persisted health scoring cache
- **Generated outputs**:
- `awesome-docker` - compiled CLI binary
- `website/index.html` - generated website
- `HEALTH_REPORT.md` - generated markdown report
- `HEALTH_REPORT.json` - generated JSON report
## Code Style
- **Language**: Node.js with ES modules (.mjs) for tests, CommonJS for build.js
- **Imports**: Use ES6 imports in .mjs files, require() in .js files
- **Error handling**: Use try/catch with LOG.error() and process.exit(1) for failures
- **Logging**: Use LOG object with error/debug methods (see build.js for pattern)
- **Async**: Prefer async/await over callbacks
- **Language**: Go
- **Formatting**: Keep code `gofmt`-clean
- **Testing**: Add/adjust table-driven tests in `internal/*_test.go` for behavior changes
- **Error handling**: Return wrapped errors (`fmt.Errorf("context: %w", err)`) from command handlers
- **CLI conventions**: Keep command behavior consistent with existing Cobra commands (`lint`, `check`, `health`, `build`, `report`, `validate`)
## CI/Automation
- **PR + weekly validation**: `.github/workflows/pull_request.yml`
- Triggers on pull requests to `master` and weekly schedule
- Builds Go CLI and runs `./awesome-docker validate`
- **Weekly broken links issue**: `.github/workflows/broken_links.yml`
- Runs `./awesome-docker check`
- Opens/updates `broken-links` issue when failures are found
- **Weekly health report issue**: `.github/workflows/health_report.yml`
- Runs `./awesome-docker health` then `./awesome-docker report`
- Opens/updates `health-report` issue
- **GitHub Pages deploy**: `.github/workflows/deploy-pages.yml`
- On push to `master`, builds CLI, runs `./awesome-docker build`, deploys `website/`
## Makefile Workflow
- The `Makefile` models file dependencies for generated artifacts (`awesome-docker`, `website/index.html`, `config/health_cache.yaml`, `HEALTH_REPORT.md`, `HEALTH_REPORT.json`).
- Prefer `make` targets over ad-hoc command sequences so dependency and regeneration behavior stays consistent.
- Use:
- `make workflow-dev` for local iteration
- `make workflow-pr` before opening/updating a PR
- `make workflow-maint` for health/report maintenance
- `make workflow-ci` for CI-equivalent local checks
## Content Guidelines (from CONTRIBUTING.md)
- Link to GitHub projects, not websites
- Entries are listed alphabetically (from A to Z)
- Entries must be Docker/container-related with clear documentation
- Include project description, installation, and usage examples
- Mark WIP projects explicitly
- Avoid outdated tutorials/blog posts unless advanced/specific
- Use one link per entry
- Prefer project/repository URLs over marketing pages
- Keep entries alphabetically ordered within each section
- Keep descriptions concise and concrete
- Use `:heavy_dollar_sign:` only for paid/commercial services
- Remove archived/deprecated projects instead of tagging them
- Avoid duplicate links and redirect variants

142
Makefile Normal file
View File

@@ -0,0 +1,142 @@
SHELL := /bin/bash
BINARY ?= awesome-docker
GO ?= go
CMD_PACKAGE := ./cmd/awesome-docker
INTERNAL_PACKAGES := ./internal/...
WEBSITE_OUTPUT := website/index.html
HEALTH_CACHE := config/health_cache.yaml
HEALTH_REPORT_MD := HEALTH_REPORT.md
HEALTH_REPORT_JSON := HEALTH_REPORT.json
GO_SOURCES := $(shell find cmd internal -type f -name '*.go')
BUILD_INPUTS := $(GO_SOURCES) go.mod go.sum
WEBSITE_INPUTS := README.md config/website.tmpl.html
HEALTH_INPUTS := README.md config/exclude.yaml
.DEFAULT_GOAL := help
.PHONY: help \
build rebuild clean \
fmt test test-race \
lint lint-fix check check-pr validate website \
guard-github-token health health-cache \
report report-json report-file report-json-file health-report \
workflow-dev workflow-pr workflow-maint workflow-ci
help: ## Show the full local workflow and available targets
@echo "awesome-docker Makefile"
@echo
@echo "Workflows:"
@echo " make workflow-dev # local iteration (fmt + test + lint + check-pr + website)"
@echo " make workflow-pr # recommended before opening/updating a PR"
@echo " make workflow-maint # repository maintenance (health + JSON report)"
@echo " make workflow-ci # CI-equivalent checks"
@echo
@echo "Core targets:"
@echo " make build # build CLI binary"
@echo " make test # run internal Go tests"
@echo " make lint # validate README formatting/content rules"
@echo " make check # check links (uses GITHUB_TOKEN when set)"
@echo " make validate # run PR validation (lint + check --pr)"
@echo " make website # generate website/index.html"
@echo " make report-file # generate HEALTH_REPORT.md"
@echo " make report-json-file# generate HEALTH_REPORT.json"
@echo " make health # refresh health cache (requires GITHUB_TOKEN)"
@echo " make report # print markdown health report"
@echo " make report-json # print full JSON health report"
@echo
@echo "Generated artifacts:"
@echo " $(BINARY)"
@echo " $(WEBSITE_OUTPUT)"
@echo " $(HEALTH_CACHE)"
@echo " $(HEALTH_REPORT_MD)"
@echo " $(HEALTH_REPORT_JSON)"
$(BINARY): $(BUILD_INPUTS)
$(GO) build -o $(BINARY) $(CMD_PACKAGE)
build: $(BINARY) ## Build CLI binary
rebuild: clean build ## Rebuild from scratch
clean: ## Remove generated binary
rm -f $(BINARY) $(HEALTH_REPORT_MD) $(HEALTH_REPORT_JSON)
fmt: ## Format Go code
$(GO) fmt ./...
test: ## Run internal unit tests
$(GO) test $(INTERNAL_PACKAGES) -v
test-race: ## Run internal tests with race detector
$(GO) test $(INTERNAL_PACKAGES) -race
lint: build ## Validate README formatting/content rules
./$(BINARY) lint
lint-fix: build ## Auto-fix lint issues when possible
./$(BINARY) lint --fix
check: build ## Check links (GitHub checks enabled when GITHUB_TOKEN is set)
./$(BINARY) check
check-pr: build ## Check links in PR mode (external links only)
./$(BINARY) check --pr
validate: build ## Run PR validation (lint + check --pr)
./$(BINARY) validate
$(WEBSITE_OUTPUT): $(BINARY) $(WEBSITE_INPUTS)
./$(BINARY) build
website: $(WEBSITE_OUTPUT) ## Generate website from README
guard-github-token:
@if [ -z "$$GITHUB_TOKEN" ]; then \
echo "GITHUB_TOKEN is required for this target."; \
echo "Set it with: export GITHUB_TOKEN=<token>"; \
exit 1; \
fi
$(HEALTH_CACHE): guard-github-token $(BINARY) $(HEALTH_INPUTS)
./$(BINARY) health
health-cache: $(HEALTH_CACHE) ## Update config/health_cache.yaml
health: ## Refresh health cache from GitHub metadata
@$(MAKE) --no-print-directory -B health-cache
report: build ## Print markdown health report from cache
./$(BINARY) report
report-json: build ## Print full health report as JSON
./$(BINARY) report --json
$(HEALTH_REPORT_MD): $(BINARY) $(HEALTH_CACHE)
./$(BINARY) report > $(HEALTH_REPORT_MD)
report-file: $(HEALTH_REPORT_MD) ## Generate HEALTH_REPORT.md from cache
$(HEALTH_REPORT_JSON): $(BINARY) $(HEALTH_CACHE)
./$(BINARY) report --json > $(HEALTH_REPORT_JSON)
report-json-file: $(HEALTH_REPORT_JSON) ## Generate HEALTH_REPORT.json from cache
health-report: health report-file ## Refresh health cache then generate HEALTH_REPORT.md
workflow-dev: fmt test lint check-pr website ## Full local development workflow
workflow-pr: fmt test validate ## Recommended workflow before opening a PR
workflow-maint: health report-json-file ## Weekly maintenance workflow
workflow-ci: test validate ## CI-equivalent validation workflow
update-ga:
ratchet upgrade .github/workflows/*
update-go:
go get -u go@latest
go get -u ./...
go mod tidy

827
README.md

File diff suppressed because it is too large Load Diff

View File

@@ -1,51 +0,0 @@
const fs = require('fs-extra');
const cheerio = require('cheerio');
const showdown = require('showdown');
process.env.NODE_ENV = 'production';
const LOG = {
error: (...args) => console.error('❌ ERROR', { ...args }),
debug: (...args) => {
if (process.env.DEBUG) console.log('💡 DEBUG: ', { ...args });
},
};
const handleFailure = (err) => {
LOG.error(err);
process.exit(1);
};
process.on('unhandledRejection', handleFailure);
// --- FILES
const README = 'README.md';
const WEBSITE_FOLDER = 'website';
const indexTemplate = `${WEBSITE_FOLDER}/index.tmpl.html`;
const indexDestination = `${WEBSITE_FOLDER}/index.html`;
async function processIndex() {
const converter = new showdown.Converter();
converter.setFlavor('github');
try {
LOG.debug('Loading files...', { indexTemplate, README });
const template = await fs.readFile(indexTemplate, 'utf8');
const markdown = await fs.readFile(README, 'utf8');
LOG.debug('Merging files...');
const $ = cheerio.load(template);
$('#md').append(converter.makeHtml(markdown));
LOG.debug('Writing index.html');
await fs.outputFile(indexDestination, $.html(), 'utf8');
LOG.debug('DONE 👍');
} catch (err) {
handleFailure(err);
}
}
async function main() {
await processIndex();
}
main();

630
cmd/awesome-docker/main.go Normal file
View File

@@ -0,0 +1,630 @@
package main
import (
"context"
"fmt"
"os"
"strconv"
"strings"
"github.com/spf13/cobra"
"github.com/veggiemonk/awesome-docker/internal/builder"
"github.com/veggiemonk/awesome-docker/internal/cache"
"github.com/veggiemonk/awesome-docker/internal/checker"
"github.com/veggiemonk/awesome-docker/internal/linter"
"github.com/veggiemonk/awesome-docker/internal/parser"
"github.com/veggiemonk/awesome-docker/internal/scorer"
)
const (
readmePath = "README.md"
excludePath = "config/exclude.yaml"
templatePath = "config/website.tmpl.html"
healthCachePath = "config/health_cache.yaml"
websiteOutput = "website/index.html"
version = "0.1.0"
)
type checkSummary struct {
ExternalTotal int
GitHubTotal int
Broken []checker.LinkResult
Redirected []checker.LinkResult
GitHubErrors []error
GitHubSkipped bool
}
func main() {
root := &cobra.Command{
Use: "awesome-docker",
Short: "Quality tooling for the awesome-docker curated list",
}
root.AddCommand(
versionCmd(),
lintCmd(),
checkCmd(),
healthCmd(),
buildCmd(),
reportCmd(),
validateCmd(),
ciCmd(),
)
if err := root.Execute(); err != nil {
os.Exit(1)
}
}
func versionCmd() *cobra.Command {
return &cobra.Command{
Use: "version",
Short: "Print version",
Run: func(cmd *cobra.Command, args []string) { fmt.Printf("awesome-docker v%s\n", version) },
}
}
func parseReadme() (parser.Document, error) {
f, err := os.Open(readmePath)
if err != nil {
return parser.Document{}, err
}
defer f.Close()
return parser.Parse(f)
}
func collectURLs(sections []parser.Section, urls *[]string) {
for _, s := range sections {
for _, e := range s.Entries {
*urls = append(*urls, e.URL)
}
collectURLs(s.Children, urls)
}
}
func runLinkChecks(prMode bool) (checkSummary, error) {
doc, err := parseReadme()
if err != nil {
return checkSummary{}, fmt.Errorf("parse: %w", err)
}
var urls []string
collectURLs(doc.Sections, &urls)
exclude, err := cache.LoadExcludeList(excludePath)
if err != nil {
return checkSummary{}, fmt.Errorf("load exclude list: %w", err)
}
ghURLs, extURLs := checker.PartitionLinks(urls)
summary := checkSummary{
ExternalTotal: len(extURLs),
GitHubTotal: len(ghURLs),
}
results := checker.CheckLinks(extURLs, 10, exclude)
for _, r := range results {
if !r.OK {
summary.Broken = append(summary.Broken, r)
}
if r.Redirected {
summary.Redirected = append(summary.Redirected, r)
}
}
if prMode {
summary.GitHubSkipped = true
return summary, nil
}
token := os.Getenv("GITHUB_TOKEN")
if token == "" {
summary.GitHubSkipped = true
return summary, nil
}
gc := checker.NewGitHubChecker(token)
_, errs := gc.CheckRepos(context.Background(), ghURLs, 50)
summary.GitHubErrors = errs
return summary, nil
}
func runHealth(ctx context.Context) error {
token := os.Getenv("GITHUB_TOKEN")
if token == "" {
return fmt.Errorf("GITHUB_TOKEN environment variable is required")
}
doc, err := parseReadme()
if err != nil {
return fmt.Errorf("parse: %w", err)
}
var urls []string
collectURLs(doc.Sections, &urls)
ghURLs, _ := checker.PartitionLinks(urls)
fmt.Printf("Scoring %d GitHub repositories...\n", len(ghURLs))
gc := checker.NewGitHubChecker(token)
infos, errs := gc.CheckRepos(ctx, ghURLs, 50)
for _, e := range errs {
fmt.Printf(" error: %v\n", e)
}
if len(infos) == 0 {
if len(errs) > 0 {
return fmt.Errorf("failed to fetch GitHub metadata for all repositories (%d errors); check network/DNS and GITHUB_TOKEN", len(errs))
}
return fmt.Errorf("no GitHub repositories found in README")
}
scored := scorer.ScoreAll(infos)
cacheEntries := scorer.ToCacheEntries(scored)
hc, err := cache.LoadHealthCache(healthCachePath)
if err != nil {
return fmt.Errorf("load cache: %w", err)
}
hc.Merge(cacheEntries)
if err := cache.SaveHealthCache(healthCachePath, hc); err != nil {
return fmt.Errorf("save cache: %w", err)
}
fmt.Printf("Cache updated: %d entries in %s\n", len(hc.Entries), healthCachePath)
return nil
}
func scoredFromCache() ([]scorer.ScoredEntry, error) {
hc, err := cache.LoadHealthCache(healthCachePath)
if err != nil {
return nil, fmt.Errorf("load cache: %w", err)
}
if len(hc.Entries) == 0 {
return nil, fmt.Errorf("no cache data, run 'health' first")
}
scored := make([]scorer.ScoredEntry, 0, len(hc.Entries))
for _, e := range hc.Entries {
scored = append(scored, scorer.ScoredEntry{
URL: e.URL,
Name: e.Name,
Status: scorer.Status(e.Status),
Stars: e.Stars,
HasLicense: e.HasLicense,
LastPush: e.LastPush,
})
}
return scored, nil
}
func markdownReportFromCache() (string, error) {
scored, err := scoredFromCache()
if err != nil {
return "", err
}
return scorer.GenerateReport(scored), nil
}
func writeGitHubOutput(path, key, value string) error {
if path == "" {
return nil
}
f, err := os.OpenFile(path, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0o644)
if err != nil {
return fmt.Errorf("open github output file: %w", err)
}
defer f.Close()
if _, err := fmt.Fprintf(f, "%s=%s\n", key, value); err != nil {
return fmt.Errorf("write github output: %w", err)
}
return nil
}
func sanitizeOutputValue(v string) string {
v = strings.ReplaceAll(v, "\n", " ")
v = strings.ReplaceAll(v, "\r", " ")
return strings.TrimSpace(v)
}
func buildBrokenLinksIssueBody(summary checkSummary, runErr error) string {
var b strings.Builder
b.WriteString("# Broken Links Detected\n\n")
if runErr != nil {
b.WriteString("The link checker failed to execute cleanly.\n\n")
b.WriteString("## Failure\n\n")
fmt.Fprintf(&b, "- %s\n\n", runErr)
} else {
fmt.Fprintf(&b, "- Broken links: %d\n", len(summary.Broken))
fmt.Fprintf(&b, "- Redirected links: %d\n", len(summary.Redirected))
fmt.Fprintf(&b, "- GitHub API errors: %d\n\n", len(summary.GitHubErrors))
if len(summary.Broken) > 0 {
b.WriteString("## Broken Links\n\n")
for _, r := range summary.Broken {
fmt.Fprintf(&b, "- `%s` -> `%d %s`\n", r.URL, r.StatusCode, strings.TrimSpace(r.Error))
}
b.WriteString("\n")
}
if len(summary.GitHubErrors) > 0 {
b.WriteString("## GitHub API Errors\n\n")
for _, e := range summary.GitHubErrors {
fmt.Fprintf(&b, "- `%s`\n", e)
}
b.WriteString("\n")
}
}
b.WriteString("## Action Required\n\n")
b.WriteString("- Update the URL if the resource moved\n")
b.WriteString("- Remove the entry if permanently unavailable\n")
b.WriteString("- Add to `config/exclude.yaml` if a known false positive\n")
b.WriteString("- Investigate GitHub API/auth failures when present\n\n")
b.WriteString("---\n")
b.WriteString("*Auto-generated by awesome-docker ci broken-links*\n")
return b.String()
}
func buildHealthReportIssueBody(report string, healthErr error) string {
var b strings.Builder
if healthErr != nil {
b.WriteString("WARNING: health refresh failed in this run; showing latest cached report.\n\n")
fmt.Fprintf(&b, "Error: `%s`\n\n", healthErr)
}
b.WriteString(report)
if !strings.HasSuffix(report, "\n") {
b.WriteString("\n")
}
b.WriteString("\n---\n")
b.WriteString("*Auto-generated weekly by awesome-docker ci health-report*\n")
return b.String()
}
func lintCmd() *cobra.Command {
var fix bool
cmd := &cobra.Command{
Use: "lint",
Short: "Validate README formatting",
RunE: func(cmd *cobra.Command, args []string) error {
doc, err := parseReadme()
if err != nil {
return fmt.Errorf("parse: %w", err)
}
result := linter.Lint(doc)
for _, issue := range result.Issues {
fmt.Println(issue)
}
if result.Errors > 0 {
fmt.Printf("\n%d errors, %d warnings\n", result.Errors, result.Warnings)
if !fix {
return fmt.Errorf("lint failed with %d errors", result.Errors)
}
count, err := linter.FixFile(readmePath)
if err != nil {
return fmt.Errorf("fix: %w", err)
}
fmt.Printf("Fixed %d lines in %s\n", count, readmePath)
} else {
fmt.Printf("OK: %d warnings\n", result.Warnings)
}
return nil
},
}
cmd.Flags().BoolVar(&fix, "fix", false, "Auto-fix formatting issues")
return cmd
}
func checkCmd() *cobra.Command {
var prMode bool
cmd := &cobra.Command{
Use: "check",
Short: "Check links for reachability",
RunE: func(cmd *cobra.Command, args []string) error {
summary, err := runLinkChecks(prMode)
if err != nil {
return err
}
fmt.Printf("Checking %d external links...\n", summary.ExternalTotal)
if !prMode {
if summary.GitHubSkipped {
fmt.Println("GITHUB_TOKEN not set, skipping GitHub repo checks")
} else {
fmt.Printf("Checking %d GitHub repositories...\n", summary.GitHubTotal)
}
}
for _, e := range summary.GitHubErrors {
fmt.Printf(" GitHub error: %v\n", e)
}
if len(summary.Redirected) > 0 {
fmt.Printf("\n%d redirected links (consider updating):\n", len(summary.Redirected))
for _, r := range summary.Redirected {
fmt.Printf(" %s -> %s\n", r.URL, r.RedirectURL)
}
}
if len(summary.Broken) > 0 {
fmt.Printf("\n%d broken links:\n", len(summary.Broken))
for _, r := range summary.Broken {
fmt.Printf(" %s -> %d %s\n", r.URL, r.StatusCode, r.Error)
}
}
if len(summary.Broken) > 0 && len(summary.GitHubErrors) > 0 {
return fmt.Errorf("found %d broken links and %d GitHub API errors", len(summary.Broken), len(summary.GitHubErrors))
}
if len(summary.Broken) > 0 {
return fmt.Errorf("found %d broken links", len(summary.Broken))
}
if len(summary.GitHubErrors) > 0 {
return fmt.Errorf("github checks failed with %d errors", len(summary.GitHubErrors))
}
fmt.Println("All links OK")
return nil
},
}
cmd.Flags().BoolVar(&prMode, "pr", false, "PR mode: skip GitHub API checks")
return cmd
}
func healthCmd() *cobra.Command {
return &cobra.Command{
Use: "health",
Short: "Score repository health and update cache",
RunE: func(cmd *cobra.Command, args []string) error {
return runHealth(context.Background())
},
}
}
func buildCmd() *cobra.Command {
return &cobra.Command{
Use: "build",
Short: "Generate website from README",
RunE: func(cmd *cobra.Command, args []string) error {
if err := builder.Build(readmePath, templatePath, websiteOutput); err != nil {
return err
}
fmt.Printf("Website built: %s\n", websiteOutput)
return nil
},
}
}
func reportCmd() *cobra.Command {
var jsonOutput bool
cmd := &cobra.Command{
Use: "report",
Short: "Generate health report from cache",
RunE: func(cmd *cobra.Command, args []string) error {
scored, err := scoredFromCache()
if err != nil {
return err
}
if jsonOutput {
payload, err := scorer.GenerateJSONReport(scored)
if err != nil {
return fmt.Errorf("json report: %w", err)
}
fmt.Println(string(payload))
return nil
}
report := scorer.GenerateReport(scored)
fmt.Print(report)
return nil
},
}
cmd.Flags().BoolVar(&jsonOutput, "json", false, "Output full health report as JSON")
return cmd
}
func validateCmd() *cobra.Command {
return &cobra.Command{
Use: "validate",
Short: "PR validation: lint + check --pr",
RunE: func(cmd *cobra.Command, args []string) error {
fmt.Println("=== Linting ===")
doc, err := parseReadme()
if err != nil {
return fmt.Errorf("parse: %w", err)
}
result := linter.Lint(doc)
for _, issue := range result.Issues {
fmt.Println(issue)
}
if result.Errors > 0 {
fmt.Printf("\n%d errors, %d warnings\n", result.Errors, result.Warnings)
return fmt.Errorf("lint failed with %d errors", result.Errors)
}
fmt.Printf("Lint OK: %d warnings\n", result.Warnings)
fmt.Println("\n=== Checking links (PR mode) ===")
summary, err := runLinkChecks(true)
if err != nil {
return err
}
fmt.Printf("Checking %d external links...\n", summary.ExternalTotal)
if len(summary.Broken) > 0 {
fmt.Printf("\n%d broken links:\n", len(summary.Broken))
for _, r := range summary.Broken {
fmt.Printf(" %s -> %d %s\n", r.URL, r.StatusCode, r.Error)
}
return fmt.Errorf("found %d broken links", len(summary.Broken))
}
fmt.Println("\nValidation passed")
return nil
},
}
}
func ciCmd() *cobra.Command {
cmd := &cobra.Command{
Use: "ci",
Short: "CI-oriented helper commands",
}
cmd.AddCommand(
ciBrokenLinksCmd(),
ciHealthReportCmd(),
)
return cmd
}
func ciBrokenLinksCmd() *cobra.Command {
var issueFile string
var githubOutput string
var strict bool
cmd := &cobra.Command{
Use: "broken-links",
Short: "Run link checks and emit CI outputs/artifacts",
RunE: func(cmd *cobra.Command, args []string) error {
summary, runErr := runLinkChecks(false)
hasErrors := runErr != nil || len(summary.Broken) > 0 || len(summary.GitHubErrors) > 0
exitCode := 0
if hasErrors {
exitCode = 1
}
if runErr != nil {
exitCode = 2
}
if issueFile != "" && hasErrors {
body := buildBrokenLinksIssueBody(summary, runErr)
if err := os.WriteFile(issueFile, []byte(body), 0o644); err != nil {
return fmt.Errorf("write issue file: %w", err)
}
}
if err := writeGitHubOutput(githubOutput, "has_errors", strconv.FormatBool(hasErrors)); err != nil {
return err
}
if err := writeGitHubOutput(githubOutput, "check_exit_code", strconv.Itoa(exitCode)); err != nil {
return err
}
if err := writeGitHubOutput(githubOutput, "broken_count", strconv.Itoa(len(summary.Broken))); err != nil {
return err
}
if err := writeGitHubOutput(githubOutput, "github_error_count", strconv.Itoa(len(summary.GitHubErrors))); err != nil {
return err
}
if runErr != nil {
if err := writeGitHubOutput(githubOutput, "run_error", sanitizeOutputValue(runErr.Error())); err != nil {
return err
}
}
if runErr != nil {
fmt.Printf("CI broken-links run error: %v\n", runErr)
}
if hasErrors {
fmt.Printf("CI broken-links found %d broken links and %d GitHub errors\n", len(summary.Broken), len(summary.GitHubErrors))
} else {
fmt.Println("CI broken-links found no errors")
}
if strict {
if runErr != nil {
return runErr
}
if hasErrors {
return fmt.Errorf("found %d broken links and %d GitHub API errors", len(summary.Broken), len(summary.GitHubErrors))
}
}
return nil
},
}
cmd.Flags().StringVar(&issueFile, "issue-file", "broken_links_issue.md", "Path to write issue markdown body")
cmd.Flags().StringVar(&githubOutput, "github-output", "", "Path to GitHub output file (typically $GITHUB_OUTPUT)")
cmd.Flags().BoolVar(&strict, "strict", false, "Return non-zero when errors are found")
return cmd
}
func ciHealthReportCmd() *cobra.Command {
var issueFile string
var githubOutput string
var strict bool
cmd := &cobra.Command{
Use: "health-report",
Short: "Refresh health cache, render report, and emit CI outputs/artifacts",
RunE: func(cmd *cobra.Command, args []string) error {
healthErr := runHealth(context.Background())
report, reportErr := markdownReportFromCache()
healthOK := healthErr == nil
reportOK := reportErr == nil
hasReport := reportOK && strings.TrimSpace(report) != ""
hasErrors := !healthOK || !reportOK
if hasReport && issueFile != "" {
body := buildHealthReportIssueBody(report, healthErr)
if err := os.WriteFile(issueFile, []byte(body), 0o644); err != nil {
return fmt.Errorf("write issue file: %w", err)
}
}
if err := writeGitHubOutput(githubOutput, "has_report", strconv.FormatBool(hasReport)); err != nil {
return err
}
if err := writeGitHubOutput(githubOutput, "health_ok", strconv.FormatBool(healthOK)); err != nil {
return err
}
if err := writeGitHubOutput(githubOutput, "report_ok", strconv.FormatBool(reportOK)); err != nil {
return err
}
if err := writeGitHubOutput(githubOutput, "has_errors", strconv.FormatBool(hasErrors)); err != nil {
return err
}
if healthErr != nil {
if err := writeGitHubOutput(githubOutput, "health_error", sanitizeOutputValue(healthErr.Error())); err != nil {
return err
}
}
if reportErr != nil {
if err := writeGitHubOutput(githubOutput, "report_error", sanitizeOutputValue(reportErr.Error())); err != nil {
return err
}
}
if healthErr != nil {
fmt.Printf("CI health-report health error: %v\n", healthErr)
}
if reportErr != nil {
fmt.Printf("CI health-report report error: %v\n", reportErr)
}
if hasReport {
fmt.Println("CI health-report generated report artifact")
} else {
fmt.Println("CI health-report has no report artifact")
}
if strict {
if healthErr != nil {
return healthErr
}
if reportErr != nil {
return reportErr
}
}
return nil
},
}
cmd.Flags().StringVar(&issueFile, "issue-file", "health_report.txt", "Path to write health issue markdown body")
cmd.Flags().StringVar(&githubOutput, "github-output", "", "Path to GitHub output file (typically $GITHUB_OUTPUT)")
cmd.Flags().BoolVar(&strict, "strict", false, "Return non-zero when health/report fails")
return cmd
}

18
config/exclude.yaml Normal file
View File

@@ -0,0 +1,18 @@
# URLs or URL prefixes to skip during link checking.
# These are known false positives or rate-limited domains.
domains:
- https://vimeo.com
- https://travis-ci.org/veggiemonk/awesome-docker.svg
- https://github.com/apps/
- https://twitter.com
- https://www.meetup.com/
- https://cycle.io/
- https://www.manning.com/
- https://deepfence.io
- https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg
- https://www.se-radio.net/2017/05/se-radio-episode-290-diogo-monica-on-docker-security
- https://www.reddit.com/r/docker/
- https://www.udacity.com/course/scalable-microservices-with-kubernetes--ud615
- https://www.youtube.com/playlist
- https://www.aquasec.com
- https://cloudsmith.com

2122
config/health_cache.yaml Normal file

File diff suppressed because it is too large Load Diff

17
go.mod Normal file
View File

@@ -0,0 +1,17 @@
module github.com/veggiemonk/awesome-docker
go 1.26.0
require (
github.com/shurcooL/githubv4 v0.0.0-20260209031235-2402fdf4a9ed
github.com/spf13/cobra v1.10.2
github.com/yuin/goldmark v1.7.16
golang.org/x/oauth2 v0.35.0
gopkg.in/yaml.v3 v3.0.1
)
require (
github.com/inconshreveable/mousetrap v1.1.0 // indirect
github.com/shurcooL/graphql v0.0.0-20240915155400-7ee5256398cf // indirect
github.com/spf13/pflag v1.0.10 // indirect
)

22
go.sum Normal file
View File

@@ -0,0 +1,22 @@
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
github.com/shurcooL/githubv4 v0.0.0-20260209031235-2402fdf4a9ed h1:KT7hI8vYXgU0s2qaMkrfq9tCA1w/iEPgfredVP+4Tzw=
github.com/shurcooL/githubv4 v0.0.0-20260209031235-2402fdf4a9ed/go.mod h1:zqMwyHmnN/eDOZOdiTohqIUKUrTFX62PNlu7IJdu0q8=
github.com/shurcooL/graphql v0.0.0-20240915155400-7ee5256398cf h1:o1uxfymjZ7jZ4MsgCErcwWGtVKSiNAXtS59Lhs6uI/g=
github.com/shurcooL/graphql v0.0.0-20240915155400-7ee5256398cf/go.mod h1:9dIRpgIY7hVhoqfe0/FcYp0bpInZaT7dc3BYOprrIUE=
github.com/spf13/cobra v1.10.2 h1:DMTTonx5m65Ic0GOoRY2c16WCbHxOOw6xxezuLaBpcU=
github.com/spf13/cobra v1.10.2/go.mod h1:7C1pvHqHw5A4vrJfjNwvOdzYu0Gml16OCs2GRiTUUS4=
github.com/spf13/pflag v1.0.9/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/spf13/pflag v1.0.10 h1:4EBh2KAYBwaONj6b2Ye1GiHfwjqyROoF4RwYO+vPwFk=
github.com/spf13/pflag v1.0.10/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
github.com/yuin/goldmark v1.7.16 h1:n+CJdUxaFMiDUNnWC3dMWCIQJSkxH4uz3ZwQBkAlVNE=
github.com/yuin/goldmark v1.7.16/go.mod h1:ip/1k0VRfGynBgxOz0yCqHrbZXhcjxyuS66Brc7iBKg=
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
golang.org/x/oauth2 v0.35.0 h1:Mv2mzuHuZuY2+bkyWXIHMfhNdJAdwW3FuWeCPYN5GVQ=
golang.org/x/oauth2 v0.35.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=

View File

@@ -0,0 +1,69 @@
package builder
import (
"bytes"
"fmt"
"os"
"strings"
"github.com/yuin/goldmark"
"github.com/yuin/goldmark/extension"
"github.com/yuin/goldmark/renderer/html"
)
// Build converts a Markdown file to HTML using a template.
// The template must contain a placeholder element that will be replaced with the rendered content.
func Build(markdownPath, templatePath, outputPath string) error {
md, err := os.ReadFile(markdownPath)
if err != nil {
return fmt.Errorf("read markdown: %w", err)
}
tmpl, err := os.ReadFile(templatePath)
if err != nil {
return fmt.Errorf("read template: %w", err)
}
// Convert markdown to HTML
gm := goldmark.New(
goldmark.WithExtensions(extension.GFM),
goldmark.WithRendererOptions(html.WithUnsafe()),
)
var buf bytes.Buffer
if err := gm.Convert(md, &buf); err != nil {
return fmt.Errorf("convert markdown: %w", err)
}
// Inject into template — support both placeholder formats
output := string(tmpl)
replacements := []struct {
old string
new string
}{
{
old: `<div id="md"></div>`,
new: `<div id="md">` + buf.String() + `</div>`,
},
{
old: `<section id="md" class="main-content"></section>`,
new: `<section id="md" class="main-content">` + buf.String() + `</section>`,
},
}
replaced := false
for _, r := range replacements {
if strings.Contains(output, r.old) {
output = strings.Replace(output, r.old, r.new, 1)
replaced = true
break
}
}
if !replaced {
return fmt.Errorf("template missing supported markdown placeholder")
}
if err := os.WriteFile(outputPath, []byte(output), 0o644); err != nil {
return fmt.Errorf("write output: %w", err)
}
return nil
}

View File

@@ -0,0 +1,133 @@
package builder
import (
"os"
"path/filepath"
"strings"
"testing"
)
func TestBuild(t *testing.T) {
dir := t.TempDir()
md := "# Test List\n\n- [Example](https://example.com) - A test entry.\n"
mdPath := filepath.Join(dir, "README.md")
if err := os.WriteFile(mdPath, []byte(md), 0o644); err != nil {
t.Fatal(err)
}
tmpl := `<!DOCTYPE html>
<html>
<body>
<div id="md"></div>
</body>
</html>`
tmplPath := filepath.Join(dir, "template.html")
if err := os.WriteFile(tmplPath, []byte(tmpl), 0o644); err != nil {
t.Fatal(err)
}
outPath := filepath.Join(dir, "index.html")
if err := Build(mdPath, tmplPath, outPath); err != nil {
t.Fatalf("Build failed: %v", err)
}
content, err := os.ReadFile(outPath)
if err != nil {
t.Fatal(err)
}
html := string(content)
if !strings.Contains(html, "Test List") {
t.Error("expected 'Test List' in output")
}
if !strings.Contains(html, "https://example.com") {
t.Error("expected link in output")
}
}
func TestBuildWithSectionPlaceholder(t *testing.T) {
dir := t.TempDir()
md := "# Hello\n\nWorld.\n"
mdPath := filepath.Join(dir, "README.md")
if err := os.WriteFile(mdPath, []byte(md), 0o644); err != nil {
t.Fatal(err)
}
// This matches the actual template format
tmpl := `<!DOCTYPE html>
<html>
<body>
<section id="md" class="main-content"></section>
</body>
</html>`
tmplPath := filepath.Join(dir, "template.html")
if err := os.WriteFile(tmplPath, []byte(tmpl), 0o644); err != nil {
t.Fatal(err)
}
outPath := filepath.Join(dir, "index.html")
if err := Build(mdPath, tmplPath, outPath); err != nil {
t.Fatalf("Build failed: %v", err)
}
content, err := os.ReadFile(outPath)
if err != nil {
t.Fatal(err)
}
if !strings.Contains(string(content), "Hello") {
t.Error("expected 'Hello' in output")
}
if !strings.Contains(string(content), `class="main-content"`) {
t.Error("expected section class preserved")
}
}
func TestBuildRealREADME(t *testing.T) {
mdPath := "../../README.md"
tmplPath := "../../config/website.tmpl.html"
if _, err := os.Stat(mdPath); err != nil {
t.Skip("README.md not found")
}
if _, err := os.Stat(tmplPath); err != nil {
t.Skip("website template not found")
}
dir := t.TempDir()
outPath := filepath.Join(dir, "index.html")
if err := Build(mdPath, tmplPath, outPath); err != nil {
t.Fatalf("Build failed: %v", err)
}
info, err := os.Stat(outPath)
if err != nil {
t.Fatal(err)
}
if info.Size() < 10000 {
t.Errorf("output too small: %d bytes", info.Size())
}
t.Logf("Generated %d bytes", info.Size())
}
func TestBuildFailsWithoutPlaceholder(t *testing.T) {
dir := t.TempDir()
mdPath := filepath.Join(dir, "README.md")
if err := os.WriteFile(mdPath, []byte("# Title\n"), 0o644); err != nil {
t.Fatal(err)
}
tmplPath := filepath.Join(dir, "template.html")
if err := os.WriteFile(tmplPath, []byte("<html><body><main></main></body></html>"), 0o644); err != nil {
t.Fatal(err)
}
outPath := filepath.Join(dir, "index.html")
err := Build(mdPath, tmplPath, outPath)
if err == nil {
t.Fatal("expected Build to fail when template has no supported placeholder")
}
}

96
internal/cache/cache.go vendored Normal file
View File

@@ -0,0 +1,96 @@
package cache
import (
"os"
"strings"
"time"
"gopkg.in/yaml.v3"
)
// ExcludeList holds URL prefixes to skip during checking.
type ExcludeList struct {
Domains []string `yaml:"domains"`
}
// IsExcluded returns true if the URL starts with any excluded prefix.
func (e *ExcludeList) IsExcluded(url string) bool {
for _, d := range e.Domains {
if strings.HasPrefix(url, d) {
return true
}
}
return false
}
// LoadExcludeList reads an exclude.yaml file.
func LoadExcludeList(path string) (*ExcludeList, error) {
data, err := os.ReadFile(path)
if err != nil {
return nil, err
}
var excl ExcludeList
if err := yaml.Unmarshal(data, &excl); err != nil {
return nil, err
}
return &excl, nil
}
// HealthEntry stores metadata about a single entry.
type HealthEntry struct {
URL string `yaml:"url"`
Name string `yaml:"name"`
Status string `yaml:"status"` // healthy, inactive, stale, archived, dead
Stars int `yaml:"stars,omitempty"`
Forks int `yaml:"forks,omitempty"`
LastPush time.Time `yaml:"last_push,omitempty"`
HasLicense bool `yaml:"has_license,omitempty"`
HasReadme bool `yaml:"has_readme,omitempty"`
CheckedAt time.Time `yaml:"checked_at"`
}
// HealthCache is the full YAML cache file.
type HealthCache struct {
Entries []HealthEntry `yaml:"entries"`
}
// LoadHealthCache reads a health_cache.yaml file. Returns empty cache if file doesn't exist.
func LoadHealthCache(path string) (*HealthCache, error) {
data, err := os.ReadFile(path)
if err != nil {
if os.IsNotExist(err) {
return &HealthCache{}, nil
}
return nil, err
}
var hc HealthCache
if err := yaml.Unmarshal(data, &hc); err != nil {
return nil, err
}
return &hc, nil
}
// SaveHealthCache writes the cache to a YAML file.
func SaveHealthCache(path string, hc *HealthCache) error {
data, err := yaml.Marshal(hc)
if err != nil {
return err
}
return os.WriteFile(path, data, 0o644)
}
// Merge updates the cache with new entries, replacing existing ones by URL.
func (hc *HealthCache) Merge(entries []HealthEntry) {
index := make(map[string]int)
for i, e := range hc.Entries {
index[e.URL] = i
}
for _, e := range entries {
if i, exists := index[e.URL]; exists {
hc.Entries[i] = e
} else {
index[e.URL] = len(hc.Entries)
hc.Entries = append(hc.Entries, e)
}
}
}

137
internal/cache/cache_test.go vendored Normal file
View File

@@ -0,0 +1,137 @@
package cache
import (
"os"
"path/filepath"
"testing"
"time"
)
func TestLoadExcludeList(t *testing.T) {
dir := t.TempDir()
path := filepath.Join(dir, "exclude.yaml")
content := `domains:
- https://example.com
- https://test.org
`
if err := os.WriteFile(path, []byte(content), 0o644); err != nil {
t.Fatal(err)
}
excl, err := LoadExcludeList(path)
if err != nil {
t.Fatal(err)
}
if len(excl.Domains) != 2 {
t.Errorf("domains count = %d, want 2", len(excl.Domains))
}
if !excl.IsExcluded("https://example.com/foo") {
t.Error("expected https://example.com/foo to be excluded")
}
if excl.IsExcluded("https://other.com") {
t.Error("expected https://other.com to NOT be excluded")
}
}
func TestHealthCacheRoundTrip(t *testing.T) {
dir := t.TempDir()
path := filepath.Join(dir, "health.yaml")
original := &HealthCache{
Entries: []HealthEntry{
{
URL: "https://github.com/example/repo",
Name: "Example",
Status: "healthy",
Stars: 42,
LastPush: time.Date(2026, 1, 15, 0, 0, 0, 0, time.UTC),
HasLicense: true,
HasReadme: true,
CheckedAt: time.Date(2026, 2, 27, 9, 0, 0, 0, time.UTC),
},
},
}
if err := SaveHealthCache(path, original); err != nil {
t.Fatal(err)
}
loaded, err := LoadHealthCache(path)
if err != nil {
t.Fatal(err)
}
if len(loaded.Entries) != 1 {
t.Fatalf("entries = %d, want 1", len(loaded.Entries))
}
if loaded.Entries[0].Stars != 42 {
t.Errorf("stars = %d, want 42", loaded.Entries[0].Stars)
}
}
func TestLoadHealthCacheMissing(t *testing.T) {
hc, err := LoadHealthCache("/nonexistent/path.yaml")
if err != nil {
t.Fatal(err)
}
if len(hc.Entries) != 0 {
t.Errorf("entries = %d, want 0 for missing file", len(hc.Entries))
}
}
func TestLoadHealthCacheInvalidYAML(t *testing.T) {
dir := t.TempDir()
path := filepath.Join(dir, "health.yaml")
if err := os.WriteFile(path, []byte("entries:\n - url: [not yaml"), 0o644); err != nil {
t.Fatal(err)
}
hc, err := LoadHealthCache(path)
if err == nil {
t.Fatal("expected error for invalid YAML")
}
if hc != nil {
t.Fatal("expected nil cache on invalid YAML")
}
}
func TestMerge(t *testing.T) {
hc := &HealthCache{
Entries: []HealthEntry{
{URL: "https://github.com/a/a", Name: "A", Stars: 10},
{URL: "https://github.com/b/b", Name: "B", Stars: 20},
},
}
hc.Merge([]HealthEntry{
{URL: "https://github.com/b/b", Name: "B", Stars: 25}, // update
{URL: "https://github.com/c/c", Name: "C", Stars: 30}, // new
})
if len(hc.Entries) != 3 {
t.Fatalf("entries = %d, want 3", len(hc.Entries))
}
// B should be updated
if hc.Entries[1].Stars != 25 {
t.Errorf("B stars = %d, want 25", hc.Entries[1].Stars)
}
// C should be appended
if hc.Entries[2].Name != "C" {
t.Errorf("last entry = %q, want C", hc.Entries[2].Name)
}
}
func TestMergeDeduplicatesIncomingBatch(t *testing.T) {
hc := &HealthCache{}
hc.Merge([]HealthEntry{
{URL: "https://github.com/c/c", Name: "C", Stars: 1},
{URL: "https://github.com/c/c", Name: "C", Stars: 2},
})
if len(hc.Entries) != 1 {
t.Fatalf("entries = %d, want 1", len(hc.Entries))
}
if hc.Entries[0].Stars != 2 {
t.Fatalf("stars = %d, want last value 2", hc.Entries[0].Stars)
}
}

174
internal/checker/github.go Normal file
View File

@@ -0,0 +1,174 @@
package checker
import (
"context"
"fmt"
"net/url"
"strings"
"time"
"github.com/shurcooL/githubv4"
"golang.org/x/oauth2"
)
// RepoInfo holds metadata about a GitHub repository.
type RepoInfo struct {
Owner string
Name string
URL string
IsArchived bool
IsDisabled bool
IsPrivate bool
PushedAt time.Time
Stars int
Forks int
HasLicense bool
}
// ExtractGitHubRepo extracts owner/name from a GitHub URL.
// Returns false for non-repo URLs (issues, wiki, apps, etc.).
func ExtractGitHubRepo(rawURL string) (owner, name string, ok bool) {
u, err := url.Parse(rawURL)
if err != nil {
return "", "", false
}
host := strings.ToLower(u.Hostname())
if host != "github.com" && host != "www.github.com" {
return "", "", false
}
path := strings.Trim(u.Path, "/")
parts := strings.Split(path, "/")
if len(parts) != 2 || parts[0] == "" || parts[1] == "" {
return "", "", false
}
// Skip known non-repository top-level routes.
switch parts[0] {
case "apps", "features", "topics":
return "", "", false
}
name = strings.TrimSuffix(parts[1], ".git")
if name == "" {
return "", "", false
}
return parts[0], name, true
}
func isHTTPURL(raw string) bool {
u, err := url.Parse(raw)
if err != nil {
return false
}
return u.Scheme == "http" || u.Scheme == "https"
}
func isGitHubAuthError(err error) bool {
if err == nil {
return false
}
s := strings.ToLower(err.Error())
return strings.Contains(s, "401 unauthorized") ||
strings.Contains(s, "bad credentials") ||
strings.Contains(s, "resource not accessible by integration")
}
// PartitionLinks separates URLs into GitHub repos and external HTTP(S) links.
func PartitionLinks(urls []string) (github, external []string) {
for _, url := range urls {
if _, _, ok := ExtractGitHubRepo(url); ok {
github = append(github, url)
} else if isHTTPURL(url) {
external = append(external, url)
}
}
return
}
// GitHubChecker uses the GitHub GraphQL API.
type GitHubChecker struct {
client *githubv4.Client
}
// NewGitHubChecker creates a checker with the given OAuth token.
func NewGitHubChecker(token string) *GitHubChecker {
src := oauth2.StaticTokenSource(&oauth2.Token{AccessToken: token})
httpClient := oauth2.NewClient(context.Background(), src)
return &GitHubChecker{client: githubv4.NewClient(httpClient)}
}
// CheckRepo queries a single GitHub repository.
func (gc *GitHubChecker) CheckRepo(ctx context.Context, owner, name string) (RepoInfo, error) {
var query struct {
Repository struct {
IsArchived bool
IsDisabled bool
IsPrivate bool
PushedAt time.Time
StargazerCount int
ForkCount int
LicenseInfo *struct {
Name string
}
} `graphql:"repository(owner: $owner, name: $name)"`
}
vars := map[string]interface{}{
"owner": githubv4.String(owner),
"name": githubv4.String(name),
}
if err := gc.client.Query(ctx, &query, vars); err != nil {
return RepoInfo{}, fmt.Errorf("github query %s/%s: %w", owner, name, err)
}
r := query.Repository
return RepoInfo{
Owner: owner,
Name: name,
URL: fmt.Sprintf("https://github.com/%s/%s", owner, name),
IsArchived: r.IsArchived,
IsDisabled: r.IsDisabled,
IsPrivate: r.IsPrivate,
PushedAt: r.PushedAt,
Stars: r.StargazerCount,
Forks: r.ForkCount,
HasLicense: r.LicenseInfo != nil,
}, nil
}
// CheckRepos queries multiple repos in sequence with rate limiting.
func (gc *GitHubChecker) CheckRepos(ctx context.Context, urls []string, batchSize int) ([]RepoInfo, []error) {
if batchSize <= 0 {
batchSize = 50
}
var results []RepoInfo
var errs []error
for i, url := range urls {
owner, name, ok := ExtractGitHubRepo(url)
if !ok {
continue
}
info, err := gc.CheckRepo(ctx, owner, name)
if err != nil {
errs = append(errs, err)
if isGitHubAuthError(err) {
break
}
continue
}
results = append(results, info)
if (i+1)%batchSize == 0 {
time.Sleep(1 * time.Second)
}
}
return results, errs
}

View File

@@ -0,0 +1,78 @@
package checker
import (
"errors"
"testing"
)
func TestExtractGitHubRepo(t *testing.T) {
tests := []struct {
url string
owner string
name string
ok bool
}{
{"https://github.com/docker/compose", "docker", "compose", true},
{"https://github.com/moby/moby", "moby", "moby", true},
{"https://github.com/user/repo/", "user", "repo", true},
{"https://github.com/user/repo?tab=readme-ov-file", "user", "repo", true},
{"https://github.com/user/repo#readme", "user", "repo", true},
{"https://github.com/user/repo.git", "user", "repo", true},
{"https://www.github.com/user/repo", "user", "repo", true},
{"https://github.com/user/repo/issues", "", "", false},
{"https://github.com/user/repo/wiki", "", "", false},
{"https://github.com/apps/dependabot", "", "", false},
{"https://example.com/not-github", "", "", false},
{"https://github.com/user", "", "", false},
}
for _, tt := range tests {
owner, name, ok := ExtractGitHubRepo(tt.url)
if ok != tt.ok {
t.Errorf("ExtractGitHubRepo(%q): ok = %v, want %v", tt.url, ok, tt.ok)
continue
}
if ok {
if owner != tt.owner || name != tt.name {
t.Errorf("ExtractGitHubRepo(%q) = (%q, %q), want (%q, %q)", tt.url, owner, name, tt.owner, tt.name)
}
}
}
}
func TestPartitionLinks(t *testing.T) {
urls := []string{
"https://github.com/docker/compose",
"https://example.com/tool",
"https://github.com/moby/moby",
"https://github.com/user/repo/issues",
"dozzle",
"#projects",
}
gh, ext := PartitionLinks(urls)
if len(gh) != 2 {
t.Errorf("github links = %d, want 2", len(gh))
}
if len(ext) != 2 {
t.Errorf("external links = %d, want 2", len(ext))
}
}
func TestIsGitHubAuthError(t *testing.T) {
tests := []struct {
err error
want bool
}{
{errors.New("non-200 OK status code: 401 Unauthorized body: \"Bad credentials\""), true},
{errors.New("Resource not accessible by integration"), true},
{errors.New("dial tcp: lookup api.github.com: no such host"), false},
{errors.New("context deadline exceeded"), false},
}
for _, tt := range tests {
got := isGitHubAuthError(tt.err)
if got != tt.want {
t.Errorf("isGitHubAuthError(%q) = %v, want %v", tt.err, got, tt.want)
}
}
}

121
internal/checker/http.go Normal file
View File

@@ -0,0 +1,121 @@
package checker
import (
"context"
"net/http"
"sync"
"time"
"github.com/veggiemonk/awesome-docker/internal/cache"
)
const (
defaultTimeout = 30 * time.Second
defaultConcurrency = 10
userAgent = "awesome-docker-checker/1.0"
)
// LinkResult holds the result of checking a single URL.
type LinkResult struct {
URL string
OK bool
StatusCode int
Redirected bool
RedirectURL string
Error string
}
func shouldFallbackToGET(statusCode int) bool {
switch statusCode {
case http.StatusBadRequest, http.StatusForbidden, http.StatusMethodNotAllowed, http.StatusNotImplemented:
return true
default:
return false
}
}
// CheckLink checks a single URL. Uses HEAD first, falls back to GET.
func CheckLink(url string, client *http.Client) LinkResult {
result := LinkResult{URL: url}
ctx, cancel := context.WithTimeout(context.Background(), defaultTimeout)
defer cancel()
// Track redirects
var finalURL string
origCheckRedirect := client.CheckRedirect
client.CheckRedirect = func(req *http.Request, via []*http.Request) error {
finalURL = req.URL.String()
if len(via) >= 10 {
return http.ErrUseLastResponse
}
return nil
}
defer func() { client.CheckRedirect = origCheckRedirect }()
doRequest := func(method string) (*http.Response, error) {
req, err := http.NewRequestWithContext(ctx, method, url, nil)
if err != nil {
return nil, err
}
req.Header.Set("User-Agent", userAgent)
return client.Do(req)
}
resp, err := doRequest(http.MethodHead)
if err != nil {
resp, err = doRequest(http.MethodGet)
if err != nil {
result.Error = err.Error()
return result
}
} else if shouldFallbackToGET(resp.StatusCode) {
resp.Body.Close()
resp, err = doRequest(http.MethodGet)
if err != nil {
result.Error = err.Error()
return result
}
}
defer resp.Body.Close()
result.StatusCode = resp.StatusCode
result.OK = resp.StatusCode >= 200 && resp.StatusCode < 400
if finalURL != "" && finalURL != url {
result.Redirected = true
result.RedirectURL = finalURL
}
return result
}
// CheckLinks checks multiple URLs concurrently.
func CheckLinks(urls []string, concurrency int, exclude *cache.ExcludeList) []LinkResult {
if concurrency <= 0 {
concurrency = defaultConcurrency
}
results := make([]LinkResult, len(urls))
sem := make(chan struct{}, concurrency)
var wg sync.WaitGroup
for i, url := range urls {
if exclude != nil && exclude.IsExcluded(url) {
results[i] = LinkResult{URL: url, OK: true}
continue
}
wg.Add(1)
go func(idx int, u string) {
defer wg.Done()
sem <- struct{}{}
defer func() { <-sem }()
client := &http.Client{Timeout: defaultTimeout}
results[idx] = CheckLink(u, client)
}(i, url)
}
wg.Wait()
return results
}

View File

@@ -0,0 +1,118 @@
package checker
import (
"net/http"
"net/http/httptest"
"testing"
)
func TestCheckLinkOK(t *testing.T) {
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
}))
defer server.Close()
result := CheckLink(server.URL, &http.Client{})
if !result.OK {
t.Errorf("expected OK, got status %d, error: %s", result.StatusCode, result.Error)
}
}
func TestCheckLink404(t *testing.T) {
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusNotFound)
}))
defer server.Close()
result := CheckLink(server.URL, &http.Client{})
if result.OK {
t.Error("expected not OK for 404")
}
if result.StatusCode != 404 {
t.Errorf("status = %d, want 404", result.StatusCode)
}
}
func TestCheckLinkRedirect(t *testing.T) {
final := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
w.WriteHeader(http.StatusOK)
}))
defer final.Close()
redir := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
http.Redirect(w, r, final.URL, http.StatusMovedPermanently)
}))
defer redir.Close()
result := CheckLink(redir.URL, &http.Client{})
if !result.OK {
t.Errorf("expected OK after following redirect, error: %s", result.Error)
}
if !result.Redirected {
t.Error("expected Redirected = true")
}
}
func TestCheckLinks(t *testing.T) {
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
if r.URL.Path == "/bad" {
w.WriteHeader(http.StatusNotFound)
return
}
w.WriteHeader(http.StatusOK)
}))
defer server.Close()
urls := []string{server.URL + "/good", server.URL + "/bad", server.URL + "/also-good"}
results := CheckLinks(urls, 2, nil)
if len(results) != 3 {
t.Fatalf("results = %d, want 3", len(results))
}
for _, r := range results {
if r.URL == server.URL+"/bad" && r.OK {
t.Error("expected /bad to not be OK")
}
if r.URL == server.URL+"/good" && !r.OK {
t.Error("expected /good to be OK")
}
}
}
func TestCheckLinkFallbackToGETOnMethodNotAllowed(t *testing.T) {
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
if r.Method == http.MethodHead {
w.WriteHeader(http.StatusMethodNotAllowed)
return
}
w.WriteHeader(http.StatusOK)
}))
defer server.Close()
result := CheckLink(server.URL, &http.Client{})
if !result.OK {
t.Errorf("expected OK after GET fallback, got status %d, error: %s", result.StatusCode, result.Error)
}
if result.StatusCode != http.StatusOK {
t.Errorf("status = %d, want 200", result.StatusCode)
}
}
func TestCheckLinkFallbackToGETOnForbiddenHead(t *testing.T) {
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
if r.Method == http.MethodHead {
w.WriteHeader(http.StatusForbidden)
return
}
w.WriteHeader(http.StatusOK)
}))
defer server.Close()
result := CheckLink(server.URL, &http.Client{})
if !result.OK {
t.Errorf("expected OK after GET fallback, got status %d, error: %s", result.StatusCode, result.Error)
}
if result.StatusCode != http.StatusOK {
t.Errorf("status = %d, want 200", result.StatusCode)
}
}

145
internal/linter/fixer.go Normal file
View File

@@ -0,0 +1,145 @@
package linter
import (
"bufio"
"fmt"
"os"
"regexp"
"strings"
"github.com/veggiemonk/awesome-docker/internal/parser"
)
// attributionRe matches trailing author attributions like:
//
// by [@author](url), by [@author][ref], by @author
//
// Also handles "Created by", "Maintained by" etc.
var attributionRe = regexp.MustCompile(`\s+(?:(?:[Cc]reated|[Mm]aintained|[Bb]uilt)\s+)?by\s+\[@[^\]]+\](?:\([^)]*\)|\[[^\]]*\])\.?$`)
// bareAttributionRe matches: by @author at end of line (no link).
var bareAttributionRe = regexp.MustCompile(`\s+by\s+@\w+\.?$`)
// sectionHeadingRe matches markdown headings.
var sectionHeadingRe = regexp.MustCompile(`^(#{1,6})\s+(.+?)(?:\s*<!--.*-->)?$`)
// RemoveAttribution strips author attribution from a description string.
func RemoveAttribution(desc string) string {
desc = attributionRe.ReplaceAllString(desc, "")
desc = bareAttributionRe.ReplaceAllString(desc, "")
return strings.TrimSpace(desc)
}
// FormatEntry reconstructs a markdown list line from a parsed Entry.
func FormatEntry(e parser.Entry) string {
desc := e.Description
var markers []string
for _, m := range e.Markers {
switch m {
case parser.MarkerAbandoned:
markers = append(markers, ":skull:")
case parser.MarkerPaid:
markers = append(markers, ":heavy_dollar_sign:")
case parser.MarkerWIP:
markers = append(markers, ":construction:")
}
}
if len(markers) > 0 {
desc = strings.Join(markers, " ") + " " + desc
}
return fmt.Sprintf("- [%s](%s) - %s", e.Name, e.URL, desc)
}
// FixFile reads the README, fixes entries (capitalize, period, remove attribution,
// sort), and writes the result back.
func FixFile(path string) (int, error) {
f, err := os.Open(path)
if err != nil {
return 0, err
}
defer f.Close()
var lines []string
scanner := bufio.NewScanner(f)
for scanner.Scan() {
lines = append(lines, scanner.Text())
}
if err := scanner.Err(); err != nil {
return 0, err
}
fixCount := 0
var headingLines []int
for i, line := range lines {
if sectionHeadingRe.MatchString(line) {
headingLines = append(headingLines, i)
}
}
// Process each heading block independently to match linter sort scope.
for i, headingIdx := range headingLines {
start := headingIdx + 1
end := len(lines)
if i+1 < len(headingLines) {
end = headingLines[i+1]
}
var entryPositions []int
var entries []parser.Entry
for lineIdx := start; lineIdx < end; lineIdx++ {
entry, err := parser.ParseEntry(lines[lineIdx], lineIdx+1)
if err != nil {
continue
}
entryPositions = append(entryPositions, lineIdx)
entries = append(entries, entry)
}
if len(entries) == 0 {
continue
}
var fixed []parser.Entry
for _, e := range entries {
f := FixEntry(e)
f.Description = RemoveAttribution(f.Description)
// Re-apply period after removing attribution (it may have been stripped)
if len(f.Description) > 0 && !strings.HasSuffix(f.Description, ".") {
f.Description += "."
}
fixed = append(fixed, f)
}
sorted := SortEntries(fixed)
for j, e := range sorted {
newLine := FormatEntry(e)
lineIdx := entryPositions[j]
if lines[lineIdx] != newLine {
fixCount++
lines[lineIdx] = newLine
}
}
}
if fixCount == 0 {
return 0, nil
}
// Write back
out, err := os.Create(path)
if err != nil {
return 0, err
}
defer out.Close()
w := bufio.NewWriter(out)
for i, line := range lines {
w.WriteString(line)
if i < len(lines)-1 {
w.WriteString("\n")
}
}
// Preserve trailing newline if original had one
w.WriteString("\n")
return fixCount, w.Flush()
}

View File

@@ -0,0 +1,193 @@
package linter
import (
"os"
"strings"
"testing"
"github.com/veggiemonk/awesome-docker/internal/parser"
)
func TestRemoveAttribution(t *testing.T) {
tests := []struct {
input string
want string
}{
{
"Tool for managing containers by [@author](https://github.com/author)",
"Tool for managing containers",
},
{
"Tool for managing containers by [@author][author]",
"Tool for managing containers",
},
{
"Tool for managing containers by @author",
"Tool for managing containers",
},
{
"Analyzes resource usage. Created by [@Google][google]",
"Analyzes resource usage.",
},
{
"A tool by [@someone](https://example.com).",
"A tool",
},
{
"step-by-step tutorial and more resources",
"step-by-step tutorial and more resources",
},
{
"No attribution here",
"No attribution here",
},
}
for _, tt := range tests {
got := RemoveAttribution(tt.input)
if got != tt.want {
t.Errorf("RemoveAttribution(%q) = %q, want %q", tt.input, got, tt.want)
}
}
}
func TestFormatEntry(t *testing.T) {
e := parser.Entry{
Name: "Portainer",
URL: "https://github.com/portainer/portainer",
Description: "Management UI for Docker.",
}
got := FormatEntry(e)
want := "- [Portainer](https://github.com/portainer/portainer) - Management UI for Docker."
if got != want {
t.Errorf("FormatEntry = %q, want %q", got, want)
}
}
func TestFormatEntryWithMarkers(t *testing.T) {
e := parser.Entry{
Name: "OldTool",
URL: "https://github.com/old/tool",
Description: "A deprecated tool.",
Markers: []parser.Marker{parser.MarkerAbandoned},
}
got := FormatEntry(e)
want := "- [OldTool](https://github.com/old/tool) - :skull: A deprecated tool."
if got != want {
t.Errorf("FormatEntry = %q, want %q", got, want)
}
}
func TestFixFile(t *testing.T) {
content := `# Awesome Docker
## Tools
- [Zebra](https://example.com/zebra) - a tool by [@author](https://github.com/author)
- [Alpha](https://example.com/alpha) - another tool
## Other
Some text here.
`
tmp, err := os.CreateTemp("", "readme-*.md")
if err != nil {
t.Fatal(err)
}
defer os.Remove(tmp.Name())
if _, err := tmp.WriteString(content); err != nil {
t.Fatal(err)
}
tmp.Close()
count, err := FixFile(tmp.Name())
if err != nil {
t.Fatal(err)
}
if count == 0 {
t.Fatal("expected fixes, got 0")
}
data, err := os.ReadFile(tmp.Name())
if err != nil {
t.Fatal(err)
}
result := string(data)
// Check sorting: Alpha should come before Zebra
alphaIdx := strings.Index(result, "[Alpha]")
zebraIdx := strings.Index(result, "[Zebra]")
if alphaIdx > zebraIdx {
t.Error("expected Alpha before Zebra after sort")
}
// Check capitalization
if !strings.Contains(result, "- A tool.") {
t.Errorf("expected capitalized description, got:\n%s", result)
}
// Check attribution removed
if strings.Contains(result, "@author") {
t.Errorf("expected attribution removed, got:\n%s", result)
}
// Check period added
if !strings.Contains(result, "Another tool.") {
t.Errorf("expected period added, got:\n%s", result)
}
}
func TestFixFileSortsAcrossBlankLinesAndIsIdempotent(t *testing.T) {
content := `# Awesome Docker
## Tools
- [Zulu](https://example.com/zulu) - z tool
- [Alpha](https://example.com/alpha) - a tool
`
tmp, err := os.CreateTemp("", "readme-*.md")
if err != nil {
t.Fatal(err)
}
defer os.Remove(tmp.Name())
if _, err := tmp.WriteString(content); err != nil {
t.Fatal(err)
}
tmp.Close()
firstCount, err := FixFile(tmp.Name())
if err != nil {
t.Fatal(err)
}
if firstCount == 0 {
t.Fatal("expected first run to apply fixes")
}
firstData, err := os.ReadFile(tmp.Name())
if err != nil {
t.Fatal(err)
}
firstResult := string(firstData)
alphaIdx := strings.Index(firstResult, "[Alpha]")
zuluIdx := strings.Index(firstResult, "[Zulu]")
if alphaIdx == -1 || zuluIdx == -1 {
t.Fatalf("expected both Alpha and Zulu in result:\n%s", firstResult)
}
if alphaIdx > zuluIdx {
t.Fatalf("expected Alpha before Zulu after fix:\n%s", firstResult)
}
secondCount, err := FixFile(tmp.Name())
if err != nil {
t.Fatal(err)
}
if secondCount != 0 {
t.Fatalf("expected second run to be idempotent, got %d changes", secondCount)
}
}

60
internal/linter/linter.go Normal file
View File

@@ -0,0 +1,60 @@
package linter
import (
"github.com/veggiemonk/awesome-docker/internal/parser"
)
// Result holds all lint issues found.
type Result struct {
Issues []Issue
Errors int
Warnings int
}
// Lint checks an entire parsed document for issues.
func Lint(doc parser.Document) Result {
var result Result
// Collect all entries for duplicate checking
allEntries := collectEntries(doc.Sections)
for _, issue := range CheckDuplicates(allEntries) {
addIssue(&result, issue)
}
// Check each section
lintSections(doc.Sections, &result)
return result
}
func lintSections(sections []parser.Section, result *Result) {
for _, s := range sections {
for _, e := range s.Entries {
for _, issue := range CheckEntry(e) {
addIssue(result, issue)
}
}
for _, issue := range CheckSorted(s.Entries) {
addIssue(result, issue)
}
lintSections(s.Children, result)
}
}
func collectEntries(sections []parser.Section) []parser.Entry {
var all []parser.Entry
for _, s := range sections {
all = append(all, s.Entries...)
all = append(all, collectEntries(s.Children)...)
}
return all
}
func addIssue(result *Result, issue Issue) {
result.Issues = append(result.Issues, issue)
if issue.Severity == SeverityError {
result.Errors++
} else {
result.Warnings++
}
}

View File

@@ -0,0 +1,111 @@
package linter
import (
"testing"
"github.com/veggiemonk/awesome-docker/internal/parser"
)
func TestRuleDescriptionCapital(t *testing.T) {
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "lowercase start.", Line: 10}
issues := CheckEntry(entry)
found := false
for _, issue := range issues {
if issue.Rule == RuleDescriptionCapital {
found = true
}
}
if !found {
t.Error("expected RuleDescriptionCapital issue for lowercase description")
}
}
func TestRuleDescriptionPeriod(t *testing.T) {
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "No period at end", Line: 10}
issues := CheckEntry(entry)
found := false
for _, issue := range issues {
if issue.Rule == RuleDescriptionPeriod {
found = true
}
}
if !found {
t.Error("expected RuleDescriptionPeriod issue")
}
}
func TestRuleSorted(t *testing.T) {
entries := []parser.Entry{
{Name: "Zebra", URL: "https://z.com", Description: "Z.", Line: 1},
{Name: "Alpha", URL: "https://a.com", Description: "A.", Line: 2},
}
issues := CheckSorted(entries)
if len(issues) == 0 {
t.Error("expected sorting issue")
}
}
func TestRuleSortedOK(t *testing.T) {
entries := []parser.Entry{
{Name: "Alpha", URL: "https://a.com", Description: "A.", Line: 1},
{Name: "Zebra", URL: "https://z.com", Description: "Z.", Line: 2},
}
issues := CheckSorted(entries)
if len(issues) != 0 {
t.Errorf("expected no sorting issues, got %d", len(issues))
}
}
func TestRuleDuplicateURL(t *testing.T) {
entries := []parser.Entry{
{Name: "A", URL: "https://example.com/a", Description: "A.", Line: 1},
{Name: "B", URL: "https://example.com/a", Description: "B.", Line: 5},
}
issues := CheckDuplicates(entries)
if len(issues) == 0 {
t.Error("expected duplicate URL issue")
}
}
func TestValidEntry(t *testing.T) {
entry := parser.Entry{Name: "Good", URL: "https://example.com", Description: "A good project.", Line: 10}
issues := CheckEntry(entry)
if len(issues) != 0 {
t.Errorf("expected no issues, got %v", issues)
}
}
func TestFixDescriptionCapital(t *testing.T) {
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "lowercase.", Line: 10}
fixed := FixEntry(entry)
if fixed.Description != "Lowercase." {
t.Errorf("description = %q, want %q", fixed.Description, "Lowercase.")
}
}
func TestFixDescriptionPeriod(t *testing.T) {
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "No period", Line: 10}
fixed := FixEntry(entry)
if fixed.Description != "No period." {
t.Errorf("description = %q, want %q", fixed.Description, "No period.")
}
}
func TestLintDocument(t *testing.T) {
doc := parser.Document{
Sections: []parser.Section{
{
Title: "Tools",
Level: 2,
Entries: []parser.Entry{
{Name: "Zebra", URL: "https://z.com", Description: "Z tool.", Line: 1},
{Name: "Alpha", URL: "https://a.com", Description: "a tool", Line: 2},
},
},
},
}
result := Lint(doc)
if result.Errors == 0 {
t.Error("expected errors (unsorted, lowercase, no period)")
}
}

149
internal/linter/rules.go Normal file
View File

@@ -0,0 +1,149 @@
package linter
import (
"fmt"
"sort"
"strings"
"unicode"
"github.com/veggiemonk/awesome-docker/internal/parser"
)
// Rule identifies a linting rule.
type Rule string
const (
RuleDescriptionCapital Rule = "description-capital"
RuleDescriptionPeriod Rule = "description-period"
RuleSorted Rule = "sorted"
RuleDuplicateURL Rule = "duplicate-url"
)
// Severity of a lint issue.
type Severity int
const (
SeverityError Severity = iota
SeverityWarning
)
// Issue is a single lint problem found.
type Issue struct {
Rule Rule
Severity Severity
Line int
Message string
}
func (i Issue) String() string {
sev := "ERROR"
if i.Severity == SeverityWarning {
sev = "WARN"
}
return fmt.Sprintf("[%s] line %d: %s (%s)", sev, i.Line, i.Message, i.Rule)
}
// CheckEntry validates a single entry against formatting rules.
func CheckEntry(e parser.Entry) []Issue {
var issues []Issue
if first, ok := firstLetter(e.Description); ok && !unicode.IsUpper(first) {
issues = append(issues, Issue{
Rule: RuleDescriptionCapital,
Severity: SeverityError,
Line: e.Line,
Message: fmt.Sprintf("%q: description should start with a capital letter", e.Name),
})
}
if len(e.Description) > 0 && !strings.HasSuffix(e.Description, ".") {
issues = append(issues, Issue{
Rule: RuleDescriptionPeriod,
Severity: SeverityError,
Line: e.Line,
Message: fmt.Sprintf("%q: description should end with a period", e.Name),
})
}
return issues
}
// CheckSorted verifies entries are in alphabetical order (case-insensitive).
func CheckSorted(entries []parser.Entry) []Issue {
var issues []Issue
for i := 1; i < len(entries); i++ {
prev := strings.ToLower(entries[i-1].Name)
curr := strings.ToLower(entries[i].Name)
if prev > curr {
issues = append(issues, Issue{
Rule: RuleSorted,
Severity: SeverityError,
Line: entries[i].Line,
Message: fmt.Sprintf("%q should come before %q (alphabetical order)", entries[i].Name, entries[i-1].Name),
})
}
}
return issues
}
// CheckDuplicates finds entries with the same URL across the entire document.
func CheckDuplicates(entries []parser.Entry) []Issue {
var issues []Issue
seen := make(map[string]int) // URL -> first line number
for _, e := range entries {
url := strings.TrimRight(e.URL, "/")
if firstLine, exists := seen[url]; exists {
issues = append(issues, Issue{
Rule: RuleDuplicateURL,
Severity: SeverityError,
Line: e.Line,
Message: fmt.Sprintf("duplicate URL %q (first seen at line %d)", e.URL, firstLine),
})
} else {
seen[url] = e.Line
}
}
return issues
}
// firstLetter returns the first unicode letter in s and true, or zero and false if none.
func firstLetter(s string) (rune, bool) {
for _, r := range s {
if unicode.IsLetter(r) {
return r, true
}
}
return 0, false
}
// FixEntry returns a copy of the entry with auto-fixable issues corrected.
func FixEntry(e parser.Entry) parser.Entry {
fixed := e
if len(fixed.Description) > 0 {
// Capitalize first letter (find it, may not be at index 0)
runes := []rune(fixed.Description)
for i, r := range runes {
if unicode.IsLetter(r) {
runes[i] = unicode.ToUpper(r)
break
}
}
fixed.Description = string(runes)
// Ensure period at end
if !strings.HasSuffix(fixed.Description, ".") {
fixed.Description += "."
}
}
return fixed
}
// SortEntries returns a sorted copy of entries (case-insensitive by Name).
func SortEntries(entries []parser.Entry) []parser.Entry {
sorted := make([]parser.Entry, len(entries))
copy(sorted, entries)
sort.Slice(sorted, func(i, j int) bool {
return strings.ToLower(sorted[i].Name) < strings.ToLower(sorted[j].Name)
})
return sorted
}

139
internal/parser/parser.go Normal file
View File

@@ -0,0 +1,139 @@
package parser
import (
"bufio"
"fmt"
"io"
"regexp"
"strings"
)
// entryRe matches: - [Name](URL) - Description
// Also handles optional markers/text between URL and " - " separator, e.g.:
//
// - [Name](URL) :skull: - Description
// - [Name](URL) (2) :skull: - Description
var entryRe = regexp.MustCompile(`^[-*]\s+\[([^\]]+)\]\(([^)]+)\)(.*?)\s+-\s+(.+)$`)
// headingRe matches markdown headings: # Title, ## Title, etc.
var headingRe = regexp.MustCompile(`^(#{1,6})\s+(.+?)(?:\s*<!--.*-->)?$`)
var markerDefs = []struct {
text string
marker Marker
}{
{text: ":skull:", marker: MarkerAbandoned},
{text: ":heavy_dollar_sign:", marker: MarkerPaid},
{text: ":construction:", marker: MarkerWIP},
}
// ParseEntry parses a single markdown list line into an Entry.
func ParseEntry(line string, lineNum int) (Entry, error) {
m := entryRe.FindStringSubmatch(strings.TrimSpace(line))
if m == nil {
return Entry{}, fmt.Errorf("line %d: not a valid entry: %q", lineNum, line)
}
middle := m[3] // text between URL closing paren and " - "
desc := m[4]
var markers []Marker
// Extract markers from both the middle section and the description
for _, def := range markerDefs {
if strings.Contains(middle, def.text) || strings.Contains(desc, def.text) {
markers = append(markers, def.marker)
middle = strings.ReplaceAll(middle, def.text, "")
desc = strings.ReplaceAll(desc, def.text, "")
}
}
desc = strings.TrimSpace(desc)
return Entry{
Name: m[1],
URL: m[2],
Description: desc,
Markers: markers,
Line: lineNum,
Raw: line,
}, nil
}
// Parse reads a full README and returns a Document.
func Parse(r io.Reader) (Document, error) {
scanner := bufio.NewScanner(r)
var doc Document
var allSections []struct {
section Section
level int
}
lineNum := 0
for scanner.Scan() {
lineNum++
line := scanner.Text()
// Check for heading
if hm := headingRe.FindStringSubmatch(line); hm != nil {
level := len(hm[1])
title := strings.TrimSpace(hm[2])
allSections = append(allSections, struct {
section Section
level int
}{
section: Section{Title: title, Level: level, Line: lineNum},
level: level,
})
continue
}
// Check for entry (list item with link)
if entry, err := ParseEntry(line, lineNum); err == nil {
if len(allSections) > 0 {
allSections[len(allSections)-1].section.Entries = append(
allSections[len(allSections)-1].section.Entries, entry)
}
continue
}
// Everything else: preamble if no sections yet
if len(allSections) == 0 {
doc.Preamble = append(doc.Preamble, line)
}
}
if err := scanner.Err(); err != nil {
return doc, err
}
// Build section tree by nesting based on heading level
doc.Sections = buildTree(allSections)
return doc, nil
}
func buildTree(flat []struct {
section Section
level int
},
) []Section {
if len(flat) == 0 {
return nil
}
var result []Section
for i := 0; i < len(flat); i++ {
current := flat[i].section
currentLevel := flat[i].level
// Collect children: everything after this heading at a deeper level
j := i + 1
for j < len(flat) && flat[j].level > currentLevel {
j++
}
if j > i+1 {
current.Children = buildTree(flat[i+1 : j])
}
result = append(result, current)
i = j - 1
}
return result
}

View File

@@ -0,0 +1,161 @@
package parser
import (
"os"
"strings"
"testing"
)
func TestParseEntry(t *testing.T) {
line := `- [Docker Desktop](https://www.docker.com/products/docker-desktop/) - Official native app. Only for Windows and MacOS.`
entry, err := ParseEntry(line, 1)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if entry.Name != "Docker Desktop" {
t.Errorf("name = %q, want %q", entry.Name, "Docker Desktop")
}
if entry.URL != "https://www.docker.com/products/docker-desktop/" {
t.Errorf("url = %q, want %q", entry.URL, "https://www.docker.com/products/docker-desktop/")
}
if entry.Description != "Official native app. Only for Windows and MacOS." {
t.Errorf("description = %q, want %q", entry.Description, "Official native app. Only for Windows and MacOS.")
}
if len(entry.Markers) != 0 {
t.Errorf("markers = %v, want empty", entry.Markers)
}
}
func TestParseEntryWithMarkers(t *testing.T) {
line := `- [Docker Swarm](https://github.com/docker/swarm) - Swarm clustering system. :skull:`
entry, err := ParseEntry(line, 1)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if entry.Name != "Docker Swarm" {
t.Errorf("name = %q, want %q", entry.Name, "Docker Swarm")
}
if len(entry.Markers) != 1 || entry.Markers[0] != MarkerAbandoned {
t.Errorf("markers = %v, want [MarkerAbandoned]", entry.Markers)
}
if strings.Contains(entry.Description, ":skull:") {
t.Errorf("description should not contain marker text, got %q", entry.Description)
}
}
func TestParseEntryMultipleMarkers(t *testing.T) {
line := `- [SomeProject](https://example.com) - A project. :heavy_dollar_sign: :construction:`
entry, err := ParseEntry(line, 1)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if len(entry.Markers) != 2 {
t.Fatalf("markers count = %d, want 2", len(entry.Markers))
}
}
func TestParseEntryMarkersCanonicalOrder(t *testing.T) {
line := `- [SomeProject](https://example.com) - :construction: A project. :skull:`
entry, err := ParseEntry(line, 1)
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if len(entry.Markers) != 2 {
t.Fatalf("markers count = %d, want 2", len(entry.Markers))
}
if entry.Markers[0] != MarkerAbandoned || entry.Markers[1] != MarkerWIP {
t.Fatalf("marker order = %v, want [MarkerAbandoned MarkerWIP]", entry.Markers)
}
}
func TestParseDocument(t *testing.T) {
input := `# Awesome Docker
> A curated list
# Contents
- [Projects](#projects)
# Legend
- Abandoned :skull:
# Projects
## Tools
- [ToolA](https://github.com/a/a) - Does A.
- [ToolB](https://github.com/b/b) - Does B. :skull:
## Services
- [ServiceC](https://example.com/c) - Does C. :heavy_dollar_sign:
`
doc, err := Parse(strings.NewReader(input))
if err != nil {
t.Fatalf("unexpected error: %v", err)
}
if len(doc.Sections) == 0 {
t.Fatal("expected at least one section")
}
// Find the "Projects" section
var projects *Section
for i := range doc.Sections {
if doc.Sections[i].Title == "Projects" {
projects = &doc.Sections[i]
break
}
}
if projects == nil {
t.Fatal("expected a Projects section")
}
if len(projects.Children) != 2 {
t.Errorf("projects children = %d, want 2", len(projects.Children))
}
if projects.Children[0].Title != "Tools" {
t.Errorf("first child = %q, want %q", projects.Children[0].Title, "Tools")
}
if len(projects.Children[0].Entries) != 2 {
t.Errorf("Tools entries = %d, want 2", len(projects.Children[0].Entries))
}
}
func TestParseNotAnEntry(t *testing.T) {
_, err := ParseEntry("- Abandoned :skull:", 1)
if err == nil {
t.Error("expected error for non-entry list item")
}
}
func TestParseRealREADME(t *testing.T) {
f, err := os.Open("../../README.md")
if err != nil {
t.Skip("README.md not found, skipping integration test")
}
defer f.Close()
doc, err := Parse(f)
if err != nil {
t.Fatalf("failed to parse README: %v", err)
}
if len(doc.Sections) == 0 {
t.Error("expected sections")
}
total := countEntries(doc.Sections)
if total < 100 {
t.Errorf("expected at least 100 entries, got %d", total)
}
t.Logf("Parsed %d sections, %d total entries", len(doc.Sections), total)
}
func countEntries(sections []Section) int {
n := 0
for _, s := range sections {
n += len(s.Entries)
n += countEntries(s.Children)
}
return n
}

35
internal/parser/types.go Normal file
View File

@@ -0,0 +1,35 @@
package parser
// Marker represents a status emoji on an entry.
type Marker int
const (
MarkerAbandoned Marker = iota // :skull:
MarkerPaid // :heavy_dollar_sign:
MarkerWIP // :construction:
)
// Entry is a single link entry in the README.
type Entry struct {
Name string
URL string
Description string
Markers []Marker
Line int // 1-based line number in source
Raw string // original line text
}
// Section is a heading with optional entries and child sections.
type Section struct {
Title string
Level int // heading level: 1 = #, 2 = ##, etc.
Entries []Entry
Children []Section
Line int
}
// Document is the parsed representation of the full README.
type Document struct {
Preamble []string // lines before the first section
Sections []Section
}

174
internal/scorer/scorer.go Normal file
View File

@@ -0,0 +1,174 @@
package scorer
import (
"encoding/json"
"fmt"
"strings"
"time"
"github.com/veggiemonk/awesome-docker/internal/cache"
"github.com/veggiemonk/awesome-docker/internal/checker"
)
// Status represents the health status of an entry.
type Status string
const (
StatusHealthy Status = "healthy"
StatusInactive Status = "inactive" // 1-2 years since last push
StatusStale Status = "stale" // 2+ years since last push
StatusArchived Status = "archived"
StatusDead Status = "dead" // disabled or 404
)
// ScoredEntry is a repo with its computed health status.
type ScoredEntry struct {
URL string
Name string
Status Status
Stars int
Forks int
HasLicense bool
LastPush time.Time
}
// ReportSummary contains grouped status counts.
type ReportSummary struct {
Healthy int `json:"healthy"`
Inactive int `json:"inactive"`
Stale int `json:"stale"`
Archived int `json:"archived"`
Dead int `json:"dead"`
}
// ReportData is the full machine-readable report model.
type ReportData struct {
GeneratedAt time.Time `json:"generated_at"`
Total int `json:"total"`
Summary ReportSummary `json:"summary"`
Entries []ScoredEntry `json:"entries"`
ByStatus map[Status][]ScoredEntry `json:"by_status"`
}
// Score computes the health status of a GitHub repo.
func Score(info checker.RepoInfo) Status {
if info.IsDisabled {
return StatusDead
}
if info.IsArchived {
return StatusArchived
}
twoYearsAgo := time.Now().AddDate(-2, 0, 0)
oneYearAgo := time.Now().AddDate(-1, 0, 0)
if info.PushedAt.Before(twoYearsAgo) {
return StatusStale
}
if info.PushedAt.Before(oneYearAgo) {
return StatusInactive
}
return StatusHealthy
}
// ScoreAll scores a batch of repo infos.
func ScoreAll(infos []checker.RepoInfo) []ScoredEntry {
results := make([]ScoredEntry, len(infos))
for i, info := range infos {
results[i] = ScoredEntry{
URL: info.URL,
Name: fmt.Sprintf("%s/%s", info.Owner, info.Name),
Status: Score(info),
Stars: info.Stars,
Forks: info.Forks,
HasLicense: info.HasLicense,
LastPush: info.PushedAt,
}
}
return results
}
// ToCacheEntries converts scored entries to cache format.
func ToCacheEntries(scored []ScoredEntry) []cache.HealthEntry {
entries := make([]cache.HealthEntry, len(scored))
now := time.Now().UTC()
for i, s := range scored {
entries[i] = cache.HealthEntry{
URL: s.URL,
Name: s.Name,
Status: string(s.Status),
Stars: s.Stars,
Forks: s.Forks,
HasLicense: s.HasLicense,
LastPush: s.LastPush,
CheckedAt: now,
}
}
return entries
}
// GenerateReport produces a Markdown health report.
func GenerateReport(scored []ScoredEntry) string {
var b strings.Builder
data := BuildReportData(scored)
groups := data.ByStatus
fmt.Fprintf(&b, "# Health Report\n\n")
fmt.Fprintf(&b, "**Generated:** %s\n\n", data.GeneratedAt.Format(time.RFC3339))
fmt.Fprintf(&b, "**Total:** %d repositories\n\n", data.Total)
fmt.Fprintf(&b, "## Summary\n\n")
fmt.Fprintf(&b, "- Healthy: %d\n", data.Summary.Healthy)
fmt.Fprintf(&b, "- Inactive (1-2 years): %d\n", data.Summary.Inactive)
fmt.Fprintf(&b, "- Stale (2+ years): %d\n", data.Summary.Stale)
fmt.Fprintf(&b, "- Archived: %d\n", data.Summary.Archived)
fmt.Fprintf(&b, "- Dead: %d\n\n", data.Summary.Dead)
writeSection := func(title string, status Status) {
entries := groups[status]
if len(entries) == 0 {
return
}
fmt.Fprintf(&b, "## %s\n\n", title)
for _, e := range entries {
fmt.Fprintf(&b, "- [%s](%s) - Stars: %d - Last push: %s\n",
e.Name, e.URL, e.Stars, e.LastPush.Format("2006-01-02"))
}
b.WriteString("\n")
}
writeSection("Archived (should mark :skull:)", StatusArchived)
writeSection("Stale (2+ years inactive)", StatusStale)
writeSection("Inactive (1-2 years)", StatusInactive)
return b.String()
}
// BuildReportData returns full report data for machine-readable and markdown rendering.
func BuildReportData(scored []ScoredEntry) ReportData {
groups := map[Status][]ScoredEntry{}
for _, s := range scored {
groups[s.Status] = append(groups[s.Status], s)
}
return ReportData{
GeneratedAt: time.Now().UTC(),
Total: len(scored),
Summary: ReportSummary{
Healthy: len(groups[StatusHealthy]),
Inactive: len(groups[StatusInactive]),
Stale: len(groups[StatusStale]),
Archived: len(groups[StatusArchived]),
Dead: len(groups[StatusDead]),
},
Entries: scored,
ByStatus: groups,
}
}
// GenerateJSONReport returns the full report as pretty-printed JSON.
func GenerateJSONReport(scored []ScoredEntry) ([]byte, error) {
data := BuildReportData(scored)
return json.MarshalIndent(data, "", " ")
}

View File

@@ -0,0 +1,164 @@
package scorer
import (
"encoding/json"
"fmt"
"strings"
"testing"
"time"
"github.com/veggiemonk/awesome-docker/internal/checker"
)
func TestScoreHealthy(t *testing.T) {
info := checker.RepoInfo{
PushedAt: time.Now().AddDate(0, -3, 0),
IsArchived: false,
Stars: 100,
HasLicense: true,
}
status := Score(info)
if status != StatusHealthy {
t.Errorf("status = %q, want %q", status, StatusHealthy)
}
}
func TestScoreInactive(t *testing.T) {
info := checker.RepoInfo{
PushedAt: time.Now().AddDate(-1, -6, 0),
IsArchived: false,
}
status := Score(info)
if status != StatusInactive {
t.Errorf("status = %q, want %q", status, StatusInactive)
}
}
func TestScoreStale(t *testing.T) {
info := checker.RepoInfo{
PushedAt: time.Now().AddDate(-3, 0, 0),
IsArchived: false,
}
status := Score(info)
if status != StatusStale {
t.Errorf("status = %q, want %q", status, StatusStale)
}
}
func TestScoreArchived(t *testing.T) {
info := checker.RepoInfo{
PushedAt: time.Now(),
IsArchived: true,
}
status := Score(info)
if status != StatusArchived {
t.Errorf("status = %q, want %q", status, StatusArchived)
}
}
func TestScoreDisabled(t *testing.T) {
info := checker.RepoInfo{
IsDisabled: true,
}
status := Score(info)
if status != StatusDead {
t.Errorf("status = %q, want %q", status, StatusDead)
}
}
func TestGenerateReport(t *testing.T) {
results := []ScoredEntry{
{URL: "https://github.com/a/a", Name: "a/a", Status: StatusHealthy, Stars: 100, LastPush: time.Now()},
{URL: "https://github.com/b/b", Name: "b/b", Status: StatusArchived, Stars: 50, LastPush: time.Now()},
{URL: "https://github.com/c/c", Name: "c/c", Status: StatusStale, Stars: 10, LastPush: time.Now().AddDate(-3, 0, 0)},
}
report := GenerateReport(results)
if !strings.Contains(report, "Healthy: 1") {
t.Error("report should contain 'Healthy: 1'")
}
if !strings.Contains(report, "Archived: 1") {
t.Error("report should contain 'Archived: 1'")
}
if !strings.Contains(report, "Stale") {
t.Error("report should contain 'Stale'")
}
}
func TestGenerateReportShowsAllEntries(t *testing.T) {
var results []ScoredEntry
for i := 0; i < 55; i++ {
results = append(results, ScoredEntry{
URL: fmt.Sprintf("https://github.com/stale/%d", i),
Name: fmt.Sprintf("stale/%d", i),
Status: StatusStale,
Stars: i,
LastPush: time.Now().AddDate(-3, 0, 0),
})
}
report := GenerateReport(results)
if strings.Contains(report, "... and") {
t.Fatal("report should not be truncated")
}
if !strings.Contains(report, "stale/54") {
t.Fatal("report should contain all entries")
}
}
func TestGenerateJSONReport(t *testing.T) {
results := []ScoredEntry{
{
URL: "https://github.com/a/a",
Name: "a/a",
Status: StatusHealthy,
Stars: 100,
LastPush: time.Now(),
},
{
URL: "https://github.com/b/b",
Name: "b/b",
Status: StatusStale,
Stars: 50,
LastPush: time.Now().AddDate(-3, 0, 0),
},
}
data, err := GenerateJSONReport(results)
if err != nil {
t.Fatalf("GenerateJSONReport() error = %v", err)
}
var report ReportData
if err := json.Unmarshal(data, &report); err != nil {
t.Fatalf("json.Unmarshal() error = %v", err)
}
if report.Total != 2 {
t.Fatalf("report.Total = %d, want 2", report.Total)
}
if report.Summary.Healthy != 1 || report.Summary.Stale != 1 {
t.Fatalf("summary = %+v, want healthy=1 stale=1", report.Summary)
}
if len(report.Entries) != 2 {
t.Fatalf("len(report.Entries) = %d, want 2", len(report.Entries))
}
if len(report.ByStatus[StatusStale]) != 1 {
t.Fatalf("len(report.ByStatus[stale]) = %d, want 1", len(report.ByStatus[StatusStale]))
}
}
func TestScoreAll(t *testing.T) {
infos := []checker.RepoInfo{
{Owner: "a", Name: "a", PushedAt: time.Now(), Stars: 10},
{Owner: "b", Name: "b", PushedAt: time.Now().AddDate(-3, 0, 0), Stars: 5},
}
scored := ScoreAll(infos)
if len(scored) != 2 {
t.Fatalf("scored = %d, want 2", len(scored))
}
if scored[0].Status != StatusHealthy {
t.Errorf("first = %q, want healthy", scored[0].Status)
}
if scored[1].Status != StatusStale {
t.Errorf("second = %q, want stale", scored[1].Status)
}
}

619
package-lock.json generated
View File

@@ -1,619 +0,0 @@
{
"name": "awesome-docker-website",
"version": "1.0.0",
"lockfileVersion": 3,
"requires": true,
"packages": {
"": {
"name": "awesome-docker-website",
"version": "1.0.0",
"license": "Apache-2.0",
"dependencies": {
"cheerio": "1.2.0",
"draftlog": "1.0.13",
"fs-extra": "11.3.3",
"node-fetch": "3.3.2",
"rimraf": "6.1.3",
"showdown": "^2.1.0"
}
},
"node_modules/@isaacs/cliui": {
"version": "9.0.0",
"resolved": "https://registry.npmjs.org/@isaacs/cliui/-/cliui-9.0.0.tgz",
"integrity": "sha512-AokJm4tuBHillT+FpMtxQ60n8ObyXBatq7jD2/JA9dxbDDokKQm8KMht5ibGzLVU9IJDIKK4TPKgMHEYMn3lMg==",
"license": "BlueOak-1.0.0",
"engines": {
"node": ">=18"
}
},
"node_modules/balanced-match": {
"version": "4.0.2",
"resolved": "https://registry.npmjs.org/balanced-match/-/balanced-match-4.0.2.tgz",
"integrity": "sha512-x0K50QvKQ97fdEz2kPehIerj+YTeptKF9hyYkKf6egnwmMWAkADiO0QCzSp0R5xN8FTZgYaBfSaue46Ej62nMg==",
"license": "MIT",
"dependencies": {
"jackspeak": "^4.2.3"
},
"engines": {
"node": "20 || >=22"
}
},
"node_modules/boolbase": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz",
"integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==",
"license": "ISC"
},
"node_modules/brace-expansion": {
"version": "5.0.2",
"resolved": "https://registry.npmjs.org/brace-expansion/-/brace-expansion-5.0.2.tgz",
"integrity": "sha512-Pdk8c9poy+YhOgVWw1JNN22/HcivgKWwpxKq04M/jTmHyCZn12WPJebZxdjSa5TmBqISrUSgNYU3eRORljfCCw==",
"license": "MIT",
"dependencies": {
"balanced-match": "^4.0.2"
},
"engines": {
"node": "20 || >=22"
}
},
"node_modules/cheerio": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.2.0.tgz",
"integrity": "sha512-WDrybc/gKFpTYQutKIK6UvfcuxijIZfMfXaYm8NMsPQxSYvf+13fXUJ4rztGGbJcBQ/GF55gvrZ0Bc0bj/mqvg==",
"license": "MIT",
"dependencies": {
"cheerio-select": "^2.1.0",
"dom-serializer": "^2.0.0",
"domhandler": "^5.0.3",
"domutils": "^3.2.2",
"encoding-sniffer": "^0.2.1",
"htmlparser2": "^10.1.0",
"parse5": "^7.3.0",
"parse5-htmlparser2-tree-adapter": "^7.1.0",
"parse5-parser-stream": "^7.1.2",
"undici": "^7.19.0",
"whatwg-mimetype": "^4.0.0"
},
"engines": {
"node": ">=20.18.1"
},
"funding": {
"url": "https://github.com/cheeriojs/cheerio?sponsor=1"
}
},
"node_modules/cheerio-select": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/cheerio-select/-/cheerio-select-2.1.0.tgz",
"integrity": "sha512-9v9kG0LvzrlcungtnJtpGNxY+fzECQKhK4EGJX2vByejiMX84MFNQw4UxPJl3bFbTMw+Dfs37XaIkCwTZfLh4g==",
"license": "BSD-2-Clause",
"dependencies": {
"boolbase": "^1.0.0",
"css-select": "^5.1.0",
"css-what": "^6.1.0",
"domelementtype": "^2.3.0",
"domhandler": "^5.0.3",
"domutils": "^3.0.1"
},
"funding": {
"url": "https://github.com/sponsors/fb55"
}
},
"node_modules/commander": {
"version": "9.5.0",
"resolved": "https://registry.npmjs.org/commander/-/commander-9.5.0.tgz",
"integrity": "sha512-KRs7WVDKg86PWiuAqhDrAQnTXZKraVcCc6vFdL14qrZ/DcWwuRo7VoiYXalXO7S5GKpqYiVEwCbgFDfxNHKJBQ==",
"license": "MIT",
"engines": {
"node": "^12.20.0 || >=14"
}
},
"node_modules/css-select": {
"version": "5.2.2",
"resolved": "https://registry.npmjs.org/css-select/-/css-select-5.2.2.tgz",
"integrity": "sha512-TizTzUddG/xYLA3NXodFM0fSbNizXjOKhqiQQwvhlspadZokn1KDy0NZFS0wuEubIYAV5/c1/lAr0TaaFXEXzw==",
"license": "BSD-2-Clause",
"dependencies": {
"boolbase": "^1.0.0",
"css-what": "^6.1.0",
"domhandler": "^5.0.2",
"domutils": "^3.0.1",
"nth-check": "^2.0.1"
},
"funding": {
"url": "https://github.com/sponsors/fb55"
}
},
"node_modules/css-what": {
"version": "6.2.2",
"resolved": "https://registry.npmjs.org/css-what/-/css-what-6.2.2.tgz",
"integrity": "sha512-u/O3vwbptzhMs3L1fQE82ZSLHQQfto5gyZzwteVIEyeaY5Fc7R4dapF/BvRoSYFeqfBk4m0V1Vafq5Pjv25wvA==",
"license": "BSD-2-Clause",
"engines": {
"node": ">= 6"
},
"funding": {
"url": "https://github.com/sponsors/fb55"
}
},
"node_modules/data-uri-to-buffer": {
"version": "4.0.1",
"resolved": "https://registry.npmjs.org/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz",
"integrity": "sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==",
"license": "MIT",
"engines": {
"node": ">= 12"
}
},
"node_modules/dom-serializer": {
"version": "2.0.0",
"resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz",
"integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==",
"license": "MIT",
"dependencies": {
"domelementtype": "^2.3.0",
"domhandler": "^5.0.2",
"entities": "^4.2.0"
},
"funding": {
"url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
}
},
"node_modules/domelementtype": {
"version": "2.3.0",
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
"integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/fb55"
}
],
"license": "BSD-2-Clause"
},
"node_modules/domhandler": {
"version": "5.0.3",
"resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz",
"integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==",
"license": "BSD-2-Clause",
"dependencies": {
"domelementtype": "^2.3.0"
},
"engines": {
"node": ">= 4"
},
"funding": {
"url": "https://github.com/fb55/domhandler?sponsor=1"
}
},
"node_modules/domutils": {
"version": "3.2.2",
"resolved": "https://registry.npmjs.org/domutils/-/domutils-3.2.2.tgz",
"integrity": "sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==",
"license": "BSD-2-Clause",
"dependencies": {
"dom-serializer": "^2.0.0",
"domelementtype": "^2.3.0",
"domhandler": "^5.0.3"
},
"funding": {
"url": "https://github.com/fb55/domutils?sponsor=1"
}
},
"node_modules/draftlog": {
"version": "1.0.13",
"resolved": "https://registry.npmjs.org/draftlog/-/draftlog-1.0.13.tgz",
"integrity": "sha512-GeMWOpXERBpfVDK6v7m0x1hPg8+g8ZsZWqJl2T17wHqrm4h8fnjiZmXcnCrmwogAc6R3YTxFXax15wezfuyCUw==",
"license": "MIT"
},
"node_modules/encoding-sniffer": {
"version": "0.2.1",
"resolved": "https://registry.npmjs.org/encoding-sniffer/-/encoding-sniffer-0.2.1.tgz",
"integrity": "sha512-5gvq20T6vfpekVtqrYQsSCFZ1wEg5+wW0/QaZMWkFr6BqD3NfKs0rLCx4rrVlSWJeZb5NBJgVLswK/w2MWU+Gw==",
"license": "MIT",
"dependencies": {
"iconv-lite": "^0.6.3",
"whatwg-encoding": "^3.1.1"
},
"funding": {
"url": "https://github.com/fb55/encoding-sniffer?sponsor=1"
}
},
"node_modules/entities": {
"version": "4.5.0",
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
"license": "BSD-2-Clause",
"engines": {
"node": ">=0.12"
},
"funding": {
"url": "https://github.com/fb55/entities?sponsor=1"
}
},
"node_modules/fetch-blob": {
"version": "3.2.0",
"resolved": "https://registry.npmjs.org/fetch-blob/-/fetch-blob-3.2.0.tgz",
"integrity": "sha512-7yAQpD2UMJzLi1Dqv7qFYnPbaPx7ZfFK6PiIxQ4PfkGPyNyl2Ugx+a/umUonmKqjhM4DnfbMvdX6otXq83soQQ==",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/jimmywarting"
},
{
"type": "paypal",
"url": "https://paypal.me/jimmywarting"
}
],
"license": "MIT",
"dependencies": {
"node-domexception": "^1.0.0",
"web-streams-polyfill": "^3.0.3"
},
"engines": {
"node": "^12.20 || >= 14.13"
}
},
"node_modules/formdata-polyfill": {
"version": "4.0.10",
"resolved": "https://registry.npmjs.org/formdata-polyfill/-/formdata-polyfill-4.0.10.tgz",
"integrity": "sha512-buewHzMvYL29jdeQTVILecSaZKnt/RJWjoZCF5OW60Z67/GmSLBkOFM7qh1PI3zFNtJbaZL5eQu1vLfazOwj4g==",
"license": "MIT",
"dependencies": {
"fetch-blob": "^3.1.2"
},
"engines": {
"node": ">=12.20.0"
}
},
"node_modules/fs-extra": {
"version": "11.3.3",
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-11.3.3.tgz",
"integrity": "sha512-VWSRii4t0AFm6ixFFmLLx1t7wS1gh+ckoa84aOeapGum0h+EZd1EhEumSB+ZdDLnEPuucsVB9oB7cxJHap6Afg==",
"license": "MIT",
"dependencies": {
"graceful-fs": "^4.2.0",
"jsonfile": "^6.0.1",
"universalify": "^2.0.0"
},
"engines": {
"node": ">=14.14"
}
},
"node_modules/glob": {
"version": "13.0.3",
"resolved": "https://registry.npmjs.org/glob/-/glob-13.0.3.tgz",
"integrity": "sha512-/g3B0mC+4x724v1TgtBlBtt2hPi/EWptsIAmXUx9Z2rvBYleQcsrmaOzd5LyL50jf/Soi83ZDJmw2+XqvH/EeA==",
"license": "BlueOak-1.0.0",
"dependencies": {
"minimatch": "^10.2.0",
"minipass": "^7.1.2",
"path-scurry": "^2.0.0"
},
"engines": {
"node": "20 || >=22"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/graceful-fs": {
"version": "4.2.11",
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
"integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==",
"license": "ISC"
},
"node_modules/htmlparser2": {
"version": "10.1.0",
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-10.1.0.tgz",
"integrity": "sha512-VTZkM9GWRAtEpveh7MSF6SjjrpNVNNVJfFup7xTY3UpFtm67foy9HDVXneLtFVt4pMz5kZtgNcvCniNFb1hlEQ==",
"funding": [
"https://github.com/fb55/htmlparser2?sponsor=1",
{
"type": "github",
"url": "https://github.com/sponsors/fb55"
}
],
"license": "MIT",
"dependencies": {
"domelementtype": "^2.3.0",
"domhandler": "^5.0.3",
"domutils": "^3.2.2",
"entities": "^7.0.1"
}
},
"node_modules/htmlparser2/node_modules/entities": {
"version": "7.0.1",
"resolved": "https://registry.npmjs.org/entities/-/entities-7.0.1.tgz",
"integrity": "sha512-TWrgLOFUQTH994YUyl1yT4uyavY5nNB5muff+RtWaqNVCAK408b5ZnnbNAUEWLTCpum9w6arT70i1XdQ4UeOPA==",
"license": "BSD-2-Clause",
"engines": {
"node": ">=0.12"
},
"funding": {
"url": "https://github.com/fb55/entities?sponsor=1"
}
},
"node_modules/iconv-lite": {
"version": "0.6.3",
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz",
"integrity": "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==",
"license": "MIT",
"dependencies": {
"safer-buffer": ">= 2.1.2 < 3.0.0"
},
"engines": {
"node": ">=0.10.0"
}
},
"node_modules/jackspeak": {
"version": "4.2.3",
"resolved": "https://registry.npmjs.org/jackspeak/-/jackspeak-4.2.3.tgz",
"integrity": "sha512-ykkVRwrYvFm1nb2AJfKKYPr0emF6IiXDYUaFx4Zn9ZuIH7MrzEZ3sD5RlqGXNRpHtvUHJyOnCEFxOlNDtGo7wg==",
"license": "BlueOak-1.0.0",
"dependencies": {
"@isaacs/cliui": "^9.0.0"
},
"engines": {
"node": "20 || >=22"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/jsonfile": {
"version": "6.2.0",
"resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.2.0.tgz",
"integrity": "sha512-FGuPw30AdOIUTRMC2OMRtQV+jkVj2cfPqSeWXv1NEAJ1qZ5zb1X6z1mFhbfOB/iy3ssJCD+3KuZ8r8C3uVFlAg==",
"license": "MIT",
"dependencies": {
"universalify": "^2.0.0"
},
"optionalDependencies": {
"graceful-fs": "^4.1.6"
}
},
"node_modules/lru-cache": {
"version": "11.2.6",
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-11.2.6.tgz",
"integrity": "sha512-ESL2CrkS/2wTPfuend7Zhkzo2u0daGJ/A2VucJOgQ/C48S/zB8MMeMHSGKYpXhIjbPxfuezITkaBH1wqv00DDQ==",
"license": "BlueOak-1.0.0",
"engines": {
"node": "20 || >=22"
}
},
"node_modules/minimatch": {
"version": "10.2.3",
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.2.3.tgz",
"integrity": "sha512-Rwi3pnapEqirPSbWbrZaa6N3nmqq4Xer/2XooiOKyV3q12ML06f7MOuc5DVH8ONZIFhwIYQ3yzPH4nt7iWHaTg==",
"license": "BlueOak-1.0.0",
"dependencies": {
"brace-expansion": "^5.0.2"
},
"engines": {
"node": "18 || 20 || >=22"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/minipass": {
"version": "7.1.2",
"resolved": "https://registry.npmjs.org/minipass/-/minipass-7.1.2.tgz",
"integrity": "sha512-qOOzS1cBTWYF4BH8fVePDBOO9iptMnGUEZwNc/cMWnTV2nVLZ7VoNWEPHkYczZA0pdoA7dl6e7FL659nX9S2aw==",
"license": "ISC",
"engines": {
"node": ">=16 || 14 >=14.17"
}
},
"node_modules/node-domexception": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/node-domexception/-/node-domexception-1.0.0.tgz",
"integrity": "sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ==",
"deprecated": "Use your platform's native DOMException instead",
"funding": [
{
"type": "github",
"url": "https://github.com/sponsors/jimmywarting"
},
{
"type": "github",
"url": "https://paypal.me/jimmywarting"
}
],
"license": "MIT",
"engines": {
"node": ">=10.5.0"
}
},
"node_modules/node-fetch": {
"version": "3.3.2",
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-3.3.2.tgz",
"integrity": "sha512-dRB78srN/l6gqWulah9SrxeYnxeddIG30+GOqK/9OlLVyLg3HPnr6SqOWTWOXKRwC2eGYCkZ59NNuSgvSrpgOA==",
"license": "MIT",
"dependencies": {
"data-uri-to-buffer": "^4.0.0",
"fetch-blob": "^3.1.4",
"formdata-polyfill": "^4.0.10"
},
"engines": {
"node": "^12.20.0 || ^14.13.1 || >=16.0.0"
},
"funding": {
"type": "opencollective",
"url": "https://opencollective.com/node-fetch"
}
},
"node_modules/nth-check": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz",
"integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==",
"license": "BSD-2-Clause",
"dependencies": {
"boolbase": "^1.0.0"
},
"funding": {
"url": "https://github.com/fb55/nth-check?sponsor=1"
}
},
"node_modules/package-json-from-dist": {
"version": "1.0.1",
"resolved": "https://registry.npmjs.org/package-json-from-dist/-/package-json-from-dist-1.0.1.tgz",
"integrity": "sha512-UEZIS3/by4OC8vL3P2dTXRETpebLI2NiI5vIrjaD/5UtrkFX/tNbwjTSRAGC/+7CAo2pIcBaRgWmcBBHcsaCIw==",
"license": "BlueOak-1.0.0"
},
"node_modules/parse5": {
"version": "7.3.0",
"resolved": "https://registry.npmjs.org/parse5/-/parse5-7.3.0.tgz",
"integrity": "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw==",
"license": "MIT",
"dependencies": {
"entities": "^6.0.0"
},
"funding": {
"url": "https://github.com/inikulin/parse5?sponsor=1"
}
},
"node_modules/parse5-htmlparser2-tree-adapter": {
"version": "7.1.0",
"resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.1.0.tgz",
"integrity": "sha512-ruw5xyKs6lrpo9x9rCZqZZnIUntICjQAd0Wsmp396Ul9lN/h+ifgVV1x1gZHi8euej6wTfpqX8j+BFQxF0NS/g==",
"license": "MIT",
"dependencies": {
"domhandler": "^5.0.3",
"parse5": "^7.0.0"
},
"funding": {
"url": "https://github.com/inikulin/parse5?sponsor=1"
}
},
"node_modules/parse5-parser-stream": {
"version": "7.1.2",
"resolved": "https://registry.npmjs.org/parse5-parser-stream/-/parse5-parser-stream-7.1.2.tgz",
"integrity": "sha512-JyeQc9iwFLn5TbvvqACIF/VXG6abODeB3Fwmv/TGdLk2LfbWkaySGY72at4+Ty7EkPZj854u4CrICqNk2qIbow==",
"license": "MIT",
"dependencies": {
"parse5": "^7.0.0"
},
"funding": {
"url": "https://github.com/inikulin/parse5?sponsor=1"
}
},
"node_modules/parse5/node_modules/entities": {
"version": "6.0.1",
"resolved": "https://registry.npmjs.org/entities/-/entities-6.0.1.tgz",
"integrity": "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==",
"license": "BSD-2-Clause",
"engines": {
"node": ">=0.12"
},
"funding": {
"url": "https://github.com/fb55/entities?sponsor=1"
}
},
"node_modules/path-scurry": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/path-scurry/-/path-scurry-2.0.1.tgz",
"integrity": "sha512-oWyT4gICAu+kaA7QWk/jvCHWarMKNs6pXOGWKDTr7cw4IGcUbW+PeTfbaQiLGheFRpjo6O9J0PmyMfQPjH71oA==",
"license": "BlueOak-1.0.0",
"dependencies": {
"lru-cache": "^11.0.0",
"minipass": "^7.1.2"
},
"engines": {
"node": "20 || >=22"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/rimraf": {
"version": "6.1.3",
"resolved": "https://registry.npmjs.org/rimraf/-/rimraf-6.1.3.tgz",
"integrity": "sha512-LKg+Cr2ZF61fkcaK1UdkH2yEBBKnYjTyWzTJT6KNPcSPaiT7HSdhtMXQuN5wkTX0Xu72KQ1l8S42rlmexS2hSA==",
"license": "BlueOak-1.0.0",
"dependencies": {
"glob": "^13.0.3",
"package-json-from-dist": "^1.0.1"
},
"bin": {
"rimraf": "dist/esm/bin.mjs"
},
"engines": {
"node": "20 || >=22"
},
"funding": {
"url": "https://github.com/sponsors/isaacs"
}
},
"node_modules/safer-buffer": {
"version": "2.1.2",
"resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
"license": "MIT"
},
"node_modules/showdown": {
"version": "2.1.0",
"resolved": "https://registry.npmjs.org/showdown/-/showdown-2.1.0.tgz",
"integrity": "sha512-/6NVYu4U819R2pUIk79n67SYgJHWCce0a5xTP979WbNp0FL9MN1I1QK662IDU1b6JzKTvmhgI7T7JYIxBi3kMQ==",
"license": "MIT",
"dependencies": {
"commander": "^9.0.0"
},
"bin": {
"showdown": "bin/showdown.js"
},
"funding": {
"type": "individual",
"url": "https://www.paypal.me/tiviesantos"
}
},
"node_modules/undici": {
"version": "7.19.1",
"resolved": "https://registry.npmjs.org/undici/-/undici-7.19.1.tgz",
"integrity": "sha512-Gpq0iNm5M6cQWlyHQv9MV+uOj1jWk7LpkoE5vSp/7zjb4zMdAcUD+VL5y0nH4p9EbUklq00eVIIX/XcDHzu5xg==",
"license": "MIT",
"engines": {
"node": ">=20.18.1"
}
},
"node_modules/universalify": {
"version": "2.0.1",
"resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.1.tgz",
"integrity": "sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==",
"license": "MIT",
"engines": {
"node": ">= 10.0.0"
}
},
"node_modules/web-streams-polyfill": {
"version": "3.3.3",
"resolved": "https://registry.npmjs.org/web-streams-polyfill/-/web-streams-polyfill-3.3.3.tgz",
"integrity": "sha512-d2JWLCivmZYTSIoge9MsgFCZrt571BikcWGYkjC1khllbTeDlGqZ2D8vD8E/lJa8WGWbb7Plm8/XJYV7IJHZZw==",
"license": "MIT",
"engines": {
"node": ">= 8"
}
},
"node_modules/whatwg-encoding": {
"version": "3.1.1",
"resolved": "https://registry.npmjs.org/whatwg-encoding/-/whatwg-encoding-3.1.1.tgz",
"integrity": "sha512-6qN4hJdMwfYBtE3YBTTHhoeuUrDBPZmbQaxWAqSALV/MeEnR5z1xd8UKud2RAkFoPkmB+hli1TZSnyi84xz1vQ==",
"license": "MIT",
"dependencies": {
"iconv-lite": "0.6.3"
},
"engines": {
"node": ">=18"
}
},
"node_modules/whatwg-mimetype": {
"version": "4.0.0",
"resolved": "https://registry.npmjs.org/whatwg-mimetype/-/whatwg-mimetype-4.0.0.tgz",
"integrity": "sha512-QaKxh0eNIi2mE9p2vEdzfagOKHCcj1pJ56EEHGQOVxp8r9/iszLUUV7v89x9O1p/T+NlTM5W7jW6+cz4Fq1YVg==",
"license": "MIT",
"engines": {
"node": ">=18"
}
}
}
}

View File

@@ -1,30 +0,0 @@
{
"name": "awesome-docker-website",
"version": "1.0.0",
"description": "A curated list of Docker resources and projects Inspired by @sindresorhus and improved by amazing contributors",
"main": "build.js",
"scripts": {
"build": "rimraf ./dist/ && node build.js",
"test-pr": "node tests/pull_request.mjs",
"test": "node tests/test_all.mjs",
"health-check": "node tests/health_check.mjs"
},
"repository": {
"type": "git",
"url": "git+https://github.com/veggiemonk/awesome-docker.git"
},
"author": "Julien Bisconti <julien.bisconti at hotmail dot com>",
"license": "Apache-2.0",
"bugs": {
"url": "https://github.com/veggiemonk/awesome-docker/issues"
},
"homepage": "https://github.com/veggiemonk/awesome-docker#readme",
"dependencies": {
"cheerio": "1.2.0",
"draftlog": "1.0.13",
"fs-extra": "11.3.3",
"node-fetch": "3.3.2",
"rimraf": "6.1.3",
"showdown": "^2.1.0"
}
}

View File

@@ -1,108 +0,0 @@
import fetch from 'node-fetch';
import { isRedirect } from 'node-fetch';
import {readFileSync} from 'fs';
const LINKS_OPTIONS = {
redirect: 'manual',
headers: {
'Content-Type': 'application/json',
'user-agent':
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.149 Safari/537.36',
},
timeout: 60000, // 1m
signal: AbortSignal.timeout(60000),
};
const LOG = {
error: (...args) => console.error('❌ ERROR', args),
error_string: (...args) =>
console.error('❌ ERROR', JSON.stringify({ ...args }, null, ' ')),
debug: (...args) => {
if (process.env.DEBUG) console.log('>>> DEBUG: ', { ...args });
},
debug_string: (...args) => {
if (process.env.DEBUG)
console.log('>>> DEBUG: ', JSON.stringify({ ...args }, null, ' '));
},
};
const handleFailure = (error) => {
console.error(`${error.message}: ${error.stack}`, { error });
process.exit(1);
};
process.on('unhandledRejection', handleFailure);
const extract_all_links = (markdown) => {
// if you have a problem and you try to solve it with a regex,
// now you have two problems
// TODO: replace this mess with a mardown parser ?
const re = /(((https:(?:\/\/)?)(?:[-;:&=+$,\w]+@)?[A-Za-z0-9.-]+|(?:www\.|[-;:&=+$,\w]+@)[A-Za-z0-9.-]+)((?:\/[+~%/.\w\-_]*)?\??(?:[-+=&;%@.\w_]*)#?(?:[.!/@\-\\\w]*))?)/g;
return markdown.match(re);
};
const find_duplicates = (arr) => {
const hm = {};
const dup = [];
arr.forEach((e) => {
if (hm[e]) dup.push(e);
else hm[e] = true;
});
return dup;
};
const partition = (arr, func) => {
const ap = [[], []];
arr.forEach((e) => (func(e) ? ap[0].push(e) : ap[1].push(e)));
return ap;
};
async function fetch_link(url) {
try {
const { headers, ok, status, statusText } = await fetch(url, LINKS_OPTIONS);
const redirect = isRedirect(status) ? { redirect: { src: url, dst: headers.get("location") } } : {};
return [url, { ok, status: statusText, ...redirect }];
} catch (error) {
return [url, { ok: false, status: error.message }];
}
}
async function batch_fetch({ arr, get, post_filter_func, BATCH_SIZE = 8 }) {
const result = [];
/* eslint-disable no-await-in-loop */
for (let i = 0; i < arr.length; i += BATCH_SIZE) {
const batch = arr.slice(i, i + BATCH_SIZE);
LOG.debug_string({ batch });
let res = await Promise.all(batch.map(get));
console.log(`batch fetched...${i + BATCH_SIZE}`);
res = post_filter_func ? res.filter(post_filter_func) : res;
LOG.debug_string({ res });
result.push(...res);
}
return result;
}
const data = readFileSync('./tests/exclude_in_test.json')
const exclude = JSON.parse(data)
const exclude_length = exclude.length;
const exclude_from_list = (link) => {
let is_excluded = false;
for (let i = 0; i < exclude_length; i += 1) {
if (link.startsWith(exclude[i])) {
is_excluded = true;
break;
}
}
return is_excluded;
};
export default {
LOG,
handleFailure,
extract_all_links,
find_duplicates,
partition,
fetch_link,
batch_fetch,
exclude_from_list,
};

View File

@@ -1,17 +0,0 @@
[
"https://vimeo.com",
"https://travis-ci.org/veggiemonk/awesome-docker.svg",
"https://github.com/apps/",
"https://twitter.com",
"https://www.meetup.com/",
"https://cycle.io/",
"https://www.manning.com/",
"https://deepfence.io",
"https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg",
"https://www.se-radio.net/2017/05/se-radio-episode-290-diogo-monica-on-docker-security",
"https://www.reddit.com/r/docker/",
"https://www.udacity.com/course/scalable-microservices-with-kubernetes--ud615",
"https://www.youtube.com/playlist",
"https://www.aquasec.com",
"https://cloudsmith.com"
]

View File

@@ -1,206 +0,0 @@
import fs from 'fs-extra';
import fetch from 'node-fetch';
import helper from './common.mjs';
const README = 'README.md';
const GITHUB_GQL_API = 'https://api.github.com/graphql';
const TOKEN = process.env.GITHUB_TOKEN || '';
if (!TOKEN) {
console.error('GITHUB_TOKEN environment variable is required');
process.exit(1);
}
const Authorization = `token ${TOKEN}`;
const LOG = {
info: (...args) => console.log(' ', ...args),
warn: (...args) => console.warn('⚠️ ', ...args),
error: (...args) => console.error('❌', ...args),
};
// Extract GitHub repos from links
const extract_repos = (arr) =>
arr
.map((e) => e.substr('https://github.com/'.length).split('/'))
.filter((r) => r.length === 2 && r[1] !== '');
// Generate GraphQL query to check repo health
const generate_health_query = (repos) => {
const repoQueries = repos.map(([owner, name]) => {
const safeName = `repo_${owner.replace(/(-|\.)/g, '_')}_${name.replace(/(-|\.)/g, '_')}`;
return `${safeName}: repository(owner: "${owner}", name:"${name}"){
nameWithOwner
isArchived
pushedAt
createdAt
stargazerCount
forkCount
isDisabled
isFork
isLocked
isPrivate
}`;
}).join('\n');
return `query REPO_HEALTH { ${repoQueries} }`;
};
// Batch repos into smaller chunks for GraphQL
function* batchRepos(repos, size = 50) {
for (let i = 0; i < repos.length; i += size) {
yield repos.slice(i, i + size);
}
}
async function checkRepoHealth(repos) {
const results = {
archived: [],
stale: [], // No commits in 2+ years
inactive: [], // No commits in 1-2 years
healthy: [],
disabled: [],
total: repos.length,
};
const twoYearsAgo = new Date();
twoYearsAgo.setFullYear(twoYearsAgo.getFullYear() - 2);
const oneYearAgo = new Date();
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
LOG.info(`Checking health of ${repos.length} repositories...`);
for (const batch of batchRepos(repos)) {
const query = generate_health_query(batch);
const options = {
method: 'POST',
headers: {
Authorization,
'Content-Type': 'application/json',
},
body: JSON.stringify({ query }),
};
try {
const response = await fetch(GITHUB_GQL_API, options);
const data = await response.json();
if (data.errors) {
LOG.error('GraphQL errors:', data.errors);
continue;
}
for (const [key, repo] of Object.entries(data.data)) {
if (!repo) continue;
const pushedAt = new Date(repo.pushedAt);
const repoInfo = {
name: repo.nameWithOwner,
pushedAt: repo.pushedAt,
stars: repo.stargazerCount,
url: `https://github.com/${repo.nameWithOwner}`,
};
if (repo.isArchived) {
results.archived.push(repoInfo);
} else if (repo.isDisabled) {
results.disabled.push(repoInfo);
} else if (pushedAt < twoYearsAgo) {
results.stale.push(repoInfo);
} else if (pushedAt < oneYearAgo) {
results.inactive.push(repoInfo);
} else {
results.healthy.push(repoInfo);
}
}
} catch (error) {
LOG.error('Batch fetch error:', error.message);
}
// Rate limiting - wait a bit between batches
await new Promise(resolve => setTimeout(resolve, 1000));
}
return results;
}
function generateReport(results) {
const report = [];
report.push('# 🏥 Awesome Docker - Health Check Report\n');
report.push(`**Generated:** ${new Date().toISOString()}\n`);
report.push(`**Total Repositories:** ${results.total}\n`);
report.push('\n## 📊 Summary\n');
report.push(`- ✅ Healthy (updated in last year): ${results.healthy.length}`);
report.push(`- ⚠️ Inactive (1-2 years): ${results.inactive.length}`);
report.push(`- 🪦 Stale (2+ years): ${results.stale.length}`);
report.push(`- 📦 Archived: ${results.archived.length}`);
report.push(`- 🚫 Disabled: ${results.disabled.length}\n`);
if (results.archived.length > 0) {
report.push('\n## 📦 Archived Repositories (Should mark as :skull:)\n');
results.archived.forEach(repo => {
report.push(`- [${repo.name}](${repo.url}) - ⭐ ${repo.stars} - Last push: ${repo.pushedAt}`);
});
}
if (results.stale.length > 0) {
report.push('\n## 🪦 Stale Repositories (No activity in 2+ years)\n');
results.stale.slice(0, 50).forEach(repo => {
report.push(`- [${repo.name}](${repo.url}) - ⭐ ${repo.stars} - Last push: ${repo.pushedAt}`);
});
if (results.stale.length > 50) {
report.push(`\n... and ${results.stale.length - 50} more`);
}
}
if (results.inactive.length > 0) {
report.push('\n## ⚠️ Inactive Repositories (No activity in 1-2 years)\n');
report.push('_These may still be stable/complete projects - review individually_\n');
results.inactive.slice(0, 30).forEach(repo => {
report.push(`- [${repo.name}](${repo.url}) - ⭐ ${repo.stars} - Last push: ${repo.pushedAt}`);
});
if (results.inactive.length > 30) {
report.push(`\n... and ${results.inactive.length - 30} more`);
}
}
return report.join('\n');
}
async function main() {
const markdown = await fs.readFile(README, 'utf8');
let links = helper.extract_all_links(markdown);
const github_links = links.filter(link =>
link.startsWith('https://github.com') &&
!helper.exclude_from_list(link) &&
!link.includes('/issues') &&
!link.includes('/pull') &&
!link.includes('/wiki') &&
!link.includes('#')
);
const repos = extract_repos(github_links);
const results = await checkRepoHealth(repos);
const report = generateReport(results);
// Save report
await fs.writeFile('HEALTH_REPORT.md', report);
LOG.info('Health report saved to HEALTH_REPORT.md');
// Also print summary to console
console.log('\n' + report);
// Exit with error if there are actionable items
if (results.archived.length > 0 || results.stale.length > 10) {
LOG.warn(`Found ${results.archived.length} archived and ${results.stale.length} stale repos`);
process.exit(1);
}
}
console.log('Starting health check...');
main();

View File

@@ -1,69 +0,0 @@
import fs from 'fs-extra';
import helper from './common.mjs';
console.log({
DEBUG: process.env.DEBUG || false,
});
const README = 'README.md';
async function main() {
const has_error = {
show: false,
duplicates: '',
other_links_error: '',
};
const markdown = await fs.readFile(README, 'utf8');
let links = helper.extract_all_links(markdown);
links = links.filter((l) => !helper.exclude_from_list(l)); // exclude websites
helper.LOG.debug_string({ links });
console.log(`total links to check ${links.length}`);
console.log('checking for duplicates links...');
const duplicates = helper.find_duplicates(links);
if (duplicates.length > 0) {
has_error.show = true;
has_error.duplicates = duplicates;
}
helper.LOG.debug_string({ duplicates });
const [github_links, external_links] = helper.partition(links, (link) =>
link.startsWith('https://github.com'),
);
console.log(`checking ${external_links.length} external links...`);
const external_links_error = await helper.batch_fetch({
arr: external_links,
get: helper.fetch_link,
post_filter_func: (x) => !x[1].ok,
BATCH_SIZE: 8,
});
if (external_links_error.length > 0) {
has_error.show = true;
has_error.other_links_error = external_links_error;
}
console.log(`checking ${github_links.length} GitHub repositories...`);
console.log(
`skipping GitHub repository check. Run "npm run test" to execute them manually.`,
);
console.log({
TEST_PASSED: !has_error.show,
EXTERNAL_LINKS: external_links.length,
});
if (has_error.show) {
helper.LOG.error_string(has_error);
process.exit(1);
}
}
console.log('starting...');
main().catch((error) => {
console.error('Fatal error:', error);
process.exit(1);
});

View File

@@ -1,127 +0,0 @@
import fs from 'fs-extra';
import fetch from 'node-fetch';
import helper from './common.mjs';
function envvar_undefined(variable_name) {
throw new Error(`${variable_name} must be defined`);
}
console.log({
DEBUG: process.env.DEBUG || false,
});
const README = 'README.md';
const GITHUB_GQL_API = 'https://api.github.com/graphql';
const TOKEN = process.env.GITHUB_TOKEN || envvar_undefined('GITHUB_TOKEN');
const Authorization = `token ${TOKEN}`;
const make_GQL_options = (query) => ({
method: 'POST',
headers: {
Authorization,
'Content-Type': 'application/json',
'user-agent':
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.149 Safari/537.36',
},
body: JSON.stringify({ query }),
});
const extract_repos = (arr) =>
arr
.map((e) => e.substr('https://github.com/'.length).split('/'))
.filter((r) => r.length === 2 && r[1] !== '');
const generate_GQL_query = (arr) =>
`query AWESOME_REPOS{ ${arr
.map(
([owner, name]) =>
`repo_${owner.replace(/(-|\.)/g, '_')}_${name.replace(
/(-|\.)/g,
'_',
)}: repository(owner: "${owner}", name:"${name}"){ nameWithOwner isArchived } `,
)
.join('')} }`;
async function main() {
const has_error = {
show: false,
duplicates: '',
other_links_error: '',
github_repos: '',
};
const markdown = await fs.readFile(README, 'utf8');
let links = helper.extract_all_links(markdown);
links = links.filter((l) => !helper.exclude_from_list(l)); // exclude websites
helper.LOG.debug_string({ links });
console.log(`total links to check ${links.length}`);
console.log('checking for duplicates links...');
const duplicates = helper.find_duplicates(links);
if (duplicates.length > 0) {
has_error.show = true;
has_error.duplicates = duplicates;
}
helper.LOG.debug_string({ duplicates });
const [github_links, external_links] = helper.partition(links, (link) =>
link.startsWith('https://github.com'),
);
console.log(`checking ${external_links.length} external links...`);
const external_links_error = await helper.batch_fetch({
arr: external_links,
get: helper.fetch_link,
post_filter_func: (x) => !x[1].ok,
BATCH_SIZE: 8,
});
if (external_links_error.length > 0) {
has_error.show = true;
has_error.other_links_error = external_links_error;
}
console.log(`checking ${github_links.length} GitHub repositories...`);
const repos = extract_repos(github_links);
const query = generate_GQL_query(repos);
const options = make_GQL_options(query);
const gql_response = await fetch(GITHUB_GQL_API, options).then((r) =>
r.json(),
);
if (gql_response.errors) {
has_error.show = true;
has_error.github_repos = gql_response.errors;
}
// Check for archived repositories
console.log('checking for archived repositories...');
const archived_repos = [];
if (gql_response.data) {
for (const [key, repo] of Object.entries(gql_response.data)) {
if (repo && repo.isArchived) {
archived_repos.push(repo.nameWithOwner);
}
}
}
if (archived_repos.length > 0) {
console.warn(`⚠️ Found ${archived_repos.length} archived repositories that should be marked with :skull:`);
console.warn('Archived repos:', archived_repos);
// Don't fail the build, just warn
}
console.log({
TEST_PASSED: has_error.show,
GITHUB_REPOSITORY: github_links.length,
EXTERNAL_LINKS: external_links.length,
});
if (has_error.show) {
helper.LOG.error_string(has_error);
process.exit(1);
}
}
console.log('starting...');
main();