Compare commits
85 Commits
purge-arch
...
master
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
bba85d0549 | ||
|
|
af1c5695c6 | ||
|
|
a4224eaa56 | ||
|
|
264f6bd0f6 | ||
|
|
b1f074cc5d | ||
|
|
b30025b291 | ||
|
|
270bcc2d58 | ||
|
|
5e90acd5b3 | ||
|
|
a68d6f826d | ||
|
|
05266bd8ac | ||
|
|
26e2d664ec | ||
|
|
9a5331398d | ||
|
|
c14a071c8d | ||
|
|
e6e23bf1cf | ||
|
|
682cffc308 | ||
|
|
89d625229b | ||
|
|
927c3124b8 | ||
|
|
8dc187abc3 | ||
|
|
0089da7ecb | ||
|
|
ae81c12fc5 | ||
|
|
ca2246667c | ||
|
|
67cc5caaa5 | ||
|
|
a5e2b831b5 | ||
|
|
1151490ae8 | ||
|
|
dc01ff137e | ||
|
|
0f00303939 | ||
|
|
0bea35576a | ||
|
|
8bb3147e4e | ||
|
|
8956f1d292 | ||
|
|
0816049273 | ||
|
|
e5d5594775 | ||
|
|
83211a4923 | ||
|
|
ed04ff4017 | ||
|
|
804da83d7b | ||
|
|
bc46effe08 | ||
|
|
ddc32f45d0 | ||
|
|
a49077803c | ||
|
|
f2680c6221 | ||
|
|
08da394e71 | ||
|
|
bae26ccdb1 | ||
|
|
d0538ddf8b | ||
|
|
5a83ccefa9 | ||
|
|
5f34822b78 | ||
|
|
60c1df78f8 | ||
|
|
3159928d85 | ||
|
|
24cd357bb7 | ||
|
|
26e89f6467 | ||
|
|
6406f329ad | ||
|
|
843c618e01 | ||
|
|
914d188dcf | ||
|
|
7eddf73cf0 | ||
|
|
6ceee1e063 | ||
|
|
0b51a5e7c3 | ||
|
|
bf75f67ac5 | ||
|
|
5ab9f7d46a | ||
|
|
c39e924cf9 | ||
|
|
ee4634e435 | ||
|
|
ecf0a3a94f | ||
|
|
2cd6c04450 | ||
|
|
f9b727041d | ||
|
|
f55163208a | ||
|
|
3cf210a0dc | ||
|
|
f9d4f888eb | ||
|
|
301e019185 | ||
|
|
4df86b48d6 | ||
|
|
f421d82354 | ||
|
|
a301665db5 | ||
|
|
cbf1df2432 | ||
|
|
9527d7e290 | ||
|
|
a4c8146fd0 | ||
|
|
24248c4aad | ||
|
|
88da43ac4d | ||
|
|
1736a09b8c | ||
|
|
2edc4efb02 | ||
|
|
ce01549b76 | ||
|
|
4365494c73 | ||
|
|
2cf19bc7ac | ||
|
|
fd885ff12f | ||
|
|
3e73b8444a | ||
|
|
8c0fd0b960 | ||
|
|
f6188cc028 | ||
|
|
0d32406949 | ||
|
|
2860a20ad6 | ||
|
|
ed3af6676b | ||
|
|
05737bcde8 |
14
.claude/settings.local.json
Normal file
14
.claude/settings.local.json
Normal file
@@ -0,0 +1,14 @@
|
||||
{
|
||||
"permissions": {
|
||||
"allow": [
|
||||
"mcp__plugin_context-mode_context-mode__ctx_batch_execute",
|
||||
"mcp__plugin_context7_context7__resolve-library-id",
|
||||
"mcp__plugin_context7_context7__query-docs",
|
||||
"Bash(go:*)",
|
||||
"Bash(./awesome-docker:*)",
|
||||
"Bash(tmux send-keys:*)",
|
||||
"Bash(tmux capture-pane:*)",
|
||||
"Bash(tmux:*)"
|
||||
]
|
||||
}
|
||||
}
|
||||
2
.github/CODEOWNERS
vendored
2
.github/CODEOWNERS
vendored
@@ -1 +1 @@
|
||||
*.md @veggiemonk @agebhar1 @dmitrytokarev @gesellix @mashb1t @moshloop @vegasbrianc @noteed
|
||||
* @veggiemonk @agebhar1 @dmitrytokarev @gesellix @mashb1t @moshloop @vegasbrianc @noteed
|
||||
|
||||
116
.github/CONTRIBUTING.md
vendored
116
.github/CONTRIBUTING.md
vendored
@@ -1,94 +1,56 @@
|
||||
# Contributing to awesome-docker
|
||||
|
||||
First: if you're unsure or afraid of anything, just ask or submit the issue or pull request anyways. You won't be yelled at for giving your best effort. The worst that can happen is that you'll be politely asked to change something. We appreciate any sort of contributions, and don't want a wall of rules to get in the way of that.
|
||||
Thanks for taking the time to contribute.
|
||||
|
||||
However, for those individuals who want a bit more guidance on the best way to contribute to the project, read on. This document will cover what we're looking for. By addressing all the points we're looking for, it raises the chances we can quickly merge or address your contributions.
|
||||
This repository is a curated list of Docker/container resources plus a Go-based maintenance CLI used by CI. Contributions are welcome for both content and tooling.
|
||||
|
||||
We appreciate and recognize [all contributors](https://github.com/veggiemonk/awesome-docker/graphs/contributors).
|
||||
Please read and follow the [Code of Conduct](./CODE_OF_CONDUCT.md).
|
||||
|
||||
Please note that this project is released with a [Contributor Code of Conduct](https://github.com/veggiemonk/awesome-docker/blob/master/.github/CODE_OF_CONDUCT.md). By participating in this project you agree to abide by its terms.
|
||||
## What We Accept
|
||||
|
||||
# Table of Contents
|
||||
- New high-quality Docker/container-related projects
|
||||
- Fixes to descriptions, ordering, or categorization
|
||||
- Removal of broken, archived, deprecated, or duplicate entries
|
||||
- Improvements to the Go CLI and GitHub workflows
|
||||
|
||||
- [Mission Statement](#mission-statement)
|
||||
- [Quality Standards](#quality-standards)
|
||||
- [Contribution Guidelines](#contribution-guidelines)
|
||||
- [New Collaborators](#new-collaborators)
|
||||
## README Entry Rules
|
||||
|
||||
# Mission Statement
|
||||
- Use one link per entry.
|
||||
- Prefer GitHub project/repository URLs over marketing pages.
|
||||
- Keep entries alphabetically sorted within their section.
|
||||
- Keep descriptions concise and concrete.
|
||||
- Use `:yen:` for paid/commercial services.
|
||||
- Use `:ice_cube:` for stale projects (2+ years inactive).
|
||||
- Do not use `:skull:`; archived/deprecated projects should be removed.
|
||||
- Avoid duplicate links and redirect variants.
|
||||
|
||||
`awesome-docker` is a hand-crafted list for high-quality information about Docker and its resources. It should be related or compatible with Docker or containers. If it's just an image built on top of Docker, the project possibly belongs to other [awesome lists](https://github.com/sindresorhus/awesome). You can check the [awesome-selfhosted list](https://github.com/Kickball/awesome-selfhosted) or the [awesome-sysadmin list](https://github.com/n1trux/awesome-sysadmin) as well.
|
||||
If it's a **tutorial or a blog post**, they get outdated really quickly so we don't really put them on the list but if it is on a very advanced and/or specific topic, we will consider it!
|
||||
If something is awesome, share it (pull request or [issue](https://github.com/veggiemonk/awesome-docker/issues/new) or [chat](https://gitter.im/veggiemonk/awesome-docker)), let us know why and we will help you!
|
||||
## Local Validation
|
||||
|
||||
# Quality Standards
|
||||
```bash
|
||||
# Build CLI
|
||||
make build
|
||||
|
||||
Note that we can help you achieve those standards, just try your best and be brave.
|
||||
We'll guide you to the best of our abilities.
|
||||
# Validate README formatting and content
|
||||
make lint
|
||||
|
||||
To be on the list, it would be **nice** if entries adhere to these quality standards:
|
||||
# Run code tests (when touching Go code)
|
||||
make test
|
||||
|
||||
- It should take less than 20 sec to find what is the project, how to install it and how to use it.
|
||||
- Generally useful to the community.
|
||||
- A project on GitHub with a well documented `README.md` file and plenty of examples is considered high quality.
|
||||
- Clearly stating if an entry is related to (Linux) containers and not to Docker. There is an [awesome list](https://github.com/Friz-zy/awesome-linux-containers) for that.
|
||||
- Clearly stating "what is it" i.e. which category it belongs to.
|
||||
- Clearly stating "what is it for" i.e. mention a real problem it solves (even a small one). Make it clear for the next person.
|
||||
- If it is a **WIP** (work in progress, not safe for production), please mention it. (Remember the time before Docker 1.0 ? ;-) )
|
||||
- Always put the link to the GitHub project instead of the website!
|
||||
# Optional: full external checks (requires GITHUB_TOKEN)
|
||||
./awesome-docker check
|
||||
./awesome-docker validate
|
||||
```
|
||||
|
||||
To be on the list, the project **must** have:
|
||||
## Pull Request Expectations
|
||||
|
||||
- How to setup/install the project
|
||||
- How to use the project (examples)
|
||||
- Keep the PR focused to one logical change.
|
||||
- Explain what changed and why.
|
||||
- If adding entries, include the target category.
|
||||
- If removing entries, explain why (archived, broken, duplicate, etc.).
|
||||
- Fill in the PR template checklist.
|
||||
|
||||
If your PR is not merged, we will tell you why so that you may be able to improve it.
|
||||
But usually, we are pretty relaxed people, so just come and say hi, we'll figure something out together.
|
||||
## Maintainer Notes
|
||||
|
||||
# Contribution Guidelines
|
||||
|
||||
## I want to share a project, what should I do?
|
||||
|
||||
- **Adding to the list:** Submit a pull request or open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new)
|
||||
- **Removing from the list:** Submit a pull request or open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new)
|
||||
- Changing something else: Submit a pull request or open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new)
|
||||
- Don't know what to do: Open an [issue](https://github.com/veggiemonk/awesome-docker/issues/new) or join our [chat](https://gitter.im/veggiemonk/awesome-docker), let us know what's going on.
|
||||
|
||||
**join the chat:**
|
||||
|
||||
[](https://gitter.im/veggiemonk/awesome-docker)
|
||||
|
||||
or you can
|
||||
|
||||
**ping us on Twitter:**
|
||||
|
||||
* [veggiemonk](https://twitter.com/veggiemonk)
|
||||
* [idomyowntricks](https://twitter.com/idomyowntricks)
|
||||
* [gesellix](https://twitter.com/gesellix)
|
||||
* [dmitrytokarev](https://twitter.com/dmitrytok)
|
||||
|
||||
### Rules for Pull Request
|
||||
|
||||
- Each item should be limited to one link, no duplicates, no redirection (careful with `http` vs `https`!)
|
||||
- The link should be the name of the package or project or website
|
||||
- Description should be clear and concise (read it out loud to be sure)
|
||||
- Description should follow the link, on the same line
|
||||
- Entries are listed alphabetically, please respect the order
|
||||
- If you want to add more than one link, please don't do all PR on the exact same line, it usually results in conflicts and your PR cannot be automatically merged...
|
||||
|
||||
Please contribute links to packages/projects you have used or are familiar with. This will help ensure high-quality entries.
|
||||
|
||||
#### Your commit message will be a [tweet](https://twitter.com/awesome_docker) so write a [good commit message](https://chris.beams.io/posts/git-commit/), keep that in mind :)
|
||||
|
||||
# New Collaborators
|
||||
|
||||
If you just joined the team of maintainers for this repo, first of all: WELCOME!
|
||||
|
||||
If it is your first time maintaining an open source project, read the [best practice guides for maintainers](https://opensource.guide/best-practices/).
|
||||
|
||||
Here are the few things you need to know:
|
||||
* We don't push directly to the master branch. Every entry **MUST** be reviewed!
|
||||
* Each entry should be in accordance to this quality standards and contribution guidelines.
|
||||
* To ask a contributor to make a change, just copy paste this message [here](https://github.com/veggiemonk/awesome-docker/pull/289#issuecomment-285608004) and change few things like names and stuff. **The main idea is to help people making great projects.**
|
||||
* If something seems weird, i.e. if you don't understand what a project does or the documentation is poor, don't hesitate to (nicely) ask for more explanation (see previous point).
|
||||
* Say thank you to people who contribute to this project! It may not seems like much but respect and gratitude are important :D
|
||||
- Changes should be reviewed before merge.
|
||||
- Prefer helping contributors improve a PR over silently rejecting it.
|
||||
- Keep `.github` documentation and workflows aligned with current tooling.
|
||||
|
||||
15
.github/ISSUE_TEMPLATE.md
vendored
15
.github/ISSUE_TEMPLATE.md
vendored
@@ -1,15 +0,0 @@
|
||||
Hi,
|
||||
|
||||
I would like to add a link.
|
||||
|
||||
**REPO**:
|
||||
|
||||
**DESCRIPTION**:
|
||||
|
||||
**AUTHOR**:
|
||||
|
||||
Or directly write it:
|
||||
```markdown
|
||||
[REPO](https://github.com/AUTHOR/REPO) - DESCRIPTION. By [@AUTHOR](https://github.com/AUTHOR)
|
||||
```
|
||||
|
||||
21
.github/ISSUE_TEMPLATE/add-a-project.md
vendored
Normal file
21
.github/ISSUE_TEMPLATE/add-a-project.md
vendored
Normal file
@@ -0,0 +1,21 @@
|
||||
---
|
||||
name: Add a project
|
||||
about: Add a new project to the list
|
||||
title: "add: [PROJECT_NAME] in [SECTION_NAME]"
|
||||
labels: pending-evaluation
|
||||
assignees: ''
|
||||
|
||||
---
|
||||
|
||||
Category:
|
||||
Repository link:
|
||||
Description (one sentence):
|
||||
Author:
|
||||
Why this should be in the list:
|
||||
Notes (`:yen:` if relevant):
|
||||
|
||||
Or directly write it:
|
||||
|
||||
```markdown
|
||||
[REPO](https://github.com/AUTHOR/REPO) - DESCRIPTION.
|
||||
```
|
||||
137
.github/MAINTENANCE.md
vendored
137
.github/MAINTENANCE.md
vendored
@@ -1,116 +1,81 @@
|
||||
# 🔧 Maintenance Guide for Awesome Docker
|
||||
# Maintenance Guide
|
||||
|
||||
This guide helps maintainers keep the awesome-docker list up-to-date and high-quality.
|
||||
This guide describes how maintainers keep the list and automation healthy.
|
||||
|
||||
## 🤖 Automated Systems
|
||||
## Automated Workflows
|
||||
|
||||
### Weekly Health Reports
|
||||
- **What**: Checks all GitHub repositories for activity, archived status, and maintenance
|
||||
- **When**: Every Monday at 9 AM UTC
|
||||
- **Where**: Creates/updates a GitHub issue with label `health-report`
|
||||
- **Action**: Review the report and mark abandoned projects with `:skull:`
|
||||
### Pull Requests / Weekly QA (`pull_request.yml`)
|
||||
|
||||
### Broken Links Detection
|
||||
- **What**: Tests all links in README.md for availability
|
||||
- **When**: Every Saturday at 2 AM UTC + on every PR
|
||||
- **Where**: Creates/updates a GitHub issue with label `broken-links`
|
||||
- **Action**: Fix or remove broken links, or add to exclusion list
|
||||
- Runs on pull requests and weekly on Saturday.
|
||||
- Builds the Go CLI and runs `./awesome-docker validate`.
|
||||
|
||||
### PR Validation
|
||||
- **What**: Checks for duplicate links and basic validation
|
||||
- **When**: On every pull request
|
||||
- **Action**: Automated - contributors see results immediately
|
||||
### Broken Links Report (`broken_links.yml`)
|
||||
|
||||
## 📋 Manual Maintenance Tasks
|
||||
- Runs weekly on Saturday and on manual trigger.
|
||||
- Executes `./awesome-docker check`.
|
||||
- Opens/updates a `broken-links` issue when problems are found.
|
||||
|
||||
### Monthly Review (First Monday of the month)
|
||||
1. Check health report issue for archived/stale projects
|
||||
2. Mark archived projects with `:skull:` in README.md
|
||||
3. Review projects with 2+ years of inactivity
|
||||
4. Remove projects that are truly abandoned/broken
|
||||
### Weekly Health Report (`health_report.yml`)
|
||||
|
||||
### Quarterly Deep Dive (Every 3 months)
|
||||
1. Run: `npm run health-check` for detailed report
|
||||
2. Review project categories - are they still relevant?
|
||||
3. Check for popular new Docker tools to add
|
||||
4. Update documentation links if newer versions exist
|
||||
- Runs weekly on Monday and on manual trigger.
|
||||
- Executes `./awesome-docker health` then `./awesome-docker report`.
|
||||
- Opens/updates a `health-report` issue.
|
||||
|
||||
### Annual Cleanup (January)
|
||||
1. Remove all `:skull:` projects older than 1 year
|
||||
2. Review CONTRIBUTING.md guidelines
|
||||
3. Update year references in documentation
|
||||
4. Check Node.js version requirements
|
||||
### Deploy to GitHub Pages (`deploy-pages.yml`)
|
||||
|
||||
## 🛠️ Maintenance Commands
|
||||
- Runs on pushes to `master` and manual trigger.
|
||||
- Builds website with `./awesome-docker build` and publishes `website/`.
|
||||
|
||||
## Day-to-Day Commands
|
||||
|
||||
```bash
|
||||
# Test all links (requires GITHUB_TOKEN)
|
||||
npm test
|
||||
# Build CLI
|
||||
make build
|
||||
|
||||
# Test PR changes only
|
||||
npm run test-pr
|
||||
# README lint/validation
|
||||
make lint
|
||||
|
||||
# Generate health report (requires GITHUB_TOKEN)
|
||||
npm run health-check
|
||||
# Auto-fix formatting issues
|
||||
./awesome-docker lint --fix
|
||||
|
||||
# Build the website
|
||||
npm run build
|
||||
|
||||
# Update dependencies
|
||||
npm update
|
||||
# Link checks and health checks (requires GITHUB_TOKEN)
|
||||
make check
|
||||
make health
|
||||
make report
|
||||
```
|
||||
|
||||
## 📊 Quality Standards
|
||||
## Content Maintenance Policy
|
||||
|
||||
### Adding New Projects
|
||||
- Must have clear documentation (README with install/usage)
|
||||
- Should have activity within last 18 months
|
||||
- GitHub project preferred over website links
|
||||
- Must be Docker/container-related
|
||||
- Remove archived/deprecated projects instead of tagging them.
|
||||
- Remove broken links that cannot be fixed.
|
||||
- Keep sections alphabetically sorted.
|
||||
- Keep descriptions short and actionable.
|
||||
|
||||
### Marking Projects as Abandoned
|
||||
Use `:skull:` emoji when:
|
||||
- Repository is archived on GitHub
|
||||
- No commits for 2+ years
|
||||
- Project explicitly states it's deprecated
|
||||
- Maintainer confirms abandonment
|
||||
## Suggested Review Cadence
|
||||
|
||||
### Removing Projects
|
||||
Only remove (don't just mark `:skull:`):
|
||||
- Broken/404 links that can't be fixed
|
||||
- Duplicate entries
|
||||
- Spam or malicious projects
|
||||
- Projects that never met quality standards
|
||||
### Weekly
|
||||
|
||||
## 🚨 Emergency Procedures
|
||||
- Triage open `broken-links` and `health-report` issues.
|
||||
- Merge straightforward quality PRs.
|
||||
|
||||
### Critical Broken Links
|
||||
If important resources are down:
|
||||
1. Check if they moved (update URL)
|
||||
2. Search for alternatives
|
||||
3. Check Internet Archive for mirrors
|
||||
4. Temporarily comment out until resolved
|
||||
### Monthly
|
||||
|
||||
### Spam Pull Requests
|
||||
1. Close immediately
|
||||
2. Mark as spam
|
||||
3. Block user if repeated offense
|
||||
4. Don't engage in comments
|
||||
- Review sections for stale/duplicate entries.
|
||||
- Re-run `check` and `health` manually if needed.
|
||||
|
||||
## 📈 Metrics to Track
|
||||
### Quarterly
|
||||
|
||||
- Total projects: ~731 GitHub repos
|
||||
- Health status: aim for <5% archived
|
||||
- Link availability: aim for >98% working
|
||||
- PR merge time: aim for <7 days
|
||||
- Weekly contributor engagement
|
||||
- Review `.github` docs and templates for drift.
|
||||
- Confirm workflows still match repository tooling and policies.
|
||||
|
||||
## 🤝 Getting Help
|
||||
## Contributor Support
|
||||
|
||||
- Open a discussion in GitHub Discussions
|
||||
- Check AGENTS.md for AI assistant guidelines
|
||||
- Review CONTRIBUTING.md for contributor info
|
||||
When requesting PR changes, be explicit and actionable:
|
||||
|
||||
- point to section/order problems,
|
||||
- explain why a link should be removed,
|
||||
- suggest exact wording when description quality is the issue.
|
||||
|
||||
---
|
||||
|
||||
*Last updated: 2025-10-01*
|
||||
Last updated: 2026-02-27
|
||||
|
||||
58
.github/PULL_REQUEST_TEMPLATE.md
vendored
58
.github/PULL_REQUEST_TEMPLATE.md
vendored
@@ -1,48 +1,28 @@
|
||||
<!-- Congrats on creating an Awesome Docker entry! 🎉 -->
|
||||
# Summary
|
||||
|
||||
<!-- **Remember that entries are ordered alphabetically** -->
|
||||
Describe what changed and why.
|
||||
|
||||
# TLDR
|
||||
* all entries sorted alphabetically (from A to Z),
|
||||
* If paying service add :heavy_dollar_sign:
|
||||
* If WIP add :construction:
|
||||
* clear and short description of the project
|
||||
* project MUST have: How to setup/install
|
||||
* project MUST have: How to use (examples)
|
||||
* we can help you get there :)
|
||||
## Scope
|
||||
|
||||
## Quality Standards
|
||||
- [ ] README entries/content
|
||||
- [ ] Go CLI/tooling
|
||||
- [ ] GitHub workflows or `.github` docs
|
||||
|
||||
Note that we can help you achieve those standards, just try your best and be brave.
|
||||
We'll guide you to the best of our abilities.
|
||||
## If This PR Adds/Edits README Entries
|
||||
|
||||
To be on the list, it would be **nice** if entries adhere to these quality standards:
|
||||
- Category/section touched:
|
||||
- New or updated project links:
|
||||
|
||||
- It should take less than 20 sec to find what is the project, how to install it and how to use it.
|
||||
- Generally useful to the community.
|
||||
- A project on GitHub with a well documented `README.md` file and plenty of examples is considered high quality.
|
||||
- Clearly stating if an entry is related to (Linux) containers and not to Docker. There is an [awesome list](https://github.com/Friz-zy/awesome-linux-containers) for that.
|
||||
- Clearly stating "what is it" i.e. which category it belongs to.
|
||||
- Clearly stating "what is it for" i.e. mention a real problem it solves (even a small one). Make it clear for the next person.
|
||||
- If it is a **WIP** (work in progress, not safe for production), please mention it. (Remember the time before Docker 1.0 ? ;-) )
|
||||
- Always put the link to the GitHub project instead of the website!
|
||||
## Validation
|
||||
|
||||
To be on the list, the project **must** have:
|
||||
- [ ] `make lint`
|
||||
- [ ] `make test` (if Go code changed)
|
||||
- [ ] `./awesome-docker check` (if `GITHUB_TOKEN` available)
|
||||
|
||||
- How to setup/install the project
|
||||
- How to use the project (examples)
|
||||
|
||||
If your PR is not merged, we will tell you why so that you may be able to improve it.
|
||||
But usually, we are pretty relaxed people, so just come and say hi, we'll figure something out together.
|
||||
|
||||
# Rules for Pull Request
|
||||
|
||||
- Each item should be limited to one link, no duplicates, no redirection (careful with `http` vs `https`!)
|
||||
- The link should be the name of the package or project or website
|
||||
- Description should be clear and concise (read it out loud to be sure)
|
||||
- Description should follow the link, on the same line
|
||||
- Entries are listed alphabetically, please respect the order
|
||||
- If you want to add more than one link, please don't do all PR on the exact same line, it usually results in conflicts and your PR cannot be automatically merged...
|
||||
|
||||
Please contribute links to packages/projects you have used or are familiar with. This will help ensure high-quality entries.
|
||||
## Contributor Checklist
|
||||
|
||||
- [ ] Entries are alphabetically ordered in their section
|
||||
- [ ] Links point to project repositories (no duplicates or redirects)
|
||||
- [ ] Descriptions are concise and specific
|
||||
- [ ] Archived/deprecated projects were removed instead of tagged
|
||||
- [ ] Used `:yen:` only when applicable
|
||||
|
||||
21
.github/config.yml
vendored
21
.github/config.yml
vendored
@@ -1,21 +0,0 @@
|
||||
# Configuration for welcome - https://github.com/behaviorbot/welcome
|
||||
|
||||
# Configuration for new-issue-welcome - https://github.com/behaviorbot/new-issue-welcome
|
||||
|
||||
# Comment to be posted to on first time issues
|
||||
newIssueWelcomeComment: >
|
||||
Thanks for opening your first issue here!
|
||||
|
||||
# Configuration for new-pr-welcome - https://github.com/behaviorbot/new-pr-welcome
|
||||
|
||||
# Comment to be posted to on PRs from first time contributors in your repository
|
||||
newPRWelcomeComment: >
|
||||
Thank you for contributing. Please check out our contributing guidelines and welcome!
|
||||
|
||||
# Configuration for first-pr-merge - https://github.com/behaviorbot/first-pr-merge
|
||||
|
||||
# Comment to be posted to on pull requests merged by a first time user
|
||||
firstPRMergeComment: >
|
||||
Congrats on merging your first pull request!
|
||||
|
||||
# It is recommend to include as many gifs and emojis as possible
|
||||
18
.github/dependabot.yml
vendored
18
.github/dependabot.yml
vendored
@@ -1,11 +1,13 @@
|
||||
# To get started with Dependabot version updates, you'll need to specify which
|
||||
# package ecosystems to update and where the package manifests are located.
|
||||
# Please see the documentation for all configuration options:
|
||||
# https://help.github.com/github/administering-a-repository/configuration-options-for-dependency-updates
|
||||
|
||||
version: 2
|
||||
updates:
|
||||
- package-ecosystem: "npm" # See documentation for possible values
|
||||
directory: "/" # Location of package manifests
|
||||
# Enable version updates for Go modules
|
||||
- package-ecosystem: "gomod"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "daily"
|
||||
interval: "weekly"
|
||||
|
||||
# Enable version updates for GitHub Actions
|
||||
- package-ecosystem: "github-actions"
|
||||
directory: "/"
|
||||
schedule:
|
||||
interval: "weekly"
|
||||
|
||||
7
.github/weekly-digest.yml
vendored
7
.github/weekly-digest.yml
vendored
@@ -1,7 +0,0 @@
|
||||
# Configuration for weekly-digest - https://github.com/apps/weekly-digest
|
||||
publishDay: sun
|
||||
canPublishIssues: true
|
||||
canPublishPullRequests: true
|
||||
canPublishContributors: true
|
||||
canPublishStargazers: true
|
||||
canPublishCommits: true
|
||||
73
.github/workflows/broken_links.yml
vendored
73
.github/workflows/broken_links.yml
vendored
@@ -2,43 +2,34 @@ name: Broken Links Report
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Run every Saturday at 2 AM UTC
|
||||
- cron: "0 2 * * 6"
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: broken-links-${{ github.ref }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
check-links:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
permissions:
|
||||
contents: read
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # ratchet:actions/checkout@v5.0.0
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
|
||||
|
||||
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # ratchet:actions/setup-node@v5.0.0
|
||||
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
|
||||
with:
|
||||
node-version: lts/*
|
||||
go-version-file: go.mod
|
||||
|
||||
- uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # ratchet:actions/cache@v4.3.0
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-node-
|
||||
|
||||
- name: Install Dependencies
|
||||
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
|
||||
- name: Build
|
||||
run: go build -o awesome-docker ./cmd/awesome-docker
|
||||
|
||||
- name: Run Link Check
|
||||
id: link_check
|
||||
run: |
|
||||
npm test > link_check_output.txt 2>&1 || true
|
||||
if grep -q "❌ ERROR" link_check_output.txt; then
|
||||
echo "has_errors=true" >> $GITHUB_OUTPUT
|
||||
else
|
||||
echo "has_errors=false" >> $GITHUB_OUTPUT
|
||||
fi
|
||||
run: ./awesome-docker ci broken-links --issue-file broken_links_issue.md --github-output "$GITHUB_OUTPUT"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
@@ -48,34 +39,8 @@ jobs:
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const output = fs.readFileSync('link_check_output.txt', 'utf8');
|
||||
const issueBody = fs.readFileSync('broken_links_issue.md', 'utf8');
|
||||
|
||||
// Extract error information
|
||||
const errorMatch = output.match(/❌ ERROR[\s\S]*$/);
|
||||
const errorInfo = errorMatch ? errorMatch[0] : 'Link check failed - see workflow logs';
|
||||
|
||||
const issueBody = `# 🔗 Broken Links Detected
|
||||
|
||||
The weekly link check has found broken or inaccessible links in the repository.
|
||||
|
||||
## Error Details
|
||||
|
||||
\`\`\`
|
||||
${errorInfo}
|
||||
\`\`\`
|
||||
|
||||
## Action Required
|
||||
|
||||
Please review and fix the broken links above. Options:
|
||||
- Update the URL if the resource moved
|
||||
- Remove the entry if it's permanently unavailable
|
||||
- Add to \`tests/exclude_in_test.json\` if it's a known false positive
|
||||
|
||||
---
|
||||
*Auto-generated by [broken_links.yml](https://github.com/veggiemonk/awesome-docker/blob/master/.github/workflows/broken_links.yml)*
|
||||
`;
|
||||
|
||||
// Check for existing issue
|
||||
const issues = await github.rest.issues.listForRepo({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
@@ -91,16 +56,14 @@ jobs:
|
||||
issue_number: issues.data[0].number,
|
||||
body: issueBody
|
||||
});
|
||||
console.log(`Updated issue #${issues.data[0].number}`);
|
||||
} else {
|
||||
const issue = await github.rest.issues.create({
|
||||
await github.rest.issues.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: '🔗 Broken Links Detected - Action Required',
|
||||
title: 'Broken Links Detected',
|
||||
body: issueBody,
|
||||
labels: ['broken-links', 'bug']
|
||||
});
|
||||
console.log(`Created issue #${issue.data.number}`);
|
||||
}
|
||||
|
||||
- name: Close Issue if No Errors
|
||||
@@ -115,7 +78,6 @@ jobs:
|
||||
labels: 'broken-links',
|
||||
per_page: 1
|
||||
});
|
||||
|
||||
if (issues.data.length > 0) {
|
||||
await github.rest.issues.update({
|
||||
owner: context.repo.owner,
|
||||
@@ -124,11 +86,4 @@ jobs:
|
||||
state: 'closed',
|
||||
state_reason: 'completed'
|
||||
});
|
||||
await github.rest.issues.createComment({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: issues.data[0].number,
|
||||
body: '✅ All links are now working! Closing this issue.'
|
||||
});
|
||||
console.log(`Closed issue #${issues.data[0].number}`);
|
||||
}
|
||||
|
||||
14
.github/workflows/deploy-pages.yml
vendored
14
.github/workflows/deploy-pages.yml
vendored
@@ -20,19 +20,17 @@ jobs:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # ratchet:actions/checkout@v5
|
||||
uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # ratchet:actions/setup-node@v5
|
||||
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
|
||||
with:
|
||||
node-version-file: '.nvmrc'
|
||||
cache: 'npm'
|
||||
go-version-file: go.mod
|
||||
|
||||
- name: Install dependencies
|
||||
run: npm ci
|
||||
- name: Build CLI
|
||||
run: go build -o awesome-docker ./cmd/awesome-docker
|
||||
|
||||
- name: Build website
|
||||
run: npm run build
|
||||
run: make website
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-pages-artifact@7b1f4a764d45c48632c6b24a0339c27f5614fb0b # ratchet:actions/upload-pages-artifact@v4
|
||||
|
||||
58
.github/workflows/health_report.yml
vendored
58
.github/workflows/health_report.yml
vendored
@@ -2,56 +2,46 @@ name: Weekly Health Report
|
||||
|
||||
on:
|
||||
schedule:
|
||||
# Run every Monday at 9 AM UTC
|
||||
- cron: "0 9 * * 1"
|
||||
workflow_dispatch: # Allow manual trigger
|
||||
workflow_dispatch:
|
||||
|
||||
concurrency:
|
||||
group: health-report-${{ github.ref }}
|
||||
cancel-in-progress: false
|
||||
|
||||
jobs:
|
||||
health-check:
|
||||
runs-on: ubuntu-latest
|
||||
timeout-minutes: 30
|
||||
permissions:
|
||||
contents: write
|
||||
contents: read
|
||||
issues: write
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # ratchet:actions/checkout@v5.0.0
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
|
||||
|
||||
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # ratchet:actions/setup-node@v5.0.0
|
||||
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
|
||||
with:
|
||||
node-version: lts/*
|
||||
go-version-file: go.mod
|
||||
|
||||
- uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # ratchet:actions/cache@v4.3.0
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-node-
|
||||
- name: Build
|
||||
run: go build -o awesome-docker ./cmd/awesome-docker
|
||||
|
||||
- name: Install Dependencies
|
||||
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
|
||||
|
||||
- name: Run Health Check
|
||||
run: node tests/health_check.mjs
|
||||
continue-on-error: true
|
||||
- name: Run Health + Report
|
||||
id: report
|
||||
run: ./awesome-docker ci health-report --issue-file health_report.txt --github-output "$GITHUB_OUTPUT"
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
- name: Upload Health Report
|
||||
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # ratchet:actions/upload-artifact@v4
|
||||
with:
|
||||
name: health-report
|
||||
path: HEALTH_REPORT.md
|
||||
|
||||
- name: Create Issue with Health Report
|
||||
- name: Create/Update Issue with Health Report
|
||||
if: steps.report.outputs.has_report == 'true'
|
||||
uses: actions/github-script@ed597411d8f924073f98dfc5c65a23a2325f34cd # ratchet:actions/github-script@v8
|
||||
with:
|
||||
script: |
|
||||
const fs = require('fs');
|
||||
const report = fs.readFileSync('health_report.txt', 'utf8');
|
||||
const issueBody = report;
|
||||
|
||||
// Read the health report
|
||||
const report = fs.readFileSync('HEALTH_REPORT.md', 'utf8');
|
||||
|
||||
// Check if there's already an open issue
|
||||
const issues = await github.rest.issues.listForRepo({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
@@ -60,25 +50,19 @@ jobs:
|
||||
per_page: 1
|
||||
});
|
||||
|
||||
const issueBody = report + '\n\n---\n*This report is auto-generated weekly. See [health_check.mjs](https://github.com/veggiemonk/awesome-docker/blob/master/tests/health_check.mjs) for details.*';
|
||||
|
||||
if (issues.data.length > 0) {
|
||||
// Update existing issue
|
||||
await github.rest.issues.update({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
issue_number: issues.data[0].number,
|
||||
body: issueBody
|
||||
});
|
||||
console.log(`Updated issue #${issues.data[0].number}`);
|
||||
} else {
|
||||
// Create new issue
|
||||
const issue = await github.rest.issues.create({
|
||||
await github.rest.issues.create({
|
||||
owner: context.repo.owner,
|
||||
repo: context.repo.repo,
|
||||
title: '🏥 Weekly Health Report - Repository Maintenance Needed',
|
||||
title: 'Weekly Health Report - Repository Maintenance Needed',
|
||||
body: issueBody,
|
||||
labels: ['health-report', 'maintenance']
|
||||
});
|
||||
console.log(`Created issue #${issue.data.number}`);
|
||||
}
|
||||
|
||||
25
.github/workflows/pull_request.yml
vendored
25
.github/workflows/pull_request.yml
vendored
@@ -11,22 +11,19 @@ jobs:
|
||||
test:
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # ratchet:actions/checkout@v5.0.0
|
||||
- uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # ratchet:actions/setup-node@v5.0.0
|
||||
with:
|
||||
node-version: lts/*
|
||||
- uses: actions/checkout@de0fac2e4500dabe0009e67214ff5f5447ce83dd # ratchet:actions/checkout@v6
|
||||
|
||||
- uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # ratchet:actions/cache@v4.3.0
|
||||
id: cache
|
||||
- uses: actions/setup-go@4b73464bb391d4059bd26b0524d20df3927bd417 # ratchet:actions/setup-go@v6
|
||||
with:
|
||||
path: ~/.npm
|
||||
key: ${{ runner.os }}-node-${{ hashFiles('**/package-lock.json') }}
|
||||
restore-keys: |
|
||||
${{ runner.os }}-node-
|
||||
go-version-file: go.mod
|
||||
|
||||
- name: Install Dependencies
|
||||
# if: steps.cache.outputs.cache-hit != 'true'
|
||||
run: npm ci --ignore-scripts --no-audit --no-progress --prefer-offline
|
||||
- run: npm run test-pr
|
||||
- name: Build
|
||||
run: go build -o awesome-docker ./cmd/awesome-docker
|
||||
|
||||
- name: Build website
|
||||
run: ./awesome-docker build
|
||||
|
||||
- name: Validate
|
||||
run: ./awesome-docker validate
|
||||
env:
|
||||
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
|
||||
4
.gitignore
vendored
4
.gitignore
vendored
@@ -10,3 +10,7 @@ website/table.html
|
||||
|
||||
.idea
|
||||
**/.DS_Store
|
||||
.worktrees
|
||||
|
||||
# Go
|
||||
/awesome-docker
|
||||
|
||||
89
AGENTS.md
89
AGENTS.md
@@ -1,28 +1,79 @@
|
||||
# Agent Guidelines for awesome-docker
|
||||
|
||||
## Commands
|
||||
- Build website: `npm run build` (converts README.md to website/index.html)
|
||||
- Test all links: `npm test` (runs tests/test_all.mjs, requires GITHUB_TOKEN)
|
||||
- Test PR changes: `npm run test-pr` (runs tests/pull_request.mjs, checks duplicates)
|
||||
- Health check: `npm run health-check` (generates HEALTH_REPORT.md, requires GITHUB_TOKEN)
|
||||
- Build CLI: `make build` (or `go build -o awesome-docker ./cmd/awesome-docker`)
|
||||
- Rebuild from scratch: `make rebuild`
|
||||
- Show local workflows: `make help`
|
||||
- Format Go code: `make fmt`
|
||||
- Run tests: `make test` (runs `go test ./internal/... -v`)
|
||||
- Race tests: `make test-race`
|
||||
- Lint README rules: `make lint` (runs `./awesome-docker lint`)
|
||||
- Auto-fix lint issues: `make lint-fix`
|
||||
- Check links: `make check` (runs `./awesome-docker check`; `GITHUB_TOKEN` enables GitHub repo checks)
|
||||
- PR-safe link checks: `make check-pr`
|
||||
- PR validation: `make validate` (lint + external link checks in PR mode)
|
||||
- Build website: `make website` (generates `website/index.html` from `README.md`)
|
||||
- Health scoring: `make health` (requires `GITHUB_TOKEN`, refreshes `config/health_cache.yaml`)
|
||||
- Print health report (Markdown): `make report`
|
||||
- Print health report (JSON): `make report-json` or `./awesome-docker report --json`
|
||||
- Generate report files: `make report-file` (`HEALTH_REPORT.md`) and `make report-json-file` (`HEALTH_REPORT.json`)
|
||||
- Maintenance shortcut: `make workflow-maint` (health + JSON report file)
|
||||
|
||||
## Architecture
|
||||
- **Main content**: README.md - curated list of Docker resources (markdown format)
|
||||
- **Build script**: build.js - converts README.md to HTML using showdown & cheerio
|
||||
- **Tests**: tests/*.mjs - link validation, duplicate detection, URL checking
|
||||
- **Website**: website/ - static site deployment folder
|
||||
- **Main content**: `README.md` (curated Docker/container resources)
|
||||
- **CLI entrypoint**: `cmd/awesome-docker/main.go` (Cobra commands)
|
||||
- **Core packages**:
|
||||
- `internal/parser` - parse README sections and entries
|
||||
- `internal/linter` - alphabetical/order/format validation + autofix
|
||||
- `internal/checker` - HTTP and GitHub link checks
|
||||
- `internal/scorer` - repository health scoring and report generation
|
||||
- `internal/cache` - exclude list and health cache read/write
|
||||
- `internal/builder` - render README to website HTML from template
|
||||
- **Config**:
|
||||
- `config/exclude.yaml` - known link-check exclusions
|
||||
- `config/website.tmpl.html` - HTML template for site generation
|
||||
- `config/health_cache.yaml` - persisted health scoring cache
|
||||
- **Generated outputs**:
|
||||
- `awesome-docker` - compiled CLI binary
|
||||
- `website/index.html` - generated website
|
||||
- `HEALTH_REPORT.md` - generated markdown report
|
||||
- `HEALTH_REPORT.json` - generated JSON report
|
||||
|
||||
## Code Style
|
||||
- **Language**: Node.js with ES modules (.mjs) for tests, CommonJS for build.js
|
||||
- **Imports**: Use ES6 imports in .mjs files, require() in .js files
|
||||
- **Error handling**: Use try/catch with LOG.error() and process.exit(1) for failures
|
||||
- **Logging**: Use LOG object with error/debug methods (see build.js for pattern)
|
||||
- **Async**: Prefer async/await over callbacks
|
||||
- **Language**: Go
|
||||
- **Formatting**: Keep code `gofmt`-clean
|
||||
- **Testing**: Add/adjust table-driven tests in `internal/*_test.go` for behavior changes
|
||||
- **Error handling**: Return wrapped errors (`fmt.Errorf("context: %w", err)`) from command handlers
|
||||
- **CLI conventions**: Keep command behavior consistent with existing Cobra commands (`lint`, `check`, `health`, `build`, `report`, `validate`)
|
||||
|
||||
## CI/Automation
|
||||
- **PR + weekly validation**: `.github/workflows/pull_request.yml`
|
||||
- Triggers on pull requests to `master` and weekly schedule
|
||||
- Builds Go CLI and runs `./awesome-docker validate`
|
||||
- **Weekly broken links issue**: `.github/workflows/broken_links.yml`
|
||||
- Runs `./awesome-docker check`
|
||||
- Opens/updates `broken-links` issue when failures are found
|
||||
- **Weekly health report issue**: `.github/workflows/health_report.yml`
|
||||
- Runs `./awesome-docker health` then `./awesome-docker report`
|
||||
- Opens/updates `health-report` issue
|
||||
- **GitHub Pages deploy**: `.github/workflows/deploy-pages.yml`
|
||||
- On push to `master`, builds CLI, runs `./awesome-docker build`, deploys `website/`
|
||||
|
||||
## Makefile Workflow
|
||||
- The `Makefile` models file dependencies for generated artifacts (`awesome-docker`, `website/index.html`, `config/health_cache.yaml`, `HEALTH_REPORT.md`, `HEALTH_REPORT.json`).
|
||||
- Prefer `make` targets over ad-hoc command sequences so dependency and regeneration behavior stays consistent.
|
||||
- Use:
|
||||
- `make workflow-dev` for local iteration
|
||||
- `make workflow-pr` before opening/updating a PR
|
||||
- `make workflow-maint` for health/report maintenance
|
||||
- `make workflow-ci` for CI-equivalent local checks
|
||||
|
||||
## Content Guidelines (from CONTRIBUTING.md)
|
||||
- Link to GitHub projects, not websites
|
||||
- Entries are listed alphabetically (from A to Z)
|
||||
- Entries must be Docker/container-related with clear documentation
|
||||
- Include project description, installation, and usage examples
|
||||
- Mark WIP projects explicitly
|
||||
- Avoid outdated tutorials/blog posts unless advanced/specific
|
||||
- Use one link per entry
|
||||
- Prefer project/repository URLs over marketing pages
|
||||
- Keep entries alphabetically ordered within each section
|
||||
- Keep descriptions concise and concrete
|
||||
- Use `:yen:` only for paid/commercial services
|
||||
- Use `:ice_cube:` for stale projects (2+ years inactive)
|
||||
- Remove archived/deprecated projects instead of tagging them
|
||||
- Avoid duplicate links and redirect variants
|
||||
|
||||
145
Makefile
Normal file
145
Makefile
Normal file
@@ -0,0 +1,145 @@
|
||||
SHELL := /bin/bash
|
||||
|
||||
BINARY ?= awesome-docker
|
||||
GO ?= go
|
||||
CMD_PACKAGE := ./cmd/awesome-docker
|
||||
INTERNAL_PACKAGES := ./internal/...
|
||||
WEBSITE_OUTPUT := website/index.html
|
||||
HEALTH_CACHE := config/health_cache.yaml
|
||||
HEALTH_REPORT_MD := HEALTH_REPORT.md
|
||||
HEALTH_REPORT_JSON := HEALTH_REPORT.json
|
||||
|
||||
GO_SOURCES := $(shell find cmd internal -type f -name '*.go')
|
||||
BUILD_INPUTS := $(GO_SOURCES) go.mod go.sum
|
||||
WEBSITE_INPUTS := README.md config/website.tmpl.html
|
||||
HEALTH_INPUTS := README.md config/exclude.yaml
|
||||
|
||||
.DEFAULT_GOAL := help
|
||||
|
||||
.PHONY: help \
|
||||
build rebuild clean \
|
||||
fmt test test-race \
|
||||
lint lint-fix check check-pr validate website \
|
||||
guard-github-token health health-cache \
|
||||
report report-json report-file report-json-file health-report \
|
||||
workflow-dev workflow-pr workflow-maint workflow-ci
|
||||
|
||||
help: ## Show the full local workflow and available targets
|
||||
@echo "awesome-docker Makefile"
|
||||
@echo
|
||||
@echo "Workflows:"
|
||||
@echo " make workflow-dev # local iteration (fmt + test + lint + check-pr + website)"
|
||||
@echo " make workflow-pr # recommended before opening/updating a PR"
|
||||
@echo " make workflow-maint # repository maintenance (health + JSON report)"
|
||||
@echo " make workflow-ci # CI-equivalent checks"
|
||||
@echo
|
||||
@echo "Core targets:"
|
||||
@echo " make build # build CLI binary"
|
||||
@echo " make test # run internal Go tests"
|
||||
@echo " make lint # validate README formatting/content rules"
|
||||
@echo " make check # check links (uses GITHUB_TOKEN when set)"
|
||||
@echo " make validate # run PR validation (lint + check --pr)"
|
||||
@echo " make website # generate website/index.html"
|
||||
@echo " make report-file # generate HEALTH_REPORT.md"
|
||||
@echo " make report-json-file# generate HEALTH_REPORT.json"
|
||||
@echo " make health # refresh health cache (requires GITHUB_TOKEN)"
|
||||
@echo " make report # print markdown health report"
|
||||
@echo " make report-json # print full JSON health report"
|
||||
@echo
|
||||
@echo "Generated artifacts:"
|
||||
@echo " $(BINARY)"
|
||||
@echo " $(WEBSITE_OUTPUT)"
|
||||
@echo " $(HEALTH_CACHE)"
|
||||
@echo " $(HEALTH_REPORT_MD)"
|
||||
@echo " $(HEALTH_REPORT_JSON)"
|
||||
|
||||
$(BINARY): $(BUILD_INPUTS)
|
||||
$(GO) build -o $(BINARY) $(CMD_PACKAGE)
|
||||
|
||||
build: $(BINARY) ## Build CLI binary
|
||||
|
||||
rebuild: clean build ## Rebuild from scratch
|
||||
|
||||
clean: ## Remove generated binary
|
||||
rm -f $(BINARY) $(HEALTH_REPORT_MD) $(HEALTH_REPORT_JSON)
|
||||
|
||||
fmt: ## Format Go code
|
||||
$(GO) fmt ./...
|
||||
|
||||
test: ## Run internal unit tests
|
||||
$(GO) test $(INTERNAL_PACKAGES) -v
|
||||
|
||||
test-race: ## Run internal tests with race detector
|
||||
$(GO) test $(INTERNAL_PACKAGES) -race
|
||||
|
||||
lint: build ## Validate README formatting/content rules
|
||||
./$(BINARY) lint
|
||||
|
||||
lint-fix: build ## Auto-fix lint issues when possible
|
||||
./$(BINARY) lint --fix
|
||||
|
||||
check: build ## Check links (GitHub checks enabled when GITHUB_TOKEN is set)
|
||||
./$(BINARY) check
|
||||
|
||||
check-pr: build ## Check links in PR mode (external links only)
|
||||
./$(BINARY) check --pr
|
||||
|
||||
validate: build ## Run PR validation (lint + check --pr)
|
||||
./$(BINARY) validate
|
||||
|
||||
$(WEBSITE_OUTPUT): $(BINARY) $(WEBSITE_INPUTS)
|
||||
./$(BINARY) build
|
||||
|
||||
website: $(WEBSITE_OUTPUT) ## Generate website from README
|
||||
|
||||
guard-github-token:
|
||||
@if [ -z "$$GITHUB_TOKEN" ]; then \
|
||||
echo "GITHUB_TOKEN is required for this target."; \
|
||||
echo "Set it with: export GITHUB_TOKEN=<token>"; \
|
||||
exit 1; \
|
||||
fi
|
||||
|
||||
$(HEALTH_CACHE): guard-github-token $(BINARY) $(HEALTH_INPUTS)
|
||||
./$(BINARY) health
|
||||
|
||||
health-cache: $(HEALTH_CACHE) ## Update config/health_cache.yaml
|
||||
|
||||
health: ## Refresh health cache from GitHub metadata
|
||||
@$(MAKE) --no-print-directory -B health-cache
|
||||
|
||||
report: build ## Print markdown health report from cache
|
||||
./$(BINARY) report
|
||||
|
||||
report-json: build ## Print full health report as JSON
|
||||
./$(BINARY) report --json
|
||||
|
||||
$(HEALTH_REPORT_MD): $(BINARY) $(HEALTH_CACHE)
|
||||
./$(BINARY) report > $(HEALTH_REPORT_MD)
|
||||
|
||||
report-file: $(HEALTH_REPORT_MD) ## Generate HEALTH_REPORT.md from cache
|
||||
|
||||
$(HEALTH_REPORT_JSON): $(BINARY) $(HEALTH_CACHE)
|
||||
./$(BINARY) report --json > $(HEALTH_REPORT_JSON)
|
||||
|
||||
report-json-file: $(HEALTH_REPORT_JSON) ## Generate HEALTH_REPORT.json from cache
|
||||
|
||||
health-report: health report-file ## Refresh health cache then generate HEALTH_REPORT.md
|
||||
|
||||
browse: build ## Launch interactive TUI browser
|
||||
./$(BINARY) browse
|
||||
|
||||
workflow-dev: fmt test lint check-pr website ## Full local development workflow
|
||||
|
||||
workflow-pr: fmt test validate ## Recommended workflow before opening a PR
|
||||
|
||||
workflow-maint: health report-json-file ## Weekly maintenance workflow
|
||||
|
||||
workflow-ci: test validate ## CI-equivalent validation workflow
|
||||
|
||||
update-ga:
|
||||
ratchet upgrade .github/workflows/*
|
||||
|
||||
update-go:
|
||||
go get -u go@latest
|
||||
go get -u ./...
|
||||
go mod tidy
|
||||
51
build.js
51
build.js
@@ -1,51 +0,0 @@
|
||||
const fs = require('fs-extra');
|
||||
const cheerio = require('cheerio');
|
||||
const showdown = require('showdown');
|
||||
|
||||
process.env.NODE_ENV = 'production';
|
||||
|
||||
const LOG = {
|
||||
error: (...args) => console.error('❌ ERROR', { ...args }),
|
||||
debug: (...args) => {
|
||||
if (process.env.DEBUG) console.log('💡 DEBUG: ', { ...args });
|
||||
},
|
||||
};
|
||||
const handleFailure = (err) => {
|
||||
LOG.error(err);
|
||||
process.exit(1);
|
||||
};
|
||||
|
||||
process.on('unhandledRejection', handleFailure);
|
||||
|
||||
// --- FILES
|
||||
const README = 'README.md';
|
||||
const WEBSITE_FOLDER = 'website';
|
||||
const indexTemplate = `${WEBSITE_FOLDER}/index.tmpl.html`;
|
||||
const indexDestination = `${WEBSITE_FOLDER}/index.html`;
|
||||
|
||||
async function processIndex() {
|
||||
const converter = new showdown.Converter();
|
||||
converter.setFlavor('github');
|
||||
|
||||
try {
|
||||
LOG.debug('Loading files...', { indexTemplate, README });
|
||||
const template = await fs.readFile(indexTemplate, 'utf8');
|
||||
const markdown = await fs.readFile(README, 'utf8');
|
||||
|
||||
LOG.debug('Merging files...');
|
||||
const $ = cheerio.load(template);
|
||||
$('#md').append(converter.makeHtml(markdown));
|
||||
|
||||
LOG.debug('Writing index.html');
|
||||
await fs.outputFile(indexDestination, $.html(), 'utf8');
|
||||
LOG.debug('DONE 👍');
|
||||
} catch (err) {
|
||||
handleFailure(err);
|
||||
}
|
||||
}
|
||||
|
||||
async function main() {
|
||||
await processIndex();
|
||||
}
|
||||
|
||||
main();
|
||||
683
cmd/awesome-docker/main.go
Normal file
683
cmd/awesome-docker/main.go
Normal file
@@ -0,0 +1,683 @@
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"os"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/spf13/cobra"
|
||||
"github.com/veggiemonk/awesome-docker/internal/builder"
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
"github.com/veggiemonk/awesome-docker/internal/checker"
|
||||
"github.com/veggiemonk/awesome-docker/internal/linter"
|
||||
"github.com/veggiemonk/awesome-docker/internal/parser"
|
||||
"github.com/veggiemonk/awesome-docker/internal/scorer"
|
||||
"github.com/veggiemonk/awesome-docker/internal/tui"
|
||||
)
|
||||
|
||||
const (
|
||||
readmePath = "README.md"
|
||||
excludePath = "config/exclude.yaml"
|
||||
templatePath = "config/website.tmpl.html"
|
||||
healthCachePath = "config/health_cache.yaml"
|
||||
websiteOutput = "website/index.html"
|
||||
version = "0.1.0"
|
||||
)
|
||||
|
||||
type checkSummary struct {
|
||||
ExternalTotal int
|
||||
GitHubTotal int
|
||||
Broken []checker.LinkResult
|
||||
Redirected []checker.LinkResult
|
||||
GitHubErrors []error
|
||||
GitHubSkipped bool
|
||||
}
|
||||
|
||||
func main() {
|
||||
root := &cobra.Command{
|
||||
Use: "awesome-docker",
|
||||
Short: "Quality tooling for the awesome-docker curated list",
|
||||
}
|
||||
|
||||
root.AddCommand(
|
||||
versionCmd(),
|
||||
lintCmd(),
|
||||
checkCmd(),
|
||||
healthCmd(),
|
||||
buildCmd(),
|
||||
reportCmd(),
|
||||
validateCmd(),
|
||||
ciCmd(),
|
||||
browseCmd(),
|
||||
)
|
||||
|
||||
if err := root.Execute(); err != nil {
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
|
||||
func versionCmd() *cobra.Command {
|
||||
return &cobra.Command{
|
||||
Use: "version",
|
||||
Short: "Print version",
|
||||
Run: func(cmd *cobra.Command, args []string) { fmt.Printf("awesome-docker v%s\n", version) },
|
||||
}
|
||||
}
|
||||
|
||||
func parseReadme() (parser.Document, error) {
|
||||
f, err := os.Open(readmePath)
|
||||
if err != nil {
|
||||
return parser.Document{}, err
|
||||
}
|
||||
defer f.Close()
|
||||
return parser.Parse(f)
|
||||
}
|
||||
|
||||
func collectURLs(sections []parser.Section, urls *[]string) {
|
||||
for _, s := range sections {
|
||||
for _, e := range s.Entries {
|
||||
*urls = append(*urls, e.URL)
|
||||
}
|
||||
collectURLs(s.Children, urls)
|
||||
}
|
||||
}
|
||||
|
||||
type entryMeta struct {
|
||||
Category string
|
||||
Description string
|
||||
}
|
||||
|
||||
func collectEntriesWithCategory(sections []parser.Section, parentPath string, out map[string]entryMeta) {
|
||||
for _, s := range sections {
|
||||
path := s.Title
|
||||
if parentPath != "" {
|
||||
path = parentPath + " > " + s.Title
|
||||
}
|
||||
for _, e := range s.Entries {
|
||||
out[e.URL] = entryMeta{Category: path, Description: e.Description}
|
||||
}
|
||||
collectEntriesWithCategory(s.Children, path, out)
|
||||
}
|
||||
}
|
||||
|
||||
func runLinkChecks(prMode bool) (checkSummary, error) {
|
||||
doc, err := parseReadme()
|
||||
if err != nil {
|
||||
return checkSummary{}, fmt.Errorf("parse: %w", err)
|
||||
}
|
||||
|
||||
var urls []string
|
||||
collectURLs(doc.Sections, &urls)
|
||||
|
||||
exclude, err := cache.LoadExcludeList(excludePath)
|
||||
if err != nil {
|
||||
return checkSummary{}, fmt.Errorf("load exclude list: %w", err)
|
||||
}
|
||||
|
||||
ghURLs, extURLs := checker.PartitionLinks(urls)
|
||||
|
||||
summary := checkSummary{
|
||||
ExternalTotal: len(extURLs),
|
||||
GitHubTotal: len(ghURLs),
|
||||
}
|
||||
|
||||
results := checker.CheckLinks(extURLs, 10, exclude)
|
||||
for _, r := range results {
|
||||
if !r.OK {
|
||||
summary.Broken = append(summary.Broken, r)
|
||||
}
|
||||
if r.Redirected {
|
||||
summary.Redirected = append(summary.Redirected, r)
|
||||
}
|
||||
}
|
||||
|
||||
if prMode {
|
||||
summary.GitHubSkipped = true
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
token := os.Getenv("GITHUB_TOKEN")
|
||||
if token == "" {
|
||||
summary.GitHubSkipped = true
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
gc := checker.NewGitHubChecker(token)
|
||||
_, errs := gc.CheckRepos(context.Background(), ghURLs, 50)
|
||||
summary.GitHubErrors = errs
|
||||
return summary, nil
|
||||
}
|
||||
|
||||
func runHealth(ctx context.Context) error {
|
||||
token := os.Getenv("GITHUB_TOKEN")
|
||||
if token == "" {
|
||||
return fmt.Errorf("GITHUB_TOKEN environment variable is required")
|
||||
}
|
||||
|
||||
doc, err := parseReadme()
|
||||
if err != nil {
|
||||
return fmt.Errorf("parse: %w", err)
|
||||
}
|
||||
|
||||
var urls []string
|
||||
collectURLs(doc.Sections, &urls)
|
||||
ghURLs, _ := checker.PartitionLinks(urls)
|
||||
|
||||
fmt.Printf("Scoring %d GitHub repositories...\n", len(ghURLs))
|
||||
gc := checker.NewGitHubChecker(token)
|
||||
infos, errs := gc.CheckRepos(ctx, ghURLs, 50)
|
||||
for _, e := range errs {
|
||||
fmt.Printf(" error: %v\n", e)
|
||||
}
|
||||
if len(infos) == 0 {
|
||||
if len(errs) > 0 {
|
||||
return fmt.Errorf("failed to fetch GitHub metadata for all repositories (%d errors); check network/DNS and GITHUB_TOKEN", len(errs))
|
||||
}
|
||||
return fmt.Errorf("no GitHub repositories found in README")
|
||||
}
|
||||
|
||||
scored := scorer.ScoreAll(infos)
|
||||
|
||||
meta := make(map[string]entryMeta)
|
||||
collectEntriesWithCategory(doc.Sections, "", meta)
|
||||
for i := range scored {
|
||||
if m, ok := meta[scored[i].URL]; ok {
|
||||
scored[i].Category = m.Category
|
||||
scored[i].Description = m.Description
|
||||
}
|
||||
}
|
||||
|
||||
cacheEntries := scorer.ToCacheEntries(scored)
|
||||
|
||||
hc, err := cache.LoadHealthCache(healthCachePath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("load cache: %w", err)
|
||||
}
|
||||
hc.Merge(cacheEntries)
|
||||
if err := cache.SaveHealthCache(healthCachePath, hc); err != nil {
|
||||
return fmt.Errorf("save cache: %w", err)
|
||||
}
|
||||
|
||||
fmt.Printf("Cache updated: %d entries in %s\n", len(hc.Entries), healthCachePath)
|
||||
return nil
|
||||
}
|
||||
|
||||
func scoredFromCache() ([]scorer.ScoredEntry, error) {
|
||||
hc, err := cache.LoadHealthCache(healthCachePath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("load cache: %w", err)
|
||||
}
|
||||
if len(hc.Entries) == 0 {
|
||||
return nil, fmt.Errorf("no cache data, run 'health' first")
|
||||
}
|
||||
|
||||
scored := make([]scorer.ScoredEntry, 0, len(hc.Entries))
|
||||
for _, e := range hc.Entries {
|
||||
scored = append(scored, scorer.ScoredEntry{
|
||||
URL: e.URL,
|
||||
Name: e.Name,
|
||||
Status: scorer.Status(e.Status),
|
||||
Stars: e.Stars,
|
||||
Forks: e.Forks,
|
||||
HasLicense: e.HasLicense,
|
||||
LastPush: e.LastPush,
|
||||
Category: e.Category,
|
||||
Description: e.Description,
|
||||
})
|
||||
}
|
||||
return scored, nil
|
||||
}
|
||||
|
||||
func markdownReportFromCache() (string, error) {
|
||||
scored, err := scoredFromCache()
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
return scorer.GenerateReport(scored), nil
|
||||
}
|
||||
|
||||
func writeGitHubOutput(path, key, value string) error {
|
||||
if path == "" {
|
||||
return nil
|
||||
}
|
||||
f, err := os.OpenFile(path, os.O_APPEND|os.O_CREATE|os.O_WRONLY, 0o644)
|
||||
if err != nil {
|
||||
return fmt.Errorf("open github output file: %w", err)
|
||||
}
|
||||
defer f.Close()
|
||||
if _, err := fmt.Fprintf(f, "%s=%s\n", key, value); err != nil {
|
||||
return fmt.Errorf("write github output: %w", err)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
func sanitizeOutputValue(v string) string {
|
||||
v = strings.ReplaceAll(v, "\n", " ")
|
||||
v = strings.ReplaceAll(v, "\r", " ")
|
||||
return strings.TrimSpace(v)
|
||||
}
|
||||
|
||||
func buildBrokenLinksIssueBody(summary checkSummary, runErr error) string {
|
||||
var b strings.Builder
|
||||
b.WriteString("# Broken Links Detected\n\n")
|
||||
|
||||
if runErr != nil {
|
||||
b.WriteString("The link checker failed to execute cleanly.\n\n")
|
||||
b.WriteString("## Failure\n\n")
|
||||
fmt.Fprintf(&b, "- %s\n\n", runErr)
|
||||
} else {
|
||||
fmt.Fprintf(&b, "- Broken links: %d\n", len(summary.Broken))
|
||||
fmt.Fprintf(&b, "- Redirected links: %d\n", len(summary.Redirected))
|
||||
fmt.Fprintf(&b, "- GitHub API errors: %d\n\n", len(summary.GitHubErrors))
|
||||
|
||||
if len(summary.Broken) > 0 {
|
||||
b.WriteString("## Broken Links\n\n")
|
||||
for _, r := range summary.Broken {
|
||||
fmt.Fprintf(&b, "- `%s` -> `%d %s`\n", r.URL, r.StatusCode, strings.TrimSpace(r.Error))
|
||||
}
|
||||
b.WriteString("\n")
|
||||
}
|
||||
|
||||
if len(summary.GitHubErrors) > 0 {
|
||||
b.WriteString("## GitHub API Errors\n\n")
|
||||
for _, e := range summary.GitHubErrors {
|
||||
fmt.Fprintf(&b, "- `%s`\n", e)
|
||||
}
|
||||
b.WriteString("\n")
|
||||
}
|
||||
}
|
||||
|
||||
b.WriteString("## Action Required\n\n")
|
||||
b.WriteString("- Update the URL if the resource moved\n")
|
||||
b.WriteString("- Remove the entry if permanently unavailable\n")
|
||||
b.WriteString("- Add to `config/exclude.yaml` if a known false positive\n")
|
||||
b.WriteString("- Investigate GitHub API/auth failures when present\n\n")
|
||||
b.WriteString("---\n")
|
||||
b.WriteString("*Auto-generated by awesome-docker ci broken-links*\n")
|
||||
return b.String()
|
||||
}
|
||||
|
||||
func buildHealthReportIssueBody(report string, healthErr error) string {
|
||||
var b strings.Builder
|
||||
if healthErr != nil {
|
||||
b.WriteString("WARNING: health refresh failed in this run; showing latest cached report.\n\n")
|
||||
fmt.Fprintf(&b, "Error: `%s`\n\n", healthErr)
|
||||
}
|
||||
b.WriteString(report)
|
||||
if !strings.HasSuffix(report, "\n") {
|
||||
b.WriteString("\n")
|
||||
}
|
||||
b.WriteString("\n---\n")
|
||||
b.WriteString("*Auto-generated weekly by awesome-docker ci health-report*\n")
|
||||
return b.String()
|
||||
}
|
||||
|
||||
func lintCmd() *cobra.Command {
|
||||
var fix bool
|
||||
cmd := &cobra.Command{
|
||||
Use: "lint",
|
||||
Short: "Validate README formatting",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
doc, err := parseReadme()
|
||||
if err != nil {
|
||||
return fmt.Errorf("parse: %w", err)
|
||||
}
|
||||
|
||||
result := linter.Lint(doc)
|
||||
for _, issue := range result.Issues {
|
||||
fmt.Println(issue)
|
||||
}
|
||||
|
||||
if result.Errors > 0 {
|
||||
fmt.Printf("\n%d errors, %d warnings\n", result.Errors, result.Warnings)
|
||||
if !fix {
|
||||
return fmt.Errorf("lint failed with %d errors", result.Errors)
|
||||
}
|
||||
count, err := linter.FixFile(readmePath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("fix: %w", err)
|
||||
}
|
||||
fmt.Printf("Fixed %d lines in %s\n", count, readmePath)
|
||||
} else {
|
||||
fmt.Printf("OK: %d warnings\n", result.Warnings)
|
||||
}
|
||||
|
||||
return nil
|
||||
},
|
||||
}
|
||||
cmd.Flags().BoolVar(&fix, "fix", false, "Auto-fix formatting issues")
|
||||
return cmd
|
||||
}
|
||||
|
||||
func checkCmd() *cobra.Command {
|
||||
var prMode bool
|
||||
cmd := &cobra.Command{
|
||||
Use: "check",
|
||||
Short: "Check links for reachability",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
summary, err := runLinkChecks(prMode)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
fmt.Printf("Checking %d external links...\n", summary.ExternalTotal)
|
||||
if !prMode {
|
||||
if summary.GitHubSkipped {
|
||||
fmt.Println("GITHUB_TOKEN not set, skipping GitHub repo checks")
|
||||
} else {
|
||||
fmt.Printf("Checking %d GitHub repositories...\n", summary.GitHubTotal)
|
||||
}
|
||||
}
|
||||
|
||||
for _, e := range summary.GitHubErrors {
|
||||
fmt.Printf(" GitHub error: %v\n", e)
|
||||
}
|
||||
|
||||
if len(summary.Redirected) > 0 {
|
||||
fmt.Printf("\n%d redirected links (consider updating):\n", len(summary.Redirected))
|
||||
for _, r := range summary.Redirected {
|
||||
fmt.Printf(" %s -> %s\n", r.URL, r.RedirectURL)
|
||||
}
|
||||
}
|
||||
|
||||
if len(summary.Broken) > 0 {
|
||||
fmt.Printf("\n%d broken links:\n", len(summary.Broken))
|
||||
for _, r := range summary.Broken {
|
||||
fmt.Printf(" %s -> %d %s\n", r.URL, r.StatusCode, r.Error)
|
||||
}
|
||||
}
|
||||
if len(summary.Broken) > 0 && len(summary.GitHubErrors) > 0 {
|
||||
return fmt.Errorf("found %d broken links and %d GitHub API errors", len(summary.Broken), len(summary.GitHubErrors))
|
||||
}
|
||||
if len(summary.Broken) > 0 {
|
||||
return fmt.Errorf("found %d broken links", len(summary.Broken))
|
||||
}
|
||||
if len(summary.GitHubErrors) > 0 {
|
||||
return fmt.Errorf("github checks failed with %d errors", len(summary.GitHubErrors))
|
||||
}
|
||||
|
||||
fmt.Println("All links OK")
|
||||
return nil
|
||||
},
|
||||
}
|
||||
cmd.Flags().BoolVar(&prMode, "pr", false, "PR mode: skip GitHub API checks")
|
||||
return cmd
|
||||
}
|
||||
|
||||
func healthCmd() *cobra.Command {
|
||||
return &cobra.Command{
|
||||
Use: "health",
|
||||
Short: "Score repository health and update cache",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
return runHealth(context.Background())
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func buildCmd() *cobra.Command {
|
||||
return &cobra.Command{
|
||||
Use: "build",
|
||||
Short: "Generate website from README",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
if err := builder.Build(readmePath, templatePath, websiteOutput); err != nil {
|
||||
return err
|
||||
}
|
||||
fmt.Printf("Website built: %s\n", websiteOutput)
|
||||
return nil
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func reportCmd() *cobra.Command {
|
||||
var jsonOutput bool
|
||||
cmd := &cobra.Command{
|
||||
Use: "report",
|
||||
Short: "Generate health report from cache",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
scored, err := scoredFromCache()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if jsonOutput {
|
||||
payload, err := scorer.GenerateJSONReport(scored)
|
||||
if err != nil {
|
||||
return fmt.Errorf("json report: %w", err)
|
||||
}
|
||||
fmt.Println(string(payload))
|
||||
return nil
|
||||
}
|
||||
|
||||
report := scorer.GenerateReport(scored)
|
||||
fmt.Print(report)
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
cmd.Flags().BoolVar(&jsonOutput, "json", false, "Output full health report as JSON")
|
||||
return cmd
|
||||
}
|
||||
|
||||
func validateCmd() *cobra.Command {
|
||||
return &cobra.Command{
|
||||
Use: "validate",
|
||||
Short: "PR validation: lint + check --pr",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
fmt.Println("=== Linting ===")
|
||||
doc, err := parseReadme()
|
||||
if err != nil {
|
||||
return fmt.Errorf("parse: %w", err)
|
||||
}
|
||||
|
||||
result := linter.Lint(doc)
|
||||
for _, issue := range result.Issues {
|
||||
fmt.Println(issue)
|
||||
}
|
||||
if result.Errors > 0 {
|
||||
fmt.Printf("\n%d errors, %d warnings\n", result.Errors, result.Warnings)
|
||||
return fmt.Errorf("lint failed with %d errors", result.Errors)
|
||||
}
|
||||
fmt.Printf("Lint OK: %d warnings\n", result.Warnings)
|
||||
|
||||
fmt.Println("\n=== Checking links (PR mode) ===")
|
||||
summary, err := runLinkChecks(true)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
fmt.Printf("Checking %d external links...\n", summary.ExternalTotal)
|
||||
if len(summary.Broken) > 0 {
|
||||
fmt.Printf("\n%d broken links:\n", len(summary.Broken))
|
||||
for _, r := range summary.Broken {
|
||||
fmt.Printf(" %s -> %d %s\n", r.URL, r.StatusCode, r.Error)
|
||||
}
|
||||
return fmt.Errorf("found %d broken links", len(summary.Broken))
|
||||
}
|
||||
|
||||
fmt.Println("\nValidation passed")
|
||||
return nil
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
func ciCmd() *cobra.Command {
|
||||
cmd := &cobra.Command{
|
||||
Use: "ci",
|
||||
Short: "CI-oriented helper commands",
|
||||
}
|
||||
cmd.AddCommand(
|
||||
ciBrokenLinksCmd(),
|
||||
ciHealthReportCmd(),
|
||||
)
|
||||
return cmd
|
||||
}
|
||||
|
||||
func ciBrokenLinksCmd() *cobra.Command {
|
||||
var issueFile string
|
||||
var githubOutput string
|
||||
var strict bool
|
||||
|
||||
cmd := &cobra.Command{
|
||||
Use: "broken-links",
|
||||
Short: "Run link checks and emit CI outputs/artifacts",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
summary, runErr := runLinkChecks(false)
|
||||
|
||||
hasErrors := runErr != nil || len(summary.Broken) > 0 || len(summary.GitHubErrors) > 0
|
||||
exitCode := 0
|
||||
if hasErrors {
|
||||
exitCode = 1
|
||||
}
|
||||
if runErr != nil {
|
||||
exitCode = 2
|
||||
}
|
||||
|
||||
if issueFile != "" && hasErrors {
|
||||
body := buildBrokenLinksIssueBody(summary, runErr)
|
||||
if err := os.WriteFile(issueFile, []byte(body), 0o644); err != nil {
|
||||
return fmt.Errorf("write issue file: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
if err := writeGitHubOutput(githubOutput, "has_errors", strconv.FormatBool(hasErrors)); err != nil {
|
||||
return err
|
||||
}
|
||||
if err := writeGitHubOutput(githubOutput, "check_exit_code", strconv.Itoa(exitCode)); err != nil {
|
||||
return err
|
||||
}
|
||||
if err := writeGitHubOutput(githubOutput, "broken_count", strconv.Itoa(len(summary.Broken))); err != nil {
|
||||
return err
|
||||
}
|
||||
if err := writeGitHubOutput(githubOutput, "github_error_count", strconv.Itoa(len(summary.GitHubErrors))); err != nil {
|
||||
return err
|
||||
}
|
||||
if runErr != nil {
|
||||
if err := writeGitHubOutput(githubOutput, "run_error", sanitizeOutputValue(runErr.Error())); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
if runErr != nil {
|
||||
fmt.Printf("CI broken-links run error: %v\n", runErr)
|
||||
}
|
||||
if hasErrors {
|
||||
fmt.Printf("CI broken-links found %d broken links and %d GitHub errors\n", len(summary.Broken), len(summary.GitHubErrors))
|
||||
} else {
|
||||
fmt.Println("CI broken-links found no errors")
|
||||
}
|
||||
|
||||
if strict {
|
||||
if runErr != nil {
|
||||
return runErr
|
||||
}
|
||||
if hasErrors {
|
||||
return fmt.Errorf("found %d broken links and %d GitHub API errors", len(summary.Broken), len(summary.GitHubErrors))
|
||||
}
|
||||
}
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
cmd.Flags().StringVar(&issueFile, "issue-file", "broken_links_issue.md", "Path to write issue markdown body")
|
||||
cmd.Flags().StringVar(&githubOutput, "github-output", "", "Path to GitHub output file (typically $GITHUB_OUTPUT)")
|
||||
cmd.Flags().BoolVar(&strict, "strict", false, "Return non-zero when errors are found")
|
||||
return cmd
|
||||
}
|
||||
|
||||
func ciHealthReportCmd() *cobra.Command {
|
||||
var issueFile string
|
||||
var githubOutput string
|
||||
var strict bool
|
||||
|
||||
cmd := &cobra.Command{
|
||||
Use: "health-report",
|
||||
Short: "Refresh health cache, render report, and emit CI outputs/artifacts",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
healthErr := runHealth(context.Background())
|
||||
report, reportErr := markdownReportFromCache()
|
||||
|
||||
healthOK := healthErr == nil
|
||||
reportOK := reportErr == nil
|
||||
hasReport := reportOK && strings.TrimSpace(report) != ""
|
||||
hasErrors := !healthOK || !reportOK
|
||||
|
||||
if hasReport && issueFile != "" {
|
||||
body := buildHealthReportIssueBody(report, healthErr)
|
||||
if err := os.WriteFile(issueFile, []byte(body), 0o644); err != nil {
|
||||
return fmt.Errorf("write issue file: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
if err := writeGitHubOutput(githubOutput, "has_report", strconv.FormatBool(hasReport)); err != nil {
|
||||
return err
|
||||
}
|
||||
if err := writeGitHubOutput(githubOutput, "health_ok", strconv.FormatBool(healthOK)); err != nil {
|
||||
return err
|
||||
}
|
||||
if err := writeGitHubOutput(githubOutput, "report_ok", strconv.FormatBool(reportOK)); err != nil {
|
||||
return err
|
||||
}
|
||||
if err := writeGitHubOutput(githubOutput, "has_errors", strconv.FormatBool(hasErrors)); err != nil {
|
||||
return err
|
||||
}
|
||||
if healthErr != nil {
|
||||
if err := writeGitHubOutput(githubOutput, "health_error", sanitizeOutputValue(healthErr.Error())); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
if reportErr != nil {
|
||||
if err := writeGitHubOutput(githubOutput, "report_error", sanitizeOutputValue(reportErr.Error())); err != nil {
|
||||
return err
|
||||
}
|
||||
}
|
||||
|
||||
if healthErr != nil {
|
||||
fmt.Printf("CI health-report health error: %v\n", healthErr)
|
||||
}
|
||||
if reportErr != nil {
|
||||
fmt.Printf("CI health-report report error: %v\n", reportErr)
|
||||
}
|
||||
if hasReport {
|
||||
fmt.Println("CI health-report generated report artifact")
|
||||
} else {
|
||||
fmt.Println("CI health-report has no report artifact")
|
||||
}
|
||||
|
||||
if strict {
|
||||
if healthErr != nil {
|
||||
return healthErr
|
||||
}
|
||||
if reportErr != nil {
|
||||
return reportErr
|
||||
}
|
||||
}
|
||||
return nil
|
||||
},
|
||||
}
|
||||
|
||||
cmd.Flags().StringVar(&issueFile, "issue-file", "health_report.txt", "Path to write health issue markdown body")
|
||||
cmd.Flags().StringVar(&githubOutput, "github-output", "", "Path to GitHub output file (typically $GITHUB_OUTPUT)")
|
||||
cmd.Flags().BoolVar(&strict, "strict", false, "Return non-zero when health/report fails")
|
||||
return cmd
|
||||
}
|
||||
|
||||
func browseCmd() *cobra.Command {
|
||||
var cachePath string
|
||||
cmd := &cobra.Command{
|
||||
Use: "browse",
|
||||
Short: "Interactive TUI browser for awesome-docker resources",
|
||||
RunE: func(cmd *cobra.Command, args []string) error {
|
||||
hc, err := cache.LoadHealthCache(cachePath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("load cache: %w", err)
|
||||
}
|
||||
if len(hc.Entries) == 0 {
|
||||
return fmt.Errorf("no cache data; run 'awesome-docker health' first")
|
||||
}
|
||||
return tui.Run(hc.Entries)
|
||||
},
|
||||
}
|
||||
cmd.Flags().StringVar(&cachePath, "cache", healthCachePath, "Path to health cache YAML")
|
||||
return cmd
|
||||
}
|
||||
18
config/exclude.yaml
Normal file
18
config/exclude.yaml
Normal file
@@ -0,0 +1,18 @@
|
||||
# URLs or URL prefixes to skip during link checking.
|
||||
# These are known false positives or rate-limited domains.
|
||||
domains:
|
||||
- https://vimeo.com
|
||||
- https://travis-ci.org/veggiemonk/awesome-docker.svg
|
||||
- https://github.com/apps/
|
||||
- https://twitter.com
|
||||
- https://www.meetup.com/
|
||||
- https://cycle.io/
|
||||
- https://www.manning.com/
|
||||
- https://deepfence.io
|
||||
- https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg
|
||||
- https://www.se-radio.net/2017/05/se-radio-episode-290-diogo-monica-on-docker-security
|
||||
- https://www.reddit.com/r/docker/
|
||||
- https://www.udacity.com/course/scalable-microservices-with-kubernetes--ud615
|
||||
- https://www.youtube.com/playlist
|
||||
- https://www.aquasec.com
|
||||
- https://cloudsmith.com
|
||||
3010
config/health_cache.yaml
Normal file
3010
config/health_cache.yaml
Normal file
File diff suppressed because it is too large
Load Diff
725
config/website.tmpl.html
Normal file
725
config/website.tmpl.html
Normal file
@@ -0,0 +1,725 @@
|
||||
<!DOCTYPE html>
|
||||
<html class="no-js" lang="en">
|
||||
<head>
|
||||
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
|
||||
<meta http-equiv="Cache-control" content="public" />
|
||||
<meta charset="UTF-8" />
|
||||
<title>Awesome-docker</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||
<meta name="theme-color" media="(prefers-color-scheme: light)" content="#0B77B7" />
|
||||
<meta name="theme-color" media="(prefers-color-scheme: dark)" content="#13344C" />
|
||||
<meta name="color-scheme" content="light dark" />
|
||||
<meta
|
||||
name="description"
|
||||
content="A curated list of Docker resources and projects."
|
||||
/>
|
||||
<meta
|
||||
name="keywords"
|
||||
content="free and open-source open source projects for docker moby kubernetes linux awesome awesome-list container tools dockerfile list moby docker-container docker-image docker-environment docker-deployment docker-swarm docker-api docker-monitoring docker-machine docker-security docker-registry"
|
||||
/>
|
||||
<meta
|
||||
name="google-site-verification"
|
||||
content="_yiugvz0gCtfsBLyLl1LnkALXb6D4ofiwCyV1XOlYBM"
|
||||
/>
|
||||
<link rel="icon" type="image/png" href="favicon.png" />
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com" />
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin />
|
||||
<link
|
||||
href="https://fonts.googleapis.com/css2?family=Manrope:wght@400;500;700;800&family=Sora:wght@600;700;800&display=swap"
|
||||
rel="stylesheet"
|
||||
/>
|
||||
<style>
|
||||
:root {
|
||||
--bg: #f3f8fb;
|
||||
--bg-top: #eaf5fb;
|
||||
--bg-bottom: #f6fbff;
|
||||
--bg-spot-1: rgba(200, 232, 248, 1);
|
||||
--bg-spot-2: rgba(213, 240, 255, 1);
|
||||
--surface: #ffffff;
|
||||
--surface-soft: #f7fbfe;
|
||||
--text: #1f2d3d;
|
||||
--muted: #4e6279;
|
||||
--heading: #103a5c;
|
||||
--link: #065f95;
|
||||
--link-hover: #044971;
|
||||
--border: #dbe7f0;
|
||||
--marker: #2f77a8;
|
||||
--hr: #d3e4f0;
|
||||
--focus-ring: #0a67a5;
|
||||
--focus-halo: rgba(10, 103, 165, 0.28);
|
||||
--header-grad-start: #0d4d78;
|
||||
--header-grad-mid: #0b77b7;
|
||||
--header-grad-end: #43a8d8;
|
||||
--header-orb-1: rgba(255, 255, 255, 0.3);
|
||||
--header-orb-2: rgba(84, 195, 245, 0.5);
|
||||
--code-bg: #edf4fa;
|
||||
--code-border: #dce7f0;
|
||||
--code-text: #1f2d3d;
|
||||
--pre-bg: #0e2334;
|
||||
--pre-border: #d8e3ed;
|
||||
--pre-text: #e2edf5;
|
||||
--table-bg: #ffffff;
|
||||
--table-header-bg: #f0f7fc;
|
||||
--toc-bg: linear-gradient(180deg, #f8fcff 0%, #f3f9fd 100%);
|
||||
--toc-title: #214f72;
|
||||
--toc-child-border: #c8deed;
|
||||
--toc-link: #175886;
|
||||
--toc-link-hover: #0f3f61;
|
||||
--toc-link-hover-bg: #e6f2fa;
|
||||
--toc-link-active: #0d3e61;
|
||||
--toc-link-active-bg: #d9ebf8;
|
||||
--toc-link-active-ring: #bcd8ec;
|
||||
--shadow: 0 22px 50px -34px rgba(17, 57, 88, 0.42);
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: dark) {
|
||||
:root {
|
||||
--bg: #0e1620;
|
||||
--bg-top: #101a25;
|
||||
--bg-bottom: #0d141d;
|
||||
--bg-spot-1: rgba(23, 47, 68, 0.85);
|
||||
--bg-spot-2: rgba(19, 61, 89, 0.62);
|
||||
--surface: #121e2a;
|
||||
--surface-soft: #162634;
|
||||
--text: #d5e4f2;
|
||||
--muted: #9fb7ce;
|
||||
--heading: #f0f7ff;
|
||||
--link: #7cc7f5;
|
||||
--link-hover: #a1d8fb;
|
||||
--border: #2b4257;
|
||||
--marker: #78b6dd;
|
||||
--hr: #284154;
|
||||
--focus-ring: #84cbf8;
|
||||
--focus-halo: rgba(132, 203, 248, 0.32);
|
||||
--header-grad-start: #12344c;
|
||||
--header-grad-mid: #185c86;
|
||||
--header-grad-end: #23759f;
|
||||
--header-orb-1: rgba(133, 198, 242, 0.24);
|
||||
--header-orb-2: rgba(61, 141, 189, 0.36);
|
||||
--code-bg: #1b2b3a;
|
||||
--code-border: #2b4257;
|
||||
--code-text: #d8e8f7;
|
||||
--pre-bg: #0b1622;
|
||||
--pre-border: #2b4257;
|
||||
--pre-text: #dceaf7;
|
||||
--table-bg: #121e2a;
|
||||
--table-header-bg: #1a2b3a;
|
||||
--toc-bg: linear-gradient(180deg, #162736 0%, #132432 100%);
|
||||
--toc-title: #b6d6ee;
|
||||
--toc-child-border: #335067;
|
||||
--toc-link: #a9d4f2;
|
||||
--toc-link-hover: #d7ecfc;
|
||||
--toc-link-hover-bg: #20384b;
|
||||
--toc-link-active: #e6f5ff;
|
||||
--toc-link-active-bg: #294a62;
|
||||
--toc-link-active-ring: #3e6482;
|
||||
--shadow: 0 28px 60px -36px rgba(0, 0, 0, 0.78);
|
||||
}
|
||||
}
|
||||
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
html {
|
||||
-ms-text-size-adjust: 100%;
|
||||
-webkit-text-size-adjust: 100%;
|
||||
scroll-behavior: smooth;
|
||||
}
|
||||
|
||||
body {
|
||||
margin: 0;
|
||||
min-height: 100vh;
|
||||
font-family: "Manrope", "Segoe UI", "Helvetica Neue", Arial, sans-serif;
|
||||
font-size: 16px;
|
||||
line-height: 1.65;
|
||||
color: var(--text);
|
||||
background:
|
||||
radial-gradient(circle at 12% 12%, var(--bg-spot-1) 0, transparent 40%),
|
||||
radial-gradient(circle at 85% 2%, var(--bg-spot-2) 0, transparent 32%),
|
||||
linear-gradient(180deg, var(--bg-top) 0%, var(--bg) 34%, var(--bg-bottom) 100%);
|
||||
}
|
||||
|
||||
a {
|
||||
color: var(--link);
|
||||
text-decoration: none;
|
||||
text-underline-offset: 0.16em;
|
||||
transition: color 140ms ease, text-decoration-color 140ms ease;
|
||||
}
|
||||
|
||||
a:hover {
|
||||
color: var(--link-hover);
|
||||
text-decoration: underline;
|
||||
}
|
||||
|
||||
:where(a, button, input, select, textarea, summary, [tabindex]):focus-visible {
|
||||
outline: 3px solid var(--focus-ring);
|
||||
outline-offset: 3px;
|
||||
box-shadow: 0 0 0 4px var(--focus-halo);
|
||||
border-radius: 7px;
|
||||
}
|
||||
|
||||
strong {
|
||||
font-weight: 800;
|
||||
}
|
||||
|
||||
p {
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
img {
|
||||
border: 0;
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
svg:not(:root) {
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.btn {
|
||||
display: inline-block;
|
||||
padding: 0.72rem 1.15rem;
|
||||
color: #ffffff;
|
||||
font-weight: 700;
|
||||
letter-spacing: 0.01em;
|
||||
background: rgba(255, 255, 255, 0.18);
|
||||
border: 1px solid rgba(255, 255, 255, 0.42);
|
||||
border-radius: 10px;
|
||||
box-shadow: inset 0 1px 0 rgba(255, 255, 255, 0.28);
|
||||
}
|
||||
|
||||
.btn:hover {
|
||||
color: #ffffff;
|
||||
text-decoration: none;
|
||||
background: rgba(255, 255, 255, 0.3);
|
||||
}
|
||||
|
||||
.page-header {
|
||||
position: relative;
|
||||
overflow: hidden;
|
||||
text-align: center;
|
||||
color: #ffffff;
|
||||
background-image: linear-gradient(128deg, var(--header-grad-start) 5%, var(--header-grad-mid) 57%, var(--header-grad-end) 100%);
|
||||
}
|
||||
|
||||
.page-header::before,
|
||||
.page-header::after {
|
||||
content: "";
|
||||
position: absolute;
|
||||
border-radius: 999px;
|
||||
pointer-events: none;
|
||||
}
|
||||
|
||||
.page-header::before {
|
||||
width: 30rem;
|
||||
height: 30rem;
|
||||
right: -7rem;
|
||||
top: -19rem;
|
||||
background: radial-gradient(circle, var(--header-orb-1) 0%, transparent 68%);
|
||||
}
|
||||
|
||||
.page-header::after {
|
||||
width: 28rem;
|
||||
height: 28rem;
|
||||
left: -10rem;
|
||||
bottom: -20rem;
|
||||
background: radial-gradient(circle, var(--header-orb-2) 0%, transparent 70%);
|
||||
}
|
||||
|
||||
.page-header > * {
|
||||
position: relative;
|
||||
z-index: 1;
|
||||
}
|
||||
|
||||
.project-name {
|
||||
margin: 0 0 0.55rem;
|
||||
font-family: "Sora", "Avenir Next", "Segoe UI", Arial, sans-serif;
|
||||
font-size: clamp(2rem, 4.4vw, 3.4rem);
|
||||
line-height: 1.05;
|
||||
letter-spacing: -0.028em;
|
||||
}
|
||||
|
||||
.project-tagline {
|
||||
margin: 0 auto;
|
||||
max-width: 46rem;
|
||||
color: rgba(255, 255, 255, 0.92);
|
||||
font-size: clamp(1.02rem, 1.8vw, 1.22rem);
|
||||
line-height: 1.45;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
margin-top: 1.5rem;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: 0.8rem;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.site-shell {
|
||||
max-width: 76rem;
|
||||
margin: -2.2rem auto 0;
|
||||
padding: 0 1rem 2.5rem;
|
||||
}
|
||||
|
||||
.main-content {
|
||||
background: var(--surface);
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 20px;
|
||||
box-shadow: var(--shadow);
|
||||
word-wrap: break-word;
|
||||
overflow-wrap: anywhere;
|
||||
max-width: 72rem;
|
||||
margin: 0 auto;
|
||||
padding: clamp(1.35rem, 1rem + 1.45vw, 2.6rem) clamp(1rem, 0.55rem + 2.15vw, 2.8rem);
|
||||
}
|
||||
|
||||
.main-content > * {
|
||||
max-width: 70ch;
|
||||
}
|
||||
|
||||
.main-content > :first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.main-content h1,
|
||||
.main-content h2,
|
||||
.main-content h3,
|
||||
.main-content h4,
|
||||
.main-content h5 {
|
||||
font-family: "Sora", "Avenir Next", "Segoe UI", Arial, sans-serif;
|
||||
line-height: 1.2;
|
||||
letter-spacing: -0.015em;
|
||||
color: var(--heading);
|
||||
scroll-margin-top: 1.1rem;
|
||||
margin: 2rem 0 0.8rem;
|
||||
}
|
||||
|
||||
.main-content h1 {
|
||||
font-size: clamp(1.6rem, 2.5vw, 2.2rem);
|
||||
}
|
||||
|
||||
.main-content h2 {
|
||||
font-size: clamp(1.45rem, 2.2vw, 1.92rem);
|
||||
}
|
||||
|
||||
.main-content h3 {
|
||||
font-size: clamp(1.3rem, 2vw, 1.62rem);
|
||||
}
|
||||
|
||||
.main-content h4,
|
||||
.main-content h5 {
|
||||
font-size: clamp(1.15rem, 1.85vw, 1.34rem);
|
||||
}
|
||||
|
||||
.main-content p,
|
||||
.main-content li {
|
||||
color: var(--text);
|
||||
line-height: 1.76;
|
||||
}
|
||||
|
||||
.main-content p + p {
|
||||
margin-top: 0.95rem;
|
||||
}
|
||||
|
||||
.main-content ul,
|
||||
.main-content ol {
|
||||
padding-left: 1.4rem;
|
||||
margin: 0.62rem 0 1.2rem;
|
||||
}
|
||||
|
||||
.main-content li + li {
|
||||
margin-top: 0.34rem;
|
||||
}
|
||||
|
||||
.main-content ul li::marker {
|
||||
color: var(--marker);
|
||||
}
|
||||
|
||||
.main-content hr {
|
||||
max-width: 100%;
|
||||
border: 0;
|
||||
height: 1px;
|
||||
margin: 2.2rem 0;
|
||||
background: linear-gradient(90deg, transparent, var(--hr) 18%, var(--hr) 82%, transparent);
|
||||
}
|
||||
|
||||
.main-content blockquote {
|
||||
margin: 1.3rem 0;
|
||||
padding: 0.9rem 1.15rem;
|
||||
border-left: 4px solid #68b0da;
|
||||
border-radius: 0 12px 12px 0;
|
||||
background: var(--surface-soft);
|
||||
color: var(--muted);
|
||||
}
|
||||
|
||||
.main-content blockquote > :first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.main-content blockquote > :last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.main-content code {
|
||||
font-family: "SFMono-Regular", Menlo, Consolas, "Liberation Mono", monospace;
|
||||
font-size: 0.88em;
|
||||
background: var(--code-bg);
|
||||
border: 1px solid var(--code-border);
|
||||
border-radius: 6px;
|
||||
padding: 0.08em 0.38em;
|
||||
color: var(--code-text);
|
||||
}
|
||||
|
||||
.main-content pre {
|
||||
max-width: 100%;
|
||||
overflow: auto;
|
||||
border: 1px solid var(--pre-border);
|
||||
border-radius: 12px;
|
||||
background: var(--pre-bg);
|
||||
box-shadow: inset 0 1px 0 rgba(255, 255, 255, 0.06);
|
||||
padding: 0.9rem 1rem;
|
||||
}
|
||||
|
||||
.main-content pre code {
|
||||
padding: 0;
|
||||
border: 0;
|
||||
background: transparent;
|
||||
color: var(--pre-text);
|
||||
}
|
||||
|
||||
.main-content table {
|
||||
width: 100%;
|
||||
max-width: 100%;
|
||||
border-collapse: collapse;
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 12px;
|
||||
overflow: hidden;
|
||||
background: var(--table-bg);
|
||||
margin: 1.2rem 0 1.6rem;
|
||||
}
|
||||
|
||||
.main-content th,
|
||||
.main-content td {
|
||||
border: 1px solid var(--border);
|
||||
padding: 0.52rem 0.68rem;
|
||||
text-align: left;
|
||||
}
|
||||
|
||||
.main-content th {
|
||||
background: var(--table-header-bg);
|
||||
color: var(--heading);
|
||||
font-weight: 700;
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type {
|
||||
list-style: none;
|
||||
max-width: 100%;
|
||||
padding: 0.95rem 1.05rem 1rem;
|
||||
margin: 1rem 0 1.8rem;
|
||||
border: 1px solid var(--border);
|
||||
border-radius: 14px;
|
||||
background: var(--toc-bg);
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type::before {
|
||||
content: "Contents";
|
||||
display: block;
|
||||
margin-bottom: 0.65rem;
|
||||
font-family: "Sora", "Avenir Next", "Segoe UI", Arial, sans-serif;
|
||||
font-size: 0.9rem;
|
||||
font-weight: 700;
|
||||
letter-spacing: 0.01em;
|
||||
text-transform: uppercase;
|
||||
color: var(--toc-title);
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type li {
|
||||
margin: 0.22rem 0;
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type ul {
|
||||
margin: 0.25rem 0 0.55rem 0.48rem;
|
||||
padding-left: 0.58rem;
|
||||
border-left: 1px solid var(--toc-child-border);
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type a {
|
||||
display: block;
|
||||
padding: 0.2rem 0.46rem;
|
||||
border-radius: 8px;
|
||||
font-weight: 700;
|
||||
color: var(--toc-link);
|
||||
transition: color 170ms ease, background-color 180ms ease, box-shadow 180ms ease, transform 220ms cubic-bezier(0.2, 0.8, 0.2, 1);
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type a:hover {
|
||||
color: var(--toc-link-hover);
|
||||
text-decoration: none;
|
||||
background: var(--toc-link-hover-bg);
|
||||
transform: translateX(2px);
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type a.is-active {
|
||||
color: var(--toc-link-active);
|
||||
background: var(--toc-link-active-bg);
|
||||
box-shadow: inset 0 0 0 1px var(--toc-link-active-ring);
|
||||
transform: translateX(3px);
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type li ul a {
|
||||
font-weight: 600;
|
||||
font-size: 0.92rem;
|
||||
}
|
||||
|
||||
.main-content img {
|
||||
word-wrap: break-word;
|
||||
height: auto;
|
||||
}
|
||||
|
||||
@keyframes hero-fade-in {
|
||||
from {
|
||||
opacity: 0;
|
||||
transform: translateY(12px);
|
||||
}
|
||||
to {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
}
|
||||
|
||||
@keyframes panel-rise-in {
|
||||
from {
|
||||
opacity: 0;
|
||||
transform: translateY(14px);
|
||||
}
|
||||
to {
|
||||
opacity: 1;
|
||||
transform: translateY(0);
|
||||
}
|
||||
}
|
||||
|
||||
.page-header > * {
|
||||
animation: hero-fade-in 540ms cubic-bezier(0.21, 0.76, 0.26, 1) both;
|
||||
}
|
||||
|
||||
.project-tagline {
|
||||
animation-delay: 90ms;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
animation-delay: 170ms;
|
||||
}
|
||||
|
||||
.main-content {
|
||||
animation: panel-rise-in 620ms cubic-bezier(0.18, 0.74, 0.24, 1) 130ms both;
|
||||
}
|
||||
|
||||
@media screen and (min-width: 54em) {
|
||||
.page-header {
|
||||
padding: 4.65rem 2rem 5.05rem;
|
||||
}
|
||||
|
||||
.main-content {
|
||||
display: grid;
|
||||
grid-template-columns: minmax(15rem, 19rem) minmax(0, 1fr);
|
||||
column-gap: 2rem;
|
||||
row-gap: 0;
|
||||
align-items: start;
|
||||
}
|
||||
|
||||
.main-content > * {
|
||||
grid-column: 2;
|
||||
width: min(100%, 70ch);
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type {
|
||||
grid-column: 1;
|
||||
grid-row: 1 / span 999;
|
||||
width: 100%;
|
||||
margin: 0;
|
||||
position: sticky;
|
||||
top: 1rem;
|
||||
max-height: calc(100vh - 2rem);
|
||||
overflow: auto;
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type + * {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.main-content > table,
|
||||
.main-content > pre {
|
||||
width: 100%;
|
||||
max-width: 100%;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: 54em) {
|
||||
.page-header {
|
||||
padding: 3.1rem 1rem 4.2rem;
|
||||
}
|
||||
|
||||
.site-shell {
|
||||
margin-top: -1.7rem;
|
||||
}
|
||||
|
||||
.main-content > ul:first-of-type {
|
||||
max-height: none;
|
||||
position: static;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
gap: 0.65rem;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: 34em) {
|
||||
.btn {
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
.main-content {
|
||||
border-radius: 14px;
|
||||
padding-top: 1.1rem;
|
||||
}
|
||||
}
|
||||
|
||||
@media (prefers-reduced-motion: reduce) {
|
||||
* {
|
||||
scroll-behavior: auto;
|
||||
transition: none !important;
|
||||
animation: none !important;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<section class="page-header">
|
||||
<h1 class="project-name">Awesome-docker</h1>
|
||||
<p class="project-tagline">
|
||||
A curated list of Docker resources and projects
|
||||
</p>
|
||||
<div class="header-actions">
|
||||
<a href="https://github.com/veggiemonk/awesome-docker" class="btn"
|
||||
>View on GitHub</a
|
||||
>
|
||||
<a
|
||||
class="github-button"
|
||||
href="https://github.com/veggiemonk/awesome-docker#readme"
|
||||
data-icon="octicon-star"
|
||||
data-size="large"
|
||||
data-count-href="/veggiemonk/awesome-docker/stargazers"
|
||||
data-show-count="true"
|
||||
data-count-aria-label="# stargazers on GitHub"
|
||||
aria-label="Star veggiemonk/awesome-docker on GitHub"
|
||||
>Star</a
|
||||
>
|
||||
</div>
|
||||
</section>
|
||||
<main class="site-shell">
|
||||
<section id="md" class="main-content"></section>
|
||||
</main>
|
||||
<script async defer src="https://buttons.github.io/buttons.js"></script>
|
||||
<script>
|
||||
(function () {
|
||||
var toc = document.querySelector("#md > ul:first-of-type");
|
||||
if (!toc) {
|
||||
return;
|
||||
}
|
||||
|
||||
var links = Array.prototype.slice.call(toc.querySelectorAll('a[href^="#"]'));
|
||||
if (links.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
var linkById = {};
|
||||
links.forEach(function (link) {
|
||||
var href = link.getAttribute("href");
|
||||
if (!href || href.length < 2) {
|
||||
return;
|
||||
}
|
||||
var id = href.slice(1);
|
||||
try {
|
||||
id = decodeURIComponent(id);
|
||||
} catch (_) {}
|
||||
if (id) {
|
||||
linkById[id] = link;
|
||||
}
|
||||
});
|
||||
|
||||
var headings = Array.prototype.filter.call(
|
||||
document.querySelectorAll("#md h1[id], #md h2[id], #md h3[id], #md h4[id], #md h5[id]"),
|
||||
function (heading) {
|
||||
return Boolean(linkById[heading.id]);
|
||||
}
|
||||
);
|
||||
if (headings.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
function setActive(id) {
|
||||
if (!id || !linkById[id]) {
|
||||
return;
|
||||
}
|
||||
links.forEach(function (link) {
|
||||
link.classList.remove("is-active");
|
||||
link.removeAttribute("aria-current");
|
||||
});
|
||||
linkById[id].classList.add("is-active");
|
||||
linkById[id].setAttribute("aria-current", "true");
|
||||
}
|
||||
|
||||
function setActiveFromHash() {
|
||||
if (!window.location.hash || window.location.hash.length < 2) {
|
||||
return false;
|
||||
}
|
||||
var id = window.location.hash.slice(1);
|
||||
try {
|
||||
id = decodeURIComponent(id);
|
||||
} catch (_) {}
|
||||
if (linkById[id]) {
|
||||
setActive(id);
|
||||
return true;
|
||||
}
|
||||
return false;
|
||||
}
|
||||
|
||||
function setActiveFromViewport() {
|
||||
var current = headings[0];
|
||||
for (var i = 0; i < headings.length; i += 1) {
|
||||
if (headings[i].getBoundingClientRect().top <= 150) {
|
||||
current = headings[i];
|
||||
} else {
|
||||
break;
|
||||
}
|
||||
}
|
||||
setActive(current.id);
|
||||
}
|
||||
|
||||
var ticking = false;
|
||||
window.addEventListener(
|
||||
"scroll",
|
||||
function () {
|
||||
if (ticking) {
|
||||
return;
|
||||
}
|
||||
ticking = true;
|
||||
window.requestAnimationFrame(function () {
|
||||
setActiveFromViewport();
|
||||
ticking = false;
|
||||
});
|
||||
},
|
||||
{ passive: true }
|
||||
);
|
||||
|
||||
window.addEventListener("hashchange", setActiveFromHash);
|
||||
if (!setActiveFromHash()) {
|
||||
setActiveFromViewport();
|
||||
}
|
||||
})();
|
||||
</script>
|
||||
</body>
|
||||
</html>
|
||||
34
go.mod
Normal file
34
go.mod
Normal file
@@ -0,0 +1,34 @@
|
||||
module github.com/veggiemonk/awesome-docker
|
||||
|
||||
go 1.26.0
|
||||
|
||||
require (
|
||||
charm.land/bubbletea/v2 v2.0.2
|
||||
charm.land/lipgloss/v2 v2.0.2
|
||||
github.com/shurcooL/githubv4 v0.0.0-20260209031235-2402fdf4a9ed
|
||||
github.com/spf13/cobra v1.10.2
|
||||
github.com/yuin/goldmark v1.7.16
|
||||
golang.org/x/oauth2 v0.36.0
|
||||
gopkg.in/yaml.v3 v3.0.1
|
||||
)
|
||||
|
||||
require (
|
||||
github.com/charmbracelet/colorprofile v0.4.3 // indirect
|
||||
github.com/charmbracelet/ultraviolet v0.0.0-20260309091805-903bfd0cf188 // indirect
|
||||
github.com/charmbracelet/x/ansi v0.11.6 // indirect
|
||||
github.com/charmbracelet/x/term v0.2.2 // indirect
|
||||
github.com/charmbracelet/x/termios v0.1.1 // indirect
|
||||
github.com/charmbracelet/x/windows v0.2.2 // indirect
|
||||
github.com/clipperhouse/displaywidth v0.11.0 // indirect
|
||||
github.com/clipperhouse/uax29/v2 v2.7.0 // indirect
|
||||
github.com/inconshreveable/mousetrap v1.1.0 // indirect
|
||||
github.com/lucasb-eyer/go-colorful v1.3.0 // indirect
|
||||
github.com/mattn/go-runewidth v0.0.21 // indirect
|
||||
github.com/muesli/cancelreader v0.2.2 // indirect
|
||||
github.com/rivo/uniseg v0.4.7 // indirect
|
||||
github.com/shurcooL/graphql v0.0.0-20240915155400-7ee5256398cf // indirect
|
||||
github.com/spf13/pflag v1.0.10 // indirect
|
||||
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e // indirect
|
||||
golang.org/x/sync v0.20.0 // indirect
|
||||
golang.org/x/sys v0.42.0 // indirect
|
||||
)
|
||||
79
go.sum
Normal file
79
go.sum
Normal file
@@ -0,0 +1,79 @@
|
||||
charm.land/bubbletea/v2 v2.0.1 h1:B8e9zzK7x9JJ+XvHGF4xnYu9Xa0E0y0MyggY6dbaCfQ=
|
||||
charm.land/bubbletea/v2 v2.0.1/go.mod h1:3LRff2U4WIYXy7MTxfbAQ+AdfM3D8Xuvz2wbsOD9OHQ=
|
||||
charm.land/bubbletea/v2 v2.0.2 h1:4CRtRnuZOdFDTWSff9r8QFt/9+z6Emubz3aDMnf/dx0=
|
||||
charm.land/bubbletea/v2 v2.0.2/go.mod h1:3LRff2U4WIYXy7MTxfbAQ+AdfM3D8Xuvz2wbsOD9OHQ=
|
||||
charm.land/lipgloss/v2 v2.0.0 h1:sd8N/B3x892oiOjFfBQdXBQp3cAkvjGaU5TvVZC3ivo=
|
||||
charm.land/lipgloss/v2 v2.0.0/go.mod h1:w6SnmsBFBmEFBodiEDurGS/sdUY/u1+v72DqUzc6J14=
|
||||
charm.land/lipgloss/v2 v2.0.2 h1:xFolbF8JdpNkM2cEPTfXEcW1p6NRzOWTSamRfYEw8cs=
|
||||
charm.land/lipgloss/v2 v2.0.2/go.mod h1:KjPle2Qd3YmvP1KL5OMHiHysGcNwq6u83MUjYkFvEkM=
|
||||
github.com/aymanbagabas/go-udiff v0.4.0 h1:TKnLPh7IbnizJIBKFWa9mKayRUBQ9Kh1BPCk6w2PnYM=
|
||||
github.com/aymanbagabas/go-udiff v0.4.0/go.mod h1:0L9PGwj20lrtmEMeyw4WKJ/TMyDtvAoK9bf2u/mNo3w=
|
||||
github.com/aymanbagabas/go-udiff v0.4.1 h1:OEIrQ8maEeDBXQDoGCbbTTXYJMYRCRO1fnodZ12Gv5o=
|
||||
github.com/charmbracelet/colorprofile v0.4.2 h1:BdSNuMjRbotnxHSfxy+PCSa4xAmz7szw70ktAtWRYrY=
|
||||
github.com/charmbracelet/colorprofile v0.4.2/go.mod h1:0rTi81QpwDElInthtrQ6Ni7cG0sDtwAd4C4le060fT8=
|
||||
github.com/charmbracelet/colorprofile v0.4.3 h1:QPa1IWkYI+AOB+fE+mg/5/4HRMZcaXex9t5KX76i20Q=
|
||||
github.com/charmbracelet/colorprofile v0.4.3/go.mod h1:/zT4BhpD5aGFpqQQqw7a+VtHCzu+zrQtt1zhMt9mR4Q=
|
||||
github.com/charmbracelet/ultraviolet v0.0.0-20260205113103-524a6607adb8 h1:eyFRbAmexyt43hVfeyBofiGSEmJ7krjLOYt/9CF5NKA=
|
||||
github.com/charmbracelet/ultraviolet v0.0.0-20260205113103-524a6607adb8/go.mod h1:SQpCTRNBtzJkwku5ye4S3HEuthAlGy2n9VXZnWkEW98=
|
||||
github.com/charmbracelet/ultraviolet v0.0.0-20260309091805-903bfd0cf188 h1:J8v4kWJYCaxv1SLhLunN74S+jMteZ1f7Dae99ioq4Bo=
|
||||
github.com/charmbracelet/ultraviolet v0.0.0-20260309091805-903bfd0cf188/go.mod h1:FzWNAbe1jEmI+GZljSnlaSA8wJjnNIZhWBLkTsAl6eg=
|
||||
github.com/charmbracelet/x/ansi v0.11.6 h1:GhV21SiDz/45W9AnV2R61xZMRri5NlLnl6CVF7ihZW8=
|
||||
github.com/charmbracelet/x/ansi v0.11.6/go.mod h1:2JNYLgQUsyqaiLovhU2Rv/pb8r6ydXKS3NIttu3VGZQ=
|
||||
github.com/charmbracelet/x/exp/golden v0.0.0-20250806222409-83e3a29d542f h1:pk6gmGpCE7F3FcjaOEKYriCvpmIN4+6OS/RD0vm4uIA=
|
||||
github.com/charmbracelet/x/exp/golden v0.0.0-20250806222409-83e3a29d542f/go.mod h1:IfZAMTHB6XkZSeXUqriemErjAWCCzT0LwjKFYCZyw0I=
|
||||
github.com/charmbracelet/x/term v0.2.2 h1:xVRT/S2ZcKdhhOuSP4t5cLi5o+JxklsoEObBSgfgZRk=
|
||||
github.com/charmbracelet/x/term v0.2.2/go.mod h1:kF8CY5RddLWrsgVwpw4kAa6TESp6EB5y3uxGLeCqzAI=
|
||||
github.com/charmbracelet/x/termios v0.1.1 h1:o3Q2bT8eqzGnGPOYheoYS8eEleT5ZVNYNy8JawjaNZY=
|
||||
github.com/charmbracelet/x/termios v0.1.1/go.mod h1:rB7fnv1TgOPOyyKRJ9o+AsTU/vK5WHJ2ivHeut/Pcwo=
|
||||
github.com/charmbracelet/x/windows v0.2.2 h1:IofanmuvaxnKHuV04sC0eBy/smG6kIKrWG2/jYn2GuM=
|
||||
github.com/charmbracelet/x/windows v0.2.2/go.mod h1:/8XtdKZzedat74NQFn0NGlGL4soHB0YQZrETF96h75k=
|
||||
github.com/clipperhouse/displaywidth v0.11.0 h1:lBc6kY44VFw+TDx4I8opi/EtL9m20WSEFgwIwO+UVM8=
|
||||
github.com/clipperhouse/displaywidth v0.11.0/go.mod h1:bkrFNkf81G8HyVqmKGxsPufD3JhNl3dSqnGhOoSD/o0=
|
||||
github.com/clipperhouse/uax29/v2 v2.7.0 h1:+gs4oBZ2gPfVrKPthwbMzWZDaAFPGYK72F0NJv2v7Vk=
|
||||
github.com/clipperhouse/uax29/v2 v2.7.0/go.mod h1:EFJ2TJMRUaplDxHKj1qAEhCtQPW2tJSwu5BF98AuoVM=
|
||||
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
|
||||
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
|
||||
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
|
||||
github.com/lucasb-eyer/go-colorful v1.3.0 h1:2/yBRLdWBZKrf7gB40FoiKfAWYQ0lqNcbuQwVHXptag=
|
||||
github.com/lucasb-eyer/go-colorful v1.3.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=
|
||||
github.com/mattn/go-runewidth v0.0.19 h1:v++JhqYnZuu5jSKrk9RbgF5v4CGUjqRfBm05byFGLdw=
|
||||
github.com/mattn/go-runewidth v0.0.19/go.mod h1:XBkDxAl56ILZc9knddidhrOlY5R/pDhgLpndooCuJAs=
|
||||
github.com/mattn/go-runewidth v0.0.21 h1:jJKAZiQH+2mIinzCJIaIG9Be1+0NR+5sz/lYEEjdM8w=
|
||||
github.com/mattn/go-runewidth v0.0.21/go.mod h1:XBkDxAl56ILZc9knddidhrOlY5R/pDhgLpndooCuJAs=
|
||||
github.com/muesli/cancelreader v0.2.2 h1:3I4Kt4BQjOR54NavqnDogx/MIoWBFa0StPA8ELUXHmA=
|
||||
github.com/muesli/cancelreader v0.2.2/go.mod h1:3XuTXfFS2VjM+HTLZY9Ak0l6eUKfijIfMUZ4EgX0QYo=
|
||||
github.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=
|
||||
github.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
||||
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
|
||||
github.com/shurcooL/githubv4 v0.0.0-20260209031235-2402fdf4a9ed h1:KT7hI8vYXgU0s2qaMkrfq9tCA1w/iEPgfredVP+4Tzw=
|
||||
github.com/shurcooL/githubv4 v0.0.0-20260209031235-2402fdf4a9ed/go.mod h1:zqMwyHmnN/eDOZOdiTohqIUKUrTFX62PNlu7IJdu0q8=
|
||||
github.com/shurcooL/graphql v0.0.0-20240915155400-7ee5256398cf h1:o1uxfymjZ7jZ4MsgCErcwWGtVKSiNAXtS59Lhs6uI/g=
|
||||
github.com/shurcooL/graphql v0.0.0-20240915155400-7ee5256398cf/go.mod h1:9dIRpgIY7hVhoqfe0/FcYp0bpInZaT7dc3BYOprrIUE=
|
||||
github.com/spf13/cobra v1.10.2 h1:DMTTonx5m65Ic0GOoRY2c16WCbHxOOw6xxezuLaBpcU=
|
||||
github.com/spf13/cobra v1.10.2/go.mod h1:7C1pvHqHw5A4vrJfjNwvOdzYu0Gml16OCs2GRiTUUS4=
|
||||
github.com/spf13/pflag v1.0.9/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
|
||||
github.com/spf13/pflag v1.0.10 h1:4EBh2KAYBwaONj6b2Ye1GiHfwjqyROoF4RwYO+vPwFk=
|
||||
github.com/spf13/pflag v1.0.10/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
|
||||
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e h1:JVG44RsyaB9T2KIHavMF/ppJZNG9ZpyihvCd0w101no=
|
||||
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e/go.mod h1:RbqR21r5mrJuqunuUZ/Dhy/avygyECGrLceyNeo4LiM=
|
||||
github.com/yuin/goldmark v1.7.16 h1:n+CJdUxaFMiDUNnWC3dMWCIQJSkxH4uz3ZwQBkAlVNE=
|
||||
github.com/yuin/goldmark v1.7.16/go.mod h1:ip/1k0VRfGynBgxOz0yCqHrbZXhcjxyuS66Brc7iBKg=
|
||||
go.yaml.in/yaml/v3 v3.0.4/go.mod h1:DhzuOOF2ATzADvBadXxruRBLzYTpT36CKvDb3+aBEFg=
|
||||
golang.org/x/exp v0.0.0-20231006140011-7918f672742d h1:jtJma62tbqLibJ5sFQz8bKtEM8rJBtfilJ2qTU199MI=
|
||||
golang.org/x/exp v0.0.0-20231006140011-7918f672742d/go.mod h1:ldy0pHrwJyGW56pPQzzkH36rKxoZW1tw7ZJpeKx+hdo=
|
||||
golang.org/x/oauth2 v0.35.0 h1:Mv2mzuHuZuY2+bkyWXIHMfhNdJAdwW3FuWeCPYN5GVQ=
|
||||
golang.org/x/oauth2 v0.35.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
|
||||
golang.org/x/oauth2 v0.36.0 h1:peZ/1z27fi9hUOFCAZaHyrpWG5lwe0RJEEEeH0ThlIs=
|
||||
golang.org/x/oauth2 v0.36.0/go.mod h1:YDBUJMTkDnJS+A4BP4eZBjCqtokkg1hODuPjwiGPO7Q=
|
||||
golang.org/x/sync v0.19.0 h1:vV+1eWNmZ5geRlYjzm2adRgW2/mcpevXNg50YZtPCE4=
|
||||
golang.org/x/sync v0.19.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||
golang.org/x/sync v0.20.0 h1:e0PTpb7pjO8GAtTs2dQ6jYa5BWYlMuX047Dco/pItO4=
|
||||
golang.org/x/sync v0.20.0/go.mod h1:9xrNwdLfx4jkKbNva9FpL6vEN7evnE43NNNJQ2LF3+0=
|
||||
golang.org/x/sys v0.41.0 h1:Ivj+2Cp/ylzLiEU89QhWblYnOE9zerudt9Ftecq2C6k=
|
||||
golang.org/x/sys v0.41.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||
golang.org/x/sys v0.42.0 h1:omrd2nAlyT5ESRdCLYdm3+fMfNFE/+Rf4bDIQImRJeo=
|
||||
golang.org/x/sys v0.42.0/go.mod h1:4GL1E5IUh+htKOUEOaiffhrAeqysfVGipDYzABqnCmw=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405 h1:yhCVgyC4o1eVCa2tZl7eS0r+SDo693bJlVdllGtEeKM=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
|
||||
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
@@ -4,10 +4,10 @@
|
||||
<meta charset="UTF-8">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||
<meta http-equiv="X-UA-Compatible" content="ie=edge">
|
||||
<meta HTTP-EQUIV="REFRESH" content="0; url=https://awesome-docker.netlify.com">
|
||||
<meta http-equiv="refresh" content="0; url=website/">
|
||||
<title>Awesome-docker</title>
|
||||
</head>
|
||||
<body>
|
||||
<p> <a href="https://awesome-docker.netlify.com/">We moved to a new place, click here to be redirected.</a></p>
|
||||
<p><a href="website/">Redirecting to the generated site.</a></p>
|
||||
</body>
|
||||
</html>
|
||||
</html>
|
||||
|
||||
71
internal/builder/builder.go
Normal file
71
internal/builder/builder.go
Normal file
@@ -0,0 +1,71 @@
|
||||
package builder
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/yuin/goldmark"
|
||||
"github.com/yuin/goldmark/extension"
|
||||
"github.com/yuin/goldmark/parser"
|
||||
"github.com/yuin/goldmark/renderer/html"
|
||||
)
|
||||
|
||||
// Build converts a Markdown file to HTML using a template.
|
||||
// The template must contain a placeholder element that will be replaced with the rendered content.
|
||||
func Build(markdownPath, templatePath, outputPath string) error {
|
||||
md, err := os.ReadFile(markdownPath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("read markdown: %w", err)
|
||||
}
|
||||
|
||||
tmpl, err := os.ReadFile(templatePath)
|
||||
if err != nil {
|
||||
return fmt.Errorf("read template: %w", err)
|
||||
}
|
||||
|
||||
// Convert markdown to HTML
|
||||
gm := goldmark.New(
|
||||
goldmark.WithExtensions(extension.GFM),
|
||||
goldmark.WithParserOptions(parser.WithAutoHeadingID()),
|
||||
goldmark.WithRendererOptions(html.WithUnsafe()),
|
||||
)
|
||||
var buf bytes.Buffer
|
||||
if err := gm.Convert(md, &buf); err != nil {
|
||||
return fmt.Errorf("convert markdown: %w", err)
|
||||
}
|
||||
|
||||
// Inject into template — support both placeholder formats
|
||||
output := string(tmpl)
|
||||
replacements := []struct {
|
||||
old string
|
||||
new string
|
||||
}{
|
||||
{
|
||||
old: `<div id="md"></div>`,
|
||||
new: `<div id="md">` + buf.String() + `</div>`,
|
||||
},
|
||||
{
|
||||
old: `<section id="md" class="main-content"></section>`,
|
||||
new: `<section id="md" class="main-content">` + buf.String() + `</section>`,
|
||||
},
|
||||
}
|
||||
|
||||
replaced := false
|
||||
for _, r := range replacements {
|
||||
if strings.Contains(output, r.old) {
|
||||
output = strings.Replace(output, r.old, r.new, 1)
|
||||
replaced = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !replaced {
|
||||
return fmt.Errorf("template missing supported markdown placeholder")
|
||||
}
|
||||
|
||||
if err := os.WriteFile(outputPath, []byte(output), 0o644); err != nil {
|
||||
return fmt.Errorf("write output: %w", err)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
172
internal/builder/builder_test.go
Normal file
172
internal/builder/builder_test.go
Normal file
@@ -0,0 +1,172 @@
|
||||
package builder
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestBuild(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
|
||||
md := "# Test List\n\n- [Example](https://example.com) - A test entry.\n"
|
||||
mdPath := filepath.Join(dir, "README.md")
|
||||
if err := os.WriteFile(mdPath, []byte(md), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
tmpl := `<!DOCTYPE html>
|
||||
<html>
|
||||
<body>
|
||||
<div id="md"></div>
|
||||
</body>
|
||||
</html>`
|
||||
tmplPath := filepath.Join(dir, "template.html")
|
||||
if err := os.WriteFile(tmplPath, []byte(tmpl), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
outPath := filepath.Join(dir, "index.html")
|
||||
if err := Build(mdPath, tmplPath, outPath); err != nil {
|
||||
t.Fatalf("Build failed: %v", err)
|
||||
}
|
||||
|
||||
content, err := os.ReadFile(outPath)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
html := string(content)
|
||||
if !strings.Contains(html, "Test List") {
|
||||
t.Error("expected 'Test List' in output")
|
||||
}
|
||||
if !strings.Contains(html, "https://example.com") {
|
||||
t.Error("expected link in output")
|
||||
}
|
||||
}
|
||||
|
||||
func TestBuildWithSectionPlaceholder(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
|
||||
md := "# Hello\n\nWorld.\n"
|
||||
mdPath := filepath.Join(dir, "README.md")
|
||||
if err := os.WriteFile(mdPath, []byte(md), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
// This matches the actual template format
|
||||
tmpl := `<!DOCTYPE html>
|
||||
<html>
|
||||
<body>
|
||||
<section id="md" class="main-content"></section>
|
||||
</body>
|
||||
</html>`
|
||||
tmplPath := filepath.Join(dir, "template.html")
|
||||
if err := os.WriteFile(tmplPath, []byte(tmpl), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
outPath := filepath.Join(dir, "index.html")
|
||||
if err := Build(mdPath, tmplPath, outPath); err != nil {
|
||||
t.Fatalf("Build failed: %v", err)
|
||||
}
|
||||
|
||||
content, err := os.ReadFile(outPath)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
if !strings.Contains(string(content), "Hello") {
|
||||
t.Error("expected 'Hello' in output")
|
||||
}
|
||||
if !strings.Contains(string(content), `class="main-content"`) {
|
||||
t.Error("expected section class preserved")
|
||||
}
|
||||
}
|
||||
|
||||
func TestBuildRealREADME(t *testing.T) {
|
||||
mdPath := "../../README.md"
|
||||
tmplPath := "../../config/website.tmpl.html"
|
||||
if _, err := os.Stat(mdPath); err != nil {
|
||||
t.Skip("README.md not found")
|
||||
}
|
||||
if _, err := os.Stat(tmplPath); err != nil {
|
||||
t.Skip("website template not found")
|
||||
}
|
||||
|
||||
dir := t.TempDir()
|
||||
outPath := filepath.Join(dir, "index.html")
|
||||
|
||||
if err := Build(mdPath, tmplPath, outPath); err != nil {
|
||||
t.Fatalf("Build failed: %v", err)
|
||||
}
|
||||
|
||||
info, err := os.Stat(outPath)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
if info.Size() < 10000 {
|
||||
t.Errorf("output too small: %d bytes", info.Size())
|
||||
}
|
||||
t.Logf("Generated %d bytes", info.Size())
|
||||
}
|
||||
|
||||
func TestBuildFailsWithoutPlaceholder(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
|
||||
mdPath := filepath.Join(dir, "README.md")
|
||||
if err := os.WriteFile(mdPath, []byte("# Title\n"), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
tmplPath := filepath.Join(dir, "template.html")
|
||||
if err := os.WriteFile(tmplPath, []byte("<html><body><main></main></body></html>"), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
outPath := filepath.Join(dir, "index.html")
|
||||
err := Build(mdPath, tmplPath, outPath)
|
||||
if err == nil {
|
||||
t.Fatal("expected Build to fail when template has no supported placeholder")
|
||||
}
|
||||
}
|
||||
|
||||
func TestBuildAddsHeadingIDs(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
|
||||
md := "# Getting Started\n\n## Next Step\n"
|
||||
mdPath := filepath.Join(dir, "README.md")
|
||||
if err := os.WriteFile(mdPath, []byte(md), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
tmpl := `<!DOCTYPE html>
|
||||
<html>
|
||||
<body>
|
||||
<section id="md" class="main-content"></section>
|
||||
</body>
|
||||
</html>`
|
||||
tmplPath := filepath.Join(dir, "template.html")
|
||||
if err := os.WriteFile(tmplPath, []byte(tmpl), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
outPath := filepath.Join(dir, "index.html")
|
||||
if err := Build(mdPath, tmplPath, outPath); err != nil {
|
||||
t.Fatalf("Build failed: %v", err)
|
||||
}
|
||||
|
||||
content, err := os.ReadFile(outPath)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
html := string(content)
|
||||
if !strings.Contains(html, `id="getting-started"`) {
|
||||
t.Error("expected auto-generated heading id for h1")
|
||||
}
|
||||
if !strings.Contains(html, `id="next-step"`) {
|
||||
t.Error("expected auto-generated heading id for h2")
|
||||
}
|
||||
}
|
||||
98
internal/cache/cache.go
vendored
Normal file
98
internal/cache/cache.go
vendored
Normal file
@@ -0,0 +1,98 @@
|
||||
package cache
|
||||
|
||||
import (
|
||||
"os"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"gopkg.in/yaml.v3"
|
||||
)
|
||||
|
||||
// ExcludeList holds URL prefixes to skip during checking.
|
||||
type ExcludeList struct {
|
||||
Domains []string `yaml:"domains"`
|
||||
}
|
||||
|
||||
// IsExcluded returns true if the URL starts with any excluded prefix.
|
||||
func (e *ExcludeList) IsExcluded(url string) bool {
|
||||
for _, d := range e.Domains {
|
||||
if strings.HasPrefix(url, d) {
|
||||
return true
|
||||
}
|
||||
}
|
||||
return false
|
||||
}
|
||||
|
||||
// LoadExcludeList reads an exclude.yaml file.
|
||||
func LoadExcludeList(path string) (*ExcludeList, error) {
|
||||
data, err := os.ReadFile(path)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
var excl ExcludeList
|
||||
if err := yaml.Unmarshal(data, &excl); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return &excl, nil
|
||||
}
|
||||
|
||||
// HealthEntry stores metadata about a single entry.
|
||||
type HealthEntry struct {
|
||||
URL string `yaml:"url"`
|
||||
Name string `yaml:"name"`
|
||||
Status string `yaml:"status"` // healthy, inactive, stale, archived, dead
|
||||
Stars int `yaml:"stars,omitempty"`
|
||||
Forks int `yaml:"forks,omitempty"`
|
||||
LastPush time.Time `yaml:"last_push,omitempty"`
|
||||
HasLicense bool `yaml:"has_license,omitempty"`
|
||||
HasReadme bool `yaml:"has_readme,omitempty"`
|
||||
CheckedAt time.Time `yaml:"checked_at"`
|
||||
Category string `yaml:"category,omitempty"`
|
||||
Description string `yaml:"description,omitempty"`
|
||||
}
|
||||
|
||||
// HealthCache is the full YAML cache file.
|
||||
type HealthCache struct {
|
||||
Entries []HealthEntry `yaml:"entries"`
|
||||
}
|
||||
|
||||
// LoadHealthCache reads a health_cache.yaml file. Returns empty cache if file doesn't exist.
|
||||
func LoadHealthCache(path string) (*HealthCache, error) {
|
||||
data, err := os.ReadFile(path)
|
||||
if err != nil {
|
||||
if os.IsNotExist(err) {
|
||||
return &HealthCache{}, nil
|
||||
}
|
||||
return nil, err
|
||||
}
|
||||
var hc HealthCache
|
||||
if err := yaml.Unmarshal(data, &hc); err != nil {
|
||||
return nil, err
|
||||
}
|
||||
return &hc, nil
|
||||
}
|
||||
|
||||
// SaveHealthCache writes the cache to a YAML file.
|
||||
func SaveHealthCache(path string, hc *HealthCache) error {
|
||||
data, err := yaml.Marshal(hc)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return os.WriteFile(path, data, 0o644)
|
||||
}
|
||||
|
||||
// Merge updates the cache with new entries, replacing existing ones by URL.
|
||||
func (hc *HealthCache) Merge(entries []HealthEntry) {
|
||||
index := make(map[string]int)
|
||||
for i, e := range hc.Entries {
|
||||
index[e.URL] = i
|
||||
}
|
||||
for _, e := range entries {
|
||||
if i, exists := index[e.URL]; exists {
|
||||
hc.Entries[i] = e
|
||||
} else {
|
||||
index[e.URL] = len(hc.Entries)
|
||||
hc.Entries = append(hc.Entries, e)
|
||||
}
|
||||
}
|
||||
}
|
||||
137
internal/cache/cache_test.go
vendored
Normal file
137
internal/cache/cache_test.go
vendored
Normal file
@@ -0,0 +1,137 @@
|
||||
package cache
|
||||
|
||||
import (
|
||||
"os"
|
||||
"path/filepath"
|
||||
"testing"
|
||||
"time"
|
||||
)
|
||||
|
||||
func TestLoadExcludeList(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
path := filepath.Join(dir, "exclude.yaml")
|
||||
content := `domains:
|
||||
- https://example.com
|
||||
- https://test.org
|
||||
`
|
||||
if err := os.WriteFile(path, []byte(content), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
excl, err := LoadExcludeList(path)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
if len(excl.Domains) != 2 {
|
||||
t.Errorf("domains count = %d, want 2", len(excl.Domains))
|
||||
}
|
||||
if !excl.IsExcluded("https://example.com/foo") {
|
||||
t.Error("expected https://example.com/foo to be excluded")
|
||||
}
|
||||
if excl.IsExcluded("https://other.com") {
|
||||
t.Error("expected https://other.com to NOT be excluded")
|
||||
}
|
||||
}
|
||||
|
||||
func TestHealthCacheRoundTrip(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
path := filepath.Join(dir, "health.yaml")
|
||||
|
||||
original := &HealthCache{
|
||||
Entries: []HealthEntry{
|
||||
{
|
||||
URL: "https://github.com/example/repo",
|
||||
Name: "Example",
|
||||
Status: "healthy",
|
||||
Stars: 42,
|
||||
LastPush: time.Date(2026, 1, 15, 0, 0, 0, 0, time.UTC),
|
||||
HasLicense: true,
|
||||
HasReadme: true,
|
||||
CheckedAt: time.Date(2026, 2, 27, 9, 0, 0, 0, time.UTC),
|
||||
},
|
||||
},
|
||||
}
|
||||
|
||||
if err := SaveHealthCache(path, original); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
loaded, err := LoadHealthCache(path)
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
if len(loaded.Entries) != 1 {
|
||||
t.Fatalf("entries = %d, want 1", len(loaded.Entries))
|
||||
}
|
||||
if loaded.Entries[0].Stars != 42 {
|
||||
t.Errorf("stars = %d, want 42", loaded.Entries[0].Stars)
|
||||
}
|
||||
}
|
||||
|
||||
func TestLoadHealthCacheMissing(t *testing.T) {
|
||||
hc, err := LoadHealthCache("/nonexistent/path.yaml")
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
if len(hc.Entries) != 0 {
|
||||
t.Errorf("entries = %d, want 0 for missing file", len(hc.Entries))
|
||||
}
|
||||
}
|
||||
|
||||
func TestLoadHealthCacheInvalidYAML(t *testing.T) {
|
||||
dir := t.TempDir()
|
||||
path := filepath.Join(dir, "health.yaml")
|
||||
if err := os.WriteFile(path, []byte("entries:\n - url: [not yaml"), 0o644); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
hc, err := LoadHealthCache(path)
|
||||
if err == nil {
|
||||
t.Fatal("expected error for invalid YAML")
|
||||
}
|
||||
if hc != nil {
|
||||
t.Fatal("expected nil cache on invalid YAML")
|
||||
}
|
||||
}
|
||||
|
||||
func TestMerge(t *testing.T) {
|
||||
hc := &HealthCache{
|
||||
Entries: []HealthEntry{
|
||||
{URL: "https://github.com/a/a", Name: "A", Stars: 10},
|
||||
{URL: "https://github.com/b/b", Name: "B", Stars: 20},
|
||||
},
|
||||
}
|
||||
|
||||
hc.Merge([]HealthEntry{
|
||||
{URL: "https://github.com/b/b", Name: "B", Stars: 25}, // update
|
||||
{URL: "https://github.com/c/c", Name: "C", Stars: 30}, // new
|
||||
})
|
||||
|
||||
if len(hc.Entries) != 3 {
|
||||
t.Fatalf("entries = %d, want 3", len(hc.Entries))
|
||||
}
|
||||
// B should be updated
|
||||
if hc.Entries[1].Stars != 25 {
|
||||
t.Errorf("B stars = %d, want 25", hc.Entries[1].Stars)
|
||||
}
|
||||
// C should be appended
|
||||
if hc.Entries[2].Name != "C" {
|
||||
t.Errorf("last entry = %q, want C", hc.Entries[2].Name)
|
||||
}
|
||||
}
|
||||
|
||||
func TestMergeDeduplicatesIncomingBatch(t *testing.T) {
|
||||
hc := &HealthCache{}
|
||||
|
||||
hc.Merge([]HealthEntry{
|
||||
{URL: "https://github.com/c/c", Name: "C", Stars: 1},
|
||||
{URL: "https://github.com/c/c", Name: "C", Stars: 2},
|
||||
})
|
||||
|
||||
if len(hc.Entries) != 1 {
|
||||
t.Fatalf("entries = %d, want 1", len(hc.Entries))
|
||||
}
|
||||
if hc.Entries[0].Stars != 2 {
|
||||
t.Fatalf("stars = %d, want last value 2", hc.Entries[0].Stars)
|
||||
}
|
||||
}
|
||||
174
internal/checker/github.go
Normal file
174
internal/checker/github.go
Normal file
@@ -0,0 +1,174 @@
|
||||
package checker
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"net/url"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/shurcooL/githubv4"
|
||||
"golang.org/x/oauth2"
|
||||
)
|
||||
|
||||
// RepoInfo holds metadata about a GitHub repository.
|
||||
type RepoInfo struct {
|
||||
Owner string
|
||||
Name string
|
||||
URL string
|
||||
IsArchived bool
|
||||
IsDisabled bool
|
||||
IsPrivate bool
|
||||
PushedAt time.Time
|
||||
Stars int
|
||||
Forks int
|
||||
HasLicense bool
|
||||
}
|
||||
|
||||
// ExtractGitHubRepo extracts owner/name from a GitHub URL.
|
||||
// Returns false for non-repo URLs (issues, wiki, apps, etc.).
|
||||
func ExtractGitHubRepo(rawURL string) (owner, name string, ok bool) {
|
||||
u, err := url.Parse(rawURL)
|
||||
if err != nil {
|
||||
return "", "", false
|
||||
}
|
||||
|
||||
host := strings.ToLower(u.Hostname())
|
||||
if host != "github.com" && host != "www.github.com" {
|
||||
return "", "", false
|
||||
}
|
||||
|
||||
path := strings.Trim(u.Path, "/")
|
||||
parts := strings.Split(path, "/")
|
||||
if len(parts) != 2 || parts[0] == "" || parts[1] == "" {
|
||||
return "", "", false
|
||||
}
|
||||
|
||||
// Skip known non-repository top-level routes.
|
||||
switch parts[0] {
|
||||
case "apps", "features", "topics":
|
||||
return "", "", false
|
||||
}
|
||||
|
||||
name = strings.TrimSuffix(parts[1], ".git")
|
||||
if name == "" {
|
||||
return "", "", false
|
||||
}
|
||||
|
||||
return parts[0], name, true
|
||||
}
|
||||
|
||||
func isHTTPURL(raw string) bool {
|
||||
u, err := url.Parse(raw)
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
return u.Scheme == "http" || u.Scheme == "https"
|
||||
}
|
||||
|
||||
func isGitHubAuthError(err error) bool {
|
||||
if err == nil {
|
||||
return false
|
||||
}
|
||||
s := strings.ToLower(err.Error())
|
||||
return strings.Contains(s, "401 unauthorized") ||
|
||||
strings.Contains(s, "bad credentials") ||
|
||||
strings.Contains(s, "resource not accessible by integration")
|
||||
}
|
||||
|
||||
// PartitionLinks separates URLs into GitHub repos and external HTTP(S) links.
|
||||
func PartitionLinks(urls []string) (github, external []string) {
|
||||
for _, url := range urls {
|
||||
if _, _, ok := ExtractGitHubRepo(url); ok {
|
||||
github = append(github, url)
|
||||
} else if isHTTPURL(url) {
|
||||
external = append(external, url)
|
||||
}
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
// GitHubChecker uses the GitHub GraphQL API.
|
||||
type GitHubChecker struct {
|
||||
client *githubv4.Client
|
||||
}
|
||||
|
||||
// NewGitHubChecker creates a checker with the given OAuth token.
|
||||
func NewGitHubChecker(token string) *GitHubChecker {
|
||||
src := oauth2.StaticTokenSource(&oauth2.Token{AccessToken: token})
|
||||
httpClient := oauth2.NewClient(context.Background(), src)
|
||||
return &GitHubChecker{client: githubv4.NewClient(httpClient)}
|
||||
}
|
||||
|
||||
// CheckRepo queries a single GitHub repository.
|
||||
func (gc *GitHubChecker) CheckRepo(ctx context.Context, owner, name string) (RepoInfo, error) {
|
||||
var query struct {
|
||||
Repository struct {
|
||||
IsArchived bool
|
||||
IsDisabled bool
|
||||
IsPrivate bool
|
||||
PushedAt time.Time
|
||||
StargazerCount int
|
||||
ForkCount int
|
||||
LicenseInfo *struct {
|
||||
Name string
|
||||
}
|
||||
} `graphql:"repository(owner: $owner, name: $name)"`
|
||||
}
|
||||
|
||||
vars := map[string]interface{}{
|
||||
"owner": githubv4.String(owner),
|
||||
"name": githubv4.String(name),
|
||||
}
|
||||
|
||||
if err := gc.client.Query(ctx, &query, vars); err != nil {
|
||||
return RepoInfo{}, fmt.Errorf("github query %s/%s: %w", owner, name, err)
|
||||
}
|
||||
|
||||
r := query.Repository
|
||||
return RepoInfo{
|
||||
Owner: owner,
|
||||
Name: name,
|
||||
URL: fmt.Sprintf("https://github.com/%s/%s", owner, name),
|
||||
IsArchived: r.IsArchived,
|
||||
IsDisabled: r.IsDisabled,
|
||||
IsPrivate: r.IsPrivate,
|
||||
PushedAt: r.PushedAt,
|
||||
Stars: r.StargazerCount,
|
||||
Forks: r.ForkCount,
|
||||
HasLicense: r.LicenseInfo != nil,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// CheckRepos queries multiple repos in sequence with rate limiting.
|
||||
func (gc *GitHubChecker) CheckRepos(ctx context.Context, urls []string, batchSize int) ([]RepoInfo, []error) {
|
||||
if batchSize <= 0 {
|
||||
batchSize = 50
|
||||
}
|
||||
|
||||
var results []RepoInfo
|
||||
var errs []error
|
||||
|
||||
for i, url := range urls {
|
||||
owner, name, ok := ExtractGitHubRepo(url)
|
||||
if !ok {
|
||||
continue
|
||||
}
|
||||
|
||||
info, err := gc.CheckRepo(ctx, owner, name)
|
||||
if err != nil {
|
||||
errs = append(errs, err)
|
||||
if isGitHubAuthError(err) {
|
||||
break
|
||||
}
|
||||
continue
|
||||
}
|
||||
results = append(results, info)
|
||||
|
||||
if (i+1)%batchSize == 0 {
|
||||
time.Sleep(1 * time.Second)
|
||||
}
|
||||
}
|
||||
|
||||
return results, errs
|
||||
}
|
||||
78
internal/checker/github_test.go
Normal file
78
internal/checker/github_test.go
Normal file
@@ -0,0 +1,78 @@
|
||||
package checker
|
||||
|
||||
import (
|
||||
"errors"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestExtractGitHubRepo(t *testing.T) {
|
||||
tests := []struct {
|
||||
url string
|
||||
owner string
|
||||
name string
|
||||
ok bool
|
||||
}{
|
||||
{"https://github.com/docker/compose", "docker", "compose", true},
|
||||
{"https://github.com/moby/moby", "moby", "moby", true},
|
||||
{"https://github.com/user/repo/", "user", "repo", true},
|
||||
{"https://github.com/user/repo?tab=readme-ov-file", "user", "repo", true},
|
||||
{"https://github.com/user/repo#readme", "user", "repo", true},
|
||||
{"https://github.com/user/repo.git", "user", "repo", true},
|
||||
{"https://www.github.com/user/repo", "user", "repo", true},
|
||||
{"https://github.com/user/repo/issues", "", "", false},
|
||||
{"https://github.com/user/repo/wiki", "", "", false},
|
||||
{"https://github.com/apps/dependabot", "", "", false},
|
||||
{"https://example.com/not-github", "", "", false},
|
||||
{"https://github.com/user", "", "", false},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
owner, name, ok := ExtractGitHubRepo(tt.url)
|
||||
if ok != tt.ok {
|
||||
t.Errorf("ExtractGitHubRepo(%q): ok = %v, want %v", tt.url, ok, tt.ok)
|
||||
continue
|
||||
}
|
||||
if ok {
|
||||
if owner != tt.owner || name != tt.name {
|
||||
t.Errorf("ExtractGitHubRepo(%q) = (%q, %q), want (%q, %q)", tt.url, owner, name, tt.owner, tt.name)
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func TestPartitionLinks(t *testing.T) {
|
||||
urls := []string{
|
||||
"https://github.com/docker/compose",
|
||||
"https://example.com/tool",
|
||||
"https://github.com/moby/moby",
|
||||
"https://github.com/user/repo/issues",
|
||||
"dozzle",
|
||||
"#projects",
|
||||
}
|
||||
gh, ext := PartitionLinks(urls)
|
||||
if len(gh) != 2 {
|
||||
t.Errorf("github links = %d, want 2", len(gh))
|
||||
}
|
||||
if len(ext) != 2 {
|
||||
t.Errorf("external links = %d, want 2", len(ext))
|
||||
}
|
||||
}
|
||||
|
||||
func TestIsGitHubAuthError(t *testing.T) {
|
||||
tests := []struct {
|
||||
err error
|
||||
want bool
|
||||
}{
|
||||
{errors.New("non-200 OK status code: 401 Unauthorized body: \"Bad credentials\""), true},
|
||||
{errors.New("Resource not accessible by integration"), true},
|
||||
{errors.New("dial tcp: lookup api.github.com: no such host"), false},
|
||||
{errors.New("context deadline exceeded"), false},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
got := isGitHubAuthError(tt.err)
|
||||
if got != tt.want {
|
||||
t.Errorf("isGitHubAuthError(%q) = %v, want %v", tt.err, got, tt.want)
|
||||
}
|
||||
}
|
||||
}
|
||||
121
internal/checker/http.go
Normal file
121
internal/checker/http.go
Normal file
@@ -0,0 +1,121 @@
|
||||
package checker
|
||||
|
||||
import (
|
||||
"context"
|
||||
"net/http"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
)
|
||||
|
||||
const (
|
||||
defaultTimeout = 30 * time.Second
|
||||
defaultConcurrency = 10
|
||||
userAgent = "awesome-docker-checker/1.0"
|
||||
)
|
||||
|
||||
// LinkResult holds the result of checking a single URL.
|
||||
type LinkResult struct {
|
||||
URL string
|
||||
OK bool
|
||||
StatusCode int
|
||||
Redirected bool
|
||||
RedirectURL string
|
||||
Error string
|
||||
}
|
||||
|
||||
func shouldFallbackToGET(statusCode int) bool {
|
||||
switch statusCode {
|
||||
case http.StatusBadRequest, http.StatusForbidden, http.StatusMethodNotAllowed, http.StatusNotImplemented:
|
||||
return true
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
|
||||
// CheckLink checks a single URL. Uses HEAD first, falls back to GET.
|
||||
func CheckLink(url string, client *http.Client) LinkResult {
|
||||
result := LinkResult{URL: url}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), defaultTimeout)
|
||||
defer cancel()
|
||||
|
||||
// Track redirects
|
||||
var finalURL string
|
||||
origCheckRedirect := client.CheckRedirect
|
||||
client.CheckRedirect = func(req *http.Request, via []*http.Request) error {
|
||||
finalURL = req.URL.String()
|
||||
if len(via) >= 10 {
|
||||
return http.ErrUseLastResponse
|
||||
}
|
||||
return nil
|
||||
}
|
||||
defer func() { client.CheckRedirect = origCheckRedirect }()
|
||||
|
||||
doRequest := func(method string) (*http.Response, error) {
|
||||
req, err := http.NewRequestWithContext(ctx, method, url, nil)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
req.Header.Set("User-Agent", userAgent)
|
||||
return client.Do(req)
|
||||
}
|
||||
|
||||
resp, err := doRequest(http.MethodHead)
|
||||
if err != nil {
|
||||
resp, err = doRequest(http.MethodGet)
|
||||
if err != nil {
|
||||
result.Error = err.Error()
|
||||
return result
|
||||
}
|
||||
} else if shouldFallbackToGET(resp.StatusCode) {
|
||||
resp.Body.Close()
|
||||
resp, err = doRequest(http.MethodGet)
|
||||
if err != nil {
|
||||
result.Error = err.Error()
|
||||
return result
|
||||
}
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
result.StatusCode = resp.StatusCode
|
||||
result.OK = resp.StatusCode >= 200 && resp.StatusCode < 400
|
||||
|
||||
if finalURL != "" && finalURL != url {
|
||||
result.Redirected = true
|
||||
result.RedirectURL = finalURL
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// CheckLinks checks multiple URLs concurrently.
|
||||
func CheckLinks(urls []string, concurrency int, exclude *cache.ExcludeList) []LinkResult {
|
||||
if concurrency <= 0 {
|
||||
concurrency = defaultConcurrency
|
||||
}
|
||||
|
||||
results := make([]LinkResult, len(urls))
|
||||
sem := make(chan struct{}, concurrency)
|
||||
var wg sync.WaitGroup
|
||||
|
||||
for i, url := range urls {
|
||||
if exclude != nil && exclude.IsExcluded(url) {
|
||||
results[i] = LinkResult{URL: url, OK: true}
|
||||
continue
|
||||
}
|
||||
|
||||
wg.Add(1)
|
||||
go func(idx int, u string) {
|
||||
defer wg.Done()
|
||||
sem <- struct{}{}
|
||||
defer func() { <-sem }()
|
||||
client := &http.Client{Timeout: defaultTimeout}
|
||||
results[idx] = CheckLink(u, client)
|
||||
}(i, url)
|
||||
}
|
||||
|
||||
wg.Wait()
|
||||
return results
|
||||
}
|
||||
118
internal/checker/http_test.go
Normal file
118
internal/checker/http_test.go
Normal file
@@ -0,0 +1,118 @@
|
||||
package checker
|
||||
|
||||
import (
|
||||
"net/http"
|
||||
"net/http/httptest"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestCheckLinkOK(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
w.WriteHeader(http.StatusOK)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
result := CheckLink(server.URL, &http.Client{})
|
||||
if !result.OK {
|
||||
t.Errorf("expected OK, got status %d, error: %s", result.StatusCode, result.Error)
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheckLink404(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
w.WriteHeader(http.StatusNotFound)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
result := CheckLink(server.URL, &http.Client{})
|
||||
if result.OK {
|
||||
t.Error("expected not OK for 404")
|
||||
}
|
||||
if result.StatusCode != 404 {
|
||||
t.Errorf("status = %d, want 404", result.StatusCode)
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheckLinkRedirect(t *testing.T) {
|
||||
final := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
w.WriteHeader(http.StatusOK)
|
||||
}))
|
||||
defer final.Close()
|
||||
|
||||
redir := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
http.Redirect(w, r, final.URL, http.StatusMovedPermanently)
|
||||
}))
|
||||
defer redir.Close()
|
||||
|
||||
result := CheckLink(redir.URL, &http.Client{})
|
||||
if !result.OK {
|
||||
t.Errorf("expected OK after following redirect, error: %s", result.Error)
|
||||
}
|
||||
if !result.Redirected {
|
||||
t.Error("expected Redirected = true")
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheckLinks(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
if r.URL.Path == "/bad" {
|
||||
w.WriteHeader(http.StatusNotFound)
|
||||
return
|
||||
}
|
||||
w.WriteHeader(http.StatusOK)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
urls := []string{server.URL + "/good", server.URL + "/bad", server.URL + "/also-good"}
|
||||
results := CheckLinks(urls, 2, nil)
|
||||
if len(results) != 3 {
|
||||
t.Fatalf("results = %d, want 3", len(results))
|
||||
}
|
||||
|
||||
for _, r := range results {
|
||||
if r.URL == server.URL+"/bad" && r.OK {
|
||||
t.Error("expected /bad to not be OK")
|
||||
}
|
||||
if r.URL == server.URL+"/good" && !r.OK {
|
||||
t.Error("expected /good to be OK")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheckLinkFallbackToGETOnMethodNotAllowed(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method == http.MethodHead {
|
||||
w.WriteHeader(http.StatusMethodNotAllowed)
|
||||
return
|
||||
}
|
||||
w.WriteHeader(http.StatusOK)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
result := CheckLink(server.URL, &http.Client{})
|
||||
if !result.OK {
|
||||
t.Errorf("expected OK after GET fallback, got status %d, error: %s", result.StatusCode, result.Error)
|
||||
}
|
||||
if result.StatusCode != http.StatusOK {
|
||||
t.Errorf("status = %d, want 200", result.StatusCode)
|
||||
}
|
||||
}
|
||||
|
||||
func TestCheckLinkFallbackToGETOnForbiddenHead(t *testing.T) {
|
||||
server := httptest.NewServer(http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
|
||||
if r.Method == http.MethodHead {
|
||||
w.WriteHeader(http.StatusForbidden)
|
||||
return
|
||||
}
|
||||
w.WriteHeader(http.StatusOK)
|
||||
}))
|
||||
defer server.Close()
|
||||
|
||||
result := CheckLink(server.URL, &http.Client{})
|
||||
if !result.OK {
|
||||
t.Errorf("expected OK after GET fallback, got status %d, error: %s", result.StatusCode, result.Error)
|
||||
}
|
||||
if result.StatusCode != http.StatusOK {
|
||||
t.Errorf("status = %d, want 200", result.StatusCode)
|
||||
}
|
||||
}
|
||||
147
internal/linter/fixer.go
Normal file
147
internal/linter/fixer.go
Normal file
@@ -0,0 +1,147 @@
|
||||
package linter
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"os"
|
||||
"regexp"
|
||||
"strings"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/parser"
|
||||
)
|
||||
|
||||
// attributionRe matches trailing author attributions like:
|
||||
//
|
||||
// by [@author](url), by [@author][ref], by @author
|
||||
//
|
||||
// Also handles "Created by", "Maintained by" etc.
|
||||
var attributionRe = regexp.MustCompile(`\s+(?:(?:[Cc]reated|[Mm]aintained|[Bb]uilt)\s+)?by\s+\[@[^\]]+\](?:\([^)]*\)|\[[^\]]*\])\.?$`)
|
||||
|
||||
// bareAttributionRe matches: by @author at end of line (no link).
|
||||
var bareAttributionRe = regexp.MustCompile(`\s+by\s+@\w+\.?$`)
|
||||
|
||||
// sectionHeadingRe matches markdown headings.
|
||||
var sectionHeadingRe = regexp.MustCompile(`^(#{1,6})\s+(.+?)(?:\s*<!--.*-->)?$`)
|
||||
|
||||
// RemoveAttribution strips author attribution from a description string.
|
||||
func RemoveAttribution(desc string) string {
|
||||
desc = attributionRe.ReplaceAllString(desc, "")
|
||||
desc = bareAttributionRe.ReplaceAllString(desc, "")
|
||||
return strings.TrimSpace(desc)
|
||||
}
|
||||
|
||||
// FormatEntry reconstructs a markdown list line from a parsed Entry.
|
||||
func FormatEntry(e parser.Entry) string {
|
||||
desc := e.Description
|
||||
var markers []string
|
||||
for _, m := range e.Markers {
|
||||
switch m {
|
||||
case parser.MarkerAbandoned:
|
||||
markers = append(markers, ":skull:")
|
||||
case parser.MarkerPaid:
|
||||
markers = append(markers, ":yen:")
|
||||
case parser.MarkerWIP:
|
||||
markers = append(markers, ":construction:")
|
||||
case parser.MarkerStale:
|
||||
markers = append(markers, ":ice_cube:")
|
||||
}
|
||||
}
|
||||
if len(markers) > 0 {
|
||||
desc = strings.Join(markers, " ") + " " + desc
|
||||
}
|
||||
return fmt.Sprintf("- [%s](%s) - %s", e.Name, e.URL, desc)
|
||||
}
|
||||
|
||||
// FixFile reads the README, fixes entries (capitalize, period, remove attribution,
|
||||
// sort), and writes the result back.
|
||||
func FixFile(path string) (int, error) {
|
||||
f, err := os.Open(path)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
defer f.Close()
|
||||
|
||||
var lines []string
|
||||
scanner := bufio.NewScanner(f)
|
||||
for scanner.Scan() {
|
||||
lines = append(lines, scanner.Text())
|
||||
}
|
||||
if err := scanner.Err(); err != nil {
|
||||
return 0, err
|
||||
}
|
||||
|
||||
fixCount := 0
|
||||
|
||||
var headingLines []int
|
||||
for i, line := range lines {
|
||||
if sectionHeadingRe.MatchString(line) {
|
||||
headingLines = append(headingLines, i)
|
||||
}
|
||||
}
|
||||
|
||||
// Process each heading block independently to match linter sort scope.
|
||||
for i, headingIdx := range headingLines {
|
||||
start := headingIdx + 1
|
||||
end := len(lines)
|
||||
if i+1 < len(headingLines) {
|
||||
end = headingLines[i+1]
|
||||
}
|
||||
|
||||
var entryPositions []int
|
||||
var entries []parser.Entry
|
||||
for lineIdx := start; lineIdx < end; lineIdx++ {
|
||||
entry, err := parser.ParseEntry(lines[lineIdx], lineIdx+1)
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
entryPositions = append(entryPositions, lineIdx)
|
||||
entries = append(entries, entry)
|
||||
}
|
||||
if len(entries) == 0 {
|
||||
continue
|
||||
}
|
||||
|
||||
var fixed []parser.Entry
|
||||
for _, e := range entries {
|
||||
f := FixEntry(e)
|
||||
f.Description = RemoveAttribution(f.Description)
|
||||
// Re-apply period after removing attribution (it may have been stripped)
|
||||
if len(f.Description) > 0 && !strings.HasSuffix(f.Description, ".") {
|
||||
f.Description += "."
|
||||
}
|
||||
fixed = append(fixed, f)
|
||||
}
|
||||
|
||||
sorted := SortEntries(fixed)
|
||||
for j, e := range sorted {
|
||||
newLine := FormatEntry(e)
|
||||
lineIdx := entryPositions[j]
|
||||
if lines[lineIdx] != newLine {
|
||||
fixCount++
|
||||
lines[lineIdx] = newLine
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if fixCount == 0 {
|
||||
return 0, nil
|
||||
}
|
||||
|
||||
// Write back
|
||||
out, err := os.Create(path)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
defer out.Close()
|
||||
|
||||
w := bufio.NewWriter(out)
|
||||
for i, line := range lines {
|
||||
w.WriteString(line)
|
||||
if i < len(lines)-1 {
|
||||
w.WriteString("\n")
|
||||
}
|
||||
}
|
||||
// Preserve trailing newline if original had one
|
||||
w.WriteString("\n")
|
||||
return fixCount, w.Flush()
|
||||
}
|
||||
193
internal/linter/fixer_test.go
Normal file
193
internal/linter/fixer_test.go
Normal file
@@ -0,0 +1,193 @@
|
||||
package linter
|
||||
|
||||
import (
|
||||
"os"
|
||||
"strings"
|
||||
"testing"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/parser"
|
||||
)
|
||||
|
||||
func TestRemoveAttribution(t *testing.T) {
|
||||
tests := []struct {
|
||||
input string
|
||||
want string
|
||||
}{
|
||||
{
|
||||
"Tool for managing containers by [@author](https://github.com/author)",
|
||||
"Tool for managing containers",
|
||||
},
|
||||
{
|
||||
"Tool for managing containers by [@author][author]",
|
||||
"Tool for managing containers",
|
||||
},
|
||||
{
|
||||
"Tool for managing containers by @author",
|
||||
"Tool for managing containers",
|
||||
},
|
||||
{
|
||||
"Analyzes resource usage. Created by [@Google][google]",
|
||||
"Analyzes resource usage.",
|
||||
},
|
||||
{
|
||||
"A tool by [@someone](https://example.com).",
|
||||
"A tool",
|
||||
},
|
||||
{
|
||||
"step-by-step tutorial and more resources",
|
||||
"step-by-step tutorial and more resources",
|
||||
},
|
||||
{
|
||||
"No attribution here",
|
||||
"No attribution here",
|
||||
},
|
||||
}
|
||||
|
||||
for _, tt := range tests {
|
||||
got := RemoveAttribution(tt.input)
|
||||
if got != tt.want {
|
||||
t.Errorf("RemoveAttribution(%q) = %q, want %q", tt.input, got, tt.want)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
func TestFormatEntry(t *testing.T) {
|
||||
e := parser.Entry{
|
||||
Name: "Portainer",
|
||||
URL: "https://github.com/portainer/portainer",
|
||||
Description: "Management UI for Docker.",
|
||||
}
|
||||
got := FormatEntry(e)
|
||||
want := "- [Portainer](https://github.com/portainer/portainer) - Management UI for Docker."
|
||||
if got != want {
|
||||
t.Errorf("FormatEntry = %q, want %q", got, want)
|
||||
}
|
||||
}
|
||||
|
||||
func TestFormatEntryWithMarkers(t *testing.T) {
|
||||
e := parser.Entry{
|
||||
Name: "OldTool",
|
||||
URL: "https://github.com/old/tool",
|
||||
Description: "A deprecated tool.",
|
||||
Markers: []parser.Marker{parser.MarkerAbandoned},
|
||||
}
|
||||
got := FormatEntry(e)
|
||||
want := "- [OldTool](https://github.com/old/tool) - :skull: A deprecated tool."
|
||||
if got != want {
|
||||
t.Errorf("FormatEntry = %q, want %q", got, want)
|
||||
}
|
||||
}
|
||||
|
||||
func TestFixFile(t *testing.T) {
|
||||
content := `# Awesome Docker
|
||||
|
||||
## Tools
|
||||
|
||||
- [Zebra](https://example.com/zebra) - a tool by [@author](https://github.com/author)
|
||||
- [Alpha](https://example.com/alpha) - another tool
|
||||
|
||||
## Other
|
||||
|
||||
Some text here.
|
||||
`
|
||||
tmp, err := os.CreateTemp("", "readme-*.md")
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
defer os.Remove(tmp.Name())
|
||||
|
||||
if _, err := tmp.WriteString(content); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
tmp.Close()
|
||||
|
||||
count, err := FixFile(tmp.Name())
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
|
||||
if count == 0 {
|
||||
t.Fatal("expected fixes, got 0")
|
||||
}
|
||||
|
||||
data, err := os.ReadFile(tmp.Name())
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
result := string(data)
|
||||
|
||||
// Check sorting: Alpha should come before Zebra
|
||||
alphaIdx := strings.Index(result, "[Alpha]")
|
||||
zebraIdx := strings.Index(result, "[Zebra]")
|
||||
if alphaIdx > zebraIdx {
|
||||
t.Error("expected Alpha before Zebra after sort")
|
||||
}
|
||||
|
||||
// Check capitalization
|
||||
if !strings.Contains(result, "- A tool.") {
|
||||
t.Errorf("expected capitalized description, got:\n%s", result)
|
||||
}
|
||||
|
||||
// Check attribution removed
|
||||
if strings.Contains(result, "@author") {
|
||||
t.Errorf("expected attribution removed, got:\n%s", result)
|
||||
}
|
||||
|
||||
// Check period added
|
||||
if !strings.Contains(result, "Another tool.") {
|
||||
t.Errorf("expected period added, got:\n%s", result)
|
||||
}
|
||||
}
|
||||
|
||||
func TestFixFileSortsAcrossBlankLinesAndIsIdempotent(t *testing.T) {
|
||||
content := `# Awesome Docker
|
||||
|
||||
## Tools
|
||||
|
||||
- [Zulu](https://example.com/zulu) - z tool
|
||||
|
||||
- [Alpha](https://example.com/alpha) - a tool
|
||||
`
|
||||
|
||||
tmp, err := os.CreateTemp("", "readme-*.md")
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
defer os.Remove(tmp.Name())
|
||||
|
||||
if _, err := tmp.WriteString(content); err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
tmp.Close()
|
||||
|
||||
firstCount, err := FixFile(tmp.Name())
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
if firstCount == 0 {
|
||||
t.Fatal("expected first run to apply fixes")
|
||||
}
|
||||
|
||||
firstData, err := os.ReadFile(tmp.Name())
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
firstResult := string(firstData)
|
||||
|
||||
alphaIdx := strings.Index(firstResult, "[Alpha]")
|
||||
zuluIdx := strings.Index(firstResult, "[Zulu]")
|
||||
if alphaIdx == -1 || zuluIdx == -1 {
|
||||
t.Fatalf("expected both Alpha and Zulu in result:\n%s", firstResult)
|
||||
}
|
||||
if alphaIdx > zuluIdx {
|
||||
t.Fatalf("expected Alpha before Zulu after fix:\n%s", firstResult)
|
||||
}
|
||||
|
||||
secondCount, err := FixFile(tmp.Name())
|
||||
if err != nil {
|
||||
t.Fatal(err)
|
||||
}
|
||||
if secondCount != 0 {
|
||||
t.Fatalf("expected second run to be idempotent, got %d changes", secondCount)
|
||||
}
|
||||
}
|
||||
60
internal/linter/linter.go
Normal file
60
internal/linter/linter.go
Normal file
@@ -0,0 +1,60 @@
|
||||
package linter
|
||||
|
||||
import (
|
||||
"github.com/veggiemonk/awesome-docker/internal/parser"
|
||||
)
|
||||
|
||||
// Result holds all lint issues found.
|
||||
type Result struct {
|
||||
Issues []Issue
|
||||
Errors int
|
||||
Warnings int
|
||||
}
|
||||
|
||||
// Lint checks an entire parsed document for issues.
|
||||
func Lint(doc parser.Document) Result {
|
||||
var result Result
|
||||
|
||||
// Collect all entries for duplicate checking
|
||||
allEntries := collectEntries(doc.Sections)
|
||||
for _, issue := range CheckDuplicates(allEntries) {
|
||||
addIssue(&result, issue)
|
||||
}
|
||||
|
||||
// Check each section
|
||||
lintSections(doc.Sections, &result)
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
func lintSections(sections []parser.Section, result *Result) {
|
||||
for _, s := range sections {
|
||||
for _, e := range s.Entries {
|
||||
for _, issue := range CheckEntry(e) {
|
||||
addIssue(result, issue)
|
||||
}
|
||||
}
|
||||
for _, issue := range CheckSorted(s.Entries) {
|
||||
addIssue(result, issue)
|
||||
}
|
||||
lintSections(s.Children, result)
|
||||
}
|
||||
}
|
||||
|
||||
func collectEntries(sections []parser.Section) []parser.Entry {
|
||||
var all []parser.Entry
|
||||
for _, s := range sections {
|
||||
all = append(all, s.Entries...)
|
||||
all = append(all, collectEntries(s.Children)...)
|
||||
}
|
||||
return all
|
||||
}
|
||||
|
||||
func addIssue(result *Result, issue Issue) {
|
||||
result.Issues = append(result.Issues, issue)
|
||||
if issue.Severity == SeverityError {
|
||||
result.Errors++
|
||||
} else {
|
||||
result.Warnings++
|
||||
}
|
||||
}
|
||||
111
internal/linter/linter_test.go
Normal file
111
internal/linter/linter_test.go
Normal file
@@ -0,0 +1,111 @@
|
||||
package linter
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/parser"
|
||||
)
|
||||
|
||||
func TestRuleDescriptionCapital(t *testing.T) {
|
||||
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "lowercase start.", Line: 10}
|
||||
issues := CheckEntry(entry)
|
||||
found := false
|
||||
for _, issue := range issues {
|
||||
if issue.Rule == RuleDescriptionCapital {
|
||||
found = true
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
t.Error("expected RuleDescriptionCapital issue for lowercase description")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRuleDescriptionPeriod(t *testing.T) {
|
||||
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "No period at end", Line: 10}
|
||||
issues := CheckEntry(entry)
|
||||
found := false
|
||||
for _, issue := range issues {
|
||||
if issue.Rule == RuleDescriptionPeriod {
|
||||
found = true
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
t.Error("expected RuleDescriptionPeriod issue")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRuleSorted(t *testing.T) {
|
||||
entries := []parser.Entry{
|
||||
{Name: "Zebra", URL: "https://z.com", Description: "Z.", Line: 1},
|
||||
{Name: "Alpha", URL: "https://a.com", Description: "A.", Line: 2},
|
||||
}
|
||||
issues := CheckSorted(entries)
|
||||
if len(issues) == 0 {
|
||||
t.Error("expected sorting issue")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRuleSortedOK(t *testing.T) {
|
||||
entries := []parser.Entry{
|
||||
{Name: "Alpha", URL: "https://a.com", Description: "A.", Line: 1},
|
||||
{Name: "Zebra", URL: "https://z.com", Description: "Z.", Line: 2},
|
||||
}
|
||||
issues := CheckSorted(entries)
|
||||
if len(issues) != 0 {
|
||||
t.Errorf("expected no sorting issues, got %d", len(issues))
|
||||
}
|
||||
}
|
||||
|
||||
func TestRuleDuplicateURL(t *testing.T) {
|
||||
entries := []parser.Entry{
|
||||
{Name: "A", URL: "https://example.com/a", Description: "A.", Line: 1},
|
||||
{Name: "B", URL: "https://example.com/a", Description: "B.", Line: 5},
|
||||
}
|
||||
issues := CheckDuplicates(entries)
|
||||
if len(issues) == 0 {
|
||||
t.Error("expected duplicate URL issue")
|
||||
}
|
||||
}
|
||||
|
||||
func TestValidEntry(t *testing.T) {
|
||||
entry := parser.Entry{Name: "Good", URL: "https://example.com", Description: "A good project.", Line: 10}
|
||||
issues := CheckEntry(entry)
|
||||
if len(issues) != 0 {
|
||||
t.Errorf("expected no issues, got %v", issues)
|
||||
}
|
||||
}
|
||||
|
||||
func TestFixDescriptionCapital(t *testing.T) {
|
||||
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "lowercase.", Line: 10}
|
||||
fixed := FixEntry(entry)
|
||||
if fixed.Description != "Lowercase." {
|
||||
t.Errorf("description = %q, want %q", fixed.Description, "Lowercase.")
|
||||
}
|
||||
}
|
||||
|
||||
func TestFixDescriptionPeriod(t *testing.T) {
|
||||
entry := parser.Entry{Name: "Test", URL: "https://example.com", Description: "No period", Line: 10}
|
||||
fixed := FixEntry(entry)
|
||||
if fixed.Description != "No period." {
|
||||
t.Errorf("description = %q, want %q", fixed.Description, "No period.")
|
||||
}
|
||||
}
|
||||
|
||||
func TestLintDocument(t *testing.T) {
|
||||
doc := parser.Document{
|
||||
Sections: []parser.Section{
|
||||
{
|
||||
Title: "Tools",
|
||||
Level: 2,
|
||||
Entries: []parser.Entry{
|
||||
{Name: "Zebra", URL: "https://z.com", Description: "Z tool.", Line: 1},
|
||||
{Name: "Alpha", URL: "https://a.com", Description: "a tool", Line: 2},
|
||||
},
|
||||
},
|
||||
},
|
||||
}
|
||||
result := Lint(doc)
|
||||
if result.Errors == 0 {
|
||||
t.Error("expected errors (unsorted, lowercase, no period)")
|
||||
}
|
||||
}
|
||||
149
internal/linter/rules.go
Normal file
149
internal/linter/rules.go
Normal file
@@ -0,0 +1,149 @@
|
||||
package linter
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"sort"
|
||||
"strings"
|
||||
"unicode"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/parser"
|
||||
)
|
||||
|
||||
// Rule identifies a linting rule.
|
||||
type Rule string
|
||||
|
||||
const (
|
||||
RuleDescriptionCapital Rule = "description-capital"
|
||||
RuleDescriptionPeriod Rule = "description-period"
|
||||
RuleSorted Rule = "sorted"
|
||||
RuleDuplicateURL Rule = "duplicate-url"
|
||||
)
|
||||
|
||||
// Severity of a lint issue.
|
||||
type Severity int
|
||||
|
||||
const (
|
||||
SeverityError Severity = iota
|
||||
SeverityWarning
|
||||
)
|
||||
|
||||
// Issue is a single lint problem found.
|
||||
type Issue struct {
|
||||
Rule Rule
|
||||
Severity Severity
|
||||
Line int
|
||||
Message string
|
||||
}
|
||||
|
||||
func (i Issue) String() string {
|
||||
sev := "ERROR"
|
||||
if i.Severity == SeverityWarning {
|
||||
sev = "WARN"
|
||||
}
|
||||
return fmt.Sprintf("[%s] line %d: %s (%s)", sev, i.Line, i.Message, i.Rule)
|
||||
}
|
||||
|
||||
// CheckEntry validates a single entry against formatting rules.
|
||||
func CheckEntry(e parser.Entry) []Issue {
|
||||
var issues []Issue
|
||||
|
||||
if first, ok := firstLetter(e.Description); ok && !unicode.IsUpper(first) {
|
||||
issues = append(issues, Issue{
|
||||
Rule: RuleDescriptionCapital,
|
||||
Severity: SeverityError,
|
||||
Line: e.Line,
|
||||
Message: fmt.Sprintf("%q: description should start with a capital letter", e.Name),
|
||||
})
|
||||
}
|
||||
|
||||
if len(e.Description) > 0 && !strings.HasSuffix(e.Description, ".") {
|
||||
issues = append(issues, Issue{
|
||||
Rule: RuleDescriptionPeriod,
|
||||
Severity: SeverityError,
|
||||
Line: e.Line,
|
||||
Message: fmt.Sprintf("%q: description should end with a period", e.Name),
|
||||
})
|
||||
}
|
||||
|
||||
return issues
|
||||
}
|
||||
|
||||
// CheckSorted verifies entries are in alphabetical order (case-insensitive).
|
||||
func CheckSorted(entries []parser.Entry) []Issue {
|
||||
var issues []Issue
|
||||
for i := 1; i < len(entries); i++ {
|
||||
prev := strings.ToLower(entries[i-1].Name)
|
||||
curr := strings.ToLower(entries[i].Name)
|
||||
if prev > curr {
|
||||
issues = append(issues, Issue{
|
||||
Rule: RuleSorted,
|
||||
Severity: SeverityError,
|
||||
Line: entries[i].Line,
|
||||
Message: fmt.Sprintf("%q should come before %q (alphabetical order)", entries[i].Name, entries[i-1].Name),
|
||||
})
|
||||
}
|
||||
}
|
||||
return issues
|
||||
}
|
||||
|
||||
// CheckDuplicates finds entries with the same URL across the entire document.
|
||||
func CheckDuplicates(entries []parser.Entry) []Issue {
|
||||
var issues []Issue
|
||||
seen := make(map[string]int) // URL -> first line number
|
||||
for _, e := range entries {
|
||||
url := strings.TrimRight(e.URL, "/")
|
||||
if firstLine, exists := seen[url]; exists {
|
||||
issues = append(issues, Issue{
|
||||
Rule: RuleDuplicateURL,
|
||||
Severity: SeverityError,
|
||||
Line: e.Line,
|
||||
Message: fmt.Sprintf("duplicate URL %q (first seen at line %d)", e.URL, firstLine),
|
||||
})
|
||||
} else {
|
||||
seen[url] = e.Line
|
||||
}
|
||||
}
|
||||
return issues
|
||||
}
|
||||
|
||||
// firstLetter returns the first unicode letter in s and true, or zero and false if none.
|
||||
func firstLetter(s string) (rune, bool) {
|
||||
for _, r := range s {
|
||||
if unicode.IsLetter(r) {
|
||||
return r, true
|
||||
}
|
||||
}
|
||||
return 0, false
|
||||
}
|
||||
|
||||
// FixEntry returns a copy of the entry with auto-fixable issues corrected.
|
||||
func FixEntry(e parser.Entry) parser.Entry {
|
||||
fixed := e
|
||||
if len(fixed.Description) > 0 {
|
||||
// Capitalize first letter (find it, may not be at index 0)
|
||||
runes := []rune(fixed.Description)
|
||||
for i, r := range runes {
|
||||
if unicode.IsLetter(r) {
|
||||
runes[i] = unicode.ToUpper(r)
|
||||
break
|
||||
}
|
||||
}
|
||||
fixed.Description = string(runes)
|
||||
|
||||
// Ensure period at end
|
||||
if !strings.HasSuffix(fixed.Description, ".") {
|
||||
fixed.Description += "."
|
||||
}
|
||||
}
|
||||
return fixed
|
||||
}
|
||||
|
||||
// SortEntries returns a sorted copy of entries (case-insensitive by Name).
|
||||
func SortEntries(entries []parser.Entry) []parser.Entry {
|
||||
sorted := make([]parser.Entry, len(entries))
|
||||
copy(sorted, entries)
|
||||
sort.Slice(sorted, func(i, j int) bool {
|
||||
return strings.ToLower(sorted[i].Name) < strings.ToLower(sorted[j].Name)
|
||||
})
|
||||
return sorted
|
||||
}
|
||||
140
internal/parser/parser.go
Normal file
140
internal/parser/parser.go
Normal file
@@ -0,0 +1,140 @@
|
||||
package parser
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"fmt"
|
||||
"io"
|
||||
"regexp"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// entryRe matches: - [Name](URL) - Description
|
||||
// Also handles optional markers/text between URL and " - " separator, e.g.:
|
||||
//
|
||||
// - [Name](URL) :skull: - Description
|
||||
// - [Name](URL) (2) :skull: - Description
|
||||
var entryRe = regexp.MustCompile(`^[-*]\s+\[([^\]]+)\]\(([^)]+)\)(.*?)\s+-\s+(.+)$`)
|
||||
|
||||
// headingRe matches markdown headings: # Title, ## Title, etc.
|
||||
var headingRe = regexp.MustCompile(`^(#{1,6})\s+(.+?)(?:\s*<!--.*-->)?$`)
|
||||
|
||||
var markerDefs = []struct {
|
||||
text string
|
||||
marker Marker
|
||||
}{
|
||||
{text: ":skull:", marker: MarkerAbandoned},
|
||||
{text: ":yen:", marker: MarkerPaid},
|
||||
{text: ":construction:", marker: MarkerWIP},
|
||||
{text: ":ice_cube:", marker: MarkerStale},
|
||||
}
|
||||
|
||||
// ParseEntry parses a single markdown list line into an Entry.
|
||||
func ParseEntry(line string, lineNum int) (Entry, error) {
|
||||
m := entryRe.FindStringSubmatch(strings.TrimSpace(line))
|
||||
if m == nil {
|
||||
return Entry{}, fmt.Errorf("line %d: not a valid entry: %q", lineNum, line)
|
||||
}
|
||||
|
||||
middle := m[3] // text between URL closing paren and " - "
|
||||
desc := m[4]
|
||||
var markers []Marker
|
||||
|
||||
// Extract markers from both the middle section and the description
|
||||
for _, def := range markerDefs {
|
||||
if strings.Contains(middle, def.text) || strings.Contains(desc, def.text) {
|
||||
markers = append(markers, def.marker)
|
||||
middle = strings.ReplaceAll(middle, def.text, "")
|
||||
desc = strings.ReplaceAll(desc, def.text, "")
|
||||
}
|
||||
}
|
||||
desc = strings.TrimSpace(desc)
|
||||
|
||||
return Entry{
|
||||
Name: m[1],
|
||||
URL: m[2],
|
||||
Description: desc,
|
||||
Markers: markers,
|
||||
Line: lineNum,
|
||||
Raw: line,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// Parse reads a full README and returns a Document.
|
||||
func Parse(r io.Reader) (Document, error) {
|
||||
scanner := bufio.NewScanner(r)
|
||||
var doc Document
|
||||
var allSections []struct {
|
||||
section Section
|
||||
level int
|
||||
}
|
||||
|
||||
lineNum := 0
|
||||
for scanner.Scan() {
|
||||
lineNum++
|
||||
line := scanner.Text()
|
||||
|
||||
// Check for heading
|
||||
if hm := headingRe.FindStringSubmatch(line); hm != nil {
|
||||
level := len(hm[1])
|
||||
title := strings.TrimSpace(hm[2])
|
||||
allSections = append(allSections, struct {
|
||||
section Section
|
||||
level int
|
||||
}{
|
||||
section: Section{Title: title, Level: level, Line: lineNum},
|
||||
level: level,
|
||||
})
|
||||
continue
|
||||
}
|
||||
|
||||
// Check for entry (list item with link)
|
||||
if entry, err := ParseEntry(line, lineNum); err == nil {
|
||||
if len(allSections) > 0 {
|
||||
allSections[len(allSections)-1].section.Entries = append(
|
||||
allSections[len(allSections)-1].section.Entries, entry)
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
// Everything else: preamble if no sections yet
|
||||
if len(allSections) == 0 {
|
||||
doc.Preamble = append(doc.Preamble, line)
|
||||
}
|
||||
}
|
||||
|
||||
if err := scanner.Err(); err != nil {
|
||||
return doc, err
|
||||
}
|
||||
|
||||
// Build section tree by nesting based on heading level
|
||||
doc.Sections = buildTree(allSections)
|
||||
return doc, nil
|
||||
}
|
||||
|
||||
func buildTree(flat []struct {
|
||||
section Section
|
||||
level int
|
||||
},
|
||||
) []Section {
|
||||
if len(flat) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
var result []Section
|
||||
for i := 0; i < len(flat); i++ {
|
||||
current := flat[i].section
|
||||
currentLevel := flat[i].level
|
||||
|
||||
// Collect children: everything after this heading at a deeper level
|
||||
j := i + 1
|
||||
for j < len(flat) && flat[j].level > currentLevel {
|
||||
j++
|
||||
}
|
||||
if j > i+1 {
|
||||
current.Children = buildTree(flat[i+1 : j])
|
||||
}
|
||||
result = append(result, current)
|
||||
i = j - 1
|
||||
}
|
||||
return result
|
||||
}
|
||||
161
internal/parser/parser_test.go
Normal file
161
internal/parser/parser_test.go
Normal file
@@ -0,0 +1,161 @@
|
||||
package parser
|
||||
|
||||
import (
|
||||
"os"
|
||||
"strings"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestParseEntry(t *testing.T) {
|
||||
line := `- [Docker Desktop](https://www.docker.com/products/docker-desktop/) - Official native app. Only for Windows and MacOS.`
|
||||
entry, err := ParseEntry(line, 1)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
if entry.Name != "Docker Desktop" {
|
||||
t.Errorf("name = %q, want %q", entry.Name, "Docker Desktop")
|
||||
}
|
||||
if entry.URL != "https://www.docker.com/products/docker-desktop/" {
|
||||
t.Errorf("url = %q, want %q", entry.URL, "https://www.docker.com/products/docker-desktop/")
|
||||
}
|
||||
if entry.Description != "Official native app. Only for Windows and MacOS." {
|
||||
t.Errorf("description = %q, want %q", entry.Description, "Official native app. Only for Windows and MacOS.")
|
||||
}
|
||||
if len(entry.Markers) != 0 {
|
||||
t.Errorf("markers = %v, want empty", entry.Markers)
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseEntryWithMarkers(t *testing.T) {
|
||||
line := `- [Docker Swarm](https://github.com/docker/swarm) - Swarm clustering system. :skull:`
|
||||
entry, err := ParseEntry(line, 1)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
if entry.Name != "Docker Swarm" {
|
||||
t.Errorf("name = %q, want %q", entry.Name, "Docker Swarm")
|
||||
}
|
||||
if len(entry.Markers) != 1 || entry.Markers[0] != MarkerAbandoned {
|
||||
t.Errorf("markers = %v, want [MarkerAbandoned]", entry.Markers)
|
||||
}
|
||||
if strings.Contains(entry.Description, ":skull:") {
|
||||
t.Errorf("description should not contain marker text, got %q", entry.Description)
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseEntryMultipleMarkers(t *testing.T) {
|
||||
line := `- [SomeProject](https://example.com) - A project. :yen: :construction:`
|
||||
entry, err := ParseEntry(line, 1)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
if len(entry.Markers) != 2 {
|
||||
t.Fatalf("markers count = %d, want 2", len(entry.Markers))
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseEntryMarkersCanonicalOrder(t *testing.T) {
|
||||
line := `- [SomeProject](https://example.com) - :construction: A project. :skull:`
|
||||
entry, err := ParseEntry(line, 1)
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
if len(entry.Markers) != 2 {
|
||||
t.Fatalf("markers count = %d, want 2", len(entry.Markers))
|
||||
}
|
||||
if entry.Markers[0] != MarkerAbandoned || entry.Markers[1] != MarkerWIP {
|
||||
t.Fatalf("marker order = %v, want [MarkerAbandoned MarkerWIP]", entry.Markers)
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseDocument(t *testing.T) {
|
||||
input := `# Awesome Docker
|
||||
|
||||
> A curated list
|
||||
|
||||
# Contents
|
||||
|
||||
- [Projects](#projects)
|
||||
|
||||
# Legend
|
||||
|
||||
- Abandoned :skull:
|
||||
|
||||
# Projects
|
||||
|
||||
## Tools
|
||||
|
||||
- [ToolA](https://github.com/a/a) - Does A.
|
||||
- [ToolB](https://github.com/b/b) - Does B. :skull:
|
||||
|
||||
## Services
|
||||
|
||||
- [ServiceC](https://example.com/c) - Does C. :yen:
|
||||
`
|
||||
doc, err := Parse(strings.NewReader(input))
|
||||
if err != nil {
|
||||
t.Fatalf("unexpected error: %v", err)
|
||||
}
|
||||
if len(doc.Sections) == 0 {
|
||||
t.Fatal("expected at least one section")
|
||||
}
|
||||
// Find the "Projects" section
|
||||
var projects *Section
|
||||
for i := range doc.Sections {
|
||||
if doc.Sections[i].Title == "Projects" {
|
||||
projects = &doc.Sections[i]
|
||||
break
|
||||
}
|
||||
}
|
||||
if projects == nil {
|
||||
t.Fatal("expected a Projects section")
|
||||
}
|
||||
if len(projects.Children) != 2 {
|
||||
t.Errorf("projects children = %d, want 2", len(projects.Children))
|
||||
}
|
||||
if projects.Children[0].Title != "Tools" {
|
||||
t.Errorf("first child = %q, want %q", projects.Children[0].Title, "Tools")
|
||||
}
|
||||
if len(projects.Children[0].Entries) != 2 {
|
||||
t.Errorf("Tools entries = %d, want 2", len(projects.Children[0].Entries))
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseNotAnEntry(t *testing.T) {
|
||||
_, err := ParseEntry("- Abandoned :skull:", 1)
|
||||
if err == nil {
|
||||
t.Error("expected error for non-entry list item")
|
||||
}
|
||||
}
|
||||
|
||||
func TestParseRealREADME(t *testing.T) {
|
||||
f, err := os.Open("../../README.md")
|
||||
if err != nil {
|
||||
t.Skip("README.md not found, skipping integration test")
|
||||
}
|
||||
defer f.Close()
|
||||
|
||||
doc, err := Parse(f)
|
||||
if err != nil {
|
||||
t.Fatalf("failed to parse README: %v", err)
|
||||
}
|
||||
|
||||
if len(doc.Sections) == 0 {
|
||||
t.Error("expected sections")
|
||||
}
|
||||
|
||||
total := countEntries(doc.Sections)
|
||||
if total < 100 {
|
||||
t.Errorf("expected at least 100 entries, got %d", total)
|
||||
}
|
||||
t.Logf("Parsed %d sections, %d total entries", len(doc.Sections), total)
|
||||
}
|
||||
|
||||
func countEntries(sections []Section) int {
|
||||
n := 0
|
||||
for _, s := range sections {
|
||||
n += len(s.Entries)
|
||||
n += countEntries(s.Children)
|
||||
}
|
||||
return n
|
||||
}
|
||||
36
internal/parser/types.go
Normal file
36
internal/parser/types.go
Normal file
@@ -0,0 +1,36 @@
|
||||
package parser
|
||||
|
||||
// Marker represents a status emoji on an entry.
|
||||
type Marker int
|
||||
|
||||
const (
|
||||
MarkerAbandoned Marker = iota // :skull:
|
||||
MarkerPaid // :yen:
|
||||
MarkerWIP // :construction:
|
||||
MarkerStale // :ice_cube:
|
||||
)
|
||||
|
||||
// Entry is a single link entry in the README.
|
||||
type Entry struct {
|
||||
Name string
|
||||
URL string
|
||||
Description string
|
||||
Markers []Marker
|
||||
Line int // 1-based line number in source
|
||||
Raw string // original line text
|
||||
}
|
||||
|
||||
// Section is a heading with optional entries and child sections.
|
||||
type Section struct {
|
||||
Title string
|
||||
Level int // heading level: 1 = #, 2 = ##, etc.
|
||||
Entries []Entry
|
||||
Children []Section
|
||||
Line int
|
||||
}
|
||||
|
||||
// Document is the parsed representation of the full README.
|
||||
type Document struct {
|
||||
Preamble []string // lines before the first section
|
||||
Sections []Section
|
||||
}
|
||||
178
internal/scorer/scorer.go
Normal file
178
internal/scorer/scorer.go
Normal file
@@ -0,0 +1,178 @@
|
||||
package scorer
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
"github.com/veggiemonk/awesome-docker/internal/checker"
|
||||
)
|
||||
|
||||
// Status represents the health status of an entry.
|
||||
type Status string
|
||||
|
||||
const (
|
||||
StatusHealthy Status = "healthy"
|
||||
StatusInactive Status = "inactive" // 1-2 years since last push
|
||||
StatusStale Status = "stale" // 2+ years since last push
|
||||
StatusArchived Status = "archived"
|
||||
StatusDead Status = "dead" // disabled or 404
|
||||
)
|
||||
|
||||
// ScoredEntry is a repo with its computed health status.
|
||||
type ScoredEntry struct {
|
||||
URL string
|
||||
Name string
|
||||
Status Status
|
||||
Stars int
|
||||
Forks int
|
||||
HasLicense bool
|
||||
LastPush time.Time
|
||||
Category string
|
||||
Description string
|
||||
}
|
||||
|
||||
// ReportSummary contains grouped status counts.
|
||||
type ReportSummary struct {
|
||||
Healthy int `json:"healthy"`
|
||||
Inactive int `json:"inactive"`
|
||||
Stale int `json:"stale"`
|
||||
Archived int `json:"archived"`
|
||||
Dead int `json:"dead"`
|
||||
}
|
||||
|
||||
// ReportData is the full machine-readable report model.
|
||||
type ReportData struct {
|
||||
GeneratedAt time.Time `json:"generated_at"`
|
||||
Total int `json:"total"`
|
||||
Summary ReportSummary `json:"summary"`
|
||||
Entries []ScoredEntry `json:"entries"`
|
||||
ByStatus map[Status][]ScoredEntry `json:"by_status"`
|
||||
}
|
||||
|
||||
// Score computes the health status of a GitHub repo.
|
||||
func Score(info checker.RepoInfo) Status {
|
||||
if info.IsDisabled {
|
||||
return StatusDead
|
||||
}
|
||||
if info.IsArchived {
|
||||
return StatusArchived
|
||||
}
|
||||
|
||||
twoYearsAgo := time.Now().AddDate(-2, 0, 0)
|
||||
oneYearAgo := time.Now().AddDate(-1, 0, 0)
|
||||
|
||||
if info.PushedAt.Before(twoYearsAgo) {
|
||||
return StatusStale
|
||||
}
|
||||
if info.PushedAt.Before(oneYearAgo) {
|
||||
return StatusInactive
|
||||
}
|
||||
return StatusHealthy
|
||||
}
|
||||
|
||||
// ScoreAll scores a batch of repo infos.
|
||||
func ScoreAll(infos []checker.RepoInfo) []ScoredEntry {
|
||||
results := make([]ScoredEntry, len(infos))
|
||||
for i, info := range infos {
|
||||
results[i] = ScoredEntry{
|
||||
URL: info.URL,
|
||||
Name: fmt.Sprintf("%s/%s", info.Owner, info.Name),
|
||||
Status: Score(info),
|
||||
Stars: info.Stars,
|
||||
Forks: info.Forks,
|
||||
HasLicense: info.HasLicense,
|
||||
LastPush: info.PushedAt,
|
||||
}
|
||||
}
|
||||
return results
|
||||
}
|
||||
|
||||
// ToCacheEntries converts scored entries to cache format.
|
||||
func ToCacheEntries(scored []ScoredEntry) []cache.HealthEntry {
|
||||
entries := make([]cache.HealthEntry, len(scored))
|
||||
now := time.Now().UTC()
|
||||
for i, s := range scored {
|
||||
entries[i] = cache.HealthEntry{
|
||||
URL: s.URL,
|
||||
Name: s.Name,
|
||||
Status: string(s.Status),
|
||||
Stars: s.Stars,
|
||||
Forks: s.Forks,
|
||||
HasLicense: s.HasLicense,
|
||||
LastPush: s.LastPush,
|
||||
CheckedAt: now,
|
||||
Category: s.Category,
|
||||
Description: s.Description,
|
||||
}
|
||||
}
|
||||
return entries
|
||||
}
|
||||
|
||||
// GenerateReport produces a Markdown health report.
|
||||
func GenerateReport(scored []ScoredEntry) string {
|
||||
var b strings.Builder
|
||||
|
||||
data := BuildReportData(scored)
|
||||
groups := data.ByStatus
|
||||
|
||||
fmt.Fprintf(&b, "# Health Report\n\n")
|
||||
fmt.Fprintf(&b, "**Generated:** %s\n\n", data.GeneratedAt.Format(time.RFC3339))
|
||||
fmt.Fprintf(&b, "**Total:** %d repositories\n\n", data.Total)
|
||||
|
||||
fmt.Fprintf(&b, "## Summary\n\n")
|
||||
fmt.Fprintf(&b, "- Healthy: %d\n", data.Summary.Healthy)
|
||||
fmt.Fprintf(&b, "- Inactive (1-2 years): %d\n", data.Summary.Inactive)
|
||||
fmt.Fprintf(&b, "- Stale (2+ years): %d\n", data.Summary.Stale)
|
||||
fmt.Fprintf(&b, "- Archived: %d\n", data.Summary.Archived)
|
||||
fmt.Fprintf(&b, "- Dead: %d\n\n", data.Summary.Dead)
|
||||
|
||||
writeSection := func(title string, status Status) {
|
||||
entries := groups[status]
|
||||
if len(entries) == 0 {
|
||||
return
|
||||
}
|
||||
fmt.Fprintf(&b, "## %s\n\n", title)
|
||||
for _, e := range entries {
|
||||
fmt.Fprintf(&b, "- [%s](%s) - Stars: %d - Last push: %s\n",
|
||||
e.Name, e.URL, e.Stars, e.LastPush.Format("2006-01-02"))
|
||||
}
|
||||
b.WriteString("\n")
|
||||
}
|
||||
|
||||
writeSection("Archived (should mark :skull:)", StatusArchived)
|
||||
writeSection("Stale (2+ years inactive)", StatusStale)
|
||||
writeSection("Inactive (1-2 years)", StatusInactive)
|
||||
|
||||
return b.String()
|
||||
}
|
||||
|
||||
// BuildReportData returns full report data for machine-readable and markdown rendering.
|
||||
func BuildReportData(scored []ScoredEntry) ReportData {
|
||||
groups := map[Status][]ScoredEntry{}
|
||||
for _, s := range scored {
|
||||
groups[s.Status] = append(groups[s.Status], s)
|
||||
}
|
||||
|
||||
return ReportData{
|
||||
GeneratedAt: time.Now().UTC(),
|
||||
Total: len(scored),
|
||||
Summary: ReportSummary{
|
||||
Healthy: len(groups[StatusHealthy]),
|
||||
Inactive: len(groups[StatusInactive]),
|
||||
Stale: len(groups[StatusStale]),
|
||||
Archived: len(groups[StatusArchived]),
|
||||
Dead: len(groups[StatusDead]),
|
||||
},
|
||||
Entries: scored,
|
||||
ByStatus: groups,
|
||||
}
|
||||
}
|
||||
|
||||
// GenerateJSONReport returns the full report as pretty-printed JSON.
|
||||
func GenerateJSONReport(scored []ScoredEntry) ([]byte, error) {
|
||||
data := BuildReportData(scored)
|
||||
return json.MarshalIndent(data, "", " ")
|
||||
}
|
||||
164
internal/scorer/scorer_test.go
Normal file
164
internal/scorer/scorer_test.go
Normal file
@@ -0,0 +1,164 @@
|
||||
package scorer
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"strings"
|
||||
"testing"
|
||||
"time"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/checker"
|
||||
)
|
||||
|
||||
func TestScoreHealthy(t *testing.T) {
|
||||
info := checker.RepoInfo{
|
||||
PushedAt: time.Now().AddDate(0, -3, 0),
|
||||
IsArchived: false,
|
||||
Stars: 100,
|
||||
HasLicense: true,
|
||||
}
|
||||
status := Score(info)
|
||||
if status != StatusHealthy {
|
||||
t.Errorf("status = %q, want %q", status, StatusHealthy)
|
||||
}
|
||||
}
|
||||
|
||||
func TestScoreInactive(t *testing.T) {
|
||||
info := checker.RepoInfo{
|
||||
PushedAt: time.Now().AddDate(-1, -6, 0),
|
||||
IsArchived: false,
|
||||
}
|
||||
status := Score(info)
|
||||
if status != StatusInactive {
|
||||
t.Errorf("status = %q, want %q", status, StatusInactive)
|
||||
}
|
||||
}
|
||||
|
||||
func TestScoreStale(t *testing.T) {
|
||||
info := checker.RepoInfo{
|
||||
PushedAt: time.Now().AddDate(-3, 0, 0),
|
||||
IsArchived: false,
|
||||
}
|
||||
status := Score(info)
|
||||
if status != StatusStale {
|
||||
t.Errorf("status = %q, want %q", status, StatusStale)
|
||||
}
|
||||
}
|
||||
|
||||
func TestScoreArchived(t *testing.T) {
|
||||
info := checker.RepoInfo{
|
||||
PushedAt: time.Now(),
|
||||
IsArchived: true,
|
||||
}
|
||||
status := Score(info)
|
||||
if status != StatusArchived {
|
||||
t.Errorf("status = %q, want %q", status, StatusArchived)
|
||||
}
|
||||
}
|
||||
|
||||
func TestScoreDisabled(t *testing.T) {
|
||||
info := checker.RepoInfo{
|
||||
IsDisabled: true,
|
||||
}
|
||||
status := Score(info)
|
||||
if status != StatusDead {
|
||||
t.Errorf("status = %q, want %q", status, StatusDead)
|
||||
}
|
||||
}
|
||||
|
||||
func TestGenerateReport(t *testing.T) {
|
||||
results := []ScoredEntry{
|
||||
{URL: "https://github.com/a/a", Name: "a/a", Status: StatusHealthy, Stars: 100, LastPush: time.Now()},
|
||||
{URL: "https://github.com/b/b", Name: "b/b", Status: StatusArchived, Stars: 50, LastPush: time.Now()},
|
||||
{URL: "https://github.com/c/c", Name: "c/c", Status: StatusStale, Stars: 10, LastPush: time.Now().AddDate(-3, 0, 0)},
|
||||
}
|
||||
report := GenerateReport(results)
|
||||
if !strings.Contains(report, "Healthy: 1") {
|
||||
t.Error("report should contain 'Healthy: 1'")
|
||||
}
|
||||
if !strings.Contains(report, "Archived: 1") {
|
||||
t.Error("report should contain 'Archived: 1'")
|
||||
}
|
||||
if !strings.Contains(report, "Stale") {
|
||||
t.Error("report should contain 'Stale'")
|
||||
}
|
||||
}
|
||||
|
||||
func TestGenerateReportShowsAllEntries(t *testing.T) {
|
||||
var results []ScoredEntry
|
||||
for i := 0; i < 55; i++ {
|
||||
results = append(results, ScoredEntry{
|
||||
URL: fmt.Sprintf("https://github.com/stale/%d", i),
|
||||
Name: fmt.Sprintf("stale/%d", i),
|
||||
Status: StatusStale,
|
||||
Stars: i,
|
||||
LastPush: time.Now().AddDate(-3, 0, 0),
|
||||
})
|
||||
}
|
||||
|
||||
report := GenerateReport(results)
|
||||
if strings.Contains(report, "... and") {
|
||||
t.Fatal("report should not be truncated")
|
||||
}
|
||||
if !strings.Contains(report, "stale/54") {
|
||||
t.Fatal("report should contain all entries")
|
||||
}
|
||||
}
|
||||
|
||||
func TestGenerateJSONReport(t *testing.T) {
|
||||
results := []ScoredEntry{
|
||||
{
|
||||
URL: "https://github.com/a/a",
|
||||
Name: "a/a",
|
||||
Status: StatusHealthy,
|
||||
Stars: 100,
|
||||
LastPush: time.Now(),
|
||||
},
|
||||
{
|
||||
URL: "https://github.com/b/b",
|
||||
Name: "b/b",
|
||||
Status: StatusStale,
|
||||
Stars: 50,
|
||||
LastPush: time.Now().AddDate(-3, 0, 0),
|
||||
},
|
||||
}
|
||||
|
||||
data, err := GenerateJSONReport(results)
|
||||
if err != nil {
|
||||
t.Fatalf("GenerateJSONReport() error = %v", err)
|
||||
}
|
||||
|
||||
var report ReportData
|
||||
if err := json.Unmarshal(data, &report); err != nil {
|
||||
t.Fatalf("json.Unmarshal() error = %v", err)
|
||||
}
|
||||
if report.Total != 2 {
|
||||
t.Fatalf("report.Total = %d, want 2", report.Total)
|
||||
}
|
||||
if report.Summary.Healthy != 1 || report.Summary.Stale != 1 {
|
||||
t.Fatalf("summary = %+v, want healthy=1 stale=1", report.Summary)
|
||||
}
|
||||
if len(report.Entries) != 2 {
|
||||
t.Fatalf("len(report.Entries) = %d, want 2", len(report.Entries))
|
||||
}
|
||||
if len(report.ByStatus[StatusStale]) != 1 {
|
||||
t.Fatalf("len(report.ByStatus[stale]) = %d, want 1", len(report.ByStatus[StatusStale]))
|
||||
}
|
||||
}
|
||||
|
||||
func TestScoreAll(t *testing.T) {
|
||||
infos := []checker.RepoInfo{
|
||||
{Owner: "a", Name: "a", PushedAt: time.Now(), Stars: 10},
|
||||
{Owner: "b", Name: "b", PushedAt: time.Now().AddDate(-3, 0, 0), Stars: 5},
|
||||
}
|
||||
scored := ScoreAll(infos)
|
||||
if len(scored) != 2 {
|
||||
t.Fatalf("scored = %d, want 2", len(scored))
|
||||
}
|
||||
if scored[0].Status != StatusHealthy {
|
||||
t.Errorf("first = %q, want healthy", scored[0].Status)
|
||||
}
|
||||
if scored[1].Status != StatusStale {
|
||||
t.Errorf("second = %q, want stale", scored[1].Status)
|
||||
}
|
||||
}
|
||||
603
internal/tui/model.go
Normal file
603
internal/tui/model.go
Normal file
@@ -0,0 +1,603 @@
|
||||
package tui
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os/exec"
|
||||
"runtime"
|
||||
"strings"
|
||||
"unicode/utf8"
|
||||
|
||||
tea "charm.land/bubbletea/v2"
|
||||
"charm.land/lipgloss/v2"
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
)
|
||||
|
||||
type panel int
|
||||
|
||||
const (
|
||||
panelTree panel = iota
|
||||
panelList
|
||||
)
|
||||
|
||||
const entryHeight = 5 // lines rendered per entry in the list panel
|
||||
const scrollOff = 4 // minimum lines/entries kept visible above and below cursor
|
||||
|
||||
// Model is the top-level Bubbletea model.
|
||||
type Model struct {
|
||||
roots []*TreeNode
|
||||
flatTree []FlatNode
|
||||
|
||||
activePanel panel
|
||||
treeCursor int
|
||||
treeOffset int
|
||||
listCursor int
|
||||
listOffset int
|
||||
currentEntries []cache.HealthEntry
|
||||
|
||||
filtering bool
|
||||
filterText string
|
||||
|
||||
width, height int
|
||||
}
|
||||
|
||||
// New creates a new Model from health cache entries.
|
||||
func New(entries []cache.HealthEntry) Model {
|
||||
roots := BuildTree(entries)
|
||||
// Expand first root by default
|
||||
if len(roots) > 0 {
|
||||
roots[0].Expanded = true
|
||||
}
|
||||
flat := FlattenVisible(roots)
|
||||
|
||||
m := Model{
|
||||
roots: roots,
|
||||
flatTree: flat,
|
||||
}
|
||||
m.updateCurrentEntries()
|
||||
return m
|
||||
}
|
||||
|
||||
// Init returns an initial command.
|
||||
func (m Model) Init() tea.Cmd {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Update handles messages.
|
||||
func (m Model) Update(msg tea.Msg) (tea.Model, tea.Cmd) {
|
||||
switch msg := msg.(type) {
|
||||
case tea.WindowSizeMsg:
|
||||
m.width = msg.Width
|
||||
m.height = msg.Height
|
||||
return m, nil
|
||||
|
||||
case openURLMsg:
|
||||
return m, nil
|
||||
|
||||
case tea.KeyPressMsg:
|
||||
// Filter mode input
|
||||
if m.filtering {
|
||||
return m.handleFilterKey(msg)
|
||||
}
|
||||
|
||||
switch msg.String() {
|
||||
case "q", "ctrl+c":
|
||||
return m, tea.Quit
|
||||
case "tab":
|
||||
if m.activePanel == panelTree {
|
||||
m.activePanel = panelList
|
||||
} else {
|
||||
m.activePanel = panelTree
|
||||
}
|
||||
case "/":
|
||||
m.filtering = true
|
||||
m.filterText = ""
|
||||
default:
|
||||
if m.activePanel == panelTree {
|
||||
return m.handleTreeKey(msg)
|
||||
}
|
||||
return m.handleListKey(msg)
|
||||
}
|
||||
}
|
||||
return m, nil
|
||||
}
|
||||
|
||||
func (m Model) handleFilterKey(msg tea.KeyPressMsg) (tea.Model, tea.Cmd) {
|
||||
switch msg.String() {
|
||||
case "esc":
|
||||
m.filtering = false
|
||||
m.filterText = ""
|
||||
m.flatTree = FlattenVisible(m.roots)
|
||||
m.updateCurrentEntries()
|
||||
case "enter":
|
||||
m.filtering = false
|
||||
case "backspace":
|
||||
if len(m.filterText) > 0 {
|
||||
m.filterText = m.filterText[:len(m.filterText)-1]
|
||||
m.applyFilter()
|
||||
}
|
||||
default:
|
||||
r := msg.String()
|
||||
if utf8.RuneCountInString(r) == 1 {
|
||||
m.filterText += r
|
||||
m.applyFilter()
|
||||
}
|
||||
}
|
||||
return m, nil
|
||||
}
|
||||
|
||||
func (m *Model) applyFilter() {
|
||||
if m.filterText == "" {
|
||||
m.flatTree = FlattenVisible(m.roots)
|
||||
m.updateCurrentEntries()
|
||||
return
|
||||
}
|
||||
|
||||
query := strings.ToLower(m.filterText)
|
||||
var filtered []cache.HealthEntry
|
||||
for _, root := range m.roots {
|
||||
for _, e := range root.AllEntries() {
|
||||
if strings.Contains(strings.ToLower(e.Name), query) ||
|
||||
strings.Contains(strings.ToLower(e.Description), query) ||
|
||||
strings.Contains(strings.ToLower(e.Category), query) {
|
||||
filtered = append(filtered, e)
|
||||
}
|
||||
}
|
||||
}
|
||||
m.currentEntries = filtered
|
||||
m.listCursor = 0
|
||||
m.listOffset = 0
|
||||
}
|
||||
|
||||
func (m Model) handleTreeKey(msg tea.KeyPressMsg) (tea.Model, tea.Cmd) {
|
||||
switch msg.String() {
|
||||
case "up", "k":
|
||||
if m.treeCursor > 0 {
|
||||
m.treeCursor--
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
}
|
||||
case "down", "j":
|
||||
if m.treeCursor < len(m.flatTree)-1 {
|
||||
m.treeCursor++
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
}
|
||||
case "enter", " ":
|
||||
if m.treeCursor < len(m.flatTree) {
|
||||
node := m.flatTree[m.treeCursor].Node
|
||||
if node.HasChildren() {
|
||||
node.Expanded = !node.Expanded
|
||||
m.flatTree = FlattenVisible(m.roots)
|
||||
if m.treeCursor >= len(m.flatTree) {
|
||||
m.treeCursor = len(m.flatTree) - 1
|
||||
}
|
||||
}
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
}
|
||||
case "ctrl+d", "pgdown":
|
||||
half := m.treePanelHeight() / 2
|
||||
if half < 1 {
|
||||
half = 1
|
||||
}
|
||||
m.treeCursor += half
|
||||
if m.treeCursor >= len(m.flatTree) {
|
||||
m.treeCursor = len(m.flatTree) - 1
|
||||
}
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
case "ctrl+u", "pgup":
|
||||
half := m.treePanelHeight() / 2
|
||||
if half < 1 {
|
||||
half = 1
|
||||
}
|
||||
m.treeCursor -= half
|
||||
if m.treeCursor < 0 {
|
||||
m.treeCursor = 0
|
||||
}
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
case "g", "home":
|
||||
m.treeCursor = 0
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
case "G", "end":
|
||||
m.treeCursor = len(m.flatTree) - 1
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
case "right", "l":
|
||||
if m.treeCursor < len(m.flatTree) {
|
||||
node := m.flatTree[m.treeCursor].Node
|
||||
if node.HasChildren() && !node.Expanded {
|
||||
node.Expanded = true
|
||||
m.flatTree = FlattenVisible(m.roots)
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
} else {
|
||||
m.activePanel = panelList
|
||||
}
|
||||
}
|
||||
case "left", "h":
|
||||
if m.treeCursor < len(m.flatTree) {
|
||||
node := m.flatTree[m.treeCursor].Node
|
||||
if node.HasChildren() && node.Expanded {
|
||||
node.Expanded = false
|
||||
m.flatTree = FlattenVisible(m.roots)
|
||||
m.adjustTreeScroll()
|
||||
m.updateCurrentEntries()
|
||||
}
|
||||
}
|
||||
}
|
||||
return m, nil
|
||||
}
|
||||
|
||||
func (m *Model) adjustTreeScroll() {
|
||||
visible := m.treePanelHeight()
|
||||
off := scrollOff
|
||||
if off > visible/2 {
|
||||
off = visible / 2
|
||||
}
|
||||
if m.treeCursor < m.treeOffset+off {
|
||||
m.treeOffset = m.treeCursor - off
|
||||
}
|
||||
if m.treeCursor >= m.treeOffset+visible-off {
|
||||
m.treeOffset = m.treeCursor - visible + off + 1
|
||||
}
|
||||
if m.treeOffset < 0 {
|
||||
m.treeOffset = 0
|
||||
}
|
||||
}
|
||||
|
||||
func (m Model) treePanelHeight() int {
|
||||
h := m.height - 6 // header, footer, borders, title
|
||||
if h < 1 {
|
||||
h = 1
|
||||
}
|
||||
return h
|
||||
}
|
||||
|
||||
func (m Model) handleListKey(msg tea.KeyPressMsg) (tea.Model, tea.Cmd) {
|
||||
switch msg.String() {
|
||||
case "up", "k":
|
||||
if m.listCursor > 0 {
|
||||
m.listCursor--
|
||||
m.adjustListScroll()
|
||||
}
|
||||
case "down", "j":
|
||||
if m.listCursor < len(m.currentEntries)-1 {
|
||||
m.listCursor++
|
||||
m.adjustListScroll()
|
||||
}
|
||||
case "ctrl+d", "pgdown":
|
||||
half := m.visibleListEntries() / 2
|
||||
if half < 1 {
|
||||
half = 1
|
||||
}
|
||||
m.listCursor += half
|
||||
if m.listCursor >= len(m.currentEntries) {
|
||||
m.listCursor = len(m.currentEntries) - 1
|
||||
}
|
||||
m.adjustListScroll()
|
||||
case "ctrl+u", "pgup":
|
||||
half := m.visibleListEntries() / 2
|
||||
if half < 1 {
|
||||
half = 1
|
||||
}
|
||||
m.listCursor -= half
|
||||
if m.listCursor < 0 {
|
||||
m.listCursor = 0
|
||||
}
|
||||
m.adjustListScroll()
|
||||
case "g", "home":
|
||||
m.listCursor = 0
|
||||
m.adjustListScroll()
|
||||
case "G", "end":
|
||||
m.listCursor = len(m.currentEntries) - 1
|
||||
m.adjustListScroll()
|
||||
case "enter":
|
||||
if m.listCursor < len(m.currentEntries) {
|
||||
return m, openURL(m.currentEntries[m.listCursor].URL)
|
||||
}
|
||||
case "left", "h":
|
||||
m.activePanel = panelTree
|
||||
}
|
||||
return m, nil
|
||||
}
|
||||
|
||||
func (m *Model) updateCurrentEntries() {
|
||||
if len(m.flatTree) == 0 {
|
||||
m.currentEntries = nil
|
||||
return
|
||||
}
|
||||
if m.treeCursor >= len(m.flatTree) {
|
||||
m.treeCursor = len(m.flatTree) - 1
|
||||
}
|
||||
node := m.flatTree[m.treeCursor].Node
|
||||
m.currentEntries = node.AllEntries()
|
||||
m.listCursor = 0
|
||||
m.listOffset = 0
|
||||
}
|
||||
|
||||
func (m Model) visibleListEntries() int {
|
||||
v := m.listPanelHeight() / entryHeight
|
||||
if v < 1 {
|
||||
return 1
|
||||
}
|
||||
return v
|
||||
}
|
||||
|
||||
func (m *Model) adjustListScroll() {
|
||||
visible := m.visibleListEntries()
|
||||
off := scrollOff
|
||||
if off > visible/2 {
|
||||
off = visible / 2
|
||||
}
|
||||
if m.listCursor < m.listOffset+off {
|
||||
m.listOffset = m.listCursor - off
|
||||
}
|
||||
if m.listCursor >= m.listOffset+visible-off {
|
||||
m.listOffset = m.listCursor - visible + off + 1
|
||||
}
|
||||
if m.listOffset < 0 {
|
||||
m.listOffset = 0
|
||||
}
|
||||
}
|
||||
|
||||
func (m Model) listPanelHeight() int {
|
||||
// height minus header, footer, borders
|
||||
h := m.height - 4
|
||||
if h < 1 {
|
||||
h = 1
|
||||
}
|
||||
return h
|
||||
}
|
||||
|
||||
// View renders the UI.
|
||||
func (m Model) View() tea.View {
|
||||
if m.width == 0 || m.height == 0 {
|
||||
return tea.NewView("Loading...")
|
||||
}
|
||||
|
||||
treeWidth := m.width*3/10 - 2 // 30% minus borders
|
||||
listWidth := m.width - treeWidth - 6 // remaining minus borders/gaps
|
||||
contentHeight := m.height - 3 // minus footer
|
||||
|
||||
if treeWidth < 10 {
|
||||
treeWidth = 10
|
||||
}
|
||||
if listWidth < 20 {
|
||||
listWidth = 20
|
||||
}
|
||||
if contentHeight < 3 {
|
||||
contentHeight = 3
|
||||
}
|
||||
|
||||
tree := m.renderTree(treeWidth, contentHeight)
|
||||
list := m.renderList(listWidth, contentHeight)
|
||||
|
||||
// Apply border styles
|
||||
treeBorder := inactiveBorderStyle
|
||||
listBorder := inactiveBorderStyle
|
||||
if m.activePanel == panelTree {
|
||||
treeBorder = activeBorderStyle
|
||||
} else {
|
||||
listBorder = activeBorderStyle
|
||||
}
|
||||
|
||||
treePanel := treeBorder.Width(treeWidth).Height(contentHeight).Render(tree)
|
||||
listPanel := listBorder.Width(listWidth).Height(contentHeight).Render(list)
|
||||
|
||||
body := lipgloss.JoinHorizontal(lipgloss.Top, treePanel, listPanel)
|
||||
|
||||
footer := m.renderFooter()
|
||||
|
||||
content := lipgloss.JoinVertical(lipgloss.Left, body, footer)
|
||||
|
||||
v := tea.NewView(content)
|
||||
v.AltScreen = true
|
||||
return v
|
||||
}
|
||||
|
||||
func (m Model) renderTree(width, height int) string {
|
||||
var b strings.Builder
|
||||
|
||||
title := headerStyle.Render("Categories")
|
||||
b.WriteString(title)
|
||||
b.WriteString("\n\n")
|
||||
|
||||
linesUsed := 2
|
||||
end := m.treeOffset + height - 2
|
||||
if end > len(m.flatTree) {
|
||||
end = len(m.flatTree)
|
||||
}
|
||||
for i := m.treeOffset; i < end; i++ {
|
||||
fn := m.flatTree[i]
|
||||
if linesUsed >= height {
|
||||
break
|
||||
}
|
||||
|
||||
indent := strings.Repeat(" ", fn.Depth)
|
||||
icon := " "
|
||||
if fn.Node.HasChildren() {
|
||||
if fn.Node.Expanded {
|
||||
icon = "▼ "
|
||||
} else {
|
||||
icon = "▶ "
|
||||
}
|
||||
}
|
||||
|
||||
count := fn.Node.TotalEntries()
|
||||
label := fmt.Sprintf("%s%s%s (%d)", indent, icon, fn.Node.Name, count)
|
||||
|
||||
// Truncate to width
|
||||
if len(label) > width {
|
||||
label = label[:width-1] + "…"
|
||||
}
|
||||
|
||||
if i == m.treeCursor {
|
||||
label = treeSelectedStyle.Render(label)
|
||||
} else {
|
||||
label = treeNormalStyle.Render(label)
|
||||
}
|
||||
|
||||
b.WriteString(label)
|
||||
b.WriteString("\n")
|
||||
linesUsed++
|
||||
}
|
||||
|
||||
return b.String()
|
||||
}
|
||||
|
||||
func (m Model) renderList(width, height int) string {
|
||||
var b strings.Builder
|
||||
|
||||
// Title
|
||||
title := "Resources"
|
||||
if m.filtering && m.filterText != "" {
|
||||
title = fmt.Sprintf("Resources (filter: %s)", m.filterText)
|
||||
}
|
||||
b.WriteString(headerStyle.Render(title))
|
||||
b.WriteString("\n\n")
|
||||
|
||||
if len(m.currentEntries) == 0 {
|
||||
b.WriteString(entryDescStyle.Render(" No entries"))
|
||||
return b.String()
|
||||
}
|
||||
|
||||
linesUsed := 2
|
||||
|
||||
visible := (height - 2) / entryHeight
|
||||
if visible < 1 {
|
||||
visible = 1
|
||||
}
|
||||
|
||||
start := m.listOffset
|
||||
end := start + visible
|
||||
if end > len(m.currentEntries) {
|
||||
end = len(m.currentEntries)
|
||||
}
|
||||
|
||||
for idx := start; idx < end; idx++ {
|
||||
if linesUsed+entryHeight > height {
|
||||
break
|
||||
}
|
||||
|
||||
e := m.currentEntries[idx]
|
||||
selected := idx == m.listCursor
|
||||
|
||||
// Use a safe width that accounts for Unicode characters (★, ⑂)
|
||||
// that some terminals render as 2 columns but lipgloss counts as 1.
|
||||
safeWidth := width - 2
|
||||
|
||||
// Line 1: name + stars + forks
|
||||
stats := fmt.Sprintf("★ %d", e.Stars)
|
||||
if e.Forks > 0 {
|
||||
stats += fmt.Sprintf(" ⑂ %d", e.Forks)
|
||||
}
|
||||
name := e.Name
|
||||
statsW := lipgloss.Width(stats)
|
||||
maxName := safeWidth - statsW - 2 // 2 for minimum gap
|
||||
if maxName < 4 {
|
||||
maxName = 4
|
||||
}
|
||||
if lipgloss.Width(name) > maxName {
|
||||
name = truncateToWidth(name, maxName-1) + "…"
|
||||
}
|
||||
nameStr := entryNameStyle.Render(name)
|
||||
statsStr := entryDescStyle.Render(stats)
|
||||
padding := safeWidth - lipgloss.Width(nameStr) - lipgloss.Width(statsStr)
|
||||
if padding < 1 {
|
||||
padding = 1
|
||||
}
|
||||
line1 := nameStr + strings.Repeat(" ", padding) + statsStr
|
||||
|
||||
// Line 2: URL
|
||||
url := e.URL
|
||||
if lipgloss.Width(url) > safeWidth {
|
||||
url = truncateToWidth(url, safeWidth-1) + "…"
|
||||
}
|
||||
line2 := entryURLStyle.Render(url)
|
||||
|
||||
// Line 3: description
|
||||
desc := e.Description
|
||||
if lipgloss.Width(desc) > safeWidth {
|
||||
desc = truncateToWidth(desc, safeWidth-3) + "..."
|
||||
}
|
||||
line3 := entryDescStyle.Render(desc)
|
||||
|
||||
// Line 4: status + last push
|
||||
statusStr := statusStyle(e.Status).Render(e.Status)
|
||||
lastPush := ""
|
||||
if !e.LastPush.IsZero() {
|
||||
lastPush = fmt.Sprintf(" Last push: %s", e.LastPush.Format("2006-01-02"))
|
||||
}
|
||||
line4 := statusStr + entryDescStyle.Render(lastPush)
|
||||
|
||||
// Line 5: separator
|
||||
sepWidth := safeWidth
|
||||
if sepWidth < 1 {
|
||||
sepWidth = 1
|
||||
}
|
||||
line5 := entryDescStyle.Render(strings.Repeat("─", sepWidth))
|
||||
|
||||
entry := fmt.Sprintf("%s\n%s\n%s\n%s\n%s", line1, line2, line3, line4, line5)
|
||||
|
||||
if selected && m.activePanel == panelList {
|
||||
entry = entrySelectedStyle.Render(entry)
|
||||
}
|
||||
|
||||
b.WriteString(entry)
|
||||
b.WriteString("\n")
|
||||
linesUsed += entryHeight
|
||||
}
|
||||
|
||||
// Scroll indicator
|
||||
if len(m.currentEntries) > visible {
|
||||
indicator := fmt.Sprintf(" %d-%d of %d", start+1, end, len(m.currentEntries))
|
||||
b.WriteString(footerStyle.Render(indicator))
|
||||
}
|
||||
|
||||
return b.String()
|
||||
}
|
||||
|
||||
func (m Model) renderFooter() string {
|
||||
if m.filtering {
|
||||
return filterPromptStyle.Render("/") + entryDescStyle.Render(m.filterText+"█")
|
||||
}
|
||||
help := " Tab:switch j/k:nav PgDn/PgUp:page g/G:top/bottom Enter:expand/open /:filter q:quit"
|
||||
return footerStyle.Render(help)
|
||||
}
|
||||
|
||||
// openURLMsg is sent after attempting to open a URL.
|
||||
type openURLMsg struct{ err error }
|
||||
|
||||
func openURL(url string) tea.Cmd {
|
||||
return func() tea.Msg {
|
||||
var cmd *exec.Cmd
|
||||
switch runtime.GOOS {
|
||||
case "darwin":
|
||||
cmd = exec.Command("open", url)
|
||||
case "windows":
|
||||
cmd = exec.Command("cmd", "/c", "start", url)
|
||||
default:
|
||||
cmd = exec.Command("xdg-open", url)
|
||||
}
|
||||
return openURLMsg{err: cmd.Run()}
|
||||
}
|
||||
}
|
||||
|
||||
// truncateToWidth truncates s to at most maxWidth visible columns.
|
||||
func truncateToWidth(s string, maxWidth int) string {
|
||||
if maxWidth <= 0 {
|
||||
return ""
|
||||
}
|
||||
w := 0
|
||||
for i, r := range s {
|
||||
rw := lipgloss.Width(string(r))
|
||||
if w+rw > maxWidth {
|
||||
return s[:i]
|
||||
}
|
||||
w += rw
|
||||
}
|
||||
return s
|
||||
}
|
||||
59
internal/tui/styles.go
Normal file
59
internal/tui/styles.go
Normal file
@@ -0,0 +1,59 @@
|
||||
package tui
|
||||
|
||||
import "charm.land/lipgloss/v2"
|
||||
|
||||
var (
|
||||
// Panel borders
|
||||
activeBorderStyle = lipgloss.NewStyle().
|
||||
Border(lipgloss.RoundedBorder()).
|
||||
BorderForeground(lipgloss.Color("#7D56F4"))
|
||||
|
||||
inactiveBorderStyle = lipgloss.NewStyle().
|
||||
Border(lipgloss.RoundedBorder()).
|
||||
BorderForeground(lipgloss.Color("#555555"))
|
||||
|
||||
// Tree styles
|
||||
treeSelectedStyle = lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("#FF79C6")).Background(lipgloss.Color("#3B2D50"))
|
||||
treeNormalStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#CCCCCC"))
|
||||
|
||||
// Entry styles
|
||||
entryNameStyle = lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("#50FA7B"))
|
||||
entryURLStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#888888")).Italic(true)
|
||||
entryDescStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#CCCCCC"))
|
||||
|
||||
// Status badge styles
|
||||
statusHealthyStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#50FA7B")).Bold(true)
|
||||
statusInactiveStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#FFB86C"))
|
||||
statusStaleStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#F1FA8C"))
|
||||
statusArchivedStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#FF5555")).Bold(true)
|
||||
statusDeadStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#666666")).Strikethrough(true)
|
||||
|
||||
// Selected entry
|
||||
entrySelectedStyle = lipgloss.NewStyle().Background(lipgloss.Color("#44475A"))
|
||||
|
||||
// Header
|
||||
headerStyle = lipgloss.NewStyle().Bold(true).Foreground(lipgloss.Color("#BD93F9")).Padding(0, 1)
|
||||
|
||||
// Footer
|
||||
footerStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#666666"))
|
||||
|
||||
// Filter
|
||||
filterPromptStyle = lipgloss.NewStyle().Foreground(lipgloss.Color("#FF79C6")).Bold(true)
|
||||
)
|
||||
|
||||
func statusStyle(status string) lipgloss.Style {
|
||||
switch status {
|
||||
case "healthy":
|
||||
return statusHealthyStyle
|
||||
case "inactive":
|
||||
return statusInactiveStyle
|
||||
case "stale":
|
||||
return statusStaleStyle
|
||||
case "archived":
|
||||
return statusArchivedStyle
|
||||
case "dead":
|
||||
return statusDeadStyle
|
||||
default:
|
||||
return lipgloss.NewStyle()
|
||||
}
|
||||
}
|
||||
118
internal/tui/tree.go
Normal file
118
internal/tui/tree.go
Normal file
@@ -0,0 +1,118 @@
|
||||
package tui
|
||||
|
||||
import (
|
||||
"sort"
|
||||
"strings"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
)
|
||||
|
||||
// TreeNode represents a node in the category tree.
|
||||
type TreeNode struct {
|
||||
Name string // display name (leaf segment, e.g. "Networking")
|
||||
Path string // full path (e.g. "Container Operations > Networking")
|
||||
Children []*TreeNode
|
||||
Expanded bool
|
||||
Entries []cache.HealthEntry
|
||||
}
|
||||
|
||||
// FlatNode is a visible tree node with its indentation depth.
|
||||
type FlatNode struct {
|
||||
Node *TreeNode
|
||||
Depth int
|
||||
}
|
||||
|
||||
// HasChildren returns true if this node has child categories.
|
||||
func (n *TreeNode) HasChildren() bool {
|
||||
return len(n.Children) > 0
|
||||
}
|
||||
|
||||
// TotalEntries returns the count of entries in this node and all descendants.
|
||||
func (n *TreeNode) TotalEntries() int {
|
||||
count := len(n.Entries)
|
||||
for _, c := range n.Children {
|
||||
count += c.TotalEntries()
|
||||
}
|
||||
return count
|
||||
}
|
||||
|
||||
// AllEntries returns entries from this node and all descendants.
|
||||
func (n *TreeNode) AllEntries() []cache.HealthEntry {
|
||||
result := make([]cache.HealthEntry, 0, n.TotalEntries())
|
||||
result = append(result, n.Entries...)
|
||||
for _, c := range n.Children {
|
||||
result = append(result, c.AllEntries()...)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// BuildTree constructs a tree from flat HealthEntry slice, grouping by Category.
|
||||
func BuildTree(entries []cache.HealthEntry) []*TreeNode {
|
||||
root := &TreeNode{Name: "root"}
|
||||
nodeMap := map[string]*TreeNode{}
|
||||
|
||||
for _, e := range entries {
|
||||
cat := e.Category
|
||||
if cat == "" {
|
||||
cat = "Uncategorized"
|
||||
}
|
||||
|
||||
node := ensureNode(root, nodeMap, cat)
|
||||
node.Entries = append(node.Entries, e)
|
||||
}
|
||||
|
||||
// Sort children at every level
|
||||
sortTree(root)
|
||||
return root.Children
|
||||
}
|
||||
|
||||
func ensureNode(root *TreeNode, nodeMap map[string]*TreeNode, path string) *TreeNode {
|
||||
if n, ok := nodeMap[path]; ok {
|
||||
return n
|
||||
}
|
||||
|
||||
parts := strings.Split(path, " > ")
|
||||
current := root
|
||||
for i, part := range parts {
|
||||
subpath := strings.Join(parts[:i+1], " > ")
|
||||
if n, ok := nodeMap[subpath]; ok {
|
||||
current = n
|
||||
continue
|
||||
}
|
||||
child := &TreeNode{
|
||||
Name: part,
|
||||
Path: subpath,
|
||||
}
|
||||
current.Children = append(current.Children, child)
|
||||
nodeMap[subpath] = child
|
||||
current = child
|
||||
}
|
||||
return current
|
||||
}
|
||||
|
||||
func sortTree(node *TreeNode) {
|
||||
sort.Slice(node.Children, func(i, j int) bool {
|
||||
return node.Children[i].Name < node.Children[j].Name
|
||||
})
|
||||
for _, c := range node.Children {
|
||||
sortTree(c)
|
||||
}
|
||||
}
|
||||
|
||||
// FlattenVisible returns visible nodes in depth-first order for rendering.
|
||||
func FlattenVisible(roots []*TreeNode) []FlatNode {
|
||||
var result []FlatNode
|
||||
for _, r := range roots {
|
||||
flattenNode(r, 0, &result)
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
func flattenNode(node *TreeNode, depth int, result *[]FlatNode) {
|
||||
*result = append(*result, FlatNode{Node: node, Depth: depth})
|
||||
if node.Expanded {
|
||||
for _, c := range node.Children {
|
||||
flattenNode(c, depth+1, result)
|
||||
}
|
||||
}
|
||||
}
|
||||
109
internal/tui/tree_test.go
Normal file
109
internal/tui/tree_test.go
Normal file
@@ -0,0 +1,109 @@
|
||||
package tui
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
)
|
||||
|
||||
func TestBuildTree(t *testing.T) {
|
||||
entries := []cache.HealthEntry{
|
||||
{URL: "https://github.com/a/b", Name: "a/b", Category: "Projects > Networking", Description: "desc1"},
|
||||
{URL: "https://github.com/c/d", Name: "c/d", Category: "Projects > Networking", Description: "desc2"},
|
||||
{URL: "https://github.com/e/f", Name: "e/f", Category: "Projects > Security", Description: "desc3"},
|
||||
{URL: "https://github.com/g/h", Name: "g/h", Category: "Docker Images > Base Tools", Description: "desc4"},
|
||||
{URL: "https://github.com/i/j", Name: "i/j", Category: "", Description: "no category"},
|
||||
}
|
||||
|
||||
roots := BuildTree(entries)
|
||||
|
||||
// Should have 3 roots: Docker Images, Projects, Uncategorized (sorted)
|
||||
if len(roots) != 3 {
|
||||
t.Fatalf("expected 3 roots, got %d", len(roots))
|
||||
}
|
||||
|
||||
if roots[0].Name != "Docker Images" {
|
||||
t.Errorf("expected first root 'Docker Images', got %q", roots[0].Name)
|
||||
}
|
||||
if roots[1].Name != "Projects" {
|
||||
t.Errorf("expected second root 'Projects', got %q", roots[1].Name)
|
||||
}
|
||||
if roots[2].Name != "Uncategorized" {
|
||||
t.Errorf("expected third root 'Uncategorized', got %q", roots[2].Name)
|
||||
}
|
||||
|
||||
// Projects > Networking should have 2 entries
|
||||
projects := roots[1]
|
||||
if len(projects.Children) != 2 {
|
||||
t.Fatalf("expected 2 children under Projects, got %d", len(projects.Children))
|
||||
}
|
||||
networking := projects.Children[0] // Networking < Security alphabetically
|
||||
if networking.Name != "Networking" {
|
||||
t.Errorf("expected 'Networking', got %q", networking.Name)
|
||||
}
|
||||
if len(networking.Entries) != 2 {
|
||||
t.Errorf("expected 2 entries in Networking, got %d", len(networking.Entries))
|
||||
}
|
||||
}
|
||||
|
||||
func TestBuildTreeEmpty(t *testing.T) {
|
||||
roots := BuildTree(nil)
|
||||
if len(roots) != 0 {
|
||||
t.Errorf("expected 0 roots for nil input, got %d", len(roots))
|
||||
}
|
||||
}
|
||||
|
||||
func TestTotalEntries(t *testing.T) {
|
||||
entries := []cache.HealthEntry{
|
||||
{URL: "https://a", Category: "A > B"},
|
||||
{URL: "https://b", Category: "A > B"},
|
||||
{URL: "https://c", Category: "A > C"},
|
||||
{URL: "https://d", Category: "A"},
|
||||
}
|
||||
roots := BuildTree(entries)
|
||||
if len(roots) != 1 {
|
||||
t.Fatalf("expected 1 root, got %d", len(roots))
|
||||
}
|
||||
if roots[0].TotalEntries() != 4 {
|
||||
t.Errorf("expected 4 total entries, got %d", roots[0].TotalEntries())
|
||||
}
|
||||
}
|
||||
|
||||
func TestFlattenVisible(t *testing.T) {
|
||||
entries := []cache.HealthEntry{
|
||||
{URL: "https://a", Category: "A > B"},
|
||||
{URL: "https://b", Category: "A > C"},
|
||||
}
|
||||
roots := BuildTree(entries)
|
||||
|
||||
// Initially not expanded, should see just root
|
||||
flat := FlattenVisible(roots)
|
||||
if len(flat) != 1 {
|
||||
t.Fatalf("expected 1 visible node (collapsed), got %d", len(flat))
|
||||
}
|
||||
if flat[0].Depth != 0 {
|
||||
t.Errorf("expected depth 0, got %d", flat[0].Depth)
|
||||
}
|
||||
|
||||
// Expand root
|
||||
roots[0].Expanded = true
|
||||
flat = FlattenVisible(roots)
|
||||
if len(flat) != 3 {
|
||||
t.Fatalf("expected 3 visible nodes (expanded), got %d", len(flat))
|
||||
}
|
||||
if flat[1].Depth != 1 {
|
||||
t.Errorf("expected depth 1 for child, got %d", flat[1].Depth)
|
||||
}
|
||||
}
|
||||
|
||||
func TestAllEntries(t *testing.T) {
|
||||
entries := []cache.HealthEntry{
|
||||
{URL: "https://a", Category: "A > B"},
|
||||
{URL: "https://b", Category: "A"},
|
||||
}
|
||||
roots := BuildTree(entries)
|
||||
all := roots[0].AllEntries()
|
||||
if len(all) != 2 {
|
||||
t.Errorf("expected 2 entries from AllEntries, got %d", len(all))
|
||||
}
|
||||
}
|
||||
14
internal/tui/tui.go
Normal file
14
internal/tui/tui.go
Normal file
@@ -0,0 +1,14 @@
|
||||
package tui
|
||||
|
||||
import (
|
||||
tea "charm.land/bubbletea/v2"
|
||||
"github.com/veggiemonk/awesome-docker/internal/cache"
|
||||
)
|
||||
|
||||
// Run launches the TUI browser. It blocks until the user quits.
|
||||
func Run(entries []cache.HealthEntry) error {
|
||||
m := New(entries)
|
||||
p := tea.NewProgram(m)
|
||||
_, err := p.Run()
|
||||
return err
|
||||
}
|
||||
973
package-lock.json
generated
973
package-lock.json
generated
@@ -1,973 +0,0 @@
|
||||
{
|
||||
"name": "awesome-docker-website",
|
||||
"version": "1.0.0",
|
||||
"lockfileVersion": 3,
|
||||
"requires": true,
|
||||
"packages": {
|
||||
"": {
|
||||
"name": "awesome-docker-website",
|
||||
"version": "1.0.0",
|
||||
"license": "Apache-2.0",
|
||||
"dependencies": {
|
||||
"cheerio": "1.1.2",
|
||||
"draftlog": "1.0.13",
|
||||
"fs-extra": "11.3.2",
|
||||
"node-fetch": "3.3.2",
|
||||
"rimraf": "6.0.1",
|
||||
"showdown": "^2.1.0"
|
||||
}
|
||||
},
|
||||
"node_modules/@isaacs/balanced-match": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/@isaacs/balanced-match/-/balanced-match-4.0.1.tgz",
|
||||
"integrity": "sha512-yzMTt9lEb8Gv7zRioUilSglI0c0smZ9k5D65677DLWLtWJaXIS3CqcGyUFByYKlnUj6TkjLVs54fBl6+TiGQDQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
}
|
||||
},
|
||||
"node_modules/@isaacs/brace-expansion": {
|
||||
"version": "5.0.0",
|
||||
"resolved": "https://registry.npmjs.org/@isaacs/brace-expansion/-/brace-expansion-5.0.0.tgz",
|
||||
"integrity": "sha512-ZT55BDLV0yv0RBm2czMiZ+SqCGO7AvmOM3G/w2xhVPH+te0aKgFjmBvGlL1dH+ql2tgGO3MVrbb3jCKyvpgnxA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@isaacs/balanced-match": "^4.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
}
|
||||
},
|
||||
"node_modules/@isaacs/cliui": {
|
||||
"version": "8.0.2",
|
||||
"resolved": "https://registry.npmjs.org/@isaacs/cliui/-/cliui-8.0.2.tgz",
|
||||
"integrity": "sha512-O8jcjabXaleOG9DQ0+ARXWZBTfnP4WNAqzuiJK7ll44AmxGKv/J2M4TPjxjY3znBCfvBXFzucm1twdyFybFqEA==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"string-width": "^5.1.2",
|
||||
"string-width-cjs": "npm:string-width@^4.2.0",
|
||||
"strip-ansi": "^7.0.1",
|
||||
"strip-ansi-cjs": "npm:strip-ansi@^6.0.1",
|
||||
"wrap-ansi": "^8.1.0",
|
||||
"wrap-ansi-cjs": "npm:wrap-ansi@^7.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-regex": {
|
||||
"version": "6.2.2",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-6.2.2.tgz",
|
||||
"integrity": "sha512-Bq3SmSpyFHaWjPk8If9yc6svM8c56dB5BAtW4Qbw5jHTwwXXcTLoRMkpDJp6VL0XzlWaCHTXrkFURMYmD0sLqg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/ansi-regex?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/ansi-styles": {
|
||||
"version": "6.2.3",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-6.2.3.tgz",
|
||||
"integrity": "sha512-4Dj6M28JB+oAH8kFkTLUo+a2jwOFkuqb3yucU0CANcRRUbxS0cP0nZYCGjcc3BNXwRIsUVmDGgzawme7zvJHvg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/boolbase": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/boolbase/-/boolbase-1.0.0.tgz",
|
||||
"integrity": "sha512-JZOSA7Mo9sNGB8+UjSgzdLtokWAky1zbztM3WRLCbZ70/3cTANmQmOdR7y2g+J0e2WXywy1yS468tY+IruqEww==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/cheerio": {
|
||||
"version": "1.1.2",
|
||||
"resolved": "https://registry.npmjs.org/cheerio/-/cheerio-1.1.2.tgz",
|
||||
"integrity": "sha512-IkxPpb5rS/d1IiLbHMgfPuS0FgiWTtFIm/Nj+2woXDLTZ7fOT2eqzgYbdMlLweqlHbsZjxEChoVK+7iph7jyQg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"cheerio-select": "^2.1.0",
|
||||
"dom-serializer": "^2.0.0",
|
||||
"domhandler": "^5.0.3",
|
||||
"domutils": "^3.2.2",
|
||||
"encoding-sniffer": "^0.2.1",
|
||||
"htmlparser2": "^10.0.0",
|
||||
"parse5": "^7.3.0",
|
||||
"parse5-htmlparser2-tree-adapter": "^7.1.0",
|
||||
"parse5-parser-stream": "^7.1.2",
|
||||
"undici": "^7.12.0",
|
||||
"whatwg-mimetype": "^4.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=20.18.1"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/cheeriojs/cheerio?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/cheerio-select": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/cheerio-select/-/cheerio-select-2.1.0.tgz",
|
||||
"integrity": "sha512-9v9kG0LvzrlcungtnJtpGNxY+fzECQKhK4EGJX2vByejiMX84MFNQw4UxPJl3bFbTMw+Dfs37XaIkCwTZfLh4g==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"boolbase": "^1.0.0",
|
||||
"css-select": "^5.1.0",
|
||||
"css-what": "^6.1.0",
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.3",
|
||||
"domutils": "^3.0.1"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
},
|
||||
"node_modules/color-convert": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/color-convert/-/color-convert-2.0.1.tgz",
|
||||
"integrity": "sha512-RRECPsj7iu/xb5oKYcsFHSppFNnsj/52OVTRKb4zP5onXwVF3zVmmToNcOfGC+CRDpfK/U584fMg38ZHCaElKQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"color-name": "~1.1.4"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=7.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/color-name": {
|
||||
"version": "1.1.4",
|
||||
"resolved": "https://registry.npmjs.org/color-name/-/color-name-1.1.4.tgz",
|
||||
"integrity": "sha512-dOy+3AuW3a2wNbZHIuMZpTcgjGuLU/uBL/ubcZF9OXbDo8ff4O8yVp5Bf0efS8uEoYo5q4Fx7dY9OgQGXgAsQA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/commander": {
|
||||
"version": "9.5.0",
|
||||
"resolved": "https://registry.npmjs.org/commander/-/commander-9.5.0.tgz",
|
||||
"integrity": "sha512-KRs7WVDKg86PWiuAqhDrAQnTXZKraVcCc6vFdL14qrZ/DcWwuRo7VoiYXalXO7S5GKpqYiVEwCbgFDfxNHKJBQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": "^12.20.0 || >=14"
|
||||
}
|
||||
},
|
||||
"node_modules/cross-spawn": {
|
||||
"version": "7.0.6",
|
||||
"resolved": "https://registry.npmjs.org/cross-spawn/-/cross-spawn-7.0.6.tgz",
|
||||
"integrity": "sha512-uV2QOWP2nWzsy2aMp8aRibhi9dlzF5Hgh5SHaB9OiTGEyDTiJJyx0uy51QXdyWbtAHNua4XJzUKca3OzKUd3vA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"path-key": "^3.1.0",
|
||||
"shebang-command": "^2.0.0",
|
||||
"which": "^2.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/css-select": {
|
||||
"version": "5.2.2",
|
||||
"resolved": "https://registry.npmjs.org/css-select/-/css-select-5.2.2.tgz",
|
||||
"integrity": "sha512-TizTzUddG/xYLA3NXodFM0fSbNizXjOKhqiQQwvhlspadZokn1KDy0NZFS0wuEubIYAV5/c1/lAr0TaaFXEXzw==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"boolbase": "^1.0.0",
|
||||
"css-what": "^6.1.0",
|
||||
"domhandler": "^5.0.2",
|
||||
"domutils": "^3.0.1",
|
||||
"nth-check": "^2.0.1"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
},
|
||||
"node_modules/css-what": {
|
||||
"version": "6.2.2",
|
||||
"resolved": "https://registry.npmjs.org/css-what/-/css-what-6.2.2.tgz",
|
||||
"integrity": "sha512-u/O3vwbptzhMs3L1fQE82ZSLHQQfto5gyZzwteVIEyeaY5Fc7R4dapF/BvRoSYFeqfBk4m0V1Vafq5Pjv25wvA==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">= 6"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
},
|
||||
"node_modules/data-uri-to-buffer": {
|
||||
"version": "4.0.1",
|
||||
"resolved": "https://registry.npmjs.org/data-uri-to-buffer/-/data-uri-to-buffer-4.0.1.tgz",
|
||||
"integrity": "sha512-0R9ikRb668HB7QDxT1vkpuUBtqc53YyAwMwGeUFKRojY/NWKvdZ+9UYtRfGmhqNbRkTSVpMbmyhXipFFv2cb/A==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 12"
|
||||
}
|
||||
},
|
||||
"node_modules/dom-serializer": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/dom-serializer/-/dom-serializer-2.0.0.tgz",
|
||||
"integrity": "sha512-wIkAryiqt/nV5EQKqQpo3SToSOV9J0DnbJqwK7Wv/Trc92zIAYZ4FlMu+JPFW1DfGFt81ZTCGgDEabffXeLyJg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.2",
|
||||
"entities": "^4.2.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/cheeriojs/dom-serializer?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/domelementtype": {
|
||||
"version": "2.3.0",
|
||||
"resolved": "https://registry.npmjs.org/domelementtype/-/domelementtype-2.3.0.tgz",
|
||||
"integrity": "sha512-OLETBj6w0OsagBwdXnPdN0cnMfF9opN69co+7ZrbfPGrdpPVNBUj02spi6B1N7wChLQiPn4CSH/zJvXw56gmHw==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
],
|
||||
"license": "BSD-2-Clause"
|
||||
},
|
||||
"node_modules/domhandler": {
|
||||
"version": "5.0.3",
|
||||
"resolved": "https://registry.npmjs.org/domhandler/-/domhandler-5.0.3.tgz",
|
||||
"integrity": "sha512-cgwlv/1iFQiFnU96XXgROh8xTeetsnJiDsTc7TYCLFd9+/WNkIqPTxiM/8pSd8VIrhXGTf1Ny1q1hquVqDJB5w==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"domelementtype": "^2.3.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 4"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/domhandler?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/domutils": {
|
||||
"version": "3.2.2",
|
||||
"resolved": "https://registry.npmjs.org/domutils/-/domutils-3.2.2.tgz",
|
||||
"integrity": "sha512-6kZKyUajlDuqlHKVX1w7gyslj9MPIXzIFiz/rGu35uC1wMi+kMhQwGhl4lt9unC9Vb9INnY9Z3/ZA3+FhASLaw==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"dom-serializer": "^2.0.0",
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.3"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/domutils?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/draftlog": {
|
||||
"version": "1.0.13",
|
||||
"resolved": "https://registry.npmjs.org/draftlog/-/draftlog-1.0.13.tgz",
|
||||
"integrity": "sha512-GeMWOpXERBpfVDK6v7m0x1hPg8+g8ZsZWqJl2T17wHqrm4h8fnjiZmXcnCrmwogAc6R3YTxFXax15wezfuyCUw==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/eastasianwidth": {
|
||||
"version": "0.2.0",
|
||||
"resolved": "https://registry.npmjs.org/eastasianwidth/-/eastasianwidth-0.2.0.tgz",
|
||||
"integrity": "sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/emoji-regex": {
|
||||
"version": "9.2.2",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-9.2.2.tgz",
|
||||
"integrity": "sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/encoding-sniffer": {
|
||||
"version": "0.2.1",
|
||||
"resolved": "https://registry.npmjs.org/encoding-sniffer/-/encoding-sniffer-0.2.1.tgz",
|
||||
"integrity": "sha512-5gvq20T6vfpekVtqrYQsSCFZ1wEg5+wW0/QaZMWkFr6BqD3NfKs0rLCx4rrVlSWJeZb5NBJgVLswK/w2MWU+Gw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"iconv-lite": "^0.6.3",
|
||||
"whatwg-encoding": "^3.1.1"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/encoding-sniffer?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/entities": {
|
||||
"version": "4.5.0",
|
||||
"resolved": "https://registry.npmjs.org/entities/-/entities-4.5.0.tgz",
|
||||
"integrity": "sha512-V0hjH4dGPh9Ao5p0MoRY6BVqtwCjhz6vI5LT8AJ55H+4g9/4vbHx1I54fS0XuclLhDHArPQCiMjDxjaL8fPxhw==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=0.12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/fetch-blob": {
|
||||
"version": "3.2.0",
|
||||
"resolved": "https://registry.npmjs.org/fetch-blob/-/fetch-blob-3.2.0.tgz",
|
||||
"integrity": "sha512-7yAQpD2UMJzLi1Dqv7qFYnPbaPx7ZfFK6PiIxQ4PfkGPyNyl2Ugx+a/umUonmKqjhM4DnfbMvdX6otXq83soQQ==",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/jimmywarting"
|
||||
},
|
||||
{
|
||||
"type": "paypal",
|
||||
"url": "https://paypal.me/jimmywarting"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"node-domexception": "^1.0.0",
|
||||
"web-streams-polyfill": "^3.0.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": "^12.20 || >= 14.13"
|
||||
}
|
||||
},
|
||||
"node_modules/foreground-child": {
|
||||
"version": "3.3.1",
|
||||
"resolved": "https://registry.npmjs.org/foreground-child/-/foreground-child-3.3.1.tgz",
|
||||
"integrity": "sha512-gIXjKqtFuWEgzFRJA9WCQeSJLZDjgJUOMCMzxtvFq/37KojM1BFGufqsCy0r4qSQmYLsZYMeyRqzIWOMup03sw==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"cross-spawn": "^7.0.6",
|
||||
"signal-exit": "^4.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/formdata-polyfill": {
|
||||
"version": "4.0.10",
|
||||
"resolved": "https://registry.npmjs.org/formdata-polyfill/-/formdata-polyfill-4.0.10.tgz",
|
||||
"integrity": "sha512-buewHzMvYL29jdeQTVILecSaZKnt/RJWjoZCF5OW60Z67/GmSLBkOFM7qh1PI3zFNtJbaZL5eQu1vLfazOwj4g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"fetch-blob": "^3.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12.20.0"
|
||||
}
|
||||
},
|
||||
"node_modules/fs-extra": {
|
||||
"version": "11.3.2",
|
||||
"resolved": "https://registry.npmjs.org/fs-extra/-/fs-extra-11.3.2.tgz",
|
||||
"integrity": "sha512-Xr9F6z6up6Ws+NjzMCZc6WXg2YFRlrLP9NQDO3VQrWrfiojdhS56TzueT88ze0uBdCTwEIhQ3ptnmKeWGFAe0A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"graceful-fs": "^4.2.0",
|
||||
"jsonfile": "^6.0.1",
|
||||
"universalify": "^2.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=14.14"
|
||||
}
|
||||
},
|
||||
"node_modules/glob": {
|
||||
"version": "11.0.3",
|
||||
"resolved": "https://registry.npmjs.org/glob/-/glob-11.0.3.tgz",
|
||||
"integrity": "sha512-2Nim7dha1KVkaiF4q6Dj+ngPPMdfvLJEOpZk/jKiUAkqKebpGAWQXAq9z1xu9HKu5lWfqw/FASuccEjyznjPaA==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"foreground-child": "^3.3.1",
|
||||
"jackspeak": "^4.1.1",
|
||||
"minimatch": "^10.0.3",
|
||||
"minipass": "^7.1.2",
|
||||
"package-json-from-dist": "^1.0.0",
|
||||
"path-scurry": "^2.0.0"
|
||||
},
|
||||
"bin": {
|
||||
"glob": "dist/esm/bin.mjs"
|
||||
},
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/graceful-fs": {
|
||||
"version": "4.2.11",
|
||||
"resolved": "https://registry.npmjs.org/graceful-fs/-/graceful-fs-4.2.11.tgz",
|
||||
"integrity": "sha512-RbJ5/jmFcNNCcDV5o9eTnBLJ/HszWV0P73bc+Ff4nS/rJj+YaS6IGyiOL0VoBYX+l1Wrl3k63h/KrH+nhJ0XvQ==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/htmlparser2": {
|
||||
"version": "10.0.0",
|
||||
"resolved": "https://registry.npmjs.org/htmlparser2/-/htmlparser2-10.0.0.tgz",
|
||||
"integrity": "sha512-TwAZM+zE5Tq3lrEHvOlvwgj1XLWQCtaaibSN11Q+gGBAS7Y1uZSWwXXRe4iF6OXnaq1riyQAPFOBtYc77Mxq0g==",
|
||||
"funding": [
|
||||
"https://github.com/fb55/htmlparser2?sponsor=1",
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/fb55"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"domelementtype": "^2.3.0",
|
||||
"domhandler": "^5.0.3",
|
||||
"domutils": "^3.2.1",
|
||||
"entities": "^6.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/htmlparser2/node_modules/entities": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/entities/-/entities-6.0.1.tgz",
|
||||
"integrity": "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=0.12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/iconv-lite": {
|
||||
"version": "0.6.3",
|
||||
"resolved": "https://registry.npmjs.org/iconv-lite/-/iconv-lite-0.6.3.tgz",
|
||||
"integrity": "sha512-4fCk79wshMdzMp2rH06qWrJE4iolqLhCUH+OiuIgU++RB0+94NlDL81atO7GX55uUKueo0txHNtvEyI6D7WdMw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"safer-buffer": ">= 2.1.2 < 3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=0.10.0"
|
||||
}
|
||||
},
|
||||
"node_modules/is-fullwidth-code-point": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/is-fullwidth-code-point/-/is-fullwidth-code-point-3.0.0.tgz",
|
||||
"integrity": "sha512-zymm5+u+sCsSWyD9qNaejV3DFvhCKclKdizYaJUuHA83RLjb7nSuGnddCHGv0hk+KY7BMAlsWeK4Ueg6EV6XQg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/isexe": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/isexe/-/isexe-2.0.0.tgz",
|
||||
"integrity": "sha512-RHxMLp9lnKHGHRng9QFhRCMbYAcVpn69smSGcq3f36xjgVVWThj4qqLbTLlq7Ssj8B+fIQ1EuCEGI2lKsyQeIw==",
|
||||
"license": "ISC"
|
||||
},
|
||||
"node_modules/jackspeak": {
|
||||
"version": "4.1.1",
|
||||
"resolved": "https://registry.npmjs.org/jackspeak/-/jackspeak-4.1.1.tgz",
|
||||
"integrity": "sha512-zptv57P3GpL+O0I7VdMJNBZCu+BPHVQUk55Ft8/QCJjTVxrnJHuVuX/0Bl2A6/+2oyR/ZMEuFKwmzqqZ/U5nPQ==",
|
||||
"license": "BlueOak-1.0.0",
|
||||
"dependencies": {
|
||||
"@isaacs/cliui": "^8.0.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/jsonfile": {
|
||||
"version": "6.2.0",
|
||||
"resolved": "https://registry.npmjs.org/jsonfile/-/jsonfile-6.2.0.tgz",
|
||||
"integrity": "sha512-FGuPw30AdOIUTRMC2OMRtQV+jkVj2cfPqSeWXv1NEAJ1qZ5zb1X6z1mFhbfOB/iy3ssJCD+3KuZ8r8C3uVFlAg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"universalify": "^2.0.0"
|
||||
},
|
||||
"optionalDependencies": {
|
||||
"graceful-fs": "^4.1.6"
|
||||
}
|
||||
},
|
||||
"node_modules/lru-cache": {
|
||||
"version": "11.2.2",
|
||||
"resolved": "https://registry.npmjs.org/lru-cache/-/lru-cache-11.2.2.tgz",
|
||||
"integrity": "sha512-F9ODfyqML2coTIsQpSkRHnLSZMtkU8Q+mSfcaIyKwy58u+8k5nvAYeiNhsyMARvzNcXJ9QfWVrcPsC9e9rAxtg==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
}
|
||||
},
|
||||
"node_modules/minimatch": {
|
||||
"version": "10.0.3",
|
||||
"resolved": "https://registry.npmjs.org/minimatch/-/minimatch-10.0.3.tgz",
|
||||
"integrity": "sha512-IPZ167aShDZZUMdRk66cyQAW3qr0WzbHkPdMYa8bzZhlHhO3jALbKdxcaak7W9FfT2rZNpQuUu4Od7ILEpXSaw==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"@isaacs/brace-expansion": "^5.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/minipass": {
|
||||
"version": "7.1.2",
|
||||
"resolved": "https://registry.npmjs.org/minipass/-/minipass-7.1.2.tgz",
|
||||
"integrity": "sha512-qOOzS1cBTWYF4BH8fVePDBOO9iptMnGUEZwNc/cMWnTV2nVLZ7VoNWEPHkYczZA0pdoA7dl6e7FL659nX9S2aw==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">=16 || 14 >=14.17"
|
||||
}
|
||||
},
|
||||
"node_modules/node-domexception": {
|
||||
"version": "1.0.0",
|
||||
"resolved": "https://registry.npmjs.org/node-domexception/-/node-domexception-1.0.0.tgz",
|
||||
"integrity": "sha512-/jKZoMpw0F8GRwl4/eLROPA3cfcXtLApP0QzLmUT/HuPCZWyB7IY9ZrMeKw2O/nFIqPQB3PVM9aYm0F312AXDQ==",
|
||||
"deprecated": "Use your platform's native DOMException instead",
|
||||
"funding": [
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://github.com/sponsors/jimmywarting"
|
||||
},
|
||||
{
|
||||
"type": "github",
|
||||
"url": "https://paypal.me/jimmywarting"
|
||||
}
|
||||
],
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=10.5.0"
|
||||
}
|
||||
},
|
||||
"node_modules/node-fetch": {
|
||||
"version": "3.3.2",
|
||||
"resolved": "https://registry.npmjs.org/node-fetch/-/node-fetch-3.3.2.tgz",
|
||||
"integrity": "sha512-dRB78srN/l6gqWulah9SrxeYnxeddIG30+GOqK/9OlLVyLg3HPnr6SqOWTWOXKRwC2eGYCkZ59NNuSgvSrpgOA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"data-uri-to-buffer": "^4.0.0",
|
||||
"fetch-blob": "^3.1.4",
|
||||
"formdata-polyfill": "^4.0.10"
|
||||
},
|
||||
"engines": {
|
||||
"node": "^12.20.0 || ^14.13.1 || >=16.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"type": "opencollective",
|
||||
"url": "https://opencollective.com/node-fetch"
|
||||
}
|
||||
},
|
||||
"node_modules/nth-check": {
|
||||
"version": "2.1.1",
|
||||
"resolved": "https://registry.npmjs.org/nth-check/-/nth-check-2.1.1.tgz",
|
||||
"integrity": "sha512-lqjrjmaOoAnWfMmBPL+XNnynZh2+swxiX3WUE0s4yEHI6m+AwrK2UZOimIRl3X/4QctVqS8AiZjFqyOGrMXb/w==",
|
||||
"license": "BSD-2-Clause",
|
||||
"dependencies": {
|
||||
"boolbase": "^1.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/nth-check?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/package-json-from-dist": {
|
||||
"version": "1.0.1",
|
||||
"resolved": "https://registry.npmjs.org/package-json-from-dist/-/package-json-from-dist-1.0.1.tgz",
|
||||
"integrity": "sha512-UEZIS3/by4OC8vL3P2dTXRETpebLI2NiI5vIrjaD/5UtrkFX/tNbwjTSRAGC/+7CAo2pIcBaRgWmcBBHcsaCIw==",
|
||||
"license": "BlueOak-1.0.0"
|
||||
},
|
||||
"node_modules/parse5": {
|
||||
"version": "7.3.0",
|
||||
"resolved": "https://registry.npmjs.org/parse5/-/parse5-7.3.0.tgz",
|
||||
"integrity": "sha512-IInvU7fabl34qmi9gY8XOVxhYyMyuH2xUNpb2q8/Y+7552KlejkRvqvD19nMoUW/uQGGbqNpA6Tufu5FL5BZgw==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"entities": "^6.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/parse5-htmlparser2-tree-adapter": {
|
||||
"version": "7.1.0",
|
||||
"resolved": "https://registry.npmjs.org/parse5-htmlparser2-tree-adapter/-/parse5-htmlparser2-tree-adapter-7.1.0.tgz",
|
||||
"integrity": "sha512-ruw5xyKs6lrpo9x9rCZqZZnIUntICjQAd0Wsmp396Ul9lN/h+ifgVV1x1gZHi8euej6wTfpqX8j+BFQxF0NS/g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"domhandler": "^5.0.3",
|
||||
"parse5": "^7.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/parse5-parser-stream": {
|
||||
"version": "7.1.2",
|
||||
"resolved": "https://registry.npmjs.org/parse5-parser-stream/-/parse5-parser-stream-7.1.2.tgz",
|
||||
"integrity": "sha512-JyeQc9iwFLn5TbvvqACIF/VXG6abODeB3Fwmv/TGdLk2LfbWkaySGY72at4+Ty7EkPZj854u4CrICqNk2qIbow==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"parse5": "^7.0.0"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/inikulin/parse5?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/parse5/node_modules/entities": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/entities/-/entities-6.0.1.tgz",
|
||||
"integrity": "sha512-aN97NXWF6AWBTahfVOIrB/NShkzi5H7F9r1s9mD3cDj4Ko5f2qhhVoYMibXF7GlLveb/D2ioWay8lxI97Ven3g==",
|
||||
"license": "BSD-2-Clause",
|
||||
"engines": {
|
||||
"node": ">=0.12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/fb55/entities?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/path-key": {
|
||||
"version": "3.1.1",
|
||||
"resolved": "https://registry.npmjs.org/path-key/-/path-key-3.1.1.tgz",
|
||||
"integrity": "sha512-ojmeN0qd+y0jszEtoY48r0Peq5dwMEkIlCOu6Q5f41lfkswXuKtYrhgoTpLnyIcHm24Uhqx+5Tqm2InSwLhE6Q==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/path-scurry": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/path-scurry/-/path-scurry-2.0.0.tgz",
|
||||
"integrity": "sha512-ypGJsmGtdXUOeM5u93TyeIEfEhM6s+ljAhrk5vAvSx8uyY/02OvrZnA0YNGUrPXfpJMgI1ODd3nwz8Npx4O4cg==",
|
||||
"license": "BlueOak-1.0.0",
|
||||
"dependencies": {
|
||||
"lru-cache": "^11.0.0",
|
||||
"minipass": "^7.1.2"
|
||||
},
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/rimraf": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/rimraf/-/rimraf-6.0.1.tgz",
|
||||
"integrity": "sha512-9dkvaxAsk/xNXSJzMgFqqMCuFgt2+KsOFek3TMLfo8NCPfWpBmqwyNn5Y+NX56QUYfCtsyhF3ayiboEoUmJk/A==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"glob": "^11.0.0",
|
||||
"package-json-from-dist": "^1.0.0"
|
||||
},
|
||||
"bin": {
|
||||
"rimraf": "dist/esm/bin.mjs"
|
||||
},
|
||||
"engines": {
|
||||
"node": "20 || >=22"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/safer-buffer": {
|
||||
"version": "2.1.2",
|
||||
"resolved": "https://registry.npmjs.org/safer-buffer/-/safer-buffer-2.1.2.tgz",
|
||||
"integrity": "sha512-YZo3K82SD7Riyi0E1EQPojLz7kpepnSQI9IyPbHHg1XXXevb5dJI7tpyN2ADxGcQbHG7vcyRHk0cbwqcQriUtg==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/shebang-command": {
|
||||
"version": "2.0.0",
|
||||
"resolved": "https://registry.npmjs.org/shebang-command/-/shebang-command-2.0.0.tgz",
|
||||
"integrity": "sha512-kHxr2zZpYtdmrN1qDjrrX/Z1rR1kG8Dx+gkpK1G4eXmvXswmcE1hTWBWYUzlraYw1/yZp6YuDY77YtvbN0dmDA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"shebang-regex": "^3.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/shebang-regex": {
|
||||
"version": "3.0.0",
|
||||
"resolved": "https://registry.npmjs.org/shebang-regex/-/shebang-regex-3.0.0.tgz",
|
||||
"integrity": "sha512-7++dFhtcx3353uBaq8DDR4NuxBetBzC7ZQOhmTQInHEd6bSrXdiEyzCvG07Z44UYdLShWUyXt5M/yhz8ekcb1A==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/showdown": {
|
||||
"version": "2.1.0",
|
||||
"resolved": "https://registry.npmjs.org/showdown/-/showdown-2.1.0.tgz",
|
||||
"integrity": "sha512-/6NVYu4U819R2pUIk79n67SYgJHWCce0a5xTP979WbNp0FL9MN1I1QK662IDU1b6JzKTvmhgI7T7JYIxBi3kMQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"commander": "^9.0.0"
|
||||
},
|
||||
"bin": {
|
||||
"showdown": "bin/showdown.js"
|
||||
},
|
||||
"funding": {
|
||||
"type": "individual",
|
||||
"url": "https://www.paypal.me/tiviesantos"
|
||||
}
|
||||
},
|
||||
"node_modules/signal-exit": {
|
||||
"version": "4.1.0",
|
||||
"resolved": "https://registry.npmjs.org/signal-exit/-/signal-exit-4.1.0.tgz",
|
||||
"integrity": "sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==",
|
||||
"license": "ISC",
|
||||
"engines": {
|
||||
"node": ">=14"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/isaacs"
|
||||
}
|
||||
},
|
||||
"node_modules/string-width": {
|
||||
"version": "5.1.2",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-5.1.2.tgz",
|
||||
"integrity": "sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"eastasianwidth": "^0.2.0",
|
||||
"emoji-regex": "^9.2.2",
|
||||
"strip-ansi": "^7.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/sponsors/sindresorhus"
|
||||
}
|
||||
},
|
||||
"node_modules/string-width-cjs": {
|
||||
"name": "string-width",
|
||||
"version": "4.2.3",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
|
||||
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"emoji-regex": "^8.0.0",
|
||||
"is-fullwidth-code-point": "^3.0.0",
|
||||
"strip-ansi": "^6.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/string-width-cjs/node_modules/ansi-regex": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
|
||||
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/string-width-cjs/node_modules/emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/string-width-cjs/node_modules/strip-ansi": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
|
||||
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^5.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-ansi": {
|
||||
"version": "7.1.2",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-7.1.2.tgz",
|
||||
"integrity": "sha512-gmBGslpoQJtgnMAvOVqGZpEz9dyoKTCzy2nfz/n8aIFhN/jCE/rCmcxabB6jOOHV+0WNnylOxaxBQPSvcWklhA==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^6.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/strip-ansi?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-ansi-cjs": {
|
||||
"name": "strip-ansi",
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
|
||||
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^5.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/strip-ansi-cjs/node_modules/ansi-regex": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
|
||||
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/undici": {
|
||||
"version": "7.16.0",
|
||||
"resolved": "https://registry.npmjs.org/undici/-/undici-7.16.0.tgz",
|
||||
"integrity": "sha512-QEg3HPMll0o3t2ourKwOeUAZ159Kn9mx5pnzHRQO8+Wixmh88YdZRiIwat0iNzNNXn0yoEtXJqFpyW7eM8BV7g==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=20.18.1"
|
||||
}
|
||||
},
|
||||
"node_modules/universalify": {
|
||||
"version": "2.0.1",
|
||||
"resolved": "https://registry.npmjs.org/universalify/-/universalify-2.0.1.tgz",
|
||||
"integrity": "sha512-gptHNQghINnc/vTGIk0SOFGFNXw7JVrlRUtConJRlvaw6DuX0wO5Jeko9sWrMBhh+PsYAZ7oXAiOnf/UKogyiw==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 10.0.0"
|
||||
}
|
||||
},
|
||||
"node_modules/web-streams-polyfill": {
|
||||
"version": "3.3.3",
|
||||
"resolved": "https://registry.npmjs.org/web-streams-polyfill/-/web-streams-polyfill-3.3.3.tgz",
|
||||
"integrity": "sha512-d2JWLCivmZYTSIoge9MsgFCZrt571BikcWGYkjC1khllbTeDlGqZ2D8vD8E/lJa8WGWbb7Plm8/XJYV7IJHZZw==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/whatwg-encoding": {
|
||||
"version": "3.1.1",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-encoding/-/whatwg-encoding-3.1.1.tgz",
|
||||
"integrity": "sha512-6qN4hJdMwfYBtE3YBTTHhoeuUrDBPZmbQaxWAqSALV/MeEnR5z1xd8UKud2RAkFoPkmB+hli1TZSnyi84xz1vQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"iconv-lite": "0.6.3"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/whatwg-mimetype": {
|
||||
"version": "4.0.0",
|
||||
"resolved": "https://registry.npmjs.org/whatwg-mimetype/-/whatwg-mimetype-4.0.0.tgz",
|
||||
"integrity": "sha512-QaKxh0eNIi2mE9p2vEdzfagOKHCcj1pJ56EEHGQOVxp8r9/iszLUUV7v89x9O1p/T+NlTM5W7jW6+cz4Fq1YVg==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=18"
|
||||
}
|
||||
},
|
||||
"node_modules/which": {
|
||||
"version": "2.0.2",
|
||||
"resolved": "https://registry.npmjs.org/which/-/which-2.0.2.tgz",
|
||||
"integrity": "sha512-BLI3Tl1TW3Pvl70l3yq3Y64i+awpwXqsGBYWkkqMtnbXgrMD+yj7rhW0kuEDxzJaYXGjEW5ogapKNMEKNMjibA==",
|
||||
"license": "ISC",
|
||||
"dependencies": {
|
||||
"isexe": "^2.0.0"
|
||||
},
|
||||
"bin": {
|
||||
"node-which": "bin/node-which"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">= 8"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi": {
|
||||
"version": "8.1.0",
|
||||
"resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-8.1.0.tgz",
|
||||
"integrity": "sha512-si7QWI6zUMq56bESFvagtmzMdGOtoxfR+Sez11Mobfc7tm+VkUckk9bW2UeffTGVUbOksxmSw0AA2gs8g71NCQ==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-styles": "^6.1.0",
|
||||
"string-width": "^5.0.1",
|
||||
"strip-ansi": "^7.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=12"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/wrap-ansi?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi-cjs": {
|
||||
"name": "wrap-ansi",
|
||||
"version": "7.0.0",
|
||||
"resolved": "https://registry.npmjs.org/wrap-ansi/-/wrap-ansi-7.0.0.tgz",
|
||||
"integrity": "sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-styles": "^4.0.0",
|
||||
"string-width": "^4.1.0",
|
||||
"strip-ansi": "^6.0.0"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=10"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/wrap-ansi?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi-cjs/node_modules/ansi-regex": {
|
||||
"version": "5.0.1",
|
||||
"resolved": "https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.1.tgz",
|
||||
"integrity": "sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==",
|
||||
"license": "MIT",
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi-cjs/node_modules/ansi-styles": {
|
||||
"version": "4.3.0",
|
||||
"resolved": "https://registry.npmjs.org/ansi-styles/-/ansi-styles-4.3.0.tgz",
|
||||
"integrity": "sha512-zbB9rCJAT1rbjiVDb2hqKFHNYLxgtk8NURxZ3IZwD3F6NtxbXZQCnnSi1Lkx+IDohdPlFp222wVALIheZJQSEg==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"color-convert": "^2.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
},
|
||||
"funding": {
|
||||
"url": "https://github.com/chalk/ansi-styles?sponsor=1"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi-cjs/node_modules/emoji-regex": {
|
||||
"version": "8.0.0",
|
||||
"resolved": "https://registry.npmjs.org/emoji-regex/-/emoji-regex-8.0.0.tgz",
|
||||
"integrity": "sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==",
|
||||
"license": "MIT"
|
||||
},
|
||||
"node_modules/wrap-ansi-cjs/node_modules/string-width": {
|
||||
"version": "4.2.3",
|
||||
"resolved": "https://registry.npmjs.org/string-width/-/string-width-4.2.3.tgz",
|
||||
"integrity": "sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"emoji-regex": "^8.0.0",
|
||||
"is-fullwidth-code-point": "^3.0.0",
|
||||
"strip-ansi": "^6.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
},
|
||||
"node_modules/wrap-ansi-cjs/node_modules/strip-ansi": {
|
||||
"version": "6.0.1",
|
||||
"resolved": "https://registry.npmjs.org/strip-ansi/-/strip-ansi-6.0.1.tgz",
|
||||
"integrity": "sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"ansi-regex": "^5.0.1"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=8"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
30
package.json
30
package.json
@@ -1,30 +0,0 @@
|
||||
{
|
||||
"name": "awesome-docker-website",
|
||||
"version": "1.0.0",
|
||||
"description": "A curated list of Docker resources and projects Inspired by @sindresorhus and improved by amazing contributors",
|
||||
"main": "build.js",
|
||||
"scripts": {
|
||||
"build": "rimraf ./dist/ && node build.js",
|
||||
"test-pr": "node tests/pull_request.mjs",
|
||||
"test": "node tests/test_all.mjs",
|
||||
"health-check": "node tests/health_check.mjs"
|
||||
},
|
||||
"repository": {
|
||||
"type": "git",
|
||||
"url": "git+https://github.com/veggiemonk/awesome-docker.git"
|
||||
},
|
||||
"author": "Julien Bisconti <julien.bisconti at hotmail dot com>",
|
||||
"license": "Apache-2.0",
|
||||
"bugs": {
|
||||
"url": "https://github.com/veggiemonk/awesome-docker/issues"
|
||||
},
|
||||
"homepage": "https://github.com/veggiemonk/awesome-docker#readme",
|
||||
"dependencies": {
|
||||
"cheerio": "1.1.2",
|
||||
"draftlog": "1.0.13",
|
||||
"fs-extra": "11.3.2",
|
||||
"node-fetch": "3.3.2",
|
||||
"rimraf": "6.0.1",
|
||||
"showdown": "^2.1.0"
|
||||
}
|
||||
}
|
||||
108
tests/common.mjs
108
tests/common.mjs
@@ -1,108 +0,0 @@
|
||||
import fetch from 'node-fetch';
|
||||
import { isRedirect } from 'node-fetch';
|
||||
import {readFileSync} from 'fs';
|
||||
|
||||
const LINKS_OPTIONS = {
|
||||
redirect: 'manual',
|
||||
headers: {
|
||||
'Content-Type': 'application/json',
|
||||
'user-agent':
|
||||
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.149 Safari/537.36',
|
||||
},
|
||||
timeout: 60000, // 1m
|
||||
signal: AbortSignal.timeout(60000),
|
||||
};
|
||||
|
||||
const LOG = {
|
||||
error: (...args) => console.error('❌ ERROR', args),
|
||||
error_string: (...args) =>
|
||||
console.error('❌ ERROR', JSON.stringify({ ...args }, null, ' ')),
|
||||
debug: (...args) => {
|
||||
if (process.env.DEBUG) console.log('>>> DEBUG: ', { ...args });
|
||||
},
|
||||
debug_string: (...args) => {
|
||||
if (process.env.DEBUG)
|
||||
console.log('>>> DEBUG: ', JSON.stringify({ ...args }, null, ' '));
|
||||
},
|
||||
};
|
||||
|
||||
const handleFailure = (error) => {
|
||||
console.error(`${error.message}: ${error.stack}`, { error });
|
||||
process.exit(1);
|
||||
};
|
||||
|
||||
process.on('unhandledRejection', handleFailure);
|
||||
|
||||
const extract_all_links = (markdown) => {
|
||||
// if you have a problem and you try to solve it with a regex,
|
||||
// now you have two problems
|
||||
// TODO: replace this mess with a mardown parser ?
|
||||
const re = /(((https:(?:\/\/)?)(?:[-;:&=+$,\w]+@)?[A-Za-z0-9.-]+|(?:www\.|[-;:&=+$,\w]+@)[A-Za-z0-9.-]+)((?:\/[+~%/.\w\-_]*)?\??(?:[-+=&;%@.\w_]*)#?(?:[.!/@\-\\\w]*))?)/g;
|
||||
return markdown.match(re);
|
||||
};
|
||||
|
||||
const find_duplicates = (arr) => {
|
||||
const hm = {};
|
||||
const dup = [];
|
||||
arr.forEach((e) => {
|
||||
if (hm[e]) dup.push(e);
|
||||
else hm[e] = true;
|
||||
});
|
||||
return dup;
|
||||
};
|
||||
|
||||
const partition = (arr, func) => {
|
||||
const ap = [[], []];
|
||||
arr.forEach((e) => (func(e) ? ap[0].push(e) : ap[1].push(e)));
|
||||
return ap;
|
||||
};
|
||||
|
||||
async function fetch_link(url) {
|
||||
try {
|
||||
const { headers, ok, status, statusText } = await fetch(url, LINKS_OPTIONS);
|
||||
const redirect = isRedirect(status) ? { redirect: { src: url, dst: headers.get("location") } } : {};
|
||||
return [url, { ok, status: statusText, ...redirect }];
|
||||
} catch (error) {
|
||||
return [url, { ok: false, status: error.message }];
|
||||
}
|
||||
}
|
||||
|
||||
async function batch_fetch({ arr, get, post_filter_func, BATCH_SIZE = 8 }) {
|
||||
const result = [];
|
||||
/* eslint-disable no-await-in-loop */
|
||||
for (let i = 0; i < arr.length; i += BATCH_SIZE) {
|
||||
const batch = arr.slice(i, i + BATCH_SIZE);
|
||||
LOG.debug_string({ batch });
|
||||
let res = await Promise.all(batch.map(get));
|
||||
console.log(`batch fetched...${i + BATCH_SIZE}`);
|
||||
res = post_filter_func ? res.filter(post_filter_func) : res;
|
||||
LOG.debug_string({ res });
|
||||
result.push(...res);
|
||||
}
|
||||
return result;
|
||||
}
|
||||
|
||||
const data = readFileSync('./tests/exclude_in_test.json')
|
||||
const exclude = JSON.parse(data)
|
||||
const exclude_length = exclude.length;
|
||||
const exclude_from_list = (link) => {
|
||||
let is_excluded = false;
|
||||
for (let i = 0; i < exclude_length; i += 1) {
|
||||
if (link.startsWith(exclude[i])) {
|
||||
is_excluded = true;
|
||||
break;
|
||||
}
|
||||
}
|
||||
return is_excluded;
|
||||
};
|
||||
|
||||
export default {
|
||||
LOG,
|
||||
handleFailure,
|
||||
extract_all_links,
|
||||
find_duplicates,
|
||||
partition,
|
||||
fetch_link,
|
||||
batch_fetch,
|
||||
exclude_from_list,
|
||||
};
|
||||
@@ -1,17 +0,0 @@
|
||||
[
|
||||
"https://vimeo.com",
|
||||
"https://travis-ci.org/veggiemonk/awesome-docker.svg",
|
||||
"https://github.com/apps/",
|
||||
"https://twitter.com",
|
||||
"https://www.meetup.com/",
|
||||
"https://cycle.io/",
|
||||
"https://www.manning.com/",
|
||||
"https://deepfence.io",
|
||||
"https://cdn.rawgit.com/sindresorhus/awesome/d7305f38d29fed78fa85652e3a63e154dd8e8829/media/badge.svg",
|
||||
"https://www.se-radio.net/2017/05/se-radio-episode-290-diogo-monica-on-docker-security",
|
||||
"https://www.reddit.com/r/docker/",
|
||||
"https://www.udacity.com/course/scalable-microservices-with-kubernetes--ud615",
|
||||
"https://www.youtube.com/playlist",
|
||||
"https://www.aquasec.com",
|
||||
"https://cloudsmith.com"
|
||||
]
|
||||
@@ -1,206 +0,0 @@
|
||||
import fs from 'fs-extra';
|
||||
import fetch from 'node-fetch';
|
||||
import helper from './common.mjs';
|
||||
|
||||
const README = 'README.md';
|
||||
const GITHUB_GQL_API = 'https://api.github.com/graphql';
|
||||
const TOKEN = process.env.GITHUB_TOKEN || '';
|
||||
|
||||
if (!TOKEN) {
|
||||
console.error('GITHUB_TOKEN environment variable is required');
|
||||
process.exit(1);
|
||||
}
|
||||
|
||||
const Authorization = `token ${TOKEN}`;
|
||||
|
||||
const LOG = {
|
||||
info: (...args) => console.log('ℹ️ ', ...args),
|
||||
warn: (...args) => console.warn('⚠️ ', ...args),
|
||||
error: (...args) => console.error('❌', ...args),
|
||||
};
|
||||
|
||||
// Extract GitHub repos from links
|
||||
const extract_repos = (arr) =>
|
||||
arr
|
||||
.map((e) => e.substr('https://github.com/'.length).split('/'))
|
||||
.filter((r) => r.length === 2 && r[1] !== '');
|
||||
|
||||
// Generate GraphQL query to check repo health
|
||||
const generate_health_query = (repos) => {
|
||||
const repoQueries = repos.map(([owner, name]) => {
|
||||
const safeName = `repo_${owner.replace(/(-|\.)/g, '_')}_${name.replace(/(-|\.)/g, '_')}`;
|
||||
return `${safeName}: repository(owner: "${owner}", name:"${name}"){
|
||||
nameWithOwner
|
||||
isArchived
|
||||
pushedAt
|
||||
createdAt
|
||||
stargazerCount
|
||||
forkCount
|
||||
isDisabled
|
||||
isFork
|
||||
isLocked
|
||||
isPrivate
|
||||
}`;
|
||||
}).join('\n');
|
||||
|
||||
return `query REPO_HEALTH { ${repoQueries} }`;
|
||||
};
|
||||
|
||||
// Batch repos into smaller chunks for GraphQL
|
||||
function* batchRepos(repos, size = 50) {
|
||||
for (let i = 0; i < repos.length; i += size) {
|
||||
yield repos.slice(i, i + size);
|
||||
}
|
||||
}
|
||||
|
||||
async function checkRepoHealth(repos) {
|
||||
const results = {
|
||||
archived: [],
|
||||
stale: [], // No commits in 2+ years
|
||||
inactive: [], // No commits in 1-2 years
|
||||
healthy: [],
|
||||
disabled: [],
|
||||
total: repos.length,
|
||||
};
|
||||
|
||||
const twoYearsAgo = new Date();
|
||||
twoYearsAgo.setFullYear(twoYearsAgo.getFullYear() - 2);
|
||||
|
||||
const oneYearAgo = new Date();
|
||||
oneYearAgo.setFullYear(oneYearAgo.getFullYear() - 1);
|
||||
|
||||
LOG.info(`Checking health of ${repos.length} repositories...`);
|
||||
|
||||
for (const batch of batchRepos(repos)) {
|
||||
const query = generate_health_query(batch);
|
||||
const options = {
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Authorization,
|
||||
'Content-Type': 'application/json',
|
||||
},
|
||||
body: JSON.stringify({ query }),
|
||||
};
|
||||
|
||||
try {
|
||||
const response = await fetch(GITHUB_GQL_API, options);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.errors) {
|
||||
LOG.error('GraphQL errors:', data.errors);
|
||||
continue;
|
||||
}
|
||||
|
||||
for (const [key, repo] of Object.entries(data.data)) {
|
||||
if (!repo) continue;
|
||||
|
||||
const pushedAt = new Date(repo.pushedAt);
|
||||
const repoInfo = {
|
||||
name: repo.nameWithOwner,
|
||||
pushedAt: repo.pushedAt,
|
||||
stars: repo.stargazerCount,
|
||||
url: `https://github.com/${repo.nameWithOwner}`,
|
||||
};
|
||||
|
||||
if (repo.isArchived) {
|
||||
results.archived.push(repoInfo);
|
||||
} else if (repo.isDisabled) {
|
||||
results.disabled.push(repoInfo);
|
||||
} else if (pushedAt < twoYearsAgo) {
|
||||
results.stale.push(repoInfo);
|
||||
} else if (pushedAt < oneYearAgo) {
|
||||
results.inactive.push(repoInfo);
|
||||
} else {
|
||||
results.healthy.push(repoInfo);
|
||||
}
|
||||
}
|
||||
} catch (error) {
|
||||
LOG.error('Batch fetch error:', error.message);
|
||||
}
|
||||
|
||||
// Rate limiting - wait a bit between batches
|
||||
await new Promise(resolve => setTimeout(resolve, 1000));
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
function generateReport(results) {
|
||||
const report = [];
|
||||
|
||||
report.push('# 🏥 Awesome Docker - Health Check Report\n');
|
||||
report.push(`**Generated:** ${new Date().toISOString()}\n`);
|
||||
report.push(`**Total Repositories:** ${results.total}\n`);
|
||||
|
||||
report.push('\n## 📊 Summary\n');
|
||||
report.push(`- ✅ Healthy (updated in last year): ${results.healthy.length}`);
|
||||
report.push(`- ⚠️ Inactive (1-2 years): ${results.inactive.length}`);
|
||||
report.push(`- 🪦 Stale (2+ years): ${results.stale.length}`);
|
||||
report.push(`- 📦 Archived: ${results.archived.length}`);
|
||||
report.push(`- 🚫 Disabled: ${results.disabled.length}\n`);
|
||||
|
||||
if (results.archived.length > 0) {
|
||||
report.push('\n## 📦 Archived Repositories (Should mark as :skull:)\n');
|
||||
results.archived.forEach(repo => {
|
||||
report.push(`- [${repo.name}](${repo.url}) - ⭐ ${repo.stars} - Last push: ${repo.pushedAt}`);
|
||||
});
|
||||
}
|
||||
|
||||
if (results.stale.length > 0) {
|
||||
report.push('\n## 🪦 Stale Repositories (No activity in 2+ years)\n');
|
||||
results.stale.slice(0, 50).forEach(repo => {
|
||||
report.push(`- [${repo.name}](${repo.url}) - ⭐ ${repo.stars} - Last push: ${repo.pushedAt}`);
|
||||
});
|
||||
if (results.stale.length > 50) {
|
||||
report.push(`\n... and ${results.stale.length - 50} more`);
|
||||
}
|
||||
}
|
||||
|
||||
if (results.inactive.length > 0) {
|
||||
report.push('\n## ⚠️ Inactive Repositories (No activity in 1-2 years)\n');
|
||||
report.push('_These may still be stable/complete projects - review individually_\n');
|
||||
results.inactive.slice(0, 30).forEach(repo => {
|
||||
report.push(`- [${repo.name}](${repo.url}) - ⭐ ${repo.stars} - Last push: ${repo.pushedAt}`);
|
||||
});
|
||||
if (results.inactive.length > 30) {
|
||||
report.push(`\n... and ${results.inactive.length - 30} more`);
|
||||
}
|
||||
}
|
||||
|
||||
return report.join('\n');
|
||||
}
|
||||
|
||||
async function main() {
|
||||
const markdown = await fs.readFile(README, 'utf8');
|
||||
let links = helper.extract_all_links(markdown);
|
||||
|
||||
const github_links = links.filter(link =>
|
||||
link.startsWith('https://github.com') &&
|
||||
!helper.exclude_from_list(link) &&
|
||||
!link.includes('/issues') &&
|
||||
!link.includes('/pull') &&
|
||||
!link.includes('/wiki') &&
|
||||
!link.includes('#')
|
||||
);
|
||||
|
||||
const repos = extract_repos(github_links);
|
||||
const results = await checkRepoHealth(repos);
|
||||
|
||||
const report = generateReport(results);
|
||||
|
||||
// Save report
|
||||
await fs.writeFile('HEALTH_REPORT.md', report);
|
||||
LOG.info('Health report saved to HEALTH_REPORT.md');
|
||||
|
||||
// Also print summary to console
|
||||
console.log('\n' + report);
|
||||
|
||||
// Exit with error if there are actionable items
|
||||
if (results.archived.length > 0 || results.stale.length > 10) {
|
||||
LOG.warn(`Found ${results.archived.length} archived and ${results.stale.length} stale repos`);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('Starting health check...');
|
||||
main();
|
||||
@@ -1,69 +0,0 @@
|
||||
import fs from 'fs-extra';
|
||||
import helper from './common.mjs';
|
||||
|
||||
console.log({
|
||||
DEBUG: process.env.DEBUG || false,
|
||||
});
|
||||
|
||||
const README = 'README.md';
|
||||
|
||||
async function main() {
|
||||
const has_error = {
|
||||
show: false,
|
||||
duplicates: '',
|
||||
other_links_error: '',
|
||||
};
|
||||
const markdown = await fs.readFile(README, 'utf8');
|
||||
let links = helper.extract_all_links(markdown);
|
||||
links = links.filter((l) => !helper.exclude_from_list(l)); // exclude websites
|
||||
helper.LOG.debug_string({ links });
|
||||
|
||||
console.log(`total links to check ${links.length}`);
|
||||
|
||||
console.log('checking for duplicates links...');
|
||||
|
||||
const duplicates = helper.find_duplicates(links);
|
||||
if (duplicates.length > 0) {
|
||||
has_error.show = true;
|
||||
has_error.duplicates = duplicates;
|
||||
}
|
||||
helper.LOG.debug_string({ duplicates });
|
||||
const [github_links, external_links] = helper.partition(links, (link) =>
|
||||
link.startsWith('https://github.com'),
|
||||
);
|
||||
|
||||
console.log(`checking ${external_links.length} external links...`);
|
||||
|
||||
const external_links_error = await helper.batch_fetch({
|
||||
arr: external_links,
|
||||
get: helper.fetch_link,
|
||||
post_filter_func: (x) => !x[1].ok,
|
||||
BATCH_SIZE: 8,
|
||||
});
|
||||
if (external_links_error.length > 0) {
|
||||
has_error.show = true;
|
||||
has_error.other_links_error = external_links_error;
|
||||
}
|
||||
|
||||
console.log(`checking ${github_links.length} GitHub repositories...`);
|
||||
|
||||
console.log(
|
||||
`skipping GitHub repository check. Run "npm run test" to execute them manually.`,
|
||||
);
|
||||
|
||||
console.log({
|
||||
TEST_PASSED: !has_error.show,
|
||||
EXTERNAL_LINKS: external_links.length,
|
||||
});
|
||||
|
||||
if (has_error.show) {
|
||||
helper.LOG.error_string(has_error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('starting...');
|
||||
main().catch((error) => {
|
||||
console.error('Fatal error:', error);
|
||||
process.exit(1);
|
||||
});
|
||||
@@ -1,127 +0,0 @@
|
||||
import fs from 'fs-extra';
|
||||
import fetch from 'node-fetch';
|
||||
import helper from './common.mjs';
|
||||
|
||||
function envvar_undefined(variable_name) {
|
||||
throw new Error(`${variable_name} must be defined`);
|
||||
}
|
||||
|
||||
console.log({
|
||||
DEBUG: process.env.DEBUG || false,
|
||||
});
|
||||
|
||||
const README = 'README.md';
|
||||
const GITHUB_GQL_API = 'https://api.github.com/graphql';
|
||||
const TOKEN = process.env.GITHUB_TOKEN || envvar_undefined('GITHUB_TOKEN');
|
||||
|
||||
const Authorization = `token ${TOKEN}`;
|
||||
|
||||
const make_GQL_options = (query) => ({
|
||||
method: 'POST',
|
||||
headers: {
|
||||
Authorization,
|
||||
'Content-Type': 'application/json',
|
||||
'user-agent':
|
||||
'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_4) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.149 Safari/537.36',
|
||||
},
|
||||
body: JSON.stringify({ query }),
|
||||
});
|
||||
|
||||
const extract_repos = (arr) =>
|
||||
arr
|
||||
.map((e) => e.substr('https://github.com/'.length).split('/'))
|
||||
.filter((r) => r.length === 2 && r[1] !== '');
|
||||
|
||||
const generate_GQL_query = (arr) =>
|
||||
`query AWESOME_REPOS{ ${arr
|
||||
.map(
|
||||
([owner, name]) =>
|
||||
`repo_${owner.replace(/(-|\.)/g, '_')}_${name.replace(
|
||||
/(-|\.)/g,
|
||||
'_',
|
||||
)}: repository(owner: "${owner}", name:"${name}"){ nameWithOwner isArchived } `,
|
||||
)
|
||||
.join('')} }`;
|
||||
|
||||
async function main() {
|
||||
const has_error = {
|
||||
show: false,
|
||||
duplicates: '',
|
||||
other_links_error: '',
|
||||
github_repos: '',
|
||||
};
|
||||
const markdown = await fs.readFile(README, 'utf8');
|
||||
let links = helper.extract_all_links(markdown);
|
||||
links = links.filter((l) => !helper.exclude_from_list(l)); // exclude websites
|
||||
helper.LOG.debug_string({ links });
|
||||
|
||||
console.log(`total links to check ${links.length}`);
|
||||
|
||||
console.log('checking for duplicates links...');
|
||||
|
||||
const duplicates = helper.find_duplicates(links);
|
||||
if (duplicates.length > 0) {
|
||||
has_error.show = true;
|
||||
has_error.duplicates = duplicates;
|
||||
}
|
||||
helper.LOG.debug_string({ duplicates });
|
||||
const [github_links, external_links] = helper.partition(links, (link) =>
|
||||
link.startsWith('https://github.com'),
|
||||
);
|
||||
|
||||
console.log(`checking ${external_links.length} external links...`);
|
||||
|
||||
const external_links_error = await helper.batch_fetch({
|
||||
arr: external_links,
|
||||
get: helper.fetch_link,
|
||||
post_filter_func: (x) => !x[1].ok,
|
||||
BATCH_SIZE: 8,
|
||||
});
|
||||
if (external_links_error.length > 0) {
|
||||
has_error.show = true;
|
||||
has_error.other_links_error = external_links_error;
|
||||
}
|
||||
|
||||
console.log(`checking ${github_links.length} GitHub repositories...`);
|
||||
|
||||
const repos = extract_repos(github_links);
|
||||
const query = generate_GQL_query(repos);
|
||||
const options = make_GQL_options(query);
|
||||
const gql_response = await fetch(GITHUB_GQL_API, options).then((r) =>
|
||||
r.json(),
|
||||
);
|
||||
if (gql_response.errors) {
|
||||
has_error.show = true;
|
||||
has_error.github_repos = gql_response.errors;
|
||||
}
|
||||
|
||||
// Check for archived repositories
|
||||
console.log('checking for archived repositories...');
|
||||
const archived_repos = [];
|
||||
if (gql_response.data) {
|
||||
for (const [key, repo] of Object.entries(gql_response.data)) {
|
||||
if (repo && repo.isArchived) {
|
||||
archived_repos.push(repo.nameWithOwner);
|
||||
}
|
||||
}
|
||||
}
|
||||
if (archived_repos.length > 0) {
|
||||
console.warn(`⚠️ Found ${archived_repos.length} archived repositories that should be marked with :skull:`);
|
||||
console.warn('Archived repos:', archived_repos);
|
||||
// Don't fail the build, just warn
|
||||
}
|
||||
|
||||
console.log({
|
||||
TEST_PASSED: has_error.show,
|
||||
GITHUB_REPOSITORY: github_links.length,
|
||||
EXTERNAL_LINKS: external_links.length,
|
||||
});
|
||||
|
||||
if (has_error.show) {
|
||||
helper.LOG.error_string(has_error);
|
||||
process.exit(1);
|
||||
}
|
||||
}
|
||||
|
||||
console.log('starting...');
|
||||
main();
|
||||
@@ -1,229 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<html class="no-js" lang="en">
|
||||
<head>
|
||||
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
|
||||
<meta http-equiv="Cache-control" content="public" />
|
||||
<meta charset="UTF-8" />
|
||||
<title>Awesome-docker</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1" />
|
||||
<meta name="theme-color" content="#5DBCD2" />
|
||||
<meta
|
||||
name="description"
|
||||
content="A curated list of Docker resources and projects."
|
||||
/>
|
||||
<meta
|
||||
name="keywords"
|
||||
content="free and open-source open source projects for docker moby kubernetes linux awesome awesome-list container tools dockerfile list moby docker-container docker-image docker-environment docker-deployment docker-swarm docker-api docker-monitoring docker-machine docker-security docker-registry"
|
||||
/>
|
||||
<meta
|
||||
name="google-site-verification"
|
||||
content="_yiugvz0gCtfsBLyLl1LnkALXb6D4ofiwCyV1XOlYBM"
|
||||
/>
|
||||
<link rel="icon" type="image/png" href="favicon.png" />
|
||||
<style>
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
|
||||
html {
|
||||
font-family: sans-serif;
|
||||
-ms-text-size-adjust: 100%;
|
||||
-webkit-text-size-adjust: 100%;
|
||||
}
|
||||
|
||||
body {
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
font-family: Open Sans, Helvetica Neue, Helvetica, Arial, sans-serif;
|
||||
font-size: 16px;
|
||||
line-height: 1.5;
|
||||
color: #606c71;
|
||||
}
|
||||
|
||||
section {
|
||||
display: block;
|
||||
}
|
||||
|
||||
a {
|
||||
background-color: transparent;
|
||||
color: #5dbcd2;
|
||||
text-decoration: none;
|
||||
}
|
||||
|
||||
strong {
|
||||
font-weight: 700;
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 2em;
|
||||
margin: 0.67em 0;
|
||||
}
|
||||
|
||||
img {
|
||||
border: 0;
|
||||
}
|
||||
|
||||
svg:not(:root) {
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.btn {
|
||||
display: inline-block;
|
||||
margin-bottom: 1rem;
|
||||
color: hsla(0, 0%, 100%, 0.7);
|
||||
background-color: hsla(0, 0%, 100%, 0.08);
|
||||
border: 1px solid hsla(0, 0%, 100%, 0.2);
|
||||
border-radius: 0.3rem;
|
||||
}
|
||||
|
||||
.page-header {
|
||||
color: #fff;
|
||||
text-align: center;
|
||||
background-color: #5dbcd2;
|
||||
background-image: linear-gradient(120deg, #155799, #5dbcd2);
|
||||
}
|
||||
|
||||
.project-name {
|
||||
margin-top: 0;
|
||||
margin-bottom: 0.1rem;
|
||||
}
|
||||
|
||||
.project-tagline {
|
||||
margin-bottom: 2rem;
|
||||
font-weight: 400;
|
||||
opacity: 0.7;
|
||||
}
|
||||
|
||||
.main-content {
|
||||
word-wrap: break-word;
|
||||
}
|
||||
|
||||
.main-content :first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.main-content h1,
|
||||
.main-content h4 {
|
||||
margin-top: 2rem;
|
||||
margin-bottom: 1rem;
|
||||
font-weight: 400;
|
||||
color: #5dbcd2;
|
||||
}
|
||||
|
||||
.main-content p {
|
||||
margin-bottom: 1em;
|
||||
}
|
||||
|
||||
.main-content blockquote {
|
||||
padding: 0 1rem;
|
||||
margin-left: 0;
|
||||
color: #819198;
|
||||
border-left: 0.3rem solid #dce6f0;
|
||||
}
|
||||
|
||||
.main-content blockquote > :first-child {
|
||||
margin-top: 0;
|
||||
}
|
||||
|
||||
.main-content blockquote > :last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.main-content img {
|
||||
max-width: 100%;
|
||||
}
|
||||
|
||||
@media screen and (min-width: 64em) {
|
||||
.btn {
|
||||
padding: 0.75rem 1rem;
|
||||
}
|
||||
.page-header {
|
||||
padding: 5rem 6rem;
|
||||
}
|
||||
.project-name {
|
||||
font-size: 3.25rem;
|
||||
}
|
||||
.project-tagline {
|
||||
font-size: 1.25rem;
|
||||
}
|
||||
.main-content {
|
||||
max-width: 64rem;
|
||||
padding: 2rem 6rem;
|
||||
margin: 0 auto;
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (min-width: 42em) and (max-width: 64em) {
|
||||
.btn {
|
||||
padding: 0.6rem 0.9rem;
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
.page-header {
|
||||
padding: 3rem 4rem;
|
||||
}
|
||||
.project-name {
|
||||
font-size: 2.25rem;
|
||||
}
|
||||
.project-tagline {
|
||||
font-size: 1.15rem;
|
||||
}
|
||||
.main-content {
|
||||
padding: 2rem 4rem;
|
||||
font-size: 1.1rem;
|
||||
}
|
||||
}
|
||||
|
||||
@media screen and (max-width: 42em) {
|
||||
.btn {
|
||||
display: block;
|
||||
width: 100%;
|
||||
padding: 0.75rem;
|
||||
font-size: 0.9rem;
|
||||
}
|
||||
.page-header {
|
||||
padding: 2rem 1rem;
|
||||
}
|
||||
.project-name {
|
||||
font-size: 1.75rem;
|
||||
}
|
||||
.project-tagline {
|
||||
font-size: 1rem;
|
||||
}
|
||||
.main-content {
|
||||
padding: 2rem 1rem;
|
||||
font-size: 1rem;
|
||||
}
|
||||
}
|
||||
</style>
|
||||
</head>
|
||||
|
||||
<body>
|
||||
<section class="page-header">
|
||||
<h1 class="project-name">Awesome-docker</h1>
|
||||
<h2 class="project-tagline">
|
||||
A curated list of Docker resources and projects
|
||||
</h2>
|
||||
<a href="https://github.com/veggiemonk/awesome-docker" class="btn"
|
||||
>View on GitHub</a
|
||||
>
|
||||
<br />
|
||||
<!-- Place this tag where you want the button to render. -->
|
||||
<a
|
||||
class="github-button"
|
||||
href="https://github.com/veggiemonk/awesome-docker#readme"
|
||||
data-icon="octicon-star"
|
||||
data-size="large"
|
||||
data-count-href="/veggiemonk/awesome-docker/stargazers"
|
||||
data-show-count="true"
|
||||
data-count-aria-label="# stargazers on GitHub"
|
||||
aria-label="Star veggiemonk/awesome-docker on GitHub"
|
||||
>Star</a
|
||||
>
|
||||
</section>
|
||||
<section id="md" class="main-content"></section>
|
||||
<!--<script src="index.js"></script> -->
|
||||
<!--Place this tag in your head or just before your close body tag. -->
|
||||
<script async defer src="https://buttons.github.io/buttons.js"></script>
|
||||
</body>
|
||||
</html>
|
||||
Reference in New Issue
Block a user