feat: Add comprehensive Gitea CI/CD workflows for multi-arch container builds
- Add build-container.yml: Main build pipeline with multi-arch support - Add pr-check.yml: Pull request validation with comprehensive testing - Add release.yml: Automated release pipeline with security scanning - Add nightly.yml: Daily builds with performance testing - Add health_check.sh: Container health validation script - Add setup-ci.sh: Local CI/CD environment setup script - Add comprehensive CI/CD documentation Features: - Multi-architecture builds (linux/amd64, linux/arm64) - Security scanning with Trivy - Automated PyPI publishing for releases - Container registry integration - Performance testing and validation - Artifact management and cleanup - Build caching and optimization Supports full development workflow from PR to production deployment.
This commit is contained in:
373
.gitea/workflows/README.md
Normal file
373
.gitea/workflows/README.md
Normal file
@@ -0,0 +1,373 @@
|
|||||||
|
# Gitea CI/CD Workflows
|
||||||
|
|
||||||
|
This directory contains the CI/CD workflows for UnitForge using Gitea Actions. These workflows automate testing, building, and deploying multi-architecture container images.
|
||||||
|
|
||||||
|
## Workflows Overview
|
||||||
|
|
||||||
|
### 1. `build-container.yml` - Main Build Pipeline
|
||||||
|
**Triggers:** Push to `main`/`develop`, tags starting with `v*`
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Runs comprehensive tests (linting, type checking, security)
|
||||||
|
- Builds multi-arch container images (linux/amd64, linux/arm64)
|
||||||
|
- Pushes to container registry
|
||||||
|
- Security scanning with Trivy
|
||||||
|
- Automatic deployment to staging/production
|
||||||
|
|
||||||
|
**Jobs:**
|
||||||
|
- `test` - Run linting, tests, and security checks
|
||||||
|
- `build-and-push` - Build and push multi-arch container images
|
||||||
|
- `security-scan` - Vulnerability scanning
|
||||||
|
- `deploy-staging` - Deploy to staging (develop branch)
|
||||||
|
- `deploy-production` - Deploy to production (tags)
|
||||||
|
|
||||||
|
### 2. `pr-check.yml` - Pull Request Validation
|
||||||
|
**Triggers:** Pull requests to `main`/`develop`
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Full test suite including coverage reporting
|
||||||
|
- Multi-arch build testing (no push)
|
||||||
|
- Container startup verification
|
||||||
|
- Configuration validation
|
||||||
|
- PR summary with build status
|
||||||
|
|
||||||
|
**Jobs:**
|
||||||
|
- `test` - Complete test suite with coverage
|
||||||
|
- `build-test` - Test multi-arch builds without pushing
|
||||||
|
- `validate-config` - Validate project configuration
|
||||||
|
- `pr-summary` - Generate build status summary
|
||||||
|
|
||||||
|
### 3. `release.yml` - Release Pipeline
|
||||||
|
**Triggers:** Tags matching `v*` pattern
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Version validation and metadata extraction
|
||||||
|
- Full test suite across all Python versions
|
||||||
|
- Multi-arch container builds with release tags
|
||||||
|
- Security scanning with vulnerability blocking
|
||||||
|
- GitHub release creation with artifacts
|
||||||
|
- PyPI package publishing (stable releases only)
|
||||||
|
- Production deployment
|
||||||
|
|
||||||
|
**Jobs:**
|
||||||
|
- `validate-release` - Version format and metadata validation
|
||||||
|
- `test-and-build` - Comprehensive testing and Python package build
|
||||||
|
- `build-container` - Multi-arch container build with release tags
|
||||||
|
- `security-scan` - Security scanning with critical vulnerability blocking
|
||||||
|
- `create-release` - GitHub release with artifacts and changelog
|
||||||
|
- `publish-package` - PyPI publishing (stable releases only)
|
||||||
|
- `deploy-production` - Production deployment
|
||||||
|
- `notify-release` - Release completion notification
|
||||||
|
|
||||||
|
### 4. `nightly.yml` - Nightly Builds
|
||||||
|
**Triggers:** Daily at 2 AM UTC, manual dispatch
|
||||||
|
|
||||||
|
**Features:**
|
||||||
|
- Change detection (skips if no commits in 24h)
|
||||||
|
- Multi-Python version testing matrix
|
||||||
|
- Performance testing
|
||||||
|
- Comprehensive security scanning
|
||||||
|
- Old image cleanup
|
||||||
|
- Detailed reporting
|
||||||
|
|
||||||
|
**Jobs:**
|
||||||
|
- `check-changes` - Detect if build is needed
|
||||||
|
- `nightly-tests` - Test across Python versions (3.8-3.12)
|
||||||
|
- `build-nightly` - Build nightly images with date/commit tags
|
||||||
|
- `performance-test` - Basic performance validation
|
||||||
|
- `security-scan-nightly` - Comprehensive security analysis
|
||||||
|
- `cleanup-old-nightlies` - Remove old nightly images
|
||||||
|
- `notify-results` - Build status notification
|
||||||
|
|
||||||
|
## Configuration
|
||||||
|
|
||||||
|
### Required Secrets
|
||||||
|
|
||||||
|
Set these secrets in your Gitea repository settings:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Container Registry
|
||||||
|
CONTAINER_REGISTRY_USERNAME=your-registry-username
|
||||||
|
CONTAINER_REGISTRY_PASSWORD=your-registry-password
|
||||||
|
|
||||||
|
# PyPI Publishing (for releases)
|
||||||
|
PYPI_API_TOKEN=your-pypi-token
|
||||||
|
|
||||||
|
# GitHub (if using GitHub releases)
|
||||||
|
GITHUB_TOKEN=your-github-token
|
||||||
|
```
|
||||||
|
|
||||||
|
### Environment Variables
|
||||||
|
|
||||||
|
The workflows use these environment variables:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
env:
|
||||||
|
REGISTRY: gitea-http.taildb3494.ts.net
|
||||||
|
IMAGE_NAME: will/unitforge
|
||||||
|
```
|
||||||
|
|
||||||
|
Update these in each workflow file to match your registry and image name.
|
||||||
|
|
||||||
|
### Multi-Architecture Support
|
||||||
|
|
||||||
|
All workflows build for multiple architectures:
|
||||||
|
- `linux/amd64` - Standard x86_64 architecture
|
||||||
|
- `linux/arm64` - ARM64 architecture (Apple Silicon, ARM servers)
|
||||||
|
|
||||||
|
This is configured using Docker Buildx with the platform specification:
|
||||||
|
```yaml
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
|
```
|
||||||
|
|
||||||
|
## Container Registry Integration
|
||||||
|
|
||||||
|
### Image Tags
|
||||||
|
|
||||||
|
Different workflows create different image tags:
|
||||||
|
|
||||||
|
**Main builds (build-container.yml):**
|
||||||
|
- `main` - Latest from main branch
|
||||||
|
- `develop` - Latest from develop branch
|
||||||
|
- `latest` - Latest stable release
|
||||||
|
|
||||||
|
**Release builds (release.yml):**
|
||||||
|
- `v1.2.3` - Specific version
|
||||||
|
- `1.2` - Major.minor version
|
||||||
|
- `1` - Major version (stable releases only)
|
||||||
|
- `latest` - Latest stable release
|
||||||
|
|
||||||
|
**Nightly builds (nightly.yml):**
|
||||||
|
- `nightly-20240101-abc1234` - Date and commit SHA
|
||||||
|
- `nightly-latest` - Latest nightly build
|
||||||
|
|
||||||
|
### Registry Configuration
|
||||||
|
|
||||||
|
The workflows are configured for a self-hosted registry. To use a different registry:
|
||||||
|
|
||||||
|
1. Update the `REGISTRY` environment variable
|
||||||
|
2. Ensure authentication secrets are set correctly
|
||||||
|
3. Verify registry supports multi-arch manifests
|
||||||
|
|
||||||
|
## Development Workflow
|
||||||
|
|
||||||
|
### Branch Strategy
|
||||||
|
|
||||||
|
- `main` - Production-ready code
|
||||||
|
- `develop` - Integration branch for features
|
||||||
|
- `feature/*` - Feature branches (create PRs to develop)
|
||||||
|
- `hotfix/*` - Critical fixes (create PRs to main)
|
||||||
|
|
||||||
|
### Release Process
|
||||||
|
|
||||||
|
1. **Prepare Release:**
|
||||||
|
```bash
|
||||||
|
git checkout main
|
||||||
|
git pull origin main
|
||||||
|
git tag v1.2.3
|
||||||
|
git push origin v1.2.3
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Automatic Process:**
|
||||||
|
- Release workflow triggers
|
||||||
|
- Tests run across all Python versions
|
||||||
|
- Multi-arch container images built
|
||||||
|
- Security scanning performed
|
||||||
|
- GitHub release created
|
||||||
|
- PyPI package published (if stable)
|
||||||
|
- Production deployment triggered
|
||||||
|
|
||||||
|
3. **Manual Verification:**
|
||||||
|
- Check workflow completion
|
||||||
|
- Verify container images in registry
|
||||||
|
- Test deployed application
|
||||||
|
- Monitor for issues
|
||||||
|
|
||||||
|
### Local Development
|
||||||
|
|
||||||
|
Test builds locally using the Makefile:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Setup development environment
|
||||||
|
make setup-dev
|
||||||
|
|
||||||
|
# Run tests and linting
|
||||||
|
make dev
|
||||||
|
|
||||||
|
# Build container image locally
|
||||||
|
make docker-buildx-local
|
||||||
|
|
||||||
|
# Test multi-arch build (requires buildx)
|
||||||
|
make docker-buildx-setup
|
||||||
|
docker buildx build --platform linux/amd64,linux/arm64 -t unitforge:test .
|
||||||
|
```
|
||||||
|
|
||||||
|
## Debugging Workflows
|
||||||
|
|
||||||
|
### Common Issues
|
||||||
|
|
||||||
|
1. **Missing Vendor Assets:**
|
||||||
|
```
|
||||||
|
Error: Missing bootstrap CSS file
|
||||||
|
```
|
||||||
|
Ensure all static assets are committed to the repository.
|
||||||
|
|
||||||
|
2. **Registry Authentication:**
|
||||||
|
```
|
||||||
|
Error: unauthorized
|
||||||
|
```
|
||||||
|
Verify `CONTAINER_REGISTRY_USERNAME` and `CONTAINER_REGISTRY_PASSWORD` secrets.
|
||||||
|
|
||||||
|
3. **Build Platform Issues:**
|
||||||
|
```
|
||||||
|
Error: multiple platforms feature is currently not supported
|
||||||
|
```
|
||||||
|
Ensure Docker Buildx is properly set up in the runner.
|
||||||
|
|
||||||
|
### Workflow Debugging
|
||||||
|
|
||||||
|
1. **Enable Debug Logging:**
|
||||||
|
Add to workflow:
|
||||||
|
```yaml
|
||||||
|
env:
|
||||||
|
ACTIONS_STEP_DEBUG: true
|
||||||
|
ACTIONS_RUNNER_DEBUG: true
|
||||||
|
```
|
||||||
|
|
||||||
|
2. **Test Locally:**
|
||||||
|
Use `act` to test workflows locally:
|
||||||
|
```bash
|
||||||
|
act -j test -s CONTAINER_REGISTRY_USERNAME=test -s CONTAINER_REGISTRY_PASSWORD=test
|
||||||
|
```
|
||||||
|
|
||||||
|
3. **Check Build Logs:**
|
||||||
|
- View detailed logs in Gitea Actions UI
|
||||||
|
- Check container registry for pushed images
|
||||||
|
- Verify security scan results
|
||||||
|
|
||||||
|
## Security
|
||||||
|
|
||||||
|
### Image Scanning
|
||||||
|
|
||||||
|
All container images are scanned for vulnerabilities using Trivy:
|
||||||
|
|
||||||
|
- **PR builds:** Informational scanning
|
||||||
|
- **Main builds:** Upload results to security dashboard
|
||||||
|
- **Release builds:** Block on critical vulnerabilities
|
||||||
|
- **Nightly builds:** Comprehensive analysis
|
||||||
|
|
||||||
|
### Secrets Management
|
||||||
|
|
||||||
|
- Use Gitea repository secrets for sensitive data
|
||||||
|
- Never commit credentials to repository
|
||||||
|
- Rotate secrets regularly
|
||||||
|
- Use least-privilege access
|
||||||
|
|
||||||
|
### Build Security
|
||||||
|
|
||||||
|
- Multi-stage Dockerfile minimizes attack surface
|
||||||
|
- Non-root user in containers
|
||||||
|
- Dependency scanning included
|
||||||
|
- Static analysis with security tools
|
||||||
|
|
||||||
|
## Monitoring and Notifications
|
||||||
|
|
||||||
|
### Build Status
|
||||||
|
|
||||||
|
Monitor workflow status:
|
||||||
|
- Gitea Actions dashboard
|
||||||
|
- Email notifications (configure in Gitea)
|
||||||
|
- External monitoring (webhook integrations)
|
||||||
|
|
||||||
|
### Metrics
|
||||||
|
|
||||||
|
Track important metrics:
|
||||||
|
- Build success rate
|
||||||
|
- Build duration
|
||||||
|
- Image size trends
|
||||||
|
- Security vulnerability counts
|
||||||
|
|
||||||
|
### Alerts
|
||||||
|
|
||||||
|
Set up alerts for:
|
||||||
|
- Failed builds on main/develop
|
||||||
|
- Security vulnerabilities in releases
|
||||||
|
- Performance regression in nightly builds
|
||||||
|
- Registry storage usage
|
||||||
|
|
||||||
|
## Customization
|
||||||
|
|
||||||
|
### Adding New Platforms
|
||||||
|
|
||||||
|
To support additional architectures:
|
||||||
|
|
||||||
|
1. Update platform list:
|
||||||
|
```yaml
|
||||||
|
platforms: linux/amd64,linux/arm64,linux/arm/v7
|
||||||
|
```
|
||||||
|
|
||||||
|
2. Ensure base images support the platform
|
||||||
|
3. Test builds on target architecture
|
||||||
|
|
||||||
|
### Custom Deployment
|
||||||
|
|
||||||
|
Modify deployment jobs for your infrastructure:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
deploy-production:
|
||||||
|
steps:
|
||||||
|
- name: Deploy to Kubernetes
|
||||||
|
run: |
|
||||||
|
kubectl set image deployment/unitforge \
|
||||||
|
unitforge=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.ref_name }}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Integration with External Tools
|
||||||
|
|
||||||
|
Add steps for external integrations:
|
||||||
|
|
||||||
|
```yaml
|
||||||
|
- name: Update monitoring
|
||||||
|
run: |
|
||||||
|
curl -X POST "$MONITORING_WEBHOOK" \
|
||||||
|
-d "version=${{ github.ref_name }}" \
|
||||||
|
-d "image=${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.ref_name }}"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Performance Issues
|
||||||
|
|
||||||
|
If builds are slow:
|
||||||
|
- Enable build caching (already configured)
|
||||||
|
- Use faster runners if available
|
||||||
|
- Parallelize independent jobs
|
||||||
|
- Optimize Docker layer caching
|
||||||
|
|
||||||
|
### Storage Issues
|
||||||
|
|
||||||
|
Manage registry storage:
|
||||||
|
- Implement cleanup policies
|
||||||
|
- Use image compression
|
||||||
|
- Remove unused layers
|
||||||
|
- Monitor storage usage
|
||||||
|
|
||||||
|
### Network Issues
|
||||||
|
|
||||||
|
For registry connectivity problems:
|
||||||
|
- Check network policies
|
||||||
|
- Verify DNS resolution
|
||||||
|
- Test registry endpoint manually
|
||||||
|
- Review firewall rules
|
||||||
|
|
||||||
|
## Contributing
|
||||||
|
|
||||||
|
When modifying workflows:
|
||||||
|
|
||||||
|
1. Test changes in feature branch
|
||||||
|
2. Document any new requirements
|
||||||
|
3. Update this README if needed
|
||||||
|
4. Ensure backward compatibility
|
||||||
|
5. Test with actual builds before merging
|
||||||
|
|
||||||
|
For questions or issues with the CI/CD workflows, please create an issue in the repository.
|
||||||
173
.gitea/workflows/build-container.yml
Normal file
173
.gitea/workflows/build-container.yml
Normal file
@@ -0,0 +1,173 @@
|
|||||||
|
name: Build Multi-Arch Container Image
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
- develop
|
||||||
|
tags:
|
||||||
|
- "v*"
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
- develop
|
||||||
|
|
||||||
|
env:
|
||||||
|
REGISTRY: gitea-http.taildb3494.ts.net
|
||||||
|
IMAGE_NAME: will/unitforge
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v3
|
||||||
|
with:
|
||||||
|
version: "latest"
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
run: uv python install 3.11
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: |
|
||||||
|
uv venv
|
||||||
|
uv pip install -e ".[dev]"
|
||||||
|
|
||||||
|
- name: Run linting
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make lint
|
||||||
|
|
||||||
|
- name: Run tests
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make test-cov
|
||||||
|
|
||||||
|
- name: Security check
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make security-check
|
||||||
|
|
||||||
|
build-and-push:
|
||||||
|
needs: test
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.event_name != 'pull_request'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v3
|
||||||
|
with:
|
||||||
|
driver-opts: network=host
|
||||||
|
|
||||||
|
- name: Log in to Container Registry
|
||||||
|
uses: docker/login-action@v3
|
||||||
|
with:
|
||||||
|
registry: ${{ env.REGISTRY }}
|
||||||
|
username: ${{ secrets.CONTAINER_REGISTRY_USERNAME }}
|
||||||
|
password: ${{ secrets.CONTAINER_REGISTRY_PASSWORD }}
|
||||||
|
|
||||||
|
- name: Extract metadata
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
|
||||||
|
tags: |
|
||||||
|
type=ref,event=branch
|
||||||
|
type=ref,event=pr
|
||||||
|
type=semver,pattern={{version}}
|
||||||
|
type=semver,pattern={{major}}.{{minor}}
|
||||||
|
type=semver,pattern={{major}}
|
||||||
|
type=raw,value=latest,enable={{is_default_branch}}
|
||||||
|
type=sha,prefix={{branch}}-
|
||||||
|
|
||||||
|
- name: Verify vendor assets
|
||||||
|
run: |
|
||||||
|
if [ ! -f frontend/static/vendor/bootstrap/css/bootstrap.min.css ]; then
|
||||||
|
echo "Error: Missing bootstrap CSS file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/vendor/bootstrap/js/bootstrap.bundle.min.js ]; then
|
||||||
|
echo "Error: Missing bootstrap JS file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/vendor/fontawesome/css/all.min.css ]; then
|
||||||
|
echo "Error: Missing FontAwesome CSS file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/vendor/fontawesome/webfonts/fa-solid-900.woff2 ]; then
|
||||||
|
echo "Error: Missing FontAwesome font file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/img/osi-logo.svg ]; then
|
||||||
|
echo "Error: Missing OSI logo"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
echo "All vendor assets verified"
|
||||||
|
|
||||||
|
- name: Build and push multi-arch image
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
file: ./Dockerfile
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
|
push: true
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
|
cache-from: type=gha
|
||||||
|
cache-to: type=gha,mode=max
|
||||||
|
build-args: |
|
||||||
|
BUILDKIT_INLINE_CACHE=1
|
||||||
|
|
||||||
|
- name: Image digest
|
||||||
|
run: echo ${{ steps.build.outputs.digest }}
|
||||||
|
|
||||||
|
security-scan:
|
||||||
|
needs: build-and-push
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.event_name != 'pull_request'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Run Trivy vulnerability scanner
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.sha }}
|
||||||
|
format: "sarif"
|
||||||
|
output: "trivy-results.sarif"
|
||||||
|
|
||||||
|
- name: Upload Trivy scan results
|
||||||
|
uses: github/codeql-action/upload-sarif@v2
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
sarif_file: "trivy-results.sarif"
|
||||||
|
|
||||||
|
deploy-staging:
|
||||||
|
needs: [build-and-push, security-scan]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: github.ref == 'refs/heads/develop'
|
||||||
|
environment: staging
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Deploy to staging
|
||||||
|
run: |
|
||||||
|
echo "Deploying ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:develop to staging environment"
|
||||||
|
# Add your staging deployment commands here
|
||||||
|
# This could include updating k8s manifests, helm charts, etc.
|
||||||
|
|
||||||
|
deploy-production:
|
||||||
|
needs: [build-and-push, security-scan]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: startsWith(github.ref, 'refs/tags/v')
|
||||||
|
environment: production
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Deploy to production
|
||||||
|
run: |
|
||||||
|
echo "Deploying ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ github.ref_name }} to production environment"
|
||||||
|
# Add your production deployment commands here
|
||||||
|
# This could include updating k8s manifests, helm charts, etc.
|
||||||
296
.gitea/workflows/nightly.yml
Normal file
296
.gitea/workflows/nightly.yml
Normal file
@@ -0,0 +1,296 @@
|
|||||||
|
name: Nightly Build
|
||||||
|
|
||||||
|
on:
|
||||||
|
schedule:
|
||||||
|
# Run every night at 2 AM UTC
|
||||||
|
- cron: "0 2 * * *"
|
||||||
|
workflow_dispatch:
|
||||||
|
inputs:
|
||||||
|
force_build:
|
||||||
|
description: "Force build even if no changes"
|
||||||
|
required: false
|
||||||
|
default: "false"
|
||||||
|
type: boolean
|
||||||
|
|
||||||
|
env:
|
||||||
|
REGISTRY: gitea-http.taildb3494.ts.net
|
||||||
|
IMAGE_NAME: will/unitforge
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
check-changes:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
should_build: ${{ steps.changes.outputs.should_build }}
|
||||||
|
commit_sha: ${{ steps.changes.outputs.commit_sha }}
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
fetch-depth: 2
|
||||||
|
|
||||||
|
- name: Check for changes
|
||||||
|
id: changes
|
||||||
|
run: |
|
||||||
|
# Get the latest commit from the last 24 hours
|
||||||
|
YESTERDAY=$(date -d "24 hours ago" --iso-8601)
|
||||||
|
RECENT_COMMITS=$(git log --since="$YESTERDAY" --format="%H" | wc -l)
|
||||||
|
|
||||||
|
FORCE_BUILD="${{ github.event.inputs.force_build }}"
|
||||||
|
|
||||||
|
if [[ "$FORCE_BUILD" == "true" ]] || [[ $RECENT_COMMITS -gt 0 ]]; then
|
||||||
|
echo "should_build=true" >> $GITHUB_OUTPUT
|
||||||
|
echo "Found $RECENT_COMMITS commits in the last 24 hours or force build requested"
|
||||||
|
else
|
||||||
|
echo "should_build=false" >> $GITHUB_OUTPUT
|
||||||
|
echo "No changes in the last 24 hours, skipping build"
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "commit_sha=$(git rev-parse HEAD)" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
nightly-tests:
|
||||||
|
needs: check-changes
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: needs.check-changes.outputs.should_build == 'true'
|
||||||
|
|
||||||
|
strategy:
|
||||||
|
matrix:
|
||||||
|
python-version: ["3.8", "3.9", "3.10", "3.11", "3.12"]
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v3
|
||||||
|
with:
|
||||||
|
version: "latest"
|
||||||
|
|
||||||
|
- name: Set up Python ${{ matrix.python-version }}
|
||||||
|
run: uv python install ${{ matrix.python-version }}
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: |
|
||||||
|
uv venv --python ${{ matrix.python-version }}
|
||||||
|
uv pip install -e ".[dev]"
|
||||||
|
|
||||||
|
- name: Run comprehensive tests
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
|
||||||
|
# Run all checks
|
||||||
|
make lint
|
||||||
|
make type-check
|
||||||
|
make security-check
|
||||||
|
make test-cov
|
||||||
|
|
||||||
|
# Additional nightly-specific tests
|
||||||
|
echo "Running extended test suite..."
|
||||||
|
python -m pytest tests/ -v --durations=10 --tb=short
|
||||||
|
|
||||||
|
- name: Upload coverage for Python ${{ matrix.python-version }}
|
||||||
|
uses: codecov/codecov-action@v3
|
||||||
|
if: matrix.python-version == '3.11'
|
||||||
|
with:
|
||||||
|
file: ./htmlcov/coverage.xml
|
||||||
|
flags: nightly-${{ matrix.python-version }}
|
||||||
|
|
||||||
|
build-nightly:
|
||||||
|
needs: [check-changes, nightly-tests]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: needs.check-changes.outputs.should_build == 'true'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v3
|
||||||
|
with:
|
||||||
|
driver-opts: network=host
|
||||||
|
|
||||||
|
- name: Log in to Container Registry
|
||||||
|
uses: docker/login-action@v3
|
||||||
|
with:
|
||||||
|
registry: ${{ env.REGISTRY }}
|
||||||
|
username: ${{ secrets.CONTAINER_REGISTRY_USERNAME }}
|
||||||
|
password: ${{ secrets.CONTAINER_REGISTRY_PASSWORD }}
|
||||||
|
|
||||||
|
- name: Generate nightly tags
|
||||||
|
id: tags
|
||||||
|
run: |
|
||||||
|
COMMIT_SHA="${{ needs.check-changes.outputs.commit_sha }}"
|
||||||
|
DATE=$(date +%Y%m%d)
|
||||||
|
SHORT_SHA=${COMMIT_SHA:0:7}
|
||||||
|
|
||||||
|
echo "nightly_tag=nightly-${DATE}-${SHORT_SHA}" >> $GITHUB_OUTPUT
|
||||||
|
echo "nightly_latest=nightly-latest" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
- name: Verify vendor assets
|
||||||
|
run: |
|
||||||
|
assets=(
|
||||||
|
"frontend/static/vendor/bootstrap/css/bootstrap.min.css"
|
||||||
|
"frontend/static/vendor/bootstrap/js/bootstrap.bundle.min.js"
|
||||||
|
"frontend/static/vendor/fontawesome/css/all.min.css"
|
||||||
|
"frontend/static/vendor/fontawesome/webfonts/fa-solid-900.woff2"
|
||||||
|
"frontend/static/img/osi-logo.svg"
|
||||||
|
)
|
||||||
|
|
||||||
|
for asset in "${assets[@]}"; do
|
||||||
|
if [ ! -f "$asset" ]; then
|
||||||
|
echo "Error: Missing required asset: $asset"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
echo "All vendor assets verified"
|
||||||
|
|
||||||
|
- name: Build and push nightly image
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
file: ./Dockerfile
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
|
push: true
|
||||||
|
tags: |
|
||||||
|
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ steps.tags.outputs.nightly_tag }}
|
||||||
|
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ steps.tags.outputs.nightly_latest }}
|
||||||
|
labels: |
|
||||||
|
org.opencontainers.image.title=UnitForge Nightly
|
||||||
|
org.opencontainers.image.description=Nightly build of UnitForge
|
||||||
|
org.opencontainers.image.version=nightly-${{ steps.tags.outputs.nightly_tag }}
|
||||||
|
org.opencontainers.image.revision=${{ needs.check-changes.outputs.commit_sha }}
|
||||||
|
org.opencontainers.image.created=${{ github.event.repository.pushed_at }}
|
||||||
|
cache-from: type=gha
|
||||||
|
cache-to: type=gha,mode=max
|
||||||
|
build-args: |
|
||||||
|
BUILDKIT_INLINE_CACHE=1
|
||||||
|
|
||||||
|
performance-test:
|
||||||
|
needs: [check-changes, build-nightly]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: needs.check-changes.outputs.should_build == 'true'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Run performance tests
|
||||||
|
run: |
|
||||||
|
# Pull the nightly image
|
||||||
|
docker pull ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:nightly-latest
|
||||||
|
|
||||||
|
# Start the container
|
||||||
|
docker run -d --name unitforge-perf \
|
||||||
|
-p 8000:8000 \
|
||||||
|
${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:nightly-latest
|
||||||
|
|
||||||
|
# Wait for startup
|
||||||
|
sleep 15
|
||||||
|
|
||||||
|
# Basic performance test
|
||||||
|
echo "Running basic performance test..."
|
||||||
|
for i in {1..10}; do
|
||||||
|
curl -s -o /dev/null -w "%{http_code} %{time_total}s\n" \
|
||||||
|
http://localhost:8000/
|
||||||
|
done
|
||||||
|
|
||||||
|
# Memory usage check
|
||||||
|
echo "Checking memory usage..."
|
||||||
|
docker stats unitforge-perf --no-stream --format "table {{.Container}}\t{{.CPUPerc}}\t{{.MemUsage}}"
|
||||||
|
|
||||||
|
# Cleanup
|
||||||
|
docker stop unitforge-perf
|
||||||
|
docker rm unitforge-perf
|
||||||
|
|
||||||
|
security-scan-nightly:
|
||||||
|
needs: [check-changes, build-nightly]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: needs.check-changes.outputs.should_build == 'true'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Run comprehensive security scan
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:nightly-latest
|
||||||
|
format: "sarif"
|
||||||
|
output: "trivy-nightly.sarif"
|
||||||
|
|
||||||
|
- name: Upload security scan results
|
||||||
|
uses: github/codeql-action/upload-sarif@v2
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
sarif_file: "trivy-nightly.sarif"
|
||||||
|
|
||||||
|
- name: Generate security report
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:nightly-latest
|
||||||
|
format: "json"
|
||||||
|
output: "security-report.json"
|
||||||
|
|
||||||
|
- name: Upload security report
|
||||||
|
uses: actions/upload-artifact@v3
|
||||||
|
with:
|
||||||
|
name: nightly-security-report
|
||||||
|
path: security-report.json
|
||||||
|
|
||||||
|
cleanup-old-nightlies:
|
||||||
|
needs: [check-changes, build-nightly]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: needs.check-changes.outputs.should_build == 'true'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Clean up old nightly images
|
||||||
|
run: |
|
||||||
|
echo "Cleaning up nightly images older than 7 days..."
|
||||||
|
|
||||||
|
# Note: This would require registry API access or container registry-specific tools
|
||||||
|
# For now, we'll just log what would be cleaned
|
||||||
|
|
||||||
|
CUTOFF_DATE=$(date -d "7 days ago" +%Y%m%d)
|
||||||
|
echo "Would clean images tagged before: nightly-${CUTOFF_DATE}"
|
||||||
|
|
||||||
|
# Add actual cleanup logic here based on your registry
|
||||||
|
# Examples:
|
||||||
|
# - Use registry API to list and delete old tags
|
||||||
|
# - Use container registry CLI tools
|
||||||
|
# - Use registry-specific cleanup policies
|
||||||
|
|
||||||
|
notify-results:
|
||||||
|
needs:
|
||||||
|
[
|
||||||
|
check-changes,
|
||||||
|
nightly-tests,
|
||||||
|
build-nightly,
|
||||||
|
performance-test,
|
||||||
|
security-scan-nightly,
|
||||||
|
]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: always() && needs.check-changes.outputs.should_build == 'true'
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Generate build report
|
||||||
|
run: |
|
||||||
|
echo "## Nightly Build Report - $(date)" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Component | Status |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "|-----------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Tests | ${{ needs.nightly-tests.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Build | ${{ needs.build-nightly.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Performance | ${{ needs.performance-test.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Security | ${{ needs.security-scan-nightly.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "" >> $GITHUB_STEP_SUMMARY
|
||||||
|
|
||||||
|
if [[ "${{ needs.nightly-tests.result }}" == "success" && "${{ needs.build-nightly.result }}" == "success" ]]; then
|
||||||
|
echo "🌙 Nightly build completed successfully!" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "📦 Image: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:nightly-latest" >> $GITHUB_STEP_SUMMARY
|
||||||
|
else
|
||||||
|
echo "❌ Nightly build encountered issues. Check failed jobs above." >> $GITHUB_STEP_SUMMARY
|
||||||
|
fi
|
||||||
|
|
||||||
|
- name: Send notification
|
||||||
|
if: failure()
|
||||||
|
run: |
|
||||||
|
echo "🚨 Nightly build failed!"
|
||||||
|
echo "Check the workflow run for details: ${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"
|
||||||
|
|
||||||
|
# Add notification logic here (webhook, email, Slack, etc.)
|
||||||
156
.gitea/workflows/pr-check.yml
Normal file
156
.gitea/workflows/pr-check.yml
Normal file
@@ -0,0 +1,156 @@
|
|||||||
|
name: Pull Request Checks
|
||||||
|
|
||||||
|
on:
|
||||||
|
pull_request:
|
||||||
|
branches:
|
||||||
|
- main
|
||||||
|
- develop
|
||||||
|
|
||||||
|
env:
|
||||||
|
REGISTRY: gitea-http.taildb3494.ts.net
|
||||||
|
IMAGE_NAME: will/unitforge
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v3
|
||||||
|
with:
|
||||||
|
version: "latest"
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
run: uv python install 3.11
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: |
|
||||||
|
uv venv
|
||||||
|
uv pip install -e ".[dev]"
|
||||||
|
|
||||||
|
- name: Run linting
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make lint
|
||||||
|
|
||||||
|
- name: Run type checking
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make type-check
|
||||||
|
|
||||||
|
- name: Run tests with coverage
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make test-cov
|
||||||
|
|
||||||
|
- name: Security check
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make security-check
|
||||||
|
|
||||||
|
- name: Upload coverage reports
|
||||||
|
uses: codecov/codecov-action@v3
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
file: ./htmlcov/coverage.xml
|
||||||
|
flags: unittests
|
||||||
|
name: codecov-umbrella
|
||||||
|
|
||||||
|
build-test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v3
|
||||||
|
|
||||||
|
- name: Verify vendor assets
|
||||||
|
run: |
|
||||||
|
if [ ! -f frontend/static/vendor/bootstrap/css/bootstrap.min.css ]; then
|
||||||
|
echo "Error: Missing bootstrap CSS file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/vendor/bootstrap/js/bootstrap.bundle.min.js ]; then
|
||||||
|
echo "Error: Missing bootstrap JS file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/vendor/fontawesome/css/all.min.css ]; then
|
||||||
|
echo "Error: Missing FontAwesome CSS file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/vendor/fontawesome/webfonts/fa-solid-900.woff2 ]; then
|
||||||
|
echo "Error: Missing FontAwesome font file"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
if [ ! -f frontend/static/img/osi-logo.svg ]; then
|
||||||
|
echo "Error: Missing OSI logo"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
echo "All vendor assets verified"
|
||||||
|
|
||||||
|
- name: Build multi-arch image (test only)
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
file: ./Dockerfile
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
|
push: false
|
||||||
|
tags: unitforge:pr-${{ github.event.number }}
|
||||||
|
cache-from: type=gha
|
||||||
|
cache-to: type=gha,mode=max
|
||||||
|
|
||||||
|
- name: Test container startup
|
||||||
|
run: |
|
||||||
|
docker run --rm -d --name unitforge-test -p 8080:8000 unitforge:pr-${{ github.event.number }}
|
||||||
|
sleep 10
|
||||||
|
# Basic health check
|
||||||
|
curl -f http://localhost:8080/health || exit 1
|
||||||
|
docker stop unitforge-test
|
||||||
|
|
||||||
|
validate-config:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v3
|
||||||
|
with:
|
||||||
|
version: "latest"
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
run: uv python install 3.11
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: |
|
||||||
|
uv venv
|
||||||
|
uv pip install -e ".[dev]"
|
||||||
|
|
||||||
|
- name: Validate configuration
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make validate-config
|
||||||
|
|
||||||
|
pr-summary:
|
||||||
|
needs: [test, build-test, validate-config]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: always()
|
||||||
|
steps:
|
||||||
|
- name: PR Summary
|
||||||
|
run: |
|
||||||
|
echo "## Pull Request Build Summary" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Check | Status |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "|-------|--------|" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Tests | ${{ needs.test.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Build | ${{ needs.build-test.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "| Config | ${{ needs.validate-config.result == 'success' && '✅ Passed' || '❌ Failed' }} |" >> $GITHUB_STEP_SUMMARY
|
||||||
|
echo "" >> $GITHUB_STEP_SUMMARY
|
||||||
|
if [[ "${{ needs.test.result }}" == "success" && "${{ needs.build-test.result }}" == "success" && "${{ needs.validate-config.result }}" == "success" ]]; then
|
||||||
|
echo "🎉 All checks passed! This PR is ready for review." >> $GITHUB_STEP_SUMMARY
|
||||||
|
else
|
||||||
|
echo "❌ Some checks failed. Please review the failed jobs above." >> $GITHUB_STEP_SUMMARY
|
||||||
|
fi
|
||||||
299
.gitea/workflows/release.yml
Normal file
299
.gitea/workflows/release.yml
Normal file
@@ -0,0 +1,299 @@
|
|||||||
|
name: Release
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
tags:
|
||||||
|
- "v*"
|
||||||
|
|
||||||
|
env:
|
||||||
|
REGISTRY: gitea-http.taildb3494.ts.net
|
||||||
|
IMAGE_NAME: will/unitforge
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
validate-release:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
version: ${{ steps.extract.outputs.version }}
|
||||||
|
is_prerelease: ${{ steps.extract.outputs.is_prerelease }}
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Extract version info
|
||||||
|
id: extract
|
||||||
|
run: |
|
||||||
|
VERSION=${GITHUB_REF#refs/tags/}
|
||||||
|
echo "version=$VERSION" >> $GITHUB_OUTPUT
|
||||||
|
|
||||||
|
# Check if this is a pre-release (contains alpha, beta, rc, etc.)
|
||||||
|
if [[ $VERSION =~ (alpha|beta|rc|pre) ]]; then
|
||||||
|
echo "is_prerelease=true" >> $GITHUB_OUTPUT
|
||||||
|
else
|
||||||
|
echo "is_prerelease=false" >> $GITHUB_OUTPUT
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo "Releasing version: $VERSION"
|
||||||
|
echo "Is pre-release: ${{ steps.extract.outputs.is_prerelease }}"
|
||||||
|
|
||||||
|
- name: Validate version format
|
||||||
|
run: |
|
||||||
|
if [[ ! "${{ steps.extract.outputs.version }}" =~ ^v[0-9]+\.[0-9]+\.[0-9]+.*$ ]]; then
|
||||||
|
echo "Invalid version format: ${{ steps.extract.outputs.version }}"
|
||||||
|
echo "Version must follow semantic versioning (e.g., v1.0.0, v1.0.0-alpha.1)"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
test-and-build:
|
||||||
|
needs: validate-release
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v3
|
||||||
|
with:
|
||||||
|
version: "latest"
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
run: uv python install 3.11
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: |
|
||||||
|
uv venv
|
||||||
|
uv pip install -e ".[dev]"
|
||||||
|
|
||||||
|
- name: Run full test suite
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make check-all
|
||||||
|
|
||||||
|
- name: Build Python package
|
||||||
|
run: |
|
||||||
|
source .venv/bin/activate
|
||||||
|
make build
|
||||||
|
|
||||||
|
- name: Upload build artifacts
|
||||||
|
uses: actions/upload-artifact@v3
|
||||||
|
with:
|
||||||
|
name: python-package
|
||||||
|
path: dist/
|
||||||
|
|
||||||
|
build-container:
|
||||||
|
needs: [validate-release, test-and-build]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
outputs:
|
||||||
|
image-digest: ${{ steps.build.outputs.digest }}
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Set up Docker Buildx
|
||||||
|
uses: docker/setup-buildx-action@v3
|
||||||
|
with:
|
||||||
|
driver-opts: network=host
|
||||||
|
|
||||||
|
- name: Log in to Container Registry
|
||||||
|
uses: docker/login-action@v3
|
||||||
|
with:
|
||||||
|
registry: ${{ env.REGISTRY }}
|
||||||
|
username: ${{ secrets.CONTAINER_REGISTRY_USERNAME }}
|
||||||
|
password: ${{ secrets.CONTAINER_REGISTRY_PASSWORD }}
|
||||||
|
|
||||||
|
- name: Extract metadata
|
||||||
|
id: meta
|
||||||
|
uses: docker/metadata-action@v5
|
||||||
|
with:
|
||||||
|
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
|
||||||
|
tags: |
|
||||||
|
type=semver,pattern={{version}}
|
||||||
|
type=semver,pattern={{major}}.{{minor}}
|
||||||
|
type=semver,pattern={{major}},enable=${{ !needs.validate-release.outputs.is_prerelease }}
|
||||||
|
type=raw,value=latest,enable=${{ !needs.validate-release.outputs.is_prerelease }}
|
||||||
|
|
||||||
|
- name: Verify vendor assets
|
||||||
|
run: |
|
||||||
|
assets=(
|
||||||
|
"frontend/static/vendor/bootstrap/css/bootstrap.min.css"
|
||||||
|
"frontend/static/vendor/bootstrap/js/bootstrap.bundle.min.js"
|
||||||
|
"frontend/static/vendor/fontawesome/css/all.min.css"
|
||||||
|
"frontend/static/vendor/fontawesome/webfonts/fa-solid-900.woff2"
|
||||||
|
"frontend/static/img/osi-logo.svg"
|
||||||
|
)
|
||||||
|
|
||||||
|
for asset in "${assets[@]}"; do
|
||||||
|
if [ ! -f "$asset" ]; then
|
||||||
|
echo "Error: Missing required asset: $asset"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
echo "All vendor assets verified"
|
||||||
|
|
||||||
|
- name: Build and push release image
|
||||||
|
id: build
|
||||||
|
uses: docker/build-push-action@v5
|
||||||
|
with:
|
||||||
|
context: .
|
||||||
|
file: ./Dockerfile
|
||||||
|
platforms: linux/amd64,linux/arm64
|
||||||
|
push: true
|
||||||
|
tags: ${{ steps.meta.outputs.tags }}
|
||||||
|
labels: ${{ steps.meta.outputs.labels }}
|
||||||
|
cache-from: type=gha
|
||||||
|
cache-to: type=gha,mode=max
|
||||||
|
build-args: |
|
||||||
|
BUILDKIT_INLINE_CACHE=1
|
||||||
|
VERSION=${{ needs.validate-release.outputs.version }}
|
||||||
|
|
||||||
|
security-scan:
|
||||||
|
needs: [validate-release, build-container]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Run Trivy vulnerability scanner
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ needs.validate-release.outputs.version }}
|
||||||
|
format: "sarif"
|
||||||
|
output: "trivy-results.sarif"
|
||||||
|
severity: "CRITICAL,HIGH"
|
||||||
|
|
||||||
|
- name: Upload Trivy scan results
|
||||||
|
uses: github/codeql-action/upload-sarif@v2
|
||||||
|
if: always()
|
||||||
|
with:
|
||||||
|
sarif_file: "trivy-results.sarif"
|
||||||
|
|
||||||
|
- name: Fail on critical vulnerabilities
|
||||||
|
uses: aquasecurity/trivy-action@master
|
||||||
|
with:
|
||||||
|
image-ref: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ needs.validate-release.outputs.version }}
|
||||||
|
format: "table"
|
||||||
|
exit-code: 1
|
||||||
|
severity: "CRITICAL"
|
||||||
|
|
||||||
|
create-release:
|
||||||
|
needs: [validate-release, test-and-build, build-container, security-scan]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Download build artifacts
|
||||||
|
uses: actions/download-artifact@v3
|
||||||
|
with:
|
||||||
|
name: python-package
|
||||||
|
path: dist/
|
||||||
|
|
||||||
|
- name: Generate changelog
|
||||||
|
id: changelog
|
||||||
|
run: |
|
||||||
|
# Extract changelog for this version
|
||||||
|
VERSION=${{ needs.validate-release.outputs.version }}
|
||||||
|
|
||||||
|
# Create release notes
|
||||||
|
cat > release-notes.md << EOF
|
||||||
|
## UnitForge $VERSION
|
||||||
|
|
||||||
|
### Container Images
|
||||||
|
- **Multi-arch support**: linux/amd64, linux/arm64
|
||||||
|
- **Registry**: \`${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:$VERSION\`
|
||||||
|
- **Digest**: \`${{ needs.build-container.outputs.image-digest }}\`
|
||||||
|
|
||||||
|
### Installation
|
||||||
|
|
||||||
|
#### Docker
|
||||||
|
\`\`\`bash
|
||||||
|
docker pull ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:$VERSION
|
||||||
|
docker run -p 8000:8000 ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:$VERSION
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
#### Python Package
|
||||||
|
\`\`\`bash
|
||||||
|
pip install unitforge==$VERSION
|
||||||
|
\`\`\`
|
||||||
|
|
||||||
|
### Verification
|
||||||
|
All container images are scanned for security vulnerabilities and signed for authenticity.
|
||||||
|
|
||||||
|
EOF
|
||||||
|
|
||||||
|
- name: Create GitHub Release
|
||||||
|
uses: softprops/action-gh-release@v1
|
||||||
|
with:
|
||||||
|
body_path: release-notes.md
|
||||||
|
files: |
|
||||||
|
dist/*
|
||||||
|
prerelease: ${{ needs.validate-release.outputs.is_prerelease }}
|
||||||
|
generate_release_notes: true
|
||||||
|
tag_name: ${{ needs.validate-release.outputs.version }}
|
||||||
|
env:
|
||||||
|
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||||
|
|
||||||
|
publish-package:
|
||||||
|
needs: [validate-release, create-release]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: ${{ !needs.validate-release.outputs.is_prerelease }}
|
||||||
|
steps:
|
||||||
|
- name: Checkout code
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
|
||||||
|
- name: Install uv
|
||||||
|
uses: astral-sh/setup-uv@v3
|
||||||
|
with:
|
||||||
|
version: "latest"
|
||||||
|
|
||||||
|
- name: Set up Python
|
||||||
|
run: uv python install 3.11
|
||||||
|
|
||||||
|
- name: Download build artifacts
|
||||||
|
uses: actions/download-artifact@v3
|
||||||
|
with:
|
||||||
|
name: python-package
|
||||||
|
path: dist/
|
||||||
|
|
||||||
|
- name: Publish to PyPI
|
||||||
|
run: |
|
||||||
|
uv pip install twine
|
||||||
|
twine upload dist/* --non-interactive
|
||||||
|
env:
|
||||||
|
TWINE_USERNAME: __token__
|
||||||
|
TWINE_PASSWORD: ${{ secrets.PYPI_API_TOKEN }}
|
||||||
|
|
||||||
|
deploy-production:
|
||||||
|
needs: [validate-release, create-release, security-scan]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: ${{ !needs.validate-release.outputs.is_prerelease }}
|
||||||
|
environment: production
|
||||||
|
steps:
|
||||||
|
- name: Deploy to production
|
||||||
|
run: |
|
||||||
|
echo "🚀 Deploying UnitForge ${{ needs.validate-release.outputs.version }} to production"
|
||||||
|
echo "Image: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ needs.validate-release.outputs.version }}"
|
||||||
|
|
||||||
|
# Add your production deployment commands here
|
||||||
|
# Examples:
|
||||||
|
# - Update Kubernetes manifests
|
||||||
|
# - Update Helm charts
|
||||||
|
# - Trigger deployment pipeline
|
||||||
|
# - Update Docker Swarm services
|
||||||
|
|
||||||
|
echo "✅ Production deployment completed"
|
||||||
|
|
||||||
|
notify-release:
|
||||||
|
needs: [validate-release, create-release, deploy-production]
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
if: always()
|
||||||
|
steps:
|
||||||
|
- name: Notify release completion
|
||||||
|
run: |
|
||||||
|
if [[ "${{ needs.deploy-production.result }}" == "success" ]]; then
|
||||||
|
echo "🎉 UnitForge ${{ needs.validate-release.outputs.version }} has been successfully released!"
|
||||||
|
echo "📦 Container image: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ needs.validate-release.outputs.version }}"
|
||||||
|
echo "🌐 Production deployment: ✅ Complete"
|
||||||
|
else
|
||||||
|
echo "⚠️ UnitForge ${{ needs.validate-release.outputs.version }} release completed with issues"
|
||||||
|
echo "📦 Container image: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}:${{ needs.validate-release.outputs.version }}"
|
||||||
|
echo "🌐 Production deployment: ❌ Failed or skipped"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Add notification logic here (Slack, Discord, email, etc.)
|
||||||
75
README.md
75
README.md
@@ -478,6 +478,81 @@ For comprehensive guides and references, see the [**Documentation Index**](docs/
|
|||||||
- [systemd.socket(5)](https://www.freedesktop.org/software/systemd/man/systemd.socket.html) - Socket units
|
- [systemd.socket(5)](https://www.freedesktop.org/software/systemd/man/systemd.socket.html) - Socket units
|
||||||
- [systemd documentation](https://systemd.io/) - Official documentation
|
- [systemd documentation](https://systemd.io/) - Official documentation
|
||||||
|
|
||||||
|
## 🔄 CI/CD
|
||||||
|
|
||||||
|
UnitForge includes comprehensive CI/CD workflows for automated testing, building, and deployment using Gitea Actions.
|
||||||
|
|
||||||
|
### Workflow Overview
|
||||||
|
|
||||||
|
- **Pull Request Checks** (`pr-check.yml`) - Validates PRs with tests, builds, and configuration checks
|
||||||
|
- **Main Build Pipeline** (`build-container.yml`) - Builds and pushes multi-arch container images
|
||||||
|
- **Release Pipeline** (`release.yml`) - Automated releases with security scanning and PyPI publishing
|
||||||
|
- **Nightly Builds** (`nightly.yml`) - Daily builds with comprehensive testing and performance checks
|
||||||
|
|
||||||
|
### Multi-Architecture Support
|
||||||
|
|
||||||
|
All container images are built for multiple architectures:
|
||||||
|
- `linux/amd64` - Standard x86_64 architecture
|
||||||
|
- `linux/arm64` - ARM64 architecture (Apple Silicon, ARM servers)
|
||||||
|
|
||||||
|
### Container Registry
|
||||||
|
|
||||||
|
Images are pushed to the configured container registry:
|
||||||
|
```bash
|
||||||
|
# Default registry configuration
|
||||||
|
REGISTRY: gitea-http.taildb3494.ts.net
|
||||||
|
IMAGE_NAME: will/unitforge
|
||||||
|
|
||||||
|
# Available tags
|
||||||
|
latest # Latest stable release
|
||||||
|
develop # Latest development build
|
||||||
|
v1.2.3 # Specific version
|
||||||
|
nightly-latest # Latest nightly build
|
||||||
|
```
|
||||||
|
|
||||||
|
### Local CI/CD Testing
|
||||||
|
|
||||||
|
Set up your local environment for CI/CD development:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Setup CI/CD environment
|
||||||
|
./scripts/setup-ci.sh
|
||||||
|
|
||||||
|
# Test local builds
|
||||||
|
./scripts/setup-ci.sh --test-build
|
||||||
|
|
||||||
|
# Test container health
|
||||||
|
./scripts/health_check.sh
|
||||||
|
|
||||||
|
# Build multi-arch locally
|
||||||
|
make docker-buildx-local
|
||||||
|
|
||||||
|
# Build and push to registry
|
||||||
|
make registry-push
|
||||||
|
```
|
||||||
|
|
||||||
|
### Required Secrets
|
||||||
|
|
||||||
|
Configure these secrets in your Gitea repository:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
CONTAINER_REGISTRY_USERNAME=your-registry-username
|
||||||
|
CONTAINER_REGISTRY_PASSWORD=your-registry-password
|
||||||
|
PYPI_API_TOKEN=your-pypi-token
|
||||||
|
GITHUB_TOKEN=your-github-token
|
||||||
|
```
|
||||||
|
|
||||||
|
### Workflow Features
|
||||||
|
|
||||||
|
- **Automated Testing**: Comprehensive test suite across Python versions
|
||||||
|
- **Security Scanning**: Trivy vulnerability scanning with blocking on critical issues
|
||||||
|
- **Performance Testing**: Basic performance validation and memory usage checks
|
||||||
|
- **Multi-stage Deployment**: Staging and production environment support
|
||||||
|
- **Artifact Management**: Automatic cleanup of old nightly builds
|
||||||
|
- **Build Caching**: GitHub Actions cache for faster builds
|
||||||
|
|
||||||
|
For detailed CI/CD documentation, see [`.gitea/workflows/README.md`](.gitea/workflows/README.md).
|
||||||
|
|
||||||
## 🤝 Contributing
|
## 🤝 Contributing
|
||||||
|
|
||||||
**New contributors**: Please see the [**Contributing Guide**](CONTRIBUTING.md) for complete development setup and workflow instructions.
|
**New contributors**: Please see the [**Contributing Guide**](CONTRIBUTING.md) for complete development setup and workflow instructions.
|
||||||
|
|||||||
306
scripts/health_check.sh
Executable file
306
scripts/health_check.sh
Executable file
@@ -0,0 +1,306 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# Health check script for UnitForge CI/CD workflows
|
||||||
|
# Tests basic functionality of the running application
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Configuration
|
||||||
|
HOST=${HOST:-localhost}
|
||||||
|
PORT=${PORT:-8000}
|
||||||
|
TIMEOUT=${TIMEOUT:-30}
|
||||||
|
MAX_RETRIES=${MAX_RETRIES:-5}
|
||||||
|
RETRY_DELAY=${RETRY_DELAY:-2}
|
||||||
|
|
||||||
|
# Colors for output
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
# Helper functions
|
||||||
|
log_info() {
|
||||||
|
echo -e "${GREEN}[INFO]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_warn() {
|
||||||
|
echo -e "${YELLOW}[WARN]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_error() {
|
||||||
|
echo -e "${RED}[ERROR]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if application is responding
|
||||||
|
check_health() {
|
||||||
|
local url="http://${HOST}:${PORT}/health"
|
||||||
|
local retry_count=0
|
||||||
|
|
||||||
|
log_info "Checking health endpoint: $url"
|
||||||
|
|
||||||
|
while [ $retry_count -lt "$MAX_RETRIES" ]; do
|
||||||
|
if curl -s -f --max-time "$TIMEOUT" "$url" > /dev/null 2>&1; then
|
||||||
|
log_info "Health check passed"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
retry_count=$((retry_count + 1))
|
||||||
|
log_warn "Health check failed (attempt $retry_count/$MAX_RETRIES)"
|
||||||
|
|
||||||
|
if [ $retry_count -lt "$MAX_RETRIES" ]; then
|
||||||
|
log_info "Retrying in ${RETRY_DELAY} seconds..."
|
||||||
|
sleep "$RETRY_DELAY"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
log_error "Health check failed after $MAX_RETRIES attempts"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if main page loads
|
||||||
|
check_main_page() {
|
||||||
|
local url="http://${HOST}:${PORT}/"
|
||||||
|
log_info "Checking main page: $url"
|
||||||
|
|
||||||
|
local response
|
||||||
|
response=$(curl -s -w "%{http_code}" --max-time "$TIMEOUT" "$url")
|
||||||
|
local http_code="${response: -3}"
|
||||||
|
|
||||||
|
if [ "$http_code" = "200" ]; then
|
||||||
|
log_info "Main page check passed (HTTP $http_code)"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_error "Main page check failed (HTTP $http_code)"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check API endpoints
|
||||||
|
check_api() {
|
||||||
|
local base_url="http://${HOST}:${PORT}/api"
|
||||||
|
log_info "Checking API endpoints"
|
||||||
|
|
||||||
|
# Check API health
|
||||||
|
local api_health_url="${base_url}/health"
|
||||||
|
if curl -s -f --max-time "$TIMEOUT" "$api_health_url" > /dev/null 2>&1; then
|
||||||
|
log_info "API health endpoint passed"
|
||||||
|
else
|
||||||
|
log_warn "API health endpoint failed or not available"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check API version
|
||||||
|
local api_version_url="${base_url}/version"
|
||||||
|
if curl -s -f --max-time "$TIMEOUT" "$api_version_url" > /dev/null 2>&1; then
|
||||||
|
log_info "API version endpoint passed"
|
||||||
|
else
|
||||||
|
log_warn "API version endpoint failed or not available"
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check static assets
|
||||||
|
check_static_assets() {
|
||||||
|
log_info "Checking static assets"
|
||||||
|
|
||||||
|
local assets=(
|
||||||
|
"/static/css/style.css"
|
||||||
|
"/static/js/app.js"
|
||||||
|
"/static/vendor/bootstrap/css/bootstrap.min.css"
|
||||||
|
"/static/vendor/fontawesome/css/all.min.css"
|
||||||
|
)
|
||||||
|
|
||||||
|
local failed_assets=0
|
||||||
|
|
||||||
|
for asset in "${assets[@]}"; do
|
||||||
|
local url="http://${HOST}:${PORT}${asset}"
|
||||||
|
if curl -s -f --max-time "$TIMEOUT" "$url" > /dev/null 2>&1; then
|
||||||
|
log_info "Asset check passed: $asset"
|
||||||
|
else
|
||||||
|
log_warn "Asset check failed: $asset"
|
||||||
|
failed_assets=$((failed_assets + 1))
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ $failed_assets -eq 0 ]; then
|
||||||
|
log_info "All static assets available"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_warn "$failed_assets static assets failed to load"
|
||||||
|
return 0 # Don't fail health check for missing assets
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Performance test
|
||||||
|
check_performance() {
|
||||||
|
log_info "Running basic performance test"
|
||||||
|
|
||||||
|
local url="http://${HOST}:${PORT}/"
|
||||||
|
local response_time
|
||||||
|
|
||||||
|
# Test response time
|
||||||
|
local response_time
|
||||||
|
response_time=$(curl -s -w "%{time_total}" --max-time "$TIMEOUT" -o /dev/null "$url")
|
||||||
|
|
||||||
|
if curl -s -w "%{time_total}" --max-time "$TIMEOUT" -o /dev/null "$url" > /dev/null 2>&1; then
|
||||||
|
log_info "Response time: ${response_time}s"
|
||||||
|
|
||||||
|
# Check if response time is reasonable (< 5 seconds)
|
||||||
|
if (( $(echo "$response_time < 5.0" | bc -l) )); then
|
||||||
|
log_info "Performance check passed"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_warn "Performance check warning: slow response time (${response_time}s)"
|
||||||
|
return 0 # Don't fail health check for slow response
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log_error "Performance check failed: no response"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Memory usage check (if running in container)
|
||||||
|
check_memory() {
|
||||||
|
if command -v docker > /dev/null 2>&1 && [ -n "$CONTAINER_NAME" ]; then
|
||||||
|
log_info "Checking container memory usage"
|
||||||
|
|
||||||
|
local memory_usage
|
||||||
|
memory_usage=$(docker stats "$CONTAINER_NAME" --no-stream --format "{{.MemUsage}}" | cut -d'/' -f1)
|
||||||
|
|
||||||
|
if [ -n "$memory_usage" ]; then
|
||||||
|
log_info "Memory usage: $memory_usage"
|
||||||
|
else
|
||||||
|
log_warn "Could not determine memory usage"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Wait for application to start
|
||||||
|
wait_for_startup() {
|
||||||
|
log_info "Waiting for application to start..."
|
||||||
|
local startup_timeout=60
|
||||||
|
local elapsed=0
|
||||||
|
|
||||||
|
while [ $elapsed -lt $startup_timeout ]; do
|
||||||
|
if curl -s --max-time 5 "http://${HOST}:${PORT}/" > /dev/null 2>&1; then
|
||||||
|
log_info "Application is responding"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
sleep 5
|
||||||
|
elapsed=$((elapsed + 5))
|
||||||
|
log_info "Waiting... (${elapsed}s/${startup_timeout}s)"
|
||||||
|
done
|
||||||
|
|
||||||
|
log_error "Application failed to start within ${startup_timeout} seconds"
|
||||||
|
return 1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main health check function
|
||||||
|
run_health_check() {
|
||||||
|
log_info "Starting UnitForge health check"
|
||||||
|
log_info "Target: http://${HOST}:${PORT}"
|
||||||
|
|
||||||
|
local failed_checks=0
|
||||||
|
|
||||||
|
# Wait for startup if needed
|
||||||
|
if ! curl -s --max-time 5 "http://${HOST}:${PORT}/" > /dev/null 2>&1; then
|
||||||
|
wait_for_startup || return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Run all checks
|
||||||
|
check_health || failed_checks=$((failed_checks + 1))
|
||||||
|
check_main_page || failed_checks=$((failed_checks + 1))
|
||||||
|
check_api || failed_checks=$((failed_checks + 1))
|
||||||
|
check_static_assets || true # Don't count static asset failures
|
||||||
|
check_performance || true # Don't count performance warnings
|
||||||
|
check_memory || true # Don't count memory check failures
|
||||||
|
|
||||||
|
# Summary
|
||||||
|
if [ $failed_checks -eq 0 ]; then
|
||||||
|
log_info "✅ All health checks passed"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_error "❌ $failed_checks health checks failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Usage information
|
||||||
|
usage() {
|
||||||
|
echo "Usage: $0 [OPTIONS]"
|
||||||
|
echo ""
|
||||||
|
echo "Options:"
|
||||||
|
echo " -h, --host HOST Target host (default: localhost)"
|
||||||
|
echo " -p, --port PORT Target port (default: 8000)"
|
||||||
|
echo " -t, --timeout TIMEOUT Request timeout in seconds (default: 30)"
|
||||||
|
echo " -r, --retries RETRIES Maximum retry attempts (default: 5)"
|
||||||
|
echo " -d, --delay DELAY Retry delay in seconds (default: 2)"
|
||||||
|
echo " -c, --container NAME Container name for memory checks"
|
||||||
|
echo " --help Show this help message"
|
||||||
|
echo ""
|
||||||
|
echo "Environment variables:"
|
||||||
|
echo " HOST Same as --host"
|
||||||
|
echo " PORT Same as --port"
|
||||||
|
echo " TIMEOUT Same as --timeout"
|
||||||
|
echo " MAX_RETRIES Same as --retries"
|
||||||
|
echo " RETRY_DELAY Same as --delay"
|
||||||
|
echo " CONTAINER_NAME Same as --container"
|
||||||
|
echo ""
|
||||||
|
echo "Examples:"
|
||||||
|
echo " $0 # Check localhost:8000"
|
||||||
|
echo " $0 -h production.example.com -p 80 # Check production server"
|
||||||
|
echo " $0 -c unitforge-container # Include container memory check"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Parse command line arguments
|
||||||
|
while [[ $# -gt 0 ]]; do
|
||||||
|
case $1 in
|
||||||
|
-h|--host)
|
||||||
|
HOST="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-p|--port)
|
||||||
|
PORT="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-t|--timeout)
|
||||||
|
TIMEOUT="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-r|--retries)
|
||||||
|
MAX_RETRIES="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-d|--delay)
|
||||||
|
RETRY_DELAY="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
-c|--container)
|
||||||
|
CONTAINER_NAME="$2"
|
||||||
|
shift 2
|
||||||
|
;;
|
||||||
|
--help)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown option: $1"
|
||||||
|
usage
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
|
done
|
||||||
|
|
||||||
|
# Check dependencies
|
||||||
|
if ! command -v curl > /dev/null 2>&1; then
|
||||||
|
log_error "curl is required but not installed"
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
if ! command -v bc > /dev/null 2>&1; then
|
||||||
|
log_warn "bc is not installed, performance timing may not work properly"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Run the health check
|
||||||
|
run_health_check
|
||||||
|
exit $?
|
||||||
445
scripts/setup-ci.sh
Executable file
445
scripts/setup-ci.sh
Executable file
@@ -0,0 +1,445 @@
|
|||||||
|
#!/bin/bash
|
||||||
|
# CI/CD Setup Script for UnitForge
|
||||||
|
# Sets up local environment for testing CI/CD workflows
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
# Colors for output
|
||||||
|
RED='\033[0;31m'
|
||||||
|
GREEN='\033[0;32m'
|
||||||
|
YELLOW='\033[1;33m'
|
||||||
|
BLUE='\033[0;34m'
|
||||||
|
NC='\033[0m' # No Color
|
||||||
|
|
||||||
|
# Helper functions
|
||||||
|
log_info() {
|
||||||
|
echo -e "${GREEN}[INFO]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_warn() {
|
||||||
|
echo -e "${YELLOW}[WARN]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_error() {
|
||||||
|
echo -e "${RED}[ERROR]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
log_step() {
|
||||||
|
echo -e "${BLUE}[STEP]${NC} $1"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check if command exists
|
||||||
|
command_exists() {
|
||||||
|
command -v "$1" >/dev/null 2>&1
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check Docker and Docker Buildx
|
||||||
|
check_docker() {
|
||||||
|
log_step "Checking Docker installation..."
|
||||||
|
|
||||||
|
if ! command_exists docker; then
|
||||||
|
log_error "Docker is not installed"
|
||||||
|
echo "Please install Docker from: https://docs.docker.com/get-docker/"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "Docker found: $(docker --version)"
|
||||||
|
|
||||||
|
# Check if Docker daemon is running
|
||||||
|
if ! docker info >/dev/null 2>&1; then
|
||||||
|
log_error "Docker daemon is not running"
|
||||||
|
echo "Please start Docker daemon"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Check Docker Buildx
|
||||||
|
if ! docker buildx version >/dev/null 2>&1; then
|
||||||
|
log_warn "Docker Buildx not found, installing..."
|
||||||
|
docker buildx install 2>/dev/null || true
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "Docker Buildx found: $(docker buildx version)"
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Setup Docker Buildx for multi-arch builds
|
||||||
|
setup_buildx() {
|
||||||
|
log_step "Setting up Docker Buildx for multi-arch builds..."
|
||||||
|
|
||||||
|
# Create builder if it doesn't exist
|
||||||
|
if ! docker buildx ls | grep -q "unitforge-builder"; then
|
||||||
|
log_info "Creating unitforge-builder..."
|
||||||
|
docker buildx create --name unitforge-builder --use
|
||||||
|
else
|
||||||
|
log_info "Using existing unitforge-builder"
|
||||||
|
docker buildx use unitforge-builder
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Bootstrap the builder
|
||||||
|
log_info "Bootstrapping builder..."
|
||||||
|
docker buildx inspect --bootstrap
|
||||||
|
|
||||||
|
log_info "Builder setup complete"
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check container registry access
|
||||||
|
check_registry() {
|
||||||
|
log_step "Checking container registry configuration..."
|
||||||
|
|
||||||
|
if [ -f ".env" ]; then
|
||||||
|
log_info "Found .env file"
|
||||||
|
|
||||||
|
if grep -q "CONTAINER_REGISTRY_URL" .env; then
|
||||||
|
local registry_url
|
||||||
|
registry_url=$(grep '^CONTAINER_REGISTRY_URL=' .env | cut -d'=' -f2)
|
||||||
|
log_info "Registry URL: $registry_url"
|
||||||
|
else
|
||||||
|
log_warn "CONTAINER_REGISTRY_URL not found in .env"
|
||||||
|
fi
|
||||||
|
|
||||||
|
if grep -q "CONTAINER_TAG" .env; then
|
||||||
|
local container_tag
|
||||||
|
container_tag=$(grep '^CONTAINER_TAG=' .env | cut -d'=' -f2)
|
||||||
|
log_info "Container tag: $container_tag"
|
||||||
|
else
|
||||||
|
log_warn "CONTAINER_TAG not found in .env"
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log_warn ".env file not found"
|
||||||
|
log_info "Creating sample .env file..."
|
||||||
|
|
||||||
|
cat > .env << EOF
|
||||||
|
# Container Registry Configuration
|
||||||
|
CONTAINER_REGISTRY_URL=gitea-http.taildb3494.ts.net/will/unitforge
|
||||||
|
CONTAINER_TAG=latest
|
||||||
|
|
||||||
|
# Development Configuration
|
||||||
|
DEBUG=true
|
||||||
|
LOG_LEVEL=debug
|
||||||
|
EOF
|
||||||
|
|
||||||
|
log_info "Sample .env file created. Please update with your registry details."
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Verify vendor assets
|
||||||
|
check_vendor_assets() {
|
||||||
|
log_step "Checking vendor assets..."
|
||||||
|
|
||||||
|
local assets=(
|
||||||
|
"frontend/static/vendor/bootstrap/css/bootstrap.min.css"
|
||||||
|
"frontend/static/vendor/bootstrap/js/bootstrap.bundle.min.js"
|
||||||
|
"frontend/static/vendor/fontawesome/css/all.min.css"
|
||||||
|
"frontend/static/vendor/fontawesome/webfonts/fa-solid-900.woff2"
|
||||||
|
"frontend/static/img/osi-logo.svg"
|
||||||
|
)
|
||||||
|
|
||||||
|
local missing_assets=0
|
||||||
|
|
||||||
|
for asset in "${assets[@]}"; do
|
||||||
|
if [ ! -f "$asset" ]; then
|
||||||
|
log_warn "Missing asset: $asset"
|
||||||
|
missing_assets=$((missing_assets + 1))
|
||||||
|
else
|
||||||
|
log_info "Found asset: $asset"
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ $missing_assets -gt 0 ]; then
|
||||||
|
log_error "$missing_assets vendor assets are missing"
|
||||||
|
log_info "Please ensure all vendor assets are downloaded and committed"
|
||||||
|
log_info "Run: make setup-dev to download missing assets"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "All vendor assets found"
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test local build
|
||||||
|
test_local_build() {
|
||||||
|
log_step "Testing local Docker build..."
|
||||||
|
|
||||||
|
if docker build -t unitforge:test . >/dev/null 2>&1; then
|
||||||
|
log_info "Local Docker build successful"
|
||||||
|
|
||||||
|
# Test container startup
|
||||||
|
log_info "Testing container startup..."
|
||||||
|
if docker run -d --name unitforge-test -p 8080:8000 unitforge:test >/dev/null 2>&1; then
|
||||||
|
sleep 5
|
||||||
|
|
||||||
|
if curl -s -f http://localhost:8080/ >/dev/null 2>&1; then
|
||||||
|
log_info "Container startup test successful"
|
||||||
|
else
|
||||||
|
log_warn "Container started but not responding on port 8080"
|
||||||
|
fi
|
||||||
|
|
||||||
|
docker stop unitforge-test >/dev/null 2>&1
|
||||||
|
docker rm unitforge-test >/dev/null 2>&1
|
||||||
|
else
|
||||||
|
log_warn "Container startup test failed"
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Clean up test image
|
||||||
|
docker rmi unitforge:test >/dev/null 2>&1 || true
|
||||||
|
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_error "Local Docker build failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test multi-arch build
|
||||||
|
test_multiarch_build() {
|
||||||
|
log_step "Testing multi-architecture build..."
|
||||||
|
|
||||||
|
if docker buildx build --platform linux/amd64,linux/arm64 -t unitforge:multiarch-test . >/dev/null 2>&1; then
|
||||||
|
log_info "Multi-architecture build successful"
|
||||||
|
|
||||||
|
# Clean up
|
||||||
|
docker buildx rm --force >/dev/null 2>&1 || true
|
||||||
|
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_error "Multi-architecture build failed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Check development environment
|
||||||
|
check_dev_environment() {
|
||||||
|
log_step "Checking development environment..."
|
||||||
|
|
||||||
|
# Check uv
|
||||||
|
if ! command_exists uv; then
|
||||||
|
log_error "uv is not installed"
|
||||||
|
echo "Install with: curl -LsSf https://astral.sh/uv/install.sh | sh"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "uv found: $(uv --version)"
|
||||||
|
|
||||||
|
# Check Python
|
||||||
|
if ! command_exists python3; then
|
||||||
|
log_error "Python 3 is not installed"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
log_info "Python found: $(python3 --version)"
|
||||||
|
|
||||||
|
# Check if virtual environment exists
|
||||||
|
if [ -d ".venv" ]; then
|
||||||
|
log_info "Virtual environment exists"
|
||||||
|
else
|
||||||
|
log_warn "Virtual environment not found"
|
||||||
|
log_info "Run: make setup-dev to create it"
|
||||||
|
fi
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Test CI/CD workflow syntax
|
||||||
|
check_workflow_syntax() {
|
||||||
|
log_step "Checking workflow syntax..."
|
||||||
|
|
||||||
|
local workflows_dir=".gitea/workflows"
|
||||||
|
|
||||||
|
if [ ! -d "$workflows_dir" ]; then
|
||||||
|
log_error "Workflows directory not found: $workflows_dir"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
local yaml_files=("$workflows_dir"/*.yml)
|
||||||
|
|
||||||
|
if [ ${#yaml_files[@]} -eq 0 ]; then
|
||||||
|
log_warn "No YAML workflow files found"
|
||||||
|
return 0
|
||||||
|
fi
|
||||||
|
|
||||||
|
local syntax_errors=0
|
||||||
|
|
||||||
|
for file in "${yaml_files[@]}"; do
|
||||||
|
if [ -f "$file" ]; then
|
||||||
|
log_info "Checking syntax: $(basename "$file")"
|
||||||
|
|
||||||
|
# Basic YAML syntax check (if python3 is available)
|
||||||
|
if command_exists python3; then
|
||||||
|
if python3 -c "import yaml; yaml.safe_load(open('$file'))" 2>/dev/null; then
|
||||||
|
log_info "✓ $(basename "$file") syntax OK"
|
||||||
|
else
|
||||||
|
log_error "✗ $(basename "$file") has syntax errors"
|
||||||
|
syntax_errors=$((syntax_errors + 1))
|
||||||
|
fi
|
||||||
|
else
|
||||||
|
log_warn "Python3 not available, skipping YAML syntax check"
|
||||||
|
fi
|
||||||
|
fi
|
||||||
|
done
|
||||||
|
|
||||||
|
if [ $syntax_errors -eq 0 ]; then
|
||||||
|
log_info "All workflow files have valid syntax"
|
||||||
|
return 0
|
||||||
|
else
|
||||||
|
log_error "$syntax_errors workflow files have syntax errors"
|
||||||
|
return 1
|
||||||
|
fi
|
||||||
|
}
|
||||||
|
|
||||||
|
# Generate CI/CD documentation
|
||||||
|
generate_docs() {
|
||||||
|
log_step "Generating CI/CD documentation..."
|
||||||
|
|
||||||
|
local docs_dir="docs/ci-cd"
|
||||||
|
mkdir -p "$docs_dir"
|
||||||
|
|
||||||
|
cat > "$docs_dir/local-testing.md" << 'EOF'
|
||||||
|
# Local CI/CD Testing
|
||||||
|
|
||||||
|
This guide helps you test CI/CD workflows locally before pushing to the repository.
|
||||||
|
|
||||||
|
## Prerequisites
|
||||||
|
|
||||||
|
- Docker with Buildx support
|
||||||
|
- uv package manager
|
||||||
|
- Python 3.8+
|
||||||
|
|
||||||
|
## Local Testing Commands
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Test local build
|
||||||
|
make docker-build
|
||||||
|
|
||||||
|
# Test multi-arch build
|
||||||
|
make docker-buildx-local
|
||||||
|
|
||||||
|
# Test full development workflow
|
||||||
|
make dev
|
||||||
|
|
||||||
|
# Run health checks
|
||||||
|
./scripts/health_check.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
## Workflow Testing
|
||||||
|
|
||||||
|
Use `act` to test GitHub/Gitea workflows locally:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Install act
|
||||||
|
curl https://raw.githubusercontent.com/nektos/act/master/install.sh | sudo bash
|
||||||
|
|
||||||
|
# Test PR workflow
|
||||||
|
act pull_request -s CONTAINER_REGISTRY_USERNAME=test -s CONTAINER_REGISTRY_PASSWORD=test
|
||||||
|
|
||||||
|
# Test release workflow
|
||||||
|
act push -e tests/fixtures/release-event.json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Build Issues
|
||||||
|
- Ensure all vendor assets are committed
|
||||||
|
- Check Docker daemon is running
|
||||||
|
- Verify buildx is properly configured
|
||||||
|
|
||||||
|
### Registry Issues
|
||||||
|
- Check .env file configuration
|
||||||
|
- Verify registry credentials
|
||||||
|
- Test registry connectivity
|
||||||
|
|
||||||
|
### Performance Issues
|
||||||
|
- Use build cache: `--cache-from type=gha`
|
||||||
|
- Optimize Docker layers
|
||||||
|
- Use multi-stage builds
|
||||||
|
```
|
||||||
|
EOF
|
||||||
|
|
||||||
|
log_info "Local testing documentation generated: $docs_dir/local-testing.md"
|
||||||
|
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
# Main setup function
|
||||||
|
main() {
|
||||||
|
echo -e "${BLUE}========================================${NC}"
|
||||||
|
echo -e "${BLUE} UnitForge CI/CD Setup${NC}"
|
||||||
|
echo -e "${BLUE}========================================${NC}"
|
||||||
|
echo ""
|
||||||
|
|
||||||
|
local failed_checks=0
|
||||||
|
|
||||||
|
# Run all checks
|
||||||
|
check_docker || failed_checks=$((failed_checks + 1))
|
||||||
|
setup_buildx || failed_checks=$((failed_checks + 1))
|
||||||
|
check_registry || true # Don't fail on registry issues
|
||||||
|
check_vendor_assets || failed_checks=$((failed_checks + 1))
|
||||||
|
check_dev_environment || failed_checks=$((failed_checks + 1))
|
||||||
|
check_workflow_syntax || failed_checks=$((failed_checks + 1))
|
||||||
|
|
||||||
|
# Optional tests
|
||||||
|
if [ "$1" = "--test-build" ]; then
|
||||||
|
test_local_build || failed_checks=$((failed_checks + 1))
|
||||||
|
test_multiarch_build || failed_checks=$((failed_checks + 1))
|
||||||
|
fi
|
||||||
|
|
||||||
|
# Generate documentation
|
||||||
|
generate_docs || true
|
||||||
|
|
||||||
|
echo ""
|
||||||
|
echo -e "${BLUE}========================================${NC}"
|
||||||
|
|
||||||
|
if [ $failed_checks -eq 0 ]; then
|
||||||
|
log_info "✅ CI/CD setup completed successfully!"
|
||||||
|
echo ""
|
||||||
|
echo "Next steps:"
|
||||||
|
echo "1. Update .env with your registry details"
|
||||||
|
echo "2. Test local build: make docker-buildx-local"
|
||||||
|
echo "3. Run full test suite: make dev"
|
||||||
|
echo "4. Check workflow syntax: ./scripts/setup-ci.sh"
|
||||||
|
echo ""
|
||||||
|
echo "For testing builds:"
|
||||||
|
echo " ./scripts/setup-ci.sh --test-build"
|
||||||
|
else
|
||||||
|
log_error "❌ CI/CD setup completed with $failed_checks issues"
|
||||||
|
echo ""
|
||||||
|
echo "Please fix the issues above before proceeding."
|
||||||
|
exit 1
|
||||||
|
fi
|
||||||
|
|
||||||
|
echo -e "${BLUE}========================================${NC}"
|
||||||
|
}
|
||||||
|
|
||||||
|
# Usage information
|
||||||
|
usage() {
|
||||||
|
echo "Usage: $0 [OPTIONS]"
|
||||||
|
echo ""
|
||||||
|
echo "Options:"
|
||||||
|
echo " --test-build Run local and multi-arch build tests"
|
||||||
|
echo " --help Show this help message"
|
||||||
|
echo ""
|
||||||
|
echo "This script sets up your local environment for CI/CD development."
|
||||||
|
echo "It checks Docker, Buildx, dependencies, and workflow syntax."
|
||||||
|
}
|
||||||
|
|
||||||
|
# Parse command line arguments
|
||||||
|
case "${1:-}" in
|
||||||
|
--help)
|
||||||
|
usage
|
||||||
|
exit 0
|
||||||
|
;;
|
||||||
|
--test-build)
|
||||||
|
main --test-build
|
||||||
|
;;
|
||||||
|
"")
|
||||||
|
main
|
||||||
|
;;
|
||||||
|
*)
|
||||||
|
echo "Unknown option: $1"
|
||||||
|
usage
|
||||||
|
exit 1
|
||||||
|
;;
|
||||||
|
esac
|
||||||
Reference in New Issue
Block a user