Compare commits

1 Commits

Author SHA1 Message Date
Oussama Douhou
336de62f7b docs: add release branch workflow for selective feature inclusion 2026-01-08 19:49:14 +01:00
7 changed files with 405 additions and 249 deletions

402
GIT_STRATEGY.md Normal file
View File

@@ -0,0 +1,402 @@
# Git Strategy and Workflow - Oh My OpenCode Free Fork
This document outlines the git workflow and branching strategy for the oh-my-opencode-free fork, ensuring clean development, easy upstream updates, and professional collaboration.
## 🏗️ Branch Structure
### Main Branches
| Branch | Purpose | Protection | Upstream Tracking |
|--------|---------|------------|-------------------|
| **`dev`** | Main development branch | ✅ Protected | `origin/dev` |
| **`main`** | Stable releases (future) | ✅ Protected | - |
### Release Branch (Selective Features)
| Branch | Purpose | Protection |
|--------|---------|------------|
| **`release`** | Production-ready with selected features only | ✅ Protected |
The `release` branch contains **only approved features** cherry-picked from `dev`. This allows:
- Excluding experimental or unstable features
- Building Docker images with specific feature sets
- Clean separation between development and production
### Feature Branches
| Branch Type | Naming Convention | Example |
|-------------|------------------|---------|
| **Features** | `feature/<feature-name>` | `feature/todo-codebase-compaction` |
| **Bug Fixes** | `fix/<issue-description>` | `fix/compaction-threshold-calculation` |
| **Refactoring** | `refactor/<component>` | `refactor/hook-injection-logic` |
| **Experiments** | `experiment/<idea>` | `experiment/alternative-compaction-prompt` |
## 🎯 Release Branch Workflow (Selective Features)
The `release` branch is used by the `oh-my-opencode-free` Docker build. Only merge features you want in production.
### Branch Flow Diagram
```
upstream (origin/dev) ──────────────────────────────► upstream updates
dev ─────┬──────────────────────────────────────────► all features
├── feature/todo-compaction ✓ merge to release
├── feature/parallel-agents ✓ merge to release
└── feature/experimental ✗ skip (not ready)
release ─────────────────────────────────────────────► selected features only
Docker build (oh-my-opencode-free)
```
### Adding Features to Release
```bash
# Ensure release branch exists and is current
git checkout release
git pull gitea release
# Merge a specific feature from dev
git merge feature/todo-compaction --no-ff -m "release: add todo-compaction feature"
# Or cherry-pick specific commits
git cherry-pick <commit-hash>
# Test the build
bun install && bun run build && bun run typecheck
# Push to Gitea (triggers Docker rebuild if CI configured)
git push gitea release
```
### Removing Features from Release
```bash
# Find the merge commit
git log --oneline --merges release
# Revert the merge
git revert -m 1 <merge-commit-hash>
# Push
git push gitea release
```
### Syncing Release with Upstream
```bash
# Update dev from upstream first
git checkout dev
git fetch origin
git merge origin/dev
# Then selectively update release
git checkout release
# Option 1: Rebase release onto new dev (keeps feature selection)
git rebase dev
# Option 2: Reset and re-merge features (clean slate)
git reset --hard dev~5 # Go back to before your features
git merge feature/todo-compaction --no-ff
git merge feature/parallel-agents --no-ff
# Skip features you don't want
git push gitea release --force-with-lease
```
### Listing Features in Release vs Dev
```bash
# Show what's in release but not in dev (your additions)
git log --oneline release ^dev
# Show what's in dev but not in release (excluded features)
git log --oneline dev ^release
```
---
## 🚀 Development Workflow
### 1. Starting a New Feature
```bash
# Ensure you're on latest dev
git checkout dev
git pull origin dev
# Create feature branch
git checkout -b feature/your-feature-name
# Develop and commit
git add .
git commit -m "feat: Add your feature description"
# Push feature branch (optional, for collaboration)
git push -u origin feature/your-feature-name
```
### 2. Feature Development
```bash
# Regular development cycle
git add .
git commit -m "feat: Implement core logic"
# Test your changes
bun run build && bun run typecheck
# Make more commits as needed
git commit -m "fix: Handle edge case"
```
### 3. Merging to Dev
```bash
# Switch to dev and ensure it's up to date
git checkout dev
git pull origin dev
# Merge feature branch
git merge feature/your-feature-name
# If merge conflicts, resolve them:
# 1. Edit conflicted files
# 2. git add <resolved-files>
# 3. git commit
# Push merged changes
git push origin dev
# Clean up feature branch
git branch -d feature/your-feature-name
```
### 4. Handling Upstream Updates
```bash
# Fetch latest upstream changes
git fetch origin
# Check what changed
git log --oneline origin/dev ^dev
# Option 1: Rebase (clean history)
git rebase origin/dev
# Option 2: Merge (preserve history)
git merge origin/dev
# Resolve any conflicts and push
git push origin dev
```
## 📋 Commit Message Convention
Follow conventional commits for consistency:
```
type(scope): description
[optional body]
[optional footer]
```
### Types
- **`feat`**: New features
- **`fix`**: Bug fixes
- **`docs`**: Documentation
- **`style`**: Code style changes
- **`refactor`**: Code refactoring
- **`test`**: Testing
- **`chore`**: Maintenance
### Examples
```
feat: Add custom todo+codebase compaction hook
fix: Resolve compaction threshold calculation bug
docs: Update git strategy documentation
refactor: Simplify hook injection logic
```
## 🔄 Update Strategy for Upstream Changes
### Scenario: Upstream Releases New Features
```bash
# Backup current state
git branch backup-$(date +%Y%m%d)
# Option A: Rebase (Recommended)
git fetch origin
git rebase origin/dev
# Resolve conflicts if any
git rebase --continue
# Option B: Merge
git fetch origin
git merge origin/dev
# Test that custom features still work
bun install && bun run build && bun run typecheck
# Push updated dev
git push origin dev
```
### Conflict Resolution Guidelines
1. **Prioritize Custom Features**: Keep your custom logic when conflicts occur
2. **Test Thoroughly**: Ensure custom compaction still works after merges
3. **Document Changes**: Update this doc if workflow changes
4. **Backup First**: Always create backups before major operations
## 🛡️ Quality Assurance
### Pre-Merge Checklist
- [ ] **Build passes**: `bun run build`
- [ ] **Types check**: `bun run typecheck`
- [ ] **Tests pass**: `bun run test` (if applicable)
- [ ] **Custom features work**: Test compaction, agents, etc.
- [ ] **No console errors**: Clean build output
- [ ] **Documentation updated**: README, this doc, etc.
### Post-Merge Verification
```bash
# Full verification script
bun install
bun run build
bun run typecheck
# Test custom features
ls src/hooks/todo-codebase-compaction/
grep "custom_compaction" src/config/schema.ts
echo "✅ All checks passed!"
```
## 🤝 Collaboration Guidelines
### For Team Members
1. **Never commit directly to `dev`** - Always use feature branches
2. **Keep feature branches focused** - One feature per branch
3. **Regular rebasing** - Keep up with upstream changes
4. **Clear commit messages** - Follow conventional commits
5. **Test before merging** - Ensure no regressions
### Code Review Process
```bash
# Before merging, get review
gh pr create --base dev --head feature/your-feature
# Or manual review
git diff dev..feature/your-feature
```
## 🏷️ Tagging Strategy
### Version Tags (Future)
```
v1.0.0 # First stable release
v1.1.0 # Feature release
v1.1.1 # Bug fix release
```
### Feature Tags
```
feature/compaction-v1 # Major feature milestone
experiment/alternative-prompts # Experimental features
```
## 🚨 Emergency Procedures
### Lost Commits
```bash
# Find lost commits
git reflog
# Restore from reflog
git checkout <commit-hash>
git branch recovery-branch
```
### Upstream Conflicts
```bash
# Abort and try different strategy
git rebase --abort
git merge origin/dev # Try merge instead
# Or create fresh feature branch
git checkout -b feature/fresh-start origin/dev
# Re-implement changes
```
### Repository Corruption
```bash
# Fresh clone and reapply changes
cd ..
rm -rf oh-my-opencode-free-fork
git clone <your-gitea-url> oh-my-opencode-free-fork
cd oh-my-opencode-free-fork
# Reapply custom changes from backups
```
## 📊 Metrics and Monitoring
### Branch Health
- **Max feature branch age**: 2 weeks
- **Max dev branch divergence**: 5 commits
- **Required reviews**: 1 reviewer for merges
### Quality Metrics
- **Build success rate**: >95%
- **Test coverage**: Track and improve
- **Merge conflict rate**: <10%
## 🎯 Best Practices
### General
- **Small, focused commits** - Easier to review and revert
- **Regular pushes** - Don't accumulate large changes
- **Clear naming** - Descriptive branch and commit names
- **Documentation** - Keep this and README updated
### Custom Features
- **Isolate custom code** - Keep it separate from upstream changes
- **Test compatibility** - Ensure custom features work with upstream updates
- **Document overrides** - Note where you deviate from upstream
- **Plan for conflicts** - Have strategies for resolving upstream conflicts
### Performance
- **Fast-forward merges** - Prefer rebase for clean history
- **Shallow clones** - For CI/CD if needed
- **Branch cleanup** - Delete merged branches regularly
## 📚 Resources
- [Git Flow](https://nvie.com/posts/a-successful-git-branching-model/)
- [Conventional Commits](https://conventionalcommits.org/)
- [Git Rebase vs Merge](https://www.atlassian.com/git/tutorials/merging-vs-rebasing)
---
## 📝 Change Log
- **2026-01-08**: Initial git strategy documentation
- **2026-01-08**: Added custom compaction workflow
- **2026-01-08**: Established feature branch workflow
---
*This strategy ensures clean, maintainable development while preserving your custom oh-my-opencode features.*

View File

@@ -77,8 +77,7 @@
"edit-error-recovery",
"prometheus-md-only",
"start-work",
"sisyphus-orchestrator",
"todo-codebase-compaction"
"sisyphus-orchestrator"
]
}
},

View File

@@ -82,8 +82,6 @@ export const HookNameSchema = z.enum([
"prometheus-md-only",
"start-work",
"sisyphus-orchestrator",
"todo-codebase-compaction",
"usage-logging",
])
export const BuiltinCommandNameSchema = z.enum([

View File

@@ -29,5 +29,3 @@ export { createPrometheusMdOnlyHook } from "./prometheus-md-only";
export { createTaskResumeInfoHook } from "./task-resume-info";
export { createStartWorkHook } from "./start-work";
export { createSisyphusOrchestratorHook } from "./sisyphus-orchestrator";
export { createTodoCodebaseCompactionInjector, createCustomCompactionHook } from "./todo-codebase-compaction";
export { createUsageLoggingHook } from "./usage-logging";

View File

@@ -1,4 +1,4 @@
import type { SummarizeContext, PreemptiveCompactionOptions } from "../preemptive-compaction"
import type { SummarizeContext } from "../preemptive-compaction"
import { injectHookMessage } from "../../features/hook-message-injector"
import { log } from "../../shared/logger"
import { createPreemptiveCompactionHook } from "../preemptive-compaction"

View File

@@ -1,229 +0,0 @@
import { createHash } from 'crypto';
interface LogEvent {
timestamp: string;
stack_name: string;
session_id: string;
event_type: string;
data: Record<string, unknown>;
}
interface UsageLoggingOptions {
ingestUrl?: string;
stackName?: string;
batchSize?: number;
flushIntervalMs?: number;
}
const DEFAULT_INGEST_URL = process.env.LOG_INGEST_URL || 'http://10.100.0.20:3102/ingest';
const DEFAULT_STACK_NAME = process.env.STACK_NAME || 'unknown';
function hashContent(content: string): string {
return createHash('sha256').update(content).digest('hex').substring(0, 16);
}
function getWordCount(content: string): number {
return content.split(/\s+/).filter(Boolean).length;
}
export function createUsageLoggingHook(options: UsageLoggingOptions = {}) {
const {
ingestUrl = DEFAULT_INGEST_URL,
stackName = DEFAULT_STACK_NAME,
batchSize = 10,
flushIntervalMs = 5000,
} = options;
const enabled = process.env.USAGE_LOGGING_ENABLED !== 'false';
if (!enabled) {
return { event: async () => {} };
}
const eventBuffer: LogEvent[] = [];
let flushTimer: ReturnType<typeof setTimeout> | null = null;
const sessionStats = new Map<string, {
startTime: number;
messageCount: number;
toolUseCount: number;
tokensIn: number;
tokensOut: number;
}>();
async function flushEvents(): Promise<void> {
if (eventBuffer.length === 0) return;
const eventsToSend = [...eventBuffer];
eventBuffer.length = 0;
try {
const response = await fetch(`${ingestUrl}/batch`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(eventsToSend),
signal: AbortSignal.timeout(5000)
});
if (!response.ok) {
console.error(`[UsageLogging] Failed to send events: ${response.status}`);
}
} catch (error) {
console.error('[UsageLogging] Error sending events:', error);
}
}
function queueEvent(sessionId: string, eventType: string, data: Record<string, unknown>): void {
eventBuffer.push({
timestamp: new Date().toISOString(),
stack_name: stackName,
session_id: sessionId,
event_type: eventType,
data
});
if (eventBuffer.length >= batchSize) {
flushEvents();
} else if (!flushTimer) {
flushTimer = setTimeout(() => {
flushTimer = null;
flushEvents();
}, flushIntervalMs);
}
}
function getOrCreateSessionStats(sessionId: string) {
if (!sessionStats.has(sessionId)) {
sessionStats.set(sessionId, {
startTime: Date.now(),
messageCount: 0,
toolUseCount: 0,
tokensIn: 0,
tokensOut: 0
});
}
return sessionStats.get(sessionId)!;
}
return {
event: async (input: { event: { type: string; properties?: Record<string, unknown> } }) => {
const { event } = input;
const props = event.properties || {};
try {
switch (event.type) {
case 'session.created': {
const info = props.info as { id?: string } | undefined;
if (info?.id) {
getOrCreateSessionStats(info.id);
queueEvent(info.id, 'session_start', {
start_time: Date.now()
});
}
break;
}
case 'session.deleted': {
const info = props.info as { id?: string } | undefined;
if (info?.id) {
const stats = sessionStats.get(info.id);
if (stats) {
queueEvent(info.id, 'session_end', {
duration_ms: Date.now() - stats.startTime,
total_messages: stats.messageCount,
total_tool_uses: stats.toolUseCount,
total_tokens_in: stats.tokensIn,
total_tokens_out: stats.tokensOut
});
sessionStats.delete(info.id);
}
}
break;
}
case 'message.created': {
const sessionID = props.sessionID as string | undefined;
const message = props.message as { role?: string; content?: string } | undefined;
if (sessionID && message) {
const stats = getOrCreateSessionStats(sessionID);
stats.messageCount++;
const content = typeof message.content === 'string'
? message.content
: JSON.stringify(message.content || '');
queueEvent(sessionID, 'message', {
role: message.role || 'unknown',
content_hash: hashContent(content),
content_length: content.length,
word_count: getWordCount(content),
message_number: stats.messageCount
});
}
break;
}
case 'tool.started': {
const sessionID = props.sessionID as string | undefined;
const tool = props.tool as { name?: string } | undefined;
if (sessionID && tool?.name) {
queueEvent(sessionID, 'tool_start', {
tool: tool.name,
start_time: Date.now()
});
}
break;
}
case 'tool.completed': {
const sessionID = props.sessionID as string | undefined;
const tool = props.tool as { name?: string } | undefined;
const result = props.result as { error?: unknown } | undefined;
if (sessionID && tool?.name) {
const stats = getOrCreateSessionStats(sessionID);
stats.toolUseCount++;
queueEvent(sessionID, 'tool_use', {
tool: tool.name,
success: !result?.error,
error_message: result?.error ? String(result.error) : undefined,
tool_use_number: stats.toolUseCount
});
}
break;
}
case 'session.error': {
const sessionID = props.sessionID as string | undefined;
const error = props.error as { message?: string; code?: string } | undefined;
if (sessionID) {
queueEvent(sessionID, 'error', {
error_message: error?.message || 'Unknown error',
error_code: error?.code
});
}
break;
}
case 'tokens.used': {
const sessionID = props.sessionID as string | undefined;
const usage = props.usage as { input_tokens?: number; output_tokens?: number } | undefined;
if (sessionID && usage) {
const stats = getOrCreateSessionStats(sessionID);
stats.tokensIn += usage.input_tokens || 0;
stats.tokensOut += usage.output_tokens || 0;
queueEvent(sessionID, 'tokens', {
tokens_in: usage.input_tokens,
tokens_out: usage.output_tokens,
total_tokens_in: stats.tokensIn,
total_tokens_out: stats.tokensOut
});
}
break;
}
}
} catch (error) {
console.error('[UsageLogging] Error processing event:', error);
}
}
};
}

View File

@@ -31,7 +31,6 @@ import {
createStartWorkHook,
createSisyphusOrchestratorHook,
createPrometheusMdOnlyHook,
createUsageLoggingHook,
} from "./hooks";
import {
contextCollector,
@@ -147,13 +146,10 @@ const OhMyOpenCodePlugin: Plugin = async (ctx) => {
const compactionContextInjector = isHookEnabled("compaction-context-injector")
? createCompactionContextInjector()
: undefined;
const todoCodebaseCompactionInjector = isHookEnabled("todo-codebase-compaction")
? createTodoCodebaseCompactionInjector()
: undefined;
const preemptiveCompaction = isHookEnabled("preemptive-compaction")
? createPreemptiveCompactionHook(ctx, {
experimental: pluginConfig.experimental,
onBeforeSummarize: todoCodebaseCompactionInjector ?? compactionContextInjector,
onBeforeSummarize: compactionContextInjector,
getModelLimit: (providerID, modelID) =>
getModelLimit(modelCacheState, providerID, modelID),
})
@@ -213,13 +209,6 @@ const OhMyOpenCodePlugin: Plugin = async (ctx) => {
? createPrometheusMdOnlyHook(ctx)
: null;
const usageLogging = isHookEnabled("usage-logging")
? createUsageLoggingHook({
stackName: process.env.STACK_NAME,
ingestUrl: process.env.LOG_INGEST_URL,
})
: null;
const taskResumeInfo = createTaskResumeInfoHook();
const backgroundManager = new BackgroundManager(ctx);
@@ -419,7 +408,6 @@ const OhMyOpenCodePlugin: Plugin = async (ctx) => {
await interactiveBashSession?.event(input);
await ralphLoop?.event(input);
await sisyphusOrchestrator?.handler(input);
await usageLogging?.event(input);
const { event } = input;
const props = event.properties as Record<string, unknown> | undefined;