Thank you for your interest in contributing to KaflowSQL! We welcome contributions from everyone.
- Go 1.24.1 or later
- Docker and Docker Compose
- Git
-
Fork and Clone
git clone https://github.com/your-username/KaflowSQL.git cd KaflowSQL -
Set up Development Environment
make setup-dev
-
Build and Test
make build make test
-
Create a Branch
git checkout -b feature/your-feature-name
-
Make Your Changes
- Write your code
- Add tests for new functionality
- Update documentation as needed
-
Test Your Changes
make check make test-coverage
-
Commit Your Changes
git add . git commit -m "feat: add your feature description"
We follow the Conventional Commits specification:
feat:new featuresfix:bug fixesdocs:documentation changesstyle:formatting changesrefactor:code refactoringtest:adding or updating testschore:maintenance tasks
- Run
make fmtto format your code - Run
make lintto check for style issues - Follow Go best practices and idioms
- Write clear, self-documenting code
- Add comments for complex logic
- Write unit tests for new functionality
- Ensure all tests pass:
make test - Check test coverage:
make test-coverage - Run race detector:
make test-race
-
Push Your Branch
git push origin feature/your-feature-name
-
Create Pull Request
- Use the provided PR template
- Provide clear description of changes
- Link related issues
- Add screenshots/examples if applicable
-
Code Review
- Address review feedback
- Keep your branch up to date
- Be responsive to comments
- Keep PRs focused and atomic
- Write clear PR titles and descriptions
- Include tests for new features
- Update documentation as needed
- Ensure CI checks pass
├── cmd/ # Application entry points
│ ├── engine/ # Main streaming engine
│ └── fakegen/ # Data generation tool
├── pkg/ # Shared packages
│ ├── avro/ # Avro schema handling
│ ├── config/ # Configuration management
│ ├── duck/ # DuckDB integration
│ ├── engine/ # Core processing engine
│ ├── kafka/ # Kafka client wrappers
│ ├── pipeline/ # Pipeline definition
│ ├── schema/ # Schema management
│ ├── state/ # State management
│ └── ttlindex/ # TTL indexing
├── pipelines/ # Pipeline definitions
└── .github/ # GitHub workflows
KaflowSQL is a streaming ETL framework with these key components:
- Engine: Processes events using stateful joins
- State Management: RocksDB for persistence
- Schema Registry: Avro schema management
- DuckDB: In-memory analytics engine
- Pipeline System: YAML-based configuration
# Run all tests
make test
# Run with coverage
make test-coverage
# Run with race detector
make test-race
# Run benchmarks
make benchmark# Start full development environment
make docker-compose-dev
# Build Docker image
make docker-build
# Start KaflowSQL only
make docker-compose-up- Update README.md for user-facing changes
- Update CLAUDE.md for development guidance
- Add godoc comments for public APIs
- Create examples for new features
- Check existing issues
- Create a new issue for bugs or feature requests
- Join discussions in GitHub Discussions
- Read the documentation
This project follows the Contributor Covenant Code of Conduct. Please read and follow it.
By contributing to KaflowSQL, you agree that your contributions will be licensed under the Apache License 2.0.
Thank you for contributing to KaflowSQL! 🚀