Coding
PromptBeginner5 minmarkdown
Markdown Converter
Agent skill for markdown-converter
7
Our development is guided by a few key principles. Always consider these when making design decisions.
Sign in to like and favorite skills
Our development is guided by a few key principles. Always consider these when making design decisions.
components/ modules. Build complex features by composing these smaller, independent units. This improves testability and maintainability.try/except blocks deep within application logic.Consistency is key to a readable and maintainable codebase.
kebab-case for all files and directories (e.g., databricks-api.py, utils/).def functions over lambda. def provides a name for debugging, supports docstrings, and allows for proper type annotation.list[str] instead of List[str]).Any. If you must use it, use typing.cast with a clear comment explaining why no stricter type is available.list[T], dict[K, V], and set[T].pydantic models for validating the structure and types of external inputs (e.g., API requests, tool arguments).
Schema suffix (e.g., CreateClusterSchema).importlib.import_module or await import(...) unless it's a core requirement of a plug-in system.A clear structure makes the system easier to navigate and understand.
databricks_mcp/: The primary application package.
core/: Shared Pydantic models, custom exception classes, and authentication logic. This code should be generic and application-agnostic.api/: Thin, stateless clients for external REST APIs (e.g., Databricks). This layer is responsible for HTTP requests, authentication, and translating API errors into well-defined exceptions. It should not contain any business logic.components/: Reusable modules that encapsulate specific business logic (e.g., parsing a notebook, calculating cluster costs). Components should be independent and easily testable in isolation.server/ or tools/: The layer that composes components into agent-callable tools. It defines tool inputs (SomethingSchema), orchestrates calls to components, and formats the final CallToolResult.cli/: Command-line entry points and interfaces.tests/: Contains all tests, mirroring the application structure.components/ or a helper in a utils/ module.Well-designed tools are the foundation of a capable agent.
list_clusters should not also have an option to delete one....Schema model. This provides automatic validation and clear documentation.CallToolResult object.
TextContent should be a concise, human-readable summary for the LLM._meta['data']. This is what downstream tools or programmatic clients will use._meta['resources'].snake_case and clearly describe their action (e.g., get_cluster_details, submit_spark_job).A robust system anticipates and reports failures clearly.
ClusterNotFoundError, InvalidConfigurationError).try/except blocks sparingly, primarily at system boundaries (e.g., in the api/ layer to catch network errors, or in the main server loop to catch unhandled exceptions).except Exception: blocks that hide bugs. If you must catch a broad exception, re-raise it or log it with a full traceback.logging module.INFO: Key lifecycle events (e.g., "Server starting," "Tool 'list_clusters' invoked").DEBUG: Verbose, detailed information useful for tracing execution flow (e.g., API request bodies, intermediate values).WARNING: A potential issue was encountered but the operation succeeded (e.g., "API deprecated, using fallback").ERROR: An operation failed due to a handled exception.databricks_mcp.log. Be mindful of logging sensitive information.Follow these steps to ensure code quality and a smooth contribution process.
uv.
uv pip install -e . && uv pip install -e ".[dev]"uv run pytestuv run black . and uv run pylint databricks_mcp testsuv run databricks-mcp -- --helppytest with pytest-asyncio for asynchronous code.api/ boundary. Tests must be fast and runnable offline._meta['data'] and _meta['resources'].Clear communication helps the team review and integrate your work efficiently.
Fixes #<id> or Refs #<id>..env file (see .env.example). This file is git-ignored.DATABRICKS_HOST, DATABRICKS_TOKEN.databricks_mcp.log has been rotated or cleared of sensitive information.