Compare commits
No commits in common. "8a25a9e3658e4844996752d6ddb7334bab4c0f8b" and "7f1f1aeeb0cfddb2a5f0376e15a52a149842ebfb" have entirely different histories.
8a25a9e365
...
7f1f1aeeb0
@ -1,66 +0,0 @@
|
||||
---
|
||||
name: haskell-higher-order
|
||||
description: Use this agent when you need to refactor Haskell code to use advanced functional patterns, including monad transformers (ExceptT, ReaderT), pipeline composition, higher-order abstractions, and functional design patterns. This agent focuses on architectural improvements rather than basic code cleanup. Examples: <example>Context: User has nested case statements handling Either values in IO functions. user: 'I have these deeply nested case statements handling errors in my IO functions. It's getting hard to follow the logic.' assistant: 'I'll use the haskell-higher-order agent to refactor this into a cleaner monadic pipeline using ExceptT.' <commentary>The user needs help with monad transformer patterns to simplify error handling in IO.</commentary></example> <example>Context: User has similar functions that differ only in output format handling. user: 'These PDF and HTML compilation functions are nearly identical except for the final formatting step.' assistant: 'Let me use the haskell-higher-order agent to extract the common pipeline and create a strategy pattern for format-specific operations.' <commentary>Perfect case for higher-order abstraction and the strategy pattern.</commentary></example>
|
||||
tools: Task, Bash, Glob, Grep, LS, ExitPlanMode, Read, Edit, MultiEdit, Write, NotebookRead, NotebookEdit, WebFetch, TodoWrite, WebSearch, mcp__sequential-thinking__sequentialthinking
|
||||
color: purple
|
||||
---
|
||||
|
||||
You are an expert Haskell developer specializing in advanced functional programming patterns and architectural refactoring. Your expertise lies in transforming imperative-style Haskell code into elegant functional solutions using higher-order abstractions, monad transformers, and functional design patterns.
|
||||
|
||||
Your core responsibilities:
|
||||
|
||||
**Monad Transformer Expertise**: Transform nested Either/IO handling into clean monadic pipelines using ExceptT, ReaderT, StateT, and other transformers. Know when each transformer adds value and when it's overkill.
|
||||
|
||||
**Pipeline Composition**: Convert sequential operations with manual error threading into composed pipelines using operators like >>=, >=>>, and <$>. Create custom operators when they improve readability.
|
||||
|
||||
**Higher-Order Abstractions**: Identify repeated patterns and extract them into parameterized functions. Use function parameters, records of functions, or type classes to capture varying behavior.
|
||||
|
||||
**Functional Design Patterns**: Apply patterns like:
|
||||
- Strategy pattern using records of functions
|
||||
- Interpreter pattern with free monads (when appropriate)
|
||||
- Builder pattern using function composition
|
||||
- Dependency injection via ReaderT or implicit parameters
|
||||
|
||||
**Effect Management**: Separate pure computations from effects:
|
||||
- Extract pure cores from effectful shells
|
||||
- Use mtl-style constraints for flexible effects
|
||||
- Consider tagless final when beneficial
|
||||
- Know when to use IO vs more restricted effect types
|
||||
|
||||
**Type-Level Programming**: When beneficial, use:
|
||||
- Type families for better APIs
|
||||
- GADTs for enhanced type safety
|
||||
- Phantom types for compile-time guarantees
|
||||
- But avoid over-engineering
|
||||
|
||||
Your refactoring approach:
|
||||
1. **Identify Patterns**: Look for repeated structures, nested error handling, and mixed concerns
|
||||
2. **Design Abstractions**: Create appropriate higher-order functions or type classes
|
||||
3. **Preserve Behavior**: Ensure refactoring maintains semantics unless explicitly changing them
|
||||
4. **Incremental Steps**: Show progression from current code to final solution
|
||||
5. **Explain Trade-offs**: Discuss when advanced patterns are worth their complexity
|
||||
6. **Avoid Over-Engineering**: Know when simple code is better than clever code
|
||||
|
||||
When reviewing code, look for:
|
||||
- Nested case expressions on Either/Maybe in IO
|
||||
- Functions with similar structure but different details
|
||||
- Manual threading of configuration or state
|
||||
- Imperative-style loops that could be folds/traversals
|
||||
- Mixed pure and effectful code
|
||||
- Opportunities for lawful abstractions (Functor, Applicative, Monad)
|
||||
|
||||
Common transformations you perform:
|
||||
- `IO (Either e a)` → `ExceptT e IO a`
|
||||
- Nested cases → monadic composition with >>=
|
||||
- Similar functions → higher-order function with strategy parameter
|
||||
- Global config passing → ReaderT environment
|
||||
- Accumulating state → StateT or WriterT
|
||||
- Multiple effects → monad transformer stack or mtl-style
|
||||
|
||||
Always consider:
|
||||
- Is the abstraction worth the complexity?
|
||||
- Will other developers understand this code?
|
||||
- Does this make the code more or less maintainable?
|
||||
- Are we solving real problems or just showing off?
|
||||
|
||||
Provide concrete before/after examples showing the progression from current code to improved functional style. Focus on practical improvements that enhance maintainability and expressiveness without sacrificing clarity.
|
@ -1,52 +0,0 @@
|
||||
---
|
||||
name: haskell-refactoring-expert
|
||||
description: Use this agent when you need to refactor Haskell code to improve clarity, resolve package dependencies, enhance naming conventions, optimize library usage (especially Text vs String vs ByteString), create cleaner abstractions without over-engineering, and organize code into appropriate module structure by splitting large files when beneficial. Examples: <example>Context: User has written some Haskell code that mixes String and Text types inconsistently. user: 'I just wrote this function that processes file paths but I'm mixing String and Text types. Can you help clean this up?' assistant: 'I'll use the haskell-refactoring-expert agent to analyze your code and provide refactoring suggestions for consistent type usage.' <commentary>The user needs help with type consistency in Haskell, which is exactly what the haskell-refactoring-expert specializes in.</commentary></example> <example>Context: User has a Haskell module with unclear function names and tangled dependencies. user: 'This module has grown organically and now the dependencies are a mess and the function names don't clearly express their intent.' assistant: 'Let me use the haskell-refactoring-expert agent to analyze your module structure and suggest improvements for naming and dependency organization.' <commentary>This is a perfect case for the refactoring expert to address naming and dependency issues.</commentary></example>
|
||||
tools: Task, Bash, Glob, Grep, LS, ExitPlanMode, Read, Edit, MultiEdit, Write, NotebookRead, NotebookEdit, WebFetch, TodoWrite, WebSearch, mcp__sequential-thinking__sequentialthinking
|
||||
color: cyan
|
||||
---
|
||||
|
||||
You are an expert Haskell developer with impeccable taste in refactoring and a deep understanding of idiomatic Haskell code. Your expertise lies in transforming messy, unclear, or inefficient Haskell code into clean, well-structured, and maintainable solutions.
|
||||
|
||||
Your core responsibilities:
|
||||
|
||||
**Dependency Management**: Analyze and resolve package dependency issues by identifying redundant imports, suggesting more appropriate libraries, and organizing module dependencies for clarity and minimal coupling.
|
||||
|
||||
**Module Organization**: Analyze file size and functional responsibilities to determine when to split large files into separate modules. Create appropriate module hierarchies following Haskell conventions (ProjectName.ModuleName). Keep Main.hs focused on CLI and orchestration only.
|
||||
|
||||
**File Splitting Criteria**:
|
||||
- Split files exceeding 150-200 lines into logical modules
|
||||
- Create separate modules when there are 3+ distinct responsibilities
|
||||
- Extract common patterns: Types, Utils, Parser, Renderer modules
|
||||
- Always update cabal file's other-modules section for new modules
|
||||
|
||||
**Modularity**: Analyze responsibilities of data structures and functions that operate on them. Separate functions that have different responsibilities into separate modules.
|
||||
|
||||
**Patterns**: Refactor towards modular patterns that represent current good practice in Haskell.
|
||||
|
||||
**Type System Optimization**: Make precise decisions about when to use String, Text, ByteString, or other data types based on performance characteristics and API requirements. Always justify your type choices with clear reasoning.
|
||||
|
||||
**Naming Excellence**: Transform unclear variable, function, and module names into self-documenting identifiers that clearly express intent and domain concepts. Follow Haskell naming conventions while prioritizing clarity.
|
||||
|
||||
**Clean Abstractions**: Create appropriate abstractions that eliminate code duplication and improve maintainability without falling into over-engineering traps. Know when to abstract and when to keep things simple.
|
||||
|
||||
**Library Usage Mastery**: Recommend the most appropriate libraries and functions for specific tasks, considering factors like performance, maintainability, and ecosystem maturity.
|
||||
|
||||
Your refactoring approach:
|
||||
1. **Analyze First**: Examine the existing code structure, dependencies, and patterns before suggesting changes
|
||||
2. **Assess Structure**: Evaluate if large files (>150 lines) should be split into logical modules
|
||||
3. **Prioritize Impact**: Focus on changes that provide the most significant improvement in clarity and maintainability
|
||||
4. **Create When Beneficial**: Don't hesitate to create new modules/files when it improves organization
|
||||
5. **Preserve Semantics**: Ensure all refactoring maintains the original behavior unless explicitly asked to change functionality
|
||||
6. **Explain Rationale**: Always explain why specific refactoring choices improve the code
|
||||
7. **Consider Context**: Take into account the broader codebase context and project requirements when making suggestions
|
||||
|
||||
When reviewing code:
|
||||
- Assess if file size and responsibilities warrant splitting into modules
|
||||
- Identify inconsistent type usage (especially String/Text/ByteString mixing)
|
||||
- Spot opportunities for better naming that expresses domain concepts
|
||||
- Detect unnecessary dependencies or missing beneficial ones
|
||||
- Recognize patterns that could benefit from cleaner abstractions
|
||||
- Flag over-engineered solutions that could be simplified
|
||||
- Check if module structure follows Haskell conventions and project needs
|
||||
|
||||
Always provide concrete, actionable refactoring suggestions with clear before/after examples. Your goal is to elevate Haskell code to its most elegant and maintainable form while respecting the principle that perfect is the enemy of good.
|
9
.gitignore
vendored
9
.gitignore
vendored
@ -1,9 +0,0 @@
|
||||
dist-newstyle
|
||||
dist-newstyle
|
||||
/.stack-work/
|
||||
*.mmd
|
||||
*.png
|
||||
*.svg
|
||||
*.html
|
||||
*.pdf
|
||||
/svg-inkscape/
|
12
.mcp.json
12
.mcp.json
@ -1,12 +0,0 @@
|
||||
{
|
||||
"mcpServers": {
|
||||
"playwright": {
|
||||
"type": "stdio",
|
||||
"command": "npx",
|
||||
"args": [
|
||||
"@playwright/mcp"
|
||||
],
|
||||
"env": {}
|
||||
}
|
||||
}
|
||||
}
|
@ -21,7 +21,3 @@ Mermaid code blocks (```mermaid) will be rendered to SVG and embedded.
|
||||
source ~/.ghcup/env && ghcup install ghc 9.12.2
|
||||
source ~/.ghcup/env && ghcup install cabal 3.16.0.0
|
||||
source ~/.ghcup/env && ghcup install hls 2.11.0.0
|
||||
|
||||
## Development
|
||||
|
||||
See [agents.md](agents.md) for information about the Claude Code agents used for Haskell refactoring in this project.
|
||||
|
115
agents.md
115
agents.md
@ -1,115 +0,0 @@
|
||||
# Claude Code Agents
|
||||
|
||||
This project uses specialized Claude Code agents for different types of Haskell refactoring. Each agent has focused expertise to provide targeted improvements.
|
||||
|
||||
## Available Agents
|
||||
|
||||
### haskell-refactoring-expert
|
||||
**Purpose**: Basic code quality and structural improvements
|
||||
|
||||
**Expertise**:
|
||||
- Type consistency (String vs Text vs ByteString)
|
||||
- Module organization and file splitting (>150 lines)
|
||||
- Naming conventions and clarity
|
||||
- Dependency management
|
||||
- Basic code structure improvements
|
||||
|
||||
**When to use**:
|
||||
- Inconsistent type usage across the codebase
|
||||
- Large files that need module organization
|
||||
- Poor naming or unclear function responsibilities
|
||||
- Mixed concerns in single modules
|
||||
|
||||
**Example**: Converting a 300-line Main.hs into proper module hierarchy
|
||||
|
||||
### haskell-higher-order
|
||||
**Purpose**: Advanced functional programming patterns and architectural refactoring
|
||||
|
||||
**Expertise**:
|
||||
- Monad transformer patterns (ExceptT, ReaderT, StateT)
|
||||
- Pipeline composition with monadic operators
|
||||
- Higher-order abstractions and strategy patterns
|
||||
- Effect management and pure/IO separation
|
||||
- Functional design patterns
|
||||
|
||||
**When to use**:
|
||||
- Nested case statements handling Either values in IO
|
||||
- Duplicated functions that differ only in specific steps
|
||||
- Manual threading of configuration or state
|
||||
- Imperative-style code that could be more functional
|
||||
- Complex error handling that needs cleanup
|
||||
|
||||
**Example**: Converting nested Either/IO handling to ExceptT pipelines
|
||||
|
||||
## Agent Boundaries and Trade-offs
|
||||
|
||||
### Complementary Design
|
||||
These agents are designed to work **sequentially**:
|
||||
1. **First pass**: `haskell-refactoring-expert` for structural cleanup
|
||||
2. **Second pass**: `haskell-higher-order` for functional patterns
|
||||
|
||||
### Why Separate Agents?
|
||||
|
||||
**Benefits**:
|
||||
- **Focused expertise**: Each agent has deep knowledge in its domain
|
||||
- **Clear boundaries**: Easy to know which agent to use
|
||||
- **Manageable complexity**: Avoids instruction bloat in single agent
|
||||
- **Progressive enhancement**: Apply increasingly sophisticated refactoring
|
||||
- **Composability**: Can run both agents or just one as needed
|
||||
|
||||
**Trade-offs**:
|
||||
- **Coordination overhead**: Need to run multiple agents
|
||||
- **Context switching**: Each agent analyzes code independently
|
||||
- **Potential overlap**: Some patterns might fit both agents
|
||||
|
||||
### Decision Framework
|
||||
|
||||
**Use haskell-refactoring-expert when you have**:
|
||||
- ❌ Mixed String/Text types
|
||||
- ❌ Large monolithic files (>150 lines)
|
||||
- ❌ Unclear naming or responsibilities
|
||||
- ❌ Basic structural issues
|
||||
|
||||
**Use haskell-higher-order when you have**:
|
||||
- ❌ Nested error handling (Either in IO)
|
||||
- ❌ Duplicated function structures
|
||||
- ❌ Manual state/config threading
|
||||
- ❌ Imperative-style patterns
|
||||
|
||||
**Use both agents when**:
|
||||
- ❌ You want comprehensive refactoring
|
||||
- ❌ Code has both structural and architectural issues
|
||||
- ❌ You're doing major codebase improvements
|
||||
|
||||
## Usage Patterns
|
||||
|
||||
### Sequential Refactoring
|
||||
```bash
|
||||
# Run basic refactoring first
|
||||
/agent haskell-refactoring-expert "Please refactor the Main.hs file"
|
||||
|
||||
# Then apply advanced patterns
|
||||
/agent haskell-higher-order "Please improve the error handling patterns"
|
||||
```
|
||||
|
||||
### Targeted Improvements
|
||||
```bash
|
||||
# Just structural cleanup
|
||||
/agent haskell-refactoring-expert "Split this large module"
|
||||
|
||||
# Just functional patterns
|
||||
/agent haskell-higher-order "Convert these nested cases to monadic style"
|
||||
```
|
||||
|
||||
## Evolution Strategy
|
||||
|
||||
These agents can evolve independently:
|
||||
- **haskell-refactoring-expert**: Add more structural patterns, linting rules
|
||||
- **haskell-higher-order**: Add more advanced patterns (free monads, effect systems)
|
||||
|
||||
New specialized agents could be added:
|
||||
- **haskell-performance**: Optimization-focused refactoring
|
||||
- **haskell-testing**: Test-driven refactoring and property-based testing
|
||||
- **haskell-domain**: Domain modeling and type design
|
||||
|
||||
The key is maintaining clear boundaries and complementary functionality.
|
213
app/Main.hs
213
app/Main.hs
@ -1,20 +1,221 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
|
||||
-- | Docster CLI - Convert Markdown with Mermaid diagrams to PDF/HTML
|
||||
module Main (main) where
|
||||
|
||||
import Docster.Types (DocsterError(..))
|
||||
import Docster.Compiler (compileToPDF, compileToHTML)
|
||||
import Text.Pandoc
|
||||
import Text.Pandoc.Class (runIOorExplode)
|
||||
import Text.Pandoc.PDF (makePDF)
|
||||
import Text.Pandoc.Walk (walkM)
|
||||
import Text.Pandoc.Extensions (getDefaultExtensions)
|
||||
import System.Environment (getArgs)
|
||||
import Control.Exception (throwIO)
|
||||
import System.FilePath (replaceExtension, takeDirectory, takeFileName, takeExtension, (</>))
|
||||
import System.Process (callProcess)
|
||||
import System.Directory (removeFile)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as TIO
|
||||
import Data.Hashable (hash)
|
||||
import Control.Monad (void)
|
||||
import Control.Exception (Exception, throwIO, bracket, catch, SomeException)
|
||||
import qualified Data.ByteString.Lazy as BL
|
||||
|
||||
-- | Parse command line arguments and return appropriate action
|
||||
-- | Custom error types for better error handling
|
||||
data DocsterError
|
||||
= InvalidUsage Text
|
||||
| FileError Text
|
||||
| PDFGenerationError Text
|
||||
| ProcessError Text
|
||||
deriving (Show)
|
||||
|
||||
instance Exception DocsterError
|
||||
|
||||
-- | Type-safe wrappers for better domain modeling
|
||||
newtype SourceDir = SourceDir FilePath deriving (Show, Eq)
|
||||
newtype OutputPath = OutputPath FilePath deriving (Show, Eq)
|
||||
newtype DiagramId = DiagramId Text deriving (Show, Eq)
|
||||
|
||||
-- | Constants for the application
|
||||
mermaidCommand :: String
|
||||
mermaidCommand = "mmdc"
|
||||
|
||||
diagramPrefix :: String
|
||||
diagramPrefix = "diagram-"
|
||||
|
||||
successEmoji :: String
|
||||
successEmoji = "✅"
|
||||
|
||||
-- | Generate a diagram ID from content hash or explicit ID
|
||||
generateDiagramId :: Text -> Text -> DiagramId
|
||||
generateDiagramId explicitId contents
|
||||
| T.null explicitId = DiagramId $ T.pack $ diagramPrefix <> take 6 (show (abs (hash (T.unpack contents))))
|
||||
| otherwise = DiagramId explicitId
|
||||
|
||||
-- | Transform Mermaid code blocks into image embeds with resource cleanup
|
||||
processMermaidBlock :: SourceDir -> OutputPath -> Block -> IO Block
|
||||
processMermaidBlock (SourceDir sourceDir) (OutputPath outputPath) (CodeBlock (id', classes, _) contents)
|
||||
| "mermaid" `elem` classes = do
|
||||
let DiagramId diagId = generateDiagramId id' contents
|
||||
diagIdStr = T.unpack diagId
|
||||
mmdFile = sourceDir </> diagIdStr <> ".mmd"
|
||||
-- Use SVG for HTML (scalable), high-res PNG for PDF (text compatibility)
|
||||
(outputFile, imagePath) = if isHTMLOutput outputPath
|
||||
then let svgFile = sourceDir </> diagIdStr <> ".svg"
|
||||
in (svgFile, takeFileName svgFile)
|
||||
else let pngFile = sourceDir </> diagIdStr <> ".png"
|
||||
in (pngFile, pngFile)
|
||||
|
||||
-- Use bracket to ensure cleanup of temporary mermaid file
|
||||
bracket
|
||||
(TIO.writeFile mmdFile contents >> return mmdFile)
|
||||
(\file -> removeFile file `catch` \(_ :: SomeException) -> return ())
|
||||
(\_ -> do
|
||||
-- Generate with appropriate format and quality for output type
|
||||
if isHTMLOutput outputPath
|
||||
then void $ callProcess mermaidCommand ["-i", mmdFile, "-o", outputFile]
|
||||
else void $ callProcess mermaidCommand ["-i", mmdFile, "-o", outputFile, "--scale", "3"]
|
||||
putStrLn $ successEmoji <> " Generated " <> outputFile
|
||||
return $ Para [Image nullAttr [] (T.pack imagePath, "Mermaid diagram")])
|
||||
processMermaidBlock _ _ block = return block
|
||||
|
||||
-- | Check if output is HTML format based on file extension
|
||||
isHTMLOutput :: FilePath -> Bool
|
||||
isHTMLOutput path = takeExtension path == ".html"
|
||||
|
||||
-- | Walk the Pandoc AST and process blocks using walkM
|
||||
transformDocument :: SourceDir -> OutputPath -> Pandoc -> IO Pandoc
|
||||
transformDocument sourceDir outputPath = walkM (processMermaidBlock sourceDir outputPath)
|
||||
|
||||
-- | LaTeX template with comprehensive package support
|
||||
latexTemplate :: Text -> Text
|
||||
latexTemplate bodyContent = T.unlines
|
||||
[ "\\documentclass{article}"
|
||||
, "\\usepackage[utf8]{inputenc}"
|
||||
, "\\usepackage{fontspec}"
|
||||
, "\\usepackage{graphicx}"
|
||||
, "\\usepackage{geometry}"
|
||||
, "\\geometry{margin=1in}"
|
||||
, "\\usepackage{hyperref}"
|
||||
, "\\usepackage{enumitem}"
|
||||
, "\\usepackage{amsmath}"
|
||||
, "\\usepackage{amssymb}"
|
||||
, "\\usepackage{fancyvrb}"
|
||||
, "\\usepackage{color}"
|
||||
, "\\DefineVerbatimEnvironment{Highlighting}{Verbatim}{commandchars=\\\\\\{\\}}"
|
||||
, "\\newenvironment{Shaded}{}{}"
|
||||
, syntaxHighlightingCommands
|
||||
, "\\providecommand{\\tightlist}{%"
|
||||
, " \\setlength{\\itemsep}{0pt}\\setlength{\\parskip}{0pt}}"
|
||||
, "\\begin{document}"
|
||||
, bodyContent
|
||||
, "\\end{document}"
|
||||
]
|
||||
|
||||
-- | Syntax highlighting commands for LaTeX
|
||||
syntaxHighlightingCommands :: Text
|
||||
syntaxHighlightingCommands = T.unlines
|
||||
[ "\\newcommand{\\AlertTok}[1]{\\textcolor[rgb]{1.00,0.00,0.00}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\AnnotationTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
, "\\newcommand{\\AttributeTok}[1]{\\textcolor[rgb]{0.49,0.56,0.16}{#1}}"
|
||||
, "\\newcommand{\\BaseNTok}[1]{\\textcolor[rgb]{0.25,0.63,0.44}{#1}}"
|
||||
, "\\newcommand{\\BuiltInTok}[1]{#1}"
|
||||
, "\\newcommand{\\CharTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\CommentTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textit{#1}}}"
|
||||
, "\\newcommand{\\CommentVarTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
, "\\newcommand{\\ConstantTok}[1]{\\textcolor[rgb]{0.53,0.00,0.00}{#1}}"
|
||||
, "\\newcommand{\\ControlFlowTok}[1]{\\textcolor[rgb]{0.00,0.44,0.13}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\DataTypeTok}[1]{\\textcolor[rgb]{0.56,0.13,0.00}{#1}}"
|
||||
, "\\newcommand{\\DecValTok}[1]{\\textcolor[rgb]{0.25,0.63,0.44}{#1}}"
|
||||
, "\\newcommand{\\DocumentationTok}[1]{\\textcolor[rgb]{0.73,0.13,0.13}{\\textit{#1}}}"
|
||||
, "\\newcommand{\\ErrorTok}[1]{\\textcolor[rgb]{1.00,0.00,0.00}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\ExtensionTok}[1]{#1}"
|
||||
, "\\newcommand{\\FloatTok}[1]{\\textcolor[rgb]{0.25,0.63,0.44}{#1}}"
|
||||
, "\\newcommand{\\FunctionTok}[1]{\\textcolor[rgb]{0.02,0.16,0.49}{#1}}"
|
||||
, "\\newcommand{\\ImportTok}[1]{#1}"
|
||||
, "\\newcommand{\\InformationTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
, "\\newcommand{\\KeywordTok}[1]{\\textcolor[rgb]{0.00,0.44,0.13}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\NormalTok}[1]{#1}"
|
||||
, "\\newcommand{\\OperatorTok}[1]{\\textcolor[rgb]{0.40,0.40,0.40}{#1}}"
|
||||
, "\\newcommand{\\OtherTok}[1]{\\textcolor[rgb]{0.00,0.44,0.13}{#1}}"
|
||||
, "\\newcommand{\\PreprocessorTok}[1]{\\textcolor[rgb]{0.74,0.48,0.00}{#1}}"
|
||||
, "\\newcommand{\\RegionMarkerTok}[1]{#1}"
|
||||
, "\\newcommand{\\SpecialCharTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\SpecialStringTok}[1]{\\textcolor[rgb]{0.73,0.40,0.53}{#1}}"
|
||||
, "\\newcommand{\\StringTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\VariableTok}[1]{\\textcolor[rgb]{0.10,0.09,0.49}{#1}}"
|
||||
, "\\newcommand{\\VerbatimStringTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\WarningTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
]
|
||||
|
||||
-- | Parse command line arguments
|
||||
parseArgs :: [String] -> Either DocsterError (IO ())
|
||||
parseArgs ["-pdf", path] = Right (compileToPDF path)
|
||||
parseArgs ["-html", path] = Right (compileToHTML path)
|
||||
parseArgs _ = Left $ InvalidUsage "Usage: docster -pdf|-html <file.md>"
|
||||
|
||||
-- | Main entry point - parse arguments and execute appropriate action
|
||||
-- | Compile markdown to PDF using XeLaTeX
|
||||
compileToPDF :: FilePath -> IO ()
|
||||
compileToPDF path = do
|
||||
let sourceDir = SourceDir $ takeDirectory path
|
||||
outputPath = OutputPath $ replaceExtension path "pdf"
|
||||
|
||||
result <- compileToPDFSafe sourceDir (OutputPath path) outputPath
|
||||
case result of
|
||||
Left err -> throwIO err
|
||||
Right _ -> return ()
|
||||
|
||||
-- | Safe PDF compilation with proper error handling
|
||||
compileToPDFSafe :: SourceDir -> OutputPath -> OutputPath -> IO (Either DocsterError ())
|
||||
compileToPDFSafe sourceDir (OutputPath inputPath) outputPath@(OutputPath outputPathStr) = do
|
||||
content <- TIO.readFile inputPath
|
||||
let readerOptions = def { readerExtensions = getDefaultExtensions "markdown" }
|
||||
|
||||
pandoc <- runIOorExplode $ readMarkdown readerOptions content
|
||||
transformed <- transformDocument sourceDir outputPath pandoc
|
||||
|
||||
-- Generate LaTeX with proper template
|
||||
latexOutput <- runIOorExplode $ writeLaTeX def transformed
|
||||
let completeLatex = latexTemplate latexOutput
|
||||
|
||||
result <- runIOorExplode $ makePDF "xelatex" [] (\_ _ -> return completeLatex) def transformed
|
||||
case result of
|
||||
Left err -> return $ Left $ PDFGenerationError $ T.pack $ show err
|
||||
Right bs -> do
|
||||
BL.writeFile outputPathStr bs
|
||||
putStrLn $ successEmoji <> " PDF written to " <> outputPathStr
|
||||
return $ Right ()
|
||||
|
||||
-- | Compile markdown to HTML
|
||||
compileToHTML :: FilePath -> IO ()
|
||||
compileToHTML path = do
|
||||
let sourceDir = SourceDir $ takeDirectory path
|
||||
outputPath = OutputPath $ replaceExtension path "html"
|
||||
|
||||
result <- compileToHTMLSafe sourceDir (OutputPath path) outputPath
|
||||
case result of
|
||||
Left err -> throwIO err
|
||||
Right _ -> return ()
|
||||
|
||||
-- | Safe HTML compilation with proper error handling
|
||||
compileToHTMLSafe :: SourceDir -> OutputPath -> OutputPath -> IO (Either DocsterError ())
|
||||
compileToHTMLSafe sourceDir (OutputPath inputPath) outputPath@(OutputPath outputPathStr) = do
|
||||
content <- TIO.readFile inputPath
|
||||
let readerOptions = def { readerExtensions = getDefaultExtensions "markdown" }
|
||||
|
||||
pandoc <- runIOorExplode $ readMarkdown readerOptions content
|
||||
transformed <- transformDocument sourceDir outputPath pandoc
|
||||
|
||||
html <- runIOorExplode $ writeHtml5String def transformed
|
||||
TIO.writeFile outputPathStr html
|
||||
putStrLn $ successEmoji <> " HTML written to " <> outputPathStr
|
||||
|
||||
-- Open the generated HTML file in browser
|
||||
putStrLn $ "🌐 Opening " <> outputPathStr <> " in browser for error checking..."
|
||||
void $ callProcess "open" [outputPathStr]
|
||||
|
||||
return $ Right ()
|
||||
|
||||
-- | Main entry point
|
||||
main :: IO ()
|
||||
main = do
|
||||
args <- getArgs
|
||||
|
@ -25,14 +25,10 @@ common warnings
|
||||
-Wpartial-fields
|
||||
-Wredundant-constraints
|
||||
|
||||
library
|
||||
executable docster
|
||||
import: warnings
|
||||
exposed-modules: Docster.Types
|
||||
Docster.Mermaid
|
||||
Docster.Transform
|
||||
Docster.LaTeX
|
||||
Docster.Compiler
|
||||
hs-source-dirs: src
|
||||
main-is: Main.hs
|
||||
hs-source-dirs: app
|
||||
build-depends:
|
||||
base >=4.21 && <5,
|
||||
text >=2.0 && <2.2,
|
||||
@ -42,19 +38,7 @@ library
|
||||
hashable >=1.4 && <1.6,
|
||||
pandoc >=3.0 && <3.2,
|
||||
pandoc-types >=1.23 && <1.25,
|
||||
bytestring >=0.11 && <0.13,
|
||||
temporary >=1.3 && <1.4,
|
||||
transformers >=0.5 && <0.7
|
||||
default-language: Haskell2010
|
||||
|
||||
executable docster
|
||||
import: warnings
|
||||
main-is: Main.hs
|
||||
hs-source-dirs: app
|
||||
build-depends:
|
||||
base >=4.21 && <5,
|
||||
text >=2.0 && <2.2,
|
||||
docster
|
||||
bytestring >=0.11 && <0.13
|
||||
default-language: Haskell2010
|
||||
ghc-options: -threaded
|
||||
-rtsopts
|
||||
|
@ -1,227 +0,0 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE LambdaCase #-}
|
||||
|
||||
-- | Document compilation functionality for PDF and HTML output
|
||||
module Docster.Compiler
|
||||
( -- * Compilation Functions
|
||||
compileToPDF
|
||||
, compileToHTML
|
||||
) where
|
||||
|
||||
import Docster.Types
|
||||
import Docster.Transform (transformDocument)
|
||||
import Docster.LaTeX (latexTemplate)
|
||||
import Text.Pandoc
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as TIO
|
||||
import System.FilePath (takeDirectory, replaceExtension, (</>))
|
||||
import System.Process (callProcess, readProcessWithExitCode)
|
||||
import System.IO.Temp (withSystemTempDirectory)
|
||||
import System.Directory (copyFile, doesFileExist)
|
||||
import System.Exit (ExitCode(..))
|
||||
import Control.Exception (throwIO)
|
||||
import Control.Monad (void)
|
||||
import Control.Monad.Trans.Except (ExceptT, runExceptT, throwE)
|
||||
import Control.Monad.Trans.Reader (ReaderT, runReaderT, asks)
|
||||
import Control.Monad.Trans.Class (lift)
|
||||
import Control.Monad.IO.Class (liftIO)
|
||||
|
||||
-- | Success indicator for user feedback
|
||||
successEmoji :: Text
|
||||
successEmoji = "✅"
|
||||
|
||||
-- | Compilation context for pipeline operations
|
||||
data CompilationContext = CompilationContext
|
||||
{ ccStrategy :: CompilationStrategy
|
||||
, ccSourceDir :: SourceDir
|
||||
, ccInputPath :: FilePath
|
||||
, ccOutputPath :: FilePath
|
||||
, ccReaderOptions :: ReaderOptions
|
||||
, ccConfig :: DiagramConfig
|
||||
}
|
||||
|
||||
-- | Monad stack for compilation pipeline
|
||||
type CompilationM = ReaderT CompilationContext (ExceptT DocsterError IO)
|
||||
|
||||
-- | Strategy pattern: Record of format-specific operations
|
||||
data CompilationStrategy = CompilationStrategy
|
||||
{ -- | Format for diagram configuration
|
||||
csOutputFormat :: OutputFormat
|
||||
-- | Pandoc writer function
|
||||
, csWriter :: WriterOptions -> Pandoc -> PandocIO Text
|
||||
-- | Post-processing function for the generated content
|
||||
, csProcessOutput :: String -> Text -> IO (Either DocsterError ())
|
||||
-- | Success message formatter
|
||||
, csSuccessMessage :: String -> Text
|
||||
}
|
||||
|
||||
-- | PDF compilation strategy
|
||||
pdfStrategy :: CompilationStrategy
|
||||
pdfStrategy = CompilationStrategy
|
||||
{ csOutputFormat = PDF
|
||||
, csWriter = writeLaTeX
|
||||
, csProcessOutput = processPDFOutput
|
||||
, csSuccessMessage = \path -> successEmoji <> " PDF written to " <> T.pack path
|
||||
}
|
||||
|
||||
-- | HTML compilation strategy
|
||||
htmlStrategy :: CompilationStrategy
|
||||
htmlStrategy = CompilationStrategy
|
||||
{ csOutputFormat = HTML
|
||||
, csWriter = writeHtml5String
|
||||
, csProcessOutput = processHTMLOutput
|
||||
, csSuccessMessage = \path -> successEmoji <> " HTML written to " <> T.pack path
|
||||
}
|
||||
|
||||
-- | Process PDF output: LaTeX template application and direct XeLaTeX compilation
|
||||
processPDFOutput :: String -> Text -> IO (Either DocsterError ())
|
||||
processPDFOutput outputPath latexOutput = do
|
||||
let completeLatex = latexTemplate latexOutput
|
||||
|
||||
-- Use temporary directory for LaTeX compilation
|
||||
withSystemTempDirectory "docster-latex" $ \tempDir -> do
|
||||
let texFile = tempDir </> "document.tex"
|
||||
pdfFile = tempDir </> "document.pdf"
|
||||
logFile = tempDir </> "document.log"
|
||||
|
||||
-- Write LaTeX content to temporary file
|
||||
TIO.writeFile texFile completeLatex
|
||||
|
||||
-- Run XeLaTeX compilation
|
||||
(exitCode, _stdout, stderr) <- readProcessWithExitCode "xelatex"
|
||||
[ "-output-directory=" <> tempDir
|
||||
, "-interaction=nonstopmode" -- Don't stop on errors
|
||||
, texFile
|
||||
] ""
|
||||
|
||||
case exitCode of
|
||||
ExitSuccess -> do
|
||||
-- Check if PDF was actually generated
|
||||
pdfExists <- doesFileExist pdfFile
|
||||
if pdfExists
|
||||
then do
|
||||
-- Copy the generated PDF to the final location
|
||||
copyFile pdfFile outputPath
|
||||
return $ Right ()
|
||||
else do
|
||||
-- PDF generation failed, read log for details
|
||||
logExists <- doesFileExist logFile
|
||||
logContent <- if logExists
|
||||
then TIO.readFile logFile
|
||||
else return "No log file generated"
|
||||
return $ Left $ PDFGenerationError $
|
||||
"PDF file not generated. LaTeX log:\n" <> logContent
|
||||
ExitFailure code -> do
|
||||
-- LaTeX compilation failed, read log for details
|
||||
logExists <- doesFileExist logFile
|
||||
logContent <- if logExists
|
||||
then TIO.readFile logFile
|
||||
else return (T.pack stderr)
|
||||
return $ Left $ PDFGenerationError $
|
||||
"XeLaTeX compilation failed (exit code " <> T.pack (show code) <> "):\n" <>
|
||||
T.pack stderr <> "\n\nLaTeX log:\n" <> logContent
|
||||
|
||||
-- | Process HTML output: file writing and browser opening
|
||||
processHTMLOutput :: String -> Text -> IO (Either DocsterError ())
|
||||
processHTMLOutput outputPath html = do
|
||||
TIO.writeFile outputPath html
|
||||
|
||||
-- Open the generated HTML file in browser for verification
|
||||
putStrLn $ "🌐 Opening " <> outputPath <> " in browser for error checking..."
|
||||
void $ callProcess "open" [outputPath]
|
||||
|
||||
return $ Right ()
|
||||
|
||||
-- | Helper function to lift IO (Either DocsterError a) into CompilationM
|
||||
liftEitherM :: IO (Either DocsterError a) -> CompilationM a
|
||||
liftEitherM action = do
|
||||
result <- liftIO action
|
||||
case result of
|
||||
Left err -> lift $ throwE err
|
||||
Right value -> return value
|
||||
|
||||
-- | Pipeline step: Read content from input file
|
||||
readContent :: CompilationM Text
|
||||
readContent = do
|
||||
inputPath <- asks ccInputPath
|
||||
liftIO $ TIO.readFile inputPath
|
||||
|
||||
-- | Pipeline step: Parse markdown content into Pandoc AST
|
||||
parseDocument :: Text -> CompilationM Pandoc
|
||||
parseDocument content = do
|
||||
readerOptions <- asks ccReaderOptions
|
||||
liftEitherM $ parseMarkdown readerOptions content
|
||||
|
||||
-- | Pipeline step: Transform document (process Mermaid diagrams)
|
||||
transformDocumentM :: Pandoc -> CompilationM Pandoc
|
||||
transformDocumentM pandoc = do
|
||||
config <- asks ccConfig
|
||||
liftEitherM $ transformDocument config pandoc
|
||||
|
||||
-- | Pipeline step: Generate output using format-specific writer
|
||||
generateOutputM :: Pandoc -> CompilationM Text
|
||||
generateOutputM pandoc = do
|
||||
strategy <- asks ccStrategy
|
||||
liftEitherM $ generateOutput strategy pandoc
|
||||
|
||||
-- | Pipeline step: Process output and write to file
|
||||
processOutput :: Text -> CompilationM ()
|
||||
processOutput output = do
|
||||
strategy <- asks ccStrategy
|
||||
outputPath <- asks ccOutputPath
|
||||
liftEitherM $ csProcessOutput strategy outputPath output
|
||||
|
||||
-- | Pipeline step: Print success message
|
||||
printSuccess :: CompilationM ()
|
||||
printSuccess = do
|
||||
strategy <- asks ccStrategy
|
||||
outputPath <- asks ccOutputPath
|
||||
liftIO $ putStrLn $ T.unpack $ csSuccessMessage strategy outputPath
|
||||
|
||||
-- | Higher-order compilation function that takes a strategy and executes the pipeline
|
||||
compileWithStrategy :: CompilationStrategy -> SourceDir -> OutputPath -> OutputPath -> IO (Either DocsterError ())
|
||||
compileWithStrategy strategy sourceDir (OutputPath inputPath) (OutputPath outputPath) = do
|
||||
let readerOptions = def { readerExtensions = getDefaultExtensions "markdown" }
|
||||
config = DiagramConfig sourceDir (csOutputFormat strategy)
|
||||
context = CompilationContext strategy sourceDir inputPath outputPath readerOptions config
|
||||
pipeline = readContent >>= parseDocument >>= transformDocumentM >>= generateOutputM >>= processOutput >> printSuccess
|
||||
|
||||
runExceptT $ runReaderT pipeline context
|
||||
|
||||
-- | Parse markdown with error handling
|
||||
parseMarkdown :: ReaderOptions -> Text -> IO (Either DocsterError Pandoc)
|
||||
parseMarkdown readerOptions content = do
|
||||
pandocResult <- runIO $ readMarkdown readerOptions content
|
||||
return $ case pandocResult of
|
||||
Left err -> Left $ FileError $ "Failed to parse markdown: " <> T.pack (show err)
|
||||
Right pandoc -> Right pandoc
|
||||
|
||||
-- | Generate output using the strategy's writer with error handling
|
||||
generateOutput :: CompilationStrategy -> Pandoc -> IO (Either DocsterError Text)
|
||||
generateOutput strategy transformed = do
|
||||
result <- runIO $ csWriter strategy def transformed
|
||||
return $ case result of
|
||||
Left err -> Left $ case csOutputFormat strategy of
|
||||
PDF -> PDFGenerationError $ "LaTeX generation failed: " <> T.pack (show err)
|
||||
HTML -> FileError $ "HTML generation failed: " <> T.pack (show err)
|
||||
Right output -> Right output
|
||||
|
||||
-- | Compile markdown to PDF using XeLaTeX
|
||||
compileToPDF :: FilePath -> IO ()
|
||||
compileToPDF = compileWithFormat pdfStrategy "pdf"
|
||||
|
||||
-- | Compile markdown to HTML
|
||||
compileToHTML :: FilePath -> IO ()
|
||||
compileToHTML = compileWithFormat htmlStrategy "html"
|
||||
|
||||
-- | Higher-order function to compile with any format strategy
|
||||
compileWithFormat :: CompilationStrategy -> String -> FilePath -> IO ()
|
||||
compileWithFormat strategy extension path = do
|
||||
let sourceDir = SourceDir $ takeDirectory path
|
||||
outputPath = OutputPath $ replaceExtension path extension
|
||||
|
||||
result <- compileWithStrategy strategy sourceDir (OutputPath path) outputPath
|
||||
case result of
|
||||
Left err -> throwIO err
|
||||
Right _ -> return ()
|
@ -1,78 +0,0 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
|
||||
-- | LaTeX template and syntax highlighting definitions
|
||||
module Docster.LaTeX
|
||||
( -- * LaTeX Generation
|
||||
latexTemplate
|
||||
) where
|
||||
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
|
||||
-- | LaTeX template with comprehensive package support for PDF generation
|
||||
latexTemplate :: Text -> Text
|
||||
latexTemplate bodyContent = T.unlines
|
||||
[ "\\documentclass{article}"
|
||||
, "\\usepackage[utf8]{inputenc}"
|
||||
, "\\usepackage{fontspec}"
|
||||
, "\\usepackage{graphicx}"
|
||||
, "\\usepackage{adjustbox}"
|
||||
, "\\usepackage{geometry}"
|
||||
, "\\geometry{margin=1in}"
|
||||
, "\\usepackage{hyperref}"
|
||||
, "\\usepackage{enumitem}"
|
||||
, "\\usepackage{amsmath}"
|
||||
, "\\usepackage{amssymb}"
|
||||
, "\\usepackage{fancyvrb}"
|
||||
, "\\usepackage{color}"
|
||||
, "\\DefineVerbatimEnvironment{Highlighting}{Verbatim}{commandchars=\\\\\\{\\}}"
|
||||
, "\\newenvironment{Shaded}{}{}"
|
||||
, syntaxHighlightingCommands
|
||||
, "\\providecommand{\\tightlist}{%"
|
||||
, " \\setlength{\\itemsep}{0pt}\\setlength{\\parskip}{0pt}}"
|
||||
, "% Auto-scale oversized images to fit page"
|
||||
, "\\makeatletter"
|
||||
, "\\def\\maxwidth{\\ifdim\\Gin@nat@width>\\linewidth\\linewidth\\else\\Gin@nat@width\\fi}"
|
||||
, "\\def\\maxheight{\\ifdim\\Gin@nat@height>\\textheight\\textheight\\else\\Gin@nat@height\\fi}"
|
||||
, "\\makeatother"
|
||||
, "\\setkeys{Gin}{width=\\maxwidth,height=\\maxheight,keepaspectratio}"
|
||||
, "\\begin{document}"
|
||||
, bodyContent
|
||||
, "\\end{document}"
|
||||
]
|
||||
|
||||
-- | Syntax highlighting commands for LaTeX code blocks
|
||||
syntaxHighlightingCommands :: Text
|
||||
syntaxHighlightingCommands = T.unlines
|
||||
[ "\\newcommand{\\AlertTok}[1]{\\textcolor[rgb]{1.00,0.00,0.00}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\AnnotationTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
, "\\newcommand{\\AttributeTok}[1]{\\textcolor[rgb]{0.49,0.56,0.16}{#1}}"
|
||||
, "\\newcommand{\\BaseNTok}[1]{\\textcolor[rgb]{0.25,0.63,0.44}{#1}}"
|
||||
, "\\newcommand{\\BuiltInTok}[1]{#1}"
|
||||
, "\\newcommand{\\CharTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\CommentTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textit{#1}}}"
|
||||
, "\\newcommand{\\CommentVarTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
, "\\newcommand{\\ConstantTok}[1]{\\textcolor[rgb]{0.53,0.00,0.00}{#1}}"
|
||||
, "\\newcommand{\\ControlFlowTok}[1]{\\textcolor[rgb]{0.00,0.44,0.13}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\DataTypeTok}[1]{\\textcolor[rgb]{0.56,0.13,0.00}{#1}}"
|
||||
, "\\newcommand{\\DecValTok}[1]{\\textcolor[rgb]{0.25,0.63,0.44}{#1}}"
|
||||
, "\\newcommand{\\DocumentationTok}[1]{\\textcolor[rgb]{0.73,0.13,0.13}{\\textit{#1}}}"
|
||||
, "\\newcommand{\\ErrorTok}[1]{\\textcolor[rgb]{1.00,0.00,0.00}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\ExtensionTok}[1]{#1}"
|
||||
, "\\newcommand{\\FloatTok}[1]{\\textcolor[rgb]{0.25,0.63,0.44}{#1}}"
|
||||
, "\\newcommand{\\FunctionTok}[1]{\\textcolor[rgb]{0.02,0.16,0.49}{#1}}"
|
||||
, "\\newcommand{\\ImportTok}[1]{#1}"
|
||||
, "\\newcommand{\\InformationTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
, "\\newcommand{\\KeywordTok}[1]{\\textcolor[rgb]{0.00,0.44,0.13}{\\textbf{#1}}}"
|
||||
, "\\newcommand{\\NormalTok}[1]{#1}"
|
||||
, "\\newcommand{\\OperatorTok}[1]{\\textcolor[rgb]{0.40,0.40,0.40}{#1}}"
|
||||
, "\\newcommand{\\OtherTok}[1]{\\textcolor[rgb]{0.00,0.44,0.13}{#1}}"
|
||||
, "\\newcommand{\\PreprocessorTok}[1]{\\textcolor[rgb]{0.74,0.48,0.00}{#1}}"
|
||||
, "\\newcommand{\\RegionMarkerTok}[1]{#1}"
|
||||
, "\\newcommand{\\SpecialCharTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\SpecialStringTok}[1]{\\textcolor[rgb]{0.73,0.40,0.53}{#1}}"
|
||||
, "\\newcommand{\\StringTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\VariableTok}[1]{\\textcolor[rgb]{0.10,0.09,0.49}{#1}}"
|
||||
, "\\newcommand{\\VerbatimStringTok}[1]{\\textcolor[rgb]{0.25,0.44,0.63}{#1}}"
|
||||
, "\\newcommand{\\WarningTok}[1]{\\textcolor[rgb]{0.38,0.63,0.69}{\\textbf{\\textit{#1}}}}"
|
||||
]
|
@ -1,94 +0,0 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
{-# LANGUAGE ScopedTypeVariables #-}
|
||||
|
||||
-- | Mermaid diagram processing functionality
|
||||
module Docster.Mermaid
|
||||
( -- * Diagram Processing
|
||||
processMermaidBlock
|
||||
, renderMermaidDiagram
|
||||
, generateDiagramId
|
||||
) where
|
||||
|
||||
import Docster.Types
|
||||
import Text.Pandoc.Definition (Block(..), Inline(..), nullAttr)
|
||||
import Data.Text (Text)
|
||||
import qualified Data.Text as T
|
||||
import qualified Data.Text.IO as TIO
|
||||
import Data.Hashable (hash)
|
||||
import System.FilePath (takeFileName, (</>))
|
||||
import System.Directory (removeFile)
|
||||
import System.Process (callProcess)
|
||||
import Control.Exception (bracket, catch, SomeException)
|
||||
|
||||
-- | Application constants
|
||||
mermaidCommand :: Text
|
||||
mermaidCommand = "mmdc"
|
||||
|
||||
diagramPrefix :: Text
|
||||
diagramPrefix = "diagram-"
|
||||
|
||||
successEmoji :: Text
|
||||
successEmoji = "✅"
|
||||
|
||||
-- | Generate a diagram ID from content hash or explicit ID
|
||||
generateDiagramId :: Text -> Text -> DiagramId
|
||||
generateDiagramId explicitId contents
|
||||
| T.null explicitId = DiagramId $ diagramPrefix <> T.take 6 (T.pack . show . abs . hash $ T.unpack contents)
|
||||
| otherwise = DiagramId explicitId
|
||||
|
||||
-- | Transform Mermaid code blocks into image embeds
|
||||
processMermaidBlock :: DiagramConfig -> Block -> IO (Either DocsterError Block)
|
||||
processMermaidBlock config (CodeBlock (id', classes, _) contents)
|
||||
| "mermaid" `elem` classes = do
|
||||
let diagId = generateDiagramId id' contents
|
||||
result <- renderMermaidDiagram config diagId contents
|
||||
case result of
|
||||
Left err -> return $ Left err
|
||||
Right imagePath -> return $ Right $ createImageBlock imagePath
|
||||
processMermaidBlock _ block = return $ Right block
|
||||
|
||||
-- | Render Mermaid diagram to appropriate format with resource cleanup
|
||||
renderMermaidDiagram :: DiagramConfig -> DiagramId -> Text -> IO (Either DocsterError Text)
|
||||
renderMermaidDiagram config@(DiagramConfig (SourceDir sourceDir) format) diagId contents = do
|
||||
let diagIdStr = T.unpack $ (\(DiagramId d) -> d) diagId
|
||||
mmdFile = sourceDir </> diagIdStr <> ".mmd"
|
||||
(outputFile, imagePath) = generateDiagramPaths config diagId
|
||||
|
||||
-- Use bracket to ensure cleanup of temporary mermaid file
|
||||
result <- bracket
|
||||
(TIO.writeFile mmdFile contents >> return mmdFile)
|
||||
(\file -> removeFile file `catch` \(_ :: SomeException) -> return ())
|
||||
(\_ -> do
|
||||
processResult <- callMermaidProcess format mmdFile outputFile
|
||||
case processResult of
|
||||
Left err -> return $ Left err
|
||||
Right _ -> do
|
||||
putStrLn $ T.unpack $ successEmoji <> " Generated " <> T.pack outputFile
|
||||
return $ Right imagePath)
|
||||
return result
|
||||
|
||||
-- | Generate file paths for diagram based on format
|
||||
generateDiagramPaths :: DiagramConfig -> DiagramId -> (FilePath, Text)
|
||||
generateDiagramPaths (DiagramConfig (SourceDir sourceDir) format) (DiagramId diagId) =
|
||||
let diagIdStr = T.unpack diagId
|
||||
in case format of
|
||||
HTML -> let svgFile = sourceDir </> diagIdStr <> ".svg"
|
||||
in (svgFile, T.pack $ takeFileName svgFile)
|
||||
PDF -> let pngFile = sourceDir </> diagIdStr <> ".png"
|
||||
in (pngFile, T.pack pngFile)
|
||||
|
||||
-- | Call mermaid CLI process with appropriate arguments
|
||||
callMermaidProcess :: OutputFormat -> FilePath -> FilePath -> IO (Either DocsterError ())
|
||||
callMermaidProcess format mmdFile outputFile = do
|
||||
let args = case format of
|
||||
HTML -> ["-i", mmdFile, "-o", outputFile]
|
||||
PDF -> ["-i", mmdFile, "-o", outputFile, "--scale", "3"]
|
||||
|
||||
result <- catch
|
||||
(callProcess (T.unpack mermaidCommand) args >> return (Right ()))
|
||||
(\(e :: SomeException) -> return $ Left $ ProcessError $ "Mermaid process failed: " <> T.pack (show e))
|
||||
return result
|
||||
|
||||
-- | Create Pandoc image block from image path
|
||||
createImageBlock :: Text -> Block
|
||||
createImageBlock imagePath = Para [Image nullAttr [] (imagePath, "Mermaid diagram")]
|
@ -1,23 +0,0 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
|
||||
-- | Document transformation functionality for processing Pandoc AST
|
||||
module Docster.Transform
|
||||
( -- * Document Transformation
|
||||
transformDocument
|
||||
) where
|
||||
|
||||
import Docster.Types
|
||||
import Docster.Mermaid (processMermaidBlock)
|
||||
import Text.Pandoc.Definition (Pandoc(..), Block)
|
||||
|
||||
-- | Walk the Pandoc AST and process blocks with error handling
|
||||
transformDocument :: DiagramConfig -> Pandoc -> IO (Either DocsterError Pandoc)
|
||||
transformDocument config doc = walkMEither (processMermaidBlock config) doc
|
||||
|
||||
-- | Walk with error handling - transforms Either into IO Either
|
||||
walkMEither :: Monad m => (Block -> m (Either e Block)) -> Pandoc -> m (Either e Pandoc)
|
||||
walkMEither f (Pandoc meta blocks) = do
|
||||
results <- mapM f blocks
|
||||
case sequence results of
|
||||
Left err -> return $ Left err
|
||||
Right newBlocks -> return $ Right $ Pandoc meta newBlocks
|
@ -1,51 +0,0 @@
|
||||
{-# LANGUAGE OverloadedStrings #-}
|
||||
|
||||
-- | Core types and error definitions for Docster
|
||||
module Docster.Types
|
||||
( -- * Error Types
|
||||
DocsterError(..)
|
||||
|
||||
-- * Output Format
|
||||
, OutputFormat(..)
|
||||
|
||||
-- * Domain Types
|
||||
, SourceDir(..)
|
||||
, OutputPath(..)
|
||||
, DiagramId(..)
|
||||
, DiagramConfig(..)
|
||||
) where
|
||||
|
||||
import Data.Text (Text)
|
||||
import Control.Exception (Exception)
|
||||
|
||||
-- | Custom error types for comprehensive error handling
|
||||
data DocsterError
|
||||
= InvalidUsage Text
|
||||
| FileError Text
|
||||
| PDFGenerationError Text
|
||||
| ProcessError Text
|
||||
deriving (Show)
|
||||
|
||||
instance Exception DocsterError
|
||||
|
||||
-- | Output format for document generation
|
||||
data OutputFormat = PDF | HTML
|
||||
deriving (Show, Eq)
|
||||
|
||||
-- | Type-safe wrapper for source directory paths
|
||||
newtype SourceDir = SourceDir FilePath
|
||||
deriving (Show, Eq)
|
||||
|
||||
-- | Type-safe wrapper for output file paths
|
||||
newtype OutputPath = OutputPath FilePath
|
||||
deriving (Show, Eq)
|
||||
|
||||
-- | Type-safe wrapper for diagram identifiers
|
||||
newtype DiagramId = DiagramId Text
|
||||
deriving (Show, Eq)
|
||||
|
||||
-- | Configuration for diagram generation
|
||||
data DiagramConfig = DiagramConfig
|
||||
{ dcSourceDir :: SourceDir
|
||||
, dcOutputFormat :: OutputFormat
|
||||
} deriving (Show)
|
@ -1,8 +0,0 @@
|
||||
resolver: lts-22.39 # GHC 9.12.2 compatible
|
||||
|
||||
packages:
|
||||
- .
|
||||
|
||||
extra-deps: []
|
||||
|
||||
allow-newer: true
|
@ -1,12 +0,0 @@
|
||||
# This file was autogenerated by Stack.
|
||||
# You should not edit this file by hand.
|
||||
# For more information, please see the documentation at:
|
||||
# https://docs.haskellstack.org/en/stable/topics/lock_files
|
||||
|
||||
packages: []
|
||||
snapshots:
|
||||
- completed:
|
||||
sha256: 6c5aeace2ca5ecde793a9e0acfaa730ec8f384aa2f6183a2a252f5f9ec55d623
|
||||
size: 720039
|
||||
url: https://raw.githubusercontent.com/commercialhaskell/stackage-snapshots/master/lts/22/39.yaml
|
||||
original: lts-22.39
|
Loading…
x
Reference in New Issue
Block a user