Run Monstarillo
Now for the exciting part - running Monstarillo to generate our database documentation! This final step connects to your database and transforms the schema into markdown files.
📋 Prerequisites Checklist¶
Before running Monstarillo, ensure you have:
- ✅ Monstarillo installed (see Install guide)
- ✅ Templates configured (see Configure guide)
- ✅ Database running (Chinook PostgreSQL sample database)
- ✅ Database credentials ready
🔗 Database Connection Information¶
You'll need these six pieces of information to connect:
Parameter | Value | Description |
---|---|---|
Database Type | postgres |
Database engine (postgres, mysql, oracle) |
Templates File | /path/to/templates.json |
Location of your configuration file |
Username | postgres |
Database username |
Password | your_password |
Database password |
Database Name | chinook-db |
Name of the database |
Host | localhost |
Database server (localhost for Docker) |
Schema | public |
Database schema name |
🚀 Run Monstarillo¶
Basic Command Structure¶
monstarillo [database_type] \
--t [templates.json_path] \
--u [username] \
--p [password] \
--db [database_name] \
--host [host] \
--schema [schema_name]
Example with Chinook Database¶
monstarillo postgres \
--t /path/to/monstarillo-templates/tutorial-database-documenter/templates.json \
--u postgres \
--p your_password_here \
--db "chinook-db" \
--host "localhost" \
--schema "public"
Update Paths
Replace /path/to/monstarillo-templates
with your actual template directory path from the configuration step.
Command Options Reference¶
Option | Long Form | Description | Example |
---|---|---|---|
postgres |
N/A | Database type | postgres , mysql , oracle |
--t |
--templates |
Path to templates.json | --t ./templates.json |
--u |
--username |
Database username | --u postgres |
--p |
--password |
Database password | --p mypassword |
--db |
--database |
Database name | --db "chinook-db" |
--host |
--host |
Database host/IP | --host localhost |
--schema |
--schema |
Database schema | --schema public |
📤 Expected Output¶
When you run the command, you should see output similar to:
postgres is in the building
Processing table: Artist
Processing table: Album
Processing table: Employee
Processing table: Customer
Processing table: Invoice
Processing table: InvoiceLine
Processing table: Track
Processing table: Playlist
Processing table: PlaylistTrack
Processing table: Genre
Processing table: MediaType
Processing template: /path/to/templates/tutorial-database-documenter/table.tmpl
Generation complete!
📁 View Generated Files¶
Check your output directory for the generated documentation:
# Navigate to your output directory
cd /path/to/your/output/directory
# List generated files
ls -la tables/
# Example output:
# Album.md
# Artist.md
# Customer.md
# Employee.md
# Genre.md
# Invoice.md
# InvoiceLine.md
# MediaType.md
# Playlist.md
# PlaylistTrack.md
# Track.md
Each .md
file contains detailed documentation for that database table, including:
- Table name and description
- Column names and data types
- Primary keys and constraints
- Relationships to other tables
🎉 Success!¶
You've successfully generated database documentation using Monstarillo!
What You Accomplished¶
- Connected to a PostgreSQL database
- Extracted metadata from 11 database tables
- Generated 11 markdown documentation files
- Learned the basic Monstarillo workflow
🚀 Next Steps¶
Now that you understand the basics, explore more advanced use cases:
💡 Ideas for Your Next Project¶
- Generate REST APIs for Go, Java, or Node.js
- Create database models for your favorite ORM
- Build API documentation (OpenAPI/Swagger specs)
- Generate test files with realistic sample data
🆘 Troubleshooting¶
Connection refused? - Ensure your database is running - Check if the port is correct (default PostgreSQL: 5432) - Verify your host/credentials
Template not found?
- Double-check your --t
path points to templates.json
- Ensure TemplateRoot in your config is correct
Permission denied? - Verify database user has read permissions - Check that output directory is writable
Empty files generated?
- Ensure your database has tables with data
- Check the minimumGeneratedFileLength
setting