Try / Caddy in Y minutes
Caddy is a powerful, extensible server platform to serve your sites, services, and apps. Most people use Caddy as a web server or proxy.
This interactive guide introduces the basics of Caddy, Caddy API and the Caddyfile. You can try all examples without leaving the browser or installing anything.
Getting started - Running Caddy · Your first config · Your first Caddyfile · JSON vs. Caddyfile · API vs. Config files · Start, stop, run
Caddy API - Reloading config · Basic config · Config traversal · Using @id tag
Caddyfile - First site · Adding functionality · Multiple sites · Matchers · Environment variables · Comments
✨ This is an open source guide. Feel free to improve it!
Getting started
Let's explore the basics of using Caddy and get familiar with it at a high level.
Running Caddy
Let's start by running it:
caddy
Caddy is an extensible server platform written in Go.
At its core, Caddy merely manages configuration. Modules are plugged
in statically at compile-time to provide useful functionality. Caddy's
standard distribution includes common modules to serve HTTP, TLS,
and PKI applications, including the automation of certificates.
Oops; without a subcommand, the caddy
command only displays help text. You can use this any time you forget what to do.
To start Caddy as a daemon, use the run
subcommand:
caddy run
This blocks forever, but what is it doing? At the moment... nothing. By default, Caddy's configuration ("config") is blank. We can verify this using the admin API in another terminal:
curl localhost:2019/config/
null
null
is an actual response from Caddy. It's not very informative because of the empty config.
localhost:2019 is not your website: this administration endpoint is used for controlling Caddy and is restricted to localhost by default.
We can make Caddy useful by giving it a config. This can be done many ways, but we'll start by making a POST request to the /load endpoint using curl
in the next section.
Your first config
To prepare our request, we need to make a config. At its core, Caddy's configuration is simply a JSON document.
Save this to a JSON file (e.g. caddy.json
):
{
"apps": {
"http": {
"servers": {
"example": {
"listen": [":2015"],
"routes": [
{
"handle": [
{
"handler": "static_response",
"body": "Hello, world!"
}
]
}
]
}
}
}
}
}
Valid configuration
You do not have to use config files, but we are for this tutorial. Caddy's admin API is designed for use by other programs or scripts.
Then upload it:
curl localhost:2019/load \
-H "Content-Type: application/json" \
-w "status: %{response_code}" \
-d @caddy.json
status: 200
We can verify that Caddy applied our new config with another GET request:
curl localhost:2019/config/
{"apps":{"http":{"servers":{"example":{"listen":[":2015"],"routes":[{"handle":[{"body":"Hello, world!","handler":"static_response"}]}]}}}}}
Test that it works by going to localhost:2015
in your browser or use curl
:
curl localhost:2015
Hello, world!
If you see Hello, world!, then congrats — it's working! It's always a good idea to make sure your config works as you expect, especially before deploying into production.
Try changing the body
in the caddy.json
above and note how the curl
response changes.
Your first Caddyfile
That was kind of a lot of work just for Hello World.
Another way to configure Caddy is with the Caddyfile. The same config we wrote in JSON above can be expressed simply as:
:2015
respond "Hello, world!"
Valid configuration
Save that to a file named Caddyfile
(no extension) in the current directory.
Stop Caddy if it is already running (Ctrl
+C
), then run:
# or if your Caddyfile is somewhere else:
# caddy adapt --config /path/to/Caddyfile
caddy adapt
{"apps":{"http":{"servers":{"srv0":{"listen":[":2015"],"routes":[{"handle":[{"body":"Hello, world!","handler":"static_response"}]}]}}}}}
You will see JSON output! What happened here?
We just used a config adapter to convert our Caddyfile to Caddy's native JSON structure.
While we could take that output and make another API request, we can skip all those steps because the caddy
command can do it for us. If there is a file called Caddyfile in the current directory and no other config is specified, Caddy will load the Caddyfile, adapt it for us, and run it right away.
Now that there is a Caddyfile in the current folder, let's do caddy run
again:
# or if your Caddyfile is somewhere else:
# caddy run --config /path/to/Caddyfile
caddy run
(If it is called something else that doesn't start with "Caddyfile", you will need to specify --adapter caddyfile
)
You can now try loading your site again and you will see that it is working:
curl localhost:2015
Hello, world!
Try changing the Caddyfile
contents above and note how the curl
response changes.
As you can see, there are several ways you can start Caddy with an initial config:
- A file named Caddyfile in the current directory
- The
--config
flag (optionally with the--adapter
flag) - The
--resume
flag (if a config was loaded previously)
JSON vs. Caddyfile
Now you know that the Caddyfile is just converted to JSON for you.
The Caddyfile seems easier than JSON, but should you always use it? There are pros and cons to each approach. The answer depends on your requirements and use case. JSON is easy to generate and automate, so it's meant for programs. Caddyfile is easy to craft by hand, so it's meant for humans.
It is important to note that both JSON and the Caddyfile (and any other supported config adapter) can be used with Caddy's API. However, you get the full range of Caddy's functionality and API features if you use JSON. If using a config adapter, the only way to load or change the config with the API is the /load endpoint.
API vs. Config files
You will also want to decide whether your workflow is API-based or CLI-based. (You can use both the API and config files on the same server, but we don't recommend it: best to have one source of truth.)
Under the hood, even config files go through Caddy's API endpoints; the
caddy
command just wraps up those API calls for you.
The choice of API or config file workflow is orthogonal to the use of config adapters: you can use JSON but store it in a file and use the command line interface; conversely, you can also use the Caddyfile with the API.
But most people will use JSON+API or Caddyfile+CLI combinations.
As you can see, Caddy is well-suited for a wide variety of use cases and deployments!
Start, stop, run
Since Caddy is a server, it runs indefinitely. That means your terminal won't unblock after you execute caddy run
until the process is terminated (usually with Ctrl
+C
).
Although caddy run
is the most common and is usually recommended (especially when making a system service!), you can alternatively use caddy start
to start Caddy and have it run in the background:
caddy start
Successfully started Caddy (pid=42) - Caddy is running in the background
This will let you use your terminal again, which is convenient in some interactive headless environments.
You will then have to stop the process yourself, since Ctrl
+C
won't stop it for you:
caddy stop
Or use the /stop endpoint of the API.
Reloading config
Your server can perform zero-downtime config reloads/changes.
All API endpoints that load or change config are graceful with zero downtime.
When using the command line, however, it may be tempting to use Ctrl
+C
to stop your server and then restart it again to pick up the new configuration. Don't do this: stopping and starting the server is orthogonal to config changes, and will result in downtime.
Stopping your server will cause the server to go down.
Instead, use the caddy reload
command for a graceful config change:
caddy reload
{"level":"info","ts":1709475992.0607605,"msg":"using adjacent Caddyfile"}
This actually just uses the API under the hood. It will load and, if necessary, adapt your config file to JSON, then gracefully replace the active configuration without downtime.
If there are any errors loading the new config, Caddy rolls back to the last working config.
Technically, the new config is started before the old config is stopped, so for a brief time, both configs are running! If the new config fails, it aborts with an error, while the old one is simply not stopped.
Caddy API
Let's explore Caddy's admin API, which makes it possible to automate in a programmable fashion.
Basic config
Let's get back to our JSON config in caddy.json
:
{
"apps": {
"http": {
"servers": {
"example": {
"listen": [":2015"],
"routes": [
{
"handle": [
{
"handler": "static_response",
"body": "Hello, world!"
}
]
}
]
}
}
}
}
}
As you probably remember, we used the /load
API method to apply it:
curl localhost:2019/load \
-H "Content-Type: application/json" \
-d @caddy.json
Config traversal
Suppose we want to change the body
from Hello, world!
to some other phrase. Instead of uploading the entire config file for a small change, let's use a powerful feature of Caddy's API to make the change without ever touching our config file.
Making little changes to production servers by replacing the entire config can be dangerous; it's like having root access to a file system. Caddy's API lets you limit the scope of your changes to guarantee that other parts of your config don't get changed accidentally.
Using the request URI's path, we can traverse into the config structure and update only the message string (be sure to scroll right if clipped):
curl \
localhost:2019/config/apps/http/servers/example/routes/0/handle/0/body \
-H "Content-Type: application/json" \
-d '"Work smarter, not harder."'
Every time you change the config using the API, Caddy persists a copy of the new config so you can --resume it later!
You can verify that it worked with a similar GET request, for example:
curl localhost:2019/config/apps/http/servers/example/routes
[{"handle":[{"body":"Work smarter, not harder.","handler":"static_response"}]}]
You can use the jq
command to prettify JSON output:
curl localhost:2019/config/apps/http/servers/example/routes | jq
[
{
"handle": [
{
"body": "Work smarter, not harder.",
"handler": "static_response"
}
]
}
]
Important note: This should be obvious, but once you use the API to make a change that is not in your original config file, your config file becomes obsolete. There are a few ways to handle this:
- Use the
--resume
of the caddy run command to use the last active config. - Don't mix the use of config files with changes via the API; have one source of truth.
- Export Caddy's new configuration with a subsequent GET request (less recommended than the first two options).
Using @id tag
Config traversal is certainly useful, but the paths are little long, don't you think?
We can give our handler object an @id
tag to make it easier to access:
curl \
localhost:2019/config/apps/http/servers/example/routes/0/handle/0/@id \
-H "Content-Type: application/json" \
-d '"msg"'
This adds a property to our handler object: "@id": "msg"
, so it now looks like this:
{
"@id": "msg",
"body": "Work smarter, not harder.",
"handler": "static_response"
}
@id
tags can go in any object and can have any primitive value (usually a string). Learn more
We can then access it directly:
curl localhost:2019/id/msg
{"@id":"msg","body":"Work smarter, not harder.","handler":"static_response"}
And now we can change the message with a shorter path:
curl \
localhost:2019/id/msg/body \
-H "Content-Type: application/json" \
-d '"Some shortcuts are good."'
And check it again:
curl localhost:2019/id/msg/body
"Some shortcuts are good."
Caddyfile
Let's explore the basics of the HTTP Caddyfile so that you can quickly and easily produce good-looking, functional site configs.
First site
Create a new text file named Caddyfile
(no extension).
The first thing you should type is your site's address:
localhost
If the HTTP and HTTPS ports (80 and 443, respectively) are privileged ports on your OS, you will either need to run with elevated privileges or use a higher port. To use a higher port, just change the address to something like
localhost:2015
and change the HTTP port using the http_port Caddyfile option.
Then hit enter and type what you want it to do. For this tutorial, make your Caddyfile look like this:
http://localhost
respond "Hello, world!"
Save that and run Caddy (since this is a training tutorial, we'll use the --watch
flag so changes to our Caddyfile are applied automatically):
caddy run --watch
If you get permissions errors, try using a higher port in your address (like
localhost:2015
) and change the HTTP port, or run with elevated privileges.
Caddy serves all sites over HTTPS by default as long as a host or IP is part of the site's address. Automatic HTTPS can be disabled by prefixing the address with http://
explicitly (that's what I did in the Caddyfile
above).
Open http://localhost
in your browser and see your web server working, complete with HTTPS! I'll use curl
instead of the browser:
curl localhost
Hello, world!
That's not particularly exciting, so let's change our static response to a file server with directory listings enabled:
http://localhost
file_server browse
Save your Caddyfile, then refresh your browser tab. You should either see a list of files or an HTML page if there is an index file in the current directory:
curl localhost
Adding functionality
Let's do something interesting with our file server: serve a templated page. Create a new file and paste this into it:
<!DOCTYPE html>
<html>
<head>
<title>Caddy tutorial</title>
</head>
<body>
Page loaded at: {{now | date "Mon Jan 2 15:04:05 MST 2006"}}
</body>
</html>
Save this as caddy.html
in the current directory and load it in your browser:
curl localhost/caddy.html
Wait a minute. We should see today's date. Why didn't it work? It's because the server hasn't yet been configured to evaluate templates! Easy to fix, just add a line to the Caddyfile so it looks like this:
http://localhost
templates
file_server browse
Save that, then reload the browser tab:
curl localhost/caddy.html
With Caddy's templates module, you can do a lot of useful things with static files, such as including other HTML files, making sub-requests, setting response headers, working with data structures, and more!
Try changing the caddy.html
template above, re-run the curl
command, and see how the result changes.
It's good practice to compress responses with a quick and modern compression algorithm. Let's enable Gzip and Zstandard support using the encode directive:
http://localhost
encode zstd gzip
templates
file_server browse
Browsers don't support Zstandard encodings yet. Hopefully soon!
That's the basic process for getting a semi-advanced, production-ready site up and running!
When you're ready to turn on automatic HTTPS, just replace your site's address (http://localhost
in our tutorial) with your domain name. See the HTTPS quick-start guide for more information.
Multiple sites
With our current Caddyfile, we can only have the one site definition! Only the first line can be the address(es) of the site, and then all the rest of the file has to be directives for that site.
But it is easy to make it so we can add more sites!
Our Caddyfile so far:
localhost
encode zstd gzip
templates
file_server browse
is equivalent to this one:
localhost {
encode zstd gzip
templates
file_server browse
}
except the second one allows us to add more sites.
By wrapping our site block in curly braces { }
we are able to define multiple, different sites in the same Caddyfile.
For example:
:8080 {
respond "I am 8080"
}
:8081 {
respond "I am 8081"
}
Let's try the port 8080:
curl localhost:8080
I am 8080
Now try changing the port in the curl
command to 8081 and see how the result changes.
When wrapping site blocks in curly braces, only addresses appear outside the curly braces and only directives appear inside them.
For multiple sites which share the same configuration, you can add more addresses, for example:
:8080, :8081 {
...
}
You can then define as many different sites as you want, as long as each address is unique.
Matchers
We may want to apply some directives only to certain requests. For example, let's suppose we want to have both a file server and a reverse proxy, but we obviously can't do both on every request! Either the file server will write a static file, or the reverse proxy will proxy the request to a backend.
This config will not work like we want:
http://localhost
file_server
reverse_proxy 127.0.0.1:9005
In practice, we may want to use the reverse proxy only for API requests, i.e. requests with a base path of /api/
. This is easy to do by adding a matcher token:
http://localhost
file_server
reverse_proxy /api/* 127.0.0.1:9005
There; now the reverse proxy will be prioritized for all requests starting with /api/
:
curl -v localhost/api/status
> GET /api/status HTTP/1.1
> Host: localhost
> User-Agent: curl/8.5.0
> Accept: */*
>
< HTTP/1.1 502 Bad Gateway
< Server: Caddy
< Date: Mon, 04 Mar 2024 19:42:42 GMT
< Content-Length: 0
(Caddy proxies the request to port 9005, where no one is listening, hence the 502 Bad Gateway response)
The /api/*
token we just added is called a matcher token. You can tell it's a matcher token because it starts with a forward slash /
and it appears right after the directive (but you can always look it up in the directive's docs to be sure).
Matchers are really powerful. You can name matchers and use them like @name
to match on more than just the request path! Take a moment to learn more about matchers before continuing!
Environment variables
The Caddyfile adapter allows substituting environment variables before the Caddyfile is parsed.
First, set an environment variable (in the same shell that runs Caddy):
export SITE_ADDRESS=localhost:9055
Then you can use it like this in the Caddyfile:
{$SITE_ADDRESS}
respond "Hello from {$SITE_ADDRESS}"
Before the Caddyfile is parsed, it will be expanded to:
localhost:9055
respond "Hello from localhost:9055"
Let's check this:
export SITE_ADDRESS=localhost:9055
caddy start
curl localhost:2019/config/ | jq '.. | .body? // empty'
Successfully started Caddy (pid=42) - Caddy is running in the background
"Hello from localhost:9055"
Comments
One last thing that you will find most helpful: if you want to remark or note anything in your Caddyfile, you can use comments, starting with #
:
# this starts a comment
Further reading
For more information about Caddy, see the documentation.
Caddy contributors + 1 others · original · CC-BY-SA-4.0 · 2024-03-03
Caddy contributors, Anton Zhiyanov