How to Create a Documentation Site With Next.js and Markdown
Time to read: 5 minutes
How to Create a Documentation Site With Next.js and Markdown
Twilio’s documentation is a key ingredient in ensuring an overall delightful developer experience. Without helpful, accurate, and thorough docs, how would you know how to schedule SMS and WhatsApp messages with Twilio, or create an AI assistant?
As you may have read earlier, we recently completely overhauled the technical platform we use to provide developers like you with documentation. Today, I’d like to share, engineer-to-engineer, what it took to pull that off and transition us to the Next.js powered goodness we enjoy today.
The previous solution: the full-blown CMS
Our previous platform was based on an open-source CMS (content management system) called Wagtail. Based on the Django framework for Python, Wagtail is a robust and capable CMS, but there were reasons why it wasn’t quite what we needed for hosting a technical docs site.
Wagtail is a system I’d heartily recommend if a capable and flexible CMS is what you need. However, we didn’t really need or prefer a full CMS for Twilio’s docs.
For starters, there was the operational overhead of managing a fleet of Docker containers running nginx and a Python app that then connected to a backend database (in our case, Postgres). This overhead is well worth it for a full-blown CMS, but, in our case, we had to solve other problems that led us down a simpler path.
The problems to solve
Poor page loading performance
The first problem we wanted to solve was the page load time of our docs pages. However, this alone didn’t warrant a platform overhaul since full page caching at the edge could be used to improve the speed of even server-side rendered pages like those served by Wagtail. It was just the first problem we wanted to solve.
Less infrastructure to manage
Additionally, we wanted to reduce the infrastructure we needed to manage. We wanted to move away from the constant tweaking of nginx configs and managing database migrations and backups.
Better controls over documentation quality
Meanwhile, our content editors needed to continue to allow product teams in Twilio to edit the docs for their products, while also maintaining a certain quality bar so that we can make sure the documentation is adequately serving you, our developers.
The wish list
We quickly decided that a static site generator would work the best, as it would require less infrastructure to manage and improve performance by pre-rendering static HTML during the build step.
To fulfill the desire for better quality controls, we realized we already used the tools we needed every day: git source control, GitHub for pull requests, and CI/CD pipelines. Adopting a “Docs-as-Code” approach, we decided to treat our docs as regular text files (Markdown, to be specific), using the same suite of tools we already use to maintain a high quality bar for our docs, just as we do with our code.
The chosen tools
MDX
We looked at Markdown and considered whether it would be able to handle rich features such as OpenAPI-sourced reference docs and dynamically generated code samples in nine (nine!) programming languages. We quickly ran across MDX, which is a superset of Markdown that allows you to embed React components (using JSX-style syntax) and we were sold.
Next.js
We knew we wanted to use React as that was the UI framework we were familiar with and what our Paste design system was built for. We also knew that for maximum performance, we wanted to statically generate the site. However, we didn’t want to paint ourselves into a corner such that we couldn’t do any kind of server-side rendering for specific use cases. That led us to Next.js which provides the best of all worlds when it comes to choice of rendering strategies.
Vercel for hosting
Finally, we needed to choose a hosting provider. Although we could use an AWS CloudFront distribution and an S3 bucket to host a static site (which we actually do for our backup site – let us know if you’d like to hear about our static backup failover solution), it would mean giving up the server-side rendering options previously discussed. Beyond this, we found more advantages by choosing Vercel:
- Preview deployments (an isolated web site for every pull request)
- Enterprise support not just for the hosting platform, but Next.js too
- A Terraform provider so all our Vercel project configs can be managed as code
- OpenID Connect support for integrating with our AWS resources
- Log drains to a webhook so we can pipe stats to our monitoring systems
- SSO integration with project-level permission mapping
Rendering MDX in Next.js
We needed a good way to render an MDX file in a Next.js page. We quickly found next-mdx-remote for this purpose. It allows you to render MDX content into React that can be then handled by Next.js. Here’s an example:
First, we have the getStaticProps
function (yes, we’re still using the Next.js Pages router here). This function is responsible for getting the MDX content from wherever it may live. In our production implementation, the files live in a Git repository, but in this example, we’re just hard-coding some content that displays a heading, some text, and a custom component called DynamicCodeSample
. This MDX is passed to the serialize
function and the result put in the source
prop for the page.
Next, we define a constant named component
that is a dictionary of all the custom components that we will support in the MDX. Again, we’re just passing a single component in this example, but in our codebase we have dozens of components that we allow for.
Finally, we have the page component itself. All we need to do is pass the source prop and our components
dictionary into the MDXRemote
component and it handles the rest!
We glossed over the DynamicCodeSample
component. It could be any React component, really. In our particular example, we have a component that takes a Twilio API path and method and will generate a code sample in nine different languages (see an example here).
Other implementation challenges
Beyond just rendering MDX, we worked on many other fun technical challenges, including:
- Allowing custom Markdown syntax using custom remark plugins
- React components that render OpenAPI resource property and parameter tables
- Using Vale to provide style linting for the documentation prose
- Caching MDX serialization to speed up builds
- Using Incremental Static Regeneration (ISR) for our error codes database
- Creating a static backup site and failover procedure
Let us know if you are interested in hearing more about any of these.
Wins
We’ve quickly skimmed the surface of this enormous project that covered migrating over 5,000 pages of docs with close to 20,000 code samples across nine languages. Now that this project is complete, we’ve enjoyed the following wins:
- Massive boost to page load speed and Lighthouse scores
- Git-based collaboration on documentation for all contributors
- Source-controlled infrastructure as code
- A modern look & feel thanks to Paste
We aren’t done, however. We have more plans to improve the docs experience both for our internal authors and for Twilio developers everywhere. Stay tuned for more.
Related Posts
Related Resources
Twilio Docs
From APIs to SDKs to sample apps
API reference documentation, SDKs, helper libraries, quickstarts, and tutorials for your language and platform.
Resource Center
The latest ebooks, industry reports, and webinars
Learn from customer engagement experts to improve your own communication.
Ahoy
Twilio's developer community hub
Best practices, code samples, and inspiration to build communications and digital engagement experiences.