×

Welcome to static-site-express. Install guide

Jun 10, 2024

static-site-express is a simple Node.js based static-site generator that uses EJS and Markdown. Deploy your static site to Netlify or any platform to your liking. Suited for landing pages, portfolio, blogs, documentation, hobby projects.

Getting started

Install static-site-express

I created a "Barebone" theme (previously on the starter/barebone branch) without Tailwind CSS and Flowbite UI, with SASS support and some basic styling. It was a huge mistake to be dependent on any CSS frameworks. This theme became the default on the master branch. The old master branch is now available as deprecated-tailwind, and I discontinued its development.

  1. Click on "Use this template" button to get an exact copy of the repo / site builder. Then use the master branch, which is the default. Or use the GitHub CLI:
gh repo create your-username/new-repo  -p webandras/static-site-express
  1. To have a basic e-commerce website Flowbite/TailWind starter incorporating the Snipcart ecommerce platform into static-site-express (Flowbite/Tailwind is going to be removed soon...):
<div id="snipcart" data-config-modal-style="side" data-api-key="YOUR_PUBLIC_TEST_API_KEY" hidden></div>

Note: This api key is public, and can be submitted to version control. There is also a private key, but that should never be committed.

Snipcart is more than a simple cart: enjoy a full back-office management dashboard to track abandoned carts, sales, orders, customers and more. Disclaimer: I am not affiliated with Snipcart in any ways.

Note: Netlify will build your site from the default branch (usually the master) by default. You can use a different branch other than the default one, but in that case Decap CMS (previously: Netlify CMS) will not work properly. For example, the images uploaded through the CMS will be pushed into the default branch, not the other one you set up in Netlify!)

Test website: Use the 'Deploy to Netlify' button at the project's website to have a test website.

Build your site locally

First, install or update npm packages.

Second, create a .env file (see .env.example), and set the variables.

If you want to use Algolia Search, you need to register and generate your API credentials. If you don't want to use Algolia, set enableSearch to false in config/site.config.js.

Check out all the settings in the site.config.js. There are comments there with additional information.

Use npm scripts defined in package.json

Note: On Windows, you can't use the bash scripts located in the bin folder -> use the corresponding npm scripts in package.json instead.

1. Build site from ./content into the ./public folder (in watch mode):

bin/watch

Or:

npm run watch-chokidar

Or:

npm run watch-nodemon

This bash script will call: npm run watch-chokidar. Alternatively, you can also use npm run watch-nodemon.

If you modify site.config.js restart the bin/watch or the corresponding scripts in package.json to apply the changes you have made. For local development, make sure you rewrite the mode to "development"!

Generate the js and css bundles as well (in --watch mode): bin\webpack (npm run webpack-watch).

2. Serve website on localhost:4000 (or the port you set in .env, default port is 4000) (legacy):

bin/serve

Or:

npm run serve

TODO: the Express dev server crashes rarely - not finding some file generated by the builder. The files and folders in the public folder are deleted and re-copied: for a brief moment it is possible not to have a .html file available to be served by the Express server. However, the site-builder generates everything in a few hundreds of milliseconds ( generally less than 300 ms). So this error happens rarely.

It is recommended to switch to browser-sync to have live reloading in the browser when files change. The issue above will disappear if you use this:

bin/liveserver

Or:

npm run liveserver

or run:

browser-sync start --server 'public' --files 'public'

3. Call the bin/webpack (npm run webpack) watcher script to make sure the js and css bundles are recreated after file changes.

If you don't see your changes:

Make sure to build the live bundle in production mode.

Check out the bin folder and the package.json file to see the available scripts.

Modify the application code

The JavaScript source is in the app/ folder. Generally, you only need to modify the core/generator.js and the core/methods.js files.

After the changes, restart build/watch scripts. This process in suboptimal, but currently this is the workflow.

Website content (in the content/ folder)

Publish Website to Netlify

Register at Netlify.com and publish your website

[build]
  base    = "/"
  publish = "public"
  command = "npm run build"

The base path, the build command, and the "publish" directory. You can keep those settings unchanged.

You can also define here some post-processing actions to be run in the post-processing stages, for example as part of Netlify's CI/CD pipeline.

In the optional _headers file you can specify the HTTP headers and set Content Security Policy (CSP) rules for the Netlify server. Currently, CSP rules are commented out. You can also specify these in netlify.toml.

The _redirects file is currently empty. When you have a custom domain, you can make a redirect from .netlify.com to your custom domain there.

robots.txt default settings:

# Disallow admin page
User-agent: *
Disallow: /admin/

# Disallow message-sent page
User-agent: *
Disallow: /message-sent/

# Rule 3
User-agent: *
Allow: /

For Google Search Console verification, you should have an HTML file from Google included in the root of your Netlify publish folder (in our case, public). The build script copies this file from ./content to ./public.

Add the name of the filename in the filesToCopy array at line 100 in ./app/core/generator.js and restart watch script!

Netlify builds your website with its buildbot. It starts a Docker container running the Netlify build image

For folks unfamiliar with Docker

TL;DR: Netlify install a lot of packages (copies files over) to be able to run your favorite tool to build your static website. And this is done in a Docker container. Read the overview section of Docker docs: https://docs.docker.com/get-started/

A Docker container is basically a writable OverlayFS (FS = filesystem) layer created on the very top of the numerous read-only OverlayFS layers of the Docker image (files copied on top of each other: each layer represents a command in the Dockerfile). Which is destroyed after the build has been completed. However, the data can be made permanent using volumes which are kept.

The images are based on base images (the FROM statement at the first line of a Dockerfile) that are special distributions that "think they are operating systems", but are more lightweight that a complete OS.

Alpine Linux is the most lightweight of them (around 5MB). Interesting to note, that images can built from scratch as well (scratch is a reserved image that is empty, and thus does nothing). The base images are built this way ("FROM scratch").

Docker is using the kernel and obviously the resources of the host (which are shared), and are meant for process isolation only. Containers are more lightweight, don't have the overheads Virtual Machines do. More about this topic.

VMs are used for full isolation including resources (for example, to subdivide the server resources for shared hosting: each hosting having a computing power of X CPUs of X type, have X GB of memory and X GB storage space), and have a separate (full) OS installed along with the host OS, so they do not share the kernel.

If you use Windows, you need to install Windows Subsystem for Windows 2 (WSL2) to have a (not fully featured) distro based on Linux kernel installed (Ubuntu is used most of the time), that will run as a regular application. Although, there are container base images available for Windows as well. So, Docker can even use the Windows kernel now (for specific images).

Lots of images are pre-built for us (like the netlify/build image) and stored in the Docker registry (not DockerHub, since that is just a user interface). You don't need to build them from Dockerfile, you just download them from the registry.

If you know the Docker basics, you can understand some things about Netlify as well. Check these shell scripts out:

When the Docker fires up, this script runs: https://github.com/netlify/build-image/blob/focal/run-build.sh

This is the Dockerfile from which the Netlify image is built (currently based on ubuntu:20.04, older Ubuntu base images - like 16.04 - are deprecated now): https://github.com/netlify/build-image/blob/focal/Dockerfile

Netlify Forms

Netlify automatically discovers the contact form via custom netlify attributes added to the form. A bot field is present in the form to protect against spam bots. Netlify has first-class spam filter.

Netlify Forms Docs

Decap CMS

Decap CMS Docs

Algolia Search

These are the key parts in the code for Algolia:

const algoliasearch = require("algoliasearch");
const client = algoliasearch(process.env.ALGOLIA_APP_ID, process.env.ALGOLIA_ADMIN_KEY);
const index = client.initIndex(process.env.ALGOLIA_INDEX);

Here, I use the AlgoliaSearch client library to send request to update and/or create records for the posts:

index.partialUpdateObjects(searchIndexData, {
    createIfNotExists: true,
});

This is currently the structure of the search index (as a default example):

searchIndexData.push({
    /**
     * The object's unique identifier
     */
    objectID: postData.attributes.date,

    /**
     * The URL where the Algolia Crawler found the record
     */
    url: canonicalUrl,

    /**
     * The lang of the page
     * - html[attr=lang]
     */
    lang: config.site.lang,

    /**
     * The title of the page
     * - og:title
     * - head > title
     */
    title: postData.attributes.title,

    /**
     * The description of the page
     * - meta[name=description]
     * - meta[property="og:description"]
     */
    description: postData.attributes.excerpt,

    /**
     * The image of the page
     * - meta[property="og:image"]
     */
    image: config.site.seoUrl + "/assets/images/uploads/" + postData.attributes.coverImage,

    /**
     * The authors of the page
     * - `author` field of JSON-LD Article object: https://schema.org/Article
     * - meta[property="article:author"]
     */
    authors: [config.site.author],

    /**
     * The publish date of the page
     * - `datePublished` field of JSON-LD Article object: https://schema.org/Article
     * - meta[property="article:published_time"]
     */
    datePublished: postData.attributes.date,

    /**
     * The category of the page
     * - meta[property="article:section"
     * - meta[property="product:category"]
     */
    category: postData.attributes.topic || "",

    /**
     * The content of your page
     */
    content: postContents,
});

Note: Currently the objectID is the post publish date (like "2022-08-17"). Maybe better to change it to the whole slug to be completely unique. You can't have two posts at the same day now.

Internationalisation (i18n)

Provided by i18next package. See in assets/js/main.js. Remove this feature if you don't need it. Some texts (like the ones coming from config) cannot be made translatable. Probably not a good solution. The code is left there mainly for reference.

The translations come from content/lang/translations.json.

i18next Docs

Open Hours library for displaying opening hours.

Credits: © Michael Lee. GitHub His website/blog

When I started my journey as web developer, I started using Jekyll for my simple websites after reading some articles from Michael Lee about it. He has a great starter for Jekyll, the Jekyll ⍺.

The data comes from content/data/opening-hours.yml. It can be edited from Decap CMS as well.

CHANGELOG

Release 2.2.1 (27 March 2023)

I created the "Barebone" theme (branch: starter/barebone) without Tailwind CSS, with SASS support and some basic styling (nothing has changed in the app folder) It was a mistake to be dependent on one CSS framework. Choose whatever you like for Barebone.

Release 2.2.0 (24 March 2023)

You can use bin/livereload instead of bin\serve. The old express local server is not removed.

Release 2.1.2 (23 March 2023)

Release 2.1.1 (30 December 2022)

The project is now in mature state, there will be no more refactoring, only bugfixes and occasional improvements of features. No breaking changes. I don't want to be part of the "rewrite culture". Also, not a fan of npm any more. I use as small amount of packages as possible. Having 1000s of interdependent packages (with all these regular rewrites, and security issues as well) is a dependency hell.

Delete:

Update:

Release 2.1.0 (16 August 2022)

New feature:

Fix:

Security:

Update:

Release 2.0.0 (14 August 2022)

This intended to be the last major version release.

New:

Update:

Delete:

New/Update/Delete:

Release 1.0.2 (28 April 2021)

Release 1.0.1 (27 April 2021)

Incorrect configuration in docker-compose.yml:

Release 1.0.0 (25 April 2021) ! breaking change from previous versions !

Correct EJS syntax error after EJS version update:

<%- include ('partial/element-name') %>

This is a breaking change, you should update your partials/templates!

Update build and watch scripts (using chokidar):

In 2019, chokidar was not watching file changes properly, thus the npm script was named "watch-exp". The default watch script is using nodemon.

Add flow types support and re-structure folders:

Refactor site generator, code improvements, config changes:

Dockerize project:

Useful resources

Known issues

1. Chokidar crashes on Ubuntu 20.04 LTS (12 August 2022)

Bug: Problem with an existing folder.

The build script should always delete the folders inside the public folder. However, the assets folder is sometimes not deleted, so an exception occurs: [Error: EEXIST: file already exists, mkdir './public/assets']

2. Nodemon was not working properly on Ubuntu (2019)

If you have a problem or a question about static-site-express, open an issue here.

Credits

The idea of using a Node.js static site generator came from this good article by Douglas Matoso (not accessible any more): Build a static site generator in 40 lines with Node.js.

This package uses some modified code parts from doug2k1/nanogen (mainly from the legacy branch and some ideas from the master branch, MIT © Douglas Matoso 2018).

Licence

MIT licence - Copyright © 2018-2024 András Gulácsi.