in Platform Updates

How to Create JavaScript Libraries (Part 2)

Mateusz Burzyński in Programming, on March 07, 2018

How to Create JavaScript Library – Part 2

This is the second part of How to Create JavaScript Libraries in 2018 series. The first one can be found here: How to Create JavaScript Libraries in 2018 (Part 1 - Basic).

In the previous post, we examined the configuration process and basic use case of Babel and Rollup. In this part we’ll move to more advanced use cases.

Proxy directories

What happens if we want our library to support several “entry points”? For example, we may have a library with a React component which should allow for importing an alternative version of this component (say, for Preact). We want the users to do either this:

import FooComponent from 'foo'

or this:

import FooComponent from 'foo/preact'

Sounds easy — it would be enough to create a file or a directory called “preact” in the root of our library and let the resolving algorithms find it. The problem with this solution, however, would be that it would give the users only one module format, while we want to provide two (ESM and CJS).

What to do, then? Use proxy directories!

This simple yet surprisingly little-known technique requires creating a directory named after our alternative entry point (in this case it will be “preact”) with the following package.json file:

{
  "name": "foo/preact",
  "private": true,
  "main": "../dist/foo-preact.js",
  "module": "../dist/foo-preact.es.js"
}

And that’s it! You can see this method in action in redux-saga/effects:

{
  "name": "redux-saga/effects",
  "private": true,
  "main": "../lib/effects.js",
  "module": "../es/effects.js",
  "jsnext:main": "../es/effects.js"
}

or in styled-components/primitives:

{
  "name": "styled-components/primitives",
  "private": true,
  "main": "../dist/styled-components-primitives.cjs.js",
  "module": "../dist/styled-components-primitives.es.js",
  "jsnext:main": "../dist/styled-components-primitives.es.js"
}

You can even check out how to create multiple proxies programatically with a script in funkia/list:

const fs = require('fs')
const path = require('path')
const { promisify } = require('util')
const pkg = require('../package.json')

const readDir = promisify(fs.readdir)
const mkDir = promisify(fs.mkdir)
const writeFile = promisify(fs.writeFile)

const removeExt = (ext, str) => path.basename(str, `.${ext}`)

const fileProxy = file => `{
  "name": "${pkg.name}/${removeExt('js', file)}",
  "private": true,
  "main": "../dist/${file}",
  "module": "../dist/es/${file}"
}
`

async function processDir(dir) {
  const files = (await readDir(dir)).filter(
    file => /\.js$/.test(file) && file !== 'index.js'
  )
  return await Promise.all(
    files.map(async file => {
      const proxyDir = removeExt('js', file)
      await mkDir(proxyDir).catch(() => {})
      await writeFile(`${proxyDir}/package.json`, fileProxy(file))
      return proxyDir
    })
  )
}

processDir('dist').then(proxies =>
  console.log(`Proxy directories (${proxies.join(', ')}) generated!`)
)

So far so good. The last thing to do is building our files. Let’s add some extra config to rollup.config.js:

import babel from 'rollup-plugin-babel'
import pkg from './package.json'

const external = [
  ...Object.keys(pkg.dependencies || {}),
  ...Object.keys(pkg.peerDependencies || {}),
]

const makeExternalPredicate = externalsArr => {
  if (externalsArr.length === 0) {
    return () => false
  }
  const externalPattern = new RegExp(`^(${externalsArr.join('|')})($|/)`)
  return id => externalPattern.test(id)
}

export default [
  {
    input: 'src/index.js',
    external: makeExternalPredicate(externals),
    plugins: [babel({ plugins: ['external-helpers'] })],
    output: [
      { file: pkg.main, format: 'cjs' },
      { file: pkg.module, format: 'es' },
    ],
  },
  {
    input: 'src/preact.js',
    external: makeExternalPredicate(externals),
    plugins: [babel({ plugins: ['external-helpers'] })],
    output: [
      { file: 'dist/foo-preact.js', format: 'cjs' },
      { file: 'dist/foo-preact.es.js', format: 'es' },
    ],
  },
]

Done! This method works fine when our “entry points” are completely independent. What if, however, it should be possible to use several entry points in one application at the same time? How to make sure that the code is not duplicated? This is where experimentalCodeSplitting comes in handy.

experimentalCodeSplitting

experimentalCodeSplitting is a fresh Rollup feature which abstracts common dependencies into a single file and adds import’s and require’s to the files that need these dependencies:

// ...

export default {
  experimentalCodeSplitting: true,
  input: ['src/index.js', 'src/preact.js', 'src/storage.js'],
  external: makeExternalPredicate(externals),
  plugins: [babel({ plugins: ['external-helpers'] })],
  output: [{ dir: 'lib', format: 'cjs' }, { dir: 'es', format: 'es' }],
}

For full reference and complete manual, refer to Experimental Code Splitting docs.

Note that if we need to generate more than one output version for the same set of input files (like we do at every example) we should specify different output directories for each format.

We should also make sure that each input file has a unique file name so output files can have stable & predictable names (if we’d bundle src/index.js and src/preact/index.js we’d end up with index.js and index2.js files).

Config fatigue

The more configuration needs our file wants to cover, the heavier it may get. To avoid bloated files and code repetition, we may want to use a simple function generating configurations based on the parameters we pass to it. A good example here is the react-textarea-autocomplete configuration file:

const createConfig = ({ umd = false, output } = {}) => ({
  input: 'src/index.js',
  output,
  external: [
    ...Object.keys(umd ? {} : pkg.dependencies || {}),
    ...Object.keys(pkg.peerDependencies || {}),
  ],
  plugins: [
    babel(),
    resolve(),
    commonjs({ extensions: ['.js', '.jsx'] }),
    umd && uglify(),
    license({
      banner: {
        file: path.join(__dirname, 'LICENSE'),
      },
    }),
  ].filter(Boolean),
})

Browser

When our library targets both a server and browser environment, it may need to know where exactly it is running. One option to do this is using a couple of “if” statements scanning the environment for global variables, global objects or certain behavior. This solution, however, is rather clunky, heavyweight (the code will be run in every environment) and unreliable. But fear not — there are two better options:

1. Build-time variables

Adding a magic variable like __SERVER__ or __BROWSER__ will let us create separate code versions for various use cases. Styled-components do this nicely:

Variables in Styled-componentsVariables in styled-componentsTo use variables, we have to install rollup-plugin-replace and add it to our configuration file in plugins section:

replace({ __SERVER__: JSON.stringify(false) })

And, of course, we’ll need to create an extra configuration object so that the values in the output files are modified correctly for each variable.

2. Alternative files

Another approach is using** alternative files** for server and browser environment, like this rng.js and this rng-browser.js.

As long as we provide the same API for both versions our users won’t even have to know that the code is different for different environments. For easier bundling process, I suggest a slightly different naming convention — instead of using a hyphen (file-browser.js), use a dot (file.browser.js)

Let’s add rollup-plugin-node-resolve to our project and use it in an alternative configuration to output files for web application bundlers. Copy the existing config file and add this to the plugins:

nodeResolve({ extensions: ['.browser.js', '.js'] })

Regardless of the method that we selected, we should have four output files:

dist / foo.js
dist / foo.es.js
dist / foo.browser.cjs.js
dist / foo.browser.es.js

The last step is adding a new field to our main package.json (even if we combine this technique with “proxy directories” we need to add this only in a root package.json). Each file that can be imported from our library and has different implementations per environment should have a corresponding file mapping so the alternative file can be loaded instead of the original one:

"main": "dist/foo.js",
"module": "dist/foo.es.js",
"browser": {
  "./dist/foo.js": "./dist/foo.browser.js",
  "./dist/foo.es.js": "./dist/foo.browser.es.js",
  // optionally - more “rewrite rules” if you use "proxy directories"
  "./dist/foo-preact.js": "dist/foo-preact.browser.js",
  "./dist/foo-preact.es.js": "dist/foo-preact.browser.es.js"
}

React Native

If we want our library to support React Native but the code for this platform is different than for node or for browsers, the best tactic is the second technique mentioned above (“Alternative files”). Metro Bundler used by React Native recognizes .native.js extension (also .ios.js and .android.js which takes precedence depending on the platform), so it’s enough to create a new Rollup config with such plugin:

nodeResolve({ extensions: ['.native.js', '.js'] })

If we make sure that the output file has the same name as the file from “main” only with .native.js extension (so in our example it will be dist/foo.native.js), we won’t have to add anything to package.json (if we leave “main” without extension). Although it still might be a good idea to add extra entry in package.json:

"main": "dist/foo.js",
"module": "dist/foo.es.js",
"react-native": "dist/foo.native.js",

Note: React Native doesn’t support ES modules, so it’s enough to build single output file with CJS format.

Finally, if we want to have separate sets for iOS and Android React Native files, we need two different configurations. Each should transpile the files to CJS format with different extensions:

nodeResolve({ extensions: ['.ios.js', '.native.js', '.js'] })

and:

nodeResolve({ extensions: ['.android.js', '.native.js', '.js'] })

Again, the output files should share names with the file from “main”: dist/foo.ios.js and dist/foo.android.js respectively. In that case you should consider leaving “main” without extension or using “react-native” key without an extension, so appropriate file can get loaded.

.mjs

It’s possible that in the future node.js will support an alternative extension (.mjs) for ESM modules (note: this is one of proposals and it is not set in stone yet that it will be recommended way of authoring ESM packages). However, we should support older node versions (at least LTS ones) for some time yet. If we want to use the native ESM modules in node, it’s best to output an additional file for node. It should have .mjs extension and node should prefer this file over the .js file. In this scenario you definitely should leave your “main” without an extension, so any node.js can chose which file to load.

process.env.NODE_ENV

In the pursuit of keeping the file size small, we shouldn’t forget about usability. Don’t be afraid to add helpful warnings, as they won’t hurt your library’s performance. A good practice is wrapping development-only code as follows:

const warn = msg => console.warn(msg)

if (process.env.NODE_ENV !== 'production') {
  if (props.render && props.children) {
    warn(
      "It doesn't make sense to use both props.render and props.children at once, props.children will be used."
    )
  }
}

Most users have configured their bundlers to replace process.env.NODE_ENV with an environmental variable. Therefore, a minifier (e.g. UglifyJS) can detect “constant conditions” (e.g. "development" !== "production" will get evaluated to false) and remove the code that cannot execute at runtime.

Note: The aim here is to remove as much dead code as possible, so we should wrap the redundant pieces on the highest possible level. If, for example, we wrap the unnecessary code inside a helper function (e.g. warn), the code will not be removed in the production build. In this case, it would remove only the body of warn function, turning the function into noop, but leaving intact the function itself and all function calls.

Conclusion

Even though setting up the tools for a library sounds trivial, we have just seen how complicated it really is. Hopefully, Microbundle (a wrapper around Rollup) will reduce the need for such extensive configuration - we can simplify this behind a couple of configuration options and a naming conventions. However, it’s still good to know how the whole process works behind the scenes because no automation tool can handle all use cases that we can think of.

See other Platform Updates

LiveChat is Getting a New Chats Section

New Chats section after LiveChat redesign Today, I want to show you the next design change we’re planning: the chats section.

Read more

New Agents Section Will Help You Better Manage Your Team

New Agents section in LiveChat Today we’ll be showcasing the new Agents section and the ways it will make your job easier.

Read more
See all updates