In my last post, we talked about how modules work in Deno but we left out some important details and there may be some misunderstandings (like I had) related to caching. To understand caching though, we first need to learn about Deno's Standard Library and third party modules.
How does the Deno Standard Library work?
The idea of Standard Libraries is inspired by languages like Go, and lots of languages like C and python have them. Ryan Dahl, the creater of Deno and Node originally learned Go and wanted to create Deno on it but ultimately decided Rust was the way to go.
The Deno Standard library (https://deno.land/std) is officially approved by the Deno core team and do not have any external dependancies outside of the standard library (module). All of its dependencies are internally contained inside of each "package".
In the last article we discussed this command further but I'd like to demonstrate it here so we can see all the dependancies of any Standard Library - If you were to open terminal and run:
[14:15:21] ~ 🦖 deno info 'https://deno.land/std/async/mux_async_iterator.ts' Download https://deno.land/std/async/mux_async_iterator.ts Warning Implicitly using master branch https://deno.land/std/async/mux_async_iterator.ts Download https://deno.land/std/async/deferred.ts Warning Implicitly using master branch https://deno.land/std/async/deferred.ts Check https://deno.land/std/async/mux_async_iterator.ts local: /Users/USERNAME/Library/Caches/deno/deps/https/deno.land/8d6fb8f6ecd39f7dfcd59c4c06f6235b4713e13e4c29c4bf66f84c920dd69a96 type: TypeScript compiled: /Users/USERNAME/Library/Caches/deno/gen/https/deno.land/std/async/mux_async_iterator.ts.js deps: https://deno.land/std/async/mux_async_iterator.ts └── https://deno.land/std/async/deferred.ts
Deno will download, then compile the file and then show the "deps".
On line 12 above you can see that it has a single dependancy https://deno.land/std/async/deferred.ts.
We can trust this because it has no third party code, as you can see it comes from
This is maintained and sets a "standard" across deno - meaning not only we can trust it but other modules can use them to extend their capabilities or do something entirely different. This directly relates into the next topic about third party modules.
What if the standard library goes down? Wouldn't that break everything? Nope, but to learn more about this, you'll need to keep reading.
What are third party modules in Deno and why is there a distinction?
As we just discussed The Standard Library is maintained by the Deno core team but the Third Party Modules are not. These are developed by the deno community.
Deno uses ES6 and doesn't understand CommonJs (require statements). It also requires a file extension. Node now supports import statements but it doesn't require the file extension so using node modules is possible but requires some configuration changes. In fact, many of the third-party modules in Deno are forks of node modules.
To that end, anyone can create Third Party Modules. If you'd like to submit your own third party module, you can follow the instructions at the top of the page at https://deno.land/x and there is a Recommended style guide: https://deno.land/manual/contributing/style_guide.
Up to this point, we are of the understanding that Deno doesn't require "downloading" and then requiring packages/modules to your computer like we did with node - while somewhat true, there is a catch - we'll get into that in the next section.
What is Deno cache?
If you've been following along with this deno series I've been writing, you have cached a few files already but for demonstration purposes I'll use the standard library.
If you were to select a standard library module and run it initially you'll see something like this:
It's downloading (also known as caching). Why though?
Do you see the second time I run that command
deno run 'https://deno.land/std/log/logger.ts' it doesn't download? That's because it's running from a file stored on your local computer.
Where did it download (Cache)?
Deno caches remote imports in a special directory specified by the
DENO_DIR environment variable. It defaults to the system's cache directory if
DENO_DIR is not specified. The next time you run the program, no downloads will be made. If the program hasn't changed, it won't be recompiled either. The default directory is:
- On Linux/Redox:
- On Windows:
- On macOS:
- If something fails, it falls back to
DENO_DIR defaults to
$HOME/.cache/deno but can be set to any path to control where generated and cached source code is written and read to.
If you delete the files located in that directory, every time a module is run, deno will download (cache) it again.
If a module has changed since it was cached, you can just pass a
--reload flag to the deno command.
deno run --reload 'https://deno.land/std/log/logger.ts'
Alternatively, to force caching you can use the
cache command in place of the
This is great if you are offline and adds some really cool server implications for security and ensuring you are only pulling trusted code.
There is a way to change where Deno caches these files.
How do I change where Deno caches?
Where your cache is located - which is specified by the
$DENO_DIR directory can be changed. First, we can run
echo $DENO_DIR to see where this is currently set, for me it's set to
In your development directory, run
mkdir DENO_CACHE then we want to set our
$DENO_DIR environment variable to that directory using the following command.
echo 'export DENO_DIR=~/Desktop/Development/DENO_CACHE' >> ~/.zshenv
This still wont do anything so we need to source our .zshenv with
Now, check that the directory has changed with
echo $DENO_DIR again. It should show the new directory you created!
But what if the host of the URL I'm downloading these goes down? Won't the source be unavailable?
This is a problem any remotely sourced dependency system will run into. While using external servers is quick for development, it's terrible for production environments because downtime costs money. Any production environment should source its own dependencies.
In node, we do this by checking the node_modules into a repository. In deno, we just point the
$DENO_DIR to a project-local directory at runtime and also checking that into a repository.
What does this mean? It means, there is no free lunch. Downloading an NPM package into your node_modules directory is not really all that different in Deno - it's still going to take up space on your hard drive.
Does this mean NPM is dead?
So after we have cached the file into a directory local to our project, we can then add it to our git repository using
git add -A DENO_CACHE && git commit -m "✅ deno modules" this way you don't have to install them again. After we do this, we no longer have to download dependancies and we also have no need for NPM. There is also no central place to host your modules even if you wanted to (although you can index them on deno.land).
With the recent purchase of NPM and Github by Microsoft, the long term future of NPM is a little blurry for me. Especially with things like this:
While I doubt very seriously NPM will go anywhere any time soon, the faster the adoption of Deno by enterprises, the faster that outcome will be. Maybe in the next 5-10 years - who knows?
We learned about the standard library, third party modules, and about how caching works in Deno. In my next article we will look at module version control.
If you found this article helpful, give me a shout on twitter – I’d love to hear from you in the comments below or on twitter @drewlearns2.
As always, if you found any errors, just highlight it and mash that “R” button on the right side of the screen and I’ll get those fixed right up!