File IO

Web First Gamedev - Part 4

File IO is one of the areas where the web diverges furthest from what I’m familiar with when developing native games. I’m used to having fopen or similar just give me straight forward filesystem access for arbitrary reading and writing. On the web there are a variety of options for reading data, and quite the spectrum of different ways to store data. I’m only going to look at local storage for writing, I’m gonna call server side storage out of scope of this post.

The storage options I’m aware of are web resources, cache, cookies, local storage, session storage, indexeddb, and direct filesystem access. It would be impossible to usefully cover all these mechanisms, so I’m going to try to tackle the primary use cases I want to have support for.

Loading Assets

The first file operation I would expect to need is being able to load some asset in to game memory. This could be a config file, a file containing the description of a level, or any other asset.

I’m going to avoid explicitly discussing loading images and audio assets. The browser has constructs that provide special support for these types. So these will be covered in the upcoming graphics and audio entries in this series.


One of the simplest way of embedding textual data is by just placing it in a script tag:


<script type="text/plain" id="text-asset">This is a simple text string!</script>
function loadAsset() {
    let display = document.getElementById('asset-display');
    let asset = document.getElementById('text-asset').innerText;
    display.innerText = asset;

The contents are accessible via the innerText property of the element. There are limitations on the contents of the tag, but it should work for most plaintext data.

For any data, binary or textual, base64 allows us to store arbitrary data in a script tag:


<script type="text/plain" id="base64-asset">
function loadAsset() {
    let display = document.getElementById('asset-display');

    let asset = atob(document.getElementById('base64-asset').innerText);
    let assetBytes = Uint8Array.from(asset, c => c.charCodeAt(0));

    display.innerText = "";
    for (b in assetBytes) {
        display.innerText += `[${b}]`;

This example embeds a binary blob, containing 0x00 - 0xFF, as a base64 string. The file was generated with a short script and then encoded as base64 using openssl base64. The loadAsset function decodes the value of the script tag and prints every byte in order.

This approach is really straight forward, but there are pretty significant downsides. The main one is that any assets embedded this way have to be completely downloaded before the DOMContentLoaded event will fire. This makes it much harder to know when it’s safe to start executing the game logic. The other is that it’s not particularly efficient. The base64 version of the data is larger than the original data, and the cost of decoding and converting to an ArrayBuffer character by character is less than ideal for large data sets.

I suspect this approach could be used to embed assets required to bootstrap a loading screen, offline indicator, or similar.

Fetch API

If the assets aren’t embedded in the webpage, then they’ll need to be fetched from the web server explicitly. This is done with the fetch API. This API is very flexible and support a wide range of options. At its most basic it just takes a URL, and returns a promise referencing the result of the request.

The request object supports a variety of methods of interpretting the result as a specific type. Here I fetch a text file and request the contents as a string using the text function:


async function loadAsset() {
    let response = await fetch('text_asset.txt');
    let text = await response.text();
    let display = document.getElementById('asset-display');
    display.innerText = text;

And similar to the embedding example above, in the next example I request a binary file and convert the result to an ArrayBuffer using the arrayBuffer function on the response:


async function loadAsset() {
    let response = await fetch('binary_asset.bin');
    let assetBuffer = await response.arrayBuffer();
    let assetBytes = new Uint8Array(assetBuffer);

    let display = document.getElementById('asset-display');
    display.innerText = "";
    for (b in assetBytes) {
        display.innerText += `[${b}]`;

Noteworthy, is that fetch uses async. This means I shouldn’t ever rely on blocking asset access. With native gamedev I can do a fopen, read in a small asset, and process the data all at once. As long as the asset I’m loading is small it would just result in a longer than normal frame, which is largely fine during a loading screen. Doing asset loading asynchronously is the better way to go about things (far better for loading larger assets) so this is a push in the right direction.

Save Games / Settings

Persisting settings changes and adding save game functionality would be the two main cases for writing to disk normally. I’m tackling them in the same section because for the most part they’d have similar life cycles and access patterns.


Cookies are where my mind first jumps to when I think of the browser persisting some state on behalf of a website between visits. Unfortunately I don’t think they’re appropriate in this case. They primarily provide a way to add extra state to requests made against the server. As a consequence:

  • Cookies are limited in size - roughly 4kb per cookie and ~20 cookies is the limit from what I can see
  • Cookies are sent to the web server with every request. We really don’t want to receive the save file contents with every request for an asset.

Local Storage

Local storage is a simple string key/value store. The values written to it with persist until the user empties out their browser state, or its forced out by disk pressure. Capacity numbers very by browser and I haven’t been able to track down concrete numbers, but it looks like I can expect to be able to rely on being able to store 5-10 megs of storage total, and a couple of megs per key.


function saveString() {
    let input = document.getElementById('string-input');
    let stringToSave = input.value;
    localStorage.setItem("saved_text", stringToSave);

function loadString() {
    let input = document.getElementById('string-input');
    let loadedString = localStorage.getItem("saved_text");
    input.value = loadedString;

Local storage is really straight forward to use. Once I have the string I need to save it’s a one-liner to persist it, and it’s similarly straight forward to delete it.

For binary data it’s not that much more involved, we just need to encode the data for storage as a string using base64. The following demo takes a number input, and places it in a Int32Array and persists the array to local storage.


function saveBinary() {
    let input = document.getElementById('number-input').value;
    let intArray = new Int32Array(1);
    intArray[0] = parseInt(input, 10);

    let byteArray = new Uint8Array(intArray.buffer);
    let stringToSave = '';
    for (let i = 0; i < byteArray.byteLength; i++) {
        stringToSave += String.fromCharCode(byteArray[i]);
    localStorage.setItem('saved_binary', btoa(stringToSave));

function loadBinary() {
    let input = document.getElementById('number-input');
    let loadedString = localStorage.getItem('saved_binary');

    let byteArray = Uint8Array.from(atob(loadedString), c => c.charCodeAt(0));
    let intArray = new Int32Array(byteArray.buffer);

    input.value = intArray[0];

This works and isn’t too complicated, but it’s really not ideal. For one, combining the base64 overhead with the UTF16 encoding of the value storage, 1kb of binary data becomes ~2.6kb of saved data. The other is that the process of encoding from base64 string to array buffer requires us to process the data bytewise in javascript.

I suspect the natural use for local storage would be to store purely textual data. JSON has good support by default but any textual format would work. So for something like game settings persistence or small to moderately sized save files local storage using json seems like a really nice solution and probably what I’ll be using most often.


The other option for persistent storage is IndexedDB. It is far more involved to use than local storage, but it has two significant features that may be needed in some cases:

  • It can store a larger amount of data. Again I’m having a hard time getting concrete numbers on what limits are but it seems to be that I can expect 10s of megs at least of storage.
  • Support for storing any javascript value. This means no expensive base64 conversions - I can just store the binary data directly.

Even more so than fetch above, covering IndexedDB is well beyond the scope of this article, but the demo bellow shows saving and loading a string and binary blob:


let db;

window.addEventListener('DOMContentLoaded', (event) => {
    let dbRequest ='indexeddb_demo', 1);
    dbRequest.onsuccess = function (event) {
        db =;
        console.log('IndexedDB Connected');
    dbRequest.onupgradeneeded = function (event) {
        let db =;
        let store = db.createObjectStore('saves', { keyPath: 'slot', unique: true });
        console.log('IndexedDB Initialized');

The first step of using IndexedDB is getting the connection to the DB, and setting up the DB if needed. A DB contains stores. A store has a main key, and optionally additional indexes. Since the values stored in the db are all javascript objects the key isn’t a column name like you’d have an SQL db. Instead it’s the path to the value in the javascript object. In this case I called the key slot, imagining I’m writing a save system that supports multiple slots.

function saveState() {
    if (!db) { return; }

    let transaction = db.transaction('saves', 'readwrite');

    let stringInput = document.getElementById('string-input');
    let stringToSave = stringInput.value;

    let numberInput = document.getElementById('number-input');
    let intArray = new Int32Array(1);
    intArray[0] = parseInt(numberInput.value, 10);

    let store = transaction.objectStore('saves');
    let putRequest = store.put({ slot: 0, stringValue: stringToSave, binaryValue: intArray });
    putRequest.onsuccess = function (event) {
        console.log('State Saved');

To save state I create a transaction against the store. I need to specify the store twice in this function. The first creates a transaction which could be over a set of stores. The second grabs a specific store from the transaction to make requests against. These operations are async, so to confirm the state was written I add a function to the onsuccess callback of the put request.

Both the text and binary data are just added directly to the object to be stored, no processing is needed to make these values storable.

function loadState() {
    if (!db) { return; }

    let transaction = db.transaction('saves', 'readonly');

    let store = transaction.objectStore('saves');
    let getRequest = store.get(0);
    getRequest.onsuccess = function (event) {
        let stringInput = document.getElementById('string-input');
        let numberInput = document.getElementById('number-input');
        stringInput.value =;
        numberInput.value =[0];

        console.log('State Loaded', event);

Loading is quite similar to write. I start the transaction, grab a handle to the store, and create the get request. I attach a function to the onsuccess callback and then I can use the data returned without any processing.

This is definitely more involved than the effective 1-liner of local storage. It also supports more involved data structures and workflows. If I need to store more complicated save state, especially if it’s such that I’d want to be able to query for parts of it easily, I could easily imagine myself using IndexedDB. A usecase might be game with a large world, where my state for different regions are stored in different keys.

Browser Tooling Support

It’s worth noting that both Firefox and Chrome (and I assume others) have full support for inspecting and manipulating the data stored in both local storage and IndexedDB. I’ve found them extremely useful to have these tools open while doing these investigations.

For Chrome these tools are under the Application tab, and in Firefox they’re on the Storage tab.

User-Made Assets

The previous sections covered accessing data I’ve authored, either during development in the case of loading assets or during runtime in the case of settings and save files. This section is about getting access to data that the player may have available to them locally, maps they may have downloaded, scripts they’ve written themselves, etc.

File API

Since web browsers have APIs for nigh on everything, there’s an API that gives us access to local file - File API. There is also the File System Access API, but that’s not supported on Firefox so I’m avoiding using it.

Using the File API is fairly straight forward. By using an input element with type file, the player can either browse to or drag drop in the file they want the game to read. The onchange event on the element is passed a File object which can be used with a FileReader to read in the contents of the file as a string, Blob, or ArrayBuffer.

In the following demo the contents of the file are read in as a string and inserted in to the page:


<input id="file-input" type="file" onchange="loadFile(event)"></input>
function loadFile(event) {
    let file =[0];

    let reader = new FileReader();
    reader.onload = function (e) {
        document.getElementById('file-contents').innerText =;


This is fairly straight forward, however two issues are:

  • It requires the use of non-canvas elements. There area few ways I could tackle this. I could wrangle overlaying DOM elements on top of the canvas accurately such that they feel integrated with the canvas content. Or I could overlay a DOM modal popup, which wouldn’t require the same accuracy. Or I could just place the upload UI elsewhere on the page. I suspect the modal would be the best middle ground. I also suspect answering this problem properly deserves a post of its own.
  • Permission to read the file doesn’t persist through reloads of the page. Storing the contents of these user files in IndexedDB would address this problem fairly straight forwardly, especially since it will happily store a Blob or ArrayBuffer transparently.


The approach above lets me read files from disk, but to allow the player to save content to disk I’m using URL.createObjectURL. This would be of use when a game has some kind of map editor and I want to let the player download and share content.

The createObjectURL function creates a URL which holds a reference to a Blob or File. By setting the target of a link to that URL the player can use that link to download the contents of the object:


let contentsURL;
function saveFile() {
    if (contentsURL) {
        contentsURL = null;

    let contents = document.getElementById('file-contents').value;
    let contentsFile = new File([contents], 'filewrite.txt', { type: 'text/plain' });
    contentsURL = URL.createObjectURL(contentsFile);

    let link = document.createElement('a'); = 'filewrite.txt';
    link.href = contentsURL;
    link.innerText = 'Download';

    let linkContainer = document.getElementById('link-container');
    linkContainer.innerHTML = '';

By setting the download attribute on the link when it’s clicked it will cause the browser to download the file instead of trying to open the contents of the object. It would also be possible to automatically click on the link - hide the element using display: none in the style of the element, and invoke click after it’s been added to the document.

Of note as well, is the need to call revokeObjectURL. The object URL holds a reference to the underlying object it links to. Without revoking the URL the contents file would never be cleaned up, resulting in a memory leak.