Reading List

The most recent articles from a list of feeds I subscribe to.

Free Intro to Web Development slides (with demos)

This semester I’m teaching 6.813 User Interface Design and Implementation at MIT, as an instructor.

Many of the assignments of this course include Web development and the course included two 2-hour labs to introduce students to these technologies. Since I’m involved this year, I decided to make new labs from scratch and increase the number of labs from 2 to 3. Even so, trying to decide what to include and what not to from the entirety of web development in only 6 hours was really hard, and I still feel I failed to include important bits.

Since many people asked me for the slides on Twitter, I decided to share them. You will find my slides here and an outline of what is covered is here. These slides were also the supporting material the students had on their own laptops and often they had to do exercises in them.

The audience for these slides is beginners in Web development but technical otherwise — people who understand OOP, trees, data structures and have experience in at least one C-like programming language.

Some demos will not make sense as they were live coded, but I included notes (top right or bottom left corner) about what was explained in each part.

Use the arrow keys to navigate. It is also quite big, so do not open this on a phone or on a data plan.

If the “Open in new Tab” button opens a tab which then closes immediately, disable Adblock.

From some quick testing, they seem to work in Firefox and Safari, but in class we were using an updated version of Chrome (since we were talking about developer tools, we needed to all have the same UI), so that’s the browser I’d recommend since they were tested much more there.

I’m sharing them as-is in case someone else finds them useful. Please do not bug me if they don’t work in your setup, or if you do not find them useful or whatever**.** If they don’t tickle your fancy, move on. I cannot provide any support or fixes. If you want to help fix the issue, you can submit a pull request, but be warned: most of the code was written under extreme time pressure (I had to produce this 6 times as fast as I usually need to make talks), so is not my finest moment.

If you want to use them to teach other people that’s fine as long as it’s a non-profit event.

[gallery columns=“2” size=“medium” ids=“2756,2757,2755,2754”]

Different remote and local resource URLs, with Service Workers!

I often run into this issue where I want a different URL remotely and a different one locally so I can test my local changes to a library. Sure, relative URLs work a lot of the time, but are often not an option. Developing Mavo is yet another example of this: since Mavo is in a separate repo from mavo.io (its website) as well as test.mavo.io (the testsuite), I can’t just have relative URLs to it that also work remotely. I’ve been encountering this problem way too frequently pretty much since I started in web development. In this post, will describe all solutions and workarounds I’ve used over time for this, including the one I’m currently using for Mavo: Service Workers!

The manual one

Probably the first solution everyone tries is doing it manually: every time you need to test, you just change the URL to a relative, local one and try to remember to change it back before committing. I still use this in some cases, since us developers are a lazy bunch. Usually I have both and use my editor’s (un)commenting shortcut for enabling one or the other:

<script src="https://get.mavo.io/mavo.js"></script>
<!--<script src="../mavo/dist/mavo.js"></script>-->

However, as you might imagine, this approach has several problems, the worst of which is that more than once I forgot and committed with the active script being the local one, which resulted in the remote website totally breaking. Also, it’s clunky, especially when it’s two resources whose URLs you need to change.

The JS one

This idea uses a bit of JS to load the remote URL when the local one fails to load.

<script src="http://localhost:8000/mavo/dist/mavo.js" onerror="this.src='https://get.mavo.io/mavo.js'"></script>

This works, and doesn’t introduce any cognitive overhead for the developer, but the obvious drawback is that it slows things down for the server since a request needs to be sent and fail before the real resource can be loaded. Slowing things down for the local case might be acceptable, even though undesirable, but slowing things down on the remote website for the sake of debugging is completely unacceptable. Furthermore, this exposes the debugging URLs in the HTML source, which gives me a bit of a knee jerk reaction.

A variation of this approach that doesn’t have the performance problem is:

<script>
{
 let host = location.hostname == "localhost"? 'http://localhost:8000/dist' : 'https://get.mavo.io';
 document.write(`<script src="${host}/mavo.js"></scr` + `ipt>`);
}
</script>

This works fine, but it’s very clunky, especially if you have to do this multiple times (e.g. on multiple testing files or demos).

The build tools one

The solution I was following up to a few months ago was to use gulp to copy over the files needed, and then link to my local copies via a relative URL. I would also have a gulp.watch() that monitors changes to the original files and copies them over again:

gulp.task("copy", function() {
	gulp.src(["../mavo/dist/**/*"])
		.pipe(gulp.dest("mavo"));
});

gulp.task("watch", function() {
	gulp.watch(["../mavo/dist/*"], ["copy"]);
});

This worked but I had to remember to run gulp watch every time I started working on each project. Often I forgot, which was a huge source of confusion as to why my changes had no effect. Also, it meant I had copies of Mavo lying around on every repo that uses it and had to manually update them by running gulp, which was suboptimal.

The Service Worker one

In April, after being fed up with having to deal with this problem for over a decade, I posted a tweet:

https://twitter.com/LeaVerou/status/857030863292436480?ref_src=twsrc^tfw

@MylesBorins replied (though his tweet seems to have disappeared) and suggested that perhaps Service Workers could help. In case you’ve been hiding under a rock for the past couple of years, Service Workers are a new(ish) API that allows you to intercept requests from your website to the network and do whatever you want with them. They are mostly promoted for creating good offline experiences, though they can do a lot more.

I was looking for an excuse to dabble in Service Workers for a while, and this was a great one. Furthermore, browser support doesn’t really matter in this case because the Service Worker is only used locally.

The code I ended up with looks like this in a small script called sitewide.js, which, as you may imagine, is used sitewide:

(function() {

if (location.hostname !== "localhost") {
	return;
}

if (!self.document) {
	// We're in a service worker! Oh man, we’re living in the future! ??
	self.addEventListener("fetch", function(evt) {
		var url = evt.request.url;

		if (url.indexOf("get.mavo.io/mavo.") > -1 || url.indexOf("dev.mavo.io/dist/mavo.") > -1) {
			var newURL = url.replace(/.+?(get|dev)\.mavo\.io\/(dist\/)?/, "http://localhost:8000/dist/") + "?" + Date.now();

			var response = fetch(new Request(newURL), evt.request)
				.then(r => r.status < 400? r : Promise.reject())
				// if that fails, return original request
				.catch(err => fetch(evt.request));

			evt.respondWith(response);
		}
	});

	return;
}

if ("serviceWorker" in navigator) {
	// Register this script as a service worker
	addEventListener("load", function() {
		navigator.serviceWorker.register("sitewide.js");
	});
}


})();

So far, this has worked more nicely than any of the aforementioned solutions and allows me to just use the normal remote URLs in my HTML. However, it’s not without its own caveats:

  • Service Workers are only activated on a cached pageload, so the first one uses the remote URL. This is almost never a problem locally anyway though, so I’m not concerned about it much.
  • The same origin restriction that service workers have is fairly annoying. So, I have to copy the service worker script on every repo I want to use this on, I cannot just link to it.
  • It needs to be explained to new contributors since most aren’t familiar with Service Workers and how they work at all.

Currently the URLs for both local and remote are baked into the code, but it’s easy to imagine a mini-library that takes care of it as long as you include the local URL as a parameter (e.g. https://get.mavo.io/mavo.js?local=http://localhost:8000/dist/mavo.js).

Other solutions

Solutions I didn’t test (but you may want to) include:

  • .htaccess redirect based on domain, suggested by @codepo8. I don’t use Apache locally, so that’s of no use to me.
  • Symbolic links, suggested by @aleschmidx
  • User scripts (e.g. Greasemonkey), suggested by @WebManWlkg
  • Modifying the hosts file, suggested by @LukeBrowell (that works if you don’t need access to the remote URL at all)

Is there any other solution? What do you do?

Different remote and local resource URLs, with Service Workers!

I often run into this issue where I want a different URL remotely and a different one locally so I can test my local changes to a library. Sure, relative URLs work a lot of the time, but are often not an option. Developing Mavo is yet another example of this: since Mavo is in a separate repo from mavo.io (its website) as well as test.mavo.io (the testsuite), I can’t just have relative URLs to it that also work remotely. I’ve been encountering this problem way too frequently pretty much since I started in web development. In this post, will describe all solutions and workarounds I’ve used over time for this, including the one I’m currently using for Mavo: Service Workers!

The manual one

Probably the first solution everyone tries is doing it manually: every time you need to test, you just change the URL to a relative, local one and try to remember to change it back before committing. I still use this in some cases, since us developers are a lazy bunch. Usually I have both and use my editor’s (un)commenting shortcut for enabling one or the other:

<script src="https://get.mavo.io/mavo.js"></script>
<!--<script src="../mavo/dist/mavo.js"></script>-->

However, as you might imagine, this approach has several problems, the worst of which is that more than once I forgot and committed with the active script being the local one, which resulted in the remote website totally breaking. Also, it’s clunky, especially when it’s two resources whose URLs you need to change.

The JS one

This idea uses a bit of JS to load the remote URL when the local one fails to load.

<script src="http://localhost:8000/mavo/dist/mavo.js" onerror="this.src='https://get.mavo.io/mavo.js'"></script>

This works, and doesn’t introduce any cognitive overhead for the developer, but the obvious drawback is that it slows things down for the server since a request needs to be sent and fail before the real resource can be loaded. Slowing things down for the local case might be acceptable, even though undesirable, but slowing things down on the remote website for the sake of debugging is completely unacceptable. Furthermore, this exposes the debugging URLs in the HTML source, which gives me a bit of a knee jerk reaction.

A variation of this approach that doesn’t have the performance problem is:

<script>
{
 let host = location.hostname == "localhost"? 'http://localhost:8000/dist' : 'https://get.mavo.io';
 document.write(`<script src="${host}/mavo.js"></scr` + `ipt>`);
}
</script>

This works fine, but it’s very clunky, especially if you have to do this multiple times (e.g. on multiple testing files or demos).

The build tools one

The solution I was following up to a few months ago was to use gulp to copy over the files needed, and then link to my local copies via a relative URL. I would also have a gulp.watch() that monitors changes to the original files and copies them over again:

gulp.task("copy", function() {
	gulp.src(["../mavo/dist/**/*"])
		.pipe(gulp.dest("mavo"));
});

gulp.task("watch", function() {
	gulp.watch(["../mavo/dist/*"], ["copy"]);
});

This worked but I had to remember to run gulp watch every time I started working on each project. Often I forgot, which was a huge source of confusion as to why my changes had no effect. Also, it meant I had copies of Mavo lying around on every repo that uses it and had to manually update them by running gulp, which was suboptimal.

The Service Worker one

In April, after being fed up with having to deal with this problem for over a decade, I posted a tweet:

https://twitter.com/LeaVerou/status/857030863292436480?ref_src=twsrc^tfw

@MylesBorins replied (though his tweet seems to have disappeared) and suggested that perhaps Service Workers could help. In case you’ve been hiding under a rock for the past couple of years, Service Workers are a new(ish) API that allows you to intercept requests from your website to the network and do whatever you want with them. They are mostly promoted for creating good offline experiences, though they can do a lot more.

I was looking for an excuse to dabble in Service Workers for a while, and this was a great one. Furthermore, browser support doesn’t really matter in this case because the Service Worker is only used locally.

The code I ended up with looks like this in a small script called sitewide.js, which, as you may imagine, is used sitewide:

(function() {

if (location.hostname !== "localhost") {
	return;
}

if (!self.document) {
	// We're in a service worker! Oh man, we’re living in the future! ??
	self.addEventListener("fetch", function(evt) {
		var url = evt.request.url;

		if (url.indexOf("get.mavo.io/mavo.") > -1 || url.indexOf("dev.mavo.io/dist/mavo.") > -1) {
			var newURL = url.replace(/.+?(get|dev)\.mavo\.io\/(dist\/)?/, "http://localhost:8000/dist/") + "?" + Date.now();

			var response = fetch(new Request(newURL), evt.request)
				.then(r => r.status < 400? r : Promise.reject())
				// if that fails, return original request
				.catch(err => fetch(evt.request));

			evt.respondWith(response);
		}
	});

	return;
}

if ("serviceWorker" in navigator) {
	// Register this script as a service worker
	addEventListener("load", function() {
		navigator.serviceWorker.register("sitewide.js");
	});
}


})();

So far, this has worked more nicely than any of the aforementioned solutions and allows me to just use the normal remote URLs in my HTML. However, it’s not without its own caveats:

  • Service Workers are only activated on a cached pageload, so the first one uses the remote URL. This is almost never a problem locally anyway though, so I’m not concerned about it much.
  • The same origin restriction that service workers have is fairly annoying. So, I have to copy the service worker script on every repo I want to use this on, I cannot just link to it.
  • It needs to be explained to new contributors since most aren’t familiar with Service Workers and how they work at all.

Currently the URLs for both local and remote are baked into the code, but it’s easy to imagine a mini-library that takes care of it as long as you include the local URL as a parameter (e.g. https://get.mavo.io/mavo.js?local=http://localhost:8000/dist/mavo.js).

Other solutions

Solutions I didn’t test (but you may want to) include:

  • .htaccess redirect based on domain, suggested by @codepo8. I don’t use Apache locally, so that’s of no use to me.
  • Symbolic links, suggested by @aleschmidx
  • User scripts (e.g. Greasemonkey), suggested by @WebManWlkg
  • Modifying the hosts file, suggested by @LukeBrowell (that works if you don’t need access to the remote URL at all)

Is there any other solution? What do you do?

Introducing Mavo: Create web apps entirely by writing HTML!

Today I finally released the project I’ve been working on for the last two years at MIT CSAIL: An HTML-based language for creating (many kinds of) web applications without programming or a server backend. It’s named Mavo after my late mother (Maria Verou), and is Open Source of course (yes, getting paid to work on open source is exactly as fun as it sounds).

It was the scariest release of my life, and have been postponing it for months. I kept feeling Mavo was not quite there yet, maybe I should add this one feature first, oh and this other one, oh and we can’t release without this one, surely! Eventually I realized that what I was doing had more to do with postponing the anxiety and less to do with Mavo reaching a stage where it can be released. After all, “if you’re not at least a bit embarrassed by what you release, you waited too long”, right?

The infamous Ship It Squrrel

So, there it is, I hope you find it useful. Read the post on Smashing Magazine or just head straight to mavo.io, read the docs, and play with the demos!

And do let me know what you make with it, no matter how small and trivial you may think it is, I would love to see it!

Introducing Mavo: Create web apps entirely by writing HTML!

Today I finally released the project I’ve been working on for the last two years at MIT CSAIL: An HTML-based language for creating (many kinds of) web applications without programming or a server backend. It’s named Mavo after my late mother (Maria Verou), and is Open Source of course (yes, getting paid to work on open source is exactly as fun as it sounds).

It was the scariest release of my life, and have been postponing it for months. I kept feeling Mavo was not quite there yet, maybe I should add this one feature first, oh and this other one, oh and we can’t release without this one, surely! Eventually I realized that what I was doing had more to do with postponing the anxiety and less to do with Mavo reaching a stage where it can be released. After all, “if you’re not at least a bit embarrassed by what you release, you waited too long”, right?

The infamous Ship It Squrrel

So, there it is, I hope you find it useful. Read the post on Smashing Magazine or just head straight to mavo.io, read the docs, and play with the demos!

And do let me know what you make with it, no matter how small and trivial you may think it is, I would love to see it!