Failure is growth. Failure is learning. But sometimes, failure is just failure.
If you’ve been hanging in there with us since beta1, maybe you bumped into problems with Duotone icons? We’re sorry about that (read on to learn how we fixed the issue). Are you a font tech nerd? Interested in stories related to the iterative process of trying, failing, and trying again? Then this blog is for you.
TLDR; For Font Awesome 6, we had a problem to solve: Our method of deriving secondary unicode values from primary unicode values had reached its limit. We learned the hard way that Unicode Variation Selectors are not a general-purpose mechanism available for such uses. We would, indeed, need to use ligatures in our Webfonts. Now Duotone icons work. Yay!
Now, onto the drama of how we discovered this issue and licked the problem.
Problems with Unicode Values in Beta 1
In Font Awesome 5, the primary unicode for the “alarm-clock” icon was f34e, so in the Duotone style, the unicode for the secondary glyph was 10f34e. For the “coffee” icon, the primary was f0f4, and the secondary was 10f0f4. See the pattern there? “Put a 10 in front”.
Have a look at this other blog for a more detailed explanation of why it became a problem for us to continue doing that with secondary unicodes and a summary of the solution that we shipped for beta2 (we think that one’s going to work).
This article is about one of the solutions that did not work, the one we shipped in v6 beta1: Unicode Variation Selectors.
How Do You Map to Multiple Glyphs with a Single Unicode?
In a font, there’s an internal identifier called a glyph ID. You put a unicode point — a character — into the content property of a CSS Pseudo-element, for example, and a little picture shows up called a glyph.
What’s happening under the hood is that the font’s cmap table maps that unicode character into a glyph id, and the computer uses that glyph id to look up all of the information across various other tables within the font to get its instructions for how to draw that little picture.
As a user of the font, it’s kinda none of your business what the glyph id is for a particular glyph. It has no necessary correspondence with unicode values you use in your documents. It’s just a way that the font system internally organizes things. So while the unicodes in the examples below are real, the glyph ids are totally arbitrary — which is realistic.
Unicode f34e mapping to glyph id 42.
In Font Awesome 6, we needed to get back to the simplicity of a single icon corresponding to a single unicode value. But what about Duotone, where there are multiple glyphs associated with one icon? How do we map to multiple glyphs using a single unicode?
The Proposed Solution: Unicode Variation Selectors
A Unicode Variation Selector is a special unicode value that doesn’t represent a character; it represents a variation of some other character. It’s an adjective, not a noun.
“Ball” is a noun. You could have a Blue Ball, a Red Ball, or a Green Ball. Each color is a variation of “Ball,” like adjectives modifying a noun.
With Unicode Variation Selectors, you give the Ball a real unicode code point, and you assign variation selector values to Blue, Red, and Green. Now suppose you have another noun: Bike. Bike gets its own unique unicode code point. But you can reuse the same variation selectors to get a Blue, Red, or Green Bike.
There’s a block of Unicode Variation Selectors in the range U+FE00..U+FE0F. Those are the ones we used for Duotone in v6 beta 1. You could select a primary glyph for any Duotone icon with U+FE01, and you could select a secondary glyph for any Duotone icon with U+FE02.
This is what the whole Duotone alarm-clock icon looks like, with the standard styling:
Most of it is the secondary glyph, with reduced opacity; the inner part is the primary glyph with full opacity. Now, what if you want to reference each glyph separately so that you can style them separately? Here’s how you’d do it in v6 beta1, using Unicode Variation Selectors.
To get the primary glyph, ask for unicode “f34e” with variation selector “fe01”:
And for the secondary glyph, it was the same unicode but the “fe02” variation selector.
In CSS, it might have looked like this:
.fad.fa-alarm-clock:before,
.fa-duotone.fa-alarm-clock:before {
content: "\f34e\fe01";
}
.fad.fa-alarm-clock:after,
.fa-duotone.fa-alarm-clock:after {
content: "\f34e\fe02";
}
For the “mug-saucer” icon (aka “coffee” in FA5), it’s a different unicode: f0f4. But the variation selectors are the same. Coffee primary: f0f4 fe01. Coffee secondary: f0f4 fe02.
Where the Variation Selector Solution Went Wrong
I promise, I read the $!#% manual.
It sure seemed like these variation selectors were the perfect tool for the job, practically designed just for me and my use case. And all of my testing across multiple web browsers looked great, even in $!#% Internet Explorer 10.
But I must have overlooked the alarm-clock and baseball icons in the Chrome browser, along with about 200 other emoji icons. (Only emojis!) They all looked fine in Firefox. So I became convinced that this was a bug in Chrome, and I began building a test case to reproduce it so I could report it and fix the Internet to work more like Firefox.
As tends to happen when I put extra diligence into proving why something else is wrong, I started getting the sense that, actually, maybe I was wrong. #lifeskillz
I downloaded the Unicode Character Database and wrote some analysis code to find any possible correlations among the broken icons. (That database includes lots of metadata about every Unicode character in the spec). Every broken icon was an emoji, using one of the Unicode standard emoji code points.
The problem never showed up for icons that used our Private Use Area code points. And it never showed up for icons using non-emoji standard code points. But also, maddeningly, it didn’t show up for all emojis either — just some emojis. I still don’t know what it is about that particular set of emojis that are treated differently by Chrome (and some other browsers).
Internet, if you know why, please just tell me already. I’ve asked nicely and in several other ways. Come on.
Consulting with a Real Font Expert (thank you, Roel!), a needle emerged from the haystack: the FAQ on Variation Selectors. Now that I see it, yeah, it’s a pretty big needle. An entire FAQ on just this topic. How did I miss that?
Anyway, it turns out Variation Selectors are not a general-purpose mechanism. They are only for specified uses. In particular, there are a bunch of specified variations for emoji characters!
So they happen to work most of the time in all browsers tested. And all of the time in some browsers. But maybe Chrome was Not Wrong after all for disrespecting some of my variation selectors. Maybe it had a right to do its own thing. Maybe I was the one who didn’t have the right to use variation selectors for my own purposes like this.
Conclusion: Ligatures, Meet Webfonts. Webfonts, Meet Ligatures
All of this had been — in part — an effort to avoid encoding ligatures into the GSUB table in our Webfonts. (Ligatures are special sequences of glyphs that get replaced by something else when matched in a document; in our case, some icon or icon layer.)
Alas, we would, indeed, need to use ligatures in our Webfonts. And it turned out that even those ligatures would not work — for the exact same set of emoji icons — if the first character in the ligature were one of those emoji standard unicode values. But that’s another story.
Take a look at the Duotone “alarm-clock”, “baseball”, or “alien-8bit” icon in the Font Awesome 6 Icon Gallery now. If you’re web console savvy and curious, you could even inspect the content property of the ::before
and ::after
pseudo-elements on those icons as they’re displayed in the gallery. See how the ::after
is different from the ::before
?
So. We think it’s helpful to show our math with these sorts of challenges, so we can all keep learning. Do you have any discoveries about unicode variation selectors you’d like to share? Give us a holler on Twitter!