- This topic is empty.
Viewing 5 posts - 1 through 5 (of 5 total)
Viewing 5 posts - 1 through 5 (of 5 total)
- The forum ‘CSS’ is closed to new topics and replies.
The forums ran from 2008-2020 and are now closed and viewable here as an archive.
This has been bugging me for a few months now. Most font-face kits I use have the bulletproof or smiley method to avoid possibility of the wrong font with the right name being used locally. My question is, why use the local declaration anyways? If you’re just going to declare a local font that can’t exist, why declare it at all? Obviously there may be a few details I’m missing, but it just seems needlessly complicated when you can simply not declare a local font at all.
Anyone care to enlighten me on this? I’ve removed the declaration on my kits so they work on android (something about it chocking on the smiley) and everything still works fine by the looks of it.
I *think* the main reason for a local declaration is to speed up page loading times
Yeah, by loading the font from the visitors computer if they have it right?
If a font is named ‘♥helvetica!@‘ and used via @font-face, I think the font is stored in a temp folder somewhere after downloading it. Therefore it should load faster after the first time if the local declaration is set. I could be wrong though, it’s just a guess.
Well yeah, but if you don’t call the font ‘♥helvetica!@’ then there’s no point, unless I’m missing something here.