Categories
Technology

Times New Roman versus Georgia font types

Note: This blog post is a rare example of an opinion/observation that is no longer relevant and/or represent my current views, but I’m keeping it here for historical reasons :-)


It seems that at the moment, the Georgia font type is winning the font race on the internet. But even for fonts, one has to ask whether or not the choice of font is good for cross-platform/browser compatibility. I was making some small changes to my business website the other day and I wanted to use the Georgia font since it is a well-designed and pleasant-to-look-at font. But on Ubuntu, the font is apparently missing… or so it seems. In fact, Firefox and Chrome renders the font in different ways. Firefox properly shows the Georgia font (or some version of it at least) while Chrome falls back and uses Times New Roman for almost all its serif fonts, as illustrated in the screenshot below.
Firefox and Chrome font comparison
Notice that the Georgia font is quite a bit bigger than Times New Roman. I use Chrome for my everyday needs and thus, a ton of websites appear to have a (too) small font size because they use the Georgia font. I think this is unfortunate and that’s why I have chosen to go back to the roots and use Times New Roman for my business website. Maybe you should too.

Here is a link to the document with the different fonts.

Update: I noticed that the screenshot is also rendered differently in Chrome and Firefox. In Firefox, it looks horrible on my screen until I click on it. Makes me happy that I’m using Chrome

Update: Sometime during an upgrade of either Ubuntu or Chrome, I now seem to have the Georgia font or at least som version of it. See below

Categories
Technology

Lisp

Is Lisp easy?I try. I really do. I read articles, I write code, I frustrate myself and I force myself to spend countless hours making things work. Whatever I do, I keep bumping my head into yet another wall. I’m trying to grasp Lisp.

According to John McCarthy, Lisp represents a local optimum in the space of programming languages. What he and everyone else fail to tell people learning Lisp (at least me) is that is represents a global minimum in the space of programming language learnability.

There are many good resources to help understanding Lisp everywhere, ranging from almost philosophical over paedagogical introductions to more technical articles. I have also managed to do something in Lisp, like implementing FastICA for Independent Component Analysis in Clojure, a dialect of Lisp, and at the moment, I’m trying to implement FP-growth, a well-known and fairly scalable Association Rule Mining algorithm. And yes, whenever something succeeds (after many hours of pondering), it is a pleasing experiencing to notice how few lines are sometimes needed to accomplish complex tasks.

But I’m not satisfied. I’m frustrated. When I was learning Java in the early days of my academic career, I was rarely frustrated when faced with new problems to solve. But learning Lisp is like having a very unstable nuclear power plant living in the brain. Meltdowns are inevitable and occur quite often.

And the worst thing about Lisp is not Lisp itself. It is the feeling of incompetence that hits you in the face whenever you cannot figure something out. The feeling of mediocrity is not very pleasant and the meltdowns are tough on one’s self-esteem. The thought: “Maybe I am just an average programmer” pops up constantly.

“I want to believe” that Lisp is great. I hope I will see the light. I’m looking for the promised epiphany. And I most certainly will not settle with mediocrity… ever.

Categories
Technology

Technology should not have side effects

3D movies are good examples of how technology evolves. In earlier days, sci-fi looking red-blue glasses were needed to create the 3D effect but that did not survive the test of time. Now, 3D technology has matured and it seems that all new movies have to be in 3D.

I went to see Avatar special edition yesterday, in 3D. Unfortunately, I am one of the unlucky ones that get eyestrain from watching 3D movies. During the movie, I figured out that looking only at the areas that are in focus will help reduce the strain (seems obvious but our eyes do not naturally do this). This was confirmed by a Shadowlocked blog post that mentions the same eyestrain problem and how to avoid it.

It is apparently a well-known problem for the industry and recent research has shown that it might be bad for your brain to watch 3D movies. Take a look at the research presented here. Go figure… the body is amazing and will tell you if something is wrong, i.e. eyestrain and headache is a warning sign. One would not look directly at the sun either. It hurts. A lot.

The question is: What the hell is going on here? Why is 3D becoming so extremely popular when it has harmful and direct side-effects? Sure, there might be long-term harmful consequences of e.g. looking at a computer screen the entire day but you certainly do not feel it after only 10 minutes in Windows (if you do, it has something to do with Windows, not your screen :-)

One of the most important aspects of technology is to help and not harm so I hope that the current 3D technology is just a stepping stone to something better. A Toms guide post mentions a possible upcoming alternative. I’m already getting excited about throwing away those glasses. Until then, I will just have to close one of my eyes when things get bad. That’s how I endured Avatar — but it certainly does not add to the experience.