Categories
Thoughts

Mighty mighty world

I found these lyrics hidden away in my notes. They were created using my updated lyrics generator earlier this year after retraining the neural network using a sentence embedding instead of a word embedding (lyrics generator code). I guess I was waiting to publish it as part of a larger post. No need. This masterpiece stands on its own :-)

hello world

a mighty mighty mighty mighty mighty mighty a world

world world
mighty

and mighty
mighty mighty
mighty

all mighty mom mighty

babies mom
the mom

mom

roman you roman mom

mother
plastic cat’s roman roman

the pagan pagan pagan pagan pagan pagan pagan pagan pagan i pagan oh
dear
and dear

life

Gene Lyrica Two

Categories
Thoughts

80/20 and data science

“20% of the task took 80% of the time”. If you have ever heard someone say that, you probably know about the so-called 80/20 rule, also known as the Pareto principle.

The principle is based on the observation that “for many events, roughly 80% of the effects come from 20% of the causes.”. As an example, the Wikipedia article mentions how in some companies “80% of the income come from 20% of the customers” along with a bunch of other examples.

In contrast to this, most of the references to the 80/20 rule that I hear in the wild are variations of the (often sarcastic) statement at the beginning of the post, and it is also this version that is more fun to talk about.

Estimating the finishing touch

In software development, the 80/20 rule often shows up when the “finishing touches” to a task are forgotten or complexity is underestimated. An example could be if a developer forgot to factor in the time it takes to add integration tests to a new feature or underestimated the difficulty of optimizing a piece of code for high performance.

In this context, the 80/20 rule could thus be seen as the result of bad task management to a certain extent, but it is worth noting that it is not always this simple. Things get in the way, like when the test suite refuses to run locally, or the optimized code cannot work without blocking the CPU, and the programming language is single-threaded, forcing the developer to take a different approach to the problem (this is purely hypothetical of course…).

Related to this, Erik Bernhardsson recently wrote an interesting treatise on the subject of why software projects “take longer than you think”, and I think it is worth sneaking in a reference. Here is the main claim from the author:

I suspect devs are actually decent at estimating the median time to complete a task. Planning is hard because they suck at the average.

Erik Bernhardsson, Why software projects take longer than you think – a statistical model

The message here resonated quite well with me (especially because of the use of graphs!). The author speaks of a “blowup factor” (actual time divided by estimated time) for projects, and if his claims are true, there could be some merit to the idea that the 20% of a task could easily “blow up” and take 80% of the time.1

Dirty data

Sometimes, the perception of data science is that most of the time “doing data science” is spent on creating models. For some data scientists, this might be the reality, but for the majority that I have spoken to, preparing data takes up a significant amount of time, and it is not the most glorious work, if one is not prepared for it.

I recently gave an internal talk at work where I jokingly referred to this as the 80/20 rule of data science: 80% of the time is spent on data cleaning (the “boring ” part), and 20% on modeling (the “fun” part).

This is not really a 80/20 rule, except if we rephrase it as “80% of the fun takes up only 20% of the time” or something like that.2

When it comes to deploying models in production, the timescales sometimes shift even more. The total time spent on a project might be 1% on modeling and 99% on data cleaning and infrastructure setup, but it’s the 1% (the model) that gets all the attention.

The future of data cleaning

In the last couple of years, there have been loads of startups and tools emerging that do automatic machine learning (or “AutoML”), i.e. they automate the fun parts of data science, while sometimes also providing convenient infrastructure to explore data.

If we assume that the 80/20 rule of data science is correct, these tools are thus helping out with 20% of data science. However, the first company that solves the problem of automatically cleaning and curating dirty data is going to be a billion-dollar unicorn. Perhaps the reason that we have not seen this yet is that dealing with data is actually really difficult.

For now, I suspect that the “80/20 rule of data science” will continue to be true in many settings, but that is not necessarily a bad thing. You just gotta love the data for the data itself :-)

Categories
Thoughts

Tax deductions are not free money

So you deducted your loan interest in your taxes. That means the interest was essentially free right? I could not quite figure out the answer to this question in my head recently, so I thought I would do a simple example and share it here, in case it could be useful for someone else.

First, the conclusion: Just because an expense is deductible does not make it free. It would have been better to not have the expense in the first place. However, if you cannot avoid the expense, then deductibles are of course great!

Let’s say we pay $10 interest on a loan, our income is $100 and we pay 50% tax. The table below shows the scenario where our interest is not deductible, compared to the scenario where our interest is 100% deductible1.

DescriptionAmount w/o deductible Amount w/ deductible
Income$100$100
Taxable income$100$90
Tax-$50-$45
Net income$50$55
Interest-$10-$10
Final income$40$45

In case we can deduct all of the interest, we would have $5 extra disposable income. Thus, about half of the interest in this example were “free” ($5). However, If we did not have to pay interest at all, we would of course have $50 in final income.

In other words, if we can completely avoid an expense, even if it is tax deductible, that is always the best financial outcome. In practice, this is not always possible, but I think it is a good principle to keep in mind.