# Evan A. Sultanik, Ph.D.

Evan's First Name @ Sultanik .com

Computer Security Researcher
Trail of Bits

## PoC‖GTFO Issue 0x00

### Ceci c'est ne pas une pirate.

A couple years ago I wrote about some thought experiments related to copyright law. Let's say you have a huge number, $n$, two documents, $D_1$ & $D_2$, and a cipher, $f$, that, given the number and the first document, yields the second document: $f(n, D_1) = D_2$. Now let's say $D_1$ is some book in the public domain, whereas $D_2$ is a copyrighted book. In my previous article I wondered,

What is the copyright status of $n$? Can a number be copyrighted?

The difficulty arises in the fact that we have reconstructed a copyrighted artifact solely using artefacts in the public domain. I subsequently posed this scenario to some lawyers, and their opinion was that either the cipher $f$ would be in violation of $D_2$'s copyright, or the copyright status would depend on the intent of the cipher’s creator.

But what if, for the same number $n$, there is another pair of documents, $D_3$ & $D_4$ for which $f(n, D_3) = D_4$? What if $D_3$ and $D_4$ are both in the public domain? Is such an occurrence even possible? It turns out that it is, and it's probably more probable that you might think. Read on to find out.

## Streams Data Processing Workshop

### A Presentation Introducing Distributed Combinatorial Optimization

I will be presenting a talk introducing distributed combinatorial optimization at the Streams Data Processing Workshop tomorrow. Here are the slides:

Handouts for the talk are also available, here.

## Sushi Elitism

### Three reasons why you've probably never had an authentic sushi experience.

There are a number of myths and misconceptions surrounding both the creation and consumption of sushi that I'd I think are important enough to devote an entire blog entry toward their clarification.

Proper sushi has a small amount of wasabi applied to the underside of the fish. Some people claim that "it is to prevent parasites" (via the natural anticeptic properties of the wasabi), but I find this explanation a bit dubious (I would think that the wasabi would have to be uniformly applied to the entire fish to have any measurable effect). The wasabi is really there to add flavor. In really high-end sushi restaurants in Japan, for example, it is relatively uncommon for the guest to be served a mound of grated wasabi; that's the sushi equivalent to serving an un-tossed caesar salad with the dressing on the side. Instead, the chef applies the perfect amount of wasabi to each piece of sushi. Some chefs will not even serve soy sauce, instead individually brushing on the sauce on each piece. If wasabi is served separately from the fish, it is generally also considered bad form to mix it with the soy sauce (as is commonly done in the US).

So, why might you have never seen this before at your local sushi joint?

1. It takes more time/skill/effort to do it properly.
2. Many sushi restaurants in the US do not have properly trained sushi chefs. In fact, in most areas of the country with little or no Japanese population, don't be surprised if your sushi chef is a Salvadorian who learned from the Oaxacan who learned from the Korean owner of the restaurant. Not that there's anything wrong with that; one of my favorite sushi places is run by an Indonesian. Just keep in mind that the people making you your sushi may have never experienced the "real thing" themselves.
3. "Real," fresh wasabi is very rare and expensive. Most of the stuff that is used in the US is basically horseradish paste plus food coloring. Restaurants that can both procure and afford to use the real stuff will want to use it sparingly; they wouldn't want to waste it by putting a mound of it on each plate. Therefore, they might be more inclined to use the "proper" technique.

Here is a video in which you can see famous chef Naomichi Yasuda using wasabi inside the sushi. It all happens very quickly, but you can clearly see him dip into the bin of wasabi, rub it on the underside of the fish, and then apply the fish to the rice. Here is another video in which Anthony Bourdain explains the "rules" of high-end sushi eating, while dining at the super-famous and super-expensive Sukiyabashi Jiro (the proprietor of which is actually designated as a "living treasure" by the Japanese government). That second video shows the chef brushing on the soy sauce, and the general lack of both soy and wasabi on the plates.

And on to the matter of chopsticks. Some purists claim that you're not supposed to use them, instead using your hands. I've been to Japan, and it really varies by the type of establishment. Here is my take on it: I think it really depends on the formality of the restaurant. In the more formal places, they won't serve any separate wasabi or soy (only gari) and they'll give you a small moist towel (in addition to the oshibori) that you are supposed to use to clean your fingers between pieces of sushi, under the expectation that you will use your hands. If they don't give a towel to clean your fingers, then I think it is acceptable to use chopsticks.

When I say "high-end", I mean a place that specializes in just sushi. In Japan, these places are usually tiny, consisting of only a bar. Each sushi chef services at most 5 customers at a time. Each chef is supported by perhaps one apprentice (who might prepare some of the seafood) and several other apprentices in the kitchen whose primary job is to prepare the rice. Sushi is all about the rice, after all; an apprentice will spend the better part of the first decade of his training just learning how to cook the rice, before even touching any seafood. That's why most casual restaurants in Japan (e.g., izakaya) will forego even serving sushi, instead offering preparations that require less skill/training (e.g., sashimi).

## Will eating late at night make you fat?

### TL;DR: No.

Despite awkwardness caused by his recent rants on the topics of modern cooking techniques and more recently fandom, Alton Brown is still one of my favorite sources of culinary quotes. One of which, related to nutrition, is

There are no "bad foods", only bad food habits.

In keeping with my recent posts' unintentional gastronomic theme [1,2], I am going to dedicate this post to debunking a myth related to the above quote. I claim that there is no bad time to eat, only bad eating habits. Specifically, I'd like to dispell the common misconception that the time of one's meal alone can affect weight gain.

I recall hearing a story on NPR a year or two ago about a study which specifically tried to test the claim that eating late at night is unhealthy. The study concluded that there was absolutely no correlation between the proximity of mealtime to sleep and weight gain, other than the fact that people tend to choose to eat more unhealthy foods late at night. I can’t seem to find a reference to that study, however, so I am going to have to do some research myself. Right. Lets get stuck in.

The majority of our food is actually digested during sleep, so the common argument that “eating late at night is bad because our metabolism [slows or shuts down] during sleep” is incorrect. With that said, there is a correlation between night eating, low self-esteem, reduced daytime hunger, and weight loss among people who are already obese or prone to obesity, however, this correlation does not necessarily imply causation (i.e., the act of eating a late meal does not necessarily provoke these conditions). It may simply be the case that the types of foods that people prefer to eat late at night are less healthy. There is still much debate on the subject, however, many scientists agree that meal frequency, as opposed to time, is one of the best predictors for weight gain. For example, the time between meals is highly correlated to one’s waist size. This makes some intuitive sense, since eating more, smaller meals will help regulate insulin levels, and spikes in insulin levels (which can be caused by large meals and/or large gaps in time between meals) have been linked to weight gain.

A newer study followed the eating and sleeping patterns of 52 subjects over one week. They found a correlation between “late sleepers” (i.e., people who go to sleep late and wake up late) and high body mass index, and that eating after 8pm was associated with higher body mass index. A relatively recent New York Times article summarizing the results of the study makes the further claim that eating late at night leads to weight gain, however, I disagree with that claim on the grounds that correlation does not imply causation. In fact, the original study noted:

Late sleepers consumed more calories at dinner and after 8:00 PM, had higher fast food, full-calorie soda and lower fruit and vegetable consumption.

Therefore, I think the results of the study can be interpreted to mean that there is a correlation between eating/sleeping late and a poor diet.

Furthermore back in 2006, the same research team conducted a study on monkeys in which they were fed a diet similar to the average (i.e., high-fat) diet common in the USA. The only variable was the time of day that the monkeys were fed. With all else remaining constant, the researchers found no correlation between weight gain and time of feeding.

It was really interesting to see that the monkeys who ate most of their food at night were no more likely to gain weight than monkeys who rarely ate at night. This suggests that calories cause weight gain no matter when you eat them.

While I’m on a roll here, let me quickly dispel yet another myth. I know many people that adhere to the strict “calorie-in/calorie-out” nutritional theory. This seems particularly popular among mathy/engineering types. The idea is that if your body burns fewer calories than what it takes in through food then it will gain weight. This theory, in general, is fallacious. The body doesn’t necessarily consume all of the calories of the food we stick down our throats. The time between meals will, in effect, affect the way one’s body “decides” to digest the food. Furthermore, as this study and others like it suggest, not all types of calorie sources are equal when it comes to how the body processes them!

In conclusion, if you’re overweight, it’s not necessarily legitimate to blame your late-night eating schedule. You can’t even blame your high calorie diet (only the types of calories you choose to eat).

## The Economics of Eating Poorly

### Is it cheaper to eat fast food than to cook a meal from scratch?

Is it cheaper to eat fast food than to cook a meal from scratch? Answering such a question is difficult, given that costs will vary greatly from country to country. Why? Well, raw commodity prices are very volatile due to varying government subsidies, differences in climate, extreme climatic events, supply chains, &c.

The supermarket industry in the US is extremely competitive. SuperValu, for example, is a giant corporation that owns many supermarket chains. They are lucky to eek away a 1.5% profit margin. (In other words, at most 1.5% of their gross income is a profit.) That means they could lower their prices at most ~1.5% without taking a loss. Bulk sellers like Costco and even the giant Wal-Mart are lucky to reach 3%. Successful fast food restaurants like McDonalds, on the other hand, easily reach a profit margin of 20%. That means, in effect, McDonalds could reduce their prices by ~20% and still stay afloat.

Why is this? One major reason is that companies like McDonalds can and do have complete vertical integration of their supply chains: McDonalds raises their own cattle, grows their own potatoes, transports their own raw ingredients, and largely owns the real estate of their restaurants. In fact, McDonalds even makes a profit off of leasing their real estate to McDonalds franchises. That flexibility is partially one reason why a Big Mac will be worth the equivalent of US\$10 in Brazil while the same Big Mac will be worth US\$1.50 in Croatia. Supermarkets don’t really have much opportunity for vertical integration unless they actually buy and operate their farms and suppliers.

So, is eating fast food really cheaper? As I mentioned in my previous blog entry about food deserts, in the US there is a definitive link between poverty, obesity, and lack of proximity to a supermarket. There was also a study in the UK that discovered a correlation between the density of fast food restaurants and poverty:

Statistically significant positive associations were found between neighborhood deprivation [i.e., poverty] and the mean number of McDonald’s outlets per 1000 people for Scotland ($p < 0.001$), England ($p < 0.001$), and both countries combined ($p < 0.001$). These associations were broadly linear with greater mean numbers of outlets per 1000 people occurring as deprivation levels increased.

Let’s have some fun now and look at the Consumer Price Index (CPI), which estimates average expenditures for various goods over a month. The current CPI in the US for food and beverages is \$227.5 (meaning that the average consumer spends \$227.5 per month on food and beverages). Now let’s assume that the cost of non-fast-food is cheaper than eating fast food and proceed with a proof by contradiction. Under this assumption, the CPI of \$227.5 is obviously a lower bound on the amount one would spend if one only ate fast food (since the CPI includes all types of food-related expenditures). This equates to about \$8 per day. In 2008, the average price of a Big Mac in the US was \$3.57, and it is certainly possible for one to subsist off of two Big Macs per day. That’s especially true since a Big Mac is one of the more expensive items on the McDonalds menu. A McDouble, for example, costs only \$1. This is a contradiction, i.e., we have shown that it is in fact possible to live off of less than \$8 a day of fast food, thus breaking the lower bound and invalidating our assumption that eating non-fast-food is cheaper than eating fast food.∎ This suggests that it is at least plausible that eating fast food could be cheaper than grocery shopping in the US. ## Food Deserts ### In which I argue that not only desserts but also deserts are corrolated with obesity. What is a food desert? That’s “desert” as in “Sahara,” not the sweet thing you eat at the end of a meal. According to Wikipedia, it's any area in the industrialised world where healthy, affordable food is difficult to obtain, specifically associated with low income. There is a good amount of debate about whether such things even exist. The problem, as I see it, is that this definition is very subjective: One can always have access to high quality or nutritional food if one is willing to spend enough time to travel to it. If a person lives a couple miles away from a grocery store but has "access" via expensive (to them) public transport, does that constitute "access"? Technically, of course, yes. But what I really think the question should be: Is there any statistical correlation between proximity to full-service grocery stores, obesity, and poverty? I think the answer to that is "yes". Answering why is a much more difficult (and perhaps open) question. Read on to hear my reasoning. ## Pronunciation of foreign words in American vs. British English ### In which my etymological conjectures are repudiated by Peter Shor. One of the differences between modern US English (hereafter referred to as "American English") and British English is the way in which we pronounce foreign words, particularly those of French origin and/or related to food. For example, Americans… • drop the "h" on "herb" and "Beethoven"; • rhyme "fillet" and "valet" with "parlay" as opposed to "skillet"; and • pronounce foods like paella as /paɪˈeɪə/ (approximating a Castilian or Mexican accent), whereas the British pronounce it as /pʌɪˈɛlə/. In general, the British seem to pronounce foreign/loan words as they would be phonetically pronounced if they were English, whereas Americans tend to approximate the original pronunciation. I've heard some people claim that this trend is due to the melting pot nature of America, and others claim that the French pronunciation, in particular, is due to America's very close relations with France during its infancy. This latter hypothesis, however, seems to be contradicted by the following: Avoid the habit of employing French words in English conversation; it is in extremely bad taste to be always employing such expressions as ci-devant, soi-disant, en masse, couleur de rose, etc. Do not salute your acquaintances with bon jour, nor reply to every proposition, volontiers. In speaking of French cities and towns, it is a mark of refinement in education to pronounce them rigidly according to English rules of speech. Mr. Fox, the best French scholar, and one of the best bred men in England, always sounded the x in Bourdeaux, and the s in Calais, and on all occasions pronounced such names just as they are written. I wondered: At what point did the USA drop the apparent British convention of pronouncing foreign words as they are spelled? I asked this question on English.sx, to little fanfare; most people—including Peter Shor!!!1—had trouble accepting that this phenomena even exists. Therefore, I extend my question to the Blogosphere! I did some more digging, and it is interesting to note that there seems to be a trend in "upper-class" (U) English to substitute words that have an obvious counterpart in French with words that are either of Germanic origin or those that do not have a direct equivalent in modern French. For example, in U English: • scent is preferred over perfume; • looking glass is preferable to mirror; • false teeth is preferable to dentures; • graveyard > cemetery; • napkin > serviette; • lavatory > toilet; • drawing-room > lounge; • bus > coach; • stay > corset; • jam > preserve; • lunch > dinner (for a midday meal); and • what? > pardon? This is admittedly a stretch, but perhaps there is some connection between the US's lack (and some might say derision) of a noble class and its preference toward non-U/French pronunciation? ## Are no two snowflakes alike? ### A mathematical argument for the negative. I think the claim that "no two snowflakes are alike" is fairly common. The idea is that there are so many possible configurations of snow crystals that it is improbable that any two flakes sitting next to each other would have the same configuration. I wanted to know: Is there any truth to this claim? Kenneth Libbrecht, a physics professor at Caltech, thinks so, and makes the following argument: Now when you look at a complex snow crystal, you can often pick out a hundred separate features if you look closely. He goes on to explain that there are$10^{158}$different configurations of those features. That's a 1 followed by 158 zeros, which is about$10^{70}\$ times larger than the total number of atoms in the universe. Dr. Libbrecht concludes

Thus the number of ways to make a complex snow crystal is absolutely huge. And thus it's unlikely that any two complex snow crystals, out of all those made over the entire history of the planet, have ever looked completely alike.

Being the skeptic that I am, I decided to rigorously investigate the true probability of two snowflakes having possessed the same configuration over the entire history of the Earth. Read on to find out.

## Revisiting the Ballmer Peak

### Might the Ballmer Peak be an actual phenomena?

Last year I made a rather esoteric joke about a supposed phenomena called the "Ballmer Peak" that was popularized by a web comic. The idea is that alcohol, at low doses, can actually increase the productivity of a computer programmer.  The original claim was obviously in jest since, among other reasons, Randall Munroe (the web comic's author) claimed that this peak occurs at exactly 0.1337% blood alcohol content. This got me thinking: Could there be any truth to this claim? Read on to find out; the results may surprise you.