Saturday, April 28, 2018

Science of fat storage (The Obesity Code)

[Dear J., you might find this interesting. -M.]
So... I'm normally pretty skeptical of nutritional fads and medical findings, because I often feel like it's not really grounded in good science. But I feel like I can recognize solid science when I see it.

My friend Eric Mueller recommended a book to me called The Obesity Code, in which the author (a kidney/diabetes specialist by the name of Dr. Jason Fung) advances the hypothesis that obesity is essentially a hormonal imbalance (high insulin levels leading to insulin resistance), and I find the claims impressive, consistent with my own experiences, and persuasive.

One key fact, supported by many, many case studies (Dr. Fung describes so many case studies that it starts getting tedious), is that your brain can adjust both appetite (calories in) and metabolism (calories out) and tries to balance them against each other. If you try to feed someone more food than they'd naturally eat, not only do they lose appetite and resist eating that much, but even if you do get them to overeat, their metabolic rates go up too. (I'm oversimplifying because there are multiple ways to body consumes energy, from raising your temperature to motivating movement to depositing it in stool.) If you decrease food consumption, metabolism slows too--not perfectly but pretty close, which is why obesity is something that generally takes years to develop instead of being something that can happen to you in a single month of really bad decisions in college. And it's also why trying to "Eat Less [calories], Move More" practically never results in long-lasting weight loss, even in studies where we measure the participants and know that they really are exercising more and eating less.

But the argument (as I understand it) goes that this metabolic homeostasis (keeping calorie intake and metabolism close to each other) is regulated by hormones (chemical signals sent to each cell of the body), and there's excellent evidence (detailed in the book) that the key hormone is insulin. High insulin levels tell cells to store energy; low insulin levels tell cells to release energy; insulin levels rise after a meal (especially carbohydrates without fiber), telling your body to store food. Normally that is fine, but here's the sticky point: when your insulin levels are high all the time from constant eating, your liver and muscles develop insulin resistance (respond less and less strongly to a given dose of insulin) but your brain, which regulates your appetite and metabolism, does not. Over a period of years or decades, your insulin levels get higher and higher as your insulin resistance in your muscles and liver get higher and higher (because more insulin is needed to do the job), but the parts of your body controlled by your brain get more and more inclined to store energy. Then if you ever manage to lose weight, your insulin resistance outside your brain is still as high as ever, so your body still keeps insulin levels high, so your non-insulin-resistant brain is still very willing to return your body to its previous weight.

I'm about halfway through the book so I haven't finished reading all the science yet, but it's obvious at this point what the conclusion is: in addition to eating fewer foods that boost insulin production and insulin resistance (i.e. eat fibers, whole grains, and avoid sucrose and fructose like the plague, especially soft drinks, because fructose boosts your liver's insulin resistance due to being metabolized only in the liver)--in addition to that, it's important to be hungry more often. 21st century humans spend waaaay too much time in a "fed" state (insulin excess) and often the only time we spend in an "unfed" state (insulin deficiency) is when we're asleep. For decades I've thought that you need at least 500 calories per day or so to prevent muscle loss--I'd been told that your brain can't metabolize fat, so your body winds up burning muscle to fuel your brain, but that turns out to be untrue. There is nothing wrong with occasional fasting, and the world record is a guy who (back in 1973) went 382 days without eating, under medical supervision, and came out of it much thinner but perfectly healthy and ambulatory. [ref: http://cristivlad.com/total-starvation-382-days-without-food-study/] So clearly being hungry isn't going to kill you or liquidate your muscles.

So that's basically it, apparently. Reduce insulin levels by eating fewer highly-processed, sugary carbohydrates, and by being hungry more often and for longer periods of time. (Dr. Fung recommends skipping or delaying breakfast, which is something I like doing anyway.) These are things, I think, which I can actually be quite good at doing, now that I understand specifically what I'm trying to accomplish. I've always been better at absolutism than moderation, and "don't eat right now" is a much easier message for me to get my head around than, "Eat right now but don't eat very much." (How much is "very much"? If eating is sort of bad right now, why am I eating at all? Etc.)

--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."

Friday, April 27, 2018

Black Panther movie

I just realized one of the reasons I didn't enjoy Black Panther very much. Movies about fighting for its own sake bore me, and Black Panther was set up as a movie about who gains political control of a country whose system of government centers on who can beat whom in a fight. There were a lot of fights during the movie, most of which didn't accomplish anything, and then the dramatic question ("Who will retain control of the country?") was resolved in a fistfight, because of course it had to be, since that was how the country's rules were set up.

No wonder I didn't really connect with it emotionally as much as I was hoping to.

~B.C.

--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."

Saturday, April 21, 2018


Here's one passage that caught my attention.

~Max]
http://journals.plos.org/ploscompbiol/article?id=10.1371/journal.pcbi.1005268

...What constitutes an understanding of a system? Lazbnick's original paper argued that understanding was achieved when one could "fix" a broken implementation. Understanding of a particular region or part of a system would occur when one could describe so accurately the inputs, the transformation, and the outputs that one brain region could be replaced with an entirely synthetic component. Indeed, some neuroengineers are following this path for sensory [26] and memory [27] systems. Alternatively, we could seek to understand a system at differing, complementary levels of analysis, as David Marr and Tomaso Poggio outlined in 1982 [28]. First, we can ask if we understand what the system does at the computational level: what is the problem it is seeking to solve via computation? We can ask how the system performs this task algorithmically: what processes does it employ to manipulate internal representations? Finally, we can seek to understand how the system implements the above algorithms at a physical level. What are the characteristics of the underlying implementation (in the case of neurons, ion channels, synaptic conductances, neural connectivity, and so on) that give rise to the execution of the algorithm? Ultimately, we want to understand the brain at all these levels.

In this paper, much as in systems neuroscience, we consider the quest to gain an understanding of how circuit elements give rise to computation. Computer architecture studies how small circuit elements, like registers and adders, give rise to a system capable of performing general-purpose computation. When it comes to the processor, we understand this level extremely well, as it is taught to most computer science undergraduates. Knowing what a satisfying answer to "how does a processor compute?" looks like makes it easy to evaluate how much we learn from an experiment or an analysis.

...Lesions studies allow us to study the causal effect of removing a part of the system. We thus chose a number of transistors and asked if they are necessary for each of the behaviors of the processor (Fig 4. In other words, we asked if removed each transistor, if the processor would then still boot the game. Indeed, we found a subset of transistors that makes one of the behaviors (games) impossible. We can thus conclude they are uniquely necessary for the game—perhaps there is a Donkey Kong transistor or a Space Invaders transistor. Even if we can lesion each individual transistor, we do not get much closer to an understanding of how the processor really works.

This finding of course is grossly misleading. The transistors are not specific to any one behavior or game but rather implement simple functions, like full adders. The finding that some of them are important while others are not for a given game is only indirectly indicative of the transistor's role and is unlikely to generalize to other games. Lazebnik [9] made similar observations about this approach in molecular biology, suggesting biologists would obtain a large number of identical radios and shoot them with metal particles at short range, attempting to identify which damaged components gave rise to which broken phenotype.

[Also, don't miss this comment on alpha waves]

In neuroscience there is a rich tradition of analyzing the rhythms in brain regions, the distribution of power across frequencies as a function of the task, and the relation of oscillatory activity across space and time. However, the example of the processor shows that the relation of such measures to underlying function can be extremely complicated. In fact, the authors of this paper would have expected far more peaked frequency distributions for the chip. Moreover, the distribution of frequencies in the brain is often seen as indicative about the underlying biophysics. In our case, there is only one element, the transistor, and not multiple neurotransmitters. And yet, we see a similarly rich distribution of power in the frequency domain. This shows that complex multi-frequency behavior can emerge from the combination of many simple elements. Analyzing the frequency spectra of artifacts thus leads us to be careful about the interpretation of those occurring in the brain. Modeling the processor as a bunch of coupled oscillators, as is common in neuroscience, would make little sense.

[And this part about culture, funding, and goals moving forward is important I think.]

...Culturally, applying these methods to real data, and rewarding those who innovate methodologically, may become more important. We can look at the rise of bioinformatics as an independent field with its own funding streams. Neuroscience needs strong neuroinformatics to make sense of the emerging datasets and known artificial systems can serve as a sanity check and a way of understanding failure modes.

We also want to suggest that it may be an important intermediate step for neuroscience to develop methods that allow understanding a processor. Because they can be simulated in any computer and arbitrarily perturbed, they are a great testbed to ask how useful the methods are that we are using in neuroscience on a daily basis.

--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."

Friday, April 13, 2018

When Upon Life's Billows: Counting My Blessings

Money can't buy happiness but a lack of money can sure buy insecurity, especially if you have no family and no friends with money. I've been really poor but I'm not any more, and I have to admit that I really enjoy not being poor.

I was talking to someone the other day about social capital and how nerve-wracking it must be to be the first one in your family to ever go to college, and have to learn the ropes on your own. (How do you get a job? How do you afford tuition? What happens if you're struggling in a class? Etc.)

Anyway, I feel very lucky to be as comfortable as I am. I'm grateful to Heavenly Father for giving me opportunities. I am waaaaay more comfortable living among human beings this way than I would be living on my own on a desert island with whatever furniture I could carve out of coconuts.

~B.C.

--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."

Monday, April 2, 2018

In Defense of Mainstream Science

[Hang onto this for me please. -Max]

http://quillette.com/2017/06/11/no-voice-vox-sense-nonsense-discussing-iq-race/ by Richard Haier

Long before the Trump Administration, alternative facts and misrepresentations have permeated attacks on the science of human intelligence. Even reasonable discussions are not immune. The recent piece posted on VOX (May 18, 2017) by Turkheimer, Harden & Nisbett (THN) excoriates Sam Harris about his recent podcast discussion with Charles Murray, author of the 1994 book, The Bell Curve(co-author Richard Herrnstein died before the book was published).

Sam Harris is not an expert in intelligence research but I am. After hearing the podcast, I emailed congratulations to him and Murray for conducting an informative discussion of complex and controversial issues. Every point they enumerated as having broad support among intelligence researchers is correct. There is an overwhelming weight of evidence to support the ideas that intelligence is something real, it can be reliably and validly measured without bias, and the measures predict many real world variables that are important to most human beings. There also is broad agreement that one component of intelligence is a general ability (the g-factor) to reason and problem-solve across a wide range of situations. There also is overwhelming evidence that genes play a significant role in explaining differences in intelligence among individuals.

These points were reasonably well established when The Bell Curve was published, as evidenced by a task force of prominent researchers constituted by the American Psychological Association in 1995 (report published in 1996), hardly a right-wing group. And, as Murray noted in the podcast, all these findings have been validated even further by subsequent research with much larger samples and more powerful research designs.

*snip*

--
If I esteem mankind to be in error, shall I bear them down? No. I will lift them up, and in their own way too, if I cannot persuade them my way is better; and I will not seek to compel any man to believe as I do, only by the force of reasoning, for truth will cut its own way.

"Thou shalt love thy wife with all thy heart, and shalt cleave unto her and none else."