Friday, November 26, 2021

Low Glycemic Bread

A change from my usual posts:

For some time I have been on a low glycemic diet. Bread made from white flour has a fairly high glycemic index and glycemic load. I found a number of recipes online that claimed to produce a tasty low glycemic bread and tried them. The best I would be willing to eat if I was  hungry and had nothing better available, but none, despite the claims on their web pages, was close to the quality of ordinary bread. I also bought one variety of low glycemic bread online — better, but still not very good. 

I decided to see if I could invent something better.

My standard bread recipe is a sourdough loosely based on a recipe in the King Arthur Flour cookbook. Sourdough bread has a lower glycemic index than yeast bread. Whole wheat flour has a lower glycemic index than white flour. Almond flour has a glycemic index of about zero. It also has no gluten, which means bread made with it (or coconut flour, or chickpea flour, or ...)  won't rise. 

The obvious solution is to add gluten. Wheat gluten has a glycemic index only a little lower than whole wheat flour but a much lower percentage of carbohydrates, hence a much lower glycemic load, which is what really matters. 

Here is the recipe for one loaf:

3/4 c whole wheat flour
3/4 c almond flour
1/2 c wheat gluten
1/3 c raisins
1 t salt
1/3 lb sourdough starter
1/2 c water

Mix together the flours and gluten.

Stir the sourdough starter into the water and add to the flours, stirring to mix. 

Let it sit for half an hour.

Add salt and raisins, knead smooth (this takes only a minute or two).

Let it sit for an  hour.

Fold it.

Let it sit for an hour.

Form into a boule (look up how to do it which is hard to describe but ends up with the dough in a ball). The one tricky bit is that you want to try to get all the raisins into the interior, since if they are on the surface they may burn. 

Cover and let it rise for two hours.

Put in a 450°F oven, bake until the internal temperature is 205°F.

Let it cool. Eat it.

It isn't the best bread I ever ate, but  better than most store bought bread. By my calculation the glycemic load from the flours and gluten is a little less than a third what it is for the two cups of white flour in a loaf of my standard bread. That does not include the raisins, which are the same for either recipe, but you can leave them out if you want — I like raisin bread.

All the News that Fits We Print

According to the New York Post and some other conservative media, Darrell Brooks, the man who drove his car into a parade in Waukesha killing at least six people, had pro-Hitler, anti-semitic and violently anti-white material on his social media account. Googling on [Darrell Brooks Hitler] I found no mention of it in any mainline source. The New York Times has a long story on Brooks with no mention of those facts — and the claim that "The suspect’s motivations are unclear." The Wall Street Journal has had multiple stories on Brooks but I cannot find any mentioning the social media.

It is possible that the conservative sources are lying but I think it unlikely, since if they were I would expect the NYT or other left of center media to call them on it. It looks very much as though the mainline media are deliberately hiding facts they don't want their readers to know.

Instead of Affirmative Action

Affirmative action in college admissions consists, at present, of applying lower standards to black applicants than to white, admitting black students who would be rejected on the basis of grades, SAT scores, and the like, if they were white. Putting aside the question of whether racial discrimination is good or bad, there are at least two serious problems with that policy, seen from the standpoint of the people it is supposed to benefit. 

The first was pointed out by Thomas Sowell in Choosing a College. A talented black student at the 90th percentile of the population in mathematical ability is admitted to MIT and finds himself at the bottom of a class where everyone else is at the 99th percentile. He would learn more at a slightly less selective school where classes were designed for people more like him. The same argument has been made more recently by Richard Sander in the context of law school admissions in Mismatch: How Affirmative Action Hurts Students It's Intended to Help, and Why Universities Won't Admit It. He offers evidence suggesting that if law schools abandoned affirmative action fewer black students would enroll but more would pass the bar, hence there would be more black lawyers. 

 [A summary of some of the arguments and evidence on this issue

The second problem is that affirmative action helps the blacks least in need of it. Colleges and law schools are looking for the most qualified black students they can find. The child of a black surgeon in the suburbs is much more likely to meet that requirement then the child of a black janitor or a single mother in the inner city. 

In the course of a recent discussion on Facebook a commenter suggested a better way of doing what affirmative action is supposed to do, help disadvantaged kids. His proposal was that college admissions should favor applicants, black or white, from a low SES (socioeconomic status) background, those being the students who are most disadvantaged. In response to my pointing out the problems raised by letting students into schools that would be too difficult for them he responded: 

more likely you get kids that work harder and overcome obstacles better than their more privileged peers.

Another commenter proposed a way in which a school could decide whether favoring low SES applicants would get better students and if so by how much to favor them: 

there is simple quantitative solution to this problem. If two students apply to college with identical grades and test scores but one has faced and overcome serious obstacles and the other has not, I would guess that the student who overcame the obstacles would be a better student and do better in college. But the college does not have to guess. It has data. So colleges could easily measure this by developing a simple rubric and ranking applicants on "hardship" on a scale (say) of 1 - 10. Then once students are admitted and their college grades are in, run a regression of college grades on high school grades, the specific high school, test scores, and hardship. If you get statistically significant coefficients, that would determine how much weight to give hardship.

Replacing the present version of affirmative action with that policy would also solve two other problems. The first is that since employers know that schools engage in affirmative action they know that the fact that a black student went to Harvard, undergraduate or law school, is weaker evidence of his ability than the fact that a white student did. The second is that the present policy very nearly guarantees that white students will observe their black fellow students to be mostly at the bottom of the class and draw the obvious conclusion, encouraging racial prejudice. There are black law school students just as smart as the white students at Santa Clara University but, with rare exceptions, they are not at SCU because they got admitted to Stanford. 

Blind discrimination in favor of lower SES students could have the same effects, but not discrimination based on and calibrated to evidence that they were better students than their academic credentials showed.

Monday, November 22, 2021

Making Life Harder for Home Schooled Students

The College Board, the organization that runs the SAT exams, has announced that they will no longer be giving SAT subject exams. That will be a problem for home schooled students who want a way of convincing colleges to admit them, especially for ones who hope to be accepted by an elite school. Home schooled students don't have high school grades or teachers' recommendations; unless they have some extraordinary accomplishment to show, publishing a novel or winning a national chess championship, they largely depend on objective tests to convince schools to accept them.

The replacement suggested by the College Board in its explanation of the change is the AP exam. Home schooled students take AP exams by finding a local school willing to let them do so. Assuming they can do so, there is still a problem. The AP exam, with a score range of of 1-5, is not nearly as good a way for student to prove his ability as an SAT exam with a score range of 200-800. One percent of students got an 800 on the Literature SAT, 9.3% a 5 on the English language and literature AP. 3% got an 800 on the American History SAT, 13% a 5 on the United States History AP. An 800 on either exam is much stronger evidence of a student's knowledge than a 5 on the corresponding AP exam.

How much of a problem is that for a home schooled students hoping to get into a selective school? To answer that we need to know how high a score on the SAT subject exam elite schools expected in the past and what score on an SAT subject exam a 5 on an AP exam corresponds to. The most recent year for which I could find figures on the range of scores that students at elite schools typically got was 2018. According to a report from that year, selective schools expected scores in the upper half of the 700’s. For the Literature SAT, 750 was 91st percentile, for American History, 83rd percentile. So getting a 5 on either AP exam was evidence that you were within the range of what such schools expected but might be near its bottom, hence only weak evidence that the school should accept you. 

That might not matter for an applicant who had lots of other ways of proving his ability — but home schooled students mostly don't. To persuade a school to accept them, the evidence they can present has to be very strong. The switch from the SAT subject exams to the AP exams makes that impossible. However able a student is, the highest score he can get is a five.

Sunday, November 14, 2021

Impossible Beliefs

I have had two past blog posts making claims that readers could check at first hand, in each case claims that someone on one side of an argument was saying something he knew was untrue. As best I could tell, very nearly nobody on the same side was prepared either  to agree with the claim or to offer a rebuttal that I thought a neutral party would have taken seriously. 

That people on one side or another of a politically loaded dispute are sometimes willing to lie in support of their side is not surprising. What I found surprising was how unwilling people were to concede that, in this particular case, someone supporting their side was doing so. It was an issue not of consistent beliefs but of loyalty; recognizing that someone on his side had lied did not require any change in his underlying beliefs.

It was the orthodox side of the climate argument that my claims offended. I am now looking for one or more similar cases from the other side, cases where someone arguing for the red tribe/Republican/right wing side of some issue closely linked to tribal identity, not necessarily climate change, said something that could be shown to be false with evidence directly observable by ordinary people and almost nobody on his side was willing to admit it.

The obvious candidate is the claim by Trump that the 2020 election was stolen. The problem with that case is that the evidence that it is false is second hand, primarily through the mass media, so someone sufficiently distrustful of the media can reject it. Are there better examples?

For the curious, here are my posts, including the comment threads:

A Climate Falsehood You Can Check for Yourself

Global Sea-ice, Deceptive Reporting, and Truthful Lies

There were multiple posts on each issue, but I think those two are sufficient to demonstrate the argument and the responses of those who did not wish to believe it.

I am not interested in rearguing those cases here; I already know there are people who reject my arguments. What I am looking for, for something I am currently writing, are new cases on the other side of the current political divide.

Wednesday, November 10, 2021

Infection Rates and Vaccination

Unvaccinated people are about 29 times more likely to be hospitalized with Covid-19 than those who are fully vaccinated, according to a study released Tuesday by the Centers for Disease Control and Prevention.

The new study, published in the CDC’s Morbidity and Mortality Weekly Report, also found that unvaccinated people were nearly five times more likely to be infected with Covid than people who got the shots. (CNBC)

The question is not is it true but how could we know. Hospitalization and death are observable, countable events, but infection is only observed when someone is tested. The U.K. has a program of randomly testing people in order to learn, among other things, how many of them were infected with Covid (REACT-1). But the CDC appears to be just counting people known to be infected.

Vaccination is much stronger protection against a serious case of Covid than a mild case, as shown by the difference between the first paragraph quoted above and the second. One would therefor expect a much larger fraction of vaccinated infections to be asymptomatic. If an infection is sufficiently asymptomatic the victim does not notice and so has no reason to get tested. Unless the CDC is somehow compensating for that problem, its count of the infection rate for vaccinated people is not only too low, it is too low by more than its count of the rate for unvaccinated, hence it is overestimating the protection that vaccination provides against infection.

The effect of vaccination on hospitalization and death rates are what you need to persuade someone that it is in his private interest to be vaccinated — and, on those grounds, I am. The effect on the infection rate is what you need to show that vaccinating one person protects others in order to defend forcing people to get vaccinated. The CDC supports vaccination requirements, which is why I doubt that they have gone out of their way to correct the overestimate of the strength of vaccination against infection that comes out of simply counting known infections. 

What the real protection against infection is I do not know. The estimate from the UK study (ignore the headline) is 50 to 60%, much lower than the CDC claim, but that is a different population and a different mix of vaccines.

A second argument is the claim that someone vaccinated and infected is less contagious than someone unvaccinated and infected. I have seen various claims as to whether that is or is not true, based on measured viral levels in the infected, but we now have something better, a study based on observed infections of people who were close contacts of infected individuals:

Unfortunately, the vaccine’s beneficial effect on Delta transmission waned to almost negligible levels over time. In people infected 2 weeks after receiving the vaccine developed by the University of Oxford and AstraZeneca, both in the UK, the chance that an unvaccinated close contact would test positive was 57%, but 3 months later, that chance rose to 67%. The latter figure is on par with the likelihood that an unvaccinated person will spread the virus. (Nature)
All of which explains why the effect of vaccination on Covid rates has been much less than many of us expected. Death rates have fallen sharply but infection rates have gone up and down, even in well vaccinated populations such as the U.K. and Israel, much as they did before. The most one can say, at least from a casual look at the data, is that the reduced infection rate from vaccination has roughly balanced the increased infection rate from the Delta variant.

 

Tuesday, November 02, 2021

Suppose the Republicans Win in Virginia

I am writing this about three hours before the polls close. It seems pretty clear that the Republican candidate will either lose by a small margin or win, in either case signalling a sharp decline in the strength of the Democrats, at least in that state. One interesting question is what the consequences will be. 

The most obvious answer is that if the Democrats are at serious risk of losing the House and Senate because they have pushed too hard in a progressive direction, which is what the news stories on the election seem to imply, they should and will moderate their positions, both nationally and on the state level.

I can see two arguments in the other direction. One is that if they only have a year left, they need to spend that year getting as much of what they want through Congress. The obvious problem is that they need fifty votes plus the VP and don't have them in their own party for anything very far in the direction the progressives want, as has been repeatedly demonstrated in recent weeks. That makes me wonder if there is any way they could pry loose two or three Republican senators, perhaps ones that do not plan to run for reelection. I don't know enough to guess how likely that is, or if there is some other way of doing it.

The other argument, which is likely to convince progressives and could even be true, is that the problem is not being too progressive but not progressive enough, that moderating their position will lose them more votes on the left, with disappointed progressives staying home, than it will get them in the center.

I don't know enough about the politics, especially within the Democratic party, to offer more than speculation.

One related point occurs to me. The infrastructure bill — the one that is actually about infrastructure — is routinely described as a bipartisan bill, but I don't think that can be true. It has been stalled in the House because a handful of progressive Democrats won't vote for it until they are guaranteed getting their bill passed as well. That is only a problem if the Republicans are almost unanimously against it, in which case it isn't bipartisan. 

At least not in the House.

A Prediction We Will Get to Test

Climate change may affect the production of maize (corn) and wheat as early as 2030 under a high greenhouse gas emissions scenario, according to a new NASA study published in the journal, Nature Food. Maize crop yields are projected to decline 24%, while wheat could potentially see growth of about 17%. (Source)

Most projected effects of climate change are far enough in the future so that the people who made and trumpeted them will be dead, or at least retired, well before we see if they are true, but this one is only nine years in the future. If, as I expect, it turns out to be false, if world maize output continues to grow as it has been doing for a very long time, I plan to announce the fact here — and nobody will notice. 

As an example of how the wording of a news story reflects the biases of the author, note that the decline in maize is "projected" while wheat "could potentially see growth." Both are projections from the same source but the positive one is put in more uncertain terms than the negative.

Following the link to the abstract, I note that:

Mean end-of-century maize productivity is shifted from +5% to −6% (SSP126) and from +1% to −24% (SSP585)—explained by warmer climate projections and improved crop model sensitivities. In contrast, wheat shows stronger gains (+9% shifted to +18%, SSP585), linked to higher CO2 concentrations and expanded high-latitude gains. 

Can any reader point me to news articles or a NASA web page from the previous round of the research trumpeting the fact that both maize and wheat productivity were projected to increase due to climate change? 

I also noticed:

Higher levels of carbon dioxide in the atmosphere have a positive effect on photosynthesis and water retention, increasing crop yields, though often at a cost to nutrition.

Assuming their source is the same one I discussed in an old blog post, "cost to nutrition" means that CO2 fertilization increases yield in calories by more than it increases yield in some other nutrients — two out of ten minerals in wheat, for example — hence lowers the amount of those nutrients per calorie.