Author Archives: Robert Brewer

About Robert Brewer

Senior Software Engineer at Tableau Software, Inc. We help people see and understand their data.

Oreo Cheesecake Pudding Cup Recipe

I got this recipe from a co-worker at LavaNet a few years ago (Hi Tim!). It makes a lot of  cups, and has proven to be a crowd-pleaser. Mahalo to Yuka for transcribing the original version from a printed fax to electrons. Enjoy.

Makes 36 cups

Ingredients:

  • 1  box of cook and serve chocolate pudding (5.9 oz box)
  • milk for pudding
  • 1 package of Oreo cookies
  • 2 packages of cream cheese at room temperature (8 oz each)
  • 3/4 cup sugar
  • 2 eggs
  • 1 tsp vanilla extract
  • chocolate chips (optional)
  • Cool Whip/whipped cream (optional)

Equipment:

  • 3 muffin tins (if you have less than 3, then you will have to bake in batches)
  • 36 paper muffin cups
  • Mixer (hand or stand)
  • Mixing bowl
  • Saucepan

Steps:

  1. Cream together sugar and cream cheese in bowl with mixer.
  2. Add eggs and vanilla extract.
  3. Line muffin tins with muffin cups.
  4. Place 1 Oreo cookie at the bottom of each cup. You can put a whole cookie in each cup, or twist them apart and put half a cookie in each. If you have less than 36 Oreos (but more than 18), splitting the Oreos is a good way avoid buying another package.
  5. Spoon one heaping tablespoon of the cream cheese mixture into each cup on top of the cookie.
  6. Bake at 350° F for about 10-12 minutes, or slightly golden brown.
  7. In the meantime, make the chocolate pudding as directed on package.
  8. Take the muffin tins out of the oven, and let them cool at room temperature.
  9. Also allow the chocolate pudding to cool slightly.
  10. Spoon the chocolate pudding onto each cup.
  11. If using chocolate chips or Cool Whip, add them now. If using whipped cream, add just before serving.
  12. Refrigerate until set.
Advertisements

DVIator + DVI Mini DisplayPort adapter = No Go

I have one of the original Apple 22″ Cinema Displays. These monitors use the Apple Display Connector (ADC), which bundles together DVI video, USB, and monitor power into a single cable. It was an interesting idea (presumably derived from/inspired by  the way NeXT cube displays worked), but since it wasn’t an industry standard, only Mac video cards had the connector, and only on desktop systems like the PowerMac G4.

If you wanted to connect an ADC monitor to systems that had DVI video output (like the PowerBook G4, and later MacBook Pro), you needed a special adapter that split the ADC into DVI, USB, and provided power. Initially, the DVIator from Dr. Bott was the only affordable solution, so I bought one. It’s served me very well over the last 7 years, and Dr. Bott even replaced mine once when it didn’t work with my aluminum PowerBook G4.

Apple has once again changed their preferred laptop display connector, and now uses the Mini DisplayPort format. MDP provides a teeny connector that can still drive 30″ monitors and the latest versions can also output audio (like HDMI).

Unfortunately, this means that unless you have a Mini DisplayPort monitor (Apple sells one), you need an adapter to connect your monitor to a MDP laptop. For those with DVI monitors, Apple sells a Mini DisplayPort to DVI adapter, which solves the problem. However, I found that adapter does not work with the DVIator, confirmed by this Apple Support forum post, another at MacRumors, and an email from Dr. Bott support. Based on the forum posts, apparently the Apple ADC to DVI adapter ($99!) does work with the MDP to DVI adapter, but I haven’t tested that myself yet.

So, bottom line: if you have a DVIator, an ADC display, and a computer with MDP output, you’ll need to either buy the Apple ADC to DVI adapter (in addition to the MDP to DVI adapter) or buy a new monitor. Bummer.

Cross-origin data access from JavaScript

WattDepot features a RESTful API that emits XML, and supports the Google Visualization API using the Google-provided Java library. There is a Java client library that makes it easy to write WattDepot clients in Java, and Google provides a library for JavaScript clients using the Google Visualization API. A several Java clients have been written and several JavaScript clients have been written.

Yichi Xu is working on a GeoMap visualization for WattDepot, and it needs to use the REST API rather than the Visualization API. This means he had to query the REST API directly, and parse the resulting XML. He quickly ran into the same-origin policy in JavaScript. In short, scripts running from a particular domain can generally only access resources from that same domain. This makes sense, because you don’t want any random web page you browse to have access to all your email just because you have a Gmail window up. However, from a developer’s perspective, this is a big when you want to use an HTTP API. Yichi’s gadget is loaded either locally or from Google’s servers, so when it tries to perform an HTTP GET to the WattDepot servers, it fails due to the same-origin policy.

There are a few workarounds for this problem, and Nelson Minar provides a good introduction. You can use proxies, but that’s gross. The most common solution makes use of a loophole in the same-origin policy: you can load JavaScript code from anywhere (just not data). So if you wrap your data so that it looks like code, then you can violate the same-origin policy. If you are providing the data in the JSON format, once wrapped it becomes JSONP.

Right now, WattDepot doesn’t support JSON representations (though it may in the future), so the JSONP trick is not available to us. Luckily, Yichi came across this draft from W3C on Cross-Origin Resource Sharing. CORS allows servers to emit HTTP headers that indicate how browsers should restrict access to the resource. The header can open things up so anyone can use the resource, or only certain other domains. However, this requires browsers to act on the header, luckily Firefox 3.5 and the latest Safari both implement this (the Firefox page has an excellent discussion of CORS). For now, supporting Firefox 3.5 and Safari is acceptable, so WattDepot just emits “Access-Control-Allow-Origin: *” for all requests. In the future this will be cleaned up to only emit this for public sources, and also provide JSON and JSONP representations for easier use in JavaScript.

Nice LaTeX package: fixme

In writing my dissertation proposal, I came across this nice little LaTeX package called fixme. It provides new commands to insert notes on things that should be fixed in a document. A common way of putting these type of notes in a LaTeX document is to use comments in the .tex file, but the downside there is that there is no trace of it in the output document, so they are easy to forget about and people reviewing your PDF will never see them.

fixme is part of TeX Live 2009, so for most LaTeX users can just say \usepackage{fixme} in their preamble and start using it. You can add different levels of corrections (from note to fatal), and they are displayed in the document in a variety of formats like margin notes, footnotes, etc. The package also prepares a list of corrections which you can use to keep track of things you need to fix.

The nice thing is that when you switch your document from draft to final mode, fixme removes all the comments and the list of corrections, except for fatal ones, which cause compilation to fail. So all the notes about minor stuff that can wait until the next revision disappears from the final copy, but anything you marked as fatal will have to be fixed before you can generate a PDF.

Unlocking a protected PDF on Mac OS X

Recently I needed to demonstrate proof of purchasing something via my credit card statement. Easy enough, I download my most recent statement as a PDF file from American Express. Then I wanted to use Adobe Acrobat Pro’s nifty redaction features to redact all the irrelevant information from the appropriate page of the bill. Except Amex has decided that the statement should be a protected PDF, which means you can view it but cannot change it. This is of course totally bogus DRM, it’s my statement afterall! I suppose they hope to curb statement forgeries, but as anyone akamai knows: if I can view it, I can edit it. I think Preview.app on Mac OS X used to ignore DRM and let you edit protected PDFs, but doesn’t seem to on Snow Leopard.

I hunted around for a tool to unlock the PDF. There are lots of tools for Windows, which didn’t interest me. One person suggested opening the PDF and “printing” it to a PDF, but Adobe has disabled those features of the Print dialog box on Mac OS X (presumably since it would allow trivial circumvention of the DRM).

PDFKey Pro looks like a reasonable option for Mac OS X, but it is $25 which seems kinda steep for a single use. They have a downloadable demo, but it will just create an unlocked version of the first page of the PDF, which wasn’t the page I wanted. And of course I can’t edit the source PDF because it is protected, so the demo wasn’t useful to me.

Then I came upon MuPDF, which is a “lightweight PDF viewer and toolkit written in portable C”. It has an X11 GUI component, as well as command line tools. One of the command line tools is “pdfclean”, which will remove the DRM from a PDF.

Unfortunately, MuPDF isn’t in MacPorts yet, so I had to compile it by hand. It uses the Perforce jam tool instead of make, and has three library dependencies: zlib, libjpeg, and freetype2. Luckily, all of these are available in MacPorts, so I was able to install them and then edit the Jamrules file to point at the MacPorts location. Here is the updated section of Jamrules:


if $(OS) = MACOSX
{
    Echo Building for MACOSX ;

    BUILD_X11APP = true ;

    CCFLAGS = -Wall -std=gnu99 -I/opt/local/include -I/opt/local/include/freetype2 ;
    LINKFLAGS = -L/usr/X11R6/lib -L/opt/local/lib ;
    LINKLIBS = -lfreetype -ljpeg -lz -lm ;
    APPLINKLIBS = -lX11 -lXext ;

    if $(BUILD) = debug   { OPTIM = -g -O0 -fno-inline ; }
    if $(BUILD) = release { OPTIM = -O3 ; }

    if $(HAVE_JBIG2DEC) { LINKLIBS += -ljbig2dec ; }
    if $(HAVE_OPENJPEG)    { LINKLIBS += -lopenjpeg ; }
}

pdfclean worked like a charm, removing the DRM from the statement. After that I was able to redact the statement without incident.

Perhaps in my copious spare time I will make a MuPDF portfile for MacPorts, but until then perhaps this will help others who want an open source way to remove bogus PDF DRM.

The null ritual

Philip and I were discussing the design of my dissertation experiment, and he pointed me at an interesting book chapter titled “The Null Ritual: What You Always Wanted to Know About Significance Testing but Were Afraid to Ask“. It’s fascinating reading, as it walks through a lot of false beliefs about significance testing as used by psychologists in experiments. I found that my understanding of significance testing was definitely incorrect in the ways described in the chapter.

The “null ritual” from the title is described as:

  1. Set up a statistical null hypothesis of “no mean difference” or “zero correlation.” Don’t specify
    the predictions of your research hypothesis or of any alternative substantive hypotheses.
  2. Use 5% as a convention for rejecting the null. If significant, accept your research hypothesis.
  3. Always perform this procedure.

The problem is that the null hypothesis test is p(D|H0), or the probability of obtaining the observed data given that the null hypothesis is true. When doing an experiment, any real world scientist will have a hypothesis that they are testing and usually hope that they can prove that it is true using the data from the experiment. What we really want is p(H1|D), or the probability of our hypothesis being true given the observed data. However, we need Bayes’ rule to draw a conclusion about the hypothesis and that requires the prior probabilities of the hypotheses, which are often not available to us beforehand.

The chapter also brings out the controversies in statistics between different approaches and goals of particular techniques, which is usually glossed over in teaching of statistics.

I’m planning to follow the authors’ recommendation in my research: “In many (if not most) cases, descriptive statistics and exploratory data analysis are all one needs.”

Rebuild Hawaii Consortium March 2010 meeting

I attended the Rebuild Hawaii Consortium quarterly meeting last week. I had never attended any of their meetings before, and I was somewhat surprised at the sizable number of people in attendance (40? 50?). It was held in a large stadium-style conference room at the Hawaii Convention Center. I had checked the agenda in advance, and thought I could arrive at 10 AM and still see everything I wanted to, but apparently the agenda changed since it was posted on the website.

The talk I missed that I wish I had seen was by Luis Vega on the Hawaii National Marine Renewable Energy Center. His slides look very interesting, lots of hard-nosed cost comparisons of wave and OTEC electricity generation.

Paul Norton have a talk on Zero Energy Buildings, which was interesting. I attended his REIS seminar where he covered some of the same things, but this was focused on ZEB. Some points I found particularly interesting:

  • The introduction of air conditioning leads to a 70% increase in electricity use
  • The key conceptual shift is thinking about the monthly cost of a home being the mortgage + utility bill.
  • The efficiency / photovoltaic balance point is the point at which adding generation via PV is the same cost as additional efficiency measures
  • A cost neutral design (monthly cost is same as a home built to code) that uses efficiency and PV results in an 85% reduction in home electricity usage
  • Once major efficiency measures are in place (solar water heating, efficient lighting & air conditioning, insulation), the major remaining load is appliance plug loads
  • In one military housing complex on Oahu, there is a 4x difference in electricity usage between houses with identical efficiency measures. Presumably the differences are due to appliance purchases and behavior.
  • In a group of homes in Las Vegas, the difference was 5x
  • Further, the differences were fairly continuous: there is no nice average plateau
  • PV inverters on the neighbor islands have been causing problems because the utility frequency can sag during periods of high usage. By default, the inverters are set to disconnect from the grid when the frequency drops below 59.3 Hz, so inverters all over turn off, which puts additional strain on the utility, exacerbating the problem. Reducing that threshold frequency to 57 Hz can help. Thus there is a lot of research still to be done on renewable integration.

Another presentation was on HCEI and smart grid initiatives at PACOM. They are working on a project called SPIDERS that is trying to address the fact that access to electricity is a critical need for the military. One thing I was stunned to learn was that people living in military housing don’t pay for electricity! Thus they have no financial incentive at all to reduce their energy usage. Slide 8 shows an actual graph of HECO’s demand and generation for one particular day. Our work on OSCAR was all based on vague outlines of what the demand curve looks like, so it was great to see it “in the flesh”.

There was a lot of good information at the meeting, so I’m planning to attend in the future. Next meeting is June 2.