I’m speaking about some of the work that I have done on randomly generating, and counting the number of, linear extensions of a poset.
Abstract: Suppose I try to shuffle a deck of n cards by randomly transposing adjacent cards. It turns out you need to do this O(n^3 ln(n)) times to really mix up the deck. Now suppose I add some extra rules that look like: “The King of Hearts must be above the Queen of Spades” or “The Queen of Spades must be above the 9 of Diamonds” and so on. I will shuffle the same way, but if an adjacent transposition would cause me to violate one of my rules, I do not execute the transposition. The surprising fact is that it still takes O(n^3 ln(n)) steps to mix up the deck! Now these shuffles with rules is more formally known as linear extensions of a partial order. In this talk I will show where the O(n^3 ln(n)) is coming from, and show how to exactly draw uniformly from the set of linear extensions. I will also discuss recent work on this problem and why embedding this problem in a continuous context can lead to better algorithms for counting linear extensions.
See here for more details.
It is summer in an even year, which means it is time for the Eleventh World Meeting of ISBA, this time held in Kyoto, Japan. For the first time I am organizing a session here, and we have some great speakers lined up: Galin Jones, James Flegal, Alex Beskos, and yours truly. My talk is: Fast approximation algorithms for partition functions of Gibbs distributions, and has an abstract very close to my longer talk at Ohio State last month.
Abstract: A new algorithm for estimating the partition function of a Gibbs distribution is presented. The method combines a well balanced cooling schedule created through TPA and a product importance sampler. One advantage of the algorithm over existing methods is the standard deviation of the estimate can be bounded theoretically. The number of samples necessary to build a close estimate grows almost linearly in the logarithm of the partition function, making the approach suitable for high dimensional problems. The samples needed for the estimate can be generated rapidly by methods such as parallel tempering.
I am speaking in the statistics seminar at The Ohio State University next week. The title of my talk is “Fast approximation algorithms for partition functions of Gibbs distributions”. It will be held in the Eighteenth Avenue Building, Room 170 on Thursday, May 24th from 3:30 to 4:30 pm. More details can be found here.
Abstract: Finding the partition function of Gibbs distributions has applications in model selection and building estimators for parameters of spatial models. This talk will present a new algorithm for estimating these partition functions. The method combines a well balanced cooling schedule created through a technique called TPA and a product importance sampler. One advantage of the algorithm over existing methods is the standard deviation of the estimate can be bounded theoretically. That is to say, unlike most algorithms where the standard deviation must itself be estimated, the standard deviation of this algorithm is fixed ahead of time by the use. The number of samples necessary to build a close estimate grows almost linearly in the logarithm of the partition function, making the approach suitable for high dimensional problems. The samples needed for the estimate can be generated rapidly by methods such as parallel tempering.
This Tuesday I’m starting off the talks for the Spring in the Algebra/Number Theory/Combinatorics Seminar here in Claremont. It will be at 12:15 on 24 Jan 2012 in Millikan 134. Complete details can be found here. The title is “A gadget for reducing the Ising model to matchings”, and it is joint work that I did with my first graduate student, Jenny Law.
Abstract: In my talk last semester in the seminar, I presented a classic result: the problem of counting the number of solutions to a logic formula can be turned into a problem of summing weighted perfect matchings in a graph. The key idea was the use of a combinatorial “gadget”. In this talk I’ll present a gadget developed with my first graduate student, Jenny Law, that allows for what is called a simulation reduction. The reduction works as follows: if you are able to sample randomly from a weighted distribution on perfect matchings in a graph, then you can also simulate from the Ising model, a classical model from statistical physics that has been heavily studied since its inception in the 1920’s. (http://arxiv.org/abs/0907.0477)
My talk at UCLA is now available on my website at http://www.cmc.edu/pages/faculty/MHuber/Research/talks/huber_talk_2011h.pdf. They have an interesting system there. The grad students go out for lunch with the speaker and can ask questions or just shoot the breeze. Then after the talk the students have a half an hour to just ask whatever questions come to mind about the talk and the contents. It is much more relaxed than the usual “five minutes to answer all you questions” at the end of the talk, and allows for a lot more depth.
The talk that I gave at UMSL can now be downloaded here. This talk describes two different protocols for perfect simulation. The first, Partially Recursive Acceptance Rejection, gives an easy way to generate perfect samples from loosely coupled Markov random fields (such as the Ising model) in linear time without building a Markov chain at all for the problem.
The second part discusses Sequential Acceptance Rejection, a method that has proved useful for problems like approximating the permanent in dense problem instances.
Today I am giving the Algebra/Number Theory/Combinatorics Seminar at the Claremont Colleges. This talk, “What is a #P complete problem” is a prequel in a sense to an earlier talk I gave in the Seminar last year. In the earlier talk, I discussed new methods for approximating the permanent of a matrix. In this talk, I will argue that finding the permanent is a hard problem. One way to characterize hardness of a numerical problem is the notion of number P problems, or #P for short. I will define #P problems and give Ben-Dor and Halevi’s 1993 proof that finding the permanent of a matrix is a #P hard problem, so if we could solve it quickly, any problem in #P could also be solved quickly.