Why backtracking alone cannot suffice to solve combinatorial problems: CP perspective, empirical hardness models evidence, and proof complexity results

Constraint Programming (CP) is a technique with particular strength to solve classes of combinatorial problems for which global constraints were identified and exploited in the design of specialized algorithms. However, I have been convinced that CP is, above all, a general model to understand combinatorial problems and their solution – using the term ‘model’ in a broader sense, as explained in a previous post and based on John Hooker’s article at CP Letters.

Regardless of the solving method used, the classic aim of CP is to tackle the Constraint Satisfaction Problem (CSP), which ultimately encodes any problem involving variables with discrete and finite values as the intersection of the regular languages defined by each of its constraints. When special properties are observed in a class of such languages that would allow us to make strong and efficient inferences among those variables, we have found one of those so-called global constraints. Not surprisingly, the concept is in fact a formalization of the widespread practice of exploiting problem structure, which was also discussed in a previous post linking the implementation of local search operators to CP concepts.

In spite of such claimed ‘agnosticism’ regarding solving methods, CSP solving is often associated with backtracking search algorithms. The rationale is that, if compared to other general solving techniques like dynamic programming, backtracking scales better in memory use with the size of the problem being solved. As a consequence, CP lectures and many lines of research are developed with the premise of using backtracking-based search. In practice, however, that post on CP and local search also points that the converse is true: local search methods like variable neighborhood search have been successfully used in commercial CP solvers.

One of such lines in which backtracking was extensively studied is the adaptive selection of algorithms, which is aimed to select the most suitable algorithm to solve a given instance according to predictive models of performance. From the characterization of CSP instances by means of properties like constrainedness, which is the ratio of pairwise constraints to variables, empirical hardness models were developed that also documented easy-hard-easy phase transitions. In short, it is easy to solve an unconstrained problem or to infer that an overconstrained one is unfeasible, but those in the ‘feasibility frontier’ are much harder to solve. A short reference about the topic is the 2005 article of Carla Gomes and Bart Selman in Nature entitled ‘Can get satisfaction’, and a precise figure of such hardness can be seen in the graph at slide number 6 of Kevin Leyton-Brown’s talk at the Workshop on Beyond Worst-Case Analysis.

Although a well-documented phenomenon, until very recently it was yet unclear to me why this happens. I was only convinced of it this week, when I came across some lecture notes of KTH Professor Jakob Nordstrom on proof complexity that discussed the resolution of formulas for the pigeonhole principle. Proof complexity is a discipline concerned with understanding the hardness of theorem proving, and comprises the study of characteristics and strengths of different proof systems as well as their equivalence. Among such systems is resolution, which is equivalent to solving through backtracking a CSP in which variables can only assume two values – true or false, or 0 or 1 – and constraints only use the logical operators NOT, OR, and AND in Conjunctive Normal Form (CNF): a set of clauses using NOT and OR joined by AND. This is a special form of the satisfiability problem, or simply SAT, for which the backtracking search with only local inference corresponds to the well-known DPLL algorithm.

The pigeonhole principle states that it is not possible to assign n pigeons into m pigeonholes without putting two of them in the same pigeonhole if n > m. It turns out that encoding an instance of this problem in polynomial size in n and m such that n = m+1 as a satisfiability problem (SAT), i.e., a CSP with variable domains of size two and constraints built of logical operators in CNF, results in an unsatisfiable problem that requires an exponential lower-bound resolution. That happens because a valid refutation requires exploring a large number of combinations of assignments to prove that none would lead to the satisfaction of the problem constraints. [check the erratum at the end of the post]

In short, backtracking really needs global constraints!

Erratum

According to the first comment to this post, there is an enconding for which the mentioned refutation is efficiently found. The discussion is continuned in the sequel of this post.

The problem with the problem

Operations research is a field with many problematic words, of which ‘problem’ is an interesting example. The word is generally used to describe a difficulty, but it has found an additional meaning in mathematics as a well-defined question that one may use an algorithm to answer. That would not be bad if it was not by a problem (of the first kind): both meanings are often interchangeably used.

The unavoidable consequence is that new developments targeting a given application sometimes present a meaningless criticism against mathematical problems, which are abstract entities that exist in spite of their use. Such a concern may sound pedantic because the mathematical problem is usually only serving as a proxy to its interpretation as the solution to the problem (of the first kind) associated with a given application. However, this bothers me because it is the tip of an iceberg of misguided conceptualization. And, of course, that is not how one should treat guys like the poor traveling salesman and his problem, who did not intend to do any harm against the problems of others.

Even though this is not a common mistake among reputed researchers in optimization, it happens pretty often among those in fields that need optimization as a means but do not delve further into theoretical concerns. This is bad because many of them go to the industry and will eventually come back to hire an expert in operations research to solve their most challenging problems, which will also involve detaching their needs from fixed ideas on how to model them.

I have already written previously about the meaning of problem, model, and solution in this blog, but coming across yet another bunch of writings and sayings with similar issues made me do it again. Personally, I only realized the problem with that after talking about my work with people in other fields, to which a problem has only one meaning (the full story is on that previous post). If I ever have the opportunity to teach an introductory course on OR, I would try to make this point clear in the most funny way that I could conceive (the funny part takes longer to be forgotten). As a matter of fact, this issue reminds me of a poem of my favorite writer. It is entitled XXVIII and is from The Keeper of Flocks, a book written by the Portuguese poet Fernando Pessoa under the name Alberto Caeiro (translation from the blog Fernando Pessoa: Alberto Caeiro: Complete Poems):  

 

Today I read almost two pages
In a book by a mystical poet
And I laughed like someone who’d cried a lot.

 

Mystical poets are sick philosophers
And philosophers are crazy.

 

Mystical poets say flowers feel
And they say stones have a soul
And they say rivers have ecstasies in the moonlight.

 

But flowers wouldn’t be flowers if they felt,
They’d be people;
And if stones had a soul, they’d be living things, they wouldn’t be stones;
And if rivers had ecstasies in the moonlight,
Rivers would be sick people.

 

You need to not know what flowers and stones and rivers are
To talk about their feelings.

Talking about the soul of stones, of flowers, of rivers,
Is talking about yourself and your false thoughts.
Thank God stones are only stones,
And rivers are nothing but rivers,
And flowers are just flowers.

Me, I write the prose of my poems
And I’m at peace,
Because I know I comprehend Nature on the outside;
And I don’t comprehend Nature on the inside
Because Nature doesn’t have an inside;
If she did she wouldn’t be Nature.

Continue reading “The problem with the problem”

2012: The year I decided to go back to academia

I’ve been wondering about pursuing a doctoral degree for a long while. It is true that I never denied this option and did the best I could while pursuing my master’s in order to keep it possible. However, there was much going on in parallel to that, and I went to ICAPS last June divided between applying for doctoral studies and buying an apartment (i.e., to settle down for good in São Paulo or to keep my academic options open). Before ICAPS had finished, I had already decided to keep renting and to start studying for the TOEFL and GRE exams.

ICAPS represented a turning point in many different levels. It reminded me what I really like about academia: crazy talks at late hours, passionate work and the desire to go beyond. My roomates were old colleagues from my alma matter, with whom I already had great times staying late at the lab in the past, and who made me remember how fun that was. I also met a number of graduate students with many interests similar to mine, and talked to many professors and post-docs who were really encouraging. It was the sense of belonging that I was missing since I have left academia.

I believe that I have learned a lot in the industry that will help me to succeed in a doctoral program. I am certain that 2013 will be a year of big changes.

On my way to Quebec City for CP 2012 and heading for INFORMS 2012 in one week!

I am right now in a bus from Montreal to Quebec City. CP 2012 will officially start on Monday, but a gathering of those who arrived earlier was suggested for tomorrow in the late afternoon. The same happened last year at CPAIOR, and that was one of the reasons why I decide to come as soon as possible: to avoid missing the first gathering again.

I will be presenting a talk and a poster about the application that I approached during my MSc, which is described in a full paper in the conference proceedings. This work helped me being accepted in the CP Doctoral Program, which will promote some events directed to the graduate students attending to the conference and has assigned a mentor to each one. The CP DP organizers were very kind to help a number of us with registration costs and hosting. On top of that, I was told last week that my position paper about characterizing adaptive search methods (a sequel of my extended abstract from CPAIOR 2011) entitled me 2 minutes to give my 2 cents in a panel about the future of CP, which is being organized by Professor Pascal Van Hentenryck. In short, I am really looking forward this meeting!

In the following, I will go to Phoenix for the INFORMS Annual Meeting. I will give two talks there, one of which at the sessions about OR applications in the energy sector of emerging markets, which I am organizing, and the other at the sessions about CP, organized by Professor van Hoeve. I will also contribute to the blog of the conference, which will have as authors many of the usual OR bloggers. As soon as the blog is officially released, I will post the link here. I have interacted with a number of those guys online but I saw very few of them in person so far. Therefore, my expectations for Phoenix are also very high!

CP Mini-Course by Prof. J. Hooker at Unicamp

Prof. John Hooker will stay for about a month at the State University of Campinas (Unicamp). During that period, he will teach a mini-course entitled “Constraint Programming and CP/OR Integration”. For those interested in enrolling in the course, which is limited to 40 students, contact Prof. Cid de Souza according to the instructions that follow. The instructions are in Portuguese, but the course will be taught in English.

Prezad*s colegas,

Gostaria de pedir a vocês que divulgassem aos eventuais interessados o mini-curso de “Constraint Programming and CP/OR Integration” a ser ministrado pelo Prof. John Hooker da Tepper School of Business da Universidade Carnegie Mellon aqui no IC/UNICAMP de 26/09 a 26/10/2012. Detalhes sobre o curso podem ser lidos no texto do anúncio que se encontra no final desta mensagem. Convém chamar a atenção dos interessados sobre os horários e locais das aulas pois estes irão variar de acordo com o dia da semana.

Favor informar a quem quiser fazer o mini-curso que preencha os dados do pequeno formulário abaixo e enviem-no para o meu email (cid@ic.unicamp.br), colocando no assunto (subject) os seguintes dizeres: “Registration: CP Mini-Course by Prof. J. Hooker”.

Como só temos cerca de 40 vagas disponíveis, atenderemos aos pedidos de acordo com a ordem de chegada dos mesmos. Portanto, seria bom que as pessoas interessadas não deixassem para a última hora.

Agradeço a colaboração de vocês e, claro, convido-*s a participar do mini-curso caso tenham interesse e disponibilidade.

Obrigado.

Abraços,

Cid

(corte aqui)
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
Formulário de Inscrição no Mini-Curso
———————————————————————-
Nome:
Email:
Informe abaixo sua ocupação atual:
[ ] graduando; Curso: ____; Instituição: _____;
[ ] mestrando; Curso: ____; Instituição: _____; Orientador: ____;
[ ] doutorando; Curso: ____; Instituição: _____; Orientador: ____;
[ ] pós-doutorando; Instituição: _____; Supervisor: ____; Orientador: ____;
[ ] docente; Departamento: ___; Instituição: ____;
[ ] outro; Especificar (cargo, empresa, etc): ____;

Observação: até o dia 16/9 a sua inscrição será confirmada via email.
———————————————————————-
(corte aqui)

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Anúncio do Mini-Curso

********************************************************
Mini-Curso: Constraint Programming and CP/OR Integration

Instituto de Computação – UNICAMP

========================================================

>>>>>>>>>>>>>>>>>>>> APOIO: FAPESP <<<<<<<<<<<<<<<<<<<<<
********************************************************

Instrutor:
**********

Prof. Dr. John Hooker
Tepper School of Business, Carnegie Mellon University

Short Bio:
———-

John Hooker is the T. Jerome Holleran Professor of Business Ethics and Social Responsibility and a professor of operations research at the Tepper School of Business. He joined the Tepper School in 1984 and has since held several visiting posts, most recently at the London School of Economics. He holds doctoral degrees in philosophy and operations research.

Professor Hooker has published over 130 articles, 6 books, and 5 edited volumes. In the area of operations research, he has produced pioneering work on the integration of optimization technologies and wrote the first book on the subject. He is also the author of Optimization Methods for Logical Inference and two editions of Integrated Methods for Optimization. He has served on several editorial boards, including two decades as an area editor for INFORMS Journal on Computing. He is an INFORMS Fellow and recipient of the INFORMS Computing Society Prize.

He also has interests in business ethics and cross-cultural management, as reflected in his books Business Ethics as Rational Choice and Working across Cultures. He is founding editor-in-chief of the Journal of Business Ethics Education and co-organized four conferences on international corporate responsibility. He has lived and worked in Australia, China, Denmark, India, Qatar, Turkey, the United Kingdom, the United States, and Zimbabwe, and has extensive experience in Germany and Mexico.

Professor Hooker headed the Tepper School’s undergraduate business administration program from 1996 to 2001 and received a Distinguished Academic Service Award for his contributions. In 2009, he was recognized with an Award for Sustained Teaching Excellence.

Para maiores detalhes ver: http://ba.gsia.cmu.edu/jnh/

Objetivo do Curso:
******************

The goal of this mini-course is to give an opportunity for the researchers and students in the field of Combinatorial Optimization to be brought up to date on the progress that has been made by the CP community in the last years, both theoretically and computationally. The duration of the mini-course will be 16 to 20 hours, divided into 8 to 10 sessions (one per day) of 2 hours each. It will be opened for any researcher interested in the topic and not constrained to the IC/UNICAMP community. However, some limitations may occur due to the physical infrastructure available (size of classrooms, etc).

Conteúdo do Curso:
******************

Constraint Programming
   Basis ideas of constraint programming (CP)
      Global constraints
      Constraint-based processing
      Constraint propagation
      Search
   CP modeling
      Employee scheduling
      Assembly line sequencing
      Resource-constrained scheduling
      Other models
   Consistency
      Strong k-consistency
      Domain consistency
      Bounds consistency
   The element constraint
      Variable subscripts
      Achieving domain consistency
   The alldiff constraint
      Review of network flow theory
      Achieving domain consistency
      Achieving bounds consistency
   The cardinality and nvalues constraints
   The among and sequence constraints
      Filtering based on cumulative sums
      Flow-based filtering
   The stretch constraint
      Dynamic programming model
      Domain filtering
   The regular constraint
      Deterministic finite automata
      Domain filtering
      Filtering by decomposition
   Disjunctive scheduling
      Edge finding rules
      Not-first/not-last rules
   Cumulative scheduling
      Edge finding rules
      Not-first/not-last rules
      Energy-based reasoning

CP/OR integration
   Basic ideas
      Branch, infer, and relax
      Example: freight transport
      Example: production planning
      Example: product configuration
      Example: Routing and frequency assignment
   Inference and relaxation duality
      Example: bounds propagation
      Example: Lagrangian relaxation
   CP-based branch and price
      Example: airline crew scheduling
   Logic-based Benders decomposition
      Example: planning and scheduling

Dias/Horários das Aulas:
************************

Fique atento pois as aulas das quartas e sextas serão em horários e locais diferentes.

Quarta: 26/09, das 18:00 às 20:00 horas, sala 85 IC-2
Sexta.: 28/09, das 10:00 às 12:00 horas, sala 316 IC-3
Quarta: 03/10, das 18:00 às 20:00 horas, sala 85 IC-2
Sexta.: 05/10, das 10:00 às 12:00 horas, sala 316 IC-3
Quarta: 10/10, das 18:00 às 20:00 horas, sala 85 IC-2
Sexta.: 12/10 *************** FERIADO ***************
Quarta: 17/10, das 18:00 às 20:00 horas, sala 85 IC-2
Sexta.: 19/10, das 10:00 às 12:00 horas, sala 316 IC-3
Quarta: 24/10, das 18:00 às 20:00 horas, sala 85 IC-2
Sexta.: 26/10, das 10:00 às 12:00 horas, sala 316 IC-3

Contato:
********

Prof. Cid de Souza (cid@ic.unicamp.br)

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
OBSERVAÇÃO: o mini-curso será integralmente ministrado em inglês.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

When CP becomes local search and local search becomes CP, and the same about CP and machine learning

As it is said in a Brazilian song entitled Sobradinho, “o sertão vai virar mar, dá no coração o medo que algum dia o mar também vire sertão [the hinterland will become sea, it comes to the heart the fear that someday the sea also will turn into hinterland]”. However, such an occurrence in the academic field is often reason for joy instead of worries – as well as for joint conferences where you can be delighted by funny flamewars.

In the case of Constraint Programming (CP), I have been observing the development of a strong relationship with local search as well as with machine learning. Coincidentally, while I was writing this post, a presentation of a local search solver at the 2012 ISMP (International Symposium on Mathematical Programming) seems to have caused some buzz on Twitter about local search and CP. To the argument that local search has not been exploited by major commercial solvers (as far as I could understand about the controversy), some observed that there is much of local search embedded inconspicuously in CP solvers etc. In the sequence, the discussion extended to the limits between what is CP and what is not.

However, why do we suppose that such techniques must be mutually exclusive?

From CP to local search…

Local search has been increasingly used within CP solvers, as is the case of Comet and of many papers authored by researchers from IBM describing what is behind CP Optimizer’s curtains. Many of them regard Large Neighborhood Search (LNS) as a must to solve cumulative scheduling problems, i.e., scheduling problems in which many activities can be simultaneously processed by the same resource. LNS is based on separate operators to generate modified solutions from a starting pool of solutions, and to adjust feasibility issues of those modified solutions. This latter type of operator addresses a common drawback of local search: the need to fix what the neighborhood operator unintentionally broke. [check the erratum at the end of the post]

For some, it might look strange to associate CP with local search and detach it from standard systematic techniques. While those techniques are important to tackle combinatorial problems in the general case, the size of many real-world problems is far beyond what it is possible to do in a lifetime. Hence, who cares if the search space is not fully explored by the search procedure if that is more of a theoretical matter than of a practical one? In addition, CP is about exploiting problem constraints, from the modeling with more semantic to the propagation that reduces the search space to the search that attempts to obtain feasible solutions. That leaves plenty of space to define the search according to the constraints in the model.

… and back from local search to CP!

A few weeks ago, I had the chance to observe the opposite trend: I saw a presentation about another commercial local search framework, but this one was using the concept of global constraints. As far as I could grasp, it was not on purpose. In order to reduce the problem of generating invalid solutions with standard local search operators on a broad range of problems, specialized constraints were defined to model certain combinatorial problems whose solutions cannot be modified at random. In other words, they are using specialized operators according to the structure of the problem, what is quite similar to performing propagation with specialized constraints in the context of CP models. As stated previously, the use of global constraints in CP is not only about propagation, but rather about exploiting problem structure for the sake of a better performance. And that is what they did on that local search solver. When I see such coincidences of design, I wonder how powerful but at the same time still unknown is the principle beneath CP.

Now from CP to machine learning…

And since constraints can be leveraged not only to improve propagation but also to improve search, why not mix the latter with the feedback received along the solving process in order to improve the selection of algorithms? That is a topic in which I am very interested: the development of adaptive search methods for constraint solving. It has been studied in great depth for the past decade and a half, and has received even more attention in recent years. One of the venues dedicating space to this topic is the LION (Learning and Intelligent OptimizatioN) conference series, which will have its seventh edition in Catania at the beginning of 2013. That is one of the conferences that I want to attend in the following years (monetary and job vacation constraints prevent my prompt participation).

… and back again to CP!

And if local search and CP each show off in the context of the other, one should expect the same in this case. At least, that seems to be the thought of the proponents of the recent call for papers of the first COCOMILE workshop (COmbining Constraint solving with MIning and LEarning), which will be held next August in Montpellier during the 2012 ECAI (European Conference on Artificial Intelligence). The workshop CFP emphasizes the two-way relationship between both fields. While I know that there exist some research on the CP-machine learning direction (even though I am not following it very closely), I think that a single and very specific event that attempts to look for both directions is of great help to foster further development. And, of course, that is yet another event that I want to attend in a nearby future (hopefully, publishing about at least in one of such directions).

Erratum

The description of LNS as stated in the post is not correct. It would be more precise to define that the first type of operator fixes values for part of the variables, and that the second attempts to find a feasible solution by assigning values to the remaining variables. The prior definition that I gave resembles somewhat what I understand as standard local search operators: procedures that fix the value of all variables on each solution that they generate. In the case of LNS, however, just part of the values is fixed and an iterative process is aimed at complete such partial solution.

ORlimpics: Making an OR medal list from OR-Exchange scores and badges

After reading Michael Trick’s blog post on ways of scoring countries according to the medal list, I wondered how to come up with a similar list in the OR world. After all, we have OR-Exchange, an online community where users are scored for their questions and answers as well as recognized with badges for certain achievements (gold, silver and bronze badges: quite similar to the Olympic Games). The results that I found provide an interesting picture of the community. In fact, I was glad that Brazil was better ranked in this list than in the one of the 2012 Summer Games!

I decided to summarize the results of all participants with a score of at least 200 (at first, I wanted to consider at least 100, but there were too many people without a declared country in the range 100-200, so I kept a smaller but reliable sample). I counted each badge as a medal, and I also considered two extra sets of medals to valorize the score of the participants.

The first set is equivalent to the marathon: who are the most diligent contributors? Gold for Paul Rubin, silver for DC Woods, and bronze for Bo Jensen. The second set accounted for team work: which countries had the higher combined score? Gold for the USA (Rubin alone guaranteed that), silver for Germany (mostly due to Florian Bahr and Marco Lüebbecke), and bronze for Denmark (Bo Jensen again).

After counting those, the final ranking was the following:

# Country Participants w/ score > 200 Total score Gold Silver Bronze
1 USA 18 25315 2 29 150
2 Denmark 2 3854 1 3 17
3 Sweden 1 211 1 1 5
4 India 6 3237 0 7 54
5 Germany 5 4340 0 7 32
6 Australia 1 3787 0 5 26
7 Brazil 2 2599 0 3 17
8 Greece 2 1078 0 3 13
9 Ukraine 1 253 0 3 6
10 Iran 1 1925 0 2 12
11 Canada 1 299 0 2 10
12 Iceland 1 680 0 2 7
13 Belgium 2 2445 0 1 18
14 Finland 1 413 0 1 10
15 Singapore 1 305 0 1 7
16 New Zealand 1 231 0 1 7
17 France 1 333 0 1 5
18 UK 1 236 0 0 2

 

Kudos to the OR-Exchange maintainers

Finally, even if not considered in the medal list ranking, we must account for something similar to the Pierre de Coubertin medals, which are offered to athletes who represented the truly Olympic spirit. In the context of OR-Exchange, the first thing that comes to my mind is the effort of some individuals and of INFORMS as well to keep OR-Exchange up and running properly. Among the top scores in the website, I know for sure that I can count Michael Trick, Mary Leszczynski, and Herman Strom as medalists for their effort on that. There are probably other “OR athletes” who deserve this medal, so I will consider this list as open-ended and I would appreciate any comment to include other contributors in the list.

The unsustainable lack of analytical skills for a sustainable environment

Developing an analytical model to support decision-making is a really tricky task. I did many of them in order to decide whether it was a good idea to buy an apartment or to keep renting. What differed among them was the level of detail taken into account, which progressed slowly as I learned from my previous mistakes. While most of my models pointed out that it was better to keep money on a savings account and do not lend any instead of buying an apartment for the rent that can be earned (or that I would avoid paying since I had to live somewhere), I often forgot to regard the probability that apartments would appreciate considerably in a near future. Looking back, I see that my renter did an excellent investment by renting to me and waiting for what would come. Nevertheless, that moment is gone and it does not make any sense to bet that real estate in São Paulo will keep rising in price as it did in recent years.

When it comes to a trending topic like sustainability, that is supposed to represent the concern of people with protecting the environment for future generations while exploiting it for their own survival, I have been often told of misconceptions and lost opportunities that made many initiatives on that regard produce the opposite effect. And what is worst: as conceived, sustainability is a great but sometimes overlooked application domain to leverage analytical tools.

For instance, I was told of a workplace where someone had the idea of asking for shelves above the toilet washbasins. Since the washbasins were often wet, people got used to place a paper towel below their toiletries bags. As an experiment, such shelves were placed in half of the toilets of a floor. Long after, people started to ask when those shelves would be placed in the other half, since the experiment seemed to be successful. However, they were informed that someone had to collect data to assess if less paper towels were used in the first half of the toilets or not. As many people did not bother to walk more to use the toilets with shelves, it is very likely that the person undertaking such experiment will be surprised after realizing that the use of paper towels actually increased in that half and decreased in the other half. I hope that the person manages to sum up the use on both halves to realize what happened instead of concluding that shelves increase the use of paper towels.

It is attributed to Ronald Fisher the following quote: “To consult the statistician after an experiment is finished is often merely to ask him to conduct a post mortem examination. He can perhaps say what the experiment died of.”. While it would be interesting to count with professional help, I believe that we would be better off if people working with sustainability were aware of the importance of developing analytical skills to avoid working against their own goals.

An advanced school on handling big data for academic purposes (with full grants for Brazilian and foreign students): São Paulo Advanced School on e-Science

São Paulo state’s research council (FAPESP) is sponsoring an interesting school on e-science, which I would describe to the blog audience as analytics for academic purposes. It is said that they will offer full financial support for 25 Brazilian and 25 foreign students. The school will be held next October in the surroundings of my alma matter, Unicamp.

Further information can be found at: http://www.vision.ime.usp.br/~liu/escibioenergy/