These books are serious graduate level math, but are the primary sources one would need to understand if one were trying to learn search theory on a deep level.
These articles are better places to start when trying to wrap your brain around search theory, and may be enough if you only want to use search theory in real searches.
If you are a search planner or hope to be, this is the class you really need to take. It is a week long class on search theory and its practical application, and you get a lot of practice during the class actually using the stuff.
If you attended my class at ESCAPE 2015 and wanted a copy of the slides, you can download them. I have also written up some notes to accompany the slides for those who wanted to attend but could not. The slides are not a standalone substitute for the class, as they were created to be visual aids and props for me to point at, not a complete encapsulation of the class materials.
The actual materials I presented were created and presented using the open source office suite, LibreOffice. The course slideware is best viewed in this format with this software. Here are the slides in LibreOffice format.
The slides are intended to be viewed in slideshow mode, and have many custom animations intended to reveal and hide different parts of the information as I speak.
During the preparation for the class I made extensive use of a simple spreadsheet with which I could test out allocations to see how well they compared to the optimal allocation from the Charnes-Cooper algorithm. You may download it in Libreoffice ODS or Excel format.
The file has multiple sheets. The first sheet shows the results of the Charnes-Cooper algorithm for two phases of search of the toy problem with 200 searcher-hours in each phase. It does NOT perform the charnes-cooper algorithm, so changing the areas or POCs will not cause recomputation of the optimal allocation. I used it to do things like compute the colors to use in the drawings (the _R and _B columns are the red and blue values to use in the paint). Other sheets in the spreadsheet allow exploration of different allocations. The second sheet, "Inverse Effort from POD" allows you to put in a desired POD for each area and see how much effort that requires. The numbers in the spreadsheet show a particular allocation that would be based on the usual "search the highest POC areas first" guidance, and show how bad an overal POS that would give for this problem. It also shows how much effort would be required if all areas were searched according to the "Critical Separation" plan.
The third sheet is set up so that the user can type in any allocation and see how it would perform. Just type times into the "Allocation" column and the other columns will show the effect. The numbers on the final row of Allocation and POS are the total allocated effort and overal POS, respectively.
The fourth sheet shows some other allocation schemes I looked at while developing the class.
The final sheet shows the "Operationally Viable" searches I presented in the class. For each phase, the optimum allocation was computed, and then effort from all areas with coverage less than 0.5 was reallocated to areas with coverage greater than 0.5, in a manner that gave each area an equal increase in coverage. The second phase is computed assuming the first phase has been done according to the viable plan. The spreadsheet does not do this allocation, that was done externally and the data imported.
The optimal allocation was computed using a hand-rolled computer program that used the SORAL ("Search Optimization and Resource Allocation Library") library. The official SORAL download site went down years ago, and it is now much harder to find. To this end, I've created a github repository with a snapshot of SORAL from shortly before development on it stopped. You can download this SORAL library from my github repository. Note that this site contains SOURCE CODE for SORAL, not any kind of ready-to-run programs. It should be considered geek fodder, not a ready-made tool.
Once you've compiled the SORAL library, you can, if you so desire, try to compile the little hacks I created to generate the allocations discussed in the class. Note that these, too, are just C++ source code, not ready-to-use programs. If you don't know how to compile a C++ program and link it with the SORAL library you compiled from the last paragraph, you probably shouldn't bother clicking the links below.
On another note, Don Ferguson has told me that he intends to recode the algorithms contained in SORAL in python and add them to IGT4SAR, so calculations like these can be done right in the GIS where you're maintaining your search map.