Frequently searched information

An analysis of my sitemeter statistics shows that a large number of people come here via google searches looking for the answers to these simple questions. Since I have enough keywords on this page to catch these questions, I figured I might as well toss the answers up at the top.

Miscellaneous map tools

This had no other normal place on my web site, so I'm sticking it here. This is a PDF file containing 12 copies of a UTM interpolator for 1:24000 scale topo maps. Print it out on transparency sheets, cut 'em out and laminate them in ID card sized laminate, and you will have a useful tool for plotting UTM coordinates on paper maps.

If you have a PostScript printer (or a copy of Ghostscript and drivers for your non-postscript printer), you might be interested in the original PostScript file from which the PDF was generated. It is a hand-written PostScript program to generate the grids, and can be edited with a plain-text editor to change the scale or the number produced. I produced the PDF version using Ghostscript's "write to PDF" capability

Shapefiles and tools for Xastir use

I have been using the Xastir APRS client for several years now, and I really recommend it for anyone who wants to play APRS. It runs on almost any Unix-based operating system, including the Cygwin Unix-like API for Microsoft Windows.

I always build Xastir with shapefile and "dbfawk" support. With these two options you can have access to lots of maps and can tune their appearance to your liking. If you build GDAL as well, you can get some tools that you can use to generate more maps.

I have created quite a few shapefiles for my own use, and this page is an attempt to start sharing them. All the files here are meant to be used along with a DBFAWK-enabled version of Xastir, although the shapefiles are perfectly general and can be used for any other purpose, too. I highly recommend the open source Geographic Information System GRASS for those other purposes.

I update the dbfawk files often as I fine tune how they look. You might want to take a look now and then to see if I have changed them. If you feel I've made a particularly ugly map using my dbfawk files, please feel free to feed back your changes to me.

DBFAWK tutorial

In reseponse to a query on the xastir mailing list, I wrote up a brief HOW-TO for DBFAWK. I hope it helps you.

Converting TIGER/Line data to shapefile format

It has just come to my attention that there is a commercial tool called TGR2SHP that can convert TIGER/Line files to shapefile format. On 13 December 2006 I learned that this tool is now freeware, and is available from http://tnatlas.geog.utk.edu/downloadfree.htm. THE INFORMATION BELOW HAS NOTHING TO DO WITH THIS SOFTWARE. My choice of the name "tgr2shp" in the name of the dbfawk file I created was purely descriptive of the process by which the shapefiles were created, and has nothing whatsoever to do with this product, and was chosen before I ever knew of the existence of the TGR2SHP tool. Please visit the manufacturer's site for information concerning that product. On my page you will find only descriptions of how to cobble together shapefiles from TIGER/Line data using free tools from the GDAL distribution.

This section is obsolete. The entire set of TIGER/Line data has already been run through ogr2ogr and Xastir-tigerpoly.py and are available in shapefile format at ftp://aprs.tamu.edu/. I'm leaving the description up because it could be useful for folks contemplating other conversions. Furthermore, the file tgr2shp.dbfaawk is now part of the Xastir distribution. The version here is outdated.

I have been able to download recent (2006 Second Edition) TIGER/Line data from The US Census Bureau's TIGER web site and convert them to shapefiles. It's a piece of cake. Download the county files you want. Then unzip each one into its own directory (call it "in_directory"). Then cd to in_directory and run the following command:

ogr2ogr -f "ESRI Shapefile" -t_srs EPSG:4326 foo .
This will create a subdirectory of in_directory called "foo" which will contain a number of shapefiles and auxiliary data. The files called "CompleteChain.*" are the ones you want; they are the polylines corresponding to roads and other linear features. Copy these to your xastir maps directory and rename them to the name of the county they represent. Then install tgr2shp.dbfawk in your share/xastir/config directory and reindex maps. Ta da. See above: tgr2shp.dbfawk is now part of the xastir distribution and the version on this web site is no longer being updated.

The option "-t_srs EPSG:4326" is not strictly necessary when used on the TIGER/Line files, but ensures that the output shapefile is projected to WGS84 Lat/Lon if necessary (the reason that it isn't necessary is that the input data is supposed to be in NAD83 Lat/Lon, which is equivalent to WGS84 Lat/Lon for xastir purposes). It will also ensure that the .prj files associated with the output shapes carry the correct projection info you need should you ultimately need to reproject them for a different purpose (e.g. for use in a GIS such as GRASS).

The polylines in CompleteChain are also used as boundaries for polygonal features, but the polygons are NOT represented as shapefile "polygon" features after the ogr2ogr conversion; no amount of dbfawk manipulation can get them to display as filled polygons in xastir. To get area features filled one would have to write a conversion tool that merges information from the PolyChainLink, Polygon, and CompleteChain tables and creates a new shapefile with polygons instead of polylines, and an associated dbf file with polygon attributes.

Some time ago I produced a Python script derived from the "tigerpoly.py" script that comes with gdal. This script is called Xastir_tigerpoly.py, and if you have gdal installed you can use it to generate polygon shapefiles from the same raw TIGER/Line files that you used with ogr2ogr to get polyline files. Look in the "scripts" directory of your CVS checkout version of xastir, or browse the Xastir CVS repository on sourceforge (follow links from the Xastir home page). The file "tgr2shppoly_2006.dbfawk" is also installed with xastir now. The complete set of 2006 Second Edition TIGER/Line data has been converted to both arc and polygon shapefiles, and are available from the TAMU FTP site. If you really want to run Xastir_tigerpoly.py yourself, all you really need to do is give it the name of a directory that has a single county's TIGER/Line data unpacked in it and the name of a shapefile to create. If you specify the "-d" option it'll dissolve the common boundaries between small polygons with identical feature names; dissolving these boundaries is key to making prettier maps, but destroys information that was in the original topological description.

Be warned: in some areas the TIGER data is woefully inaccurate. I happen to live in one of them (Bernalillo County, NM). There is some evidence of careless digitization of maps, and there is a strong possibility that the source maps might not have been properly rectified or might have been in the wrong datum. In Bernalillo County these errors were not corrected between the 2000 data and the 2003 data, but was corrected substantially by the 2006 Second Edition data.

According to the Census's own recommendations, TIGER/Line data is not meant for use in navigation --- it is specifically tailored to the needs of the US Census and does not meet accuracy standards of other US mapping data. It is, however, readily available and often useful. Please consult the Census web sites for more information before using this data for anything important.

Bernalillo County GIS Files!

Unfortunately, the information in the following paragraphs was found to be obsolete once again on 16 July 2007. Bernalillo county no longer provides road shapefiles on their GIS data download site. Only a few polygon shapefiles are available at this time. The only option for getting Bernalillo County road data will now be the Albuquerque GIS page. Fortunately, I have been able to create a suitable new DBFAWK file (actually a trivial modification of the old one) to go with those, so scroll down to the next section if you're looking for Albuquerque/Bernalillo County road data.

Bernalillo County has a fantastic web site where you can download all manner of current GIS data. It is, however, in HARN 1983 New Mexico State Planecoordinates, Central Zone, in US surveyor feet (they come with the .prj filenecessary to define the input coordinate system). You can download 'em,convert them to lat/lon with ogr2ogr (with "ogr2ogr -t_srs EPSG:4326 newfile.shp origfile.shp"), and use 'em in xastir. They are really nice, and are naturally much, much more accurate than the TIGER/Line data for this county.

The Bernalillo County web site has this data in both "ESRI GeoDatabase" and Shapefile formats. You need the shapefiles, as the GeoDatabase deal is unusable in Xastir

I have, so far, only created a dbfawk file for the "RoadCenterlines" layer, the current transportation network. The labels look horrid, of course, because of the way they're displayed in xastir, but the streets are correct and look good. Here's what you do to use them.

  1. Get the data from the Bernalillo County web site. Click on "Interactive Maps", then on the right side click "Download County Data". Select the "Public Works Road Inventory" zipfile under "Download Shapefiles." This is a zip file that contains all of the files needed to use the data in xastir. You might also want to look at the metadata web page for the data --- this is linked to in the table of GeoDatabase files, and the field names won't match exactly; they have been modified slightly for the shapefiles, but it is easy to see what shapefile field name goes with what. Bernalillo County's GIS staff are meticulous with their documentation of their GIS data, and this metadata is very helpful in crafting usable dbfawk files.
  2. Convert the RoadCenterlines shapefile to lat/lon, using the .prj file as input and the target projection of EPSG:4326 (WGS84 lat/lon):
     ogr2ogr -t_srs EPSG:4326 RoadCenterlines_ll.shp RoadCenterlines.shp 
    
    The resulting shapefile RoadCenterlines_ll.shp and its associated .shx and .dbf file will be directly usable in xastir.
  3. Download my dbfawk file
  4. Move RoadCenterlines_ll.* to an appropriate map directory that xastir can use, and index new maps. Tune the dbfawk file to taste.

One could extend this process to the other shapefiles at the bernco.gov site. At one time I did so for the the airports shapefile, but this is no longer available.

The much older "netcurr" shapefile contained enough information that one could use it to geocode addresses, but the new RoadCenterlines layer does not. The netcurr file on Albuquerque's GIS site does have this information, though, and represents the same road network. See the following section for details.

The good news is that the netcurr files are updated monthly, so they are generally up-to-date with the real roads. The bad news is that when they are updated they are not always regenerated with the same attribute table format, and so the dbfawk file needs to be redone. I discovered on 4 April 2006 that this had happened since I originally made a dbfawk file for them, and that the dbfawk file I had on this site had been outdated for some time. I have regenerated it on that date to reflect the new format of the dbf attributes, but I can't guarantee that today's files from Bernalillo County will match. If you find that it no longer works, please let me know, I'll take a look to see what incompatible change has been made, and update the dbfawk file if I can.

City of Albuquerque data

The City of Albuquerque also has an on-line GIS depot. Their site is http://www.cabq.gov/gis/. The data on their site is quite good, although the road network shapefiles are not as nicely set up with attribute tables as the old Bernalillo county shapefiles that contained precisely the same roads. Also, the metadata that go along with their shapefiles leaves a great deal to be desired. In some cases, what is desired is that there BE metadata. Where there is any metadata at all, some files have atttribute columns that are not documented anywhere (for example, no explanation is given of the "FUNCTIONAL" attribute for road data, which is clearly intended to denote what type of road the feature is). Some of the data here is self-explanatory, though, and those files have been quite useful to me.

Note added 30 August:HOWEVER: the "FUNCTIONAL" attribute is described (briefly) in the metadata for netcurr as having come from "MRCOG" which refers to the "Middle Region Council of Governments" --- and the very old version of "netcurr" that I had from the old Bernalillo County web site had a "MRCOG_FUNC" field that seems on cursory glance to relate the integer FUNCTIONAL value to an actually street type in text. In my netcurr.dbfawk file for that old file I actually had extracted that information by hand and put it in comments, and I have happily discovered that my old netcurr_ll.dbfawk file needed only some minor tweaking to get it usable with the new Albuquerque data.

The data on the City of Albuquerque site is all in New Mexico State Plane Coordinate System, Central Zone, just as is the data on the county web site. They will all need conversion to lat/lon to be usable in xastir. This is done with the usual ogr2ogr command:

 ogr2ogr -s_srs netcurr.prj -t_srs EPSG:4326 netcurr_ll.shp netcurr.shp 

Once you have converted netcurr.shp to netcurr_ll.shp with ogr2ogr, you can just use the dbfawk file I have provided right here. This dbfawk file worked on 30 August 2007, and as long as Albuquerque doesn't change the dbf signature it should work with future versions of the shapefile, too. Using Albuquerque's data for roads should be very useful, and Albuquerque will undoubtedly keep this file updated and current. Update often, and please report to me if you update and find that the dbfawk file no longer works --- that'll mean I have to go back and fix up the dbf signature again.

While you're at it, you could possibly find my dbfawk file for the arroyos data useful, too. This goes along with the "arroyos.zip" shapefile set (once converted to lat/lon, as always), and provides a somewhat useful picture of the arroyo flood-control/irrigation network. I have noticed some glaring omissions, though, especially in the Los Ranchos de Albuquerque area --- the one area where I've actually needed this information in recent months. The data just seems to have a hole there. Fortunately, the census data and road data seems to have a little information to fill in the gaps.

for further work
The netcurr file that the City of Albuquerque offers contains information that could be used to geocode addresses, but the geocoder in xastir is too closely tied to the TIGER/Line data for this to be done easily. With a little effort, it would be possible to use the address geocoder from the Postal Address Geocoder project in xastir. Back when Bernalillo County had their own version of netcurr, I verified that with the schema in this dbf file, it is possible to use the (no longer available) 4 April 2006 version of netcurr.{shp,shx,dbf} to generate a usable geocoding --- pagc takes a dbf file of address and other fields, and can generate a point shapefile to go with that dbf file. The way I did this was:
 ogr2ogr -s_srs netcurr.prj -t_srs EPSG:4326 netcurr_ll.shp netcurr.shp 
           (to create a lat/lon version instead of NMSPCS)
 pagc -bnetcurr_ll -smyschema.dbf 
           (to generate the geocoding database)
 dbfcreate my_addrs -s Name 25 -s Address 25
           (create a dbf file for addresses to look up)
 dbfadd my_addrs "Cuthbert Twillie" "300 Lomas Blvd NW"
 dbfadd my_addrs "Chauncey Hoffenagel" "1500 Menaul Blvd NE"
  [...etc....] (to add addresses to look up)
 pagc -rnetcurr_ll -mmy_addrs
   (to generate the shapefile to go with the address records)

This shapefile could be imported into xastir quite simply with a crude dbfawk file. Building in the address lookup capability directly into xastir would be a different project for a different day.

I HAVE NOT TRIED THIS WITH ALBUQUERQUE'S NETCURR FILE and it will certainly need to be changed to work --- the field names in the old Bernalillo County file don't match those in the newer Albuqeurque file. But one could study the schema file and tweak it to work properly if one cared enough. I don't care enough at the moment, but it is definitely a worthwhile thing to pursue.

USGS data layers

I have obtained a number of USGS SDTS DLG transfers from the USGS EROS Data Center. You can't read these into xastir directly (yet --- you will be when full GDAL/OGR support is added), but you can convert them to shapefiles using the utility "sdts2shp" that is available the gdal source tree. It doesn't get built automatically, you have to go down to the "frmts/sdts" directory and build it yourself. It's worth the trouble.

The SDTS DLG transfers are line graphs of features that are on USGS 7.5 minute quadrangle topo maps. If you're lucky, the USGS has put recent SDTS line graphs on their web site. Unfortunately, some of the SDTS files are from previous revisions of topo maps, and you can only get updated data buy buying it in "DLG-3 Optional Format." More on that later.

You'll probably want the sdtsdump utility to view the metadata for these files. Look at the "IDEN" and "XREF" files to be sure what year the thing was last updated and the map datum, respectively. "NAX" means "NAD83" and "NAS" means "NAD27".

I have had more and more trouble finding source code for the sdtsdump and other simple utilities over the years, but last time I did an hour-long google search, I finally found it at ftp://ftp.blm.gov/pub/gis/sdts/dlg/c_code.zip. Snag it now. Five years ago this code was all over the place.

NOTE: the SDTS files all have coordinates in UTM. You need to convert them to Lat/Lon. If sdtsdump on the XREF file shows "NAS" for the horizontal datum, you'll also need to do a datum shift. See below.

Metadata

There's a special place in hell for people who create purely arbitrary "geospatial" data and make it available to others with poor metadata. --- Gerry Creager
I am sure there's already a special place there for me, but I don't need another. On 21 Dec 2004 I went through all of my tar files here and made sure that each had sufficient metadata for them to be used outside of xastir.

All the shapefiles here are in WGS84 Lat/Lon unprojected coordinates. In the case of any files I have converted from some other coordinate system, I've provided a ".prj" file that can be used with OGR to guarantee proper georeferencing. This prj file is in the WKT format, and is unusable by "shpproj" (which expects the .prj file to be nothing more than its own command line arguments) or Global Mapper (which produces a human-readable .prj file). Knowing that all files are in a projection/datum of "lat/lon" and "WGS84" (EPSG number 4326) should be adequate.

The Wilderness areas shapefile contains the original metadata text file that came with the data when I downloaded it from the National Atlas.

For shapefiles derived from USGS data products, please consult the USGS EROS Data Center for details. All I have done with them is convert to shapefile format and change the coordinate system to lat/lon.

Yeah, yeah, gimme some free files

I have the following shape files generated from USGS SDTS data (they're all road data layers from around Albuquerque, New Mexico): All of these shapefiles have been converted to NAD83 lat/long from the original NAD83 UTMs in the SDTS files. To view them best, please put sdtsrd.dbfawk and sdts2rd.dbfawk into /usr/local/share/xastir/config. Two are necessary because when a file has NO named or numbered roads, the SDTS file has no column for the road number --- that means two otherwise-identical dbfawk that have two different DBFINFO signatures.

DLG optional format

The USGS's older format for digital line graphs is the "DLG Optional" format. There are no free converters that will work for you. I downloaded a 14-day free evaluation copy of FME Desktop Suite from Safe Software, but it's a very expensive package and I can't afford it to do this operation often. An alternative to FME is the much cheaper, but still pricey, Global Mapper. This is a commercial product originally based on a USGS DLG viewer dlgv32, but with export capability.

I have two dlg optional road data layers that I purchased directly from USGS along with lots of other data. I used FME Desktop to convert them to shapefiles, and generated a dbfawk file that makes them look just like the SDTS files above. If you buy your own dlg optional files and convert them, you might have to modify the dbfawk file to get the signature and field names right.

US Forest Service CFF (Cartographic Feature Files) format

By doing a web search, I was able to find a repository of GIS data files in ArcView "Export" (e00) format for all the Cibola National Forest Ranger Districts. I'm still working on the process, but I've one of them already. Use it in good health. The conversion from E00 format also needed FME Desktop Suite, and I had to play around with it a lot. The E00 format files have tons of info in them that are not relevant to all data layers, and FME desktop suite's default action on importing the E00 file isn't what I wanted. If you make a similar effort you are likely to produce shapefiles that won't work with my dbfawk file. Note: since writing all this, I discovered an open-source route to converting e00 files to shapefiles. You need avcimport and possibly e00conv from http://avce00.maptools.org. I have not yet gone back and redone these conversions for the USFS data, because they will certainly involve redoing my dbfawk files. But I have checked that the process works. The USFS data I had was in compressed e00 files, so you need to use e00conv to uncompress them, then use avcimport to turn them into ArcView binary coverages. Then you use ogr2ogr to convert them to shapefiles.

The USFS has a geodata clearing house, but the files there are much less useful than the ones Cibola National Forest has on their site. The CNF files have trail, stream and road names and numbers, and very few extraneous lines. This isn't the case for the geodata clearing house files.

Note how different the USGS and USFS versions of trails are. I believe the USFS are more current, and they certainly show more of them.

The labels can get ugly, but where the USGS topo maps don't show a trail at all, the USFS line graphs add the trail *and* its name to your display. I like it a lot, even though the labels are kinda hard to read.

US Forest Service --- Cibola National Forest

Today, 13 July 2006, I noticed that the Cibola National Forest web site has a new GIS data page. This was updated only in the last week or so, as last week I found that they had removed their older GIS page altogether. I have not yet started using their data, but there does seem to be some especially useful stuff, including US Forest Service modified topo quads in Arc/Info geotiff format (which means missing all the geotiff tags to make them directly useful in xastir, unfortunately). The page in question is on the Cibola National Forest Projects and Plans page. To get to the topo quads, you need to navigate to the US Forest Service's Geodata Clearinghouse. to use the quads, you'll need to add appropriate geotiff tags to contain the coordinate system. The specific coordinate system (projection and datum) information needed is contained in the metadata, so it's not that hard. Most are in UTM with NAD27 datum, so you'd just use gdal_translate with the "-a_srs EPSG:26713" option to note that they're UTM zone 13, NAD27.

As I experiment more with this data (including some of the new vector data) I'll put more up here about it.

Miscellaneous

Tools

Shapefile splitter

These two programs can be used to split up shapefiles into managable chunks. Both of these programs are trivially compiled. On my system, where libshp.a is in /usr/local/lib, and the associated header files are in /usr/local/include, I compile with:
cc -o split_shape_by_attribute -L/usr/local/lib -I/usr/local/include split_shape_by_attribute.c -lshp
To run them is about as simple. As always, if you have GDAL and OGR installed, the ogr2ogr utility allows far more power for splitting shapefiles by feature type and region. One uses an SQL-like "WHERE" clause and/or a bounding box specification:
 ogr2ogr -f "ESRI Shapefile" -where "Foobie='Blargh'" -spat 35.5 -105 36.7 -104 bar Foo.shp
will create a subdirectory bar that contains the features of Foo.shp for which the field Foobie has value Blargh and which lie in the same bounding box as for the split_shape_by_bbox example above. More complex where clauses are allowed, but you should consult with some SQL documentation for syntax. (Note: the order of the single and double quotes in the where clause are important. The entire expression needs to be in double quotes, and the value tested needs to be in single quites. It was wrong on this page until 25 June 06. My apologies to anyone who tried and couldn't get it to work this way.)

projection conversion utility

I have looked over the source code for the "shpproj" utility that's included in the shapelib distribution's "contrib" directory. It does not do reprojections properly if the source and destination have different data (it does them in two steps, source projection to "geographic" and then "geographic" to destination projection, which is just plain wrong if the data don't match --- a datum shift needs to be applied). It appears that this utility was designed around an older version of the PROJ4 library.

The proj-4.4.5 projection library has a more general transform function, and comes with a program called "cs2cs" that you can use to transform almost any coordinate system in any datum to almost any other. The datum shifts work fine for NAD83, WGS84/NAD83 and GGRS87. But you have to have the coordinates in ascii text.

I hacked on cs2cs and merged its function with shpproj, and produced "shpcs2cs". Help yourself to the source. Very little of this code is my own. I started with cs2cs and adapted it to read and write shapefiles instead. It will do proper NAD27 to NAD83 conversions of shapefiles, so if you have an NAD27 UTM shapefile you can produce a WGS84 Lat/Lon file for use in Xastir.

Only the New Mexico State Police District files above needed that conversion from the original I received to the one I'm distributing here, which was done by saying:

shpcs2cs uspbound uspbound_ll83 +proj=utm +zone=13 +datum=NAD27 +to \
                                +proj=latlong +datum=NAD83

Using OGR2OGR for shapefile conversions

Note that the utility "ogr2ogr" that comes with GDAL can also do proper datum conversions and reprojections, and if you have GDAL installed it is a better choice because it is more flexible and more powerful.

To use ogr2ogr for this purpose you need to have a "Well Known Transformation" (WKT) or EPSG number for the input and output projections. Sometimes you'll luck out and have a .prj file along with your source shapefiles that ogr2ogr will use. So long as the .prj file actually has a valid WKT in it, ogr2ogr will know what projection and datum the source file is in. If the .prj file was created by shpproj, however, it will be useless except for use with shpproj.

You can get the complete list of WKTs and associated EPSG numbers from this zipfile at the geotiff projections list.

The EPSG numbers for NAD27 UTM are 26700+UTMzone (north of the equator only, of course, and only for zones 3 to 22). EPSG numbers for NAD83 UTMs are 26900+UTM zone.

The EPSG number for WGS 84 lat/long is 4326.

Therefore to convert a shapefile that is not in WGS84 Lat/Lon and which has a correct .prj file into a shapefile usable by Xastir, you can do this:

  ogr2ogr -f "ESRI Shapefile" -t_srs EPSG:4326 OutputDirectory InputShapefile.shp 

This will use the .prj associated with the shapefile and the EPSG number for WGS84 lat/long to handle projection and datum shift in one fell swoop.

If your shapefile doesn't have a correct .prj file you'll need to find metadata that tells you what the source projection and datum are. Once you know that, you can force ogr2ogr to use the appropriate information as the input projection. Say, for example, you want to convert a file you know is in a UTM projection in UTM Zone 13 (North), and is in NAD27. You'd do this to get it usable by Xastir:

  ogr2ogr -f "ESRI Shapefile" -s_srs EPSG:26713 -t_srs EPSG:4326 OutputDirectory InputShapefile.shp

Your OutputDirectory will contain the new shapefile in WGS 84 lat/long, with an appropriate .prj file should you ever want to reproject/datum shift it again.

You can also create a new .prj file for an existing shapefile using the data in the "WKT" zip file I mentioned earlier. Just copy the entire line into a .prj file with the same base name as your shapefile, and ogr2ogr will be able to use it without your having to specify it on the command line every time.

An example of a .prj file for NAD27 UTM Zone 13 (North) would be:

PROJCS["NAD27 / UTM zone 13N",GEOGCS["NAD27",DATUM["North_American_Datum_1927",SPHEROID["Clarke 1866",6378206.4,294.978698213901]],PRIMEM["Greenwich",0],UNIT["degree",0.0174532925199433]],PROJECTION["Transverse_Mercator"],PARAMETER["latitude_of_origin",0],PARAMETER["central_meridian",-105],PARAMETER["scale_factor",0.9996],PARAMETER["false_easting",500000],PARAMETER["false_northing",0],UNIT["metre",1]]

For further data

If you need more GIS information from New Mexico, you might want to look at the University of New Mexico's Resource Geographic Information System project. Most of the USGS information is available here, including digital orthographic quads, digital line graphs, etc. Be aware, though, that the site is very slow (probably due to the large number of downloads it is subjected to), so don't think you can just go there and grab everything at once. Some of it needs conversion before there's any hope of using it in xastir; for example, digital orthographic quads are usually in MrSID compression format, a proprietary compression algorithm that can only be converted to GeoTIFF format using a product of LizardTech, Inc.. I was able to decode the MrSID format topo maps from the UNM site using the linux version of MrSID Decode to create GeoTIFF files, but the orthoquad photos caused a segfault. The Windows version, however, converts the orthoquads from MrSID to GeoTIFF just fine.

Despite the wealth of map data on the UNM RGIS site that is worth downloading, I don't recommend getting USGS topo maps from the UNM site. The few I looked at were significantly older than the ones from the LANL "New Mexico Search and Rescue Resources" site --- for example, on the UNM site the Sedillo, NM map is dated 1978, but at LANL they have one from 1997. Further, while the mrsiddecode utility was able to create a GeoTIFF file out off the MrSID compressed data, it contained NONE of the metadata necessary to get Xastir to use it --- nothing about the projection, nothing about the map datum. The orthoquads, however, are just fine once converted from MrSid to GeoTIFF and contain all projection and datum information.

As I mentioned in the earlier paragraph, some SAR volunteers at Los Alamos National Laboratory have created a fabulous repository of USGS topo maps of reasonably recent vintage, covering (almost?) all of New Mexico at three different map scales (1:24000, 1:100000, 1:250000). The GeoTIFF files at sar.lanl.gov are directly usable in xastir with no modification. One thing, though, is that you have to dig around a little to realize where the ".fgd" files are hidden --- xastir uses these to clip the map collar for seamless tiling of maps. As you click on links you'll see that the .tif files that contain the maps themselves are in a "data" directory underneath several other directory levels. The .fgd files are in a parallel "metadata" directory. That is to say, for the Tijeras, NM quad, which is map "o35106a3.tif", you'll find it at http://sar.lanl.gov/topo_maps/35106/data/o35106a4.tif, and the corresponding ".fgd" metadata file is at http://sar.lanl.gov/topo_maps/35106/metadata/o35106a4.fgd. Put these two files together in a directory that Xastir can find and you'll get the Tijeras quad with its collar stripped.

A note about USGS topo map naming

The file names for the USGS DRG ("Digital Raster Graphics") files appear at first blush to be somewhat arcane. But they tell you where the quad is located. Disecting the Tijeras quad's name, "o35106a4.tif" --- the first letter, "o", tells that it is a 1:24000 scale map ("f" are 1:100000, "c" are 1:250000). The next two digits tell that the first two digits of the quad's latitude are "35" degrees, and the next three tell the first three digits of longitude. The "a" tells which 7.5 minute chunk of latitude between 35 and 36 degrees North the quad occupies, and the "4" tells which 7.5 minute chunk of longitude between 106 and 107 it occupies. In a 1-degree rectangle, the A1 quad is in the southeast corner, and the H8 quad is in the northwest corner (for 7.5 minute quads, because there are 8 7.5 minute quads in a degree). Clear as mud?

Obtaining a list of quad names for a given state

The "where" clause in ogr2ogr can be used on the 24kgrid.shp file from gisdatadepot.comto get you a shapefile of quads for a given state. Here's how. Neat fact: the "MRC" column of the dbf database associated with the 24kgrid.shp file contains exactly the information you need to determine the filename that USGS uses. For example, the MRC column in the "NMQuads.shp" for the Tijeras, NM quad is "35106-A4". I've used this fact to construct one-liner "bash" scripts (that I haven't saved, so don't ask) to download whole batches of maps from various sources --- including DOQQs and DEMS from RGIS and DRGs from LANL.

Enjoy. Mail to me if you have comments on this stuff.