Posted 16 December 2008 - 09:10 PM
Dan Goldberg here from the USC GIS Research Lab. Just found this site recently and glad to join. I'm a phd student in computer science focusing on all things geocoding. The maps I make are currently hideous looking and I'm hoping to improve my cartographic skills with the info I can find on this site. I've seen lots of good stuff on here so far and I'm looking forward to reading more.
Posted 18 December 2008 - 10:38 AM
And remember, the fact that you recognize you have a problem means there's hope!
Posted 18 December 2008 - 01:59 PM
Welcome! What's your research subject? What software are you using to make maps?
I work on determining, predicting, and (hopefully) improving the overall error of the geocoding process. In particular, I'm working toward deriving quantitative spatial error metrics instead of the qualitative values one gets back from most geocoding software, e.g., "street-level", "city-level", "ZIP code-level". The former allows researchers to have a level of confidence about their geocoded data that can be accounted for in research studies, while it can be argued (by me) that the latter mean nothing. In the simplest example of why the current qualitative reporting practices are bad consider that USC is it's own ZIP code which is about a mile square, while the ZIP code next to USC, 90007, is about 5 times as large, and the ZIP code for most of Palos Verdes is about 20 times as large. So clearly just reporting the qualitative value "ZIP code-level accuracy" should not inspire much confidence or trust in one's data because the in-class variation is so large. This is particularly troubling if these qualitative codes are being used to determine geocode suitability for small area analysis such as epidemiological research into such things as pesticide exposure that travel in the air less than 1000 m. The same "what does that accuracy value actually tell me" concerns arise for the many types of linear interpolation and other areal-unit interpolation outcomes; street-level, parcel centroid, etc. Riveting, I know....
For mapping I tend to use Google APIs whenever possible. When not, I use ArcMap which is where my real trouble starts with -- what I am told by people who seem to make good-looking maps are -- basic cartographic principles...
Posted 24 December 2008 - 12:27 PM
Posted 24 December 2008 - 01:15 PM
Posted 25 December 2008 - 02:08 AM
This is particularly troubling if these qualitative codes are being used to determine geocode suitability for small area analysis such as epidemiological research into such things as pesticide exposure that travel in the air less than 1000 m.
Interesting! My PhD research and other projects I've worked on focused on predicting household pesticide exposure based on spatial modeling of agricultural pesticide use into house carpet dust for epi studies. For a study in Iowa we knew the geocoding would be off, so took GPS readings at each house sampled, then used both plus aerial photography to derive a better spatial location. It didn't really matter for urban locations, but was really important for rural addresses. There were only around 110 residences, so it was doable.
For a California study we found a different problem with rural addresses. Houses tended to cluster at intersections and the interpolated geocoded address could be WAY off.
In contrast, for another study I drove around San Antonio, Texas, comparing geocoded addresses to GPS readings in front of the address. Those urban locations were very well behaved.
I know some epi researchers who would be interested in the results of your research.
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users