Thursday, October 15, 2009

SDI for Everyone

The topic of verbose metadata versus youtube-style metadata (a title and a video) in the context of Spatial Data Infrastructures is not new. Even publishers of geospatial content struggle with the verbose metadata standards that have been created over the years. Those metadata standards were not written with data discovery in mind. They resulted from the need for analysists to fully understand the data they were about to use to ensure it fit their purpose.

With the advent of the content sharing sites like Youtube or Flickr content sharing no longer was limited to the scientific/professional community. Everyone who is willing can share their content with someone else can do so now. With the increased ease of sharing comes a demand for an increased ease of describing the thing you're sharing.

If someone decides to share something, they do this presumably with the intent of someone finding that thing. If you are trying to sell your car on eBay, you will want people to find your car there and you'll try to describe your car such that you attract buyers. A picture of your car may not be sufficient in that case. On the other hand, you don't need to refer to the specifications of every part of the car either.

Different uses, different metadata.

Additionally, the tools used for discovery are different. In addition to going to a specific site (Flickr, Youtube) users expect to find the things they're interested in using their preferred search engine.

These two aspects apply to Spatial Data Infrastructures just as they do to the general audience searching for content on the Web.

With the ESRI Geoportal Extension we're aiming to answer to these two aspects: support different types of metadata and support arbitrary search tools.

This has led to supporting both the verbose FGDC/ISO metadata specifications as well as supporting the ability for someone to register a geospatial service and extracting information from the resource enough to support findability. This is supported on both the ESRI ArcGIS Server services as the Open Geospatial Consortium service types.

The second part of supporting arbitrary search tools is supported through a set of interfaces to the geoportal. These interfaces support both the 'traditional' interfaces as published by the Open Geospatial Consortium and ISO (CS-W 2.0.2 and ISO 23950) as well as OpenSearch or generic RESTful interfaces. In addition we provide an indexable sitemap of the content of the geoportal that follows the Sitemap.org profile. This profile is supported by the major search engines.

The Geoportal Extension REST interface supports alerting users of updates to the catalog through GeoRSS notifications that reflect a user's interest. The same interface supports output in KML, HTML, ATOM, and GeoJSON. This has opened the content of the Geoportal to many platforms and search tools. Geospatial users who have desktop GIS tools (ArcMap or the free ArcGIS Explorer) can plug in a simple search tool that leverages these interfaces so that they can find geospatial resources and directly use those in their GIS environment.

With Geoportal Extension we are looking to bridge the gap between discovery and verbose metadata and support both traditional interoperability specifications and the interfaces that are en vogue currently.

SDI for Everyone!

The proof of the pudding is in the eating, right? so here we go:
Enjoy!

Monday, October 12, 2009

NSGIC 2.0

Some things were different at NSGIC 2009 in Cleveland, OH. Perhaps it was the proximity of the Rock & Roll Hall of Fame that stirred things up a bit... This time the crowd wasn't just listening to the presentations, they were actively engaged in online conversations using twitter, following #NSGIC2009. People who weren't at the conference got to participate by responding to the posts and asking their own questions. The presentations are now available online at SlideShare, courtesy of NSGIC. While the conference is over the conversations will continue. To me, those conversations are one key aspect of realizing the NSDI and Gov 2.0.

During my presentation, I claimed that: sharing your current authoritative geodata through web services with open API will help build a platform for transparent and accountable government. Simple as that. Open up those data silos, let others use your data creatively to build applications.

I was honored to co-present with Jerry Johnston, GIO of US EPA. Jerry introduced the ChesapeakeStat program that will provide an avenue for publicly tracking and reporting progress of accelerating nutrient and sediment reduction throughout the Chesapeake Bay watershed, including federal lands. A multi-state and federal partnership that will result in a set of web services and applications built on those web services.

ESRI Inc Supports suchs public/private partnerships to collaboratively build a Gov 2.0 Platform, through the provisioning of Map Services and Tasks, a Content Sharing Program, Open Web Mapping API, and Open Source Sample applications. These and more are available at the ESRI resource center at http://resources.esri.com.

Saturday, September 19, 2009

Stability in a dynamic world

Technology is not unlike fashion. What was once considered hip and trendy falls out of fashion, only to returned in a retro-improved way after a period of time. I never realized this until I hosted a group of Dutch Waterboards at the ESRI Headquarters in Redlands, CA...

Seeing some of my former clients after some 5 years was a pleasant distraction from the day-to-day business of projects and product development. It provided an opportunity to recollect some of the work we did on creating a common data model for water boards in the Netherlands ten years ago and look at how the goals of that project were doing: share application development to reduce redundant investment, less dependency on a particular application vendor, bridge the information gap between organizational units and business processes, and others.

Although the data model has changed name, its contents are still very much alive. Over the years, the Waterboards have developed several client-server applications that connect to a single data store, thus integrating the various business processes within a Waterboard. All organizational units have the same definition and information for the objects they manage, including but not limited to waterways, hydraulic structures, permits, and numerous other aspects of water management. This would lead you to think: mission accomplished!

But the group of visitors expressed new desires that have come up in this past period that have lead to their study trip to the United States: further separate the layers of the application architecture or the use of new features in supporting software components, to name a few. Over time the client-server pattern has been followed by thin-client and other application architectures. More recently service oriented architectures and the enterprise services bus (or is it services-buzz?) are in-vogue. The accepted styles for user interfaces have changed with those technologies as well. Not that the current application wasn’t meeting the initial needs, but the needs have changed with the advent of new technologies and capabilities: how did we ever do without AJAX?

People trade in their cellphones, not because the old one is broken, but because the new ones have more cool features, and the availability of those features creates the need for them. This is the old supply-demand principle of a market economy.

This principle also applies to software technologies as well. Where a couple years ago the use of HTTP GET/POST protocols for the Open Geospatial Consortium (OGC) services specifications was well accepted, there now is a growing need to add support for SOAP/WSDL to these specifications. Not because HTTP GET/POST doesn’t work, but because the context in which the OGC services are being used is changing to use Service Oriented Architectures (SOA) that require the use of specific standards such as SOAP. Note that the data or the data model that is made accessible through these changing protocols is not affected by the particular choice of protocol.

Well, ever noticed what happens if you use SOAP (most noticable in the shower)? It dissolves and you have to buy new soap! New technologies will come for messaging, information exchange, presentation, aggregation, central versus distributed data management, and what not. Acknowledging this is half the victory of the technology battle: do what makes sense for the foreseeable future, follow the general trends of IT, and assess what and when new technology fits your business needs.

For the Dutch Waterboards, the constact factor in this 10 year period has been the data model, confirming one of the assumptions that lead to the definition of the data model to begin with. The tremendous effort that was put in analysing entity-relationship-diagrams, class-diagrams, data dictionaries, and other exciting materials, proved its value in a world of ever changing technologies.


Appeared in GeoInformatics Magazine (www.geoinformatics.com) in October 2006

The Eye of the Storm

We all know the cliché that in the United States the vacations are limited to christmas eve, president’s day and the fourth of July. Bosses get nervous when their staff asks for a full week off, and two weeks is generally called sebatical... You can imagine what happens after a four week vacation that ends on the day that Hurricane Katrina makes landfall in Louisiana.

It is like hitting the wall of the storm again, right after being in the quiet eye of the storm for four weeks. My mind was set to complete the new version of the Geospatial One-Stop Portal (
http://www.geodata.gov) for which I am project manager at ESRI and to think about the next steps in this project.

Little did I know how my schedule would be affected by Hurricane Katrina that cost many lives, cost business owners their life’s work, and destroyed many local governments databases, IT infrastructures, and buildings.

Apart from the human rescue operations, many GIS volunteers went to the affected areas and helped recreate the data and maps needed by first responders using equipment and software donated by private industries from accross the United States.

Within a week the United States Geological Survey initiated a concerted and focused effort aimed at creating a comprehensive seemless database of geographic information for the affected areas. This meant collecting data from many sources, in many different formats, assessing which data was usefull for the new database, creating mapping of the individual data models to a common data model, loading this data into the new database, and making this data available to those people who needed it in the first place. Apart from this effort, data is being collected in the field by first responders, areal photography is being acquired, and satellite imagery of the area is becoming available.

The amount of data that has become available todate after Hurricane Katrina made landfall is enormous. With the initial response activities changing in to a recovery operation. The amount of reports, data, and analysis results of the effects of the hurricane and of the reconstruction of the area that will become available within the next year is mind blowing.

To find ones way in this sudden wealth of information, metadata catalogs and search capabilities on those catalogs will play a key role. Here is where portals suchs as the Geospatial One-Stop Portal can have a significant contribution. These portals act as the card catalog for a library, allowing you to browse through a description of the information rather than requiring you to walk through the entire library in search of that one book. The new generation portals such as the Geospatial One-Stop portal provides more than just a card catalog with a search interface. These new portals provide for collaboration tools that allow users and producers of geospatial information to communicate with each other about a specific topic. This collaboration can be through the sharing of working documents, listing of and linking to relevant news feeds from a variety of agencies, or by participating in chat rooms.

Directly after the storm made landfall, a community of interest was set up on the Geospatial One-Stop Portal that was populated with information as it became available. This unstructured collection of resources was useful at the time (it was all there was!). However, as the stream of avaiable data grew, and after the realization that this information channel would exist for many months, a more structured approach would be necessary. This structure was provided by metadata about the datasets, clearinghouses, and mapping applications, that was published to the Geospatial One-Stop Portal. Existing resources already available on the Geospatial One-Stop Portal was updated to include a simple keyword ‘Katrina’ (and later on ‘Rita’) to indentify resources as relevant for the response and recovery effort. The Geospatial One-Stop Portal has proven its tremendous value as a mechanism for the dissemination of key geospatial resources to users of these resources.

The hurricanes also made it clear that although we all love the magic of web services, there are situations in which the good old floppy is unbeatable. The fact that 1.4 Megabyte floppydisk, has been replaced with a 120 GB Firewire Drive aside...


Appeared in GeoInformatics Magazine (www.geoinformatics.com) in October 2005

Traveling Salesmen

With the laptop, the problem of the traveling salesman has found a new dimension. Perhaps it is even time to discuss whether the original criteria of a solution to the traveling salesman problem require reconsideration.

I’m sitting in Terminal 6 at John F. Kennedy Airport in New York, waiting for the flight back to California. For a while now, there is this gentleman looking at me and clearly something is bothering him. When I hear my boarding call and I unplug my laptop from the power outlet, he straightens his back and I sense a definite mood swing. Walking to the gate I turn around to double check that I took all my belongings and I see the man plugging in his laptop and settling in the seat I just occupied. He now overlooks the surrounding area as a king overlooking the area around his castle. Nothing can hurt him anymore.

In my trips to clients across the United States I have learned the locations of power outlets and phone jackets at different airports. Keep your eyes out for flocks of laptops scattered around pillars or phone booths. These are the signs that power or Internet connectivity is not far away.

Since time salesmen have tried to determine how they can plan their visit to the clients of the day. The essence of the traveling salesman problem is basically improving efficiency. In the past travel meant waiting for the transportation vehicle to arrive (horse carriage, train, airplane) or to get at the next stop.

Nowadays we have mobile offices and wireless Internet connectivity. So instead of looking at the pattern of the lights in the ceiling or the color of the carpet in the airline terminals, we can now spend our time preparing for a meeting, work through the list of e-mails that piled up during a meeting, or write columns for a GIS magazine.

There are some limitations to the mobile office though. One of those is battery power. The salesman of today has to bridge the flying time between airports and the time spent at an airport between two connecting flights. This translates into a basic GIS problem: select a route from A to B as a set of flying segments for which the flying time per segment is not more than available battery lifetime and for which the battery can be charged in the time spent between two segments. Account for loss of battery power due to shutdown and startup when the airplane takes of or lands.

Note that shortest distance is no longer one of the traveling salesman’s problems. A longer flight might result in more effective working time and may therefore be considered a more optimal solution.

Extending efficiency to 100% leads to an interesting situation. At one point, the battery cannot be charged enough to last the next flying segment, resulting in making the segments shorter. This however also results in more shutdowns and startups, which has a negative effect on the remaining battery lifespan. Thus segments get shorter and shorter. The traveling salesman gets stuck halfway to his destination… Traveling further is not sensible since it is more efficient to stay at the current location.

A solution to this paradox may come in time as technology progresses. Video conferencing or even virtual reality meetings will become possible in the future, finally solving the traveling salesman for once and for all: the salesman stays at the office! Until then sit back, relax and enjoy the flight.

Appeared in GeoInformatics Magazine (www.geoinformatics.com) in April/May 2003

Address Management

I assume you know your address. But do you realize how often you are asked for that piece of information? And how often have you received mail at your address that was meant for a previous resident? This time I want to discuss some of the issues of managing addresses and how they might be solved using the new buzz: web services.

We came to realize this fact once again upon moving to Redlands. We provided address information Social security, driver’s licenses, visa, utilities, tax, and (to some of our family) more important matters such as cable television and Internet.

For us simple folks, the administrative part of moving to a new address may sometimes be irritating but imagine that you are a utility company, a bank or the Internal Revenue Service. Not knowing where to send your bills or sending bills to the wrong address can result in a considerable loss of revenues.

Being a geographer, knowing where to send a bill is one thing, but knowing where your customers are is another. The well-known process of geocoding allows for geographic analysis that is beyond the extent of address-based analysis.

One of the prerequisites for the results of this analysis to be valuable is that the addresses used are accurate. Not only in the sense that it corresponds to the information that is analyzed, but also that the address actually exists and can therefore be located on the earth’s surface.

After capturing address information by an organization, it also propagates through organizations as part of information exchange, aggregation and reporting to (for example) higher government bodies. At any stage of this process, people verify the validity of address related information before actually using it. There is definitely room for improvement here.

In my last column I discussed software development in relation to geographic information systems. One of the laws of software development states that solving errors early in the development process saves as opposed to solving the problem at the end of the project. The same applies to having correct addresses. Instead of having a multitude of organizations checking the validity of addresses and spending amounts of time in the geocoding process, we could try to start out with correct address information at the source.

Those of you, who have actually set up a geocoding function, know that it uses a geographic dataset with street information and address ranges. And you also know that this dataset needs to be maintained as new streets get created on almost a daily basis. The decision for what area the street dataset is acquired and maintained greatly affects the cost of the geocoding function.

With the introduction of web services such as ArcWeb from ESRI, it is now possible to offer a high quality geocoding function at a low cost for organizations. The cost associated with maintaining the dataset is shared between a large number of users.

Applications pass an address to the geocoding web service and the result, consisting of a geographic location, a standardized (!) address and possibly a map of the surroundings of the address are returned to the applications. This is all done through agreed upon web interfacing standards, including as XML, SOAP, WSDL and UDDI. If necessary, the requests and results transmitted through the web can be encrypted when address information is considered sensitive information and privacy should be protected. Using an application independent transaction identifier, privacy protection can be enhanced even more.

Equipped with this new web service, the utility companies and all the other organizations can stop guessing who is living in my house, resulting in a better cost-benefit balance for them and less unwanted mail for me. Sounds like one of those win-win situations we’re all waiting for.

Appeared in GeoInformatics Magazine (www.geoinformatics.com) in October/November 2002

GeoInformatics, Geography and Informatics

The past years, the work I have done in the field of GIS is getting closer and closer to the roots I have in software development. Is it just my own experience or is the world around us changing?

When I starting my professional career it was with a hydraulic engineering company. I worked with the software department and developed applications to support flood early warning and flood damage assessment. After some years I started creating these applications using ESRI’s Arc/INFO (is it was spelled at the time), ArcView, and ArcIMS.

I remember a Saturday evening when a colleague and I were enjoying lasagna while working through lengthy pieces of SQL trying to figure out why the web application we developed did not return the desired result for the selected location. One of us was coding the thoughts the other came up with combined with his own ideas and interpretation. The problem had been bugging our development team for a while and as the deadline came closer we resorted to this extreme measure.

By midnight the problem was solved and we sat down with a beer and reflected on how the approach we took proved successful and whether or not that could be turned into more practical use. It appeared that the concept of ‘paired programming’ we practiced was one of the elements of a software development methodology call eXtreme Programming (XP, not to be confused with that recent addition to the existing family of operating systems).

That experience made me realize that developing a GIS application successfully depends on well-known software development principles just as much as any other non-GIS software projects. The fact that we started off with products that deliver a lot of functionality out-of-the-box does not change this.

It may even complicate things. Whatever development methodology is applied (RAD, DSDM, XP, Waterfall, RUP), an application goes through the process of design, coding, testing in different cycles of different lengths. But the final step to success always is user acceptance. When starting from a commercial-off-the-shelf product that is being customized, there will always be discussions when anomalies are found as to whether this is part of the customization and thus within responsibility of the developer or is part of the off-the-shelf product and thus within responsibility of the manufacturer.

The fact that this discussion takes place at the end of the project when budget is running out and both developer and client want the product to be finished, but working, adds to the complexity. An idea may be to introduce a second type of acceptance test, this time aimed at the developer! Whatever off-the-shelf product, data or document is supplied to the developer is subjected to an intake test and anomalies are noted and either accepted to exist or lead to modification of the supplied item.

And guess what? Formal acceptance of items delivered to the software team at the starting point of the project or during the project, was one of the things we introduced at this hydraulic engineering company when we started thinking about Software Quality Assurance; nothing new, but still refreshing in a way.

So after some years my interest in software development has been revitalized and I now again read about subjects as software process improvement and test process improvement and how these may contribute to the success of our GIS projects and thus to the success of our clients. And the success of our clients is our primary concern.

Appeared in GeoInformatics Magazine (http://www.geoinformatics.com) in June 2002

Traffic congestion

Part of our moving to Redlands is to get used to dollars, ounces, gallons, and miles. Just when we thought we missed out on the change to the Euro, we find ourselves in conversions of our own. Fortunately, one thing has not changed upon moving to the Los Angeles basin: driving a car often means: follow the red lights in front of you…

Another stable factor so far has been the use of Internet. We still use the same free e-mail provider as we did when we lived in the Netherlands. We only use it more often and with more people than before.

There is a kind of analogy between the road network and the Internet. First of all, the Internet and roads are collections of connected networks that allow transport between two locations. The networks may differ in type or size. Through agreed standards it has become possible to have objects pass from one computer network to another, thereby allowing the object to travel further than the limits of the originating network. And similarly, when traveling through Europe, I never had to change tires when crossing the border between two countries.

Just as is the case with the Internet, bandwidth is important on the road network. Bandwidth on a sandy forest road is much smaller than on a 4-lane interstate. And people using both types of networks share a preference for high bandwidth.

The increase in network traffic is the source of the third analogy between roads and Internet: traffic congestion. No matter the size of your driveway or the horsepower of the engine of your car, you will get into a traffic jam just as you turn on the highway. And similarly it doesn’t seem to help to have a cable modem installed to surf the web when everybody else does the same and uses it at the same moment you do.

So, now we have seen that the two types of networks are similar and share the same problems, can we benefit from this knowledge? Perhaps we can. Over the past years we have been building more roads, thereby increasing the bandwidth. This was not enough to meet the increasing demands due to the need or desire to travel.

With Internet traffic a different development can be observed. In the early nineties the Internet was accessed through a 9600 bps modem, the happy few having access to a 14.4 kbps one. Since then data volumes have increased exponentially, now allowing us to serve GIS data and functionality over the Internet.

Apparently we have found some solutions that help to minimize traffic congestion on the Internet. Whether it is data compression (car-pooling), or the use of high bandwidth backbones in addition to low-speed home phone lines (treating local traffic differently from long-distance traffic), there should be something we can learn from Internet developments when designing new road plans or addressing traffic problems in our GIS consulting work. Indeed we already apply some form of packaging (as is done with information sent over the Internet) with the introduction of traffic information systems that guide us along a route that has the least chance of jams!

I do not pretend to have the solution to traffic congestion problems. But we tend to keep following a familiar line of thought in solving geographic problems, and that has not always helped us find a more permanent solution. Sometimes the solution to a problem is found in an unexpected location or at an unexpected moment. So be sure to always bring your notepad and pencil, wherever you go, just to make sure you don’t miss that one leap of mind.

Appeared in GeoInformatics Magazine (www.geoinformatics.com) in April/May 2002

Precision Gardening

After living in an apartment complex for three months we recently moved to a house of our own. What the Bureau of Land management does for the entire United States, I do for my 9000 square feet of garden. Working in the field of GIS I thought about applying some of the techniques available to help me managing my domain.

We thought that we paid a lot of attention to our gardens in the Netherlands. After all, we are the tulip country of the world. How surprised we were to see the amount of attention that is put to gardening here in Redlands. Our newly acquired garden has a built-in irrigation system and this is not a rare luxury, but can be found all over town. Of course, when considering that we live in a cultivated piece of the dessert, it is only a bare necessity to treat the garden properly.

But what is properly? It is not uncommon to see small streams of water flowing down the road, indicating that some garden is being soaked instead of watered. The questions of life for the new homeowner are when to water the grass and how much does it need, or does it need feeding? And what about the roof? I realized that these questions are the urban echo of farmers across the globe wondering how to improve the yield of their land while controlling the cost.

One of the big differences is of course scale. To distinguish different parts of a garden as opposed to multi-acre fields, the resolution of the information needs to be better than one meter I would say. The high-resolution satellites that are becoming more and more available are rapidly solving this.

Another point is that in our garden we try to keep the grass short instead of having it grow like a field of corn. I am not aiming at optimizing yield. My concern is to apply enough water to avoid my grass to dry out without drowning it. I would like a thermostat-like system that tells me when and how much to water the lawn.

Farmers who expect a high increase in yield when applying precision farming are prepared to invest. For us homeowners the benefits mainly consist of saving money on water. This means that if the precision gardening business is to be viable, it should be a low-cost service. Luckily we have one very strong point in favor over farmers. Homeowners come in large numbers. The same principles should apply as with cable television or utilities. Having the cable company digging a cable to a single house would make it very expensive. Sharing cost with other users is the key to success.

The system I envision accounts for the amount of rain that has fallen over the past period, the seasons, the type of soil in the area and such. It should be possible to enter the type of grass is used in my garden. Satellite images are combined with weather forecasts and soil types in the area. These data come from different sources, but with technologies such as applied in the Geography Network, this should not present a problem. A future generation of the system may even support searching for more accurate data that feeds in the model (local weather reports for example) or for competing models that promise minimal water usage while giving a nice green lawn.

Think of the size of the market, if one could sell this type of geographic services to every garden-owning family in the world. Being able to log in to www.precisiongardening.com and to see my garden on screen with hints where to apply water and how much would not only be really cool, but could also help save some on the water bill. And saving water is only a natural thing to do in the dessert.

Appeared in GeoInformatics Magazine (www.geoinformatics.com) in April/May 2002

Relocation Based Services

No, the title for this column is not misspelled and you did not miss a new development in the field of GIS. The term reflects one of the reasons for this column and my own recent experiences in relocating and the role location-based services appeared to play.

Location based services are sometimes defined as based on the position of the user in space. My opinion is that location based services need not be limited to mobile devices. Logging in to my office network from some desk and finding that my computer automatically configured a printer near me is one of the less obvious examples of a location-based service although it is available in many office networks. The point here is that the field of application of location-based services might be much wider than mobile devices only.

So what about relocation? Relocation has settled in my mind lately, since my family and I are about to move from the Netherlands to California. In the process of relocating, at least two distinct locations and a route between them play a part. First of all I wanted some information on our destination and typed 'go Redlands' in my browser. Apart from a list of 53 sites with interesting information, I also got a great offer for a video camera system that can be used for surveillance purposes. Unfortunately I'm not a US resident yet nor am I logged on to a computer in the US meaning that the offer is not for me...

These Internet marketing campaigns clearly lack a location-based service and did not read Alex van Leeuwen's article 'Geo-targeting on IP Address' in the July/August issue of GeoInformatics! But things are not as bad as they may seem. Part of relocation is to find a new place to live. Surfing the web taught me that realtors are clearly beginning to see the added value of GIS. Some time ago you were lucky to find a picture of the offered houses on the Internet. Nowadays, most real-estate web sites offer a map showing the location of the houses. Some even give additional information on the neighborhood, schools, demographics and such. To me these sites qualify as 'offering a location-based service'. The nice thing about the Internet is that once you've followed one of these links you enter a new 'world' of information and more links to follow…

Beware not to get lost in cyberspace. But then again, being able to get lost is one of the key properties of a world, isn't it? Fortunately the chance of my goods getting lost during the relocation is next to zero. I can actually track the relocation of my goods from my present to my future location. This means that even in the event that the container ship carrying my stuff across the ocean does sink, I will always know where they are. Isn't that comforting?

Location-based services are not a thing of the future anymore and we do not have to wait for UMTS or other high-speed mobile networks and Global Positioning System (GPS) enabled devices. Geo-targeting can be seen as the GPS of the Internet. The nice thing about this is that your IP address is always known when connected to the Internet, and that is independent of your physical location on the earth. That means that Internet services that make use of geo-targeting will travel with you, wherever you go!

We have seen that even in a common thing as relocation one already can experience the benefits of location based services. All of these services were offered through the Internet. This leads me to conclude that the Internet, with its links, and references, is perhaps the largest location based service provider around. This column will be my own contribution to the expansion of location-based services.

Appeared in GeoInformatics Magazine (www.geoinformatics.com) in September 2001

Internet GIS: Dedicate or Integrate?

The Internet plays an increasing role in GIS. Be it for publishing maps or for exchanging data. New GIS applications are Internet enabled. The current websites containing geographic information are often dedicated to one subject, while integration of different data sources opens up new possibilities.

My last project at ESRI Nederland consisted of creating a web site where professionals can find statistical water related monitoring data (http://www.waterstat.nl). The developed application allowed for geographical selection and for this purpose a so-called map service was created. In a presentation I gave at our annual user conference in the Netherlands, I used this map service to demonstrate the possibilities of one of the ESRI (http://www.esri.com/) products to use Internet data sources in addition to local data.

During my presentation, I realized that the map service was not created as a data source for a GIS conference. It was dedicated to serving an Internet application. This led me to visit other web sites that serve maps and there one can see a similar phenomenon. Interesting maps are being used within these applications. The available Internet maps could be used in many different projects. But this is certainly not common practice. This is a bit sad. After all, how many of us have convinced our customers that GIS is the means of integrating different data sources!

Internet is widely available to organizations, modern GIS products are able to access data sources over the Internet, and maps can be served through the Internet. These ingredients can be combined into applications that give us access to information from local and national governments, combined with commercial data and data created with a project. Multiorganization projects can use a common dataset where each party contributes by supplying a dataset dedicated to one aspect of the project.

Although technical requirements may have been fulfilled to a large extent, there still are some hurdles to be taken. Currently organizations are used to prepare datasets that are only used by the organization itself. The possibility of others using their data requires not only the willingness to share information. It also requires the notion that a dedicated dataset might be used in a broader perspective than the project the dataset resulted from.

For a dataset to be suitable to be integrated with other datasets, one has to know something about its content, its quality, its purpose and so on. Initiatives such as the Geography Network (http://www.geographynetwork.com) implement metadata standards that allow for the search for that missing peace of data. However, being able to find different dedicated datasets and put them on the same map is only the beginning of integration. Content standardization is just as important.

A good example is the development of a nationwide data standard in the Netherlands, used by water authorities (http://www.idsw.nl/standaarden/model/entiteit_relatie/). The good thing of this standard is that it is a result of an initiative from water authorities themselves. Different competing GIS vendors participate in this standard by building applications and extending the standard with new data models. These vendors are all members of the Open GIS Consortium. This means that both with respect to content and with respect to the technical side of things standards are available and in use. Dutch water authorities are now implementing Intranet applications based on this common data standard and some of them are even publishing their maps on the Internet. The step towards being able to use each other’s data is not a big one.

We will have to dedicate ourselves to integrating data. So be a publisher and give us access to your information. It will help us make better decisions and that is beneficial for you too!

Appeared in GeoInformatics Magazine (www.geoinformatics.com) in November 2001

Thursday, March 12, 2009

Motto

Explorers are we intrepid and bold
Out in the world of wonders untold
Equipped with a wit, a map, and a snack
We're searching for fun and we're on the right track

It seems we are indeed on the right track after reading Adena's analysis of the new features on Geospatial One-Stop and its summary on Slashgeo.

ps: that was Calvin and Hobbes

Monday, March 9, 2009

New Geospatial One Stop (GOS) Features Add Content to GIS and Mapping Applications

Check out the What's New page on the Geospatial One-Stop Portal (GOS). It has some new features that focus on making the content more accessible for use outside of the website context.

What I specifically like is the new capability to perform searches in GOS from external Web applications using a REST API. The new REST API allows you to issue a simple URL string request to GOS and receive responses in any of three formats: GeoRSS, KML, or HTML.

Saturday, March 7, 2009

Yikes! I'm blogging!

Hmmm. So this is a blog... I'll have to get used to talking to no one (or all of you). Will anyone actually read what I say or am I alone in the desert? What do you do in such cases? We'll see what inspiration brings. As I'm working on ESRI's GIS Portal Toolkit I run into things around metadata, OGC specs, and such. I'm also trying to think about how GIS Portals and catalogs can help people find geospatial resources that aren't discoverable through your regular search engines. Those will be the kinds of topics I'll talk about here.