Joris Schouteden is the Director of Research and Development for Hexagon Geospatial Luciad Portfolio, based in Belgium. He is a long-time software engineer with specific expertise in web development of 2D and 3D data.
You can learn more about this work by visitingHexagon’s site.
The introduction of the web browser as a geospatial tool has become less of a novelty over the years, and for good reason. High-speed internet coverage has increased and spread across the globe, allowing more people than ever to access quality geospatial information and processing with minimal taxing of machine resources.
Web apps enhance accessibility for both the GIS user, and public user. Both parties benefit from faster load times, and from a standardized environment that allows sharing of tailored views, and widespread data interoperability.
The mobile nature of web applications is a major selling point. From anywhere, (VPN not included), users can access their organization’s full suite of streamable data in a secure manner. Removing the desktop element also removes headaches associated with implementation processes, such as installations and licensing.
The removal of the local element works wonders in terms of speed, as it creates the option to stream data. Rather than waiting for data to download, and dealing with the overhead of file management, the user can streamline the process of getting to work on their data.
This does not mean that your expensive hardware will collect dust. Web applications can be configured on the backend to take advantage of machine resources at varying scales, depending on the end use envisioned by the developer.
Note: It may be worth checking to see if you havehardware acceleration enabled in your browser of choice. This is a factor in allowing your browser’s backend access to machine resources.
Streamable data is data that is loaded incrementally. There may be a massive amount of data that is hosted server-side, but through filtering and clever development, the client can rapidly access just the data that is most relevant to their request. This is achieved through data delivery methods such as tiling and multi-leveling.
Tiling is breaking up data into logical spatial chunks. Imagine you have some imagery printed out on a sheet of paper. If you draw a grid on the sheet, then take some scissors and cut along the lines, you are left with a pretty good representation of how images are tiled, and a nice new GIS-enthusiast puzzle.
Multi-leveling takes advantage of wavelet compression by providing data at various levels of detail depending on the context in which the data is viewed. It would not make sense to display every single road in Europe when you are viewing the Earth on a global scale, in fact, due to the cell and pixel sizes at play, it might not even be possible on your monitor. Multi-leveling would not waste resources on rendering all of the roads at the global scale, but if you zoom into Germany, it may show you the major roadways, then if you go further in, say to Munich, you would unlock all of the available road classes.
This multi-leveling concept applies to imagery resolution as well. Think of loading and watching a YouTube video. If you choose to view the video at 360p, it will load quickly, but you may not get the best video quality. On the other hand, if you need to see that cat video at 1080p, you are going to have to wait a bit longer to be able to seamlessly stream that content.
All the protocols described below adhere to theOpen Geospatial Consortium (OGC) Standards which provides widely adopted schemas and guidelines for standardized data creation and access.
Web Map Service (WMS)- A generalized format that provides georeferenced imagery or vector graphics from a map server to a client.
Web Map Tile Service (WMTS)- Provides cached and tiled imagery and attributes from a map server.
Web Feature Service (WFS)- Enables querying and retrieval of XML-based feature data from a server, including complex geometries.
Web Coverage Service (WCS)- Most commonly used for data concerning space and time, such as weather and multispectral data.
Indexed 3D Scene Layer (I3S)- A scalable JSON based format which is designed to efficiently stream large 3D datasets
A web application is generally designed to place as little of a burden as possible on the client-side when delivering a service. Your machine, however, does not necessarily get a vacation just because you are working in a browser. If you investigate Task Manager while running a web app, you will see your browser application begins to eat away at your CPU, GPU, memory, etc as your browser passes off heavier tasks to your machine for an assist. (Google Chrome is famous for being especially greedy, for a GIS-friendly browser, try Firefox).
The trick is that not all data is created equal when it comes to streamability. Streaming prefers linear data, and geospatial data is generally not linear. Think of linear data as the alphabet, and geospatial data as a completed crossword puzzle, the context, location, and intersection of the letters has uncompromisable meaning.
Streamable GIS data formats exist despite these challenges.The most important factors in a successful web application are the ability to fluently visualize data, and to be able to manipulate those visualizations (filtering, highlighting, etc.). The better the application handles the first task, the better the results of the second. Without a doubt, your GPU is going to be the most important tool for these visualizations, doing all the heavy lifting that comes with clean and rapid rendering.
If you don’t have the latest graphics card, do not fear. Geospatial web formats provide background information about the underlying data that can provide context for what you are looking at while the graphics catch up. For example, an imagery footprint loading before the imagery itself.
If they are well-structured, thesebackground links to the data can also assist with delivering attribute information, helping to mediate the sacrifice of data richness we often associate with streaming.
Only a few years ago, 3D rendering was reserved for desktop applications. Since then, browsers have entered the arena.
A 3D mesh is a structure of vertices which come together to represent a modeled object, or objects in space (a scene). The point vertices are connected as a triangular network of edges and faces which make up the visualized surface. This can be combined with textures, such as draping imagery over the mesh, to create a more realistic rendering of the subject, they can even hold attribute data.
So in order to optimize your data for visual streaming, it will ultimately need to be translated to this language. This is no problem for imagery, it has spoken this language since its inception. When it comes to vector, BIM, and LiDAR data, however, this translation can add a lot of backend work as it makes the move from unstructured, to structured data, adding to visual latency.
The geospatial use cases for web applications and streamable data are infinite. Every business, government entity, and individual has some use for such an application.
In the instance of an emergency response event, streamable data is especially relevant. Quick retrieval of pertinent data is vital. A dispatcher may need to pull street data, hydrant data, building footprints, and even video livestreams all together in order to give the best possible instructions to a responding fire truck. If they were to use a desktop application, and needed to download, unzip, organize, project, and analyze all this data, it may be too late. A quality web application removes these latency concerns and allows the dispatcher to provide response as soon as it is needed.
Well, for now, there are still limitations on what a browser can do. A native desktop application is still better situated to communicate with other desktop applications (like with your local DBMS, or shared drives), and still more capable of unlocking your machine’s full power when you get into complex and intensive processing, like for deep learning or machine learning.
Considering, however, that five years ago the concept of a GPU working side-by-side with a web application would have been cutting edge, the future blurring of the line between browser and desktop capabilities is inevitable.
Crafting a quality application for a job you really want takes time, so you do not want to spread yourself too thin. When constructing your CV, it is important to keep your audience in mind. Realistically, the first set of eyes will likely be a computer algorithm, scraping the submitted CVs for certain keywords.