Dominant Limitations
The dominant designs that limit us, and cost us exorbitant amounts of money!
North51
First, don’t forget about booking tickets for North51, January 21–23, 2026. If you like this Substack, you’ll love North51! This is not a traditional conference. It is curated, ideas-driven, and built for people who want real conversations about the future of geospatial. Even better, this year we’ll have sessions especially curated by Priscilla Cole and the Geospatial Risk team.
An executive-level opportunity to talk about the pressing issues in the geospatial community of practice. And this year seems more pressing than most! See you in Canmore, Alberta.
New tools, old patterns
Sometimes, I get a sudden wave of realization. This could be while laying awake in the small hours of long night. It could be that two thirds into a long run, when coming to the crest of a raise, I just stop. It could be, like this time, when I was flitting between sleep and wakefulness on a transatlantic flight. It’s usually at a moment like that when my mind is tricked into wandering. Tricked, out of the fury of “the now”, into a different place. On reflection, there must be something in that liminal state between sleep or meditation (don’t you lose yourself in a run?) and presence. Something in that transition between mental states.
I see patterns
I’ve frequently talked about dominant designs. Why does a car have four wheels? Why are most pens the length they are? Why do glasses look the way they do? Because, through a process of iterative industrial design, we have reached a point of consensus on the way some common things look and operate. Sometimes, that consensus has come in the form of standards: shipping containers, for instance, sometimes we have just agreed that a thing should look a particular way. But it’s worth noting that dominant designs surround us and while they provide comfort and often production efficiencies, they can also be a source of limitation. By “limitation” I mean that the comfort of obviousness of a particular design can be a source of hidden opportunity for the entrepreneurial.
Another critical point is that a dominant design need not just be an article, it can also be a process. Legacy human and digital processes are a curious source of innovation. Very often people do things, because that’s the way they do things, not because that’s the best way to do things. So when we think about dominant designs, don’t forget that these can also be in the way we do things as much as the tools we use. If we are doing an old process with new tools, have we really innovated?
So, today we are discussing dominant designs; patterns of design and behaviour.
And the Cloud’s gone wild!
One dominant design I think about is how we think about data storage and use. Traditionally, we have become used to acquiring data, perhaps through survey, purchasing, or exchange. Then, we would expect to move this data to a place of ownership, and subsequently use the acquired data. This could be as simple as a work colleague sharing something via email or a thumb drive. Or it could be more complex, perhaps involving the purchase of large amounts of remote sensing imagery. While acquiring imagery via API is a step forward, in essence a download API is still adopting the traditional pattern of moving acquired data from location A to location B. Worse, we are not just moving, but copying.
The main problem with moving data from A to B in this manner is our tools have evolved but the process has not. This is a dominant design for a pre-cloud era. When we apply this simple method of exchange to the cloud, the practicalities start to break down. This dominant design becomes wildly expensive, incurring storage costs per exchange, as well as egress and ingress costs and any costs involved with the actual bandwidth consumption.
Solutions to this process might include allowing access for local compute close to the stored data product. This could allow for running algorithms or querying data, perhaps. This workflow might also open the opportunity of location-oriented subscriptions. In this scenario, the needless data movement would be replaced with increasingly complex feature level authentication/authorization needs. But this is a better, more fit for purpose pattern.
In the end a new dominant designs around networked data access will emerge. Indeed, my suggestions is just the one that springs to mind. You are welcome, and I encourage you, to innovate your own. My point is the accidental application of old patterns to new toolkits.
Unprojected augmentations
Another process for future consideration is how we think specifically about spatial data storage and projection systems. The need of a projection system is because we don’t live on a flat Earth (sorry folks, I’m firm on this one). We needed data captured during surveys of our planet (local or global) to be visualized on a flat surface. So, we built mathematical projects to help us do that. There are thousands of these systems, and they each have different levels of accuracy and appropriateness for different locations. Traditionally, that flat surface would have been a paper map. Within the last 30 years, that surface was more likely to be a screen.
But, now we have GPUs (Graphics Processing Units). These are a complementary asset developed primarily from the gaming industry, but now sit in the centre of the AI revolution. GPUs mean that we can use far higher performance user interface technology. So, yes our screen might stay flat for a while, but they need not simply render a flat map anymore. Further, when we consider that Meta’s glasses have not been as throughly rejected as Google’s initial attempts at augmented reality glasses. Perhaps, the timing for this kind of technology has come.
“Entrepreneurism is the precise application of time against effort and capital”
With glasses, the resulting data will clearly not be flat, and it certainly won’t be viewed from a planimetric perspective. So where does that leave the core GIS/Geospatial activity of “reprojecting”? Indeed, if we have semi-accurate representations of our globe, do we need to at all? Even if we do need to reproject, can higher performance computing negate the need for the storage of multiple versions of the same data?
And note the use of the phrase, planimetric mapping. When will maps start to make more sense projected from the users perspective?
Again, this note is less of a series of answers. Instead I want to trigger your reflection on the processes we do every day. Which have we taken for granted that might be completely unnecessary or limiting.
The development of complementary assets around us and their subsequent new capabilities might allow our community to move in a series of new directions or leapfrog cumbersome and aging human processes. We should let them!



