We've spent time mapping parts of Hull, today was around Argyle Street. The tiles on the map are getting full for the west part of the city and that is now a problem. The data is turned into tiles by different means for different maps.
The Mapnik map renders once per week. I like the way it looks, but when I have entered data I may have made a mistake, so I want to see a rendered map to find those mistakes while the route is fresh in my mind. So I always request an Osmarender version first.
Osmarender used to render a page in few minutes, but recently it always takes t least an hour and often more. As we fill the space of a tile, the tile gets more complex to render so it takes longer. The tile data is downloaded to a PC run by someone who has kindly set their machine up to render tiles (tiles@home). When the tile is drawn it is uploaded to the server to replace the tile to display on the map. In fact it is more complex than that because all of the tiles at higher zoom levels are rendered, zipped and uploaded which adds up to a lot of tiles. After a period of time, maybe one and a half to two hours, if the rendered tiles have not been uploaded the renderer (or maybe the server) gives up and the process starts again. It has taken all evening sometimes, just to find that I've made errors which need correcting and then the render process starts again.
On the mailing list there have been messages about how this might be improved, but I think they miss the point. Most requested tiles are automatically requested by various processes. If I request a tile render, the automatic render will still get requested some time later, even though the tile is already up-to-date. This massive overkill on auto-render is clogging the system.
While have typed this the two tiles I have requested have timed-out and started again. If I had the resource to run tiles@home I would be cross when the rendered tile-set is rejected.