Jump to content

 
Photo

Issues when supplying large, complex map files to a press

- - - - - RIP offset press large files

  • Please log in to reply
11 replies to this topic

#1
jamierob

jamierob

    Contributor

  • Validated Member
  • PipPip
  • 19 posts
  • Gender:Male
  • Location:Missoula, MT
  • United States

Hello- I've been having a lot of trouble recently with various presses being able to run my map files through pre-press/RIP process. They are just too large and complex and crash most systems or take too long to process and the press in therefore unwilling to deal with them in their standard way of processing. So, i'm looking for any tips to make my files smaller and/or easier to process. Another question for those cartographers that create large, complex maps: How to you package and send your files to press?

 

My current workflow:

-Create the main map file in Illustrator. They are ~28x40" with lots of contour lines, labels, 1:24k hydrography, pattern fills, and lots of vector roads, trails, and other feature data. There is also a 300dpi linked .tif document that serves as the shaded relief. I use MAPublisher so there is also attribute data.

-Simplify all lines to 99% accuracy to remove extraneous anchor points.

-I do have a spot color for contour lines/labels that is overprinted, and this spot color does interact with some other partially transparent objects (ownership and other boundary lines).

-Place the file in an Indesign document frame. I place it twice in Indesign with one frame holding the main map document and another just slightly larger with layer overrides to only show the grid labels (UTM and Lat/Long) around the edge of the map. 

-Export the document from Indesign as a PDF with normal Acrobat high-quality print settings.

 

An Acrobat 5 (PDF1.4) document takes an incredibly long time to run the the press RIP (5 separations w/ 1 spot color for contour lines). ~13 hours per plate which is unacceptable. 

 

Does anyone have any ideas or tips on how to avoid these issues when going to press?

 

Thanks!

 

-Jamie Robertson



#2
Jacques Gélinas

Jacques Gélinas

    Master Contributor

  • Validated Member
  • PipPipPipPip
  • 105 posts
  • Gender:Male
  • Location:Gatineau (Québec)
  • Canada

One small suggestion. I would save my Illustrator file without MapPublisher attributes before importing into InDesign. I believe there is a documented way to this in the MapPublisher docs.

 

 


Jacques Gélinas
cartographer
www.cartesgeo.ca


#3
jamierob

jamierob

    Contributor

  • Validated Member
  • PipPip
  • 19 posts
  • Gender:Male
  • Location:Missoula, MT
  • United States

Jacques - Thanks for your suggestion. I've thought that the extra attribute data might be causing an issue as well. When i drop the attribute data (by dragging all of the MAP layers to the Non-MAP Layers section in the MAP Views palette) and save the .ai document, the filesize decreases a little bit, but i'm still having the same issues with the files having problems with the press's RIP though. It definitely can't hurt to get rid of that stuff though! 



#4
Adam Wilbert

Adam Wilbert

    Legendary Contributor

  • Validated Member
  • PipPipPipPipPip
  • 276 posts
  • Gender:Male
  • Location:Bellingham, WA, USA
  • United States

I wonder if it's the file size or the file complexity that is causing the issue? If it's complexity, simplifying lines 99% might still leave way more anchor points than you need. You might have to simplify on a feature-by-feature basis. If it's file size, I've always converted my background raster to a lossless PNG-24 (or occasionally a very high quality JPG) for print. That might reduce the file size considerably. (*edit, on looking at the High Quality Print preset, it looks like the hill shade is already getting converted to jpg in the pdf export, so this probably won't help after all.)

 

My workflow never goes through InDesign, I export to PDF straight from Illustrator. I typically will use the PDF / X-4 standard, with colorspace modifications dictated by the printer. I'm not sure how InDesign handles embedding the TIFF and AI files when using the "High Quality Print" preset. But I know that "High Quality Print" in Illustrator creates a PDF that includes both the PDF information and a full copy of the Illustrator file. (i.e. Preserving Illustrator Editing Capabilities). My hunch is that the InDesign process does the same, but doesn't tell you. This can explode your file size as well.

 

I might try exporting an X4 PDF from Illustrator, and embed place that in your InDesign document instead, and see what happens.


Adam Wilbert

@awilbert
CartoGaia.com
Lynda.com author of "ArcGIS Essential Training"


#5
Matthew Hampton

Matthew Hampton

    Hall of Fame

  • Moderator
  • PipPipPipPipPipPipPip
  • 1,325 posts
  • Gender:Male
  • Location:Portland, Oregon
  • Interests:Playing in the mountains and rivers.
  • United States

I am wondering if the hang-up in the RIP is from placing 2 copies of the same file (one slightly larger) in InDesign before exporting to PDF?  Does the RIP still hang-up when only one file is used?

 

If you can't find a RIP with enough memory to handle the job, then one solution would be to export a mega-raster from InDesign.  This creates baby food for RIP's and they slurp-it up with ease, though the trade-off is you can loose a little detail depending on your export resolution (not to mention dealing with a multi-Gigabyte file and a potential issue with overprinting).  We chose a big shop with a huge RIP for a moderately complex bike map a few years ago and they still had to order more memory.


co-cartographic creator of boringmaps.com


#6
Unit Seven

Unit Seven

    Legendary Contributor

  • Moderator
  • PipPipPipPipPip
  • 266 posts
  • Gender:Male
  • Location:New Zealand
  • New Zealand

Willing to bet transparancies are the culprit—non postscript elements that need to be flattened when ripped. Esp if they are used on boundary lines etc.

 

We usually build these as overprints which is a postscript concept. Eg 70% black set to overprint looks the same as 70% transparent black—unless it crosses an object with >70% black. Takes some thought but usually works out ok and makes for much simlier files. There are native PDF rips around which may hadle these beeter but while the rip will take a PDF most rips are still postscript based when it comes down to it—here at least.

 

Suggest if you can export the file to a PDF-x/1a (all the new PDF bits are flattened so is pretty similar to just postscript) file they should be able to deal with it.

 

Cheers,

S.


S a m B r o w n

U N I T S E V E N
unit.seven@gmail.com

Miramar, Wellington
N E W Z E A L A N D

#7
Hans van der Maarel

Hans van der Maarel

    CartoTalk Editor-in-Chief

  • Admin
  • PipPipPipPipPipPipPip
  • 3,898 posts
  • Gender:Male
  • Location:The Netherlands
  • Interests:Cartography, GIS, history, popular science, music.
  • Netherlands

I think I'm going to have to side with Matthew, I would try it with only one copy of the file.

 

One other option could be to open the PDF file in Acrobat and then save it as a reduced size PDF. I've often achieved a gigantic reduction in file size, but do make sure to doublecheck your output settings as you don't want this step to introduce quality loss.


Hans van der Maarel - Cartotalk Editor
Red Geographics
Email: hans@redgeographics.com / Twitter: @redgeographics

#8
jamierob

jamierob

    Contributor

  • Validated Member
  • PipPip
  • 19 posts
  • Gender:Male
  • Location:Missoula, MT
  • United States

Thanks for the input!

 

Adam - 

-I probably could simplify the lines a little more than than 99%. It seems that there is an enormous removal of nodes going from raw .shp import to 99%, and the gains decrease considerably after that. That being said, it's probably just my hyper-accuracy GIS brain speaking through, and decreasing a little more past 99% would have virtually no effect on a 1:80k map. 

-The press that I'm working with does a PDF X4 workflow by default. My X4 docs that run through their RIP are the issue and take ~13 hours to RIP per plate. When they run them through as PDF1.3/X-1a they only take 4 hours per plate (acceptable), but I had issues because then the black was knocked out. With the normal X4 processing (which the press had used previously) the black was auto-overprinted--as it should be. I didnt have the black objects set to overprint in the .ai doc as this hadn't been an issue in the past. My understanding is that PDF1.3 uses a transparency flattener and makes some sort of rasterized interpretation of the transparency. 

 

I'll try to place a PDF version of the file instead of the .ai file in indesign to see if that helps anything. One would think that it wouldn't make a difference as indesign appears to only read the imbedded pdf within the .ai doc anyway, but I don't know for sure. 

 

Matthew- I wonder if the double placement is an issue as well. I haven't submitted files yet without the double pdf placement. My sheer lack of knowledge of how PDF's and RIP's interact is frustrating. (aside: does anyone know how to render rasterized separations (plus spot channels) without owning full-on pre-press software?) If you have virtually the entire map rendered invisible via indesign pdf placement layer overrides, and only a single pdf layer (labels around the edge) visible, does the RIP still try to process all of the pdf layers for that frame/object, even though it's just saying, "oh yeah, don't show this...and the next 100 layers and 2 billion nodes.."?

 

I hadn't thought about producing a a mega-raster from indesign as a file to supply to the press. We're talking a 150lpi/2400dpi CMYK image i'm thinking?  I'm not sure how to do this and retain the spot color channels on the contour lines and keep the black overprinted. I really like the crispness that you get from spot color contour lines and having the black all be overprinted on a single plate. But, i'd be interested in seeing some sampls of maps created from this method. Does anyone use this method?

 

Sam - I think you might be on to something with the postscript objects. When i save the pdf out of indesign with the X4/1.4 PDF version the filesize is  ~180MB, and a PDF1.3/X-1a doc is ~550MB. This additional info might be because instead of just a partially transparent stroke, it's calculating a new appearance wherever that partially transparent stroke interacts with anything else (which is a lot) and writing that info to each of the separations? I typically render the pdf in photoshop to do test prints on plotters, and my machine (mac 18GB of ram) easily renders the PDF1.4 doc, but crashes whenever I try to render the PDF1.3 version. Would there be a way to do the overprint instead of opacity trick on say, an orange line built with two process plates? Setting my Orange line to 100% opacity and overprint definitely gives a similar effect, but it's quite a bit 'muddier' as it is multiplied on the shaded relief image.I don't think there is another way to keep it bright without knocking out the other plates?

 

Hans - I had not thought about gaining any size reduction by running it through Acrobat. I'll give that a go!



#9
David Medeiros

David Medeiros

    Hall of Fame

  • Validated Member
  • PipPipPipPipPipPipPip
  • 1,089 posts
  • Gender:Male
  • Location:Redwood City CA
  • Interests:Cartography, wood working, wooden boats, fishing, camping, overland travel, exploring.
  • United States

Just a note on "GIS accuracy", keep in mind that incredibly detailed, vertex heavy lines from a GIS are not necessarily super accurate, they are just very precise and detailed. There is a large amount of inaccuracy in most GIS data as a result of equipment used to collect data, human errors in digitizing, moving small scale source  data onto large scale GIS layers etc.

 

When the goal is a representational map then representational accuracy trumps 'GIS' accuracy. Make it look right so it's understandable. Too much detail, even when it is accurate, can create a map that is less effective than one that is generalized to suit its purpose.

 

Edited to add: when saving the file, turn off the "Save as PDF compatible" option to further reduce the AI file size.


GIS Reference and Instruction Specialist, Stanford Geospatial Center.

 

www.mapbliss.com

 


#10
Dennis McClendon

Dennis McClendon

    Hall of Fame

  • Validated Member
  • PipPipPipPipPipPipPip
  • 1,084 posts
  • Gender:Male
  • Location:Chicago
  • Interests:map design, large-scale maps of cities
  • United States

Too much detail, even when it is accurate, can create a map that is less effective than one that is generalized to suit its purpose.

 

I've been asked to talk on this subject to a state GIS conference this fall.  Well, I've been asked to talk about my work, and this is one of the themes I want to explore.

 

Do you know of anyone who's written about this concept, in a textbook, or in lecture notes that I might find online?  


Dennis McClendon, Chicago CartoGraphics
chicagocarto.com

#11
David Medeiros

David Medeiros

    Hall of Fame

  • Validated Member
  • PipPipPipPipPipPipPip
  • 1,089 posts
  • Gender:Male
  • Location:Redwood City CA
  • Interests:Cartography, wood working, wooden boats, fishing, camping, overland travel, exploring.
  • United States

 

Too much detail, even when it is accurate, can create a map that is less effective than one that is generalized to suit its purpose.

 

I've been asked to talk on this subject to a state GIS conference this fall.  Well, I've been asked to talk about my work, and this is one of the themes I want to explore.

 

Do you know of anyone who's written about this concept, in a textbook, or in lecture notes that I might find online?  

 

 

Krygier and Wood talk about it a bit in Making Maps (Map Generalization and Classification). They discuss the idea that maps work by "strategically reducing detail and grouping phenomena". It's the reducing detail we're mostly about here.

 

I'm certain Tufte talks about it as well and you might find it mentioned in Cartographic Relief Presentation or even Semiology of Graphics, but I'm not sure.

 

It is, to me, a bedrock concept within cartography but one that most GIS map makers have a very hard time with, especially in science based GIS work. The assumptions in GIS are a) GIS and GIS data are inherently very accurate, and b ) that simplifying the display of your data is somehow dishonest.

 

Maps intended for sharing and communication end up with a mess of raw data on them that clouds the actual message being communicated and in the end there is a potentially misleading message of accuracy that may get transmitted when in reality GIS data can often be much less accurate then manual mapping, just a lot more precise.

 

For me the use of generalization hinges on the concept of the maps communication goal. If transmitting raw data is the goal then obviously don't generalize, but that's a poor use of a map in most cases as only the analyst really needs to have all the data at hand for their work. Instead the map should be communicating your results or findings and for that it's probably better to simplify.


GIS Reference and Instruction Specialist, Stanford Geospatial Center.

 

www.mapbliss.com

 


#12
jamierob

jamierob

    Contributor

  • Validated Member
  • PipPip
  • 19 posts
  • Gender:Male
  • Location:Missoula, MT
  • United States

Just thought i'd followup on what i've figured out with my files that were crashing the press RIP. Turns out to be, surprise surprise...user error on my part! I have 100ft contour lines on my maps, and place many, many contour labels via MAPublisher. I then use the MAPublisher knockout tool to knockout the contour lines behind the labels. Turns out the option Avenza put in the knockout tool to 'Use clipping masks if exporting to PDF or postscript' was put there for a reason.. The default is to create the knockouts as transparency masks. I guess even though they are 100% transparent masks, the RIP still has to calculate each little mask for each separation, and on something as complex as this, it's just too much. Once I created the knockouts with clipping masks instead of transparency masks, the files went through the RIP without issue.

 

So, it was a silly mistake on my part. At least in my case the following factors didn't seem to have any real significance for the files going through the RIP.

 

-Placing the .ai map file multiple times with layer overrides in the .indd document

-Retaining MAPublisher attribute data

-Needing to simplify lines past 99%

 

Thanks all, 

Jamie






0 user(s) are reading this topic

0 members, 0 guests, 0 anonymous users

-->