Ruby: Kml Library

Ruby: KML Library?

I solved my problem by using schleyfox-ruby_kml, which is a fork of
kmlr. With this, it is easy to generate a kml for a set of placemarks. See the following example from the README:

require 'kml'

kml = KMLFile.new
folder = KML::Folder.new(:name => 'Melbourne Stations')
[
["Flinders St", -37.818078, 144.966811],
["Southern Cross", -37.818358, 144.952417],
].each do |name, lat, lng|
folder.features << KML::Placemark.new(
:name => name,
:geometry => KML::Point.new(:coordinates => {:lat => lat, :lng => lng})
)
end
kml.objects << folder
puts kml.render

The gem also provides classes to generate polygons and so forth.

How to generate KML file in ruby?

KML is just another XML, you can easily make your own "writer" which parses recursively and formats whatever input is passed in as a parameter and generates a string that looks like valid KML/XML.

Recommended reading for the options and the XML layout:
https://developers.google.com/kml/documentation/kmlreference

Parsing huge (~100mb) kml (xml) file taking *hours* without any sign of actual parsing

For a huge XML file, you should not use default XML parser from Nokogiri, because it parses as DOM. A much better parsing strategy for large XML files is SAX. Luckly we are, Nokogiri supports SAX.

The downside is that using a SAX parser all logic should be done with callbacks. The idea is simple: The sax parser starts to read a file and let you know whenever it finds something interesting, for example a tag opening, a tag close, or a text. You will be able to bind callbacks to these events, and extract whatever you need.

Of course you don't want to use a SAX parser to load all file into the memory and work with it there - this is exactly what SAX want to avoid. You will need to do whatever you want with this file part-by-part.

So this is basically a rewrite your parsing with callbacks logic. To learn more about XML DOM vs SAX parsers, you might want to check this FAQ from cs.nmsu.edu

Save large text file without using too much memory

You can do this with s3 multipart uploads, since they don't require you to know the file size upfront.

Parts have to be at least 5MB in size, so the easiest way to use this is to write your data to an in memory buffer and upload the part to s3 every time you get past 5MB. There's a limit of 10000 parts for an upload, so if your file size is going to be > 50GB then you'd need to know that ahead of time so that you can make the parts bigger.

Using the fog library, that would look a little like

def upload_chunk connection, upload_id, chunk, index
md5 = Base64.encode64(Digest::MD5.digest(chunk)).strip
connection.upload_part('bucket', 'a_key', upload_id, chunk_index, chunk, 'Content-MD5' => md5 )
end

connection = Fog::Storage::AWS.new(:aws_access_key_id => '...', :region => '...', :aws_secret_access_key => '...'
upload_id = connection.initiate_multipart_upload('bucket', 'a_key').body['UploadId']
chunk_index = 1

kml = String.new
kml << header
coords3d.each do |coords|
#append to kml
if kml.bytesize > 5 * 1024 * 1024
upload_chunk connection, upload_id, kml, chunk_index
chunk_index += 1
kml = ''
end
end
upload_chunk connection, upload_id, kml, chunk_index
#when you've uploaded all the chunks
connection.complete_multipart_upload('bucket', 'a_key', upload_id)

You could probably come up with something neater if you created an uploader class to wrap the buffer and stuck all the s3 logic in there. Then your kml code doesn't have to know whether it has an actual string or a string that flushes to s3 periodically

any ruby library that produces a nice gauge graph image that doesn't require javascript?

Google Image Charts has a "gauge" graph.

http://code.google.com/apis/chart/image/docs/chart_wizard.html

It's not quite as slick as the javascript version, but it might do.

http://chart.apis.google.com/chart?chxl=0:|slow|faster|crazy&chxt=y&chs=300x150&cht=gm&chd=t:70&chl=Groovy



Related Topics



Leave a reply



Submit