Hi,
This could be a bug in the pickle implementation (not in igraph, but in Python itself):
The workaround is to pickle the object into a string, and then write that string in chunks less than 2^31 bytes into a file.
However, note that pickling is not a terribly efficient format -- since it needs to support serializing an arbitrary set of Python objects that may link to each other and form cycles in any conceivable configuration, it has to do a lot of extra bookkeeping so that object cycles and objects embedded within themselves do not trip up the implementation. That's why the memory usage rockets up to 35 GB during pickling. If you only have a name and an additional attribute for each vertex, you could potentially gain some speed (and cut down on the memory usage) if you brew your custom format -- for instance, you could get the edge list and the two vertex attributes, stuff them into a Python dict, and then save the dict in JSON format:
def graph_as_json(graph):
return {
"vertices": {
"name": graph.vs["name"],
"pt": graph.vs["pt"]
},
"edges": graph.get_edgelist()
}
with open("output.json", "w") as fp:
json.dump(graph_as_json(graph), fp)
You could also use gzip.open() instead of open() to compress the saved data on-the-fly. You'll also need a json_as_graph() function to perform the conversion in the opposite direction.