Crazy day, I indexed 30GB file having 53 million lines of json data to elastic. Then I tried kibana with it it was really enjoyable after doing it with my drink. Link to kibana is shivalink.com:5601.

Link to exastic is shivalink.com:9200

the most tough was to unzip 5GB file using all cores, it was bz2 file. I used pbzip2 but it didn't worked in my case. Then I found lbzip2 -d myfile.json. It was really fast and used my all cores efficiently. It turned out to be 30GB then. After that how could we insert it to elastic, as I am very new to this I found esbulk and started with this. I inserted 45 million entries then It became too slow. Now I had no option other and stopping it right there.

Than I came up with new idea of tail -n No of rest of the entries and inserted them back. I successfully did it. Now I can say I kind of know big big data..... :) feeling happy