I am looking to open and fetch data from a large text file in python as fast as possible (It almost has 62603143 lines - size 550MB). As I don't want to stress my computer, I am doing it by following way ,
import time
start = time.time()
for line in open(filePath):#considering data as last element in fileif data in line:do_something(data)
end = time.time()
print "processing time = %s" % (count, end-start)
But as I am doing by above method its taking almost 18 seconds to read full file ( My computer has Intel i3 processor and 4 GB RAM ). Likewise if file size is more it is taking more time and considering user point of view its very large. I read lot of opinions on forums, referred multiple Stack Overflow questions but didn't get the fast and efficient way to read and fetch the data from large files. Is there really any way in Python to read large text files in few seconds?