Why is the difference in Timestamp of a same code snippet in python with opencv

Question:

I wanted to find out the amount of time taken for a module or function to execute. I have used two methods mentioned below, but the time taken in each case is not the same for the same function. Why?

import time
from timeit import default_timer as timer

start=timer()
ret,thresh1 = cv2.threshold(gray,127,255,cv2.THRESH_BINARY)
end=timer()
print("Time Taken by Binary Thresh1 = {}".format(end-start))

e1=cv2.getTickCount()
ret,thresh1 = cv2.threshold(gray,127,255,cv2.THRESH_BINARY)
e2=cv2.getTickCount()
t=(e2-e1)/cv2.getTickFrequency()
print("Time Taken by Binary Thresh2 = {}".format(t))

Output

Time Taken by Binary Thresh1 = 0.00017630465444318233
Time Taken by Binary Thresh2 = 3.005609493620777e-05

Kindly let me what is the reason , or anything wrong in the code?

Asked By: Shishira

||

Answers:

This is likely a problem of trying to measure too little and then seeing an effect of your PC doing other things

It would be worth having a read through this SO post which may give you some background info.

As for measuring what you want to measure better I would suggest doing it 100 or 1000 times and measuring the total time. You can then work backwards to see how long each attempt took or create an average time.

Also see here for tips on profiling Python code.

Answered By: GPPK
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.