Let us say we have an algorithm that carries out N^2 operations for an input of size N. Let us say that a computer takes 1 microsecond (1/1000000 second) to carry out one operation. How long does the algorithm run for an input of size 3000?

(A) 90 seconds
(B) 9 seconds
(C) 0.9 seconds
(D) 0.09 seconds

»Important Links:

Latest Jobs in PakistanGovt Jobs in PakistanPrivate Jobs in Pakistan
NTS JobsPPSC JobsFPSC Jobs
Teaching JobsNADRA JobsBank Jobs
English McqsGeneral Knowledge McqsPak Studies Mcqs
Current Affairs McqsCurrent Affairs Mcqs PDFCurrent Affairs of Pakistan
Books PDFNotes PDFIslamic Studies Mcqs

Leave a Reply

Your email address will not be published. Required fields are marked *

error: Content is protected !!