For some time, I've been interested in how fast cameras can autofocus and execute a shot. I've read various test reports and measurements, and then I got curious: how is it, precisely, that various test organizations measure autofocus times?
I was poking around the Internet when I saw a post by a gentleman who attempted to make his own measurements by simply photographing a digital stop watch on his computer screen. He would wait for the top of the minute and take a picture. The resultant image would show how much time had elapsed.
I tried the same thing, with poor results, so I tried triggering the shot and the on-screen digital stop watch at the same time. This, of course, forced me to be within arm's length of the laptop. I found the G12 would focus and trip the shutter in about .7 seconds.
But then I tried the Quick Shot mode on the G12. This is a mode that forces the photographer to use the optical viewfinder and continually adjusts the focus and exposure, depending on where the camera is pointed. Now, granted, I am tripping the shutter with my right hand and trigggering the watch with my left, so we have to allow for some error. Nevertheless, times were generally under .1 second, and one of the shots was recorded in just .052 seconds.
If you're willing to put up with the inaccuracy of the optical viewfinder (it sees only about 80% of what the sensor captures), the QuickShot mode appears to be seriously fast.