Incorrect time reported by os/signpost APIs

I am trying to get the execution time of a function in Instruments Time Profiler using the signpost APIs, but it is reporting incorrect time. I have reduced it to the following code sample

#import <os/log.h>
#import <os/signpost.h>

   os_log_t _log = os_log_create("com.example.parser", OS_LOG_CATEGORY_POINTS_OF_INTEREST);
  os_signpost_id_t spid = os_signpost_id_generate(_log);
   
  os_signpost_interval_begin(_log, spid, "Parse", "Start");
  auto b = clock();
  auto begin = std::chrono::high_resolution_clock::now();
   
   // dummy loop instead of actual parse
  for(int i = 0; i < INT_MAX; ++i) {
  }
   
  auto e = clock();
  auto end = std::chrono::high_resolution_clock::now();
     
  auto duration = std::chrono::duration_cast<std::chrono::milliseconds>(end - begin).count();
  os_signpost_interval_end(_log, spid, "Parse", "End");
  NSLog(@"CPU Time: %lf s", (e-b)/(double)CLOCKS_PER_SEC);
  NSLog(@"Chrono: %lld ms", duration);

As per chrono/CPU time, it takes ~3s to execute this code, but in Instruments Time Profiler, it shows the duration as 145 microseconds.

Is this not the correct way to use the signpost API?

XCode - 11.3

macOS 10.15.7

Incorrect time reported by os/signpost APIs
 
 
Q