Author Topic: Asked: functions written in C/C++ or other language : Microseconds Delays Timers  (Read 7785 times)

Stefke35

  • Member
  • ***
  • Posts: 10
Hi all,

I ask me of myzelf what kind of Timers APi can I use for create an delays of microseconds?

I read the MSDN site and I found the topics of different manners for create function that generates an delay of microseconds.

Witch one is the best solution for my problem if the user set the delay in microseconds and the function executes the adjustable microseconds of the user.

I hope that here developers knowledge the WinAPI and low level programming. I work under Windows XP/7

We have the chooice of:

1 Multimedia Timer
2. Waitable Timer
3. Queue Timer

1) Whitch is the best chooice of this timers?
2) How can I rewrite the code for use in microseconds?
3) I wish to change in the exists functions (see above) for programming the functions that using the microsecond
    (delay) timer. I don't know how I can change this in the code:

     DECLARE SUB TimeInit_us()
     DECLARE FUNCTION TimeRead_us() AS QUAD
     DECLARE SUB Delay_us(BYVAL wDelay AS WORD)

'------------------------------------------------------------------------------
' TIME functions
'------------------------------------------------------------------------------
' This set of functions is important for various time measurements. Just as
' with DELAY function (in ms) and Delay_us (in Ás), TIME functions also operate on millisecond or
' microsecond basis.
' TIMEINIT (in ms) and TIMEINIT_US (in Ás) will reset the timers to zero and start them again
' with ms or us precision
' TIMEREAD (in ms) and TIMEREAD_US (in Ás) functions will read the amount of time (in ms or us)
' since the last TIMEINIT or TIMEINITUS function was executed. Both functions
' return a 64-bit integer

4) How can I combine it to an Quad word (64 bit)?

Code: [Select]

#Dim All
#Include "Win32API.inc"

Type LARGE_INTEGER
    low_part  As Long
    high_part As Long
End Type

Global g_TimeUnit               As Double
Global g_Start_Time_Low   As Long
Global g_Start_Time_High   As Long

'========================================================================='
'Milliseconds Delay Timer Functions                                       '
'========================================================================='

Sub TimeInit()
    '==============================================='
    'Reset the milliseconds Timer to Zero
    'will reset the timers to zero and start them again with ms precision
    '===============================================
    Local f As LARGE_INTEGER
    Local t As LARGE_INTEGER
    Local x As Dword

    x = QueryPerformanceFrequency(f)
    g_TimeUnit = 1000 / f.low_part
    x = QueryPerformanceCounter(t)

    g_Start_Time_Low  = t.low_part
    g_Start_Time_High = t.high_part
End Sub

Function TimeRead()  As QUAD
    '=======================================
    ' TIMEREAD functions will read the amount of time (in ms)
    ' since the last TIMEINIT function was executed. The function
    ' return a 64-bit integer
    '=======================================
    Local t As LARGE_INTEGER
    Local x As Dword

    If(g_TimeUnit = 0) Then g_TimeUnit = 0.000838096515
    x = QueryPerformanceCounter(t)
    Function = (t.high_part * 4294967296# + t.low_part - g_Start_Time_High * 4294967296# - g_Start_Time_Low * g_TimeUnit)
End Function

Sub Delay(ByVal wDelay_time As Word)
    '=============================================
    'With the DELAY function is important for various time measurements. Just
    'as with Delay function, Time function also operate on ms basic
    '=============================================
    Local time_start As Double

    time_start = TimeRead()
    While(TimeRead() < (time_start + wDelay_time)) : Wend
End Sub

I hope that here someone help me but I can't written the functions for the microseconds timer

Kind regards
Stephane


 





Jan Axelson

  • Administrator
  • Frequent Contributor
  • *****
  • Posts: 2616
    • Lakeview Research
This might be helpful:

http://www.codeproject.com/Articles/98346/Microsecond-and-Millisecond-NET-Timer

If you need precise timer for events on external hardware, it's best to do the timing in the external hardware, then pass the data to the PC without the need for precise timing.

Stefke35

  • Member
  • ***
  • Posts: 10
This might be helpful:

http://www.codeproject.com/Articles/98346/Microsecond-and-Millisecond-NET-Timer

If you need precise timer for events on external hardware, it's best to do the timing in the external hardware, then pass the data to the PC without the need for precise timing.
I don't have experience in C# only C++/C and MFC