You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to write some production-grade benchmarks for bfs with early termination, something like
bfs -name pattern -quit
where pattern is the name of some file at a particular depth. However, I don't want to use the same file every time, because performance varies a lot depending on the order that files are visited in. A single file isn't representative.
This seems like a good case for -L, like this
hyperfine -L name foo,bar,baz "bfs -name {name} -quit"
except that generates three different command lines and compares them against each other.
In real life I have a few dozen names, and I don't really care about the performance of each one separately. What I want is a mode that picks a random {name}every time it runs the command, so I could do things like
Thank you. Sounds like a useful feature. I would assume that most use cases for such an option would include numeric parameters, not a list of finite options. Do you think we could come up with a good CLI for this without having to add too many options?
I am trying to write some production-grade benchmarks for
bfs
with early termination, something likewhere
pattern
is the name of some file at a particular depth. However, I don't want to use the same file every time, because performance varies a lot depending on the order that files are visited in. A single file isn't representative.This seems like a good case for
-L
, like thisexcept that generates three different command lines and compares them against each other.
In real life I have a few dozen names, and I don't really care about the performance of each one separately. What I want is a mode that picks a random
{name}
every time it runs the command, so I could do things likeand compare
bfs
tofind
over the whole set of names. Right now I'm hacking it withshuf
:The text was updated successfully, but these errors were encountered: