BTW - my simulation based on dyno torque curve data has actually proven to produce realistic results for several vehicles already. My favorite example is a 2006 WRX STi. My brother bought one last year and was interested in using my simulation get some idea about how launching at different RPMs and different quickness of shifting gears might affect acceleration performance.

We found a stock dyno chart online, a measured coefficient of drag and frontal area, specs for transmission ratios, weight and tire size. Everything we needed.

First simulated run, launch hard at peak torque, assuming 0.5s shift times, and I got...

0-60mph: 4.9s
1/4 mile: 13.01s @ 104.4mph

One car website reported 4.9s 0-60, but no 1/4 mile time.

Another car website reported 4.5s 0-60, and 13.0 sec @ 103.5

Almost a perfect match to the reported 1/4 mile time, and a perfect match to one of the reported 0-60 mph times. But that quicker 4.5s 0-60 time from the other website was really bothering me. Which one was correct? 4.9 or 4.5? How could there be such a big difference? And what concerned me the most was that the website that had the 1/4 mile result that agreed with my simulation was the source of the 0-60 time that did NOT agree with my simulation! What have I done wrong!?!?!

Then I remembered that some car testers use a 1-foot roll-out (like a 1/4 mile race) for all acceleration-from-a-stop tests, including 0-60 mph tests. This produces quicker 0-60 results because there's a 1-foot head start before the timer starts. So back to the simulation to test this hypothesis...

Simulated 0-60 with 1-foot rollout: 4.57s

Much better!

I was then able to slightly tweak the launch RPMs and shift times, still within a reasonable range, and get results that even more perfectly matched the websites reported results.