There seems to be a bug with the way SetSyncRate() and SetVSync() works, or at least the documentation is wrong. This doesn't affect performance per se, but can explain why some have frame rates that don't match the monitor refresh rate and why there may be tearing with vSync on.
According to the docs, when SetVSync is set to 0, then the frame rate will match what is used for SetSyncRate. When SetVSync is set to 1, then SetSyncRate is overridden and the refresh rate matches the vertical blank of the monitor.
With 7-12-2018, it seems to work that way. If I set the sync rate to 75 and vSync to 0, the frame rate is ~75 fps. If I set the vSync to 1, then the frame rate drops to 60 (my monitor's refresh rate). If I set the sync rate to 30, then I get 30fps and 60fps respectively. However, if I set vSync to 1, then back to 0 while the program is running, it will go full speed instead of the value set in SetSyncRate. Apparently, the sync rate isn't just overridden, but completely disabled and must be reset when you set vSync from 1 to 0.
Now it seems that 6-11-2019, 8-1-2019 and studio behave differently. If I set the sync rate to 75 and vSync to 0, then it runs at 75fps. Set vSync to 1 then it goes to 60fps. However, when I go back to 0 within the program, it goes back to 75, which makes it seem that the sync rate is actually being overridden and not just disabled. A plus for later versions of AppGameKit, but the real problem comes when I set the sync rate to 30. When I set vSync to 1, the fps remains at 30! It is not being overridden. In order to match the monitor's refresh, I need to manually set the sync rate to 0. So if you have a monitor with a 75Hz refresh rate, but the app has a sync rate of 60 (which is default if no sync rate is set), then you will get tearing as the app's rate is not matching your monitor.
It seems that SetSyncRate sets the maximum fps and only is disabled when the monitor rate is lower or matches the sync rate.