Disclaimer: I legitimately tried to search for the answers to these questions before posting! If I missed something somewhere, please post a link! Thanks!
I have an '04 CTS-V with an Edelbrock E-Force unit installed and have converted over to the 2 BAR OS. I ditched the Bosch T-MAP in favor of the 2 BAR Cobalt sensor as it is a direct plug in and does not require scaling from what I have been able to find (it was also cheaper than the T-MAP connector harness from Ford).
Anyhow, I have everything converted over and have the correct fuel injector data and adjusted the TB scaler for the 90mm unit in place of the 78mm. It starts and runs at idle and I am about to begin tuning, however this is where I have a few questions:
1. The MAF fail must be set to "0 Hz" and DTC's P0102 and 103 must be set fail on first error with the "CEL" box unchecked. Is this still true if the MAF is no longer even in the system? Will the OS recognize there is no MAF input and simply run SD with the DTC's set to "No Error Reported"? This is half a tuning question and half an emissions testing question.
2. Is using the RTT as accurate as using a wideband? Is it recommended to use the RTT to get "close" and then switch to the wideband error PID to get things exact? I assume RTT maps the VE tables in much the same way as it finds the ST/LT correction factors...is this correct?
3. When using RTT, must you still complete all of the traditional tuning steps? Force open loop operation, disable PE, reset the O2 readiness monitors, etc? Or, is all of this compensated for through this process?
Appreciate any info that can be provided! Thanks in advance!