Nope, there is no simple formula. It depends on too many different factors to put them all into a simple formula.
The number of records youmentioned (250-350 million) is high but manageable in general.
However, how large is each record?
How complex is the profile you defined?
How large are the sample data sets taken for each profile?
For example, imagine you have 20 columns holding address data (from 50 different countries) for private and work addresses, cell phone and land line numbers, private and work-related email addressed, and the like.
Now imagine you want to perform a cross-profile for all 20 attributes.
If you do that for 100 records, that should work fine.
If you do that for 100,000 records, things look very much different.
Another example: if you perform ETL transformations on 250 million records, a medium-sized server will probably be able to process them without big trouble.
However, if you do cross-column profiling on such a data set, even a really HUGE server (64 CPUs, TB of RAM) will eventually give up.
It all depends, that's all I can say. I know this doesn't help you much, so you might want to hire someone experienced with complex set-ups and requirements to get a useful estimation.
1. Is there an expected performance matrix for IDQ?
2. How much data is too much for IDQ to process?
In theory there is no limit if you have enough resources.
3. What is the informatica recommended hardware configuration?
This is completely dependent on the tasks you wish to perform. We only give the minimum requirements in the install guide.
In the meantime one thing came to my mind.
Informatica Professional Services has quite a few resources who have experience with such estimations. You might consider hiring one of these experienced people, they do have resources to facilitate such analysis work and can help you out fairly fast.
And no, I don't get any commission for such recommendations; I just know quite a few of my former colleagues, and so I do know that there are people who are really good at such tasks.