I have a performance/capacity related question:
Is it common for an OLTP to have as many as 300 million records.
Here is my situation -- I get 300 milliion records per day to an OLTP. I need to keep the data atleast until three weeks. I need to also be able to search the whole data by a PK (21 days X 300 million records).
Let us assume I can buy any needed hardware for this. Can this be accomplished?
Is there any reference to benchmarks and thumb rules to be followed?
This is common and Oracle has nice feature to handle such large data sets. Make sure you have partitioned your tables, if they are not partitioned then atleast try to think about the ways you can change to incorporate partitions. This would greatly enhance your search time. You can search Oracle site on partioning tables. www.oracle.com
-- Android Development Center
-- Cloud Development Project Center
-- HTML5 Development Center
-- Windows Mobile Development Center