I have a performance/capacity related question:
Is it common for an OLTP to have as many as 300 million records.
Here is my situation -- I get 300 milliion records per day to an OLTP. I need to keep the data atleast until three weeks. I need to also be able to search the whole data by a PK (21 days X 300 million records).
Let us assume I can buy any needed hardware for this. Can this be accomplished?
Is there any reference to benchmarks and thumb rules to be followed?
This is common and Oracle has nice feature to handle such large data sets. Make sure you have partitioned your tables, if they are not partitioned then atleast try to think about the ways you can change to incorporate partitions. This would greatly enhance your search time. You can search Oracle site on partioning tables. www.oracle.com
Top DevX Stories
Easy Web Services with SQL Server 2005 HTTP Endpoints
JavaOne 2005: Java Platform Roadmap Focuses on Ease of Development, Sun Focuses on the "Free" in F.O.S.S.
Wed Yourself to UML with the Power of Associations
Microsoft to Add AJAX Capabilities to ASP.NET
IBM's Cloudscape Versus MySQL