I need to save a lot of information which will be used as search filters. Since there are too many rows in the database, each of these filters has to be a single column of the table, so the search is fast enough. Some of them are numbers, others booleans, others strings, others dates, etc. Is it bad to add as much as 200 columns to the table? How will this affect the effectiveness of my database table? Thanks for your answers.
2007-10-08
03:41:48
·
7 answers
·
asked by
Anonymous
1
in
Computers & Internet
➔ Programming & Design
While it is possible that you could have a data model that requires being that flat of a design, I'm guessing you need to take a look a it again to see if you can normalize things a bit more. Can you break out the information into tables that can be joined when needed?
2007-10-08 03:46:12
·
answer #1
·
answered by Jim Maryland 7
·
0⤊
0⤋
Thats way too many columns. The database integrity has to be very poor on that. I'm not sure how many connections you run at a time but more than a few and your data can easily become inconsistent or cause unnecessary locks that will slow you down. I'm not sure what you are filtering or how the search works so I can't comment on improving this overall. You may really need the 200 columns for your current design which means everything needs to be redone. You should be at around 2NF - 3NF, with 200 columns I doubt you are even in 1NF.
1NF - No repeating Columns
2NF - 1NF + Only 1 Primary Key per table, all other fields need that one key
3NF - 2NF + all fields need that key and only that key, cannot be determined by other fields.
I think thats how it goes. Haven't done database management in about 2 years now. Its something like that. Might have gotten 3NF confused with BCNF.
2007-10-08 11:57:33
·
answer #2
·
answered by yanksfan868686 2
·
0⤊
0⤋
The simple answer to your question - in my opinion - is "yes", 200 columns is too many to have in a table for all practical puposes. When you say there are too many rows in the database (table?) it leads me to wonder if you are storing things in a table that could be generated programmatically. I think a good place to start would be to separate out the data types into their own tables - have a table specifically for strings, one for date types, etc. In the program(s) using the tables, you can decide which table to look at based on datatype, which should increase the speed of the search immensely. Perhaps you could create a special module/class for those apps that will access the database that can include some of the heavy lifting logic, as well as data parsing and decision structures. Good luck.
2007-10-08 11:20:10
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋
Maximum number of columns is 4096 in MySQL (5.1). But there are other limits, so the effective number of columns may be fewer. For example, row size is 65 535 bytes.
Personally, I would reconsider some other approach because maintaining a table with 200 columns can be quite hard. Have you considered to use multiple tables?
2007-10-08 10:51:22
·
answer #4
·
answered by Timo J 4
·
0⤊
0⤋
What is the size of this DB? From your description your data is not normalized properly. What exactly are your "filtering" What is the general nature of your database and what type of data are you seeking to extact. But MySQL will handle a 200 column db --- the broader question is why?
2007-10-08 10:54:51
·
answer #5
·
answered by KarenL 6
·
0⤊
0⤋
I would say 200 columns sounds incredibly steep, but I'm not a DBA.
Generally speaking, too normalized data will not be perform well, and too denormalized data will be slow as well. I would venture to guess that the answer lies somewhere in between.
You can always benchmark some queries and see which version of your code performs best, thats what I would do.
Check out this book though:
http://www.agiledata.org/essays/databaseRefactoringSmells.html
2007-10-08 10:50:49
·
answer #6
·
answered by Christian T 2
·
0⤊
0⤋
As others have said it doesn't sound as if you have the design quite right.
You ought to look at your data - only you know what you are trying to do. It could be doing exactly what you need and no further re-design is necessary.
However, I think you need to look at and consider how normalisation could help you. Also consider how much redundant data you have. Primary and foreign keys and multiple tables with appropriate joins may help performance and storage issues.
Hope this give you something to think about.
2007-10-08 17:12:31
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋