The purpose is to make the tables in the database function at optimal efficiency, and to reduce or eliminate redundancy.
Normalisation in a database comes in several levels. Each level has specific criteria involved in it for the database to be declared "normalised" to that level. Most databases in the corporate world are only normalised to the second or, more rarely, third normal form (NF).
After second NF, in a real world environment, the database can actually become less efficient.
Memory in a computer is stored as a series of bits of information that is either "on" or "off". In the real old days, there were physical relay switches, like a light switch, that was flipped up or down, depending on whether that "bit" was on or off. Today, the "bit" is magnetized to be on or off. Which is why playing with large magnets around your PC is a bad plan.
When it comes to externally stored data, CDs and DVDs are not magnetic, but that's a different discussion.
2006-10-31 21:37:58
·
answer #1
·
answered by Kaia 7
·
0⤊
0⤋
Basically, normalisation is the process of efficiently organising data in a database. There are two main objectives of the normalization process: eliminate redundant data (storing the same data in more than one table) and ensure data dependencies make sense (only storing related data in a table). Both of these are valuable goals as they reduce the amount of space a database consumes and ensure that data is logically stored.
The process of designing a relational database includes making sure that a table contains only data directly related to the primary key, that each data field contains only one item of data, and that redundant (duplicated and unnecessary) data is eliminated. The task of a database designer is to structure the data in a way that eliminates unnecessary duplication(s) and provides a rapid search path to all necessary information. This process of specifying and defining tables, keys, columns, and relationships in order to create an efficient database is called normalization.
Normalisation is part of successful database design.
Without normalisation, database systems can be inaccurate, slow, and inefficient and they might not produce the data you expect.
2006-11-01 01:59:46
·
answer #2
·
answered by Chintan 2
·
0⤊
0⤋
Database normalization is a process by which an existing schema is modified to bring its component tables into compliance with a series of progressive normal forms. The goal of database normalization is to ensure that every non-key column in every table is directly dependent on the key, the whole key and nothing but the key and with this goal come benefits in the form of reduced redundancies, fewer anomalies, and improved efficiencies. While normalization is not the be-all and end-all of good design, a normalized schema provides a good starting point for further development.
2016-05-23 01:14:02
·
answer #3
·
answered by Anonymous
·
0⤊
0⤋