Differences Between Varchar and Nvarchar
Varchar vs Nvarchar
Varchar is a short name for Variable Character Field. The characters represent data that are of indeterminate length. Varchar in actual sense is a data column type that is found in database management systems. The field size of Varchar columns can vary, depending on the database that is being considered.
In Oracle 9i, the field has a maximum limit of 4000 characters. MySQL has a data limit of 65,535 for a row, and the Microsoft SQL server 2005 comes with a field limit of 8000. This figure can go higher in the Microsoft SQL server when Varchar (max) is used, rising to 2 gigabytes. Nvarchar, on the other hand, is a column that can store any length of Unicode data. The codepage that Nvarchar must adhere to is an 8 bit coding. The maximum size for Varchar is 8000 while the maximum size for NVarchar is 4000. This in effect means that a single column of Varchar can be a maximum of 8000 characters and a single column of Nvarchar can be 4000 characters at the most. Exceeding the column values becomes a huge issue and can even cause serious problems as rows cannot span multiple pages, with the exception of SQL server 2005, and the limitation must be adhered to or errors or truncation will result..
One of the main differences between Varchar and Nvarchar is the use of less space in Varchar. This is because Nvarchar employs Unicode, which, because of the hassle of coding the specifics, takes up more space. For every character stored, Unicode requires two bytes of data, and this is what may cause the data value to look higher when compared to the non-Unicode data that Varchar uses. Varchar, on the other hand, only requires one byte of data for each and every character that is stored. However, more importantly, although the use of Unicode takes up more space, it does solve problems arising with codepage incompatibilities which are a pain to solve manually.
Thus space feature can be overlooked in preference for the shorter time it takes Unicode to fix the arising incompatibilities. Also, the cost of disks and memory have also become quite affordable, ensuring that the space feature can often be overlooked, while the longer time it takes to solve arising problems with Varchar cannot be dismissed so easily.
All development platforms use modern operating systems internally, allowing Unicode to run. This means that Nvarchar is employed more often than Varchar. Encoding conversions are avoided, reducing the time it takes to read and write to the database. This also significantly reduces errors, with the recovery of conversion errors that do occur becoming a simple issue to deal with.
The benefit of using Unicode also applies to people who use ASCII application interfaces, as the database responds well, particularly the Operating system and the database coalition algorithms. Unicode data avoid conversion related problems and data can always be validated, if restricted to a 7 bit ASCII, irrespective of the legacy system that must be maintained.
Varchar and Nvarchar come with differing character types. Varchar makes use of non-Unicode data while Nvarchar makes use of Unicode data.
Both Varchar and Nvarchar have varying data types that must be adhered to. Varchar only saves data in a 1 byte sequence and Nvarchar saves data in 2 bytes for each character
The maximum length also varies. Varchar length is limited to 8000 bytes and 4000 bytes is the limit for Nvarchar.
This is because the storage size in Varchar is more straightforward as opposed to the Unicode data used by Nvarchar.
Search DifferenceBetween.net :
Email This Post : If you like this article or our site. Please spread the word. Share it with your friends/family.
Leave a Response