This is slightly OT but ....
When testing why my aspx page isn't returning the data as expected I decided
to type directly into one of my SQL tables.
(The column definitions are below).
When I type in data into any of the columns I get this error:
"string or binary data will be truncated"
I've tried clearing out all the columns but no luck. Even if I just change
on of the int's to a different number I get the same error.
I believe all are under the limits for the column types - unless I've got it
wrong with ntext?? The table new records from the page without any errors,
its just when I type directly into the columns....?
[ID] [int] IDENTITY(1,1) NOT NULL,
[AddedDate] [datetime] NOT NULL,
[AddedBy] [nvarchar](256) NOT NULL,
[IssueNumber] [int] NOT NULL,
[Subject] [nvarchar](256) NOT NULL,
[Abstract] [nvarchar](4000) NULL,
[HtmlBody] [ntext] NOT NULL,
[PlainTextBody] [ntext] NOT NULL,
[StatusID] [int] NOT NULL ,
[LastSent] [datetime] NOT NULL,
[RecordClicks] [bit] NOT NULL,[Listed] [bit] NOT NULL ,
[Approved] [bit] NOT NULL ,
[OnlyForMembers] [bit] NOT NULL,
[ViewCount] [int] ,
JJ 8 1705
"JJ" <ab*@xyz.comwrote in message
news:eS**************@TK2MSFTNGP02.phx.gbl...
This is slightly OT but ....
When testing why my aspx page isn't returning the data as expected I
decided
to type directly into one of my SQL tables.
(The column definitions are below).
When I type in data into any of the columns I get this error:
"string or binary data will be truncated"
I've tried clearing out all the columns but no luck. Even if I just change
on of the int's to a different number I get the same error.
I believe all are under the limits for the column types - unless I've got
it
wrong with ntext?? The table new records from the page without any errors,
its just when I type directly into the columns....?
[ID] [int] IDENTITY(1,1) NOT NULL,
[AddedDate] [datetime] NOT NULL,
[AddedBy] [nvarchar](256) NOT NULL,
[IssueNumber] [int] NOT NULL,
[Subject] [nvarchar](256) NOT NULL,
[Abstract] [nvarchar](4000) NULL,
[HtmlBody] [ntext] NOT NULL,
[PlainTextBody] [ntext] NOT NULL,
[StatusID] [int] NOT NULL ,
[LastSent] [datetime] NOT NULL,
[RecordClicks] [bit] NOT NULL,[Listed] [bit] NOT NULL ,
[Approved] [bit] NOT NULL ,
[OnlyForMembers] [bit] NOT NULL,
[ViewCount] [int] ,
I seem to recall a limitation in retrieving large field sizes working with
ODBC. I don't know if it is applicable to your situation. The solution was
to retrieve the large columns in chunks. This link will get you some
information.
http://www.google.com/search?hl=en&q=blob+getchunk+site%3Amicrosoft.com
Hi.
Thanks for that - however as I am typing directly into the tables I don't
think ODBC comes into play.
JJ
"MyndPhlyp" <no****@homeright.nowwrote in message
news:%2****************@TK2MSFTNGP03.phx.gbl...
>
"JJ" <ab*@xyz.comwrote in message
news:eS**************@TK2MSFTNGP02.phx.gbl...
>This is slightly OT but ....
When testing why my aspx page isn't returning the data as expected I
decided
>to type directly into one of my SQL tables. (The column definitions are below).
When I type in data into any of the columns I get this error: "string or binary data will be truncated"
I've tried clearing out all the columns but no luck. Even if I just change on of the int's to a different number I get the same error. I believe all are under the limits for the column types - unless I've got
it
>wrong with ntext?? The table new records from the page without any errors, its just when I type directly into the columns....? [ID] [int] IDENTITY(1,1) NOT NULL, [AddedDate] [datetime] NOT NULL, [AddedBy] [nvarchar](256) NOT NULL, [IssueNumber] [int] NOT NULL, [Subject] [nvarchar](256) NOT NULL, [Abstract] [nvarchar](4000) NULL, [HtmlBody] [ntext] NOT NULL, [PlainTextBody] [ntext] NOT NULL, [StatusID] [int] NOT NULL , [LastSent] [datetime] NOT NULL, [RecordClicks] [bit] NOT NULL, [Listed] [bit] NOT NULL , [Approved] [bit] NOT NULL , [OnlyForMembers] [bit] NOT NULL, [ViewCount] [int] ,
I seem to recall a limitation in retrieving large field sizes working with
ODBC. I don't know if it is applicable to your situation. The solution was
to retrieve the large columns in chunks. This link will get you some
information.
http://www.google.com/search?hl=en&q=blob+getchunk+site%3Amicrosoft.com
On May 31, 2:03 pm, "JJ" <a...@xyz.comwrote:
Hi.
Thanks for that - however as I am typing directly into the tables I don't
think ODBC comes into play.
JJ
"MyndPhlyp" <nob...@homeright.nowwrote in message
news:%2****************@TK2MSFTNGP03.phx.gbl...
"JJ" <a...@xyz.comwrote in message
news:eS**************@TK2MSFTNGP02.phx.gbl...
This is slightly OT but ....
When testing why my aspx page isn't returning the data as expected I
decided
to type directly into one of my SQL tables.
(The column definitions are below).
When I type in data into any of the columns I get this error:
"string or binary data will be truncated"
I've tried clearing out all the columns but no luck. Even if I just
change
on of the int's to a different number I get the same error.
I believe all are under the limits for the column types - unless I've got
it
wrong with ntext?? The table new records from the page without any
errors,
its just when I type directly into the columns....?
[ID] [int] IDENTITY(1,1) NOT NULL,
[AddedDate] [datetime] NOT NULL,
[AddedBy] [nvarchar](256) NOT NULL,
[IssueNumber] [int] NOT NULL,
[Subject] [nvarchar](256) NOT NULL,
[Abstract] [nvarchar](4000) NULL,
[HtmlBody] [ntext] NOT NULL,
[PlainTextBody] [ntext] NOT NULL,
[StatusID] [int] NOT NULL ,
[LastSent] [datetime] NOT NULL,
[RecordClicks] [bit] NOT NULL,[Listed] [bit] NOT NULL ,
[Approved] [bit] NOT NULL ,
[OnlyForMembers] [bit] NOT NULL,
[ViewCount] [int] ,
I seem to recall a limitation in retrieving large field sizes working with
ODBC. I don't know if it is applicable to your situation. The solution was
to retrieve the large columns in chunks. This link will get you some
information.
http://www.google.com/search?hl=en&q=blob+getchunk+site%3Amicrosoft.com- Hide quoted text -
- Show quoted text -
The maximum number of bytes per row is 8060. When you add a row in
length exceeds 8060 bytes you will get an error. If I look into your
table, I see that Abstract is [nvarchar](4000), that means this column
used 4000x2 bytes - almost the 100% of capacity
Reduced the size of your fields and it will help. Otherwise, add a
child table and move large field there.
On May 31, 2:27 pm, Alexey Smirnov <alexey.smir...@gmail.comwrote:
The maximum number of bytes per row is 8060. When you add a row in
length exceeds 8060 bytes you will get an error. If I look into your
table, I see that Abstract is [nvarchar](4000), that means this column
used 4000x2 bytes - almost the 100% of capacity
Reduced the size of your fields and it will help. Otherwise, add a
child table and move large field there.- Hide quoted text -
- Show quoted text -
Changing nvarchar to varchar can help too. (N) means unicode support,
that needs 2 bytes per character. This is a reason why the size is x2
in bytes.
If I made the Abstract column 'ntext' would that solve my problem?
Thanks,
JJ
"Alexey Smirnov" <al************@gmail.comwrote in message
news:11**********************@q69g2000hsb.googlegr oups.com...
On May 31, 2:03 pm, "JJ" <a...@xyz.comwrote:
>Hi.
Thanks for that - however as I am typing directly into the tables I don't think ODBC comes into play.
JJ
"MyndPhlyp" <nob...@homeright.nowwrote in message
news:%2****************@TK2MSFTNGP03.phx.gbl...
"JJ" <a...@xyz.comwrote in message news:eS**************@TK2MSFTNGP02.phx.gbl... This is slightly OT but ....
>When testing why my aspx page isn't returning the data as expected I
decided to type directly into one of my SQL tables. (The column definitions are below).
>When I type in data into any of the columns I get this error: "string or binary data will be truncated"
>I've tried clearing out all the columns but no luck. Even if I just change on of the int's to a different number I get the same error. I believe all are under the limits for the column types - unless I've got
it wrong with ntext?? The table new records from the page without any errors, its just when I type directly into the columns....?
>[ID] [int] IDENTITY(1,1) NOT NULL, [AddedDate] [datetime] NOT NULL, [AddedBy] [nvarchar](256) NOT NULL, [IssueNumber] [int] NOT NULL, [Subject] [nvarchar](256) NOT NULL, [Abstract] [nvarchar](4000) NULL, [HtmlBody] [ntext] NOT NULL, [PlainTextBody] [ntext] NOT NULL, [StatusID] [int] NOT NULL , [LastSent] [datetime] NOT NULL, [RecordClicks] [bit] NOT NULL, [Listed] [bit] NOT NULL , [Approved] [bit] NOT NULL , [OnlyForMembers] [bit] NOT NULL, [ViewCount] [int] ,
I seem to recall a limitation in retrieving large field sizes working
with
ODBC. I don't know if it is applicable to your situation. The solution
was
to retrieve the large columns in chunks. This link will get you some
information.
>http://www.google.com/search?hl=en&q=blob+getchunk+site%3Amicrosoft.com- Hide quoted text -
- Show quoted text -
The maximum number of bytes per row is 8060. When you add a row in
length exceeds 8060 bytes you will get an error. If I look into your
table, I see that Abstract is [nvarchar](4000), that means this column
used 4000x2 bytes - almost the 100% of capacity
Reduced the size of your fields and it will help. Otherwise, add a
child table and move large field there.
When I changed the row 'HtmlBody' to nvarchar(MAX) the errors went away.
It seems its the ntext row (when there's lots of text in it) thats causing
the error.
Will nvarchar(MAX) be ok for lots of text? How does it compare to ntext?
JJ
"JJ" <ab*@xyz.comwrote in message
news:uf**************@TK2MSFTNGP04.phx.gbl...
If I made the Abstract column 'ntext' would that solve my problem?
Thanks,
JJ
"Alexey Smirnov" <al************@gmail.comwrote in message
news:11**********************@q69g2000hsb.googlegr oups.com...
>On May 31, 2:03 pm, "JJ" <a...@xyz.comwrote:
>>Hi.
Thanks for that - however as I am typing directly into the tables I don't think ODBC comes into play.
JJ
"MyndPhlyp" <nob...@homeright.nowwrote in message
news:%2****************@TK2MSFTNGP03.phx.gbl.. .
"JJ" <a...@xyz.comwrote in message news:eS**************@TK2MSFTNGP02.phx.gbl... This is slightly OT but ....
When testing why my aspx page isn't returning the data as expected I decided to type directly into one of my SQL tables. (The column definitions are below).
When I type in data into any of the columns I get this error: "string or binary data will be truncated"
I've tried clearing out all the columns but no luck. Even if I just change on of the int's to a different number I get the same error. I believe all are under the limits for the column types - unless I've got it wrong with ntext?? The table new records from the page without any errors, its just when I type directly into the columns....?
[ID] [int] IDENTITY(1,1) NOT NULL, [AddedDate] [datetime] NOT NULL, [AddedBy] [nvarchar](256) NOT NULL, [IssueNumber] [int] NOT NULL, [Subject] [nvarchar](256) NOT NULL, [Abstract] [nvarchar](4000) NULL, [HtmlBody] [ntext] NOT NULL, [PlainTextBody] [ntext] NOT NULL, [StatusID] [int] NOT NULL , [LastSent] [datetime] NOT NULL, [RecordClicks] [bit] NOT NULL, [Listed] [bit] NOT NULL , [Approved] [bit] NOT NULL , [OnlyForMembers] [bit] NOT NULL, [ViewCount] [int] ,
I seem to recall a limitation in retrieving large field sizes working with ODBC. I don't know if it is applicable to your situation. The solution was to retrieve the large columns in chunks. This link will get you some information.
http://www.google.com/search?hl=en&q=blob+getchunk+site%3Amicrosoft.com- Hide quoted text -
- Show quoted text -
The maximum number of bytes per row is 8060. When you add a row in length exceeds 8060 bytes you will get an error. If I look into your table, I see that Abstract is [nvarchar](4000), that means this column used 4000x2 bytes - almost the 100% of capacity
Reduced the size of your fields and it will help. Otherwise, add a child table and move large field there.
Interestingly, in the depths of the SQL help docs, I noticed this:
Important:
ntext, text, and image data types will be removed in a future version of
Microsoft SQL Server. Avoid using these data types in new development work,
and plan to modify applications that currently use them. Use nvarchar(max),
varchar(max), and varbinary(max) instead. For more information, see Using
Large-Value Data Types.
So nvarchar(MAX) should cope with a large amount of html text, as did ntext,
.....shouldn't it...?
JJ
"JJ" <ab*@xyz.comwrote in message
news:ur**************@TK2MSFTNGP02.phx.gbl...
When I changed the row 'HtmlBody' to nvarchar(MAX) the errors went away.
It seems its the ntext row (when there's lots of text in it) thats causing
the error.
Will nvarchar(MAX) be ok for lots of text? How does it compare to ntext?
JJ
"JJ" <ab*@xyz.comwrote in message
news:uf**************@TK2MSFTNGP04.phx.gbl...
>If I made the Abstract column 'ntext' would that solve my problem? Thanks,
JJ "Alexey Smirnov" <al************@gmail.comwrote in message news:11**********************@q69g2000hsb.googleg roups.com...
>>On May 31, 2:03 pm, "JJ" <a...@xyz.comwrote: Hi.
Thanks for that - however as I am typing directly into the tables I don't think ODBC comes into play.
JJ
"MyndPhlyp" <nob...@homeright.nowwrote in message
news:%2****************@TK2MSFTNGP03.phx.gbl. ..
"JJ" <a...@xyz.comwrote in message news:eS**************@TK2MSFTNGP02.phx.gbl... This is slightly OT but ....
When testing why my aspx page isn't returning the data as expected I decided to type directly into one of my SQL tables. (The column definitions are below).
When I type in data into any of the columns I get this error: "string or binary data will be truncated"
I've tried clearing out all the columns but no luck. Even if I just change on of the int's to a different number I get the same error. I believe all are under the limits for the column types - unless I've got it wrong with ntext?? The table new records from the page without any errors, its just when I type directly into the columns....?
[ID] [int] IDENTITY(1,1) NOT NULL, [AddedDate] [datetime] NOT NULL, [AddedBy] [nvarchar](256) NOT NULL, [IssueNumber] [int] NOT NULL, [Subject] [nvarchar](256) NOT NULL, [Abstract] [nvarchar](4000) NULL, [HtmlBody] [ntext] NOT NULL, [PlainTextBody] [ntext] NOT NULL, [StatusID] [int] NOT NULL , [LastSent] [datetime] NOT NULL, [RecordClicks] [bit] NOT NULL, [Listed] [bit] NOT NULL , [Approved] [bit] NOT NULL , [OnlyForMembers] [bit] NOT NULL, [ViewCount] [int] ,
I seem to recall a limitation in retrieving large field sizes working with ODBC. I don't know if it is applicable to your situation. The solution was to retrieve the large columns in chunks. This link will get you some information.
http://www.google.com/search?hl=en&q=blob+getchunk+site%3Amicrosoft.com- Hide quoted text -
- Show quoted text -
The maximum number of bytes per row is 8060. When you add a row in length exceeds 8060 bytes you will get an error. If I look into your table, I see that Abstract is [nvarchar](4000), that means this column used 4000x2 bytes - almost the 100% of capacity
Reduced the size of your fields and it will help. Otherwise, add a child table and move large field there.
On May 31, 3:00 pm, "JJ" <a...@xyz.comwrote:
When I changed the row 'HtmlBody' to nvarchar(MAX) the errors went away.
It seems its the ntext row (when there's lots of text in it) thats causing
the error.
Will nvarchar(MAX) be ok for lots of text? How does it compare to ntext?
Ah, you are about 2005....
nvarchar(MAX) is a replacement for old ntext and you should use this
new type in SQL Server 2005. It's simply handy - ntext data type
cannot be a variable in a procedure, you can concatenate or use a
substring with it. With nvarchar(MAX) you don't have these problems. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics
by: Rafal Kedziorski |
last post by:
hi,
I have this table:
CREATE TABLE `category_tree` (
`category_tree_id` BIGINT NOT NULL AUTO_INCREMENT,
`parent_id` BIGINT,
`mandant_id` SMALLINT NOT NULL,...
|
by: martin |
last post by:
Hi,
We have a heavily used production server and a table which logs every
hit on a web site. This table has grown large over time and we want to
clear it down as efficiently as possible. We would...
|
by: LineVoltageHalogen |
last post by:
Greeting All, I have a stored proc that dynamically truncates all the
tables in my databases. I use a cursor and some dynamic sql for this:
......
create cursor
Loop through sysobjects and...
|
by: New MSSQL DBA |
last post by:
I have recently been assigned to take over several MSSQL environments
and found some of the existing practice confusing. As most of my
previous experiences are on Oracle and Unix platform so would...
|
by: Amit Kulkarni |
last post by:
Hi,
I have small problem.
I want to truncate a line in a text file using C file handling
functions and write new line in place of it. How do I do it?
e.g.
"example.txt"
Line 1: This is a...
|
by: rdraider |
last post by:
Hi,
I am trying to create a script that deletes transaction tables and leaves
master data like customer, vendors, inventory items, etc. How can I use
TRUNCATE TABLE with an Exists? My problem is...
|
by: Sumanth |
last post by:
Are there any implementations of truncate in db2. Is it going to be
implemented in the future?
Is there an alternate way of doing a truncate of a table that has a high
record count without using...
|
by: Neil |
last post by:
Can one use Truncate Table on a linked server table? When I try it, I get a
message that only two prefixes are allowed. Here's what I'm using:
Truncate Table svrname.dbname.dbo.tablename
|
by: Troels Arvin |
last post by:
Hello,
Until this date, I believed that DB2 has no TRUNCATE TABLE command. But
then I came across...
|
by: Charles Arthur |
last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
|
by: aa123db |
last post by:
Variable and constants
Use var or let for variables and const fror constants.
Var foo ='bar';
Let foo ='bar';const baz ='bar';
Functions
function $name$ ($parameters$) {
}
...
|
by: ryjfgjl |
last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
|
by: ryjfgjl |
last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
|
by: BarryA |
last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
|
by: Sonnysonu |
last post by:
This is the data of csv file
1 2 3
1 2 3
1 2 3
1 2 3
2 3
2 3
3
the lengths should be different i have to store the data by column-wise with in the specific length.
suppose the i have to...
|
by: marktang |
last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
|
by: Hystou |
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
|
by: Oralloy |
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers,...
| |