Wednesday, March 21, 2012
Line/Paragraph Breaks in SQL Server/XML output
I was trying out a stored proc that outputs a huge XML file. It works fine
but my problem is the linebreaks it comes out with. What I did to test it
was to output it as text using Query Analyzer and then saved it as an XML
file with the headers and root elements. It kept on giving me errors. I
found out that it was because of paragraph breaks that split tags and such
thus causing the errors on the XML.
What I did to remedy the problem was to copy and paste to word then Searched
and Replaced for paragraph breaks and removed them. When I saved this to an
XML file, it worked fine. I tried doing a REPLACE on the resulting
textstream from an ASP page but it was giving me errors.
So my question is, how do I remove those linebreaks from the XML output so
that my ASP page is correctly rendered? Is there a setting I missed?
Thanks a bunch guys!
Jeeves
How exactly are you outputing the file from the Stored Proc? The basic
problem is that SQL Server 2000 doesn't have a native XML type, so it treats
the results of a FOR XML query as a string (this changes in SQL Server
2005). A better approach might be to use the SQLOLEDB or SQLXMLOLEDB
provider to retrieve the XML as a stream or DOMDocument in your ASP script.
--
Graeme Malcolm
Principal Technologist
Content Master Ltd.
www.contentmaster.com
"Jeeves De Veyra" <me@.jeevester.com> wrote in message
news:e5GUZR6rEHA.2016@.TK2MSFTNGP12.phx.gbl...
Hi again,
I was trying out a stored proc that outputs a huge XML file. It works fine
but my problem is the linebreaks it comes out with. What I did to test it
was to output it as text using Query Analyzer and then saved it as an XML
file with the headers and root elements. It kept on giving me errors. I
found out that it was because of paragraph breaks that split tags and such
thus causing the errors on the XML.
What I did to remedy the problem was to copy and paste to word then Searched
and Replaced for paragraph breaks and removed them. When I saved this to an
XML file, it worked fine. I tried doing a REPLACE on the resulting
textstream from an ASP page but it was giving me errors.
So my question is, how do I remove those linebreaks from the XML output so
that my ASP page is correctly rendered? Is there a setting I missed?
Thanks a bunch guys!
Jeeves
|||Jeeves mentioned that the result was saved through the Query Analyzer.
The QA in SQL Server 2000 should not be used for that since it does not
support the XML stream. In SQL Server 2000, the XML stream is build on top
of the rowset streaming of TDS by sending a 4kB block as a row. The query
analyzer just shows the rows as it normally would.
The best ways to save your data in an ASP script is to use stream interface
of the OLEDB provider as Graeme points out. If you want to store it
manually, you can either write a VB Script or you can use the SQLXML ISAPI
component and then save it through your browser.
Best regards
Michael
"Graeme Malcolm" <graemem_cm@.hotmail.com> wrote in message
news:%237OwhoEsEHA.376@.TK2MSFTNGP14.phx.gbl...
> How exactly are you outputing the file from the Stored Proc? The basic
> problem is that SQL Server 2000 doesn't have a native XML type, so it
> treats
> the results of a FOR XML query as a string (this changes in SQL Server
> 2005). A better approach might be to use the SQLOLEDB or SQLXMLOLEDB
> provider to retrieve the XML as a stream or DOMDocument in your ASP
> script.
> --
> --
> Graeme Malcolm
> Principal Technologist
> Content Master Ltd.
> www.contentmaster.com
>
> "Jeeves De Veyra" <me@.jeevester.com> wrote in message
> news:e5GUZR6rEHA.2016@.TK2MSFTNGP12.phx.gbl...
> Hi again,
> I was trying out a stored proc that outputs a huge XML file. It works fine
> but my problem is the linebreaks it comes out with. What I did to test it
> was to output it as text using Query Analyzer and then saved it as an XML
> file with the headers and root elements. It kept on giving me errors. I
> found out that it was because of paragraph breaks that split tags and such
> thus causing the errors on the XML.
> What I did to remedy the problem was to copy and paste to word then
> Searched
> and Replaced for paragraph breaks and removed them. When I saved this to
> an
> XML file, it worked fine. I tried doing a REPLACE on the resulting
> textstream from an ASP page but it was giving me errors.
> So my question is, how do I remove those linebreaks from the XML output so
> that my ASP page is correctly rendered? Is there a setting I missed?
> Thanks a bunch guys!
> Jeeves
>
>
sql
Monday, March 19, 2012
line feeds
there any way i can include any sort of line feed or cr in the varchar
currently if i parm in 'line 1 line 2' i get
line 1 line 2
but what i want to get is
line 1
line 2Hi Peter,
You can use SQL Server's CHAR function to insert LF or CR (CHAR(10) or
CHAR(13)) into the text to display it correctly:
-- Set QA to "Results in text" (<ctrl> + T)
SELECT 'line1' + CHAR(10) + 'line2'
HTH
Ami
"Peter Newman" <PeterNewman@.discussions.microsoft.com> wrote in message
news:B63EB308-13E7-411F-BE68-6EF58CDD0816@.microsoft.com...
> Im parming a Nvarchar(4000) into a stored proc that uses xp_sendmail .
is
> there any way i can include any sort of line feed or cr in the varchar
> currently if i parm in 'line 1 line 2' i get
> line 1 line 2
> but what i want to get is
> line 1
> line 2|||To add to Ami's response, you can include both carriage return and line feed
as your line terminator. This is the Windows convention.
SELECT 'line1' + CHAR(13) + CHAR(10) + 'line2'
Hope this helps.
Dan Guzman
SQL Server MVP
"Ami Levin" <XXX_NOSPAM___XXX__amlevin@.mercury.com> wrote in message
news:%23y4VSqFCFHA.2568@.TK2MSFTNGP10.phx.gbl...
> Hi Peter,
> You can use SQL Server's CHAR function to insert LF or CR (CHAR(10) or
> CHAR(13)) into the text to display it correctly:
> -- Set QA to "Results in text" (<ctrl> + T)
> SELECT 'line1' + CHAR(10) + 'line2'
> HTH
> Ami
> "Peter Newman" <PeterNewman@.discussions.microsoft.com> wrote in message
> news:B63EB308-13E7-411F-BE68-6EF58CDD0816@.microsoft.com...
> is
>
Line Break in T-SQL
In a Stored Proc, I am building a string variable. I am getting outputs
from 4 different queries and would like the string to have line breaks
to display each entry in a different line in a text area. How can I do
this?
i.e
result = result1 + result2 + result3 + result4.
What characters can I enter so that the output is displayed in the
textarea as
result1
result2
result3
result4
Thanks,SELECT 'asdfasd'+char(13)+'ASDF ASF ASDF ASD'
maybe it's CHR not CHAR|||The line terminator on a Windows platform is carriage return/line feed
(ASCII 10 and 13). You can concatenate CHAR(13) + CHAR(10) where you want
line breaks.
--
Hope this helps.
Dan Guzman
SQL Server MVP
<nashak@.hotmail.comwrote in message
news:1158285904.128023.10670@.e3g2000cwe.googlegrou ps.com...
Quote:
Originally Posted by
Hello,
>
In a Stored Proc, I am building a string variable. I am getting outputs
from 4 different queries and would like the string to have line breaks
to display each entry in a different line in a text area. How can I do
this?
>
i.e
result = result1 + result2 + result3 + result4.
What characters can I enter so that the output is displayed in the
textarea as
result1
result2
result3
result4
>
Thanks,
>
Quote:
Originally Posted by
>Hello,
>
>In a Stored Proc, I am building a string variable. I am getting outputs
>from 4 different queries and would like the string to have line breaks
>to display each entry in a different line in a text area. How can I do
>this?
Hi Nashak,
In addition to the answers already given by Alexander and Dan, here's
another option - just use a newline character inside a string constant.
For example:
SELECT 'First line
Second line.'
This will show the output on two lines (make sure to select output to
text, not output to grid - the grid doesn't handle newlines too well).
If the data comes from columns, try
SELECT Column1 + '
' + Column2
FROM YourTable
WHERE ...
--
Hugo Kornelis, SQL Server MVP
Limits to length of Stored Proc
creates a lot of temporary tables as it processes down through a few
sets of data.
When testing it through Query Analyzer, it runs fine (a bit slow
though). But when I try to run it through the ade, it doesn't do
anything. It runs through the procedure in milliseconds but doesn't
seem to ever actually start it. If I change the calling code in the
ade VBA to refer to a different SP, it will call/run the different SP,
so I don't think its the way I call it.
Is there a limit to the number of lines a stored procedure can have,
or some other limit on memory or transactions?I doubt the proc size is the issue. Have you included SET NOCOUNT ON at
the beginning of the proc?
--
Hope this helps.
Dan Guzman
SQL Server MVP
--------
SQL FAQ links (courtesy Neil Pike):
http://www.ntfaq.com/Articles/Index...epartmentID=800
http://www.sqlserverfaq.com
http://www.mssqlserver.com/faq
--------
"C Kirby" <ckirby@.mindspring.com> wrote in message
news:psdqlvklfnjtml6v4jr8ambl2c3fns0lk0@.4ax.com...
> In SQL Server 2000, I've got a rather lengthy stored procedure, which
> creates a lot of temporary tables as it processes down through a few
> sets of data.
> When testing it through Query Analyzer, it runs fine (a bit slow
> though). But when I try to run it through the ade, it doesn't do
> anything. It runs through the procedure in milliseconds but doesn't
> seem to ever actually start it. If I change the calling code in the
> ade VBA to refer to a different SP, it will call/run the different SP,
> so I don't think its the way I call it.
> Is there a limit to the number of lines a stored pr|||Hi
You would not be able to compile the Stored procedure if you have hit a size
limit, although it could be the query cost that is somehow behaving
differently and therefore hitting that limit (see sp_configure/ query
governor cost limit in Books online).
If you can run this through QA it seems most likely that you are not passing
the parameters incorrectly in your code, so try adding some debug
statements! If the procedure is as long as you say, it is probably worth
consider modularising it and splitting it into sub-procedures; you may also
be able to re-write the code to be more efficient. This may also help stop
recompilations.
John
"C Kirby" <ckirby@.mindspring.com> wrote in message
news:psdqlvklfnjtml6v4jr8ambl2c3fns0lk0@.4ax.com...
> In SQL Server 2000, I've got a rather lengthy stored procedure, which
> creates a lot of temporary tables as it processes down through a few
> sets of data.
> When testing it through Query Analyzer, it runs fine (a bit slow
> though). But when I try to run it through the ade, it doesn't do
> anything. It runs through the procedure in milliseconds but doesn't
> seem to ever actually start it. If I change the calling code in the
> ade VBA to refer to a different SP, it will call/run the different SP,
> so I don't think its the way I call it.
> Is there a limit to the number of lines a stored procedure can have,
> or some other limit on memory or transactions?|||Thanks for the help.. I do have NOCOUNT set to on, so I don't think
that is the problem.
I've tried to setup the debugging by using the IF @.@.ERROR <>0 ,
Rollback and return X.
I can't seem to get the front end to actually look at the return value
to tell if the sp ran successfully though, so something isn't quite
right with that.
As far as the parameters go, this sp doesn't use any, so I'm not
sending it any. Could this be why the front end isn't reading the
return parameter?
One question on splitting out the different functions into separate
stored procs. The very first thing the sp does is to create a temp
table holding the records to be manipulated. Right now it is named
#temp (or something like that). In order to reference this table from
another sp, should a use the double # ('##temp')?
On Tue, 9 Sep 2003 09:02:09 +0100, "John Bell"
<jbellnewsposts@.hotmail.com> wrote:
>Hi
>You would not be able to compile the Stored procedure if you have hit a size
>limit, although it could be the query cost that is somehow behaving
>differently and therefore hitting that limit (see sp_configure/ query
>governor cost limit in Books online).
>If you can run this through QA it seems most likely that you are not passing
>the parameters incorrectly in your code, so try adding some debug
>statements! If the procedure is as long as you say, it is probably worth
>consider modularising it and splitting it into sub-procedures; you may also
>be able to re-write the code to be more efficient. This may also help stop
>recompilations.
>John
>
>"C Kirby" <ckirby@.mindspring.com> wrote in message
>news:psdqlvklfnjtml6v4jr8ambl2c3fns0lk0@.4ax.com...
>> In SQL Server 2000, I've got a rather lengthy stored procedure, which
>> creates a lot of temporary tables as it processes down through a few
>> sets of data.
>> When testing it through Query Analyzer, it runs fine (a bit slow
>> though). But when I try to run it through the ade, it doesn't do
>> anything. It runs through the procedure in milliseconds but doesn't
>> seem to ever actually start it. If I change the calling code in the
>> ade VBA to refer to a different SP, it will call/run the different SP,
>> so I don't think its the way I call it.
>> Is there a limit to the number of lines a stored procedure can have,
>> or some other limit on memory or transactions?|||> I've tried to setup the debugging by using the IF @.@.ERROR <>0 ,
> Rollback and return X.
> I can't seem to get the front end to actually look at the return value
> to tell if the sp ran successfully though, so something isn't quite
> right with that.
> As far as the parameters go, this sp doesn't use any, so I'm not
> sending it any. Could this be why the front end isn't reading the
> return parameter?
The return value is essentially an output parameter. Does your
procedure return resultsets? If so, you may need to consume those
before the return value is available.
> One question on splitting out the different functions into separate
> stored procs. The very first thing the sp does is to create a temp
> table holding the records to be manipulated. Right now it is named
> #temp (or something like that). In order to reference this table from
> another sp, should a use the double # ('##temp')?
The local temp table (#temp) is visible to the nested procs so you don't
need to resort to a global (##temp) table. An issue with global temp
tables is that you'll need to uniquely name them to handle concurrency.
--
Hope this helps.
Dan Guzman
SQL Server MVP
"C Kirby" <ckirby@.mindspring.com> wrote in message
news:4m7ulvo2b8fbn7d230e30es5pl299c8to3@.4ax.com...
> Thanks for the help.. I do have NOCOUNT set to on, so I don't think
> that is the problem.
> I've tried to setup the debugging by using the IF @.@.ERROR <>0 ,
> Rollback and return X.
> I can't seem to get the front end to actually look at the return value
> to tell if the sp ran successfully though, so something isn't quite
> right with that.
> As far as the parameters go, this sp doesn't use any, so I'm not
> sending it any. Could this be why the front end isn't reading the
> return parameter?
> One question on splitting out the different functions into separate
> stored procs. The very first thing the sp does is to create a temp
> table holding the records to be manipulated. Right now it is named
> #temp (or something like that). In order to reference this table from
> another sp, should a use the double # ('##temp')?
> On Tue, 9 Sep 2003 09:02:09 +0100, "John Bell"
> <jbellnewsposts@.hotmail.com> wrote:
> >Hi
> >You would not be able to compile the Stored procedure if you have hit
a size
> >limit, although it could be the query cost that is somehow behaving
> >differently and therefore hitting that limit (see sp_configure/ query
> >governor cost limit in Books online).
> >If you can run this through QA it seems most likely that you are not
passing
> >the parameters incorrectly in your code, so try adding some debug
> >statements! If the procedure is as long as you say, it is probably
worth
> >consider modularising it and splitting it into sub-procedures; you
may also
> >be able to re-write the code to be more efficient. This may also help
stop
> >recompilations.
> >John
> >"C Kirby" <ckirby@.mindspring.com> wrote in message
> >news:psdqlvklfnjtml6v4jr8ambl2c3fns0lk0@.4ax.com...
> >> In SQL Server 2000, I've got a rather lengthy stored procedure,
which
> >> creates a lot of temporary tables as it processes down through a
few
> >> sets of data.
> >> When testing it through Query Analyzer, it runs fine (a bit slow
> >> though). But when I try to run it through the ade, it doesn't do
> >> anything. It runs through the procedure in milliseconds but
doesn't
> >> seem to ever actually start it. If I change the calling code in
the
> >> ade VBA to refer to a different SP, it will call/run the different
SP,
> >> so I don't think its the way I call it.
> >> Is there a limit to the number of lines a stored procedure can
have,
> >> or some other limit on memory or transactions?|||Dan Guzman (danguzman@.nospam-earthlink.net) writes:
> The return value is essentially an output parameter. Does your
> procedure return resultsets? If so, you may need to consume those
> before the return value is available.
To add to what Dan says here, it depends on whether you are using
client-side or server-side cursor. With client-side cursors you can
access the return value directly.
But we are a bit in the dark here, as we have not seen any of your
code, neither the ADO code, nor the SQL code.
--
Erland Sommarskog, SQL Server MVP, sommar@.algonet.se
Books Online for SQL Server SP3 at
http://www.microsoft.com/sql/techin.../2000/books.asp|||You guys have *no* idea how much I appreciate the help!!
Here's the code that I am using to call the sp from the Access ade
'call stored procedure
docmd.hourglass true
Set com = New ADODB.Command
With com
.ActiveConnection = getadoconnectstring("SM")
.CommandText = "qryBSTransPost"
.CommandType = adCmdStoredProc
.CommandTimeout = 0
.Execute , , adAsyncExecute
If .Parameters(0) <> 0 Then
'operation failed
MsgBox "Fail"
Else
DoCmd.Hourglass False
MsgBox "Transaction Import Complete", vbOKOnly, "Brokerage
Transactions Imported"
End If
End With
Set com = Nothing
This is all going into a pre-exisiting software package that has the
getadoconnectionstring function. Since the simple stored procs work
with this call, I'm going to say that the connection is ok.
Due to the length of the sp, I won't post it, but would be glad to let
anyone see it (even though it is *ugly*!!). It does not return any
records. It basically looks at a table, picks the records that meet a
certain criteria, creates a temp table to hold the Primary keys to
those records, then runs through a series of data manipulations using
more temp tables, then adds those created records into a couple of
different tables, then changes some values in the original records so
that they no longer meet the initial criteria.
This is the final section of the sp:
IF @.@.ERROR <> 0
BEGIN
ROLLBACK TRANSACTION
RETURN 11
END
COMMIT TRANSACTION
GO
Regardless of what actually happens with the sp when I call it from
the ade, the .parameter(0) value never triggers the 'fail' option..
On Wed, 10 Sep 2003 22:26:36 +0000 (UTC), Erland Sommarskog
<sommar@.algonet.se> wrote:
>Dan Guzman (danguzman@.nospam-earthlink.net) writes:
>> The return value is essentially an output parameter. Does your
>> procedure return resultsets? If so, you may need to consume those
>> before the return value is available.
>To add to what Dan says here, it depends on whether you are using
>client-side or server-side cursor. With client-side cursors you can
>access the return value directly.
>But we are a bit in the dark here, as we have not seen any of your
>code, neither the ADO code, nor the SQL code.|||Is there some reason you are using the adAsyncExecute option here? If
not, you might try removing the option from your Execute method.
It looks to me like your code isn't written to handle asynchronous proc
execution. The code is checking the return value even though the proc
may still be executing. The code probably works with your other procs
simply because they complete before you check the result.
Also, note @.@.ERROR is changed after every SQL statement so you need to
check it after each statement and perform error processing then. For
example:
BEGIN TRAN
INSERT INTO MyTable VALUES(1)
IF @.@.ERROR <> 0 GOTO ErrorHandler
INSERT INTO MyTable VALUES(2)
IF @.@.ERROR <> 0 GOTO ErrorHandler
COMMIT
RETURN 0
ErrorHandler:
IF @.@.TRANCOUNT > 0 ROLLBACK
RETURN 11
--
Hope this helps.
Dan Guzman
SQL Server MVP
"C Kirby" <ckirby@.mindspring.com> wrote in message
news:d2bvlvgj5c2hqv1mm6fqohdsh641pmifp3@.4ax.com...
> You guys have *no* idea how much I appreciate the help!!
> Here's the code that I am using to call the sp from the Access ade
> 'call stored procedure
> docmd.hourglass true
> Set com = New ADODB.Command
> With com
> .ActiveConnection = getadoconnectstring("SM")
> .CommandText = "qryBSTransPost"
> .CommandType = adCmdStoredProc
> .CommandTimeout = 0
> .Execute , , adAsyncExecute
> If .Parameters(0) <> 0 Then
> 'operation failed
> MsgBox "Fail"
> Else
> DoCmd.Hourglass False
> MsgBox "Transaction Import Complete", vbOKOnly, "Brokerage
> Transactions Imported"
> End If
> End With
> Set com = Nothing
> This is all going into a pre-exisiting software package that has the
> getadoconnectionstring function. Since the simple stored procs work
> with this call, I'm going to say that the connection is ok.
> Due to the length of the sp, I won't post it, but would be glad to let
> anyone see it (even though it is *ugly*!!). It does not return any
> records. It basically looks at a table, picks the records that meet a
> certain criteria, creates a temp table to hold the Primary keys to
> those records, then runs through a series of data manipulations using
> more temp tables, then adds those created records into a couple of
> different tables, then changes some values in the original records so
> that they no longer meet the initial criteria.
> This is the final section of the sp:
> IF @.@.ERROR <> 0
> BEGIN
> ROLLBACK TRANSACTION
> RETURN 11
> END
> COMMIT TRANSACTION
> GO
>
> Regardless of what actually happens with the sp when I call it from
> the ade, the .parameter(0) value never triggers the 'fail' option..
>
> On Wed, 10 Sep 2003 22:26:36 +0000 (UTC), Erland Sommarskog
> <sommar@.algonet.se> wrote:
> >Dan Guzman (danguzman@.nospam-earthlink.net) writes:
> >> The return value is essentially an output parameter. Does your
> >> procedure return resultsets? If so, you may need to consume those
> >> before the return value is available.
> >To add to what Dan says here, it depends on whether you are using
> >client-side or server-side cursor. With client-side cursors you can
> >access the return value directly.
> >But we are a bit in the dark here, as we have not seen any of your
> >code, neither the ADO code, nor the SQL code.|||Looks like that was the problem, Dan! I removed the adAsync option
and the procedure is running!!!! Thanks for the help!!
I'm going to take a shot at splitting the the sp into a few sub
procedures and then see if I can get the return value to work..
Thanks for all the help from everybody!!!!!!!
On Thu, 11 Sep 2003 03:02:13 GMT, "Dan Guzman"
<danguzman@.nospam-earthlink.net> wrote:
>Is there some reason you are using the adAsyncExecute option here? If
>not, you might try removing the option from your Execute method.
>It looks to me like your code isn't written to handle asynchronous proc
>execution. The code is checking the return value even though the proc
>may still be executing. The code probably works with your other procs
>simply because they complete before you check the result.
>Also, note @.@.ERROR is changed after every SQL statement so you need to
>check it after each statement and perform error processing then. For
>example:
>BEGIN TRAN
>INSERT INTO MyTable VALUES(1)
>IF @.@.ERROR <> 0 GOTO ErrorHandler
>INSERT INTO MyTable VALUES(2)
>IF @.@.ERROR <> 0 GOTO ErrorHandler
>COMMIT
>RETURN 0
>ErrorHandler:
>IF @.@.TRANCOUNT > 0 ROLLBACK
>RETURN 11
Wednesday, March 7, 2012
Limitations of SQL Server 2000?
running 2003 Enterprise. Plenty of ram and hard drive space. I've also
just got access to a 8 CPU server which we've just begun using as well.
Now I'm hearing word from my management that we will be moving towards
Oracle because of a limitation with SQL Server 2000. All because an
individual sites SQL server cannot deal with more than 25 million rows.
I personally have tables with over 150 million records, but I haven't
tried to dump 25 million into
I have no idea where this comes from, and I doubt there is any
supporting evidence. I'm not opposed to Oracle, just wondering why we
are trying to fix what isn't broken.
Can anybody point me towards what this limitation, or describe a
similar scenario?
Check out "Maximum Capacity Specifications" in the BOL. Your management is
wrong. SQL Server has handled databases over 10.5 TB in size. Most
problems I've seen with SQL Server (since 1993) have been
application-related.
Tom
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
..
<MICHAEL_SUNLIN@.COUNTRYWIDE.COM> wrote in message
news:1127156075.664445.58460@.o13g2000cwo.googlegro ups.com...
Hello, we are running SQL 2000 Enterprise on a quad 2.8Ghz proc server
running 2003 Enterprise. Plenty of ram and hard drive space. I've also
just got access to a 8 CPU server which we've just begun using as well.
Now I'm hearing word from my management that we will be moving towards
Oracle because of a limitation with SQL Server 2000. All because an
individual sites SQL server cannot deal with more than 25 million rows.
I personally have tables with over 150 million records, but I haven't
tried to dump 25 million into
I have no idea where this comes from, and I doubt there is any
supporting evidence. I'm not opposed to Oracle, just wondering why we
are trying to fix what isn't broken.
Can anybody point me towards what this limitation, or describe a
similar scenario?
|||On 19 Sep 2005 11:54:35 -0700, MICHAEL_SUNLIN@.COUNTRYWIDE.COM wrote:
>Can anybody point me towards what this limitation, or describe a
>similar scenario?
No, but half the developers in Los Angeles have worked at Countrywide
at some time in the last ten years, and are aware of this Oracle
project. It was begun when the mortgage boom was at its peak about
three years ago and all their current SQLServer systems hit capacity.
In a fit of pique, panic, and good salesmanship from Oracle, a new
project was born, has gone through about three names at last count,
and is now 300% over budget and schedule. Need one say more? Just
that, with new hardware and some extended tuning, they're still
happily running on SQLServer.
J.
|||In the same vein, I usually tell people that if they have perf problems with
SQL Server, they can migrate to Oracle and if they have perf problems with
Oracle, they can migrate to SQL Server. How so? Because they'll have to
clean up their code in order to do the migration. It isn't the migration
that pays the dividend, it's the code/design cleanup that does. It's much
cheaper than the migration.
Tom
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
..
"JXStern" <JXSternChangeX2R@.gte.net> wrote in message
news:brvui11h9g0fbtpctk0540mlicbc5jg3io@.4ax.com...
On 19 Sep 2005 11:54:35 -0700, MICHAEL_SUNLIN@.COUNTRYWIDE.COM wrote:
>Can anybody point me towards what this limitation, or describe a
>similar scenario?
No, but half the developers in Los Angeles have worked at Countrywide
at some time in the last ten years, and are aware of this Oracle
project. It was begun when the mortgage boom was at its peak about
three years ago and all their current SQLServer systems hit capacity.
In a fit of pique, panic, and good salesmanship from Oracle, a new
project was born, has gone through about three names at last count,
and is now 300% over budget and schedule. Need one say more? Just
that, with new hardware and some extended tuning, they're still
happily running on SQLServer.
J.
|||On Tue, 20 Sep 2005 07:38:45 -0400, "Tom Moreau"
<tom@.dont.spam.me.cips.ca> wrote:
>In the same vein, I usually tell people that if they have perf problems with
>SQL Server, they can migrate to Oracle and if they have perf problems with
>Oracle, they can migrate to SQL Server. How so? Because they'll have to
>clean up their code in order to do the migration. It isn't the migration
>that pays the dividend, it's the code/design cleanup that does. It's much
>cheaper than the migration.
Prezactly.
In this case, there was (and I assume still is) also an extremely
ambitious plan to merge and rationalize a bunch of related databases
into a single company-wide schema, or meta-schema, or ontology, or
phylogeny, or taxonomy, or whatever it is everyone has always thought
they were doing on such projects. It's like Captain Queeg proving the
mess boys took the strawberries, I think, an obsession that takes hold
and distracts from any real progress.
Anybody ever see one of these project succeed? I haven't, but I
suspect that some, maybe 10%, actually get deployed, at least.
Whether *any* show a positive ROI, I really wonder.
J.
|||My guess is that an Oracle bigot made it into upper management and simply
decreed that it must be so. Forget about justifying it. Eventually, when
they realize they spent a ton of money and got nothing back, they'll can the
jerk and look at improving their code. Yeah, and maybe I'll win the
lottery. ;-)
Tom
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
..
"jxstern" <jxstern@.nowhere.xyz> wrote in message
news:6km0j1hpfndba26oj7adrgaene8f6c688b@.4ax.com...
On Tue, 20 Sep 2005 07:38:45 -0400, "Tom Moreau"
<tom@.dont.spam.me.cips.ca> wrote:
>In the same vein, I usually tell people that if they have perf problems
>with
>SQL Server, they can migrate to Oracle and if they have perf problems with
>Oracle, they can migrate to SQL Server. How so? Because they'll have to
>clean up their code in order to do the migration. It isn't the migration
>that pays the dividend, it's the code/design cleanup that does. It's much
>cheaper than the migration.
Prezactly.
In this case, there was (and I assume still is) also an extremely
ambitious plan to merge and rationalize a bunch of related databases
into a single company-wide schema, or meta-schema, or ontology, or
phylogeny, or taxonomy, or whatever it is everyone has always thought
they were doing on such projects. It's like Captain Queeg proving the
mess boys took the strawberries, I think, an obsession that takes hold
and distracts from any real progress.
Anybody ever see one of these project succeed? I haven't, but I
suspect that some, maybe 10%, actually get deployed, at least.
Whether *any* show a positive ROI, I really wonder.
J.
Limitations of SQL Server 2000?
running 2003 Enterprise. Plenty of ram and hard drive space. I've also
just got access to a 8 CPU server which we've just begun using as well.
Now I'm hearing word from my management that we will be moving towards
Oracle because of a limitation with SQL Server 2000. All because an
individual sites SQL server cannot deal with more than 25 million rows.
I personally have tables with over 150 million records, but I haven't
tried to dump 25 million into
I have no idea where this comes from, and I doubt there is any
supporting evidence. I'm not opposed to Oracle, just wondering why we
are trying to fix what isn't broken.
Can anybody point me towards what this limitation, or describe a
similar scenario?Check out "Maximum Capacity Specifications" in the BOL. Your management is
wrong. SQL Server has handled databases over 10.5 TB in size. Most
problems I've seen with SQL Server (since 1993) have been
application-related.
--
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
<MICHAEL_SUNLIN@.COUNTRYWIDE.COM> wrote in message
news:1127156075.664445.58460@.o13g2000cwo.googlegroups.com...
Hello, we are running SQL 2000 Enterprise on a quad 2.8Ghz proc server
running 2003 Enterprise. Plenty of ram and hard drive space. I've also
just got access to a 8 CPU server which we've just begun using as well.
Now I'm hearing word from my management that we will be moving towards
Oracle because of a limitation with SQL Server 2000. All because an
individual sites SQL server cannot deal with more than 25 million rows.
I personally have tables with over 150 million records, but I haven't
tried to dump 25 million into
I have no idea where this comes from, and I doubt there is any
supporting evidence. I'm not opposed to Oracle, just wondering why we
are trying to fix what isn't broken.
Can anybody point me towards what this limitation, or describe a
similar scenario?|||On 19 Sep 2005 11:54:35 -0700, MICHAEL_SUNLIN@.COUNTRYWIDE.COM wrote:
>Can anybody point me towards what this limitation, or describe a
>similar scenario?
No, but half the developers in Los Angeles have worked at Countrywide
at some time in the last ten years, and are aware of this Oracle
project. It was begun when the mortgage boom was at its peak about
three years ago and all their current SQLServer systems hit capacity.
In a fit of pique, panic, and good salesmanship from Oracle, a new
project was born, has gone through about three names at last count,
and is now 300% over budget and schedule. Need one say more? Just
that, with new hardware and some extended tuning, they're still
happily running on SQLServer.
J.|||In the same vein, I usually tell people that if they have perf problems with
SQL Server, they can migrate to Oracle and if they have perf problems with
Oracle, they can migrate to SQL Server. How so? Because they'll have to
clean up their code in order to do the migration. It isn't the migration
that pays the dividend, it's the code/design cleanup that does. It's much
cheaper than the migration.
--
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
"JXStern" <JXSternChangeX2R@.gte.net> wrote in message
news:brvui11h9g0fbtpctk0540mlicbc5jg3io@.4ax.com...
On 19 Sep 2005 11:54:35 -0700, MICHAEL_SUNLIN@.COUNTRYWIDE.COM wrote:
>Can anybody point me towards what this limitation, or describe a
>similar scenario?
No, but half the developers in Los Angeles have worked at Countrywide
at some time in the last ten years, and are aware of this Oracle
project. It was begun when the mortgage boom was at its peak about
three years ago and all their current SQLServer systems hit capacity.
In a fit of pique, panic, and good salesmanship from Oracle, a new
project was born, has gone through about three names at last count,
and is now 300% over budget and schedule. Need one say more? Just
that, with new hardware and some extended tuning, they're still
happily running on SQLServer.
J.|||On Tue, 20 Sep 2005 07:38:45 -0400, "Tom Moreau"
<tom@.dont.spam.me.cips.ca> wrote:
>In the same vein, I usually tell people that if they have perf problems with
>SQL Server, they can migrate to Oracle and if they have perf problems with
>Oracle, they can migrate to SQL Server. How so? Because they'll have to
>clean up their code in order to do the migration. It isn't the migration
>that pays the dividend, it's the code/design cleanup that does. It's much
>cheaper than the migration.
Prezactly.
In this case, there was (and I assume still is) also an extremely
ambitious plan to merge and rationalize a bunch of related databases
into a single company-wide schema, or meta-schema, or ontology, or
phylogeny, or taxonomy, or whatever it is everyone has always thought
they were doing on such projects. It's like Captain Queeg proving the
mess boys took the strawberries, I think, an obsession that takes hold
and distracts from any real progress.
Anybody ever see one of these project succeed? I haven't, but I
suspect that some, maybe 10%, actually get deployed, at least.
Whether *any* show a positive ROI, I really wonder.
J.|||My guess is that an Oracle bigot made it into upper management and simply
decreed that it must be so. Forget about justifying it. Eventually, when
they realize they spent a ton of money and got nothing back, they'll can the
jerk and look at improving their code. Yeah, and maybe I'll win the
lottery. ;-)
--
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
"jxstern" <jxstern@.nowhere.xyz> wrote in message
news:6km0j1hpfndba26oj7adrgaene8f6c688b@.4ax.com...
On Tue, 20 Sep 2005 07:38:45 -0400, "Tom Moreau"
<tom@.dont.spam.me.cips.ca> wrote:
>In the same vein, I usually tell people that if they have perf problems
>with
>SQL Server, they can migrate to Oracle and if they have perf problems with
>Oracle, they can migrate to SQL Server. How so? Because they'll have to
>clean up their code in order to do the migration. It isn't the migration
>that pays the dividend, it's the code/design cleanup that does. It's much
>cheaper than the migration.
Prezactly.
In this case, there was (and I assume still is) also an extremely
ambitious plan to merge and rationalize a bunch of related databases
into a single company-wide schema, or meta-schema, or ontology, or
phylogeny, or taxonomy, or whatever it is everyone has always thought
they were doing on such projects. It's like Captain Queeg proving the
mess boys took the strawberries, I think, an obsession that takes hold
and distracts from any real progress.
Anybody ever see one of these project succeed? I haven't, but I
suspect that some, maybe 10%, actually get deployed, at least.
Whether *any* show a positive ROI, I really wonder.
J.
Limitations of SQL Server 2000?
running 2003 Enterprise. Plenty of ram and hard drive space. I've also
just got access to a 8 CPU server which we've just begun using as well.
Now I'm hearing word from my management that we will be moving towards
Oracle because of a limitation with SQL Server 2000. All because an
individual sites SQL server cannot deal with more than 25 million rows.
I personally have tables with over 150 million records, but I haven't
tried to dump 25 million into
I have no idea where this comes from, and I doubt there is any
supporting evidence. I'm not opposed to Oracle, just wondering why we
are trying to fix what isn't broken.
Can anybody point me towards what this limitation, or describe a
similar scenario?Check out "Maximum Capacity Specifications" in the BOL. Your management is
wrong. SQL Server has handled databases over 10.5 TB in size. Most
problems I've seen with SQL Server (since 1993) have been
application-related.
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
<MICHAEL_SUNLIN@.COUNTRYWIDE.COM> wrote in message
news:1127156075.664445.58460@.o13g2000cwo.googlegroups.com...
Hello, we are running SQL 2000 Enterprise on a quad 2.8Ghz proc server
running 2003 Enterprise. Plenty of ram and hard drive space. I've also
just got access to a 8 CPU server which we've just begun using as well.
Now I'm hearing word from my management that we will be moving towards
Oracle because of a limitation with SQL Server 2000. All because an
individual sites SQL server cannot deal with more than 25 million rows.
I personally have tables with over 150 million records, but I haven't
tried to dump 25 million into
I have no idea where this comes from, and I doubt there is any
supporting evidence. I'm not opposed to Oracle, just wondering why we
are trying to fix what isn't broken.
Can anybody point me towards what this limitation, or describe a
similar scenario?|||On 19 Sep 2005 11:54:35 -0700, MICHAEL_SUNLIN@.COUNTRYWIDE.COM wrote:
>Can anybody point me towards what this limitation, or describe a
>similar scenario?
No, but half the developers in Los Angeles have worked at Countrywide
at some time in the last ten years, and are aware of this Oracle
project. It was begun when the mortgage boom was at its peak about
three years ago and all their current SQLServer systems hit capacity.
In a fit of pique, panic, and good salesmanship from Oracle, a new
project was born, has gone through about three names at last count,
and is now 300% over budget and schedule. Need one say more? Just
that, with new hardware and some extended tuning, they're still
happily running on SQLServer.
J.|||In the same vein, I usually tell people that if they have perf problems with
SQL Server, they can migrate to Oracle and if they have perf problems with
Oracle, they can migrate to SQL Server. How so? Because they'll have to
clean up their code in order to do the migration. It isn't the migration
that pays the dividend, it's the code/design cleanup that does. It's much
cheaper than the migration.
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
"JXStern" <JXSternChangeX2R@.gte.net> wrote in message
news:brvui11h9g0fbtpctk0540mlicbc5jg3io@.
4ax.com...
On 19 Sep 2005 11:54:35 -0700, MICHAEL_SUNLIN@.COUNTRYWIDE.COM wrote:
>Can anybody point me towards what this limitation, or describe a
>similar scenario?
No, but half the developers in Los Angeles have worked at Countrywide
at some time in the last ten years, and are aware of this Oracle
project. It was begun when the mortgage boom was at its peak about
three years ago and all their current SQLServer systems hit capacity.
In a fit of pique, panic, and good salesmanship from Oracle, a new
project was born, has gone through about three names at last count,
and is now 300% over budget and schedule. Need one say more? Just
that, with new hardware and some extended tuning, they're still
happily running on SQLServer.
J.|||On Tue, 20 Sep 2005 07:38:45 -0400, "Tom Moreau"
<tom@.dont.spam.me.cips.ca> wrote:
>In the same vein, I usually tell people that if they have perf problems wit
h
>SQL Server, they can migrate to Oracle and if they have perf problems with
>Oracle, they can migrate to SQL Server. How so? Because they'll have to
>clean up their code in order to do the migration. It isn't the migration
>that pays the dividend, it's the code/design cleanup that does. It's much
>cheaper than the migration.
Prezactly.
In this case, there was (and I assume still is) also an extremely
ambitious plan to merge and rationalize a bunch of related databases
into a single company-wide schema, or meta-schema, or ontology, or
phylogeny, or taxonomy, or whatever it is everyone has always thought
they were doing on such projects. It's like Captain Queeg proving the
mess boys took the strawberries, I think, an obsession that takes hold
and distracts from any real progress.
Anybody ever see one of these project succeed? I haven't, but I
suspect that some, maybe 10%, actually get deployed, at least.
Whether *any* show a positive ROI, I really wonder.
J.|||My guess is that an Oracle bigot made it into upper management and simply
decreed that it must be so. Forget about justifying it. Eventually, when
they realize they spent a ton of money and got nothing back, they'll can the
jerk and look at improving their code. Yeah, and maybe I'll win the
lottery. ;-)
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
.
"jxstern" <jxstern@.nowhere.xyz> wrote in message
news:6km0j1hpfndba26oj7adrgaene8f6c688b@.
4ax.com...
On Tue, 20 Sep 2005 07:38:45 -0400, "Tom Moreau"
<tom@.dont.spam.me.cips.ca> wrote:
>In the same vein, I usually tell people that if they have perf problems
>with
>SQL Server, they can migrate to Oracle and if they have perf problems with
>Oracle, they can migrate to SQL Server. How so? Because they'll have to
>clean up their code in order to do the migration. It isn't the migration
>that pays the dividend, it's the code/design cleanup that does. It's much
>cheaper than the migration.
Prezactly.
In this case, there was (and I assume still is) also an extremely
ambitious plan to merge and rationalize a bunch of related databases
into a single company-wide schema, or meta-schema, or ontology, or
phylogeny, or taxonomy, or whatever it is everyone has always thought
they were doing on such projects. It's like Captain Queeg proving the
mess boys took the strawberries, I think, an obsession that takes hold
and distracts from any real progress.
Anybody ever see one of these project succeed? I haven't, but I
suspect that some, maybe 10%, actually get deployed, at least.
Whether *any* show a positive ROI, I really wonder.
J.