Showing posts with label record. Show all posts
Showing posts with label record. Show all posts

Wednesday, March 28, 2012

Link to a record in CRM from SSRS report

Hi All, I have develped a simple report in SSRS, I would like to link to the record in CRM from SSRS.

Example:

Report looks like

Id Name

-

001 ABS Co.

If I click on ABS Co. it should take me the record details of ABS Co. in CRM.

Please help

Hi,

you′ve to add the AccountId to your reportdataset. Using the field AccountId you can build up the following link for the action property (Jump to URL) of the field:

="http://CRMSERVER/sfa/accts/edit.aspx?id={" & Fields!AccountId.Value.ToString() & "}"

Best regards

Monday, March 12, 2012

limiting the number of record sets returned by a stored procedure

Hi,
I know that a stored procedure will return as many results sets as
SELECT statements are in. I want to actually mark which selects should be
returned as results sets and which are just internal. Is there a way of
doing this?
Thanks,
George.George,
There is noway to do this in t-sql. Why are you doing exactly?
if you are doing this kind of operation:
select col1, col2, ..., coln
from table1
where col1 like 'microsoft%'
if @.@.rowcount > 0
...
you can use:
if exists(select * from table1 where col1 like 'microsoft%')
...
AMB
"George Tihenea" wrote:

> Hi,
> I know that a stored procedure will return as many results sets as
> SELECT statements are in. I want to actually mark which selects should be
> returned as results sets and which are just internal. Is there a way of
> doing this?
> Thanks,
> George.
>
>|||Thanks,
Here are some more details. I have a stored procedure like this:
// start of stored proc, then
....
SELECT c1, c2, c3 from ...
WHERE (condition here)
if ( @.returned_rows > 0 ) return 0; /* all ok return*/
/* let the flow continue */
select c1, c2, c3 from...
where (a different condition here)
if ( @.returned_rows > 0 ) return 0; /* all ok return*/
/* let the flow continue */
......................
return 0 /* did not find anything*/
/// end of stored procedure
All this worked ok and I can get the result set from my OLEDB middle tier
using multiple results sets. Of cause I will always get ONLY ONE result set
but OLEDB needs to use the template with multiple results sets to work...
then I had to modify the stored precedure to do an INSERT before finishing.
Here is a scheleton code:
// start of stored proc, then
....
SELECT c1, c2, c3 from ...
WHERE (condition here)
if ( @.returned_rows > 0 ) go to FINISH /* all ok return*/
/* let the flow continue */
select c1, c2, c3 from...
where (a different condition here)
if ( @.returned_rows > 0 ) go to FINISH; /* all ok return*/
/* let the flow continue */
......................
FINISH:
INSERT INTO ....
VALUES (...)
return 0
/// end of stored procedure
The problem is this INSERT. For some reasons I cannot understand, my OLEDB
consumer templates will think that there are 3 result sets instead of 2! And
when I try to read the last one it will just crash while binding the
columns, which is normal because this last result set is bogus!!?
Thanks,
George.
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:F7B5BE46-FCA1-485B-902C-9F8D37629C78@.microsoft.com...
> George,
> There is noway to do this in t-sql. Why are you doing exactly?
> if you are doing this kind of operation:
> select col1, col2, ..., coln
> from table1
> where col1 like 'microsoft%'
> if @.@.rowcount > 0
> ...
> you can use:
> if exists(select * from table1 where col1 like 'microsoft%')
> ...
>
> AMB
>
> "George Tihenea" wrote:
>|||If you need these "other" select statements for debugging what I do is
add a debug variable to each stored procedure:
declare @.Debug int
and then in the TSQL code I test it:
if @.Debug = 1
Select * from Scheduler
Of course you can only run these from Query Analyzer interactively but
typically that is where you draw the line for debugging SPs.
On Thu, 17 Mar 2005 08:53:01 -0500, "George Tihenea"
<tihenea@.comcast.net> wrote:

> Hi,
> I know that a stored procedure will return as many results sets as
>SELECT statements are in. I want to actually mark which selects should be
>returned as results sets and which are just internal. Is there a way of
>doing this?
> Thanks,
> George.
>

Wednesday, March 7, 2012

Limitations of SQL Server 2000 Personal Edition

Can someone tell me what the record limit is or db size is for PE of SQL
Server 2000. I'm trying to import 33.6 million records, and I keep getting
an error message that says the data contains an extra column at 454K records
.
I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
than. The db is set to automatically grow.
TIA
MarkYour error message isn't related to capacity. The system is telling you
that there is an extra column in a record. SQL requires a fixed number of
columns inan import source file. Most likely you are using a comma or tab
delimited file. Inside that file there is a character string that has an
extra delimiter character so SQL interprets that as an extra column. This
is fairly common on a data import from a non-scrubbed source.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> Can someone tell me what the record limit is or db size is for PE of SQL
> Server 2000. I'm trying to import 33.6 million records, and I keep
> getting
> an error message that says the data contains an extra column at 454K
> records.
> I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> than. The db is set to automatically grow.
> TIA
> Mark|||Hello Geoff,
Thanks for taking the time to answer my post. I realized the message it
probably correct, although I suspect it's an error because I loaded similar
files to the server at my office, and now when I'm trying to load them on my
PC at home, I'm running into this problem.
It could be a problem in the tab delimited file as you say, but I've tried 2
of the 3 text files, and I get the same error message at record 454,157.
Since these text files aren't the exact same files as I loaded at work, it's
possible that there is a cliche in my PC creating slightly different version
s
of the files than my work PC, but not likely to generate the error at the
same point within the file.
Do you know though if there are limitations on the size of the db on the PE
version?
TIA
Mark
"Geoff N. Hiten" wrote:

> Your error message isn't related to capacity. The system is telling you
> that there is an extra column in a record. SQL requires a fixed number of
> columns inan import source file. Most likely you are using a comma or tab
> delimited file. Inside that file there is a character string that has an
> extra delimiter character so SQL interprets that as an extra column. This
> is fairly common on a data import from a non-scrubbed source.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
>
>|||There are memory and processor usage limitations, but no database size
limitations in PE. Lok up "Maximum Capacity Limitations" in BOL. Also,
pre-expand your SQL Data files to hold the entire import. Auto-grow can
sometimes cause timeout issues.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:6FA717DB-6D9D-4480-ADE8-4B64C9DB5DC6@.microsoft.com...[vbcol=seagreen]
> Hello Geoff,
> Thanks for taking the time to answer my post. I realized the message it
> probably correct, although I suspect it's an error because I loaded
> similar
> files to the server at my office, and now when I'm trying to load them on
> my
> PC at home, I'm running into this problem.
> It could be a problem in the tab delimited file as you say, but I've tried
> 2
> of the 3 text files, and I get the same error message at record 454,157.
> Since these text files aren't the exact same files as I loaded at work,
> it's
> possible that there is a cliche in my PC creating slightly different
> versions
> of the files than my work PC, but not likely to generate the error at the
> same point within the file.
> Do you know though if there are limitations on the size of the db on the
> PE
> version?
> TIA
> Mark
> "Geoff N. Hiten" wrote:
>

Limitations of SQL Server 2000 Personal Edition

Can someone tell me what the record limit is or db size is for PE of SQL
Server 2000. I'm trying to import 33.6 million records, and I keep getting
an error message that says the data contains an extra column at 454K records.
I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
than. The db is set to automatically grow.
TIA
Mark
Your error message isn't related to capacity. The system is telling you
that there is an extra column in a record. SQL requires a fixed number of
columns inan import source file. Most likely you are using a comma or tab
delimited file. Inside that file there is a character string that has an
extra delimiter character so SQL interprets that as an extra column. This
is fairly common on a data import from a non-scrubbed source.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> Can someone tell me what the record limit is or db size is for PE of SQL
> Server 2000. I'm trying to import 33.6 million records, and I keep
> getting
> an error message that says the data contains an extra column at 454K
> records.
> I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> than. The db is set to automatically grow.
> TIA
> Mark
|||Hello Geoff,
Thanks for taking the time to answer my post. I realized the message it
probably correct, although I suspect it's an error because I loaded similar
files to the server at my office, and now when I'm trying to load them on my
PC at home, I'm running into this problem.
It could be a problem in the tab delimited file as you say, but I've tried 2
of the 3 text files, and I get the same error message at record 454,157.
Since these text files aren't the exact same files as I loaded at work, it's
possible that there is a cliche in my PC creating slightly different versions
of the files than my work PC, but not likely to generate the error at the
same point within the file.
Do you know though if there are limitations on the size of the db on the PE
version?
TIA
Mark
"Geoff N. Hiten" wrote:

> Your error message isn't related to capacity. The system is telling you
> that there is an extra column in a record. SQL requires a fixed number of
> columns inan import source file. Most likely you are using a comma or tab
> delimited file. Inside that file there is a character string that has an
> extra delimiter character so SQL interprets that as an extra column. This
> is fairly common on a data import from a non-scrubbed source.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
>
>
|||There are memory and processor usage limitations, but no database size
limitations in PE. Lok up "Maximum Capacity Limitations" in BOL. Also,
pre-expand your SQL Data files to hold the entire import. Auto-grow can
sometimes cause timeout issues.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:6FA717DB-6D9D-4480-ADE8-4B64C9DB5DC6@.microsoft.com...[vbcol=seagreen]
> Hello Geoff,
> Thanks for taking the time to answer my post. I realized the message it
> probably correct, although I suspect it's an error because I loaded
> similar
> files to the server at my office, and now when I'm trying to load them on
> my
> PC at home, I'm running into this problem.
> It could be a problem in the tab delimited file as you say, but I've tried
> 2
> of the 3 text files, and I get the same error message at record 454,157.
> Since these text files aren't the exact same files as I loaded at work,
> it's
> possible that there is a cliche in my PC creating slightly different
> versions
> of the files than my work PC, but not likely to generate the error at the
> same point within the file.
> Do you know though if there are limitations on the size of the db on the
> PE
> version?
> TIA
> Mark
> "Geoff N. Hiten" wrote:

Limitations of SQL Server 2000 Personal Edition

Can someone tell me what the record limit is or db size is for PE of SQL
Server 2000. I'm trying to import 33.6 million records, and I keep getting
an error message that says the data contains an extra column at 454K records.
I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
than. The db is set to automatically grow.
TIA
MarkYour error message isn't related to capacity. The system is telling you
that there is an extra column in a record. SQL requires a fixed number of
columns inan import source file. Most likely you are using a comma or tab
delimited file. Inside that file there is a character string that has an
extra delimiter character so SQL interprets that as an extra column. This
is fairly common on a data import from a non-scrubbed source.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> Can someone tell me what the record limit is or db size is for PE of SQL
> Server 2000. I'm trying to import 33.6 million records, and I keep
> getting
> an error message that says the data contains an extra column at 454K
> records.
> I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> than. The db is set to automatically grow.
> TIA
> Mark|||Hello Geoff,
Thanks for taking the time to answer my post. I realized the message it
probably correct, although I suspect it's an error because I loaded similar
files to the server at my office, and now when I'm trying to load them on my
PC at home, I'm running into this problem.
It could be a problem in the tab delimited file as you say, but I've tried 2
of the 3 text files, and I get the same error message at record 454,157.
Since these text files aren't the exact same files as I loaded at work, it's
possible that there is a cliche in my PC creating slightly different versions
of the files than my work PC, but not likely to generate the error at the
same point within the file.
Do you know though if there are limitations on the size of the db on the PE
version?
TIA
Mark
"Geoff N. Hiten" wrote:
> Your error message isn't related to capacity. The system is telling you
> that there is an extra column in a record. SQL requires a fixed number of
> columns inan import source file. Most likely you are using a comma or tab
> delimited file. Inside that file there is a character string that has an
> extra delimiter character so SQL interprets that as an extra column. This
> is fairly common on a data import from a non-scrubbed source.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> > Can someone tell me what the record limit is or db size is for PE of SQL
> > Server 2000. I'm trying to import 33.6 million records, and I keep
> > getting
> > an error message that says the data contains an extra column at 454K
> > records.
> >
> > I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> > than. The db is set to automatically grow.
> >
> > TIA
> >
> > Mark
>
>|||There are memory and processor usage limitations, but no database size
limitations in PE. Lok up "Maximum Capacity Limitations" in BOL. Also,
pre-expand your SQL Data files to hold the entire import. Auto-grow can
sometimes cause timeout issues.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:6FA717DB-6D9D-4480-ADE8-4B64C9DB5DC6@.microsoft.com...
> Hello Geoff,
> Thanks for taking the time to answer my post. I realized the message it
> probably correct, although I suspect it's an error because I loaded
> similar
> files to the server at my office, and now when I'm trying to load them on
> my
> PC at home, I'm running into this problem.
> It could be a problem in the tab delimited file as you say, but I've tried
> 2
> of the 3 text files, and I get the same error message at record 454,157.
> Since these text files aren't the exact same files as I loaded at work,
> it's
> possible that there is a cliche in my PC creating slightly different
> versions
> of the files than my work PC, but not likely to generate the error at the
> same point within the file.
> Do you know though if there are limitations on the size of the db on the
> PE
> version?
> TIA
> Mark
> "Geoff N. Hiten" wrote:
>> Your error message isn't related to capacity. The system is telling you
>> that there is an extra column in a record. SQL requires a fixed number
>> of
>> columns inan import source file. Most likely you are using a comma or
>> tab
>> delimited file. Inside that file there is a character string that has an
>> extra delimiter character so SQL interprets that as an extra column.
>> This
>> is fairly common on a data import from a non-scrubbed source.
>> --
>> Geoff N. Hiten
>> Senior Database Administrator
>> Microsoft SQL Server MVP
>> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
>> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
>> > Can someone tell me what the record limit is or db size is for PE of
>> > SQL
>> > Server 2000. I'm trying to import 33.6 million records, and I keep
>> > getting
>> > an error message that says the data contains an extra column at 454K
>> > records.
>> >
>> > I've tried the HELP, but I only see the Terrabyte limits. Is PE less
>> > than
>> > than. The db is set to automatically grow.
>> >
>> > TIA
>> >
>> > Mark
>>

Monday, February 20, 2012

limit record

hi everybody,
Beside top statement what is the best statement used to limit the display the data?
Thanks...Where

Having

Look them up in BOL|||You can also use SET ROWCOUNT, which can also be found in Books Online.|||thans for the replys guys, One more question here can I use set row count in my vb app without using stored procedure?

thank|||thans for the replys guys, One more question here can I use set row count in my vb app without using stored procedure?

thankSometimes you can, sometimes you can't. The SET ROWCOUNT command is a SQL Server specific command, that affects the current connection. If your application uses connection pooling, it is unsafe to use SET ROWCOUNT outside of a stored procedure.

-PatP

Limit query results

Hello, I'm using MSSql and ASP
I need to create an ASP page with record pagination
I'd like to know which is the best technique to get the best performance
The ideal way I think should be something like the LIMIT clause for Mysql
In MSSql there is the TOP clause to get the first top n records but it's not
the same as LIMIT
Is there any other way?
I'm searching for an asp pagination class also, but I can't find it,
do you guys have one? or can you point me to any website?
ThanksCheck out this site for a number of approaches...
http://www.aspfaq.com/show.asp?id=2120
"Deniz" <dzoddi@.mvmnet.com> wrote in message
news:eXAOsi6QGHA.4608@.tk2msftngp13.phx.gbl...
> Hello, I'm using MSSql and ASP
> I need to create an ASP page with record pagination
> I'd like to know which is the best technique to get the best performance
> The ideal way I think should be something like the LIMIT clause for Mysql
> In MSSql there is the TOP clause to get the first top n records but it's
not
> the same as LIMIT
> Is there any other way?
> I'm searching for an asp pagination class also, but I can't find it,
> do you guys have one? or can you point me to any website?
> Thanks
>
>