Showing posts with label records. Show all posts
Showing posts with label records. Show all posts

Friday, March 23, 2012

Link problem in my script file

I have a script file that I need to change the link because the view is leaving out some records. The problem description follows the script.

The script is:

CREATE view v_Flight_Check(Flight_No, Doc_No, Flight_Doc_No, Leg_no, Actual_Departing_Date, Actual_Arriving_Date,
Flight_Legs, Actual_Hours, Actual_Over_Load, Actual_Landings, Actual_Air_Time, Flight_Type_A, Flight_Type_B,
Aircraft_Id, Doc_Status,Aircraft_Doc_No,LogBookPage, Actual_From_Airport,Actual_To_Airport, Actual_Flight_Legs, FLActual_Hours, FLActual_Over_Load,
Start_ECU_Airframe_Hrs, End_Ecu_Airframe_Hrs, End_Ecu_Ac_hrs,
Crew,Position, From_Leg_Doc_No, To_Leg_Doc_No,Last_Name, First_Name, Tail_No,HSN,Serial_No,RECORD_TYPE) AS

SELECT DISTINCT Flight_Log.Flight_No,Flight_Log_Leg.Doc_no, Flight_Log_Leg.Flight_Doc_No, Flight_Log_Leg.Leg_no, Flight_Log_Leg.Actual_Departing_Date, Flight_Log_Leg.Actual_Arriving_Date,
Flight_Log_Leg.Flight_Legs, Flight_Log_Leg.Actual_Hours, Flight_Log_Leg.Actual_Over_Load, Flight_Log_Leg.Actual_Landings, Flight_Log_Leg.Actual_Air_Time, Flight_Log_Leg.Flight_Type_A, Flight_Log_Leg.Flight_Type_B,
Flight_Log.Aircraft_Id, Flight_log.Doc_Status, Flight_Log.Aircraft_Doc_No,Flight_Log.LogBookPage, Flight_Log.Actual_From_Airport,Flight_Log.Actual_To_Airport, Flight_Log.Actual_Flight_Legs, Flight_Log.Actual_Hours, Flight_Log.Actual_Over_Load,
Flight_Log.Start_ecu_Airframe_hrs, Flight_Log.End_ECU_Airframe_hrs,Flight_Log.End_ecu_ac_hrs,
Flight_Log_Crew.User_code,Flight_Log_Crew.Position, Flight_Log_Crew.From_Leg_Doc_No, Flight_Log_Crew.To_Leg_Doc_No, Userlist.L_name,UserList.F_Name,Aircraft_hdr.tail_no,Aircraft_hdr.hsn, Aircraft_Hdr.Serial_No,'FL'

FROM GDB_01_4.dbo.FLIGHT_LOG,
GDB_01_4.dbo.Aircraft_HDR,
GDB_01_4.dbo.Flight_Log_leg Right Outer Join
GDB_01_4.dbo.FLIGHT_LOG_CREW ON GDB_01_4.dbo.FLIGHT_LOG_CREW.FLIGHT_LOG_DOC_NO = GDB_01_4.dbo.FLIGHT_LOG_LEG.FLIGHT_DOC_NO,
GDB_01_4.dbo.Userlist

Where Flight_Log.Doc_No = Flight_Log_Leg.Flight_Doc_No and
FLIGHT_LOG_LEG.DOC_NO >= fLIGHT_LOG_CREW.FROM_LEG_DOC_NO AND FLIGHT_LOG_LEG.DOC_NO <= fLIGHT_LOG_CREW.TO_LEG_DOC_NO and
Flight_Log_Crew.User_Code = Userlist.ccode and Flight_Log.Aircraft_Doc_No = Aircraft_Hdr.Doc_No

The problem is that if there was no crew listed on the Flight Leg, the entire flight and leg is left off. I need all of the records in the FLight Leg Table, even if no crew was listed.

Does anyone know how I can fix this? My script writing skills are minimal.

Thanks

SBoyd

You need to move some of the conditions in the WHERE clause to the OUTER JOIN ON clause. This should provide the results you are looking for. It is however hard to tell which conditions without looking at some data and the relationship between the tables in detail.

Link problem in my script file

I have a script file that I need to change the link because the view is leaving out some records. The problem description follows the script.

The script is:

CREATE view v_Flight_Check(Flight_No, Doc_No, Flight_Doc_No, Leg_no, Actual_Departing_Date, Actual_Arriving_Date,
Flight_Legs, Actual_Hours, Actual_Over_Load, Actual_Landings, Actual_Air_Time, Flight_Type_A, Flight_Type_B,
Aircraft_Id, Doc_Status,Aircraft_Doc_No,LogBookPage, Actual_From_Airport,Actual_To_Airport, Actual_Flight_Legs, FLActual_Hours, FLActual_Over_Load,
Start_ECU_Airframe_Hrs, End_Ecu_Airframe_Hrs, End_Ecu_Ac_hrs,
Crew,Position, From_Leg_Doc_No, To_Leg_Doc_No,Last_Name, First_Name, Tail_No,HSN,Serial_No,RECORD_TYPE) AS

SELECT DISTINCT Flight_Log.Flight_No,Flight_Log_Leg.Doc_no, Flight_Log_Leg.Flight_Doc_No, Flight_Log_Leg.Leg_no, Flight_Log_Leg.Actual_Departing_Date, Flight_Log_Leg.Actual_Arriving_Date,
Flight_Log_Leg.Flight_Legs, Flight_Log_Leg.Actual_Hours, Flight_Log_Leg.Actual_Over_Load, Flight_Log_Leg.Actual_Landings, Flight_Log_Leg.Actual_Air_Time, Flight_Log_Leg.Flight_Type_A, Flight_Log_Leg.Flight_Type_B,
Flight_Log.Aircraft_Id, Flight_log.Doc_Status, Flight_Log.Aircraft_Doc_No,Flight_Log.LogBookPage, Flight_Log.Actual_From_Airport,Flight_Log.Actual_To_Airport, Flight_Log.Actual_Flight_Legs, Flight_Log.Actual_Hours, Flight_Log.Actual_Over_Load,
Flight_Log.Start_ecu_Airframe_hrs, Flight_Log.End_ECU_Airframe_hrs,Flight_Log.End_ecu_ac_hrs,
Flight_Log_Crew.User_code,Flight_Log_Crew.Position, Flight_Log_Crew.From_Leg_Doc_No, Flight_Log_Crew.To_Leg_Doc_No, Userlist.L_name,UserList.F_Name,Aircraft_hdr.tail_no,Aircraft_hdr.hsn, Aircraft_Hdr.Serial_No,'FL'

FROM GDB_01_4.dbo.FLIGHT_LOG,
GDB_01_4.dbo.Aircraft_HDR,
GDB_01_4.dbo.Flight_Log_leg Right Outer Join
GDB_01_4.dbo.FLIGHT_LOG_CREW ON GDB_01_4.dbo.FLIGHT_LOG_CREW.FLIGHT_LOG_DOC_NO = GDB_01_4.dbo.FLIGHT_LOG_LEG.FLIGHT_DOC_NO,
GDB_01_4.dbo.Userlist

Where Flight_Log.Doc_No = Flight_Log_Leg.Flight_Doc_No and
FLIGHT_LOG_LEG.DOC_NO >= fLIGHT_LOG_CREW.FROM_LEG_DOC_NO AND FLIGHT_LOG_LEG.DOC_NO <= fLIGHT_LOG_CREW.TO_LEG_DOC_NO and
Flight_Log_Crew.User_Code = Userlist.ccode and Flight_Log.Aircraft_Doc_No = Aircraft_Hdr.Doc_No

The problem is that if there was no crew listed on the Flight Leg, the entire flight and leg is left off. I need all of the records in the FLight Leg Table, even if no crew was listed.

Does anyone know how I can fix this? My script writing skills are minimal.

Thanks

SBoyd

You need to move some of the conditions in the WHERE clause to the OUTER JOIN ON clause. This should provide the results you are looking for. It is however hard to tell which conditions without looking at some data and the relationship between the tables in detail.

Wednesday, March 21, 2012

Lines

I have a report that I need to add a line every 5 records (don't ask!), I
can't see anyway to do this simply using the line control.
Am I missing something or do I have to look at doing this progmatically?Found the answer.
I added a border to the bottom of the detail row in a table control and then
set the borders visibility to an expression so that it was only visible every
5 rows.

Monday, March 12, 2012

Limits in SQL 2000

I have a database with 10,000 records and I want to just see record
1000 - 1025 is there an easy way to do this with a query. I have found
info for my SQL using limits but it does not work with SQL Server 2000.
Please help.

Thanks in advance.Robert P (rphilipson@.legaleze.com) writes:

Quote:

Originally Posted by

I have a database with 10,000 records and I want to just see record
1000 - 1025 is there an easy way to do this with a query. I have found
info for my SQL using limits but it does not work with SQL Server 2000.


Assuming that you want to do paging for display on a web page,
see SQL Server MVP's Aaron Bertrand article on the topic on
http://www.aspfaq.com/show.asp?id=2120.

--
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

Limits in Reporting Services

I read a post from one of the MVPs that stated "If you mean that the result
set has 90,000 records or 180,000 records then you have the wrong product. RS
is not designed to generate reports that are 1500+ pages."
Is there documentation available about the limits of RS (i.e. max number of
records, max number of pages, etc.)? I realize that it depends on the amount
of memory available, but I'd like to get the limits for the 2 GB process
maximum (numbers for 3GB with the /3GB option would be nice also).
Additionally, how is this affected by multiple users? Is it a sum total of
all requests or are the requests queued?
I looked in the docs first for this info and wasn't able to find it.
Thanks for your help."brett" wrote:
> I read a post from one of the MVPs that stated "If you mean that the result
> set has 90,000 records or 180,000 records then you have the wrong product. RS
> is not designed to generate reports that are 1500+ pages."
> Is there documentation available about the limits of RS (i.e. max number of
> records, max number of pages, etc.)? I realize that it depends on the amount
> of memory available, but I'd like to get the limits for the 2 GB process
> maximum (numbers for 3GB with the /3GB option would be nice also).
> Additionally, how is this affected by multiple users? Is it a sum total of
> all requests or are the requests queued?
> I looked in the docs first for this info and wasn't able to find it.
> Thanks for your help.
Hi Brett,
Are you outputting to pdf files. I have recently experienced a similar
problem where the reports timeout and don't work if there are over 4000
pages. We tried throttling the memory allocated to SQL on the box as well as
changing memory limits in the config file within Reporting services but that
has not seemed to do the trick.|||I am not aware of any document like that. PDF and Excel rendering take a lot
of processing power. Rendering to CSV is much much faster (I had a case
where Excel took 5 minutes and the same data rendered as CSV in about 15
seconds). That is a possible work around for you for large datasets.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"brett" <brett@.discussions.microsoft.com> wrote in message
news:5318C3D8-3051-45C4-AC1B-2D206C7AC2E8@.microsoft.com...
> I read a post from one of the MVPs that stated "If you mean that the
result
> set has 90,000 records or 180,000 records then you have the wrong product.
RS
> is not designed to generate reports that are 1500+ pages."
> Is there documentation available about the limits of RS (i.e. max number
of
> records, max number of pages, etc.)? I realize that it depends on the
amount
> of memory available, but I'd like to get the limits for the 2 GB process
> maximum (numbers for 3GB with the /3GB option would be nice also).
> Additionally, how is this affected by multiple users? Is it a sum total
of
> all requests or are the requests queued?
> I looked in the docs first for this info and wasn't able to find it.
> Thanks for your help.|||Rob,
Our application renders to the gui first, so that's where we are
encountering our problem. It's clearly a memory issue. I'm able to render
over 7K+ pages in the GUI, but then we get the OutOfMemory exception when we
go above that.
When I up the process memory limit to 3GB, I'm able to render more pages so
it's clearly a memory issue as the exception states.
I'm trying to get documentation so we can understand what the limits are so
we can tell our users what to expect. Telling them "don't run big reports"
isn't acceptable:)
"rob" wrote:
>
> "brett" wrote:
> > I read a post from one of the MVPs that stated "If you mean that the result
> > set has 90,000 records or 180,000 records then you have the wrong product. RS
> > is not designed to generate reports that are 1500+ pages."
> >
> > Is there documentation available about the limits of RS (i.e. max number of
> > records, max number of pages, etc.)? I realize that it depends on the amount
> > of memory available, but I'd like to get the limits for the 2 GB process
> > maximum (numbers for 3GB with the /3GB option would be nice also).
> >
> > Additionally, how is this affected by multiple users? Is it a sum total of
> > all requests or are the requests queued?
> >
> > I looked in the docs first for this info and wasn't able to find it.
> >
> > Thanks for your help.
>
> Hi Brett,
> Are you outputting to pdf files. I have recently experienced a similar
> problem where the reports timeout and don't work if there are over 4000
> pages. We tried throttling the memory allocated to SQL on the box as well as
> changing memory limits in the config file within Reporting services but that
> has not seemed to do the trick.|||Thanks for the feedback Bruce.
What we are wrestling with is there is a max amount of data the RS can
render and it's tied to memory. When our customers ask how big of a machine
they need to buy, we need to be able to give them some guidance based on
their data sets. We also need to be able to tell them (or prevent them in
code) what size of reports they can run.
If it's not documented, we'll have to test different memory configurations
and datasets...not a great use of resources.
"Bruce L-C [MVP]" wrote:
> I am not aware of any document like that. PDF and Excel rendering take a lot
> of processing power. Rendering to CSV is much much faster (I had a case
> where Excel took 5 minutes and the same data rendered as CSV in about 15
> seconds). That is a possible work around for you for large datasets.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "brett" <brett@.discussions.microsoft.com> wrote in message
> news:5318C3D8-3051-45C4-AC1B-2D206C7AC2E8@.microsoft.com...
> > I read a post from one of the MVPs that stated "If you mean that the
> result
> > set has 90,000 records or 180,000 records then you have the wrong product.
> RS
> > is not designed to generate reports that are 1500+ pages."
> >
> > Is there documentation available about the limits of RS (i.e. max number
> of
> > records, max number of pages, etc.)? I realize that it depends on the
> amount
> > of memory available, but I'd like to get the limits for the 2 GB process
> > maximum (numbers for 3GB with the /3GB option would be nice also).
> >
> > Additionally, how is this affected by multiple users? Is it a sum total
> of
> > all requests or are the requests queued?
> >
> > I looked in the docs first for this info and wasn't able to find it.
> >
> > Thanks for your help.
>
>|||One potential for large Excel data is to render it in CSV and specify ANSII
format (rather than the default of Unicode which Report Manager uses). Excel
doesn't handle Unicode CSV. If rendered in ANSII CSV then Excel will
appropriate put it in the correct columns. Might be a good solution for
people that want a data dump to perform their own analysis on.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"brett" <brett@.discussions.microsoft.com> wrote in message
news:290F0D39-1C39-436E-9597-75208873EEA3@.microsoft.com...
> Thanks for the feedback Bruce.
> What we are wrestling with is there is a max amount of data the RS can
> render and it's tied to memory. When our customers ask how big of a
machine
> they need to buy, we need to be able to give them some guidance based on
> their data sets. We also need to be able to tell them (or prevent them in
> code) what size of reports they can run.
> If it's not documented, we'll have to test different memory configurations
> and datasets...not a great use of resources.
> "Bruce L-C [MVP]" wrote:
> > I am not aware of any document like that. PDF and Excel rendering take a
lot
> > of processing power. Rendering to CSV is much much faster (I had a case
> > where Excel took 5 minutes and the same data rendered as CSV in about 15
> > seconds). That is a possible work around for you for large datasets.
> >
> >
> > --
> > Bruce Loehle-Conger
> > MVP SQL Server Reporting Services
> >
> > "brett" <brett@.discussions.microsoft.com> wrote in message
> > news:5318C3D8-3051-45C4-AC1B-2D206C7AC2E8@.microsoft.com...
> > > I read a post from one of the MVPs that stated "If you mean that the
> > result
> > > set has 90,000 records or 180,000 records then you have the wrong
product.
> > RS
> > > is not designed to generate reports that are 1500+ pages."
> > >
> > > Is there documentation available about the limits of RS (i.e. max
number
> > of
> > > records, max number of pages, etc.)? I realize that it depends on the
> > amount
> > > of memory available, but I'd like to get the limits for the 2 GB
process
> > > maximum (numbers for 3GB with the /3GB option would be nice also).
> > >
> > > Additionally, how is this affected by multiple users? Is it a sum
total
> > of
> > > all requests or are the requests queued?
> > >
> > > I looked in the docs first for this info and wasn't able to find it.
> > >
> > > Thanks for your help.
> >
> >
> >|||We've actually had certain reports bring a server down because it was
taking so much memory and CPU time. Interestingly, the PDF format of
the same report didn't take very long nor did it bring the machine down.
Bruce L-C [MVP] wrote:
> I am not aware of any document like that. PDF and Excel rendering take a lot
> of processing power. Rendering to CSV is much much faster (I had a case
> where Excel took 5 minutes and the same data rendered as CSV in about 15
> seconds). That is a possible work around for you for large datasets.
>|||Our customer requirement is for the reports to be rendered in PDF format so
the CSV is not an option. There is also a logo graphic that is on each page
that adds to the problem as well.
"Bruce L-C [MVP]" wrote:
> One potential for large Excel data is to render it in CSV and specify ANSII
> format (rather than the default of Unicode which Report Manager uses). Excel
> doesn't handle Unicode CSV. If rendered in ANSII CSV then Excel will
> appropriate put it in the correct columns. Might be a good solution for
> people that want a data dump to perform their own analysis on.
>
> --
> Bruce Loehle-Conger
> MVP SQL Server Reporting Services
> "brett" <brett@.discussions.microsoft.com> wrote in message
> news:290F0D39-1C39-436E-9597-75208873EEA3@.microsoft.com...
> > Thanks for the feedback Bruce.
> >
> > What we are wrestling with is there is a max amount of data the RS can
> > render and it's tied to memory. When our customers ask how big of a
> machine
> > they need to buy, we need to be able to give them some guidance based on
> > their data sets. We also need to be able to tell them (or prevent them in
> > code) what size of reports they can run.
> >
> > If it's not documented, we'll have to test different memory configurations
> > and datasets...not a great use of resources.
> >
> > "Bruce L-C [MVP]" wrote:
> >
> > > I am not aware of any document like that. PDF and Excel rendering take a
> lot
> > > of processing power. Rendering to CSV is much much faster (I had a case
> > > where Excel took 5 minutes and the same data rendered as CSV in about 15
> > > seconds). That is a possible work around for you for large datasets.
> > >
> > >
> > > --
> > > Bruce Loehle-Conger
> > > MVP SQL Server Reporting Services
> > >
> > > "brett" <brett@.discussions.microsoft.com> wrote in message
> > > news:5318C3D8-3051-45C4-AC1B-2D206C7AC2E8@.microsoft.com...
> > > > I read a post from one of the MVPs that stated "If you mean that the
> > > result
> > > > set has 90,000 records or 180,000 records then you have the wrong
> product.
> > > RS
> > > > is not designed to generate reports that are 1500+ pages."
> > > >
> > > > Is there documentation available about the limits of RS (i.e. max
> number
> > > of
> > > > records, max number of pages, etc.)? I realize that it depends on the
> > > amount
> > > > of memory available, but I'd like to get the limits for the 2 GB
> process
> > > > maximum (numbers for 3GB with the /3GB option would be nice also).
> > > >
> > > > Additionally, how is this affected by multiple users? Is it a sum
> total
> > > of
> > > > all requests or are the requests queued?
> > > >
> > > > I looked in the docs first for this info and wasn't able to find it.
> > > >
> > > > Thanks for your help.
> > >
> > >
> > >
>
>|||I agree with the MVP - a 4000+ page report is really just a fancy
data-feed.
I often have customers request such reports, and my typical response is
that Sql Reporting Services is not the right tool for the job. Just
because you "Can" do something doesnt mean that you should.
Also, assuming you do get this working, if you legitimately have a
4000+ page PDF report, then I have sympathy for the end-user who tries
to open it in Adobe Acrobat. I have seen Adobe Acrobat opening
sub-500 page PDF documents bring a high-end PC to its knees, so I am
fearful of what a 4000 page doc would be like.
There are plenty of alternatives...split the report into 40 PDF
documents @. 100 pages, extract the data to file, then use a 3rd party
PDF renderer to create the file, convince the users to accept an
ETL-generated CSV file, etc.
TIP: If you are the admin of a Sql Reporting Services instance, set a
report development standard for report developers to limit the number
of rows in a DataSet and the number of Pages rendered.
Typically my max threshold for data in a report DataSet is 5000 rows @.
5 columns (or equivilent). Also, my guideline for report-length is to
restrict report output to < 100 pages.
I hope this helps.
Lance Hunt
http://weblogs.asp.net/lhunt/|||"There are more reporting requirements in heaven and on earth, Lance,
than are dreamt of in your reporting architecture."|||We are working on supporting larger report sizes but yes, if you want to do
a pure data dump, Reporting Services is not the best tool for the job.
--
Brian Welcker
Group Program Manager
Microsoft SQL Server
This posting is provided "AS IS" with no warranties, and confers no rights.
"Lance" <lancehunt@.gmail.com> wrote in message
news:1116022906.085455.213970@.f14g2000cwb.googlegroups.com...
>I agree with the MVP - a 4000+ page report is really just a fancy
> data-feed.
> I often have customers request such reports, and my typical response is
> that Sql Reporting Services is not the right tool for the job. Just
> because you "Can" do something doesnt mean that you should.
> Also, assuming you do get this working, if you legitimately have a
> 4000+ page PDF report, then I have sympathy for the end-user who tries
> to open it in Adobe Acrobat. I have seen Adobe Acrobat opening
> sub-500 page PDF documents bring a high-end PC to its knees, so I am
> fearful of what a 4000 page doc would be like.
> There are plenty of alternatives...split the report into 40 PDF
> documents @. 100 pages, extract the data to file, then use a 3rd party
> PDF renderer to create the file, convince the users to accept an
> ETL-generated CSV file, etc.
> TIP: If you are the admin of a Sql Reporting Services instance, set a
> report development standard for report developers to limit the number
> of rows in a DataSet and the number of Pages rendered.
> Typically my max threshold for data in a report DataSet is 5000 rows @.
> 5 columns (or equivilent). Also, my guideline for report-length is to
> restrict report output to < 100 pages.
> I hope this helps.
> Lance Hunt
> http://weblogs.asp.net/lhunt/
>|||What other third party PDF rendering tool you would suggest? How do I split
current PDF report into many pages thru Reporting Services? I got the same
report that genearats more than 2500 pages and RS threw out 'OutofMemory'
execption. THanks.
-Chuck
"Lance" wrote:
> I agree with the MVP - a 4000+ page report is really just a fancy
> data-feed.
> I often have customers request such reports, and my typical response is
> that Sql Reporting Services is not the right tool for the job. Just
> because you "Can" do something doesnt mean that you should.
> Also, assuming you do get this working, if you legitimately have a
> 4000+ page PDF report, then I have sympathy for the end-user who tries
> to open it in Adobe Acrobat. I have seen Adobe Acrobat opening
> sub-500 page PDF documents bring a high-end PC to its knees, so I am
> fearful of what a 4000 page doc would be like.
> There are plenty of alternatives...split the report into 40 PDF
> documents @. 100 pages, extract the data to file, then use a 3rd party
> PDF renderer to create the file, convince the users to accept an
> ETL-generated CSV file, etc.
> TIP: If you are the admin of a Sql Reporting Services instance, set a
> report development standard for report developers to limit the number
> of rows in a DataSet and the number of Pages rendered.
> Typically my max threshold for data in a report DataSet is 5000 rows @.
> 5 columns (or equivilent). Also, my guideline for report-length is to
> restrict report output to < 100 pages.
> I hope this helps.
> Lance Hunt
> http://weblogs.asp.net/lhunt/
>|||Hi,
Since there's a conclusion here that reporting services is not the right
tool to work around for generating reports that have like more than 150 K
records, is there any solution to this? I mean even if we were to use excel,
excel files only support 65K plus records right?
Thanks
"brett" wrote:
> I read a post from one of the MVPs that stated "If you mean that the result
> set has 90,000 records or 180,000 records then you have the wrong product. RS
> is not designed to generate reports that are 1500+ pages."
> Is there documentation available about the limits of RS (i.e. max number of
> records, max number of pages, etc.)? I realize that it depends on the amount
> of memory available, but I'd like to get the limits for the 2 GB process
> maximum (numbers for 3GB with the /3GB option would be nice also).
> Additionally, how is this affected by multiple users? Is it a sum total of
> all requests or are the requests queued?
> I looked in the docs first for this info and wasn't able to find it.
> Thanks for your help.|||There are at least a few different ways to do this, but I recommend
starting at the end-user and working backwards.
If this report ultimately only needs to be machine-readable, then
choose a data-format such as XML, CSV, MDB (access). Then, work
backwards to figure-out the best technology to deliver this file(s).
If your end-users require this to be a human-readable report, then what
format do they want it in? HTML, Excel, XML, PDF, plain-text, etc.
Also, its unlikely that any person would need to read the entire report
top-to-bottom in one session, so figure-out how they plan to use the
report and break it up accordingly. Possibly this report could be
broken-down into several hundred HTML pages with a table-of-contents to
enable navigation. You may be able to use something such as XML-FO to
help automate this, or a simple XSL and a VBS script to chunk the data.
Hope this helps...
Lance Hunt
http://www.lance-hunt.net/|||It would be fascinating to learn the motivation behind this rather odd
requirement.|||That was me. RS does everything in memory (RAM). This makes sense from a
performance viewpoint but when you get to large amounts of data then that
really makes a difference. The other issue is just efficiency of rendering.
Rendering to PDF and Excel is much slower than HTML or CSV. They have
emphasized fidelity rather than performance. If doing large amounts to Excel
then the following will be very very fast. You need to have a link to do
this because Report Manager defaults to unicode which Excel puts all in one
column. In RS 2005 you will be able to configure Report Manager to use ASCII
CSV export.
Here is an example of a Jump to URL link I use. This causes Excel to come up
with the data in a separate window:
="javascript:void(window.open('" & Globals!ReportServerUrl &
"?/SomeFolder/SomeReport&ParamName=" & Parameters!ParamName.Value &
"&rs:Format=CSV&rc:Encoding=ASCII','_blank'))"
Very nice and very fast.
Bruce Loehle-Conger
MVP SQL Server Reporting Services
"et_ck" <etck@.discussions.microsoft.com> wrote in message
news:09627F24-3E35-4ABC-A353-DED4A8865DEF@.microsoft.com...
> Hi,
> Since there's a conclusion here that reporting services is not the right
> tool to work around for generating reports that have like more than 150 K
> records, is there any solution to this? I mean even if we were to use
> excel,
> excel files only support 65K plus records right?
> Thanks
> "brett" wrote:
>> I read a post from one of the MVPs that stated "If you mean that the
>> result
>> set has 90,000 records or 180,000 records then you have the wrong
>> product. RS
>> is not designed to generate reports that are 1500+ pages."
>> Is there documentation available about the limits of RS (i.e. max number
>> of
>> records, max number of pages, etc.)? I realize that it depends on the
>> amount
>> of memory available, but I'd like to get the limits for the 2 GB process
>> maximum (numbers for 3GB with the /3GB option would be nice also).
>> Additionally, how is this affected by multiple users? Is it a sum total
>> of
>> all requests or are the requests queued?
>> I looked in the docs first for this info and wasn't able to find it.
>> Thanks for your help.|||I'm with the masses here... what the hell would you do with a 400
page printed report? It's pointless, no-one will read through it. Yo
need to find out what your customer wants to do with it
Personally I'm a fan of exception reports where you actually buil
logic into your report rather than just spewing data from
database
Likewise, with reporting services you could build a summary repor
where the user can interactively drill to more detail which would b
a subset of your 4000 pages|||This is probably some braindead legal requirement, even the most
hideous of paper wasters would not ask for something like this.

Limiting user records

Hi,
What is the best way to limit the number of records a user can insert
into a table. Basically, I want to set something like 50 records. If
the user tries to insert the 11th record, I want the action to FAIL,
therefore firing some SQL Error. I want to be able to customize this
error too.
So, what is the best way? Is it possible to get an example please?
Tascien
Your specs are inconsistent. You say you want to limit it to 50 rows but it
should fail on the 11th row. Could you post your DDL, sample data and
expected results?
Tom
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
<tascienu@.ecoaches.com> wrote in message
news:1136420757.396966.83240@.o13g2000cwo.googlegro ups.com...
Hi,
What is the best way to limit the number of records a user can insert
into a table. Basically, I want to set something like 50 records. If
the user tries to insert the 11th record, I want the action to FAIL,
therefore firing some SQL Error. I want to be able to customize this
error too.
So, what is the best way? Is it possible to get an example please?
Tascien
|||tascienu@.ecoaches.com wrote:
> Hi,
> What is the best way to limit the number of records a user can insert
> into a table. Basically, I want to set something like 50 records. If
> the user tries to insert the 11th record, I want the action to FAIL,
> therefore firing some SQL Error. I want to be able to customize this
> error too.
> So, what is the best way? Is it possible to get an example please?
> Tascien
Do you mean total rows in a table or the number of rows executed in a
single insert statement? If you want total rows in a table then you'll
have to tag each row with the user name. You can use an instead of
insert trigger to first query the underlying table for the number of
rows owned by the user and add this to the number of rows in the
inserted table. If you want to limit the number in a single insert (as
in INSERT INTO... SELECT FROM) you can also use an instead of insert
trigger, but this time check COUNT(*) from inserted to make sure it's
not over the limit. Or maybe you want something else altogether...
David Gugick
Quest Software
www.quest.com
|||Sorry, I was giving an example... Let's say, i limit to 3 records...
1 First
2 Second
3 Third
4 Raise Error "You have reached the maximum allowed rows"
Tascien
|||create table test
(value char(100))
go
create trigger testrig
on test
instead of
insert
as
begin
if (select count(*) from test) >= 3
begin
rollback
raiserror ('Number of rows exceeds 3',16,1)
end
else
insert into test select value from inserted
end
go
insert into test values('first')
insert into test values('second')
insert into test values('third')
insert into test values('fourth')
go
select * from test
|||Wonderful. That is what i was looking for. By the way, adding a
Select Count(*) trigger for each insert... is it a performance issue
when the table gets big?
Tascien
|||tascienu@.ecoaches.com wrote:
> Wonderful. That is what i was looking for. By the way, adding a
> Select Count(*) trigger for each insert... is it a performance issue
> when the table gets big?
> Tascien
Yes. You might want to consider creating a clustered index on the user
column. But it is a concern you should account for before the table gets
too large.
David Gugick
Quest Software
www.quest.com
|||Yes you need to create an clustered index on it.
One more thing that I found about my last post is,
Suppose two users want to insert record and number of records are 2 in
the table when both of you checked it.so both of you will issue insert
commands. So first of them two should be succeeded. You have to put a
lock on the table.
so while you check you have to give lock hint of holdlock.
select count(*) from test (holdlock)
But then at a time only one user can insert into the table.
What is the reason for a table to limmit its number of records to some
number of rows.
Instead you can create a view to filter number of rows.
Regards
Amish
*** Sent via Developersdex http://www.codecomments.com ***
|||The reason I am limiting is not a critical one. I just don't want some
of my customers to exceed a number of records I assign to them. If they
exceed by 1 or 2, it should be fine...
Once again, thanks for help.
Tascien
|||Ok in that case you dont need to put hold lock is you are ready to
accept 1 or 2 more records then decided. It will not create locks on
the table.
Regards
Amish

Limiting user records

Hi,
What is the best way to limit the number of records a user can insert
into a table. Basically, I want to set something like 50 records. If
the user tries to insert the 11th record, I want the action to FAIL,
therefore firing some SQL Error. I want to be able to customize this
error too.
So, what is the best way? Is it possible to get an example please?
TascienYour specs are inconsistent. You say you want to limit it to 50 rows but it
should fail on the 11th row. Could you post your DDL, sample data and
expected results?
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
<tascienu@.ecoaches.com> wrote in message
news:1136420757.396966.83240@.o13g2000cwo.googlegroups.com...
Hi,
What is the best way to limit the number of records a user can insert
into a table. Basically, I want to set something like 50 records. If
the user tries to insert the 11th record, I want the action to FAIL,
therefore firing some SQL Error. I want to be able to customize this
error too.
So, what is the best way? Is it possible to get an example please?
Tascien|||tascienu@.ecoaches.com wrote:
> Hi,
> What is the best way to limit the number of records a user can insert
> into a table. Basically, I want to set something like 50 records. If
> the user tries to insert the 11th record, I want the action to FAIL,
> therefore firing some SQL Error. I want to be able to customize this
> error too.
> So, what is the best way? Is it possible to get an example please?
> Tascien
Do you mean total rows in a table or the number of rows executed in a
single insert statement? If you want total rows in a table then you'll
have to tag each row with the user name. You can use an instead of
insert trigger to first query the underlying table for the number of
rows owned by the user and add this to the number of rows in the
inserted table. If you want to limit the number in a single insert (as
in INSERT INTO... SELECT FROM) you can also use an instead of insert
trigger, but this time check COUNT(*) from inserted to make sure it's
not over the limit. Or maybe you want something else altogether...
David Gugick
Quest Software
www.quest.com|||Sorry, I was giving an example... Let's say, i limit to 3 records...
1 First
2 Second
3 Third
4 Raise Error "You have reached the maximum allowed rows"
Tascien|||create table test
(value char(100))
go
create trigger testrig
on test
instead of
insert
as
begin
if (select count(*) from test) >= 3
begin
rollback
raiserror ('Number of rows exceeds 3',16,1)
end
else
insert into test select value from inserted
end
go
insert into test values('first')
insert into test values('second')
insert into test values('third')
insert into test values('fourth')
go
select * from test|||Wonderful. That is what i was looking for. By the way, adding a
Select Count(*) trigger for each insert... is it a performance issue
when the table gets big?
Tascien|||tascienu@.ecoaches.com wrote:
> Wonderful. That is what i was looking for. By the way, adding a
> Select Count(*) trigger for each insert... is it a performance issue
> when the table gets big?
> Tascien
Yes. You might want to consider creating a clustered index on the user
column. But it is a concern you should account for before the table gets
too large.
David Gugick
Quest Software
www.quest.com|||Yes you need to create an clustered index on it.
One more thing that I found about my last post is,
Suppose two users want to insert record and number of records are 2 in
the table when both of you checked it.so both of you will issue insert
commands. So first of them two should be succeeded. You have to put a
lock on the table.
so while you check you have to give lock hint of holdlock.
select count(*) from test (holdlock)
But then at a time only one user can insert into the table.
What is the reason for a table to limmit its number of records to some
number of rows.
Instead you can create a view to filter number of rows.
Regards
Amish
*** Sent via Developersdex http://www.codecomments.com ***|||The reason I am limiting is not a critical one. I just don't want some
of my customers to exceed a number of records I assign to them. If they
exceed by 1 or 2, it should be fine...
Once again, thanks for help.
Tascien|||Ok in that case you dont need to put hold lock is you are ready to
accept 1 or 2 more records then decided. It will not create locks on
the table.
Regards
Amish

Limiting user records

Hi,
What is the best way to limit the number of records a user can insert
into a table. Basically, I want to set something like 50 records. If
the user tries to insert the 11th record, I want the action to FAIL,
therefore firing some SQL Error. I want to be able to customize this
error too.
So, what is the best way? Is it possible to get an example please?
TascienYour specs are inconsistent. You say you want to limit it to 50 rows but it
should fail on the 11th row. Could you post your DDL, sample data and
expected results?
--
Tom
----
Thomas A. Moreau, BSc, PhD, MCSE, MCDBA
SQL Server MVP
Columnist, SQL Server Professional
Toronto, ON Canada
www.pinpub.com
<tascienu@.ecoaches.com> wrote in message
news:1136420757.396966.83240@.o13g2000cwo.googlegroups.com...
Hi,
What is the best way to limit the number of records a user can insert
into a table. Basically, I want to set something like 50 records. If
the user tries to insert the 11th record, I want the action to FAIL,
therefore firing some SQL Error. I want to be able to customize this
error too.
So, what is the best way? Is it possible to get an example please?
Tascien|||tascienu@.ecoaches.com wrote:
> Hi,
> What is the best way to limit the number of records a user can insert
> into a table. Basically, I want to set something like 50 records. If
> the user tries to insert the 11th record, I want the action to FAIL,
> therefore firing some SQL Error. I want to be able to customize this
> error too.
> So, what is the best way? Is it possible to get an example please?
> Tascien
Do you mean total rows in a table or the number of rows executed in a
single insert statement? If you want total rows in a table then you'll
have to tag each row with the user name. You can use an instead of
insert trigger to first query the underlying table for the number of
rows owned by the user and add this to the number of rows in the
inserted table. If you want to limit the number in a single insert (as
in INSERT INTO... SELECT FROM) you can also use an instead of insert
trigger, but this time check COUNT(*) from inserted to make sure it's
not over the limit. Or maybe you want something else altogether...
--
David Gugick
Quest Software
www.quest.com|||Sorry, I was giving an example... Let's say, i limit to 3 records...
1 First
2 Second
3 Third
4 Raise Error "You have reached the maximum allowed rows"
Tascien|||create table test
(value char(100))
go
create trigger testrig
on test
instead of
insert
as
begin
if (select count(*) from test) >= 3
begin
rollback
raiserror ('Number of rows exceeds 3',16,1)
end
else
insert into test select value from inserted
end
go
insert into test values('first')
insert into test values('second')
insert into test values('third')
insert into test values('fourth')
go
select * from test|||Wonderful. That is what i was looking for. By the way, adding a
Select Count(*) trigger for each insert... is it a performance issue
when the table gets big?
Tascien|||tascienu@.ecoaches.com wrote:
> Wonderful. That is what i was looking for. By the way, adding a
> Select Count(*) trigger for each insert... is it a performance issue
> when the table gets big?
> Tascien
Yes. You might want to consider creating a clustered index on the user
column. But it is a concern you should account for before the table gets
too large.
--
David Gugick
Quest Software
www.quest.com

Limiting the Records in SELECT Statement

Hi..

As everyone knows that there's a keyword in MySQL that is used to limit the records in SELECT statement as

SELECT * FROM Table LIMIT 40,10

means that select from record 40 to onward 10 records.

In SQL Server it is a long way with nested select ... can someone provide an alternate or we have to use the Nested SELECT?

Thanks

Yes

U can specify

select top n * from tablename

similarly u can also specify top n percentage in a select statement.

Check for Keyword top in sql Help

|||

Nested select (technically called a derived table) is the best way to go in SQL. Here are a few articles that cover the subject.

http://www.sqlmag.com/Article/ArticleID/43922/sql_server_43922.html

http://sqljunkies.com/WebLog/amachanic/archive/2004/11/03/4945.aspx

http://davidhayden.com/blog/dave/archive/2005/12/30/2652.aspx

Limiting Records Returned within Groups

Sorry to be a noob here, but I've scoured the rest of the threads, and can't find the answer to my dilemma.

I'm trying to write a report that returns the top 10 depositors (based on current balance) for each of our branches. I've broken out each branch into it's own group.

When I apply the 'Top N' function, it limits the first group to 10 records, but pulls nothing for the remaining groups.

I've seen other posts that say I need to write a formula that counts the records in each group, and then in the detail section, suppress records greater than 10, but I have no idea how to create the 'count records' formula.

Again, sorry to be the fresh fish here, but I'm at my wit's end.

Thanksfirstly create a formula

formula 1:

//it puts in section 'Group header'
shared numbervar mycount:=0;


formula 2:

//now, inside section 'details'
shared numbervar mycount;
mycount:=mycount+1;


now, click right on section 'detalis' it shows popup menu
you do click on Format Section ......
click on button Suppress and there you have to write

shared numbervar mycount;
if mycount>10 then
true


I don't tested this, but I think it may help you|||Hensa22, you are my savior. Thank you so much.

Limiting Records Returned

I'm building an Access front end (A2K2 ADP project) to view records on a SQL 2000 Server, and want to ensure the fastest performance for my users. I expect the database to grow to around 25,000 records (after that, they'll be archived), and would like opi
nions/comments on the best way to access the data. I would guess that a recordset would be fastest, but how do you set up a form to be based on a recordset instead of just hooking directly to the underlying table?
TIA,
Terry Roberts
Check the ADO and ADP references in the Microsoft Office
2000/Visual Basic Programmer's Guide:
http://msdn.microsoft.com/library/de...radobasics.asp
You can also find a list of Access resources here:
http://msdn.microsoft.com/library/en...acacclinks.asp
And...you really should considering getting the book: Access
Developer's Guide to SQL Server by
Mary Chipman and Andy Baron.
-Sue
On Mon, 10 May 2004 09:11:08 -0700, "Terry Roberts"
<anonymous@.discussions.microsoft.com> wrote:

>I'm building an Access front end (A2K2 ADP project) to view records on a SQL 2000 Server, and want to ensure the fastest performance for my users. I expect the database to grow to around 25,000 records (after that, they'll be archived), and would like op
inions/comments on the best way to access the data. I would guess that a recordset would be fastest, but how do you set up a form to be based on a recordset instead of just hooking directly to the underlying table?
>TIA,
>Terry Roberts

Friday, March 9, 2012

Limiting Records Returned

I'm building an Access front end (A2K2 ADP project) to view records on a SQL
2000 Server, and want to ensure the fastest performance for my users. I exp
ect the database to grow to around 25,000 records (after that, they'll be ar
chived), and would like opi
nions/comments on the best way to access the data. I would guess that a reco
rdset would be fastest, but how do you set up a form to be based on a record
set instead of just hooking directly to the underlying table?
TIA,
Terry RobertsCheck the ADO and ADP references in the Microsoft Office
2000/Visual Basic Programmer's Guide:
adobasics.asp" target="_blank">http://msdn.microsoft.com/library/d...
adobasics.asp
You can also find a list of Access resources here:
http://msdn.microsoft.com/library/e.../acacclinks.asp
And...you really should considering getting the book: Access
Developer's Guide to SQL Server by
Mary Chipman and Andy Baron.
-Sue
On Mon, 10 May 2004 09:11:08 -0700, "Terry Roberts"
<anonymous@.discussions.microsoft.com> wrote:

>I'm building an Access front end (A2K2 ADP project) to view records on a SQL 2000 S
erver, and want to ensure the fastest performance for my users. I expect the databas
e to grow to around 25,000 records (after that, they'll be archived), and would like
op
inions/comments on the best way to access the data. I would guess that a recordset would be
fastest, but how do you set up a form to be based on a recordset instead of just hooking dir
ectly to the underlying table?
>TIA,
>Terry Roberts

Limiting Records Returned

I'm building an Access front end (A2K2 ADP project) to view records on a SQL 2000 Server, and want to ensure the fastest performance for my users. I expect the database to grow to around 25,000 records (after that, they'll be archived), and would like opinions/comments on the best way to access the data. I would guess that a recordset would be fastest, but how do you set up a form to be based on a recordset instead of just hooking directly to the underlying table
TIA
Terry RobertsCheck the ADO and ADP references in the Microsoft Office
2000/Visual Basic Programmer's Guide:
http://msdn.microsoft.com/library/default.asp?url=/library/en-us/odeopg/html/deovradobasics.asp
You can also find a list of Access resources here:
http://msdn.microsoft.com/library/en-us/dnacc2k/html/acacclinks.asp
And...you really should considering getting the book: Access
Developer's Guide to SQL Server by
Mary Chipman and Andy Baron.
-Sue
On Mon, 10 May 2004 09:11:08 -0700, "Terry Roberts"
<anonymous@.discussions.microsoft.com> wrote:
>I'm building an Access front end (A2K2 ADP project) to view records on a SQL 2000 Server, and want to ensure the fastest performance for my users. I expect the database to grow to around 25,000 records (after that, they'll be archived), and would like opinions/comments on the best way to access the data. I would guess that a recordset would be fastest, but how do you set up a form to be based on a recordset instead of just hooking directly to the underlying table?
>TIA,
>Terry Roberts

Limiting Matrix Columns

I am dealing with a report which must display records in columns. The number of columns is dynamic as is the number of fields (rows) which will be displayed.
For example, my data is structured as follows:
ID Name Price State Category etc...
1 Widget $100 AL Tools
2 Gadget $45 NY Accessories
3 Thingy $12.50 TX Tools
etc.
.
.
In order to convert this to a matrix, I have queried the data to return in the following format:
ID Attribute Value
1 Name Widget
1 Price $100
1 State AL
1 Category Tools
2 Name Gadget
2 Price $45
2 State NY
2 Category Accessories
3 Name Thingy
3 Price $12.50
3 State TX
3 Category Tools
etc...
So, I create a matrix in my report with 1 column and 1 row. The column is grouped on ID. The row is grouped on Attribute. The detail displays the Value. Giving me:
1 2 3 etc...
Name Widget Gadget Thingy
Price $100 $45 $12.50
State AL NY TX
Category Tools Accessories Tools
etc...
This all works great and is very simple. The problem stems from the fact that I only want to display a limited number of columns on each page. Once 8 columns are displayed, I would like the matrix to skip to a new page, which redisplays my header. Unfortunately, the matrix just keeps expanding to the right and never wraps. Is there any way to work around this? Has anyone found a way to force a matrix to wrap and repeat the headers?
Thanks

Hello:

I have just sstayed away from Matrix'es in that I have no control. I have not found a way around your above example - maybe someone else knows but I sure don't - so I have just stayed with simple lists and have manipulated the data via a stored procedure with a "crosstab" or "pivot table" to generate the data (with respective columns) that I want and then within RS just layout the report visibility similar to a matrix.

A different subject for a different day but the same occurs within Analysis Services in that the horizontal presentation just extrapolates itself forever based upon the dimensions and columns - maybe I did not say that corectly but the same thing occurs.
So here is a nice little "generic" SQL that will allow you to present your row data as column data. A side Note - that is the ONLY issue I have with Reporting Services in that all data is always represented as a ROW - so this Stored Procedure will definitely help you control what you want to present.
There are some initial definitions you should deploy such as storing these parameters for each report within a table in SQL and if you want I can help you - but nevertheless here is the code and I rename this Stored Procedure for the name of my Reporting Services Report.
CREATE procedure Ex_CrossTab or whatever name you want (@.SQL varchar(2500),

@.PivotCol varchar(100),

@.Summaries varchar(100),

@.GroupBy varchar(100),

@.OtherFields varchar(150) = Null,

@.Debug bit = 0)

AS

set nocount on

set ansi_warnings off

declare @.Vals varchar(8000);

declare @.Vals2 varchar(8000);

declare @.Vals3 varchar(8000);

declare @.tmp varchar(1000);

declare @.TotalLen int;

set @.Vals = '';

set @.Vals2 = '';

set @.Vals3 = '';

set @.TotalLen = len(@.SQL) + len(@.GroupBy) + Len(ISNULL(@.OtherFields,''))

set @.OtherFields = isNull(', ' + @.OtherFields ,'')

create table #temp (Pivot varchar(100))

insert into #temp

exec ('select distinct convert(varchar(100),' + @.PivotCol + ') as Pivot FROM (' + @.SQL + ') A')

select @.tmp =

replace(replace(@.Summaries,'(','(CASE WHEN ' + @.PivotCol + '=''' + replace(Pivot,'''',''') +

''' THEN '),')[', ' END) as [' + Pivot ),

@.TotalLen = @.TotalLen + Len(@.tmp),

@.Vals = case when @.TotalLen < 7800 then @.Vals + ', ' + @.tmp else @.Vals end,

@.Vals2 = case when @.TotalLen between 7800 and 15799 then @.Vals2 + ', ' + @.tmp else @.Vals2 end,

@.Vals3 = case when @.TotalLen between 15800 and 23799 then @.Vals3 + ', ' + @.tmp else @.Vals3 end

from
#Temp
order by
Pivot

drop table #Temp

if (@.Debug=0)

exec ( 'select ' + @.GroupBy + @.OtherFields + @.Vals + @.Vals2 +

@.Vals3 + ' from (' + @.SQL + ') A GROUP BY ' + @.GroupBy)

else

begin

create table #Temp2 (SQLText Text);

insert into #Temp2 (SQLText)

values ('select ' + @.GroupBy + @.OtherFields + @.Vals + @.Vals2 +

@.Vals3 + ' from (' + @.SQL + ') A GROUP BY ' + @.GroupBy);

select * from #Temp2

end


set nocount off
set ansi_warnings on
GO

The above code executes your SQL statement as Dynamic SQL so make sure that you get rid of all blanks within your SQL because it is passed as a parameter to the above stored procedure. So use Query Analyzer and once again eliminate all "white" space and then pass your SQL statement as a parameter to the above stored procedure. Based upon your SQL Select you can eliminate the data that is returned thus "eliminating" your columns from extroploating within Reporting Services.
Definitions on how to use the above stored procedure are below:


Select
- This is the SQL statement or table you are querying. It can be any valid SELECT statement, with JOINS, WHERE clauses, TOP, etc -- anything that you can cut and paste into query analyzer and execute. (And, in fact, you should always do this when testing things out).

Wednesday, March 7, 2012

Limitations of SQL Server 2000 Personal Edition

Can someone tell me what the record limit is or db size is for PE of SQL
Server 2000. I'm trying to import 33.6 million records, and I keep getting
an error message that says the data contains an extra column at 454K records
.
I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
than. The db is set to automatically grow.
TIA
MarkYour error message isn't related to capacity. The system is telling you
that there is an extra column in a record. SQL requires a fixed number of
columns inan import source file. Most likely you are using a comma or tab
delimited file. Inside that file there is a character string that has an
extra delimiter character so SQL interprets that as an extra column. This
is fairly common on a data import from a non-scrubbed source.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> Can someone tell me what the record limit is or db size is for PE of SQL
> Server 2000. I'm trying to import 33.6 million records, and I keep
> getting
> an error message that says the data contains an extra column at 454K
> records.
> I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> than. The db is set to automatically grow.
> TIA
> Mark|||Hello Geoff,
Thanks for taking the time to answer my post. I realized the message it
probably correct, although I suspect it's an error because I loaded similar
files to the server at my office, and now when I'm trying to load them on my
PC at home, I'm running into this problem.
It could be a problem in the tab delimited file as you say, but I've tried 2
of the 3 text files, and I get the same error message at record 454,157.
Since these text files aren't the exact same files as I loaded at work, it's
possible that there is a cliche in my PC creating slightly different version
s
of the files than my work PC, but not likely to generate the error at the
same point within the file.
Do you know though if there are limitations on the size of the db on the PE
version?
TIA
Mark
"Geoff N. Hiten" wrote:

> Your error message isn't related to capacity. The system is telling you
> that there is an extra column in a record. SQL requires a fixed number of
> columns inan import source file. Most likely you are using a comma or tab
> delimited file. Inside that file there is a character string that has an
> extra delimiter character so SQL interprets that as an extra column. This
> is fairly common on a data import from a non-scrubbed source.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
>
>|||There are memory and processor usage limitations, but no database size
limitations in PE. Lok up "Maximum Capacity Limitations" in BOL. Also,
pre-expand your SQL Data files to hold the entire import. Auto-grow can
sometimes cause timeout issues.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:6FA717DB-6D9D-4480-ADE8-4B64C9DB5DC6@.microsoft.com...[vbcol=seagreen]
> Hello Geoff,
> Thanks for taking the time to answer my post. I realized the message it
> probably correct, although I suspect it's an error because I loaded
> similar
> files to the server at my office, and now when I'm trying to load them on
> my
> PC at home, I'm running into this problem.
> It could be a problem in the tab delimited file as you say, but I've tried
> 2
> of the 3 text files, and I get the same error message at record 454,157.
> Since these text files aren't the exact same files as I loaded at work,
> it's
> possible that there is a cliche in my PC creating slightly different
> versions
> of the files than my work PC, but not likely to generate the error at the
> same point within the file.
> Do you know though if there are limitations on the size of the db on the
> PE
> version?
> TIA
> Mark
> "Geoff N. Hiten" wrote:
>

Limitations of SQL Server 2000 Personal Edition

Can someone tell me what the record limit is or db size is for PE of SQL
Server 2000. I'm trying to import 33.6 million records, and I keep getting
an error message that says the data contains an extra column at 454K records.
I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
than. The db is set to automatically grow.
TIA
Mark
Your error message isn't related to capacity. The system is telling you
that there is an extra column in a record. SQL requires a fixed number of
columns inan import source file. Most likely you are using a comma or tab
delimited file. Inside that file there is a character string that has an
extra delimiter character so SQL interprets that as an extra column. This
is fairly common on a data import from a non-scrubbed source.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> Can someone tell me what the record limit is or db size is for PE of SQL
> Server 2000. I'm trying to import 33.6 million records, and I keep
> getting
> an error message that says the data contains an extra column at 454K
> records.
> I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> than. The db is set to automatically grow.
> TIA
> Mark
|||Hello Geoff,
Thanks for taking the time to answer my post. I realized the message it
probably correct, although I suspect it's an error because I loaded similar
files to the server at my office, and now when I'm trying to load them on my
PC at home, I'm running into this problem.
It could be a problem in the tab delimited file as you say, but I've tried 2
of the 3 text files, and I get the same error message at record 454,157.
Since these text files aren't the exact same files as I loaded at work, it's
possible that there is a cliche in my PC creating slightly different versions
of the files than my work PC, but not likely to generate the error at the
same point within the file.
Do you know though if there are limitations on the size of the db on the PE
version?
TIA
Mark
"Geoff N. Hiten" wrote:

> Your error message isn't related to capacity. The system is telling you
> that there is an extra column in a record. SQL requires a fixed number of
> columns inan import source file. Most likely you are using a comma or tab
> delimited file. Inside that file there is a character string that has an
> extra delimiter character so SQL interprets that as an extra column. This
> is fairly common on a data import from a non-scrubbed source.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
>
>
|||There are memory and processor usage limitations, but no database size
limitations in PE. Lok up "Maximum Capacity Limitations" in BOL. Also,
pre-expand your SQL Data files to hold the entire import. Auto-grow can
sometimes cause timeout issues.
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:6FA717DB-6D9D-4480-ADE8-4B64C9DB5DC6@.microsoft.com...[vbcol=seagreen]
> Hello Geoff,
> Thanks for taking the time to answer my post. I realized the message it
> probably correct, although I suspect it's an error because I loaded
> similar
> files to the server at my office, and now when I'm trying to load them on
> my
> PC at home, I'm running into this problem.
> It could be a problem in the tab delimited file as you say, but I've tried
> 2
> of the 3 text files, and I get the same error message at record 454,157.
> Since these text files aren't the exact same files as I loaded at work,
> it's
> possible that there is a cliche in my PC creating slightly different
> versions
> of the files than my work PC, but not likely to generate the error at the
> same point within the file.
> Do you know though if there are limitations on the size of the db on the
> PE
> version?
> TIA
> Mark
> "Geoff N. Hiten" wrote:

Limitations of SQL Server 2000 Personal Edition

Can someone tell me what the record limit is or db size is for PE of SQL
Server 2000. I'm trying to import 33.6 million records, and I keep getting
an error message that says the data contains an extra column at 454K records.
I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
than. The db is set to automatically grow.
TIA
MarkYour error message isn't related to capacity. The system is telling you
that there is an extra column in a record. SQL requires a fixed number of
columns inan import source file. Most likely you are using a comma or tab
delimited file. Inside that file there is a character string that has an
extra delimiter character so SQL interprets that as an extra column. This
is fairly common on a data import from a non-scrubbed source.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> Can someone tell me what the record limit is or db size is for PE of SQL
> Server 2000. I'm trying to import 33.6 million records, and I keep
> getting
> an error message that says the data contains an extra column at 454K
> records.
> I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> than. The db is set to automatically grow.
> TIA
> Mark|||Hello Geoff,
Thanks for taking the time to answer my post. I realized the message it
probably correct, although I suspect it's an error because I loaded similar
files to the server at my office, and now when I'm trying to load them on my
PC at home, I'm running into this problem.
It could be a problem in the tab delimited file as you say, but I've tried 2
of the 3 text files, and I get the same error message at record 454,157.
Since these text files aren't the exact same files as I loaded at work, it's
possible that there is a cliche in my PC creating slightly different versions
of the files than my work PC, but not likely to generate the error at the
same point within the file.
Do you know though if there are limitations on the size of the db on the PE
version?
TIA
Mark
"Geoff N. Hiten" wrote:
> Your error message isn't related to capacity. The system is telling you
> that there is an extra column in a record. SQL requires a fixed number of
> columns inan import source file. Most likely you are using a comma or tab
> delimited file. Inside that file there is a character string that has an
> extra delimiter character so SQL interprets that as an extra column. This
> is fairly common on a data import from a non-scrubbed source.
> --
> Geoff N. Hiten
> Senior Database Administrator
> Microsoft SQL Server MVP
> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
> > Can someone tell me what the record limit is or db size is for PE of SQL
> > Server 2000. I'm trying to import 33.6 million records, and I keep
> > getting
> > an error message that says the data contains an extra column at 454K
> > records.
> >
> > I've tried the HELP, but I only see the Terrabyte limits. Is PE less than
> > than. The db is set to automatically grow.
> >
> > TIA
> >
> > Mark
>
>|||There are memory and processor usage limitations, but no database size
limitations in PE. Lok up "Maximum Capacity Limitations" in BOL. Also,
pre-expand your SQL Data files to hold the entire import. Auto-grow can
sometimes cause timeout issues.
--
Geoff N. Hiten
Senior Database Administrator
Microsoft SQL Server MVP
"MChrist" <MChrist@.discussions.microsoft.com> wrote in message
news:6FA717DB-6D9D-4480-ADE8-4B64C9DB5DC6@.microsoft.com...
> Hello Geoff,
> Thanks for taking the time to answer my post. I realized the message it
> probably correct, although I suspect it's an error because I loaded
> similar
> files to the server at my office, and now when I'm trying to load them on
> my
> PC at home, I'm running into this problem.
> It could be a problem in the tab delimited file as you say, but I've tried
> 2
> of the 3 text files, and I get the same error message at record 454,157.
> Since these text files aren't the exact same files as I loaded at work,
> it's
> possible that there is a cliche in my PC creating slightly different
> versions
> of the files than my work PC, but not likely to generate the error at the
> same point within the file.
> Do you know though if there are limitations on the size of the db on the
> PE
> version?
> TIA
> Mark
> "Geoff N. Hiten" wrote:
>> Your error message isn't related to capacity. The system is telling you
>> that there is an extra column in a record. SQL requires a fixed number
>> of
>> columns inan import source file. Most likely you are using a comma or
>> tab
>> delimited file. Inside that file there is a character string that has an
>> extra delimiter character so SQL interprets that as an extra column.
>> This
>> is fairly common on a data import from a non-scrubbed source.
>> --
>> Geoff N. Hiten
>> Senior Database Administrator
>> Microsoft SQL Server MVP
>> "MChrist" <MChrist@.discussions.microsoft.com> wrote in message
>> news:41821F63-5257-411B-9C8F-19E0CC3E6FF2@.microsoft.com...
>> > Can someone tell me what the record limit is or db size is for PE of
>> > SQL
>> > Server 2000. I'm trying to import 33.6 million records, and I keep
>> > getting
>> > an error message that says the data contains an extra column at 454K
>> > records.
>> >
>> > I've tried the HELP, but I only see the Terrabyte limits. Is PE less
>> > than
>> > than. The db is set to automatically grow.
>> >
>> > TIA
>> >
>> > Mark
>>

Friday, February 24, 2012

Limit total # of records used in a report

For performance purposes, is it possible to limit the # of records
returned for use in a report? I know you can do SELECT TOP 1000, but in
instances where the report contains several subselects, you have to
perform that function with every SELECT statement. This also has the
potential of making maintenance hectic.
Is there a way to globally limit the rendered rows?
Thanks!
MikeHi Mike,
There is no direct properties of records per page.FOr controlling the
records per page you need to specify the height and accordingly number of
recrds are set to the report.
Thanks
Jasvinder
"Bassist695" wrote:
> For performance purposes, is it possible to limit the # of records
> returned for use in a report? I know you can do SELECT TOP 1000, but in
> instances where the report contains several subselects, you have to
> perform that function with every SELECT statement. This also has the
> potential of making maintenance hectic.
> Is there a way to globally limit the rendered rows?
> Thanks!
> Mike
>

Limit the number of records returned in Stored procedure.

In my ASP page, when I select an option from the drop down list, it has to get the records from the database stored procedure. There are around 60,000 records to be fetched. It throws an exception when I select this option. I think the application times out due to the large number of records. Could some tell me how to limit the number of rows to be returned to avoid this problem. Thanks.

Query

SELECT @.SQLTier1Select='SELECT * FROM dbo.UDV_Tier1Accounts WHERE CUSTOMER IN (SELECT CUSTOMERNUMBER FROM dbo.UDF_GetUsersCustomers('+CAST(@.UserIDAS VARCHAR(4))+'))'+ @.Criteria+' AND (number IN (SELECT DISTINCT ph1.number FROM Collect2000.dbo.payhistory ph1 LEFT JOIN Collect2000.dbo.payhistory ph2 ON ph1.UID = ph2.ReverseOfUID WHERE (((ph1.batchtype = ''PU'') OR (ph1.batchtype = ''PC'')) AND ph2.ReverseOfUID IS NULL)) OR code IN (SELECT DISTINCT StatusID FROM tbl_APR_Statuses WHERE SearchCategoryPaidPaymentsT1 = 1))'

Select the top 100 records:

SELECT TOP 100 *

Select the top 10% of the total number of records:

SELECT TOP 10 percent *

|||

Or if you want to retrieve a limited number of records, but not necessarily the top X records, you could use the row_number() function. For example, to retrieve records 250-299 you could try

 
1SELECT*
2FROM3(
4SELECT *, row = row_number()OVER (ORDER BY id)
6FROM yourtable
7) a
8WHERE rowBETWEEN 250AND 299

Limit the number of records (with offset)

Hi,

i want to have only a few rows in a table. eg, from row 20 to 50. The problem is, 20 is a variable and 30 is (50 - 20 = 30...) one too
i use this statement:

SELECT * FROM
(SELECT TOP @.MaxRecord * FROM
(SELECT TOP @.MaxRecord * FROM
(SELECT TOP @.Totals * FROM PL4 ORDER BY RailLocation ASC)
AS foo ORDER BY RailLocation DESC )
AS bar )
AS baz ORDER BY RailLocation ASC

Totals is 50 and
MaxRecord is 30 in this example
PL4 was declared before in this Stored Procedure

This aint allowed, only when i hardcoded top 30 and top 50 it works...
And i don't want it to be hardcoded, does someone have a solution for this problem?

ThnxTry this:

EXEC('SELECT * FROM
(SELECT TOP ' + @.MaxRecord + ' * FROM
(SELECT TOP ' + @.Totals + ' * FROM PL4 ORDER BY RailLocation ASC)
AS foo ORDER BY RailLocation DESC )
AS baz ORDER BY RailLocation ASC')|||Nice, it works, thanx