Showing posts with label date. Show all posts
Showing posts with label date. Show all posts

Monday, March 19, 2012

Line 1: Incorrect syntax near getlistofspclass.

what am i doing wrong.. :(

CREATE PROCEDURE getlistofspclass
(
@.date datetime,
@.s varchar(8000) output
)

AS


select @.s = coalesce(@.s + ' & ', '') + '"' + convert(varchar(10),dbo.calader.timestart)+ ' ' + dbo.classT.classname + '<p>'
FROM dbo.calader INNER JOIN dbo.classT ON dbo.calader.classid = dbo.classT.classcode
WHERE dbo.calader.caledardatedates = @.date
exec( @.s)
GO

The procedure's parameters should not be enclosed in parentheses. Remove the parentheses that surround the parameters and you should have better luck.|||

my god!! i spent 4 hours on that and $60 on a Que sql server programming book. :(

but now it works. i can hang myself now

|||

hello can you help me again. ...my asp.net code is not working.. it will not send the date to the pramater.

can you make a sameple code in asp.net vb of how you will send a date to that stored procuder and how you would get the returned value

please. thank you verymuch

|||

All I have is a C# version I made the other day, sorry. But it's close enough to VB you should be able to figure out how to port it.

1public int InsertNewThread(Thread ThreadToInsert,out int InternalID)2{3using(SqlConnection sqlConnection =new SqlConnection(GetConnectionString()))4{5SqlCommand sqlCommand =new SqlCommand("Thread_Insert", sqlConnection);6sqlCommand.CommandType = CommandType.StoredProcedure;7sqlCommand.Parameters.Add("@.threadSummary", SqlDbType.NVarChar, 75).Value = ThreadToInsert.ThreadSummary;8sqlCommand.Parameters.Add("@.boardID", SqlDbType.Int).Value = ThreadToInsert.BoardID;9sqlCommand.Parameters.Add("@.internalID", SqlDbType.Int).Direction = ParameterDirection.Output;10sqlConnection.Open();11int RowsEffected = sqlCommand.ExecuteNonQuery();12sqlConnection.Close();13InternalID = (int) sqlCommand.Parameters["@.internalID"].Value;14return RowsEffected;15}16}
 
  
12/****** Object: Stored Procedure dbo.Thread_Insert Script Date: 7/28/2006 7:24:19 PM ******/3CREATE PROCEDURE Thread_Insert4@.threadSummaryNVARCHAR( 75 ),5@.internalIDINT OUTPUT6AS7INSERT INTO8 threads( threadSummary )910VALUES11 ( @.threadSummary )1213SELECT14@.internalID = SCOPE_IDENTITY ( )1516GO
|||

thanks again. you know the speed of getting the return value was just the same as putting the sql statement on the webpage itself...people say storedprocedures are faster..oh well

thanks everything worked

|||

Stored procedures are faster. If you coded one page without stored procedures, and another with stored procedures, and timed them with your watch, then you won't notice much.

Take a web server (or multiple web servers) using complex queries and time the difference when they are getting hit a few hundred requests per second, and you will notice a difference, especially in older versions of SQL Server. The newer ones have added a few little things that reduce the difference dramatically in most cases (Cached query plans, even for dynamic queries). They basically turn even dynamic queries into a kind of "temporary" stored procedure, getting most of the sp's benefits, a very nice addition to SQL Server.

Friday, March 9, 2012

Limiting log growth during DTS Package

SQL Server 2000 SP4. I built a large DTS package that grabs a number
of tables from an Oracle DB, does some scrubbing and date verification
and loads to a SQL Server DB. Most of the tables are full refresh and
a few are incremental.

Main DW: DwSQL
Staging Area: DwLoadAreaSQL

The DW is about 60 Gigs. The Staging Area is about 80 Gigs. This is
all good.

However, the log file for the staging area is 50 Gigs and I'm trying
to find ways to not require such a large log file. I tried adding a
few "BACKUP LOG DwLoadAreaSQL WITH TRUNCATE_ONLY" statements in the
DTS package but figured out that because it's 1 DTS package it's all 1
transaction. I've thought about breaking it up into multiple DTS
packages and truncating the log between running them but was hoping to
avoid this. To be clear, I know how to shrink DB's and Log
Files...that's not the issue.

Any Ideas? Thanks.On Jul 24, 3:49 pm, davisutt <davis...@.aol.comwrote:

Quote:

Originally Posted by

SQL Server 2000 SP4. I built a large DTS package that grabs a number
of tables from an Oracle DB, does some scrubbing and date verification
and loads to a SQL Server DB. Most of the tables are full refresh and
a few are incremental.
>
Main DW: DwSQL
Staging Area: DwLoadAreaSQL
>
The DW is about 60 Gigs. The Staging Area is about 80 Gigs. This is
all good.
>
However, the log file for the staging area is 50 Gigs and I'm trying
to find ways to not require such a large log file. I tried adding a
few "BACKUP LOG DwLoadAreaSQL WITH TRUNCATE_ONLY" statements in the
DTS package but figured out that because it's 1 DTS package it's all 1
transaction. I've thought about breaking it up into multiple DTS
packages and truncating the log between running them but was hoping to
avoid this. To be clear, I know how to shrink DB's and Log
Files...that's not the issue.
>
Any Ideas? Thanks.


Make sure both databases are in bulked log recovery mode. You should
have a step at the end of your DTS to run CHECKPOINT, backup truncate
the log. Also, manaually shrink the log file to your desired log size.

Hope it helps...

MNDBA|||On Jul 28, 4:55 pm, kmounkh...@.gmail.com wrote:

Quote:

Originally Posted by

On Jul 24, 3:49 pm, davisutt <davis...@.aol.comwrote:
>
>
>
>
>

Quote:

Originally Posted by

SQL Server 2000 SP4. I built a large DTS package that grabs a number
of tables from an Oracle DB, does some scrubbing and date verification
and loads to a SQL Server DB. Most of the tables are full refresh and
a few are incremental.


>

Quote:

Originally Posted by

Main DW: DwSQL
Staging Area: DwLoadAreaSQL


>

Quote:

Originally Posted by

The DW is about 60 Gigs. The Staging Area is about 80 Gigs. This is
all good.


>

Quote:

Originally Posted by

However, the log file for the staging area is 50 Gigs and I'm trying
to find ways to not require such a large log file. I tried adding a
few "BACKUP LOG DwLoadAreaSQL WITH TRUNCATE_ONLY" statements in the
DTS package but figured out that because it's 1 DTS package it's all 1
transaction. I've thought about breaking it up into multiple DTS
packages and truncating the log between running them but was hoping to
avoid this. To be clear, I know how to shrink DB's and Log
Files...that's not the issue.


>

Quote:

Originally Posted by

Any Ideas? Thanks.


>
Make sure both databases are in bulked log recovery mode. You should
have a step at the end of your DTS to run CHECKPOINT, backup truncate
the log. Also, manaually shrink the log file to your desired log size.
>
Hope it helps...
>
MNDBA- Hide quoted text -
>
- Show quoted text -


Thanks. I think that's what I was looking for. I changed the
recovery mode and shrunk the log size to about 20% of what it was.
I'll run the process and see what kind of growth occurs.

Limiting data in Indexed Views by date

Hi everyone,
We have a fact table containing 28,000,000 rows and are experiencing
scalability issues when performing queries. It turns out that indexed
views are an excellent way to pre-calculate aggregates and improve
scalability.
The problem is that the table is part of an OLTP solution, and is
updated constantly throughout the day. However, our indexed view only
needs to include data up to yesterday (which never changes once it's
written). However, when we create an indexed view, it includes today's
data, and the overhead required to maintain the view is causing our
data insert routines to slow down by 90%.
What are our options to limit the data in the indexed view to
yesterday? Ideally, we'd like the index to automatically update at
midnight each night, without the need for replication or batch jobs.
Here are the options we know of:
1. Create indexed view without regard for the date. As stated above,
this slows down insert speeds on the table and is not feasible.
2. Create a copy of the table which contains data only up to yesterday,
and run the indexed query on this table. Run a batch process at 12:01am
every morning to copy the new rows to the new table. The table is very
large, so this will necessitate some downtime however.
Is there another simpler way to do this?On 1 Dec 2005 09:25:46 -0800, marketing@.adwarereport.com wrote:
(snip)
>What are our options to limit the data in the indexed view to
>yesterday?
(snip)
>Is there another simpler way to do this?
Hi marketing,
Have you considered creating a second database for reporting. Each
night, after the backup of the OLTP database has finished, load the
reporting database with that backup.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)

Limiting data in Indexed Views by date

Hi everyone,
We have a fact table containing 28,000,000 rows and are experiencing
scalability issues when performing queries. It turns out that indexed
views are an excellent way to pre-calculate aggregates and improve
scalability.
The problem is that the table is part of an OLTP solution, and is
updated constantly throughout the day. However, our indexed view only
needs to include data up to yesterday (which never changes once it's
written). However, when we create an indexed view, it includes today's
data, and the overhead required to maintain the view is causing our
data insert routines to slow down by 90%.
What are our options to limit the data in the indexed view to
yesterday? Ideally, we'd like the index to automatically update at
midnight each night, without the need for replication or batch jobs.
Here are the options we know of:
1. Create indexed view without regard for the date. As stated above,
this slows down insert speeds on the table and is not feasible.
2. Create a copy of the table which contains data only up to yesterday,
and run the indexed query on this table. Run a batch process at 12:01am
every morning to copy the new rows to the new table. The table is very
large, so this will necessitate some downtime however.
Is there another simpler way to do this?
On 1 Dec 2005 09:25:46 -0800, marketing@.adwarereport.com wrote:
(snip)
>What are our options to limit the data in the indexed view to
>yesterday?
(snip)
>Is there another simpler way to do this?
Hi marketing,
Have you considered creating a second database for reporting. Each
night, after the backup of the OLTP database has finished, load the
reporting database with that backup.
Best, Hugo
(Remove _NO_ and _SPAM_ to get my e-mail address)

Limiting data in Indexed Views by date

Hi everyone,
We have a fact table containing 28,000,000 rows and are experiencing
scalability issues when performing queries. It turns out that indexed
views are an excellent way to pre-calculate aggregates and improve
scalability.
The problem is that the table is part of an OLTP solution, and is
updated constantly throughout the day. However, our indexed view only
needs to include data up to yesterday (which never changes once it's
written). However, when we create an indexed view, it includes today's
data, and the overhead required to maintain the view is causing our
data insert routines to slow down by 90%.
What are our options to limit the data in the indexed view to
yesterday? Ideally, we'd like the index to automatically update at
midnight each night, without the need for replication or batch jobs.
Here are the options we know of:
1. Create indexed view without regard for the date. As stated above,
this slows down insert speeds on the table and is not feasible.
2. Create a copy of the table which contains data only up to yesterday,
and run the indexed query on this table. Run a batch process at 12:01am
every morning to copy the new rows to the new table. The table is very
large, so this will necessitate some downtime however.
Is there another simpler way to do this?On 1 Dec 2005 09:25:46 -0800, marketing@.adwarereport.com wrote:
(snip)
>What are our options to limit the data in the indexed view to
>yesterday?
(snip)
>Is there another simpler way to do this?
Hi marketing,
Have you considered creating a second database for reporting. Each
night, after the backup of the OLTP database has finished, load the
reporting database with that backup.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)

Friday, February 24, 2012

Limit the range to select on dateparameter

Hello out there,

I try to force my users to select a date not more than 6 months in the past. How can I limit the range of dateselection in the reportparameter? Any ideas?

Thank you very much!

Kind regards

dollmaker

Up

|||

I'm assuming that you have FromDate and ToDate Report parameters, which have values from a query.

Try doing this :

Set the Available values for the ToDate parameter as Non-Queried and put an expression in the LABEL and VALUES columns, evaluating the ToDate Parameter against the FromDate parameter with the DateDiff function, and then allowing only the values within 6 months.

I don't know if this is the exact scenario you're facing, but this is how I would do for my application.

Let me know how it goes. All the best!

Limit the range to select on dateparameter

Hello out there,

I try to force my users to select a date not more than 6 months in the past. How can I limit the range of dateselection in the reportparameter? Any ideas?

Thank you very much!

Kind regards

dollmaker

Up

|||

I'm assuming that you have FromDate and ToDate Report parameters, which have values from a query.

Try doing this :

Set the Available values for the ToDate parameter as Non-Queried and put an expression in the LABEL and VALUES columns, evaluating the ToDate Parameter against the FromDate parameter with the DateDiff function, and then allowing only the values within 6 months.

I don't know if this is the exact scenario you're facing, but this is how I would do for my application.

Let me know how it goes. All the best!

Monday, February 20, 2012

limit parameters

is there anyway to limit the date ranges that a user can enter?
my reports have a start date and end date as parameters and I would
like to limit the date range to 6 months so that a user cannot enter a
huge date range or something and slow down the database.Just build a custom prompting mechanizm on your own web page, validating
user input there , disable built-in param prompts in RDL,
construct a custom URL query string on your own web page , than simply
redirect.
--
Oleg Yevteyev,
San Diego, CA
It is OK to contact me with a contracting opportunity.
"myfirstname"001atgmaildotcom.
Replace "myfirstname" with Oleg.
--
"bevarg" <benovarghese@.gmail.com> wrote in message
news:1142261128.051458.241930@.v46g2000cwv.googlegroups.com...
> is there anyway to limit the date ranges that a user can enter?
> my reports have a start date and end date as parameters and I would
> like to limit the date range to 6 months so that a user cannot enter a
> huge date range or something and slow down the database.
>