Showing posts with label value. Show all posts
Showing posts with label value. Show all posts

Thursday, March 29, 2012

Hoe to get Encrypted Password value varbinary field

I just converted from SQL 7 to SQL 2000 but have one issue.
My login table contains a password field that is a varbinary type. I
encrypted the value by using the encrypt('password') when inserting users
into this table.
How can I see the actual value of the password field. Is there a way to
decrypt?
For some reason, all the users passwords don't work. If I redo each one, it
works but there are too many to redo manually."yodakt" <dev1on11@.gmail.com> wrote in message
news:6E45D35E-970E-4656-AECF-B192BD1035D7@.microsoft.com...
>I just converted from SQL 7 to SQL 2000 but have one issue.
> My login table contains a password field that is a varbinary type. I
> encrypted the value by using the encrypt('password') when inserting users
> into this table.
> How can I see the actual value of the password field. Is there a way to
> decrypt?
> For some reason, all the users passwords don't work. If I redo each one,
> it
> works but there are too many to redo manually.
Try this:
SELECT CAST(myPassword AS NVARCHAR(100))
FROM myTable
Where myPassword is the VARBINARY column containing the passwords.

Tuesday, March 27, 2012

History/Data Change File Approach

I need to record in a table:
Who, When, What Field and New Value of Fields
When changes occur to an existing record.

The purpose is for users to occassionally view the changes. They'll want to be able to see the history of the record - who changed what and when.

I figured I'd add the needed code to the stored procedure that's doing the update for the record.

When the stored procedure is called to do the update, the PK and parameters are sent.

The SP could first retain the current state of the record from the disk,
then do the update, then "spin" thru the fields comparing the record state prior to the update and after. Differences could be parsed to a "Changes string" and in the end, this string is saved in a history record along with a few other fields:

Name, DateTime, Changes

FK to Changed Record: some int value
Name: Joe Blow
Date: 1/1/05 12:02pm
Changes: Severity: 23 Project: Everest Assigned Lab: 204

How does the above approach sound?

Is there a better way you'd suggest?

Any sample code for a system that spins thru the fields comparing 1 temporary record with another looking for changes?

Thanks,

PeterHave you considered using a trigger? You can use the inserted and deleted tables to compare your data without having to save it manually. Then insert your data into the history table as you suggested.

We often just save the before image to the history table along with the type of operation performed, the id that was used to perform it and a time stamp. Inserted records are not recorded (because all their data is already recorded on the live table) but deleted ones are. The differences for updated records can be determined at any time by comparing the before image to the next or previous stored image or the current actual record. We rarely actually look at this sort of history however unless data disappears or the customer tells us that there is something else wrong with the data and we need to trace what happened to it.|||ejustuss - thanks for the thoughts.

Good idea about simply saving the before image prior to the actual save of the new one.

Our users are used to systems where they can click a button and see essentially the change history of the record. This particular system is a Work Order system. The status of the WO changes over time, etc.

In other non-SQL Server systems I've developed I have a routine that prior to actual save:

1. saves "before save" copy of record
2. updates the record with new values into the DB
3. peels off an "After save" copy of record
4. runs a routine that compares side by side each field. Any changes are noted in a text variable (field name, new value).
5. Once all the fields are spun thru (compared), if there is any information in the text variable, a "change table" record is created with FK to the parent record, Who Changed it, When changed, and a single text field describing all the changes.

Weeks later when a user is viewing the particular record, they can press a button and have a query run against the change table to bring up a simple list of what changed when.

One wrinkle is that a Work Order has a huge text area. Once a WO is accepted by a lab, this text are becomes "frozen". So if we simply peel off a before save copy each time a user specifies an update - I wouldn't want to needlessly include this particular field due to space considerations.

Bottom line - I was assuming someone might have a canned routine for spinning thru a record comparing all field values against an identical layed out record. I figured there might be a system function or 2 to:
1. Identify how many fields are in a table
2. identify the content of a field - something like @.@.Field(i)
where i=1 to number of fields in the table.

Peter|||I don't know of any function like that although I am sure you can write one.

In triggers you can use IF UPDATE(column) to test if a column is updated but you still have to use the column names which means writing a different trigger for each case . The data in text fields will probably not be accessible within the trigger.

You don't have to save all the fields if you do a before image either just the fields you want to compare in case they are updated.

Monday, March 26, 2012

Hindi fonts

Hello all

Using hindi fonts in the asp.net, is that possible...

admin area adding a new data in the database and when the save and all value save on the database

HINDI FONTS Using in backends.......................

Hi,

Check this article : http://www.codeproject.com/useritems/localization.asp

Hope this helps,

Vivek

Monday, March 19, 2012

High CPU value in Profiler

We see CPU values of 20000 + in Profiler for some statements ? Whats the
unit for it ? Is it in ms(milliseconds) ?
If its in milliseconds, what does that mean? Can a high duration of CPU
means higher processing power or just longer time to process ? Please help
me understand.
Hassan, do you have Books Online? You should. From there:
"In SQL Server 2005, the server reports the duration of an event in
microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
time used by the event in milliseconds (one thousandth, or 10^-3, of a
second). In SQL Server 2000, the server reported both duration and CPU time
in milliseconds. In SQL Server 2005, the SQL Server Profiler graphical user
interface displays the Duration column in milliseconds by default, but when
a trace is saved to either a file or a database table, the Duration column
value is written in microseconds."
A
"Hassan" <hassan@.hotmail.com> wrote in message
news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
> We see CPU values of 20000 + in Profiler for some statements ? Whats the
> unit for it ? Is it in ms(milliseconds) ?
> If its in milliseconds, what does that mean? Can a high duration of CPU
> means higher processing power or just longer time to process ? Please help
> me understand.
>
|||Thanks Aaron.
What would 20000 ms mean from a CPU perspective ?
Is that considered a high CPU or is just running for 20 secs using some CPU
cycles ?
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
> Hassan, do you have Books Online? You should. From there:
> "In SQL Server 2005, the server reports the duration of an event in
> microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
> time used by the event in milliseconds (one thousandth, or 10^-3, of a
> second). In SQL Server 2000, the server reported both duration and CPU
> time in milliseconds. In SQL Server 2005, the SQL Server Profiler
> graphical user interface displays the Duration column in milliseconds by
> default, but when a trace is saved to either a file or a database table,
> the Duration column value is written in microseconds."
> A
>
> "Hassan" <hassan@.hotmail.com> wrote in message
> news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
>
|||Hassan
Run those statements and specify SET STATISTICS TIME ON
"Hassan" <hassan@.hotmail.com> wrote in message
news:eOjtAHAyHHA.4276@.TK2MSFTNGP05.phx.gbl...
> Thanks Aaron.
> What would 20000 ms mean from a CPU perspective ?
> Is that considered a high CPU or is just running for 20 secs using some
> CPU cycles ?
> "Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in
> message news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
>
|||It means that 20 seconds worth of CPU cycles were used. So that could be
20 seconds of one CPU (or core), 10 seconds for 2 CPU's, etc. So if the
system only has one CPU (with one core), and the elapsed time was also
20 seconds, then this query has saturated the CPU for 20 seconds. If the
running time was 40 seconds (and still assuming 1 CPU), then the CPU was
used for (on average) 50% for this query.
HTH,
Gert-Jan
Hassan wrote:[vbcol=seagreen]
> Thanks Aaron.
> What would 20000 ms mean from a CPU perspective ?
> Is that considered a high CPU or is just running for 20 secs using some CPU
> cycles ?
> "Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
> news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...

High CPU value in Profiler

We see CPU values of 20000 + in Profiler for some statements ? Whats the
unit for it ? Is it in ms(milliseconds) ?
If its in milliseconds, what does that mean? Can a high duration of CPU
means higher processing power or just longer time to process ? Please help
me understand.Hassan, do you have Books Online? You should. From there:
"In SQL Server 2005, the server reports the duration of an event in
microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
time used by the event in milliseconds (one thousandth, or 10^-3, of a
second). In SQL Server 2000, the server reported both duration and CPU time
in milliseconds. In SQL Server 2005, the SQL Server Profiler graphical user
interface displays the Duration column in milliseconds by default, but when
a trace is saved to either a file or a database table, the Duration column
value is written in microseconds."
A
"Hassan" <hassan@.hotmail.com> wrote in message
news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
> We see CPU values of 20000 + in Profiler for some statements ? Whats the
> unit for it ? Is it in ms(milliseconds) ?
> If its in milliseconds, what does that mean? Can a high duration of CPU
> means higher processing power or just longer time to process ? Please help
> me understand.
>|||Thanks Aaron.
What would 20000 ms mean from a CPU perspective ?
Is that considered a high CPU or is just running for 20 secs using some CPU
cycles ?
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in mess
age
news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
> Hassan, do you have Books Online? You should. From there:
> "In SQL Server 2005, the server reports the duration of an event in
> microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
> time used by the event in milliseconds (one thousandth, or 10^-3, of a
> second). In SQL Server 2000, the server reported both duration and CPU
> time in milliseconds. In SQL Server 2005, the SQL Server Profiler
> graphical user interface displays the Duration column in milliseconds by
> default, but when a trace is saved to either a file or a database table,
> the Duration column value is written in microseconds."
> A
>
> "Hassan" <hassan@.hotmail.com> wrote in message
> news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
>|||Hassan
Run those statements and specify SET STATISTICS TIME ON
"Hassan" <hassan@.hotmail.com> wrote in message
news:eOjtAHAyHHA.4276@.TK2MSFTNGP05.phx.gbl...
> Thanks Aaron.
> What would 20000 ms mean from a CPU perspective ?
> Is that considered a high CPU or is just running for 20 secs using some
> CPU cycles ?
> "Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in
> message news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
>|||It means that 20 seconds worth of CPU cycles were used. So that could be
20 seconds of one CPU (or core), 10 seconds for 2 CPU's, etc. So if the
system only has one CPU (with one core), and the elapsed time was also
20 seconds, then this query has saturated the CPU for 20 seconds. If the
running time was 40 seconds (and still assuming 1 CPU), then the CPU was
used for (on average) 50% for this query.
HTH,
Gert-Jan
Hassan wrote:[vbcol=seagreen]
> Thanks Aaron.
> What would 20000 ms mean from a CPU perspective ?
> Is that considered a high CPU or is just running for 20 secs using some CP
U
> cycles ?
> "Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in me
ssage
> news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...

High CPU value in Profiler

We see CPU values of 20000 + in Profiler for some statements ? Whats the
unit for it ? Is it in ms(milliseconds) ?
If its in milliseconds, what does that mean? Can a high duration of CPU
means higher processing power or just longer time to process ? Please help
me understand.Hassan, do you have Books Online? You should. From there:
"In SQL Server 2005, the server reports the duration of an event in
microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
time used by the event in milliseconds (one thousandth, or 10^-3, of a
second). In SQL Server 2000, the server reported both duration and CPU time
in milliseconds. In SQL Server 2005, the SQL Server Profiler graphical user
interface displays the Duration column in milliseconds by default, but when
a trace is saved to either a file or a database table, the Duration column
value is written in microseconds."
A
"Hassan" <hassan@.hotmail.com> wrote in message
news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
> We see CPU values of 20000 + in Profiler for some statements ? Whats the
> unit for it ? Is it in ms(milliseconds) ?
> If its in milliseconds, what does that mean? Can a high duration of CPU
> means higher processing power or just longer time to process ? Please help
> me understand.
>|||Thanks Aaron.
What would 20000 ms mean from a CPU perspective ?
Is that considered a high CPU or is just running for 20 secs using some CPU
cycles ?
"Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
> Hassan, do you have Books Online? You should. From there:
> "In SQL Server 2005, the server reports the duration of an event in
> microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
> time used by the event in milliseconds (one thousandth, or 10^-3, of a
> second). In SQL Server 2000, the server reported both duration and CPU
> time in milliseconds. In SQL Server 2005, the SQL Server Profiler
> graphical user interface displays the Duration column in milliseconds by
> default, but when a trace is saved to either a file or a database table,
> the Duration column value is written in microseconds."
> A
>
> "Hassan" <hassan@.hotmail.com> wrote in message
> news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
>> We see CPU values of 20000 + in Profiler for some statements ? Whats the
>> unit for it ? Is it in ms(milliseconds) ?
>> If its in milliseconds, what does that mean? Can a high duration of CPU
>> means higher processing power or just longer time to process ? Please
>> help me understand.
>|||Hassan
Run those statements and specify SET STATISTICS TIME ON
"Hassan" <hassan@.hotmail.com> wrote in message
news:eOjtAHAyHHA.4276@.TK2MSFTNGP05.phx.gbl...
> Thanks Aaron.
> What would 20000 ms mean from a CPU perspective ?
> Is that considered a high CPU or is just running for 20 secs using some
> CPU cycles ?
> "Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in
> message news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
>> Hassan, do you have Books Online? You should. From there:
>> "In SQL Server 2005, the server reports the duration of an event in
>> microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
>> time used by the event in milliseconds (one thousandth, or 10^-3, of a
>> second). In SQL Server 2000, the server reported both duration and CPU
>> time in milliseconds. In SQL Server 2005, the SQL Server Profiler
>> graphical user interface displays the Duration column in milliseconds by
>> default, but when a trace is saved to either a file or a database table,
>> the Duration column value is written in microseconds."
>> A
>>
>> "Hassan" <hassan@.hotmail.com> wrote in message
>> news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
>> We see CPU values of 20000 + in Profiler for some statements ? Whats the
>> unit for it ? Is it in ms(milliseconds) ?
>> If its in milliseconds, what does that mean? Can a high duration of CPU
>> means higher processing power or just longer time to process ? Please
>> help me understand.
>>
>|||It means that 20 seconds worth of CPU cycles were used. So that could be
20 seconds of one CPU (or core), 10 seconds for 2 CPU's, etc. So if the
system only has one CPU (with one core), and the elapsed time was also
20 seconds, then this query has saturated the CPU for 20 seconds. If the
running time was 40 seconds (and still assuming 1 CPU), then the CPU was
used for (on average) 50% for this query.
HTH,
Gert-Jan
Hassan wrote:
> Thanks Aaron.
> What would 20000 ms mean from a CPU perspective ?
> Is that considered a high CPU or is just running for 20 secs using some CPU
> cycles ?
> "Aaron Bertrand [SQL Server MVP]" <ten.xoc@.dnartreb.noraa> wrote in message
> news:eHYbw5%23xHHA.5980@.TK2MSFTNGP04.phx.gbl...
> > Hassan, do you have Books Online? You should. From there:
> >
> > "In SQL Server 2005, the server reports the duration of an event in
> > microseconds (one millionth, or 10^-6, of a second) and the amount of CPU
> > time used by the event in milliseconds (one thousandth, or 10^-3, of a
> > second). In SQL Server 2000, the server reported both duration and CPU
> > time in milliseconds. In SQL Server 2005, the SQL Server Profiler
> > graphical user interface displays the Duration column in milliseconds by
> > default, but when a trace is saved to either a file or a database table,
> > the Duration column value is written in microseconds."
> >
> > A
> >
> >
> > "Hassan" <hassan@.hotmail.com> wrote in message
> > news:u$r6qx%23xHHA.5584@.TK2MSFTNGP02.phx.gbl...
> >> We see CPU values of 20000 + in Profiler for some statements ? Whats the
> >> unit for it ? Is it in ms(milliseconds) ?
> >>
> >> If its in milliseconds, what does that mean? Can a high duration of CPU
> >> means higher processing power or just longer time to process ? Please
> >> help me understand.
> >>
> >
> >

Monday, March 12, 2012

high Compilations/sec value

Hi,
Would someone please explain what causes a high "Compilations/sec" count?
Is still caued by stored procedures not being cached and not enough memory
alloacted to SQL Server.
ThanksIf you have create SPs that are frequently used, with recompile option might
bring up compilations/sec high.
"mm" <postto@.news.com> wrote in message
news:u1p$2asuEHA.452@.TK2MSFTNGP09.phx.gbl...
> Hi,
> Would someone please explain what causes a high "Compilations/sec" count?
> Is still caued by stored procedures not being cached and not enough memory
> alloacted to SQL Server.
>
> Thanks
>
>|||Generally speaking, if this figure is over 100 compilations per second, then
you may be experiencing unnecessary compilation overhead. A high number such
as this might indicate that you server is just very busy, or it could mean
that unnecessary compilations are being performed. For example, compilations
can be forced by SQL Server if object schema changes, if previously
parallelized execution plans have to run serially, if statistics are
recomputed, or if a number of other things occur.
Also, it depends how your SPs are written. For example, if you have the
following SP:
CREATE PROCEDURE dbo.spTest (@.query bit) AS
IF @.query = 0
SELECT * FROM authors
ELSE
SELECT * FROM publishers
GO
Suppose I make my first call to this procedure with the @.query parameter set
to 0. The query-plan that SQL Server will generate will be optimized for the
first query ("SELECT * FROM authors"), because the path followed on the first
call will result in that query being executed.
Now, if I next call the stored procedure with @.query set to 1, the query
plan that SQL Server has in memory will not be of any use in executing the
second query, since the query-plan is optimized for the authors table, not
the publishers table. Result: SQL Server will have to compile a new query
plan, the one needed for the second query.
Ultimately, you should write the SP as follow:
CREATE PROCEDURE dbo.spTestDelegator (@.query bit) AS
IF @.query = 0
EXEC spTestFromAuthors
ELSE
EXEC spTestFromPublishers
GO
I hope this helps.
--
Sasan Saidi, MSc in CS
"I saw it work in a cartoon once so I am pretty sure I can do it."
"mm" wrote:
> Hi,
> Would someone please explain what causes a high "Compilations/sec" count?
> Is still caued by stored procedures not being cached and not enough memory
> alloacted to SQL Server.
>
> Thanks
>
>

high Compilations/sec value

Hi,
Would someone please explain what causes a high "Compilations/sec" count?
Is still caued by stored procedures not being cached and not enough memory
alloacted to SQL Server.
ThanksIf you have create SPs that are frequently used, with recompile option might
bring up compilations/sec high.
"mm" <postto@.news.com> wrote in message
news:u1p$2asuEHA.452@.TK2MSFTNGP09.phx.gbl...
> Hi,
> Would someone please explain what causes a high "Compilations/sec" count?
> Is still caued by stored procedures not being cached and not enough memory
> alloacted to SQL Server.
>
> Thanks
>
>|||Generally speaking, if this figure is over 100 compilations per second, then
you may be experiencing unnecessary compilation overhead. A high number such
as this might indicate that you server is just very busy, or it could mean
that unnecessary compilations are being performed. For example, compilations
can be forced by SQL Server if object schema changes, if previously
parallelized execution plans have to run serially, if statistics are
recomputed, or if a number of other things occur.
Also, it depends how your SPs are written. For example, if you have the
following SP:
CREATE PROCEDURE dbo.spTest (@.query bit) AS
IF @.query = 0
SELECT * FROM authors
ELSE
SELECT * FROM publishers
GO
Suppose I make my first call to this procedure with the @.query parameter set
to 0. The query-plan that SQL Server will generate will be optimized for the
first query ("SELECT * FROM authors"), because the path followed on the firs
t
call will result in that query being executed.
Now, if I next call the stored procedure with @.query set to 1, the query
plan that SQL Server has in memory will not be of any use in executing the
second query, since the query-plan is optimized for the authors table, not
the publishers table. Result: SQL Server will have to compile a new query
plan, the one needed for the second query.
Ultimately, you should write the SP as follow:
CREATE PROCEDURE dbo.spTestDelegator (@.query bit) AS
IF @.query = 0
EXEC spTestFromAuthors
ELSE
EXEC spTestFromPublishers
GO
I hope this helps.
Sasan Saidi, MSc in CS
"I saw it work in a cartoon once so I am pretty sure I can do it."
"mm" wrote:

> Hi,
> Would someone please explain what causes a high "Compilations/sec" count?
> Is still caued by stored procedures not being cached and not enough memory
> alloacted to SQL Server.
>
> Thanks
>
>

high Compilations/sec value

Hi,
Would someone please explain what causes a high "Compilations/sec" count?
Is still caued by stored procedures not being cached and not enough memory
alloacted to SQL Server.
Thanks
If you have create SPs that are frequently used, with recompile option might
bring up compilations/sec high.
"mm" <postto@.news.com> wrote in message
news:u1p$2asuEHA.452@.TK2MSFTNGP09.phx.gbl...
> Hi,
> Would someone please explain what causes a high "Compilations/sec" count?
> Is still caued by stored procedures not being cached and not enough memory
> alloacted to SQL Server.
>
> Thanks
>
>
|||Generally speaking, if this figure is over 100 compilations per second, then
you may be experiencing unnecessary compilation overhead. A high number such
as this might indicate that you server is just very busy, or it could mean
that unnecessary compilations are being performed. For example, compilations
can be forced by SQL Server if object schema changes, if previously
parallelized execution plans have to run serially, if statistics are
recomputed, or if a number of other things occur.
Also, it depends how your SPs are written. For example, if you have the
following SP:
CREATE PROCEDURE dbo.spTest (@.query bit) AS
IF @.query = 0
SELECT * FROM authors
ELSE
SELECT * FROM publishers
GO
Suppose I make my first call to this procedure with the @.query parameter set
to 0. The query-plan that SQL Server will generate will be optimized for the
first query ("SELECT * FROM authors"), because the path followed on the first
call will result in that query being executed.
Now, if I next call the stored procedure with @.query set to 1, the query
plan that SQL Server has in memory will not be of any use in executing the
second query, since the query-plan is optimized for the authors table, not
the publishers table. Result: SQL Server will have to compile a new query
plan, the one needed for the second query.
Ultimately, you should write the SP as follow:
CREATE PROCEDURE dbo.spTestDelegator (@.query bit) AS
IF @.query = 0
EXEC spTestFromAuthors
ELSE
EXEC spTestFromPublishers
GO
I hope this helps.
Sasan Saidi, MSc in CS
"I saw it work in a cartoon once so I am pretty sure I can do it."
"mm" wrote:

> Hi,
> Would someone please explain what causes a high "Compilations/sec" count?
> Is still caued by stored procedures not being cached and not enough memory
> alloacted to SQL Server.
>
> Thanks
>
>

Friday, March 9, 2012

High ASYNC_NETWORK_IO value

Hello!
I have been analyzing wait stats using get_waitstats_2005 on ou production S
QLserver (two node Active/Passive SQL Server 2005 64-bit cluster) and notice
d high values of
ASYNC_NETWORK_IO wait type (around 50% of total resource type). From what I
can see, our 1GB Network cards are all right. Has anyone had experience trou
bleshooting this wait type? What could be causing high percentage of this wa
it type?
Thanks,
IgorHi
"imarchenko" wrote:

> Hello!
> I have been analyzing wait stats using get_waitstats_2005 on ou production
SQLserver (two node Active/Passive SQL Server 2005 64-bit cluster) and noti
ced high values of
> ASYNC_NETWORK_IO wait type (around 50% of total resource type). From what
I can see, our 1GB Network cards are all right. Has anyone had experience tr
oubleshooting this wait type? What could be causing high percentage of this
wait type?
> Thanks,
> Igor
50% of not very much is probably not something to worry about! If perfmon
stats such as output queue length show no problems then you are probably ok.
I assume that your 1GB network cards are on a 1GB network that does not have
any bottlenecks? If you can ad an extra card I don't think it would not do
any harm doing that!
John|||Also..
Have you checked for queries that return a large result set? You may want to
use SQL profiler to identify them. If you return an excessive result set the
client may not be able to process this fast enough!
John|||John,
This could well be the case. We have old legacy report application
generating thousands of queries per report.
Thanks,
Igor
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:%23z%23wpltUHHA.192@.TK2MSFTNGP04.phx.gbl...
> Also..
> Have you checked for queries that return a large result set? You may want
> to use SQL profiler to identify them. If you return an excessive result
> set the client may not be able to process this fast enough!
> John
>|||Hi Igor
"imarchenko" wrote:

> John,
> This could well be the case. We have old legacy report application
> generating thousands of queries per report.
> Thanks,
> Igor
Profiling the system will certainly show high I/O and duration queries. Look
for instances where the result set is too wide or too many rows are being
returned that are not being used.
If you can't tune or re-write the queries, maybe a different method of
delivery would be more approprate such as a scheduled report or DTS/SSIS
export. Use of Analysis Service may allow you to process your data in a more
piecemeal way?
John|||John,
AS is out of question at the moment, but we are working on improving this
app.
Thanks again,
Igor
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:A4E11028-D4C0-4845-A347-03C5FEB1BE5D@.microsoft.com...
> Hi Igor
> "imarchenko" wrote:
>
> Profiling the system will certainly show high I/O and duration queries.
> Look
> for instances where the result set is too wide or too many rows are being
> returned that are not being used.
> If you can't tune or re-write the queries, maybe a different method of
> delivery would be more approprate such as a scheduled report or DTS/SSIS
> export. Use of Analysis Service may allow you to process your data in a
> more
> piecemeal way?
> John
>

High ASYNC_NETWORK_IO value

This is a multi-part message in MIME format.
--=_NextPart_000_0006_01C75217.31455020
Content-Type: text/plain;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
Hello!
I have been analyzing wait stats using get_waitstats_2005 on ou = production SQLserver (two node Active/Passive SQL Server 2005 64-bit = cluster) and noticed high values of
ASYNC_NETWORK_IO wait type (around 50% of total resource type). From = what I can see, our 1GB Network cards are all right. Has anyone had = experience troubleshooting this wait type? What could be causing high = percentage of this wait type?
Thanks,
Igor
--=_NextPart_000_0006_01C75217.31455020
Content-Type: text/html;
charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
&
Hello!

I have been analyzing wait stats using get_waitstats_2005 on ou production SQLserver (two node Active/Passive = SQL Server 2005 64-bit cluster) and noticed high values of
ASYNC_NETWORK_IO wait type (around 50% of = total resource type). From what I can see, our 1GB Network cards are all = right. Has anyone had experience troubleshooting this wait type? What could be = causing high percentage of this wait type?
Thanks,
Igor
--=_NextPart_000_0006_01C75217.31455020--Hi
"imarchenko" wrote:
> Hello!
> I have been analyzing wait stats using get_waitstats_2005 on ou production SQLserver (two node Active/Passive SQL Server 2005 64-bit cluster) and noticed high values of
> ASYNC_NETWORK_IO wait type (around 50% of total resource type). From what I can see, our 1GB Network cards are all right. Has anyone had experience troubleshooting this wait type? What could be causing high percentage of this wait type?
> Thanks,
> Igor
50% of not very much is probably not something to worry about! If perfmon
stats such as output queue length show no problems then you are probably ok.
I assume that your 1GB network cards are on a 1GB network that does not have
any bottlenecks? If you can ad an extra card I don't think it would not do
any harm doing that!
John|||Also..
Have you checked for queries that return a large result set? You may want to
use SQL profiler to identify them. If you return an excessive result set the
client may not be able to process this fast enough!
John|||John,
This could well be the case. We have old legacy report application
generating thousands of queries per report.
Thanks,
Igor
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:%23z%23wpltUHHA.192@.TK2MSFTNGP04.phx.gbl...
> Also..
> Have you checked for queries that return a large result set? You may want
> to use SQL profiler to identify them. If you return an excessive result
> set the client may not be able to process this fast enough!
> John
>|||Hi Igor
"imarchenko" wrote:
> John,
> This could well be the case. We have old legacy report application
> generating thousands of queries per report.
> Thanks,
> Igor
Profiling the system will certainly show high I/O and duration queries. Look
for instances where the result set is too wide or too many rows are being
returned that are not being used.
If you can't tune or re-write the queries, maybe a different method of
delivery would be more approprate such as a scheduled report or DTS/SSIS
export. Use of Analysis Service may allow you to process your data in a more
piecemeal way?
John|||John,
AS is out of question at the moment, but we are working on improving this
app.
Thanks again,
Igor
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:A4E11028-D4C0-4845-A347-03C5FEB1BE5D@.microsoft.com...
> Hi Igor
> "imarchenko" wrote:
>> John,
>> This could well be the case. We have old legacy report application
>> generating thousands of queries per report.
>> Thanks,
>> Igor
> Profiling the system will certainly show high I/O and duration queries.
> Look
> for instances where the result set is too wide or too many rows are being
> returned that are not being used.
> If you can't tune or re-write the queries, maybe a different method of
> delivery would be more approprate such as a scheduled report or DTS/SSIS
> export. Use of Analysis Service may allow you to process your data in a
> more
> piecemeal way?
> John
>

High ASYNC_NETWORK_IO value

Hello!
I have been analyzing wait stats using get_waitstats_2005 on ou production SQLserver (two node Active/Passive SQL Server 2005 64-bit cluster) and noticed high values of
ASYNC_NETWORK_IO wait type (around 50% of total resource type). From what I can see, our 1GB Network cards are all right. Has anyone had experience troubleshooting this wait type? What could be causing high percentage of this wait type?
Thanks,
Igor
Hi
"imarchenko" wrote:

> Hello!
> I have been analyzing wait stats using get_waitstats_2005 on ou production SQLserver (two node Active/Passive SQL Server 2005 64-bit cluster) and noticed high values of
> ASYNC_NETWORK_IO wait type (around 50% of total resource type). From what I can see, our 1GB Network cards are all right. Has anyone had experience troubleshooting this wait type? What could be causing high percentage of this wait type?
> Thanks,
> Igor
50% of not very much is probably not something to worry about! If perfmon
stats such as output queue length show no problems then you are probably ok.
I assume that your 1GB network cards are on a 1GB network that does not have
any bottlenecks? If you can ad an extra card I don't think it would not do
any harm doing that!
John
|||Also..
Have you checked for queries that return a large result set? You may want to
use SQL profiler to identify them. If you return an excessive result set the
client may not be able to process this fast enough!
John
|||John,
This could well be the case. We have old legacy report application
generating thousands of queries per report.
Thanks,
Igor
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:%23z%23wpltUHHA.192@.TK2MSFTNGP04.phx.gbl...
> Also..
> Have you checked for queries that return a large result set? You may want
> to use SQL profiler to identify them. If you return an excessive result
> set the client may not be able to process this fast enough!
> John
>
|||Hi Igor
"imarchenko" wrote:

> John,
> This could well be the case. We have old legacy report application
> generating thousands of queries per report.
> Thanks,
> Igor
Profiling the system will certainly show high I/O and duration queries. Look
for instances where the result set is too wide or too many rows are being
returned that are not being used.
If you can't tune or re-write the queries, maybe a different method of
delivery would be more approprate such as a scheduled report or DTS/SSIS
export. Use of Analysis Service may allow you to process your data in a more
piecemeal way?
John
|||John,
AS is out of question at the moment, but we are working on improving this
app.
Thanks again,
Igor
"John Bell" <jbellnewsposts@.hotmail.com> wrote in message
news:A4E11028-D4C0-4845-A347-03C5FEB1BE5D@.microsoft.com...
> Hi Igor
> "imarchenko" wrote:
>
> Profiling the system will certainly show high I/O and duration queries.
> Look
> for instances where the result set is too wide or too many rows are being
> returned that are not being used.
> If you can't tune or re-write the queries, maybe a different method of
> delivery would be more approprate such as a scheduled report or DTS/SSIS
> export. Use of Analysis Service may allow you to process your data in a
> more
> piecemeal way?
> John
>

Sunday, February 26, 2012

HIDING TEXT BOX IN RDL

Hi All,
I have a text box with static data, I need to hide the textbox based on the
condition, if =Sum(Fields!TotalSessions.Value) > 1 then I need to show the
text box else I need to hide the text box.
How to write the condition and where do I add the text to the text box.
Balaji
--
Message posted via http://www.sqlmonster.comProperties
Visibility
Hidden Expression
Example;
=Iif(Sum(Fields!PO_.Value, "CustomerSubContractsDetail") <> 0, False, True)

Hiding Subtotals

Is there a way to hide subtotals based on a parameter value?
--
Is that a cursor in your code!Yes! Just have to write a condition statement in visibility option.
"Ammar" wrote:
> Is there a way to hide subtotals based on a parameter value?
>
> --
> Is that a cursor in your code!|||There is no visibility option associated with the value of the subtotals. The
visibility option pertains to the header cell only. I have tried this and
ended up with subtotal columns with no header. What I am trying to do is to
eliminate the sub total columns all together from appearing (disabling the
subtotal) for a given group if a parameter equals to a predefined value.
"Asim" wrote:
> Yes! Just have to write a condition statement in visibility option.
> "Ammar" wrote:
> > Is there a way to hide subtotals based on a parameter value?
> >
> >
> >
> > --
> > Is that a cursor in your code!|||Ammar:
For any report item, you should be able to set the "visibility -> Hidden"
option.
In the "Hidden" option use the expression value to set it with your
parameter.For example, if you have a parameter say, 'HideSubtotal' and you
want to hide the subtotal column from displaying when 'HideSubtotal' is 1,
then in the expression of subtotal column visibility -> Hidden you will use:
=(Parameters!HideSubtotal.Value = 1)
That should do the trick.
"Ammar" wrote:
> There is no visibility option associated with the value of the subtotals. The
> visibility option pertains to the header cell only. I have tried this and
> ended up with subtotal columns with no header. What I am trying to do is to
> eliminate the sub total columns all together from appearing (disabling the
> subtotal) for a given group if a parameter equals to a predefined value.
> "Asim" wrote:
> > Yes! Just have to write a condition statement in visibility option.
> >
> > "Ammar" wrote:
> >
> > > Is there a way to hide subtotals based on a parameter value?
> > >
> > >
> > >
> > > --
> > > Is that a cursor in your code!|||Once again, there is no visibility option associated with the actual values
in the subtotal columns. In a matrix control when you add a subtotal
reporting designer adds a column with a "Total" label. There is a visibility
option when you select the cell, but if you click the green triangle to
modify the properties of the values of the subtotal column you do not see a
visibility option there. If I added an expression in the visibility option by
selecting the â'Totalâ' cell, then that only control the header of the columns
not the values.
Take some time and try it and you will see
"sam" wrote:
> Ammar:
> For any report item, you should be able to set the "visibility -> Hidden"
> option.
> In the "Hidden" option use the expression value to set it with your
> parameter.For example, if you have a parameter say, 'HideSubtotal' and you
> want to hide the subtotal column from displaying when 'HideSubtotal' is 1,
> then in the expression of subtotal column visibility -> Hidden you will use:
> =(Parameters!HideSubtotal.Value = 1)
> That should do the trick.
> "Ammar" wrote:
> > There is no visibility option associated with the value of the subtotals. The
> > visibility option pertains to the header cell only. I have tried this and
> > ended up with subtotal columns with no header. What I am trying to do is to
> > eliminate the sub total columns all together from appearing (disabling the
> > subtotal) for a given group if a parameter equals to a predefined value.
> >
> > "Asim" wrote:
> >
> > > Yes! Just have to write a condition statement in visibility option.
> > >
> > > "Ammar" wrote:
> > >
> > > > Is there a way to hide subtotals based on a parameter value?
> > > >
> > > >
> > > >
> > > > --
> > > > Is that a cursor in your code!

Hiding Rows....

I have the following in a cell...
=iif(Fields!TRANSDATE.Value >= Parameters!Report_Parameter_Mon.Value,
Fields!TRANSDATE.Value,Nothing)
What I want it to do is if the TRANSDATE is greater than or equal to the
Parameter display the TRANSDATE... otherwise don't display it... The above
code works to a point... but it leaves huge spaces as if it is displaying
the Nothing Rows... e.g.
Parameter = 01/06/04
02/06/04
05/06/04
How do I remove that massive blank area?Place your expression on the "rows visibility" instead of the fields'
contents.
=iif(Fields!TRANSDATE.Value >= Parameters!Report_Parameter_Mon.Value,"True"
, "False")
Click on the left table margin button, examine the properties of that row
and place your expression under the rows visibility property
HTH,
Greg|||Thanks that worked great. Now to figure out a way to get the subtotal of
the group to understand what needs to be totalled.
"Greg Rowland" <greg@.waveltd.com> wrote in message
news:euBy4oKdEHA.3916@.TK2MSFTNGP11.phx.gbl...
> Place your expression on the "rows visibility" instead of the fields'
> contents.
> =iif(Fields!TRANSDATE.Value >=Parameters!Report_Parameter_Mon.Value,"True"
> , "False")
> Click on the left table margin button, examine the properties of that row
> and place your expression under the rows visibility property
> HTH,
> Greg
>|||Click on the left table margin button, of the row in question.
Right click, then click insert row below.
Examine the properties of the newly created row.
Select "Data, grouping/sorting" click the (Ellipsis) button at the right.
Under name enter the group name.
In Group on, Expression, enter the field or fields you wish to discriminate
by.
Example;
Group name
Employees
Expression
=Fields!EmployeeNumber_.Value|||Finally
Add Field Expressions to the newly created rows cells.
Example:
=Sum(Fields!GrossPay.Value)
or
=Iif(Sum(Fields! GrossPay.Value, "Employees")<>0, Sum(Fields!
GrossPay.Value, "Employees"), "")
Other variations;
Font
Font expression =Iif(Sum(Fields!GrossPay.Value, "Employees")<>0, "Bold",
"Normal")
TextDecoration
=Iif(Fields! Sum(Fields!GrossPay.Value, "Employees")<>0, "Underline",
"Normal")|||Fantastic. thanks.
"Greg Rowland" <greg@.waveltd.com> wrote in message
news:uQSQ9RZdEHA.3380@.TK2MSFTNGP12.phx.gbl...
> Finally
> Add Field Expressions to the newly created rows cells.
> Example:
> =Sum(Fields!GrossPay.Value)
> or
> =Iif(Sum(Fields! GrossPay.Value, "Employees")<>0, Sum(Fields!
> GrossPay.Value, "Employees"), "")
> Other variations;
> Font
> Font expression =Iif(Sum(Fields!GrossPay.Value, "Employees")<>0, "Bold",
> "Normal")
> TextDecoration
> =Iif(Fields! Sum(Fields!GrossPay.Value, "Employees")<>0, "Underline",
> "Normal")
>
>
>

Friday, February 24, 2012

Hiding Main Report Items based on SubReport Value

Hi Group,
How do i Conditionally Hide the Main Report Iitem based on the SubReport
Value.
My Main Report Looks Like This
Task Id Description Effort
1020 Project Plan 100(Subreport Item)
What i want is if the effort is Zero i Should hide 1020,Project Plan and My
Subreport Value,Items How do i do
Respond ASAPThe main report cannot reach inside the subreport(s) to extract values. You
would need to make sure the relevant value is available in your main query.
--
This post is provided 'AS IS' with no warranties, and confers no rights. All
rights reserved. Some assembly required. Batteries not included. Your
mileage may vary. Objects in mirror may be closer than they appear. No user
serviceable parts inside. Opening cover voids warranty. Keep out of reach of
children under 3.
"Manoj.Pasumarthi" <ManojPasumarthi@.discussions.microsoft.com> wrote in
message news:D0A47874-C1C6-4D0A-88AA-A96BE0E27094@.microsoft.com...
> Hi Group,
> How do i Conditionally Hide the Main Report Iitem based on the SubReport
> Value.
> My Main Report Looks Like This
> Task Id Description Effort
> 1020 Project Plan 100(Subreport Item)
> What i want is if the effort is Zero i Should hide 1020,Project Plan and
My
> Subreport Value,Items How do i do
> Respond ASAP
>|||Hi Chris,
Thank u
"Chris Hays [MSFT]" wrote:
> The main report cannot reach inside the subreport(s) to extract values. You
> would need to make sure the relevant value is available in your main query.
> --
> This post is provided 'AS IS' with no warranties, and confers no rights. All
> rights reserved. Some assembly required. Batteries not included. Your
> mileage may vary. Objects in mirror may be closer than they appear. No user
> serviceable parts inside. Opening cover voids warranty. Keep out of reach of
> children under 3.
> "Manoj.Pasumarthi" <ManojPasumarthi@.discussions.microsoft.com> wrote in
> message news:D0A47874-C1C6-4D0A-88AA-A96BE0E27094@.microsoft.com...
> > Hi Group,
> >
> > How do i Conditionally Hide the Main Report Iitem based on the SubReport
> > Value.
> > My Main Report Looks Like This
> >
> > Task Id Description Effort
> > 1020 Project Plan 100(Subreport Item)
> >
> > What i want is if the effort is Zero i Should hide 1020,Project Plan and
> My
> > Subreport Value,Items How do i do
> >
> > Respond ASAP
> >
> >
>
>|||"Chris Hays [MSFT]" wrote:
> The main report cannot reach inside the subreport(s) to extract values. You
> would need to make sure the relevant value is available in your main query.
> --
> This post is provided 'AS IS' with no warranties, and confers no rights. All
> rights reserved. Some assembly required. Batteries not included. Your
> mileage may vary. Objects in mirror may be closer than they appear. No user
> serviceable parts inside. Opening cover voids warranty. Keep out of reach of
> children under 3.
> "Manoj.Pasumarthi" <ManojPasumarthi@.discussions.microsoft.com> wrote in
> message news:D0A47874-C1C6-4D0A-88AA-A96BE0E27094@.microsoft.com...
> > Hi Group,
> >
> > How do i Conditionally Hide the Main Report Iitem based on the SubReport
> > Value.
> > My Main Report Looks Like This
> >
> > Task Id Description Effort
> > 1020 Project Plan 100(Subreport Item)
> >
> > What i want is if the effort is Zero i Should hide 1020,Project Plan and
> My
> > Subreport Value,Items How do i do
> >
> > Respond ASAP
> >
> >
>
>|||I'm guessing the original question was in regards to RS 2000. Has the answer
changed with RS 2005? Can subreport values be referenced from the master
report in RS 2005
"Chris Hays [MSFT]" wrote:
> The main report cannot reach inside the subreport(s) to extract values. You
> would need to make sure the relevant value is available in your main query.
> --
> This post is provided 'AS IS' with no warranties, and confers no rights. All
> rights reserved. Some assembly required. Batteries not included. Your
> mileage may vary. Objects in mirror may be closer than they appear. No user
> serviceable parts inside. Opening cover voids warranty. Keep out of reach of
> children under 3.
> "Manoj.Pasumarthi" <ManojPasumarthi@.discussions.microsoft.com> wrote in
> message news:D0A47874-C1C6-4D0A-88AA-A96BE0E27094@.microsoft.com...
> > Hi Group,
> >
> > How do i Conditionally Hide the Main Report Iitem based on the SubReport
> > Value.
> > My Main Report Looks Like This
> >
> > Task Id Description Effort
> > 1020 Project Plan 100(Subreport Item)
> >
> > What i want is if the effort is Zero i Should hide 1020,Project Plan and
> My
> > Subreport Value,Items How do i do
> >
> > Respond ASAP
> >
> >
>
>

Sunday, February 19, 2012

Hiding columns

hi
I have 2 doubts . Need to finish some reports by tomorrow :
1.How to hide certain columns in a report based on say a value being returned from a SP.
2. I want all the recoreds in the report to be shown in a single page and without any page breaks...
Any suggestions will be very helpful
Thanks
--
Posted using Wimdows.net NntpNews Component -
Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine supports Post Alerts, Ratings, and Searching.1. Columns have a visibility property that can be based on any value.
2. Set the page height to something large.
--
Brian Welcker
Group Program Manager
SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
"SqlJunkies User" <User@.-NOSPAM-SqlJunkies.com> wrote in message
news:OprEejuVEHA.4092@.TK2MSFTNGP11.phx.gbl...
> hi
> I have 2 doubts . Need to finish some reports by tomorrow :
> 1.How to hide certain columns in a report based on say a value being
returned from a SP.
> 2. I want all the recoreds in the report to be shown in a single page and
without any page breaks...
> Any suggestions will be very helpful
> Thanks
> --
> Posted using Wimdows.net NntpNews Component -
> Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine
supports Post Alerts, Ratings, and Searching.

Hiding columns

Hi
One more query!!
What expression can I use to set the visibility of a list or a matrix
I tried
=IIf(Fields!count.Value > 20, 0 , 1 )
and
=IIf(Fields!count.Value > 20, "False", "True" )
and
=IIf(Fields!count.Value > 20, "Hidden", "Visible" )
none of these work.
Can anyone tell me what is wrong here?
Thanks
--
Posted using Wimdows.net NntpNews Component -
Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine supports Post Alerts, Ratings, and Searching.I am suspicious that you are using Fields!count.Value and not
Sum(Fields!count.Value) if you are trying to hide a list or matrix as you
will probably want to do this over multiple rows. The first one should work
as should an expression that returns a boolean, =(Fields!Count.Value>20).
--
Brian Welcker
Group Program Manager
SQL Server Reporting Services
This posting is provided "AS IS" with no warranties, and confers no rights.
"SqlJunkies User" <User@.-NOSPAM-SqlJunkies.com> wrote in message
news:uAZQZYvVEHA.2520@.TK2MSFTNGP12.phx.gbl...
> Hi
> One more query!!
> What expression can I use to set the visibility of a list or a matrix
> I tried
> =IIf(Fields!count.Value > 20, 0 , 1 )
> and
> =IIf(Fields!count.Value > 20, "False", "True" )
> and
> =IIf(Fields!count.Value > 20, "Hidden", "Visible" )
> none of these work.
> Can anyone tell me what is wrong here?
> Thanks
>
> --
> Posted using Wimdows.net NntpNews Component -
> Post Made from http://www.SqlJunkies.com/newsgroups Our newsgroup engine
supports Post Alerts, Ratings, and Searching.

Hiding a zero value

How can I hide or suppress the 0-values (zero) in a report?

Use the expression =iif(Fields!FIELDNAME.Value = 0, True, False) at the Visibility expression field in the properties of the textbox.

Kind regards!

Marius

|||

Try this. . .

=Iif(Fields!FIELDNAME.Value = 0, "", Fields!FIELDNAME.Value)

|||

Both options will work.

Thx a lot!

|||You're welcome!|||

Add this function to the custom code field on de report properties:

Code Snippet

Function NoZero(Number as Double)
If Number = 0 Then
Return Nothing
Else
Return Number
End If
End Function

Now you can use this function to suppres zeros in your expression like this:

Code Snippet

=Code.NoZero(Fields!YourField.Value)

Hiding a zero value

How can I hide or suppress the 0-values (zero) in a report?

Use the expression =iif(Fields!FIELDNAME.Value = 0, True, False) at the Visibility expression field in the properties of the textbox.

Kind regards!

Marius

|||

Try this. . .

=Iif(Fields!FIELDNAME.Value = 0, "", Fields!FIELDNAME.Value)

|||

Both options will work.

Thx a lot!

|||You're welcome!|||

Add this function to the custom code field on de report properties:

Code Snippet

Function NoZero(Number as Double)
If Number = 0 Then
Return Nothing
Else
Return Number
End If
End Function

Now you can use this function to suppres zeros in your expression like this:

Code Snippet

=Code.NoZero(Fields!YourField.Value)