Sunday, 2 February 2014

How to run the sql scripts on SHR

Login to SHR server with administrative privilages

go to Start -> Run then type dbisql.exe

Make sure the ODBC data source name is selected as SHRDB.

press connect. (if you are using the default DSN created during the installation no need to input the username and password)

Once it is connected to DB.

go to file menu -> run script  and browse and select the script.  This will run the script and provide the output to default location which is this case C:\HP-SHR\Sybase\IQ15_4\Bin64

In the above screenshot this items are marked in red color. Cut the output file from this location and send it to HP (Make sure this location clean and remove the output files from this location after the activity)

Monday, 27 January 2014

How to customize the first DayOfWeek in SHR/Sybase IQ?

How to customize the first DayOfWeek in SHR/Sybase IQ?
Setting the first day of the week is not technically a setting in SHR. This setting is at the Sybase IQ level, and changes must be made in the database.
The Sybase IQ option DATE_FIRST_DAY_OF_WEEK is, by default, 0, making Sunday the default first day of the week. This option could be set to any of the following values, as appropriate:
First Day
The following SQL code may be run in dbisql.exe to change this setting on the pmdb database: 
set option DATE_FIRST_DAY_OF_WEEK = 1
Change option in this way affect only current user, so you must connect to DB as pmdb_admin user.
update datetime set
week_boundary=case when (datepart(mi,time_full_date) = 0 and
                         datepart(hh,time_full_date) = 0 and
                         datepart(dw,time_full_date) = 1) then 1
              else 0
We must also recalculate week numbers to shift Sundays and Mondays to properly weeks.

Monday, 13 January 2014

Data Colection Stops after some time SHR 9.30

On SHR 9.30 i had an issue. Data collection is getting stoped every now and then. Raised a case with HP on this and they found this as a know issue.

If you are having similar issue check for the below error

2013-12-29 11:20:25,380 ERROR, com.hp.bto.bsmr.collection.utils.RollingFileWriter.forceFlush , Error in flushing file C:\HP-SHR\PMDB/extract/\ALL_VIEWS_0_cmdbviews_0_584358474342500.csv This instance of the CsvWriter class has already been closed.

if this is the case there is hotfix available for this . Please be in touch HP support for the hot fix


Oracle Content Pack Error

If you are installing Oracle Content pack on SHR 9.30 over a period of time SHR Infoview will produce an error "Partial-Results".
Same time report genration will be extremly Slow.
I had this issue and Raised case with Support. Looks like it is a known  issue.

QCCR1A171394- duplicate records in k_ci_group_bridge table.

Please check the above for more details and raise a case with HP with details support will provide a solution

SHR Utlities

SHR provide a log viewer utility which is very  good when you need to check some of the log files for a specifica word or ERROR Level.

This utility is available in C:\HP-SHR\PMDB\bin\LogViewer.exe

If you are not aware of this please check it.

SHR Tables and Aggregation -Information

This is Just an information i found it.. This major change is not uddated any in of the documents . So though of sharing with people.

In SHR 9.30 there is no Monthly and above tables. SHR 9.20 had Monthly tables but from 9.30 onwards the aggregation is done in the front end where mponthly and yearly data will be aggregated and presented in the Business Objects level.


Manual Data aggragation command in SHR

In any of the case if you need to do data aggreation in a manual mode in SHR there are 2 options . Please find the options below.

I have tested and confirmed the second option. First option didnt work for me intially becuase i forgot to mention the switch execute=true.

I followed the second option and suceeded. Later i have got an update from support on my errors.

This is tested in SHR 9.30
Option 1

As per HP support the aggregation process is to run the following command.
aggregate config=%PMDB_HOME%/scripts/SR_SM_NODE_RES_SH_SM_NODE_RES_Hourly_Resource_Details.xml processall=true execute=true
In the above example i am aggrageting SM content pack and metrics like CPU,MEM,Buffer,Network etc.. Similarly there are many xml files under
file start with SR_ means this is initial data collection
File start with SH is hourly
file start with SD is horuly.
Now if you try to read this file name you can see that we are taking values from SR_SM_NODE_RES and aggrgeting and updating values in SH_SM_NODE_RES which is the hourly table.
So person who is doing the aggregation should do understand th xml files first and plan the aggreation well
Once Content pack idenfiteid needs agregation . need to indetnify the corresponding SR_ to SH and SH_ to SD xml files.
then first need to run the SR_ to SH_ and next SH_ to SD .
Make sure to follow the highrarchy

Option 2

on the similar scenario you can the below step to do the aggregation.
User need to login the PG Admin and go to SQL window
now we need to identify the data from aggregate_control table . for doing this person who is doing should be familar with Content pack and it tables
in the below scenario my primary interest is to aggrgate the system availability so do run the below sql

select * from aggregate_control where source_table like 'SH_SM_NODE_AVAIL'
this command will provide me the output

once i get the max_ta_period we need to know from when we need to aggrgate the data.
This can be done by a query in sybase. select min(ta_period),max(ta_period) from SR_SM_NODE_AVAIL
now we no the from when data is available in SR table
based on that we need to update the aggregate_control tables max_ta_period  filed to a prevous date. In this case min(ta_period) from SR_SM_Node_Avail will be that value.

update aggregate_control set max_ta_period ='2013-11-01 00:00:00.000' where source_table like 'SR_SM_NODE_AVAIL'
run the above command in PG admin SQl Prompt.
check the value is reflected by
select * from aggregate_control where source_table like 'SH_SM_NODE_AVAIL'
until the commit is done data will not get updated permanant.

after checking the value in PG admin
run the following command on command line

%PMDB_HOME%\bin\aggregate config=C:\HP-SHR\PMDB\scripts\SR_SM_NODE_AVAIL_SH_SM_NODE_AVAIL_HourlyNodeAvailability.xml processAll=true
and repeat the same procedure for SH_SM to SD_SM tables also