Sunday, December 28, 2008

What's New in PerformancePoint SP2?


Planning
Support for Windows Server 2008 Hyper-V™
You can now use PerformancePoint Server 2007 SP2 with Windows Server 2008 Hyper-V.
Hyper-V creates new opportunities for server virtualization. You can use Hyper-V to make more efficient use of system hardware and host operating system resources to reduce the overhead that is associated with virtualization.
For more information, see the
PerformancePoint Server 2007 Hyper-V guide.


Monitoring

  1. You can now use SQL Server 2008 with PerformancePoint Server 2007 SP2.
    Important: To use SQL Server 2008 with PerformancePoint Server, you must install PerformancePoint Server 2007 SP2 before you install SQL Server 2008.
    You can now use the Show Details action on PerformancePoint reports that use data that is stored in SQL Server 2008 Analysis Services.
  2. Show Details enables dashboard consumers to right-click in a cell or on a chart value and see the transaction-level details for that value.
    PerformancePoint Server 2007 with SP2 now supports Windows Server 2008 Hyper-V.
    Hyper-V creates new opportunities for server virtualization. You can use Hyper-V to make more efficient use of system hardware and host operating system resources to reduce the overhead associated with virtualization.
    For more information, see the
    PerformancePoint Server 2007 Hyper-V guide (http://go.microsoft.com/?linkid=9639690).
  3. You can now use Dashboard Designer on a computer that us running .NET Framework 3.5 alongside .NET Framework 2.0.
    You must install .NET Framework 2.0 before you install .NET Framework 3.5.
    You can now use PerformancePoint Server with domains that have apostrophes in their names
    In previous versions of PerformancePoint Server, when a domain name included an apostrophe, the configuration tool failed for both Planning Server and Monitoring Server.
  4. Scorecard key performance indicator (KPI) queries are improved.
  5. Timeout errors no longer occur with scorecard key performance indicators (KPIs) that use data that is stored in SQL Server 2005 Analysis Services.
  6. Time Intelligence Post Formula filters now display the correct number of days for each month.
    In previous versions of PerformancePoint Server, the calendar control for Time Intelligence Post Formula filters sometimes displayed 31 days for each month. This is no longer the case.
  7. Time Intelligence filters now work on scorecard KPIs that use data that is stored in Analysis Services
    In previous versions of PerformancePoint Server, some Time Intelligence expressions caused filters that were linked to KPIs to fail.
    For example, when a compound expression such as (Day-7:Day-1) was used in a Time Intelligence Post Formula filter and that filter was linked to a KPI, an error message occurred. In PerformancePoint Server 2007 PS2, single and compound Time Intelligence expressions work with KPIs that use data that is stored in Analysis Services.

Sunday, November 2, 2008

Download SQL Server 2005 Service Pack 3 - CTP

The CTP version of SQL Server 2005 Service Pack 3 (SP3) is now available. You can use these packages to upgrade any of the following SQL Server 2005 editions:

  • Enterprise
  • Enterprise Evaluation
  • Developer
  • Standard
  • Workgroup

For a summary list of What’s new in SQL Server 2005 SP3 CTP, review the What’s New document.

These packages have been made available for general testing purposes only. Do not deploy the CTP software in production.

Download SQL Server 2005 Service Pack 3

Wednesday, July 30, 2008

SQL Server 2008 SSMS Enhancements - Part I

I have just installed SQL Server 2008 RC0 and had been playing a bit with the new Management Studio. I found a few interesting features. I will try to cover some of those features in this post.
The new cool splash screen SQL Server 2008 has got an all new logo and splash screen.
Activity Monitor
The next feature that attracted me is the new "Activity Monitor".
It opens a window that displays the server activities.


What I found most useful is the section that shows "Recent Expensive Queries". One of the tasks that I do quite often is the optimization of "bad" queries or even "well written" queries that does not perform well enough. This panel of the activity monitor seems to be a good tool which readily shows me the stuff that I need to work on.
Missing Index hints in Graphical Execution Plan
Another change that I noticed instantly is the addition of Missing Index Hints in the query execution plan.
You could right click on the index hint and it will show a context menu that will take you to the details of the missing index.

Thursday, July 24, 2008

Upgrade live applications to SQL Server 2005 for high availability

Most companies have the luxury of upgrading their applications in place. But for many industries such as credit card, banking, shopping and gaming companies, the prospect of downtime is unacceptable. Imagine eBay/PayPal taking the system down for even one hour for an upgrade. The costs quickly escalate into the millions in just lost immediate revenue, not to mention the customers that would be lost to competitors.
The challenge
Your boss has told you that the ordering application has to be upgraded to version 10 with less than 15 minutes downtime. The database in question is 4+ TB and takes 16 hours to upgrade without any traffic running to it.
Reality check
The best laid ideas often fail because the scope of the project is not set early on and strictly adhered to. You will probably need to focus on a few key application areas that are critical to convert live. In eBay's case, it might say that, throughout the course of the upgrade, activities such as placing auctions, bids, payments and so on have to be live, but it's acceptable if you can't provide feedback or account merges. Try to avoid converting any data live, but keep each functional unit (block of tables) intact and transfer them live.

Hypothetical process
What process you pick, is heavily dependent on your budget. In this hypothetical scenario, we'll use two clusters with two nodes each. But you could easily simulate this scenario using two SQL Server instances on the same physical (or cluster) box or virtual servers running something like VMware's ESX. Let's examine the figure below.

Using two clusters and two nodes for live application upgrade.
Stage 1
In stage 1, we have the live data flowing into the current version of the database application. We created a second conversion database on Cluster 1 that contains any tables we want to convert over to the upgraded database – while the application is live. The most important part at this point is determining how to maintain consistency in the tables. The most common method is saving off of the current date/reference number for all conversion data at the time you want to start converting. Then continually you'd use a job that fires off stored procedures to bring over dates/reference numbers greater than the initial numbers you saved off. You could use other methods, such as triggers or Service Broker, but both would involve modifications to the original database schema and increase your risk and chance of making a mistake.
Stage 2
Stage 2 is a second physical cluster (or second instance/virtual server, depending on budget). As far as the schema goes, the conversion database on Cluster 2 exactly matches the conversion database on Cluster 1.
As the application pushes live data into the Order App DB (V9), the SQL Agent job on Cluster 1 fires off stored procedures that move the data into the conversion database on Cluster 1. Once the data reaches the conversion database, you can set up a replication publication on Cluster 1 for the conversion database that has a subscription running on Cluster 2. This causes any data pushed into the conversion database to transfer almost immediately into the conversion database on Cluster 2.
As the application is live pushing data into Cluster 1's version 9 database and is replicated over to the conversion database on Cluster 2, you can run the database upgrade process on Cluster 2's Order App DB – which you already restored from production at some point. Regardless of the time the database upgrade takes to run, the live data will be flowing into the conversion database on Cluster 2. Once the database upgrade is complete, you can fire off the SQL Server Agent job on Cluster 2; this fires off stored procedures that convert the version 9 data tables in the conversion database into the upgraded version 10 database sitting on Cluster 2. At this point, you'll have a chance to do any necessary configuration/maintenance to the upgraded version 10 database.
Stage 3
Once the conversion database on Cluster 2 is close to empty and the clients are upgraded to version 10, it's time to simply turn off Cluster 1 and point all the clients to Cluster 2, which already have all the critical live data transferred. A quick method of switching clients to version 10 would be SMS or SoftGrid (application virtualization), or some other method of automatically deploying a new version of a client en masse to a large client base. This would allow you to deploy version 10 applications within 15 minutes and they'd point to the freshly upgraded version 10 database on Cluster 2.
Summary
Upgrading an application while it is live requires that a lot of custom code be written to ensure the tables are consistent between the clusters (servers or instances). Test the upgrade scenario/scripts out several times, verifying reports at all the various stages to make sure the upgrade flows like clockwork. Verifying the various conversion database processes and jobs and documenting and ensuring consistency is fairly boring work, but it will pay off in the end when the upgrade flows smoothly. It beats working out issues within the user's sight.

Tuesday, July 22, 2008

Configure IBM Cognos 8.3 with IIS 7 and Windows Vista

One of the common problems I has facing is Configure IBM Cognos 8.3 with IIS 7 and Windows Vista for that I summarize the configuration in the following steps hoping be useful

1. In IIS7 goto your device name
2. There in the center screen open ISAPI and CGI RESTRICTIONS
3. On the right screen click add
4. the path: %windir%\system32\inetsrv\cgi.dll (if you dont have this file, install it from the 'turn windows features on/off) where you can find it under the IIS component.
5. Name this thing something like 'dot cgi'
6. Allow it to execute -> done
7. Now click on the right screen 'edit feature settings' and allow cgi (I allowed both cgi and isapi )
8. Click on your device name again and in the center screen open 'Handler mappings'
9. Add module mapping
10. Request path: *.cgi
11. Module: CgiModule
12. Name: dot.cgi
Now you can go to open welcome page and start navigate :http://"ServerName"/Cognos8

Warning: do not add cgi as MIME type

Tuesday, July 15, 2008

SQL Server Audit in SQL Server 2008

SQL Server 2008 February CTP introduces a new feature, SQL Server Audit. This feature facilitates auditing of Database Engine events by providing simple T-SQL statements to enable, store, and view audits on server and database objects. Compared to auditing with SQL Server trace files, using SQL Server Audit is much easier and cleaner. SQL Server Audit is implemented in the SQL Server Extended Events infrastructure. Extended events are created by the T-SQL statements to record the occurrences of the auditable actions.


To implement SQL Server audits, you first need to create a SQL Server Audit object. This object collects server and/or database-level actions and groups of actions that you want to monitor. It is created at the SQL Server instance level. You can have multiple SQL Server Audit objects per SQL Server instance. In the SQL Server Audit object, you specify the location for the output of the audit results. The results can be written to a binary file, the Windows Application Log, or the Windows Security Log. The file or the log is the Target component and the consumer of the associated extended events. Then you create Server or Database Audit Specification objects that belong to the SQL Server audit object. A Server Audit Specification collects server-level actions grouped in server audit action groups. It cannot collect individual actions.



However, a Database Audit Specification collects both database-level audit action groups and individual actions.




In Part I of this series, we will focus on the server level events. Server audit action groups record logon and logoff operations, server principal changes, server and database role membership changes, database maintenance operations, server object changes, database object changes, server and database statement and object permission changes. Any database related action groups are collected on all the databases on the server instance.

For example, DATABASE_ROLE_MEMBER_CHANGE_GROUP in a Server Audit Specification collects any actions that add or remove a login from a database role in any database on the server. If you want to audit only one database, you will need to define a Database Audit Specification in that specific database.
As part of the auditing requirements, we would like to monitor server logins and their server-level permission changes. To monitor login additions and deletions, we can include the SERVER_PRINCIPAL_CHANGE_GROUP audit action group in our server audit specification. Server-level permissions can be changed by issuing GRANT, REVOKE, or DENY, or through assigning server role memberships. Therefore, we can include the SERVER_PERMISSION_CHANGE_GROUP and SERVER_ROLE_MEMBER_CHANGE_GROUP audit action groups.
In this script, we write the audit to the Application Log.

/* Create a SQL Server Audit Object that writes the audit results to the Windows Application Log every one second.
If the write fails, the instance continues running without stopping. */


CREATE SERVER AUDIT MyServerAudit
TO APPLICATION_LOG
WITH ( QUEUE_DELAY = 1000, ON_FAILURE = CONTINUE);

/* Create a Server Audit Specification object for the server audit.
This object include three audit action groups related to server principal changes. */


CREATE SERVER AUDIT SPECIFICATION MyServerAuditSpecification
FOR SERVER AUDIT MyServerAudit
ADD (SERVER_PRINCIPAL_CHANGE_GROUP),
ADD (SERVER_PERMISSION_CHANGE_GROUP),
ADD (SERVER_ROLE_MEMBER_CHANGE_GROUP);

/* By default, both the audit and audit specification are created in the disabled state.
We need to enable them before using them to record actions. */
ALTER SERVER AUDIT SPECIFICATION MyServerAuditSpecification
WITH (STATE = ON);
ALTER SERVER AUDIT MyServerAudit
WITH (STATE = ON);



You can confirm that the server audit object and the server audit specification object have been created by selecting from the system views shown below.SELECT * FROM sys.server_audits

SELECT * FROM sys.server_audit_specifications
To confirm that an extended event session has been created, we can select from the extended events dynamic management views.select s.name as [session name], se.event_name as [event name], p.name as [package name], t.target_name as [target name]
from sys.dm_xe_sessions s
join sys.dm_xe_session_events se on s.address = se.event_session_address
join sys.dm_xe_packages p on se.event_package_guid = p.guid
join sys.dm_xe_session_targets t on s.address=t.event_session_address



As we can see above, an extended event session called MyServerAudit$A was created. It includes the audit_event from the SecAdudit package. The audit is written to Application Event log asynchronously.
Let’s test this script by adding a login called testlogin. Grant the View Server State permission to the login, and add it to the dbcreator role.CREATE LOGIN testLogin WITH PASSWORD = 'Welcome123'
GRANT VIEW SERVER STATE TO testLogin
EXEC sp_addsrvrolemember 'testLogin', 'dbcreator'
DROP LOGIN testLogin



We can examine the Application Log by running the Windows Powershell command below.


Get-EventLog -logname "application" -newest 4 where {$_.Source -eq 'MSSQLSERVER' } select EntryType, EventID, Message, TimeGenerated format-list





Tuesday, July 1, 2008

Integrating Telephony Services Into .NET Applications


Software developers have had limited access to telecom networks and capabilities. Specialized computer-telephony integration protocols are too complex and inflexible to properly link telephony services into enterprise business applications. As such, integrating person-to-person calling, conferencing, and call routing capabilities such as find me-follow me into applications has always been the preserve of specialized developers. As organizations continually attempt to increase the efficiency of their business processes through communications, the inability to easily tie communications services into the fabric of daily operational processes has emerged as a serious obstacle to business profitability.

With the advent of next-generation IP communications, Session Initiation Protocol (SIP), and Web services, these barriers to interconnectivity are being deconstructed. IT and communications systems now share a common language. However, developers are still faced with a lack of easy-to-use development tools to take advantage of this new interconnection capability. In particular, developers using Microsoft’s .NET Framework are challenged by the lack of tools to discover and incorporate telephony services into their business applications. SIP Objects.NET from Inova IT fills this void


you can download the white paper from below URL
http://lippisreport.com/2007/09/12/integrating-telephony-services-into-net-applications/