In my company we are starting to look at enterprise level reporting. Unfortunately, not all groups use the HP tools, so we need to be able to extract data/reports from HP ALM in order to surface the data in another "dashboard" product.
We are using ALM 11.52 patch 5. The back end database is MS SQL.
The current solution is to create a read-only database user in every project, and then anybody can use that identity to extract data. Unfortunately, we have no control or traceability with that solution to know who is executing the queries. In a couple of cases, people have written inefficient queries or had the query run too frequently, and the load on the database server became too great.
So I'm interested in hearing what other people might be doing.
I'm not a SQL management expert so I don't know what options might exist in that arena for knowing more about who is using the generic credentials.
I was thinking maybe something could be done to mirror the project databases to another server, and allow reporting to be done from that secondary server. I don't know how that might put load on the production environment. I know we would have to consider how frequently the data on the mirror got updated.
I also thought about creating an API to execute such queries, and take away the generic credential information from the user community. If they wanted to run a query, they would have to use the API. Then I could code the API to collect information about who is submitting the query.
I know we could go back to forcing users to use the ALM APIs but then they would be required to use User credentials that have access to the projects through the UI. We don't have a single User that has access to all projects.
So, what other suggestions are out there?
(Opinions and information contained in this post are wholly my own and do not reflect the opinions of my employer.)
You are right, using a separate database server for reporting may reduce load on the production database. At the same time having a separate server may not worth it if the reporting load is small. The additional server may require hardware, maintenance and licenses. Anyway there are two options for mirroring project databases to a separate database server.
1. Automated/manual export/import
2. Database replication
I used the first approach for several months until I proved to QC admin that the SQL queries are optimized properly and accessing the production server will not affect the system performance. The export/import process was mostly automated on a weekly basis and that was the main point of complains: the data was old four days every week. After pointing the reports to our production database we didn’t notice any performance degradation.
I used the second approach with some other system (not QC) and it was great. Almost real time replication creates a little overhead to the production system and allows querying the latest data. There may be some disadvantages depending on a database vendor and their replication implementation: several times the replication silently broke (it was not MS SQL) and nobody noticed for some time that the queries returned stale data.
Also we used shared credentials to access the replica db and hit the same problem: if someone used a bad query or run it often the reporting database become unusable. We diminished the problem by having several replicated databases dedicated to important infrastructure components such as integration systems, monitoring, automation and reporting. Also we limited the query execution time so a bad query would be aborted automatically after running for a minute or so.
Personally I don’t consider access to the database or API as enterprise level reporting. Most people cannot use such a low level technology efficiently. I prefer using a mature reporting engine to expose data in tabular or graphical formats. It makes it easier to access data for everyone. At the same time it does not limit power users to use raw data or API for ad hoc reporting.
I use MS SQL Server Reporting Services (SSRS) for reporting. This reporting platform makes it easy to control access to any report, send reports by email using subscriptions, store report history, convert reports to PDF, Excel and many other things. Creating a SSRS report reminds me making graphs in Excel – clicking through the wizards – but sometimes it may be challenging and could require scripting. Oh, by using an external reporting engine (not QC/ALM built in reporting) it is easier to use hierarchical queries in MS SQL format (WITH name (columns)…) SELECT…).
Regardless of the technology I follow these rules:
* Reports should be actionable. Identifying actionable reports may be a challenge but I use this trick: find the gaps/problems with the current process and see if some of them can be solved with the reporting. Even though the reporting does not solve the problem by itself it can provide a way to identify and track problems, notify us in case of an event or status update, summarize/aggregate large amount of data and show trends.
* Each report should be designed to solve one major problem. For example if we need to track requirements coverage and resolution time of critical defects we most likely need two different reports. Dashboards are kind of exception to this rule but not quite: most dashboards solve a different type of problem -- general visibility problem.
* Reports should not be too long. Users may be overwhelmed by 300 page test report that lists every detail from the test plan. This type of long reports may still be needed for audit or archiving purposes.
Sorry, I don’t have any suggestions but I thought my experience on the topic may help.