Posts in category "Siebel"

fun with the Siebel data model

The Siebel 7.8 data model supports different industry applications (finance and banking, pharmaceutical, sales, telecommunications, energy, media, motor, public sector, retail) and contains 3,920 tables and even more indexes.

The results are just in from a Friday afternoon poll to find the most popular Siebel table:

  • S_ACCELERATOR - invaluable in performance tuning. 'Oh well - there's nothing for it. Now we are simply going to have to load S_ACCELERATOR'.
  • S_CLUSTER_TNT - images of cluster bombs and high explosives but supposedly related to 'Hospitality'.
  • S_AUDIT_ITEM - typically will contain 155 million records of historic audit data dating back to 2003. No-one knows why this data is being collected let alone using it.
  • S_DOCK_INITM_43 - Crazy name, crazy table. Docking table for Remote.
  • S_CL_PTCL_LSXM - A real tongue twister related to clinical protocols.

And finally...

  • S_DISEASE - the clear winner.

30 second guide to tuning Siebel

Bottom up

Identify resource intensive SQL statements using Statspack (ADDM, custom scripts, Spotlight). Siebel is a black box that pumps out lengthy, complicated SQL statements with lots of (outer) joins. Level 7 snapshots are useful as 'sprepsql' may be used to reveal the full query text which is often tantalisingly truncated in the summary reports. In addition, the associated query plan for any statement can retrospectively dumped.

The DBA can determine which queries are executed and how many times. There is no point tuning a query that is only executed monthly whereas shaving fractions of a second from a fundamental query executed thousands of times may prove more worthwhile.

The disadvantages of this approach are that the bind variables and username who issued the query are not available which may be important. Secondly, it is not obvious which Siebel screen is associated with the problematic SQL statement(s).

Middle out

Increase Object Manager logging to dump the SQL statements together with timing information for prepare, fetch and execute. Search the log files for any SQL taking more than one second. The advantage here is that the username for the session. the sequence of screens visited and all bind variables are included so it is easier to reproduce in SQL*Plus.

Patterns of usage and resource intensive queries should start to appear. You may even identify groups of users (by job role or geographic location) who have different types of problems.

Top down

A little radical for most DBA's (developers, integrators and highly paid consultants for that matter). Seek out the users, sit down and talk with them.

Watch how they actually use the application for 30 minutes. This is often most enlightening. You witness first hand how users actually use the system. Sometimes it turns out that they are not always using the application efficiently (for various reasons).

You can see the typical business scenarios, the frequently used views (tables), the type of searches users specify, what they like, what they hate, what is fast, what is slow. In fact, your visit may well end up lasting a lot more than 30 minutes.

Or you may need a combination of all three.