Posts tagged with "oracle"

Oracle SQLcl configuration

I use SQLcl a lot and install it on every environment I work on. Its fully compatible with SQL*Plus and has useful extensions to interact with OCI, Autonomous Databases and Data Pump.

My SQLcl configuration file is named 'login.sql' and located in the '~/work' directory. I also keep my 'tnsnames.ora' file here.

The location of these two Oracle configuration files is configured in '~/.bashrc'.

# Oracle TNS location
export TNS_ADMIN=$HOME/work

# SQLCL login file
export SQLPATH=$HOME/work

This is my SQLcl configuration file.

set editor emacs
set statusbar on
set statusbar add timing
set sqlformat ansiconsole
set highlighting on
set highlighting keyword foreground green
set highlighting identifier foreground magenta
set highlighting string foreground yellow
set highlighting number foreground cyan
set hig
comment background white
set highlighting comment foreground black

Sample output

SQL> select count(*) from dba_objects;

COUNT(*)
___________
358710

emacs ¦ 1:0 ¦ BILLY ¦ EDA_DEMO ¦ 00:00:00.953

Watch Your User

Connor McDonald posts an excellent series of articles about tuning a database application.

This analysis from a end user perspective reminded me of my own experiences when I was a technical consultant helping customers running a large CRM application, typically in call centres scattered across Europe.

I was often summoned onsite and told to solve the problem that 'The application is slow'. Usually, different people were eager to give me their view on the issue:-

  • Oracle DBA's often would be pouring over AWR reports or a monitoring tool and examining wait events in minute detail.'We can see multiple ITL waits over 700 ms. This means we need to increase the FREELISTS for the ORDERS table but the business won't let us have an outage'.

  • The application developers would also proffer their own diagnosis - 'Oh yes, we already know what causes that. It a custom workflow written by the previous integrator. It needs refactoring but it will take 3 months'.

  • The CEO brusquely told me - 'This CRM application isn't fit for purpose. If this isn't resolved by Thursday, we're going to evaluate SAP and Safra will be hearing about this'.

Now, this is all detailed, technical analysis and background providing useful information to be considered but I would often ask to see the problem at first hand by talking to an individual who was using this application all day, every day to see the perspective from his point of view.

This simple request was often met with puzzlement and resistance by the technical team - 'Why do you want to watch a user ? We've already told you what the problem is. This will just waste time'.

Sometimes, this resistance was born out of a concern that the user feedback would unearth different, unrelated functional issues and distract me from the performance problems under investigation. Alternatively, a floor supervisor would air the valid concern that my conversation with an agent would distract him from dealing with the customer call. This was easily overcome by letting the agent handle the call with me simply watching and taking notes. Then, after the call was finished, we'd have our chat.

On one occasion, this approach of listening to the users proved particularly beneficial. The client was a utility company but could have easily been a bank or a telco. The business scenario in the busy call centre was typical. he, Essentially, the customer calls in with a query or complaint which is resolved by the agent.

Some call centres use CTI technology where the application looks up the customer from the inbound telephone number and then presents the customer details to the agent on the screen so he can start the dialogue, typically security checks.

However, this call center didn't use CTI so the agent had to manually search for the customer before the call could commence.

I watched the agent process an entire customer enquiry from start to finish and took notes.

The call started and after the initial exchange, the agent asked for customer's surname and started a search. In this example, the customer was Mr. Johnson. I watched with interest as the agent typed in 'J' into the customer tab and his 'Search'. This operation took a long time. There are 66 million people in the UK and 38 million of them appear in this client's database. Searching a table for all customers with a surname starting with 'J' is expensive performance wise.

The agent didn't seems phased or perturbed or even irritated as the hour glass popped up. He merely continued to clear security with the customer. By the time, this exchange was complete, the search had finally returned.

My eyes widened as the agent then proceeded to sort all these thousands of customers by surname and scrolled down page by page searching for 'Johnson'. Again, sorting a large data set like this is sub-optimal performance wise. This is an online application where users are expecting each button click to return within 3 seconds - not 3 minutes. The solution isn't for the DBA to increase the PGA to allow larger temporary segments to accommodate the massive sort operation. The solution is not to issue the request to sort thousands of records in the first place.

It would have been marvellous if the agent had uttered the immortal words 'Sorry, Mr. Johnson but the system is really slow today'. Unfortunately, he didn't but you can certainly envisage similar scenarios where this excuse is proffered.

When the agent finally identified 'Mr. David Johnson' of '23 New Street, Canterbury, CT2 6AD', the rest of the customer call went pretty quickly. It was either taking a payment, changing a tariff, lodging a complaint, a billing enquiry or a change of personal details and common to most agents working on that floor.

After the call ended, I asked the agent why he used that sequence of searches and scrolling to identify that specific customer. The answer, inevitably, was 'We always do it that way and when I joined, that's what Barry showed me...'

Then we revisited the call using a different technique. This time, I recommended he searched for the complete surname (he has that available as soon as the customer starts talking). When he searched for 'Johnson', the query ran much quicker but there are still probably thousands of people called 'Johnson' in the UK.

Instead of sorting and endlessly scrolling to locate the customer in question, I suggested he simply entered the postcode into the 'Address' section. The postcode is now known after the customer completed the security questions. He could have used customer number but that's a long 12 digit number with scope for error when entering it.

[ Ironically, one of his reasons for typing in 'J' instead of 'Johnson' was that 'Hey - I'm pretty lazy and that's a lot of typing' which resonated with me as that's normally my attitude. ]

The agent just needed to type the first element of the postcode ('CT2') to refine the search further and now we have the customer details on the screen in a fraction of the time it used to take him.

I thanked him for his time and told him it had been a very interesting exercise for me to see the application actually in use. He reciprocated and thanked me. As he went to put his headset back on, he smiled and said:-

'You're not going to tell my Supervisor about what we've just done, are you ?'

'Well, yes I am. Trying to solve these performance issues is why I've been asked to come in. Why do you say that ?'

'If Barry gets to hear about this, our call targets will probably be doubled !'.

He smiled and nodded at the electronic rolling ticker display detailing how many calls have been handled, how many are waiting, average call duration etc.

Am I am an Oracle luddite ?

Jeff Moss article about the commercial and free versions of Toad and the incredibly tenacious, persistent breed of salesperson bred by Quest Software got me thinking about the Oracle DBA tools I use.

  • People
  • SQL*Plus
  • Statspack
  • putty

People are important because people have developed the application, people are using the application, people are managing the servers, people are managing the database and intelligent people have configured that very expensive storage array.

These people know a lot about the application, the history of the project, the successes and failures, the lessons learned, the architecture, the infrastructure whereas I may know, quite literally, nothing about the same subjects.

I have seen people using Toad (and similar GUI based Oracle development tools) very effectively, multi-tasking, flipping between windows at breakneck speed. Sometimes it makes me quite tired just watching them.

However, I prefer SQL*Plus to do most of the work because :-

  • SQL*Plus provides a complete report of my session, with timings, query plans, statistics. No need to frantically try to write it all down.
  • The SQL script can be incrementally developed and is repeatable. It is quite easy in Toad to execute an arbitrary set of mouse-clicks so it is not clear which of the six 'alter session' statements were actually in effect when you finally got the optimal results.
  • SQL*Plus is the 'vi' of the Oracle world. It is the one Oracle tool you are guaranteed to have access to, everywhere. The sys admin may not allow you to install commercial software without a valid license or may use another tool you have never seen before.
  • SQL*Plus can use bind variables, provide the query plan (with or without retrieving the 6 million records) and provide the vital statistics (consistent gets) just the same as the GUI tools. Almost everything you need in fact.

Statspack is important because it is an Oracle package. Oracle will maintain and develop statspack for the latest features available in 10g. Statspack produces reports in a standard format which can easily be analysed by others (colleagues, DBA's, even Oracle). Statspack can also be configured to run automatically at regular intervals. If there is always a problem with the overnight ETL at 03.30, I would rather statspack gets the overtime and gathers the performance metrics rather than me sitting there in the middle of the night.

Statspack tells the truth, the whole truth and nothing but the truth. It is incredibly tempting to use an Oracle monitoring tool like OEM or Spotlight, obsessively watching the screen, drilling in on what appears to be an expensive query and your salvation. The problem is that you don't know the query in question is only spawned once a quarter for the accountant's financial report and runs in batch. The query could take 37 minutes and no-one would care. It is simply not important. It is not what the users are complaining about.

Worse, because your refresh interval is 5 seconds, you are blissfully unaware that you have missed the crucial SQL query with literal SQL that takes 1.7 seconds but is executed hundreds of times per second. Statspack does.

Setting the default statspack level to level 7 means that statspack can (retrospectively) produce the query plan for any problematic SQL statement identified in the summary report. This is handy where you may have long winded SQL statements where the summary report has tantalisingly truncated the SQL text just as the WHERE clause starts.

Putty is important so you can run O/S utilities (prstat, glance, topas, top, iostat, vmstat) to monitor the actual database server during the investigations. If the server is a development server hosting multiple Oracle instances with 2 CPU's running at 100% and saturated disks, then the performance of the application will be impacted, no matter what wizardry you weave.