Feed aggregator

Question about sequence with lower nextval than column

Tom Kyte - Tue, 2021-01-19 20:46
Hi, I have a sequence name s1 start with 1 and incremented by 1 and end with 999. I am inserting these sequence value into one table name 'e'. E table contains eno (pk) column. insert into e values(s1.nextval); I inserted 9 rows. sequence current value is 9 and 10 is already inserted in backend. I try to insert into e values(s1.nextval); then it will come pk violation error. In this case i want to insert 11 in place of 10
Categories: DBA Blogs

Choosing a view based on result of view

Tom Kyte - Tue, 2021-01-19 20:46
Hi all, I am having a struggle with a view. The outcome of the view can be 1 row of 3 different views with the same fields. Can someone point me in the right direction how to think please? :) Very basic, this is my problem: <code> /* MAINVIEW */ SELECT * FROM (SELECT * FROM VIEW1) --returns n rows /* IF VIEW1 IS EMPTY THEN SELECT ALL FROM VIEW2 */ (SELECT * FROM VIEW2) -- returns n rows /* IF VIEW2 IS EMPTY THEN SELECT VIEW3 (=HAS ALWAYS DATA) */ (SELECT * FROM VIEW3) -- returns 1 row </code> I don't need full code, just a hint in the right direction ;) . Thanks in advance.
Categories: DBA Blogs

The JSON query results does not return the full path

Tom Kyte - Tue, 2021-01-19 20:46
Hi, I'm try to evaluate the Json query. e.g I have following json string in the table <code>{ "WO": [ { "id": "1", "name": "WO1", "action": "add", "items": [ { "id": "1", "author": "Tom Ask" }, { "id": "2", "author": "Lee Frik" } ] }, { "id": "3", "name": "WO1", "action": "add", "items": [ { "id": "1", "author": "John A" }, { "id": "2", "author": "Jean Pontus" } ] } ] }</code> <code> select d.wo_data.WO.items.id from workorder_json d;</code> I get following results: <code>["Tom Ask","Lee Frik","John A","Jean Pontus"] </code> What I hope is return with full path like following <code>{ "WO": [ "items": [ { "author": "Tom Ask" }, { "author": "Lee Frik" } ], "items": [ { "author": "John A" }, { "author": "Jean Pontus" } ] ] } </code> Is there anyway to return like above?
Categories: DBA Blogs

Partner Webcast – Oracle Cloud Observability & Management Platform Logging Services

The Oracle Cloud Observability and Management Platform brings together a comprehensive set of management, diagnostic, and analytics services that help customers eliminate the complexity, risk, and...

We share our skills to maximize your revenue!
Categories: DBA Blogs

Announcement: Both Of My Oracle Webinars Scheduled For February 2021 !!

Richard Foote - Mon, 2021-01-18 19:56
I’m please to announce that both of my highly acclaimed webinars scheduled for February are now confirmed. These webinars are a must for any Oracle DBA or Developer interested in designing, developing or maintaining high performance, highly scalable Oracle-based applications or databases. However only a few places are currently available on each webinar with numbers […]
Categories: DBA Blogs

Supplemental Defect

Jonathan Lewis - Mon, 2021-01-18 11:22

Here’s an anomaly that appeared in a question on the ODC recently about tweaking the output of dbms_metadata.get_ddl(), As the title suggests, though, this posting isn’t about the dbms_metadata package it’s about supplemental logging and something that should not to be possible and may have unfortunate side effects.

We start with a little script that creates a table, inserts some data, adds a couple of constraints, and then introduces some supplemental logging requirements. As the script header indicates I’ve only tested this on 19.3.0.0:

rem
rem     Script:         supplemental_defect.sql
rem     Author:         Jonathan Lewis
rem     Dated:          Jan 2021
rem
rem     Last tested 
rem             19.3.0.0
rem

create table t1 (
        n1      number,
        n2      number,
        n3      number,
        n4      number
);

insert into t1 (n1, n2, n3, n4) values (1,2,3,4);
commit;

alter table t1 add constraint t1_pk primary key(n1, n2)
/

alter table t1 add constraint t1_uk unique(n3) using index (
        create index t1_uk on t1(n3, n4)
)
/

alter table t1 add supplemental log data (primary key, unique) columns
/

alter table t1 add supplemental log group t1_g1 (n1, n2) always
/

alter table t1 add supplemental log group t1_g2 (n1, n2) always
/

There’s nothing particularly special or complex about this code, and every statement runs successfully. You might notice that I’ve created two identical supplemental log groups, of course, and that’s clearly a mistake and I’d say that Oracle should have raised an error when I tried to create group t1_g2.

In fact my script is slightly worse than you might think at first sight because (n1, n2) is the primary key of the table, and I’ve added supplemental logging on the primary key already, so even supplemental log group t1_g1 really ought to raise an error as well.

Side note: you’ll notice that I’ve got a single-column unique constraint protected by non-unique multi-column index, and part of my supplemental log data specifies unique columns. If you check the syntax this option can be expressed as “unique” or “unique indexes” – which leaves me wondering whether Oracle my code would treat this as a command relating to n3 and n4, or just to n3.

So far all I’ve done is create a table in a way that’s left me with a couple of questions – let’s see what we get when we use dbms_metadata.get_ddl() to generate code to recreate the table:

begin
        dbms_metadata.set_transform_param(dbms_metadata.session_transform,'PRETTY',true);
        dbms_metadata.set_transform_param(dbms_metadata.session_transform,'SQLTERMINATOR',true);
        dbms_metadata.set_transform_param(dbms_metadata.session_transform,'SEGMENT_ATTRIBUTES',false);
        dbms_metadata.set_transform_param(dbms_metadata.session_transform,'STORAGE', false);
        dbms_metadata.set_transform_param(dbms_metadata.session_transform,'TABLESPACE',false);
        dbms_metadata.set_transform_param(dbms_metadata.session_transform,'CONSTRAINTS',true);
end;
/

set long 20000
set heading off

select dbms_metadata.get_ddl('TABLE','T1') from dual;

================================================================

  CREATE TABLE "TEST_USER"."T1"
   (	"N1" NUMBER,
	"N2" NUMBER,
	"N3" NUMBER,
	"N4" NUMBER,
	 CONSTRAINT "T1_PK" PRIMARY KEY ("N1", "N2")
  USING INDEX  ENABLE,
	 CONSTRAINT "T1_UK" UNIQUE ("N3")
  USING INDEX  ENABLE,
	 SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS,
	 SUPPLEMENTAL LOG DATA (UNIQUE INDEX) COLUMNS,
	 SUPPLEMENTAL LOG GROUP "T1_G1" ("N1", "N2") ALWAYS,
	 SUPPLEMENTAL LOG GROUP "T1_G2" ("N1", "N2") ALWAYS
   ) ;

The “create table” statement includes 4 supplemental log clauses – notice how my original “two-part” clause for the primary key and unique constraints has been split into two – with the key word index being added to the latter.

You might note that if you try to execute this statement to recreate the table you’re going to run into an immediate problem – Oracle will create a unique singe-column index to protect the unique constraint – it doesn’t know that the constraint should be protected by a two-column non-unique index.

But there’s another problem that will protect you from the indexing issue. Here’s what I get when I try to execute the statement:

	 SUPPLEMENTAL LOG GROUP "T1_G1" ("N1", "N2") ALWAYS,
	                                *
ERROR at line 12:
ORA-02261: such unique or primary key already exists in the table

The funniest thing about this error comes when you look at the error message file ($ORACLE_HOME/rdbms/mesg/oraus.msg in my case), where we can find the “Cause” and “Action” that Oracle supplies for the error:

02261, 00000, "such unique or primary key already exists in the table"
// *Cause: Self-evident.
// *Action: Remove the extra key.

It’s not too difficult to guess why the error has appeared but “Self-evident” seems a little optimistic. In fact your first guess about the error may be wrong. Obviously I could simply delete the lines that create the t1_g1 and t1_g2 logs since they appear to be redundant copies of the (primary key) supplemental log – but what if I just delete the line that creates the (primary key) supplemental log? Oracle still raises the ORA-02261 error. It’s only when I delete the declaration of the primary key (and I can leave the declaration of the (primary key) supplemental log in place) that the error stops appearing and I recreate the table … even if I’ve left the decarations of the two supplemental log groups on (n1, n2) in place.

On the plus side – although the export/import utilities expdp and impdp make use of the dbms_metadata package they don’t run into the same problem. After I’d exported, dropped and imported the t1 table with the sqlfile=xxx.sql option this is how impdp reported the code it had run to recreate the table (edited to remove the space management bits):

CREATE TABLE "TEST_USER"."T1" 
   (    "N1" NUMBER, 
        "N2" NUMBER, 
        "N3" NUMBER, 
        "N4" NUMBER
   ) 
;

ALTER TABLE "TEST_USER"."T1" ADD CONSTRAINT "T1_PK" PRIMARY KEY ("N1", "N2")
;
ALTER TABLE "TEST_USER"."T1" ADD CONSTRAINT "T1_UK" UNIQUE ("N3")
;
ALTER TABLE "TEST_USER"."T1" ADD SUPPLEMENTAL LOG DATA (UNIQUE INDEX) COLUMNS;
ALTER TABLE "TEST_USER"."T1" ADD SUPPLEMENTAL LOG DATA (PRIMARY KEY) COLUMNS;
ALTER TABLE "TEST_USER"."T1" ADD SUPPLEMENTAL LOG GROUP "T1_G1" ("N1", "N2") ALWAYS;
ALTER TABLE "TEST_USER"."T1" ADD SUPPLEMENTAL LOG GROUP "T1_G2" ("N1", "N2") ALWAYS;

The underlying code for expdp/impdp generates constraints separately from the table and then adds the constraints to the tables – and it views supplemental logs as a variant on constraints (you can stop the supplemental log clauses appearing by using the dbms_metadata.transform_param() procedure to disable the generation of constraints) and adds them later.

You might notice in passing that the index created by impdp to support the unique constraint is NOT the index originally specified. This is a very old problem – though the nature of the problem has varied with time – so watch out if you move data from a production system to a test system using export/import.

Summary

The internal code to handle supplemental logging allows you to create redundant supplemental log groups that will cause errors if you use dbms_metadata.get_ddl() in the most straightforward way to generate code to recreate the table.

The error message that appears if you haven’t allowed for the anomaly is a little counter-intuitive.

audit once per session (unified auditing)

Tom Kyte - Mon, 2021-01-18 08:26
Hi, is it possible to audit (unified auditing) access to tables only once per session? I don't need to catch all selects from audited tables in a session, I just want to to know if a table was at querried at least once. Currently it generates huge amount of audit data, from which only fraction is needed. Thanks, Julius
Categories: DBA Blogs

How to allow in-progress client work survive database corruptions that don't require a database restart.

Tom Kyte - Mon, 2021-01-18 08:26
Good Morning, It seems like no time is spent during development efforts to assure that an application is able to survive database corruptions that don't require a database restart like say a datafile corruption or even a block corruption. If a corruption of a datafile occurs, for instance, the DBA can restore and recover the datafile, but I don't think the clients or patch processes using the application that access the corrupt datafile will survive the operation and/or the 15-60 minutes it may take to recover. I would assume that the application developers would need to do additional work on their end to even have a chance. Is there any Oracle documentation or a book that you can point me to that an organization can use to make their applications bullet proof against database corruptions that don't require shutting down the database? I would hate to have to try to reinvent this wheel by figuring out all possible Oracle error codes, timeout settings, etc. since it seems like this is something that should be done by lots of application development efforts, but my guess is that 98%+ of the time, nothing is done to try to have in-progress client work survive database corruptions that don't require a database restart. Thank you
Categories: DBA Blogs

Advanced Queueing

Tom Kyte - Mon, 2021-01-18 08:26
Environment: Oracle 18XE 64-bit for Windows. I have a question about dequeueing an array of messages from persistent queue. It?s a simple point-to-point messaging. Queue is ?single_consumer?, without propagation. I registered my PL/SQL callback function. I need to know an exact size of array of messages to dequeue in every call of my callback function from Oracle AQ internal job. And I found the only legal way how to have done it. And this way is to register callback with <b>qosflags</b> parameter of <b>sys.aq$reg_info</b> equal to <i><u>dbms_aq.NTFN_QOS_PAYLOAD</u></i>. Here is the registration PL/SQL block: <code>declare v_qosflags number := dbms_aq.NTFN_QOS_PAYLOAD; r_info SYS.AQ$_REG_INFO; begin r_info := SYS.AQ$_REG_INFO( 'STERN.FOUNDERS_QUEUE', DBMS_AQ.NAMESPACE_AQ, 'plsql://stern.dosomecalc', HEXTORAW('FF') ); r_info.qosflags := v_qosflags; r_info.ntfn_grouping_class := dbms_aq.NTFN_GROUPING_CLASS_TIME ; r_info.ntfn_grouping_value := 60; r_info.ntfn_grouping_type := dbms_aq.NTFN_GROUPING_TYPE_SUMMARY ; DBMS_AQ.REGISTER ( SYS.AQ$_REG_INFO_LIST( r_info ), 1 ); end;</code> Here is the declaration of callback procedure. It is a standard declaration: <code>create or replace procedure dosomecalc (context RAW ,reginfo SYS.AQ$_REG_INFO ,descr SYS.AQ$_DESCRIPTOR ,payload raw ,payloadl NUMBER)</code> Now, thankfully to <b>qosflags</b> parameter initialized with <i><u>dbms_aq.NTFN_QOS_PAYLOAD</u></i> ,my callback function is registered in such a way that I always can see real size of messages to dequeue in callback session. It may be evaluated as counting size of <b>descr.msgid</b>_array part of descr parameter. Without setting of <b>qosflags</b> during registration to some value - this part of descr parameter always comes empty to callback procedure call. Once I know the real size of messages array , I can use it in <code>Dbms_aq.dequeue_array(?, array_size => descr.msgid_array.count,?) /*dequeuing call*/.</code> inside my callback function. Than, after analyze of contents of descr parameter, I found in it an <b>ntfnsRecdInGrp</b> element, and decided that <b>ntfnsRecdInGrp </b>is always equal to <b>descr.msgid_array.count</b>, and just made for programmer?s convenience, just for duplicate <b> descr.msgid_array.count</b>. AQ documentation says: <code>msgid_array - Group notification message ID list ntfnsRecdInGrp - Notifications received in group</code> That was why I decided that they are equal by value. It was a my mistake. When I use callback with array size equal to<b> descr.msgid_array.count</b> ? everything is OK. With n<b>tfnsRecdInGrp</b> ? no. Sometimes <b>descr.msgid_array.count</b> and <b>ntfnsRecdInGrp</b> equal to each other, sometimes not. Now the question is: What is the meaning of <b>ntfnsRecdInGrp</b> part of descr parameter? Why it is not the same as <b>Msgid_array.count</b>? <u></u>
Categories: DBA Blogs

[A NEW VERSION OF DP-200 & DP-201] Exam DP-203: Data Engineering on Microsoft Azure (beta)

Online Apps DBA - Mon, 2021-01-18 02:29

Examination DP-201 & DP-200 Will be replaced with examination DP-203 ON FEBRUARY 23, 2021. you may still be capable of earn this certification with the aid of passing DP-200 and DP-201 until they retire on June 30, 2021. Azure Data Engineers are responsible for integrating, transforming, and consolidating data from distinct structured and unstructured data […]

The post [A NEW VERSION OF DP-200 & DP-201] Exam DP-203: Data Engineering on Microsoft Azure (beta) appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Connect to Oracle Cloud DB from Python

Andrejus Baranovski - Sat, 2021-01-16 09:19
A quick explanation of how to connect to Oracle Autonomous Cloud Database (Always Free instance) from Python script.

 

Fantasy Software Development – Story Finding

The Anti-Kyte - Sat, 2021-01-16 08:56

As threatended promised, I’m going to look at the first phase of my Application – recording the results and outcomes of Soccer tournaments from the first days of the sport in 1871 through to the culmination of the first Football League Championship in 1889.

I’ll begin with a narrative description of the functional requirements of our application. OK, it’s more like a potted history of the early days of Association Football, but I’ve got to start somewhere.

I’ll use this to extrapolate some user stories, which I’ll then drop into Altassian Jira having taken advantage of a free Jira account.

If you’re an Oracle developer and reading this is your first experience of Scrum, then you may feel that it’s not an obvious fit for developing a data-centric application.
On the other hand, if you’re a Scrum officianado, you may be faintly horrified by the “free-form” way I’m approaching Scrum. So, something to annoy everyone then…

Association Football – The Early Years

Those of you less than enamoured of “the Beautiful Game” will not be unduly surprised to learn that it all started with a bunch of blokes in a pub…

On 26th October 1863, at the Freemason’s Tavern in London, the Football Association (FA) first came into being.
Despite this, the rules observed by it’s member clubs when playing the game remained somewhat fragmented.
There were several versions to choose from and clubs would agree a set of rules to adhere to when playing each other.

In an attempt to promote the “Association” rules ( adopted by the FA), Charles Alcock, Secretary of the FA came up with the idea of a Cup competition between all of the clubs affiliated to the Association.
All matches in this new competition would be played under this single set of rules.
Thus it was that Association Football ( soccer) really came into being several years after it’s governing body was founded.
To give a flavour of how fundamental this change was, the Rules established such matters as the number of players per side (11), and the duration of a match ( 90 minutes), which had not been consistent across the various football codes then extant.

The FA CUP

The first Tournament duly took place in the 1871/72 season.
The format can best be described as “sort of knock-out”.
Whilst the winners of a tie between two competing teams would advance to the next stage of the competition, this was not the only method of progression. There was, of course, the possibility of being awarded a bye, a free-pass to the next round if there were an odd number of teams in the current round.
Teams could also be awarded a Walkover, if their designated opponents withdrew.
Additionally, at the discretion of the competition’s organising committee, teams could advance if :
– they drew their fixture ( both teams could go through)
– they could not agree a date and venue with the opponents against whom they had been drawn.
Eventually, the 14 entrants were whittled down to Wanderers, who defeated Royal Engineers 1-0 in the Final, played on 16th March 1872 at the Kennington Oval in London.

Originally, the intention was for the cup holders to defend the trophy from challengers, hence the competition’s full name – The Football Association Challenge Cup.
For the 1872/73 tournament, Wanderers were given a bye all the way through to the final with the remaining matches being essentially and elimination event to find a challenger.
Wanderers were also given choice of venue and – perhaps unsurpisingly – managed to retain the trophy with a 2-0 win over Oxford University.

It’s only from 1873/74 that the competition settles down into a consistent knock-out format.

For the first 10 years of competition, the southern amateur teams dominated, and it was not until 1882 that a team from the north of England appeared in the final.
That year, Old Etonians saw off the challenge of Blackburn Rovers. It was to prove the end of an era.
In subsequent years the centre of power changed radically, Rovers’ local rivals Blackburn Olympic won the trophy the following season after which Rovers themselves, won three consecutive finals.

The Football League

By 1888, football in England was dominated by professional clubs in the North and Midlands. 12 such clubs formed the Football League and participated in the first season.
The League format consisted of each team playing all of the others, once on their home ground and once on the opponents’ ground.
For each match won, 2 points were awarded.
For a draw, one point was awarded.
No points were awarded for a defeat.
Teams finishing level on points would be separated by Goal Average.

Goal Average was calculated by dividing the number of goals scored over the course of the season by the number of goals conceded.

International Football

The first international took place on 30th November 1872 in Glasgow, when Scotland and England drew 0-0.
As the game spread to Wales and Ireland (which was then a single entity and part of the United Kingdom), matches between the four home nations became a regular occurrence. However, each association observed slightly different rules and this was the cause of some friction.
Eventually, the International Football Conference was convened in Manchester in December 1882. It was at this meeting where a common set of rules were agreed.
The first full season in which these rules were applied was 1883/84 and it’s subsequently been acknowledged that the matches played between the Home nations in that season comprised the inaugural British Home Championship – the first international soccer tournament.
The format was a round-robin with each nation playing the other once with home and away fixtures alternating between years.
The outcome of the tournament was decided using the same criteria as for the Football League, with the exception that Goal Average was not applied and teams finishing level on points were considered to be tied.
Given the influence of Scottish players on the development of the game, it’s little surprise that Scotland won the inaugural championship. Indeed, Scotland only failed to win one of the first six tournaments up to and including 1888/89.

I should point out that applying this format to the Home International Championship is slightly anachronistic.
Contemporary reports don’t include league tables and these seem to have been applied several years later, under the influence of the format adopted by the Football League.
For our purposes however, we’re going to proceed on the basis that this format was in place from the outset.

User Stories

To start with, I’ve identified three distinct user roles :

  • Football Administrator – generally runs the FA and sets up Competitions
  • Tournament Organiser – manages tournaments
  • Record Keeper – enters fixture data, including results and maintains league tables

The stories are written from the perspective of these users.

Football Administrator Stories
Title : Create Competition
 As a : Football Administrator 
 I would like to : create a competition
 so that : I can provide a structured environment for teams to participate
Title : Administer Teams
 As a : Football Administrator
 I would like to : Maintain details of teams 
 So that : I know which teams are eligible to enter competitions
Title : Administer Venues
 As a : Football Administrator
 I would like to : Maintain details of venues where matches can be played
 So that : I know where teams can fulfil fixtures.
Tournament Organiser Stories
Title : Create Tournament
 As a : Tournament Orgainiser
 I would like to : create a tournament
 so that : teams can compete against each other
Title : Specify a format for a tournament
 As a : Tournament Organiser
 I would like to : specify the format for a tournament
 so that : I can anticipate what fixtures may be played
Title : Define Tournament Rules
 As a : Tournament Organiser
 I would like to : Define the rules for the tournament
 So that : I know how to determine the outcome of the tournament
Title : Enter Teams
 As a : Tournament Organiser
 I would like to : Accept the entry of teams into a tournament
 So that : I know which teams are competing in a tournament
Title : Assign Tournament Venues
 As a : Tournament Organiser
 I would like to : Assign venues to a tournament
 So that : I know where tournament fixtures may be played
Title : Assign Players to Tournament Teams
 As a : Tournament Organiser
 I would like to : assign players to a team 
 So that : I know which players are playing for which team in a tournament
Title : Override Tournament Fixture Results
 As a : Tournament Organiser
 I would like to : override results of fixtures
 So that : I can account for exceptional circumstances
Title : Teams remaining in a tournament
 As a : Tournament Organiser
 I would like to : identify the teams still in a knock-out tournament
 so that : I know which teams will be in the draw for the next round
Title : Future fixtures
 As a : Tournament Organiser
 I would like to : add tournament fixtures even when the teams are not known
 So that : I can anticipate which future fixtures are required for the tournament to be completed.
Record keeper stories
As a : Record Keeper
 I would like to : record details of a fixture 
 so that : I can see what happened during a match
As a : Record Keeper
 I would like to : view a league table based on results so far
 So that : I know how teams are performing relative to each other in a League or round-robin tournament
As a : Record Keeper
 I would like to : record which team won and which teams were placed in a competition
 so that : I can maintain a Roll of Honour

Whilst this is all a bit rough-and-ready, it does give us a backlog to start working from.

Project Management Software

Now, how best to track our progress ? At this point, I did start to wonder whether I could throw together a quick application as installing Jira was likely to take time and incur cost that I didn’t really have on this project.

Fortunately, Altassian saved me the bother as they provide a free Jira cloud account.

Consequently, my backlog is now just waiting for Sprint 1 to start, which will be the subject of my next post on this topic.

Introduction To Google Cloud Platform

Online Apps DBA - Fri, 2021-01-15 07:20

Google Cloud is a suite of Cloud Computing services offered by Google that provides various services like compute, storage, networking, and many more that run on the same infrastructure that Google uses internally for end-users like Gmail, Google Photos, and YouTube. There are many services and tools offered by Google Cloud like Storage, Big Data, […]

The post Introduction To Google Cloud Platform appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

AWS Database Migration Service

Online Apps DBA - Fri, 2021-01-15 07:05

Are you looking for a way to migrate your on-premise database to the cloud? AWS Database Migration Service (DMS) is a managed service that provides a quick and secure way to migrate your on-premise databases to the cloud. Check out this blog at k21academy.com/awssa34 to know more about AWS Database Migration Service: • What is […]

The post AWS Database Migration Service appeared first on Oracle Trainings for Apps & Fusion DBA.

Categories: APPS Blogs

Partner Webcast – Integration Insight in Oracle Integration

Today’s competitive market demands that stakeholders understand, monitor, and react to rapidly changing conditions. Businesses need flexible, dynamic, and detailed insight – and they need...

We share our skills to maximize your revenue!
Categories: DBA Blogs

DIFFERENCE BETWEEN ANALYZE AND DBMS_STATS

Tom Kyte - Thu, 2021-01-14 12:46
DIFFERENCE BETWEEN ANALYZE AND DBMS_STATS
Categories: DBA Blogs

Is there a view that a DBA can query to find out if "ORA-02393: exceeded call limit on CPU usage"

Tom Kyte - Thu, 2021-01-14 12:46
Greetings, I've seen when "cpu_per_call" limit is reached. ORA-02393 is sent to the SQL Plus. Is there a view that a DBA can query to find out if "ORA-02393: exceeded call limit on CPU usage" occurs to applications using the database since it isn't written to alert log. Thanks, John
Categories: DBA Blogs

MATERIALIZED VIEW Performance Issue!

Tom Kyte - Thu, 2021-01-14 12:46
I have created a MV on UAT server and my MV view using a query which has remote connectivity to PROD and select only rights to these tables which has millions of rows around 10 lakhs in each table but after calculation output of query is 139-150 rows only. query alone without MViews is taking 60 seconds but when I use CREATE MATERIALIZED VIEW NOCOMPRESS NOLOGGING BUILD IMMEDIATE USING INDEX REFRESH FORCE ON DEMAND NEXT null USING DEFAULT LOCAL ROLLBACK SEGMENT USING ENFORCED CONSTRAINTS DISABLE QUERY REWRITE as "query" mview creation happens in one hour and after that refresh time is 20-30 minutes ? which is surely not acceptable as this data is being used for dashboard with 3 minutes delay which MV should take time to refresh! I don't have privilege to anything to check on prod DB but on UAT I have sufficient access! I have tried many option but didn't work so please help me to know what is solution and if no solution what is reason behind this? in addition when my mview refresh it shows in explain plan " INSERT /*+ BYPASS_RECURSIVE_CHECK */ INTO abc". Please help me! I am really stuck here and tried my hard to get it resolved or finding a reason where I can explain to relevant team! Please help! 1. I have tried create table with same query and it took less than a minute. 2. Insert statement also working fine taking same time. 3. I tried MV view refresh option with atomic_refresh=false as well but it didn't work and actually it will not help! Please let me know if u have any info required! Note: My mv view query using prod tables(approx 4 tables) with db link from UAT.Prod server has one separate user which has been given below table rights select count(*) from abc@prod; --800000 select count(*) from abc1@prod; --700000 select count(*) from abc2@prod; --200000
Categories: DBA Blogs

blob to clob on ORDS Handler Definition

Tom Kyte - Thu, 2021-01-14 12:46
Hi! I'm trying to send a post request with json: <code> { "id": 12344444, "email": "ppppoddddddppp@gmail.com", "first_name": "", "last_name": "", "billing": { "first_name": "22222", "last_name": "", "company": "", "address_1": "", "address_2": "", "city": "", "postcode": "", "country": "", "state": "", "email": "", "phone": "" } } </code> I'm trying to use apex_json to extract information like: ?company? that is in ?billing? I read the following guide:https://oracle-base.com/articles/misc/apex_json-package-generate-and-parse-json-documents-in-oracle#parsing-json and it works but not inside ORDS Handler Definition.... I'm trying to use the following code ... but it's not insert the data and return "201": <code> DECLARE l_json_payload clob; l_blob_body blob := :body; l_dest_offset integer := 1; l_src_offset integer := 1; l_lang_context integer := dbms_lob.default_lang_ctx; l_warning PLS_INTEGER := DBMS_LOB.warn_inconvertible_char; BEGIN if dbms_lob.getlength(l_blob_body) = 0 then :status_code := 400; --error :errmsg := 'Json is empty'; return; end if; dbms_lob.createTemporary(lob_loc => l_json_payload ,cache => false); dbms_lob.converttoclob( dest_lob => l_json_payload ,src_blob => l_blob_body ,amount => dbms_lob.lobmaxsize ,dest_offset => l_dest_offset ,src_offset => l_src_offset ,blob_csid => dbms_lob.default_csid ,lang_context => l_lang_context ,warning => l_warning); APEX_JSON.parse(l_json_payload); INSERT INTO ACCOUNTS ( wp_id , name , email , f_name , l_name , wp_role , wp_username , woo_is_paying_customer , woo_billing_first_name ) VALUES ( :id, :first_name || ' ' || :last_name, :email, :first_name, :last_name, :role, :username, decode(:is_paying_customer,'false', 'N', 'Y'), APEX_JSON.get_varchar2(p_path => 'billing.first_name') ); :status_code := 201; --created EXCEPTION WHEN OTHERS THEN :status_code := 400; --error :errmsg := SQLERRM; END; </code> updating: After testing - the problem is in this line: <code> l_blob_body blob := :body; </code> When I enter it, it does not insert anything into a database update 2: after testing... I realized that it is not possible to combine: : body and other bind value, so APEX_JSON.get_varchar2 should be used instead (p_path => 'billing.first_name') So the problem was solved
Categories: DBA Blogs

How to pass a parameter to a GET Handler in APEX?

Tom Kyte - Thu, 2021-01-14 12:46
Hello, I created a PL/SQL function that returns a list of open balances as a table result, where all amounts are converted to the currency provided as an input parameter: <code>function my_pkg.my_func (pi_currency in NUMBER default NULL) return amount_tab pipelined; </code> I created an Oracle REST Data Service with only GET handler: <code>select * from table(my_pkg.my_func(:to_currency)) ;</code> I tested it by Advanced REST Client and it is working as expected with an additional header for the to_currency parameter. In APEX I declared a REST Data Source related to the above REST service, then I made an APEX page with IG region based on the above REST source and it is working well as long as I am not trying to provide a parameter, i.e. until to_currency is null. When I try to populate <b>{"to_currency":"USD"}</b> in the External Filter attribute, this causes the application crash. I googled the problem but found nothing. Is any other standard way to pass the non-column parameter to the GET handler in APEX or I should write my own procedure to call REST service, e.g. by using APEX_EXEC? Thank you and best regards, Alex
Categories: DBA Blogs

Pages

Subscribe to Oracle FAQ aggregator