1Z0-931 Autonomous Database

Question 1:

Which three statements are true regarding how Autonomous Database provides data security?

a) Network connections from clients to Autonomous Database are encrypted using the client

credentials wallet

 
  1. b) Data is encrypted at rest using transparent data encryption.

c) Oracle automatically applies security updates to ensure data is not vulnerable to known attack

vectors

 
  1. d) Users are given OS logons or SYSDBA privileges to prevent phishing attacking

Explanation

In Autonomous Database, Oracle encrypt your data everywhere—whether in motion in SQL*Net traffic, and at rest in tablespaces and backups by default, each Autonomous Database service is automatically configured to use industrystandard TLS 1.2 to encrypt data in transit between the database service and clients or applications. Required client certificates and networking information are automatically packaged for the service consumer when the service is provisioned

Oracle encrypt data at rest, by default the Autonomous Database is automatically encrypted using Oracle Transparent Data Encryption in tablespaces and backups

Question 2:

Which of these database features is NOT part of the Autonomous Database?

  1. Flashback Database
  2. Java in the Database
  3. Real Application Clusters (RAC)
  4. Online Indexing

Explanation

List of Restricted and Removed Oracle Features

Lists the Oracle Database features that are not available in Autonomous Database. Additionally, database features designed for administration are not available.

List of Removed Oracle Features

Oracle Real Application Testing

Oracle Database Vault

Database Resident Connection Pooling (DRCP)

Oracle OLAP

Oracle R capabilities of Oracle Advanced Analytics

Oracle Industry Data Models

Oracle Tuning Pack

Oracle Database Lifecycle Management Pack

Oracle Data Masking and Subsetting Pack

Oracle Cloud Management Pack for Oracle Database

Oracle Multimedia

Java in DB

Oracle Workspace Manage

Question 3:

Which task is NOT automatically performed by the Oracle Autonomous Database? a) Automatically optimize the workload

  1. Backing up the database
  2. Mask your sensitive data
  3. Patching the database

Explanation

  • Oracle patch the full stack—including the firmware, the OS, clusterware, and the database.
  • Oracle Autonomous Database is actually a family of cloud services, with each member of the family optimized by workload. Autonomous Data Warehouse (ADW), which has been optimized for analytic workloads and Autonomous Transaction Processing (ATP). ATP is optimized for transaction processing or mixed workload.
  • it is possible to use Oracle Data Safe data masking capabilities with cloned instances of ADB. but by the customer

Question 4:

Migrating an on-premise database to Autonomous Database (ADB) for large amounts of data involves multiple steps like creating a credential object, creating (access to) a storage object/location, running datapump export and running a datapump import.

Which three statements are true for SQL Developer (18.3 and up) in combination with ADB Data Loading?

  1. a) SQL Developer can only export/move/import files using datapump from databases running on Linux systems.

b) SQL Developer can import .csv files into ADB which are located on the system where SQL

Developer is running.

 

c) SQL Developer can be used to export/move/import of a database to ADB in l set of wizard

steps

.

  1. d) SQL Developer can be started from the ADB Cloud console but only for data loading scenarios.

e) SQL Developer can import files (.dmp and .csv for example) into ADB which are located on

Amazon S3 Object storage

 

Explanation

SQL Developer is a free integrated development environment (IDE) provided by Oracle that simplifies the development and management of Oracle databases. A Java based platform, this IDE can run on Linux, Mac OS X, and Windows platforms. SQL Developer facilitates database migrations by providing options to use Oracle tools like Data Pump export, database copy, and SQL*Loader

Question 5:

What are the two methods that could be used during the migration of your existing Oracle database to Autonomous Database?

  1. CSV files copied to Autonomous Database block storage
  2. Data Pump
  3. Golden Gate
  4. Recovery Manager (RMAN)

Explanation

The main migration tool for migrating to ADB is Data Pump. You can export your schemas and import them into ADB using Data Pump. To sync up the additional/incremental changes on the source database during the export/import process you can use GoldenGate or GoldenGate Cloud Service to replicate those changes to ADB.

In the current release you cannot use physical migration methods like backup/restore, Data Guard, database clones, and transportable tablespaces to move your existing database to ADB. Question 6:

The default eight-day retention period for Autonomous Database performance data can be modified using which DBMS_WORKLOAD_REPOSITORY subprogram procedure?

  1. MODIFY_SNAPSHOT_SETTINGS
  2. UPDATE_OBJECT_INFO
  3. CREATE_BASELINE_TEMPLATE

Explanation

The retention time can be changed by changing the Automatic Workload Repository retention setting with the PL/SQL procedure

DBMS_WORKLOAD_REPOSITORY.MODIFY_SNAPSHOT_SETTINGS

Question 7:

How can an Autonomous Database resource be provisioned without logging into the Oracle Cloud Infrastructure console?

  1. a) Connecting to the Cloud Infrastructure Command console via SSH wallet. b) It cannot be done.
  2. Using Database Configuration Assistant (DBCA) on the database server.
  3. Using the Oracle Cloud Infrastructure Command Line interface tool or REST API calls.

Explanation

The CLI is a small footprint tool that you can use on its own or with the Console to complete Oracle Cloud Infrastructure tasks. The CLI provides the same core functionality as the Console, plus additional commands. https://blogs.oracle.com/datawarehousing/managing-autonomous-data-warehouse-using-oci-curl Examples of using Autonomous Database managing by REST API https://oracle.github.io/learning-library/workshops/autonomous-transactionprocessing/LabGuide900ConfigureOCI-CLI.md

Question 8:

If you need to connect to Autonomous Data Warehouse (ADW) using Java Database Connectivity (JDBC) via an HTTP proxy, where do you set the proxy details?

  1. sso
  2. jks
  3. properties
  4. ora
  5. ora

Explanation

JDBC Thin Connections with an HTTP Proxy

To connect to Autonomous Data Warehouse through an HTTPS proxy, open and update your tnsnames.ora file. Add the HTTP proxy hostname(https_proxy) and port (https_proxy_port) to the connection string.

JDBC Thin client versions earlier than 18.1 do not support connections through HTTP proxy

Question 9:

Which Autonomous Database Service is NOT used to connect to an Autonomous Transaction Processing instance?

  1. TPPERFORMANT
  2. TPURGENT
  3. LOW
  4. MEDIUM
  5. HIGH

Explanation

Predefined Database Service Names for Autonomous Transaction Processing

The tnsnames.ora file provided with the credentials zip file contains five database service names identifiable as tpurgent, tp, high, medium, and low. The predefined service names provide different levels of performance and concurrency for Autonomous Transaction Processing.

  • tpurgent: The highest priority application connection service for time critical transaction processing operations. This connection service supports manual parallelism.
  • tp: A typical application connection service for transaction processing operations. This connection service does not run with parallelism.
  • high: A high priority application connection service for reporting and batch operations. All operations run in parallel and are subject to queuing.
  • medium: A typical application connection service for reporting and batch operations. All operations run in parallel and are subject to queuing. Using this service the degree of parallelism is limited to four (4).
  • low: A lowest priority application connection service for reporting or batch processing operations. This connection service does not run with parallelism.

Question 10:

Where can a user's public ssh key be added on the Oracle Cloud Infrastructure Console in order to execute API calls?

  1. On the Autonomous Database Console.
  2. SSH keys are not required In Oracle Cloud Infrastructure
  3. SSH keys cannot be added from console. They have to be added using REST APIs only.
  4. Navigate to Identity, select Users panel on the console and select "Add Public Key".

Explanation

  • In the Console, click Identity, and then click Users. Locate the user in the list, and then click the user's name to view the details.
  • Click Add Public Key.
  • Paste the key's value into the window and click Add.

Question 11:

Which can be Scaled independently of the number of CPUs in an Autonomous Database? a) Concurency

  1. Memory
  2. Parallelism
  3. Storage
  4. Sessions

Explanation

Oracle allows you to scale compute and storage independently, no need to do it together. these scaling activities fully online (no downtime required)

Question 12:

Which method can be used to migrate on-premises databases to Autonomous Databases in cloud?

  1. RMAN backup & restore
  2. Physical migration method like database cloning
  3. Data Pump
  4. Original Import/Export tools

Explanation

The main migration tool for migrating to ADB is Data Pump. You can export your schemas and import them into ADB using Data Pump. To sync up the additional/incremental changes on the source database during the export/import process you can use GoldenGate or GoldenGate Cloud Service to replicate those changes to ADB.

In the current release you cannot use physical migration methods like backup/restore, Data Guard, database clones, and transportable tablespaces to move your existing database to ADB.

Question 13:

Which two system privileges does a user need to create analytic views?

  1. CREATE ATTRIBUTE DIMENSION
  2. CREATE ANALYTIC VIEW
  3. CREATE ANALYTIC MEASURE
  4. CREATE ANALYTIC LEVEL
  5. CREATE ANALYTIC HIERARCHY

Explanation

The following system privileges allow the user to create, alter, or drop analytic view component objects.

CREATE ANALYTIC VIEW

Create an analytic view in the grantee's schema.

CREATE ANY ANALYTIC VIEW

Create analytic views in any schema except SYS.

CREATE ATTRIBUTE DIMENSION

Create an attribute dimension in the grantee's schema. CREATE ANY ATTRIBUTE DIMENSION

Create attribute dimensions in any schema except SYS. CREATE HIERARCHY

Create a hierarchy in the grantee's schema.

CREATE ANY HIERARCHY

Create hierarchies in any schema except SYS.

ALTER ANY ANALYTIC VIEW

Rename analytic views in any schema except SYS.

ALTER ANY ATTRIBUTE DIMENSION

Rename attribute dimensions in any schema except SYS. ALTER ANY HIERARCHY

Rename hierarchies in any schema except SYS.

DROP ANY ANALYTIC VIEW

Drop analytic views in any schema except SYS.

DROP ANY ATTRIBUTE DIMENSION

Drop attribute dimensions in any schema except SYS.

DROP ANY HIERARCHY

Drop hierarchies in any schema except SYS.

SELECT ANY TABLE

Query or view any analytic view or hierarchy in any schema

Question 14:

Which three statements are true about procedures in the DBMS_CLOUD package?

  1. a) The DBMS_CLOUD.DELETE_FILE procedure removes the credentials file from the Autonomous

Data Warehouse database

b) The DBMS_CLOUD.CREATE_CREDENTIAL procedure stores Cloud Object Storage credentials

in the Autonomous Data Warehouse database

 
  1. The DBMS_CLOUD.PUT_OBJECT procedure copies a file from Cloud Object Storage to the Autonomous Data Warehouse.
  2. The DBMS_CLOUD.VALIDATE_EXTERNAL_TABLE procedure validates the source files for an external table, generates log information, and stores the rows that do not match the format options specified for the external table in a badfile table on Autonomous Data Warehouse

e) The DBMS_CLOUD.CREATE_EXTERNAL_TABLE procedure creates an external table on files in

the cloud. You can run queries on external data from the Autonomous Data Warehouse

.

Explanation

DELETE_FILE Procedure

This procedure removes the specified file from the specified directory on Autonomous Data Warehouse.

CREATE_CREDENTIAL Procedure

This procedure stores Cloud Object Storage credentials in the Autonomous Data Warehouse database. Use stored credentials for data loading or for querying external data residing in the Cloud.

PUT_OBJECT Procedure

This procedure copies a file from Autonomous Data Warehouse to the Cloud Object Storage. The maximum file size allowed in this procedure is 5 gigabytes (GB). VALIDATE_EXTERNAL_TABLE Procedure

This procedure validates the source files for an external table, generates log information, and stores the rows that do not match the format options specified for the external table in a badfile table on Autonomous Data Warehouse.

CREATE_EXTERNAL_TABLE Procedure

This procedure creates an external table on files in the Cloud. This allows you to run queries on external data from Autonomous Data Warehouse.

Question 15:

A Corporation is building a web application to allow its customers to schedule service requests online. There is also a need to run operational reports at times during non-peak hours. The architecture team is debating whether such reports should be run on the OLTP database or in a separate data mart. The DBA Manager does not want to add anymore admin responsibility to the team and is looking for a database option that's low to zero maintenance, but meets their strict performance requirements as well.

Which Oracle Cloud Infrastructure database service is appropriate for this scenario?

  1. Using 'tpurgenf and 'high' TNS services to separate connection types
  2. ADW since operational reporting is a higher priority in this scenario.
  3. Since the application needs to be highly available, It should to be deployed on a Kubernetes Cluster.
  4. It is best to build a separate data warehouse, and move the OLTP data on a nightly basis.

Explanation

Autonomous Transaction Processing provides all of the performance of the marketleading Oracle Database in an environment that is tuned and optimized to meet the demands of a variety of applications, including: mission-critical transaction processing, mixed transactions and analytics, IoT, and JSON document store.

As a service Autonomous Transaction Processing does not require database administration. With Autonomous Transaction Processing you do not need to configure or manage any hardware, or install any software. Autonomous Transaction Processing handles creating the database, backing up the database, patching and upgrading the database, and growing or shrinking the database.

Question 16:

Which two PL/SQL functions can be used to validate an analytic view?

  1. VALIDATE_ANALYTIC_VIEW
  2. VALIDATE_LEVELS
  3. VALIDATE_MEASURES
  4. VALIDATE_DIMENSION
  5. VALIDATE_HIERARCHY

Explanation

PL/SQL Package for Analytic Views

You can validate the data for analytic view and hierarchy objects with the following procedures in the DBMS_HIERARCHY package:

CREATE_VALIDATE_LOG_TABLE procedure

VALIDATE_ANALYTIC_VIEW function

VALIDATE_CHECK_SUCCESS function

VALIDATE_HIERARCHY function

Question 17:

As a database architect you are tasked with configuring a high concurrency, production OLTP application to connect to an Autonomous Transaction Processing database with a requirement to have some reporting queries run in parallel mode.

Which connection service is appropriate for such a workload?

  1. MEDIUM
  2. TP
  3. HIGH
  4. TPURGENT

Explanation tpurgent: The highest priority application connection service for time critical transaction processing operations. This connection service supports manual parallelism.

Concurrent Statements in tpurgent: 100 × OCPUs

Question 18:

What are the two methods that could be used during the migration of your existing Oracle database to Autonomous Database?

  1. CSV files copied to Autonomous Database block storage
  2. Data Pump
  3. Golden Gate
  4. Recovery Manager (RMAN)

Explanation

The main migration tool for migrating to ADB is Data Pump. You can export your schemas and import them into ADB using Data Pump. To sync up the additional/incremental changes on the source database during the export/import process you can use GoldenGate or GoldenGate Cloud Service to replicate those changes to ADB.

In the current release you cannot use physical migration methods like backup/restore, Data Guard, database clones, and transportable tablespaces to move your existing database to ADB.

Question 19:

Which three data dictionary views contain information about analytic view objects?

  1. ALL_ANALYTIC_VIEW_PATHS
  2. ALL_ANALYTIC_VIEW_DIM_CLASS
  3. ALL_ANALYTIC_VIEW_ID_ATTRS
  4. ALL_ANALYTIC_VIEW_LVLGRPS
  5. ALL_ANALYTIC_VIEW_KEYS

Explanation

- Analytic View Views

ALL_ANALYTIC_VIEW_ATTR_CLASS

ALL_ANALYTIC_VIEW_BASE_MEAS

ALL_ANALYTIC_VIEW_CALC_MEAS

ALL_ANALYTIC_VIEW_CLASS

ALL_ANALYTIC_VIEW_COLUMNS

ALL_ANALYTIC_VIEW_DIM_CLASS

ALL_ANALYTIC_VIEW_DIMENSIONS

ALL_ANALYTIC_VIEW_HIER_CLASS

ALL_ANALYTIC_VIEW_HIERS

ALL_ANALYTIC_VIEW_KEYS

ALL_ANALYTIC_VIEW_LEVEL_CLASS

ALL_ANALYTIC_VIEW_LEVELS

ALL_ANALYTIC_VIEW_LVLGRPS

ALL_ANALYTIC_VIEW_MEAS_CLASS

ALL_ANALYTIC_VIEW

Question 20:

Which two options are available to restore an Autonomous Data Warehouse?

  1. Select the snapshot of the backup
  2. Specify the point in time (timestamp) to restore
  3. Select the archived redo logs.
  4. Backup and recovery must be done using Recovery Manager (RMAN).
  5. Select the backup from which restore needs to be done

Explanation

You can initiate recovery for your Autonomous database using the cloud console. Autonomous database automatically restores and recovers your database to the point-in-time you specify.

On the details page, from the Actions drop-down list, select Restore to display the Restore prompt.

You can Specify Timestamp or Select Backup. then Click Restore.

Restoring Autonomous Database puts the database in the unavailable state during the restore operation. You cannot connectto database in that state

The details page shows Lifecycle State: Restore In Progress...

When the restore operation finishes your Autonomous Data Warehouse instance opens in readonly mode and the instance details page Lifecycle State shows Available Needs Attention.

At this point you can connect to your Autonomous Database instance and check your data to validate that the restore point you specified was correct.

if f the restore point you specified was correct and you want to open your database in read-write mode click Stop and after the database stops, click Start to start the database.

After stopping and starting, the database opens in read-write mode.

Question 21:

When exporting a notebook, what type of file is created?

  1. SQL
  2. JSON
  3. ASCII
  4. XML
  5. TXT

Explanation

Export a Notebook

You can export a notebook as a .json (JavaScript Object Notation) file, and later import it in to the same or a different environment.

To export a notebook:

  • In the Notebooks page, click the notebook that you want to export.

The notebook opens in the notebook editor.

  • In the top panel of the notebook editor, click export. The notebook is saved to your local folder as a .json file.

Question 22:

Which statement is true regarding database client credentials file required to connect to your Autonomous Database?

  1. When you share the credential files with authorized users, mail the wallet password and the file in the same email
  2. The Transparent Data Encryption (TDE) wallet can be used for your client credentials to connect to your database
  3. Place the credential files on a share drive that all users can use to connect to the database

d) Store credential files in a secure location and share the files only with authorized users to prevent

unauthorized access to the database

 

Explanation

Connection to Autonomous DB uses certificate authentication and Secure Sockets Layer (SSL). This ensures that there is no unauthorized access to Autonomous DB and that communications between the client and server are fully encrypted and cannot be intercepted or altered.

Certification authentication uses an encrypted key stored in a wallet on both the client (where the application is running) and the server (where your database service on the Autonomous DB is running). The key on the client must match the key on the server to make a connection

There's credentials zip file available for download once ADB is created. the zip file contains a collection of files, that needed to connect to your Autonomous DB. All communications between the client and the server are encrypted

Wallet files, along with the Database user ID and password provide access to data in your Autonomous Transaction Processing database. Store wallet files in a secure location. Share wallet files only with authorized users. If wallet files are transmitted in a way that might be accessed by unauthorized users (for example, over public email), transmit the wallet password separately and securely.

For better security, Oracle recommends using restricted permissions on wallet files. This means setting the file permissions to 600 on Linux/Unix.

Similar restrictions can be achieved on Windows by letting the file owner have Read and Write permissions while all other users have no permissions.

Question 23:

When you connect Oracle Analytics Cloud to the Autonomous Data Warehouse, what file needs to be uploaded?

  1. PROPERTIES
  2. SSO
  3. ORA
  4. ORA

Explanation

Create the Autonomous Data Warehouse Connection in Oracle Analytics Cloud.

  • Sign in to Oracle Analytics Cloud.
  • On the Home page, click Connect to Oracle Autonomous Data Warehouse.
  • In Create Connection, enter a Connection Name, for example, MyADW_connection.
  • In Description, enter a brief description.
  • Click Select next to Client Credentials. In File Upload, select the wallet zip file from your download location.

The Client Credentials field is populated with cwallet.sso, and the Service Name field contains a value.

  • Enter your Oracle Autonomous Data Warehouse Username and Password.
  • From the Service list, select the service for your data, and then click Save.

Ref:

https://www.oracle.com/webfolder/technetwork/tutorials/obe/cloud/oac_ee_dv/create_adwc_con nection/html/index.html

Question 24:

Which statement about the Export Wizard used to export database objects and data is NOT correct?

  1. If "Dependents" is checked as a DDL Option, for non-privileged users, only dependent objects in their schema are exported
  2. Export DDL includes features such as "Show schema," "Storage," and "Terminator."
  3. If "Clipboard" is selected as the "Output," the output will be placed on the system clipboard, so that it can be pasted into a file, a command line, or other location appropriate for the format

d) If "Grants" Is checked as a DDL Option, GRANT statements are Included for any grant objects

on the exported objects, including those owned by the SYS schema

 

Question 25:

Which two statements are true about the The Oracle Cloud Infrastructure (OCI)?

a) Because availability domains do not share infrastructure such as power or cooling, or the internal

availability domain network, a failure at one availability domain within a region is unlikely to impact

 

the availability of the others within the same region

 
  1. Regions are dependent on other regions and must be located with 5 thousand kilometers of each other.
  2. A single fault domain can be associated with multiple regions and availability domains

d) An OCI region is a localized geographic area, and an availability domain is one or more data

centers located within a region

 

Explanation

  • Oracle Cloud Infrastructure is physically hosted in regions and availability domains. A region is a localized geographic area, and an availability domain is one or more data centers located within a region. A region is composed of one or more availability domains.
  • Availability domains are isolated from each other, fault tolerant, and very unlikely to fail simultaneously or be impacted by the failure of another availability domain. When you configure your cloud services, use multiple availability domains to ensure high availability and to protect against resource failure.

Question 26:

Which is NOT required to connect to Autonomous Database from SQL developer? a) Service name

  1. Wallet file

  • Database name
  1. Username and password

Explanation

This is the information required to connect to ADB from SQL Developer:

  • Username: Enter the database username. You can either use the default administrator database account (ADMIN) provided as part of the service
  • Password: Enter the password for the database user.
  • Connection Type: Select Cloud Wallet
  • Configuration File: Click Browse, and select the client credentials zip file.
  • Service: Enter the service name. The client credentials file provides the service names.

Question 27:

How many pre-defined service names are configured in tnsnames.ora for a single Autonomous Transaction Processing database instance, and what are they called?

  1. They are called tpurgent, tp, high, medium and low.
  2. They are called high, medium and low
  3. There are no pre-defined service names in tnsnames.ora.
  4. Two. They are called ATP and ADW.

Explanation

Predefined Database Service Names for Autonomous Transaction Processing

The tnsnames.ora file provided with the credentials zip file contains five database service names identifiable as tpurgent, tp, high, medium, and low. The predefined service names provide different levels of performance and concurrency for Autonomous Transaction Processing.

tpurgent: The highest priority application connection service for time critical transaction processing operations. This connection service supports manual parallelism.

tp: A typical application connection service for transaction processing operations. This connection service does not run with parallelism.

high: A high priority application connection service for reporting and batch operations. All operations run in parallel and are subject to queuing.

medium: A typical application connection service for reporting and batch operations. All operations run in parallel and are subject to queuing. Using this service the degree of parallelism is limited to four (4).

low: A lowest priority application connection service for reporting or batch processing operations. This connection service does not run with parallelism.

Question 28:

When scaling OCPUs in Autonomous Database, which statement is true in regards to active transactions?

  1. Scaling cannot happen while there are active transactions in the database.
  2. Active transactions are terminated and rolled back.
  3. Active transactions are paused.
  4. Active transactions continue running unaffected.

Explanation

Oracle allows you to scale compute and storage independently, no need to do it together. these scaling activities fully online (no downtime required)

Question 29:

Which Autonomous Database Cloud service ignores hints in SQL Statements by default?

  1. Autonomous Data Warehouse.
  2. Neither service ignores hints by default
  3. Autonomous Transaction Processing
  4. Both services ignore hints by default

Explanation

Optimizer Hints Ignored by default in ADW, where honored by default in ATP

Question 30:

Which is correct about security features that are available in Oracle Autonomous Database?

  1. Neither Data Redaction nor TDE are supported.
  2. Data Redaction but not TDE
  3. TDE but not Data Redaction
  4. Data Redaction and TDE are both supported

Explanation

All data is encrypted at rest using transparent data encryption, and data redaction is part of ADB.

Question 31:

What REST verb is used to create an Autonomous Database service using REST APIs?

  1. A "POST" REST call
  2. A "GET" REST call
  3. An "INSERT" REST call
  4. A "PUT" REST call

Explanation

Use REST verb POST to create Autonomous Database with REST API

Question 32:

What predefined user is created when an Autonomous Database (ADB) instance is created that you connect to in order to create other users and grant roles?

  1. SCOTT
  2. ADMIN
  3. SYS
  4. DWDEV

Explanation

Administrator account in Autonomous Database is ADMIN

Want a fresh copy of this sample assignment