Oracle

Need Help in Oracle Assignment?
We can help you if you are having difficulty with your Oracle Assignment. Just email your Oracle Assignment at admin@assignmentcache.com.
We provide help for students all over the world in Oracle Assignment.

Items 1 to 10 of 63 total

per page
Page:
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5

Grid  List 

Set Descending Direction
  1. DBM449 LAB1 SqlFile

    DBM 449 LAB 1 Oracle Joins

    Regular Price: $20.00

    Special Price: $5.00

    GENERAL OVERVIEW
    Scenario/Summary
    My colleague, Ann Henry, operates a regional training center for a commercial software organization. She created a database to track client progress so she can analyze effectiveness of the certification program. CLIENT, COURSE, and COURSE_ACTIVITY are three of the tables in her database. The CLIENT table contains client name, company, client number, pre-test score, certification program and email address. The COURSE_ACTIVITY table contains client number, course code, grade, and instructor notes. The COURSE table contains the course code, course name, instructor, course date, and location. Although she and her instructors enter much of the data themselves, some of the data are extracted from the corporate database and loaded into her tables.

    Loading the initial data was easy. For grade entry at the end of each course, a former employee created a data entry form for the instructors. Updating most client information and generating statistics on client progress is not easy because Ann does not know much SQL. For now, she exports the three tables into three spreadsheets. To look up a grade in the COURSE_ACTIVITY spreadsheet, she first has to look up client number in the CLIENT spreadsheet. While this is doable, it is certainly not practical. For statistics, she sorts the data in the COURSE_ACTIVITY spreadsheet using multiple methods to get the numbers she needs.

    Every month, Ann's database tables need to be refreshed to reflect changes in the corporate database. Ann describes this unpleasant task. She manually compares the contents of newly extracted data from corporate to the data in her spreadsheets, copies in the new values, and then replaces the database contents with the new values.

    Ann needs our help. Let’s analyze her situation and determine what advanced SQL she could use to make her tasks easier.
         
    L A B O V E R V I E W


    Scenario/Summary


    The purpose of this lab is to explore join operators to determine which, if any, are appropriate for solving Ann's business problems, as described in this week's lecture.


    Since Ann prefers to work from Excel spreadsheets, she wants her CLIENT and COURSE_ACTIVITY tables exported into one spreadsheet rather than two, as she is currently using. We need to determine which, if any, of the join operators will provide the data she wants for the single spreadsheet. (Note: we will not perform the export, just determine how to retrieve the necessary data.) Using the spreadsheet, she will be able to determine:



    1. Which course(s) a specific client has taken

    2. What grade(s) a specific client has earned in a specific course

    3. Which clients did not take any courses

    4. Which courses were not taken by any client


    Here are results from DESCRIBE commands that show structure (columns and their data types) of tables CLIENT and COURSE_ACTIVITY. You may refer to it while constructing your queries.


    SQL*Plus: Release 10.2.0.1.0 — Production on Thu Jun 14 22:38:52 2007

    Copyright (c) 1982, 2005, Oracle.  All rights reserved.

    Connected to:
    Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 — 64bit Production
    With the Partitioning, OLAP and Data Mining Options

    SQL> desc course_activity



    SQL>


    For this lab you will be creating several documents. First, write your queries in Notepad to create a script file that will contain all of the queries asked for in lab steps 4 through 13. You can (and should) test each query as you write it to make sure that it works and is returning the correct data. Once you have all of your queries written then create a SPOOL session and run your entire script file. Be sure that you execute a SET ECHO ON session command before running the file so that both the query and the output will be captured in the SPOOL file. IMPORTANT: If you are using Windows Vista you will need to create a directory on your C: drive to SPOOL your file into. Vista will not allow you to write a file directly to the C: drive. This will give you two files for the lab. The third file will the be the Lab1 Report document found in Doc Sharing. You will need to put your responses to the questions asked in the various lab steps.


    Now let's begin.


    L A B S T E P S

    STEP 1: Start Oracle SQL*Plus via Citrix


    Log into the Citrix iLab environment. Open your Oracle folder, select SQL Plus and log in to your database instance. Use "sys" as User Name, and "oracle" as the Password. Enter the Host String as "DB9999.world as sysdba" where 9999 is the database number you have been assigned.


    STEP 2: Initialize tables


    Download the pupbld.sql and Lab1_init.sql files associated with the links to your C: drive or to the F: drive in your Citrix environment. You will need to open each of the files and edit the connection string to reflect your instance name. The pupbld.sql file has two connections strings; one at the top of the script and another at the bottom. Be sure to change both of these to reflect your instance name.


    Once you have done this then run the pupbld.sql script first (DO NOT copy and paste it) in your SQL*Plus session. The script will create the product_user_profile synonym in the SYSTEM account which will be used each time you log in as a normal user.


    Next run the lab1_init.sql script in your session. The script will create a new user (DBM449_USER) that will be used in various labs in this course. You can find the password for this new user by looking at the CREATE USER statement in the script file. Disregard the DROP TABLE error messages. They occur because the script is designed to work regardless of whether you have already created the tables or not. This way, you may run it if you ever decide to resent the contents of your tables to the original values. When you run the script for the first time, the error messages appear as you attempt to drop tables that do not exist.


    Once the script has finished you will be logged into the new user and ready to start your lab.


    STEP 3: Verify your tables


    You want to verify that everything completed successfully. To do this execute a SELECT * FROM TAB statement to make sure all 5 tables were created and then you can execute a SELECT COUNT(*) FROM statement using each of the table names. You should find the following numbers of records for each table.



    • CLIENT table - 5 rows

    • COURSE table - 5 rows

    • COURSE_ACTIVITY table - 6 rows

    • CORP_EXTRACT1 table - 3 rows

    • CORP_EXTRACT2 table - 0 rows


    NOTE: In the following steps when writing your queries be sure to list the tables in the FROM clause in the same order they are listed in the instructions. Reversing the order of the tables in the FROM clause will produce an incorrect results set


    STEP 4: Using the FULL OUTER JOIN operator


    Join the CLIENT and COURSE_ACTIVITY tables using a FULL OUTER JOIN.



    • Write and execute the SQL statement that produces the client number and name, course code and grade that the client got in this course.


    Will the FULL OUTER JOIN be helpful to Ann? Place your response in the lab report document for this step.


    STEP 5: Using the RIGHT OUTER JOIN operator


    Join the CLIENT and COURSE_ACTIVITY tables using a RIGHT OUTER JOIN.



    • Write and execute the SQL statement that produces the client number and name, course code and grade that the client got in this course.


    Will the RIGHT OUTER JOIN be helpful to Ann? Place your response in the lab report document for this step.


    STEP 6: Using the LEFT OUTER JOIN operator


    Join the CLIENT and COURSE_ACTIVITY tables using a LEFT OUTER JOIN.



    • Write and execute the SQL statement that produces the client number and name, course code and grade that the client got in this course.


    Will the LEFT OUTER JOIN be helpful to Ann? Place your response in the lab report document for this step.


    STEP 7: Using the NATURAL JOIN operator


    Join the CLIENT and COURSE_ACTIVITY tables using a NATURAL JOIN.



    • Write and execute the SQL statement that produces the client number and name, course code and grade that the client got in this course.

    • Will the NATURAL JOIN be helpful to Ann? Place your response in the lab report document for this step.


    STEP 8: Using the INNER JOIN operator


    Join the CLIENT and COURSE_ACTIVITY tables using a INNER JOIN.
    Write and execute the SQL statement that produces the client number and name, course code and grade that the client got in this course.


    Will the INNER JOIN be helpful to Ann? Place your response in the lab report document for this step.


    Write a conclusion based on the five steps above, which join - if any - should Ann use to populate the spreadsheet that can answer her questions.


    STEP 9: Using the UNION operator 


    Examine the clients and courses in Ann’s tables and the CORP_EXTRACT1 table using the UNION operator.



    • Write and execute the SQL statement that examines client numbers in CLIENT and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines client numbers in COURSE_ACTIVITY and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines course names in COURSE and CORP_EXTRACT1.


    Which of these statements, if any, will be helpful to Ann? Place your response in the lab report document for this step.


    STEP 10: Using the UNION ALL operator


    Examine the clients and courses in Ann’s tables and the CORP_EXTRACT1 table using the UNION ALL operator.



    • Write and execute the SQL statement that examines client numbers in CLIENT and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines client numbers in COURSE_ACTIVITY and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines course names in COURSE and CORP_EXTRACT1.


    Which of these statements, if any, will be helpful to Ann? Place your response in the lab report document for this step.


    STEP 11: Using the INTERSECT operator


    Examine the clients and courses in Ann’s tables and the CORP_EXTRACT1 table using the INTERSECT operator.



    • Write and execute the SQL statement that examines client numbers in CLIENT and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines client numbers in COURSE_ACTIVITY and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines course names in COURSE and CORP_EXTRACT1.


    Which of these statements, if any, will be helpful to Ann? Place your response in the lab report document for this step.


    STEP 12: Using the MINUS operator


    Examine the clients and courses in Ann’s tables and the CORP_EXTRACT1 table using the MINUS operator.



    • Write and execute the SQL statement that examines client numbers in CLIENT and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines client numbers in COURSE_ACTIVITY and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines course names in COURSE and CORP_EXTRACT1.


    Which of these statements, if any, will be helpful to Ann? Place your response in the lab report document for this step.


    STEP 13: Using subqueries


    Examine the clients and courses in Ann’s tables and the CORP_EXTRACT1 table using a subquery with NOT IN operator.



    • Write and execute the SQL statement that examines client numbers in CLIENT and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines client numbers in COURSE_ACTIVITY and CORP_EXTRACT1.

    • Write and execute the SQL statement that examines course names in COURSE and CORP_EXTRACT1.


    Which of these statements, if any, will be helpful to Ann? Place your response in the lab report document for this step.


    Deliverables
        
    What is Due


    Submit your spooled lab file with the queries and results sets along with the completed Lab 1 Report to the Dropbox. Your report should contain copies of each query and result set outlined in the lab along with the requested explanation of whether or not it satisfied the business requirement outlined for that particular section of the lab.

    Learn More
  2. DBM 449 Lab 2 Sql File

    DBM 449 lab 2 OEM Query optimization

    Regular Price: $20.00

    Special Price: $15.00

    In this lab we will focus on several common performance tuning issues that one might encounter while working with a database.  You will need to refer to both your text book and the lecture material for this week for examples and direction to complete this lab.
    To record your work for this lab use the LAB2_Report.doc found in Doc Sharing. As in your first lab you will need to copy/paste your SQL statements and results from SQL*Plus into this document. This will be the main document you submit to the Drop Box for Week 2.


    L A B   S T E P S   
    STEP 1: Examine Query Optimization using OEM


    Oracle Enterprise Manager (OEM) provides a graphical tool for query optimization.  The tables that you will be using in this lab are the same ones that were created in the first lab in the DBM449_USER schema.



    1. Start OEM via Citrix iLab. If you need help or instructions on how to do this you can refer to the How_to_use_OEM_in_Citrix iLab.pdf file associated with this link.

    2. Select Database Tools icon from the vertical tool bar and Select SQL Scratchpad icon from the expanded tool bar. If you need help or instructions on how to do this your can refer to theExecuting_and_Analyzing_Queries_in_OEM.pdf file associated with this link.

    3. Write a SQL statement to query all data from table COURSE (you will need to connect as the DBM449_USER). Click on Execute. Take a screen shot that shows the results and paste that into the lab document.

    4. Click on Explain Plan. Take screen shot of the results and past that into the lab document.

    5. Write a comment how this query is executed.

    6. Write a SQL statement to query the course_name, client_name and grade from the COURSE, COURSE_ACTIVITY and CLIENT tables and order the results by course name, and within the same course by client name.

    7. Click on Explain Plan. Take screen shot of the results and past that into the lab document.

    8. Exit out of OEM at this point.

    9. Write a comment on how this query is executed.


    STEP 2: Examine Query Optimization using SQL*Plus


    In this portion of the lab we are going to use SQL*Plus to replicate what we did in Step one using OEM.  At the end of this part of the lab you will be asked to compare the results between the processes.



    1. Before you can analyze an SQL statement in SQL*Plus you first need to create a Plan Table that will hold the results of your analysis.  To do this you will need to download the UTLXPLAN.SQL file associated with this link and run this script in an SQL*Plus session while logged in as the DBM449_USER user.  Once the script has completed then execute a DESC command on the PLAN_TABLE.

    2. Again you are going to write a SQL statement to query all data from table COURSE.  Remember to make the modifications to the query so that it will utilize the plan table that you just created.

    3. Now write the query that will create a results table similar to the one below by using the DBMS_XPLAN procedure.


    PLAN_TABLE_OUTPUT
    Plan hash value:  1263998123
         
      Id      Operation      Name      Rows      Bytes      Cost (%CPU)      Time
      0      SELECT STATEMENT          5      345      3    (0)      00:00:01
      1      TABLE ACCESSFULL      COURSE      5      345      3    (0)      00:00:01
        
    Note
    PLAN_TABLE_OUTPUT
    - dynamic spamling used for this statement



    1. Now execute the second query you used in Step 1 and then show the results in the plan table for that query.  HINT: Before you run your second query you will need to delete the contents of the plan table so that you will get a clean analysis.

    2. Write a short paragraph comparing the output from OEM to the output from the EXPLAIN PLAN process you just ran.  Be sure to copy/paste all of the queries and results set from this step into the lab report section for this step.


    STEP 3: Dealing With Chained Rows


    In this portion of the lab we are going to create a new table and then manipulate some data to generate a series of chained rows within the table.  After you have generated this problem then we are going to go through the process of correcting the problem and tuning the table so that the chained rows are gone.  The process is a little tricky and is going to require you to think through your approach to some of the SQL.  Remember that every table has a hidden column named ROWID that is created implicitly by the system when the table is created.  This column can be queried just like any other column.  You will need this information in step 6 of this part of the lab. 



    1. For this part of the lab you will need to create a new user named GEORGE.  You can determine your own password but you want to make sure that the default tablespace is USERS and the temporary tablespace is TEMP.  Grant both the connect and resource rolls to the new user and then log in to create a session for the new user GEORGE.

    2. Once logged in to the new user then write the SQL to create a new table using the given column information and storage parameters listed below.  NOTE:  the parameters have been chosen intentionally so please do not change them.


    Table name: NEWTAB
    Columns: Prod_id       NUMBER
         Prod_desc VARCHAR2(30)
         List_price NUMBER(10,2)
         Date_last_upd DATE


    Tablespace:    USERS
         PCTFREE    10
         PCTUSED    90
         Initial and Next extents:    1K
         MinExtents    1
         MaxExtents    121
         PCTINCREASE    0



    1. Next, you will need to download both the UTLCHAIN.SQL and LAB2_FILL_NEWTAB.SQL scripts from the links shown.  First run the UTLCHAIN script in your SQL*Plus session and then run the LAB2_FILL_TAB script.  Be sure that you run them in the order just described.

    2. Now execute the ANALYZE command on the table NEWTAB to gather any chained rows.  HINT: refer back to the lecture material for this week and your text book. 

    3. Write and execute the query that will list the owner_name, table_name and head_rowid columns from the CHAINED_ROWS table.  You will have approximately 200+ rows in your result set so please do not copy/paste all of them into the lab report.  You only need the first 10 or 15 rows as a representation of what was returned.

    4. Now you need to go through the steps of getting rid of all the chained rows using these steps.  


      • You can create your temporary table to hold the chained rows of the NEWTAB table as a select statement based on the existing table.  HINT: CREATE TABLE NEWTAB_TEMP AS SELECT * FROM NEWTAB....  You want to be sure that you only pull data from the existing table that matches the data in the CHANED_ROW table.  To assure this you will need a WHERE clause to pull only this records with a HEAD_ROWID value in the CHAINED_ROWS table that matches a ROWID value for the NEWTAB table.  

      • Now you need to delete the chained rows from the NEWTAB table.  To accomplish this you will need a subquery that pulls the HEADROW_ID value from the CHAINED_ROWS table to match against the ROWID value in the NEWTAB table.  The number of rows deleted should be the same as the number that you retrieved in the query for part 5 of this section.

      • Now write an insert statement that will insert all of the rows of data in the temporary table that you created above into the NEWTAB table.  Be sure that you explicitly define the rows that you are pulling data from in the NEWTAB_TEMP table.

      • Next, write and execute the statement that will TRUNCATE the chained_rows table.

      • Now run the same ANALYZE statement you did in step 4 and then the query you did in part 5 above.  This time you should get a return message stating no rows selected.



    Be sure that you copy/paste all of the above SQL code and returned results sets and messages into the appropriate place in the Lab Report for this week.


    Deliverables     

    What is Due
    Submit your completed Lab 2 Report to the Dropbox as stated below.  Your report should contain copies of each query and result set outlined in the lab along with the requested explanation of whether or not it satisfied the business requirement outlined for that particular section of the lab.
     

    Learn More
  3. DBM 449 Lab 3 Sql File

    DBM 449 Lab 3 Distributed Database

    Regular Price: $20.00

    Special Price: $15.00

    L A B O V E R V I E W    


    Scenario/Summary
    To the end user working with databases distributed through out a company's network is not different than working with multiple tables within a single database. The fact that the different databases exist in other locations should be totally transparent to the user. For this lab we are going to take on the roll of a database administrator in a company that has three regional offices in the country. You work in the central regional office, but there is also a West Coast Region located in Seattle and an East Coast Region located in Miami. Your roll is to gather report information from the other two regions.


    For this lab you are going to work with three different databases. You already have your own database instance. You will also be working with the a database named SEATTLE representing the West Coast Region and a database named MIAMI representing the East Coast Region. Login information for these two additional database instances is as follows:


    SEATTLE: Userid - seattle_user
    Password - seattle
    Host String - seattle


    MIAMI: Userid - miami_user
    Password - miami
    Host String - miami


    To record your work for this lab use the LAB3_Report.doc found in Doc Sharing. As in your previous labs you will need to copy/paste your SQL statements and results from SQL*Plus into this document. This will be the main document you submit to the Drop Box for Week 3.


    L A B S T E P S    
    STEP 1: Setting up Your Environment



    1. Be sure you are connected to the DBM449_USER schema that was created in lab 1. 

    2. To begin this lab you will need to download the LAB3_DEPTS.SQL script file associated with the link and run the script in your DBM449_USER schema of your database instance. This script contains a single table and that you will be using to help pull data from each of the other two database instances.  Notice that the DEPTNO column in this table is the PRIMARY KEY column and can be used to reference or link to the DEPTNO column in the other two database employee tables.

    3. Now you need to create a couple of private database links that will allow you to connect to your other two regional databases. To accomplish this use the connection information listed above in the Lab Overview section. Name your links using your database instance name together with the region name as the name for the link. Separate the two with an underscore (example - DB1000_SEATTLE).

    4. After creating both of your database links, query the USER_DB_LINKS view in the data dictionary to retrieve information about your database links.  The output from your query should look similar to what you see below.  You will need to set your linesize to 132 and format the DB_LINK and HOST columns to be only 25 bytes wide to get the same format that you see.


    DB_LINK                   USERNAME                       HOST                      CREATED
    ------------------------- ------------------------------ ------------------------- ---------
    DB1000_MIAMI              MIAMI_USER                     miami                     09-DEC-08


    STEP 2: Testing your Database Links
    Each of your remote databases has an employee data table. The tables are named SEATTLE_EMP and MIAMI_EMP respective to the database they are in. Using the appropriate database link, query each of the two tables to retrieve the employee number, name, job function, and salary. (HINT: you can issue a DESC command on each of the distributed tables to find out the actual column names just like you would for a table in your own instance.


    STEP 3: Connecting Data in the Seattle Database
    Write a query that will retrieve all employees from the Seattle region who are salespeople working in the marketing department. Show the employee number, name, job function, salary, and department name (HINT: The department name is in the DEPT table) in the result set.


    STEP 4: Connecting Data in the Miami Database
    Write a query that will retrieve all employees from the Miami region who work in the accounting department. Show the employee number, name, job function, salary, and department name (HINT: The department name is in the DEPT table) in the results set.


    STEP 5: Connecting Data in all Three Databases
    Now we need to increase our report. Write a query that will retrieve employees from both the Seattle and Miami regions who work in sales. Show the employee number, employee name, job function, salary and location name in the result set (HINT: The location name is in the DEPT table).


    STEP 6: Improving Data Retrieval from all Three Databases
    Writing queries like the ones above can be fairly cumbersome. It would be much better to be able to pull this type of data as though it was coming from a single table, and in fact this can be done by creating a view.



    1. Using the query written above as a guide, write and execute the SQL statement that will create a view that will show all employees in both the Seattle and Miami regions (you can use your own naming convention for the view name). Show all the employee number, name, job, salary, commission, department number and location name for each employee (HINT: The location name is in the DEPT table).

    2. Now write a query that will retrieve all the data from the view just created.


    Deliverables
    Submit your completed Lab 3 Report to the Dropbox. Your report should contain copies of each query and result set outlined in the lab along with the requested explanation of whether or not it satisfied the business requirement outlined for that particular section of the lab.

    Learn More
  4. MIS 562 Week 5 Homework Query Optimization

    MIS 562 Week 5 Homework Query Optimization

    Regular Price: $20.00

    Special Price: $15.00

    MIS 562 Week 5 Homework Query Optimization


    Using the student schema from week 2, provide answers to the following questions.


    Question
    SQL statement or Answer
    1. Generate statistics for the student, enrollment, grade, and zipcode tables (15 pts)


    2. Write a query that performs a join, a subquery, a correlated subquery using the student, enrollment, grade, and zipcode tables. Execute each query to show that it produces the same results. (15 pts)


    3. Produce an autotrace output and explain plan for each query. (10 pts)


    4. Analyze the results and state which performs best and why. Write an analysis of what operations are being performed for each query. Determine which query is the most efficient and explain why (10 pts)

    Learn More
  5. Oracle Database to Track Students Grade ERD

    Oracle Database to Track Students Grade

    Regular Price: $15.00

    Special Price: $12.00

    Oracle Database to Track Students Grade


    Build an Oracle database to track students grade in a class:
    The database tracks
    *Student Information
    *Instructor Information
    *Class information
    *Grading Breakdown
    *Students' Grades


    Different queries will show
    *List of students with semester grades
    *List of students who received an A
    *Average grades
    *Above average grades

    Learn More
  6. MIS 562 Week 7 Homework

    MIS 562 Week 7 Homework Roles and Privileges

    Regular Price: $15.00

    Special Price: $12.00

    MIS 562 Week 7 Homework Roles and Privileges


    Part 1
    Using the following Data Dictionary views write the statements that will perform the following actions. Be sure to test your statements. (Do not use SELECT *)
    ROLE_ROLE_PRIVS
    ROLE_SYS_PRIVS
    ROLE_TAB_PRIVS
    USER_ROLE_PRIVS
    USER_SYS_PRIVS
    USER_TAB_PRIVS


    Question SQL statement or Answer
    1. Determine what privileges your account has been granted through a role. (10 points)


    2. Determine what system privileges your account has been granted. (10 points)


    3.Execute the following statement then determine what table privileges your account has been granted. (15 points)
    Grant select on student to public;


    4. Determine what system privileges the DVONLINE role has. (10 points)

    5. Analyze the following query and write a description of the output it produces. (15 points)
    SELECT COUNT(DECODE(SIGN(total_capacity-20), -1, 1, 0, 1)) "<=20",
    COUNT(DECODE(SIGN(total_capacity-21), 0, 1, -1, NULL,
    DECODE(SIGN(total_capacity-30), -1, 1)))"21-30",
    COUNT(DECODE(SIGN(total_capacity-30), 1, 1)) "31+"
    FROM
    (SELECT SUM(capacity) total_capacity, course_no
    FROM section
    GROUP BY course_no)


    6. Determine the top three zip codes where most of the students live. Use an analytical function. The query will product 10 rows. (10 points)



    Part 2
    Analyze the file from Doc Share called utlpwdmg.sql and analyze the code in this file. Write a paragraph that describes what the function performs. What are the inputs parameters, the output parameter and what does the function do? (30 points)

    Learn More
  7. MSCD640 Assignment 1

    MSCD640 Assignment 1

    Regular Price: $60.00

    Special Price: $50.00

    MSCD640 Assignment 1


    Part 2: Hands-on exercise


    You’ve just been through an eight week intensive training for database administration. You are now a support DBA. You’ve been assigned to handle tickets for a database environment. Handle each of the trouble tickets below. Capture all of your work as either screen shots or spooled output. You’ll need to hand in this output to the shift manager at the end of your shift that verifies you successfully handled each ticket.


    ---------------------------------------------------------------------------------
    Ticket #1: Production DBA has reported that the database you’re assigned to is not in archive log mode. You need to enable archive logging in your assigned database.



    Verify:
    SQL> archive log list;
    ---------------------------------------------------------------------------------
    Ticket #2: User has reported that they’re receiving an open cursors error from the application. You need to verify the database initialization setting of open_cursors and double the setting (for example, if it’s 50, change it to 100).



    Verify:
    SQL> show parameter open_cursors;
    ---------------------------------------------------------------------------------
    Ticket #3: Production DBA notices abnormal wait times related to the online redo logs. The DBA recommends that you immediately add one online redo log to your database. Make sure you size the new online redo log the same size as the current logs.



    Verify:
    SQL> select group#, member from v$logfile;
    ---------------------------------------------------------------------------------
    Ticket #4: Development DBA has filed a ticket asking for an additional tablespace. Create a tablespace, named AP_DATA size its datafile at 20M.



    Verify:
    SQL> select tablespace_name from dba_tablespaces;
    ---------------------------------------------------------------------------------
    Ticket #5: Development team has filed a ticket asking for a new user account. Create a user named AP_MGMT with the password of f00b0r and assign it the following:
    • Assign it the default tablespace of AP_DATA
    • Assign it the temporary tablespace of TEMP
    • Grant it connect, create table, and create sequence
    • Alter the user to have a quota of unlimited on the AP_DATA


    Verify:
    SQL> connect ap_mgmt/f00b0r
    SQL> select * from user_users;
    ---------------------------------------------------------------------------------
    Ticket # 6: Development team is requesting that you create two tables in the AP_DATA account: EMP and DEPT. Here are the requirements.


    • The DEPT table needs 2 columns: dept_id, dept_name
    • The dept_id is a number.
    • The dept_id is the primary key.
    • The dept_name is varchar2(30)
    • The dept_name needs a check constraint that limits it to the following values (‘HR’, ‘IT’, ‘SECURITY’)


    • The EMP table needs 3 columns: emp_id, emp_name, dept_id
    • The emp_id column is a number.
    • The emp_id column is the primary key.
    • The emp_name column is varchar2(30)
    • The dept_id column is a number.
    • The EMP(dept_id) column needs a foreign key constraint defined that references the DEPT(dept_id) parent table.


    Verify:
    SQL> desc emp;
    SQL> desc dept;
    ---------------------------------------------------------------------------------
    Ticket #7: The application team needs you to seed the EMP and DEPT tables with data:


    insert into dept (dept_id, dept_name) values(1, ‘HR’);
    insert into dept (dept_id, dept_name) values(2, ’IT’);
    insert into dept (dept_id, dept_name) values(3, ‘SECURITY’);
    insert into dept (dept_id, dept_name) values(4, ‘WAREHOUSE’);


    insert into emp (emp_id, emp_name, dept_id) values (50, ‘GEORGE’, 2);
    insert into emp (emp_id, emp_name, dept_id) values (20, ‘JANE’, 1);
    insert into emp (emp_id, emp_name, dept_id) values (30, ‘JOHN’, 3);


    Verify:
    SQL> select * from emp;
    SQL> select * from dept;
    ---------------------------------------------------------------------------------
    Ticket #8: After the EMP and DEPT tables have been created and seeded (prior two steps). An end user is reporting strange locking issues with the DEPT and EMP tables. You need to run a script that validates whether or not the EMP table has an index created on the foreign key column.


    ---------------------------------------------------------------------------------
    Ticket #9: Development team is reporting that they now require an index be added to the EMP table on the dept_id column. Make its tablespace APP_DATA.


    Verify:
    SQL> select index_name, column_name from user_ind_columns;
    ---------------------------------------------------------------------------------
    Ticket #10: The storage manager is concerned about disk space. The manager wants a report showing how much space all of the tablespaces in your database are consuming. The manager would like to see space free and space consumed in the report.



    ---------------------------------------------------------------------------------
    Ticket #11: The security department is reporting a massive security breach!! They are requesting that you lock all users in your database except for SYS.



    ---------------------------------------------------------------------------------
    Ticket #12: The application team wants a report showing the space used for each table and index in the AP_MGMT account:



    ---------------------------------------------------------------------------------
    Ticket #13: The production DBAs have requested that archive logging be disabled for your database. Disable archive logging in your database:



    Verify:
    SQL> archive log list;
    ---------------------------------------------------------------------------------
    Ticket #14: With a bit of sadness, you realize your shift will end soon. Before you leave, your team leader would like you to write few sentences indicating whether or not your training prepared you adequately for the job. Do you have any recommendations for your team leader?


     

    Learn More
  8. CMIS420 PROJECT 2 Mail-Order Database DML and DDL statements

    CMIS420 Advanced Relational Database PROJECT 2 Mail-Order Database

    Regular Price: $25.00

    Special Price: $20.00

    CMIS420 Advanced Relational Database PROJECT 2 Mail-Order Database


    Overview:
    Use SQL, PL/SQL, and Triggers to design and create a Mail-Order Database System. Please create your own data for testing purpose. Use the attached file "Project 2 Tables" as a guide to creating the tables. You should pre-populate the PARTS, CUSTOMERS, EMPLOYEE and ZIPCODES tables.


    Due Date:
    Check the due date in Syllabus for the exact date for this assignment. No project will be accepted after the due date.


    Deliverables:
    Turn in all SQL scripts in the form of a SQL script files. The script files should include,


    1. A script file containing all the DML and DDL statements. That is, the SQL used to create the tables and sequence and the SQL to pre-populate or insert records in the tables. Name this file XXX_PROJ2.sql, where XXX are you intials.
    2. A file containing the PL/SQL package (Specification and Body) that provides the functionality listed in the requirements below. Name this file XXX_PROJ2.pkg, where XXX are you initials.
    3. A file containing the database triggers. Name this file XXX_PROJ2.trg, where XXX are you initials.
    4. Finally, provide a test SQL*PLUS routine (PL/SQL anonymous block) that will test the PL/SQL functionality developed. Name this file XXX_PROJ2_tst.sql, where XXX are your initials.


    You should submit your assignment through WebTycho as you did for previous assignments.
    Use winzip or any zip software to package the four (4) files into one file called XXX_project2.zip, where XXX are your initials.


    Requirements:
    The Mail-Order Database consists of the following tables and attributes. Please ensure that all constraints are created when creating the tables. All constraints other than NOT NULL constraints must be named.
    1. EMPLOYEE(ENO, ENAME, ZIP, HDATE, CREATION_DATE, CREATED_BY, LAST_UPDATE_DATE, LAST_UPDATED_BY)
    2. PARTS(PNO, PNAME, QOH, PRICE, REORDER_LEVEL, CREATION_DATE, CREATED_BY, LAST_UPDATE_DATE, LAST_UPDATED_BY)
    3. CUSTOMERS(CNO, CNAME, STREET, ZIP, PHONE, CREATION_DATE, CREATED_BY, LAST_UPDATE_DATE, LAST_UPDATED_BY)
    4. ORDERS(ONO, CNO, ENO, RECEIVED, SHIPPED, CREATION_DATE, CREATED_BY, LAST_UPDATE_DATE, LAST_UPDATED_BY)
    5. ODETAILS(ONO, PNO, QTY, CREATION_DATE, CREATED_BY, LAST_UPDATE_DATE, LAST_UPDATED_BY)
    6. ZIPCODES(ZIP, CITY)
    7. ORDERS_ERRORS(TRANSACTION_DATE, ONO, MESSAGE)
    8. ODETAILS_ERRORS(TRANSACTION_DATE, ONO, PNO, MESSAGE)
    9. RESTOCK(TRANSACTION_DATE, PNO)


    • The EMPLOYEE table contains information about the employees of the company. The ENO (Employee Number) attribute is the primary key. The ZIP attribute is a foreign key referring to the ZIPCODES table.
    • The PARTS table keeps a record of the inventory of the company. The record for each part includes its number (PNO) and name (PNAME) as well as the quantity on hand (QOH), the unit price (PRICE) and the reorder level (REORDER_LEVEL). PNO is the primary key for this table.
    • The CUSTOMERS table contains information about the customers of the mail-order company. Each customer is assigned a customer number (CNO), which serves as the primary key. The ZIP attribute is a foreign key referring to the ZIPCODES table.
    • The ORDERS table contains information about the orders placed by customers, the employee who took the orders, and the dates the orders were received and shipped. Order number (ONO) is the primary key. The Customer number (CNO) attribute is a foreign key referring to the CUSTOMERS table, and the ENO attribute is a foreign key referring to the EMPLOYEES table.
    • The ODETAILS table contains information about the various parts order by the customer within a particular order. The combination of ONO and PNO attributes forms the primary key. The ONO attribute is a foreign key referring to the ORDERS table, and the PNO attribute is a foreign key referring to the PARTS relation.
    • The ZIPCODES table maintains information about the zip codes for various cities. ZIP is the primary key.
    • The ORDERS_ERRORS table contains information about any errors that occurred when an order is processed. Transaction date is the date of the transaction.
    • The ODETAILS_ERRORS table contains information about all errors that occur when processing an order detail. Transaction date is the date of the transaction.
    • The RESTOCK table contains information about all parts (PNO) that are below the reorder level. Transaction date is the date of the transaction.


    1. Write a package called Process_Orders to process customer orders. This package should contain four procedures and a function, namely;
    Add_order. This procedure takes as input customer number, employee number, and received date and tries to insert a new row in the Orders table. If the received date is null, the current date is used. The shipped date is left as null. If any errors occur, an entry is made in the Orders_errors table. A sequence called Order_number_seq should be used to populate the order number (ONO) column.
    Add_order_details. This procedure receives as input an order number, part number, and quantity and attempts to add a row to the Odetails table. If the quantity on hand for the part is less than what is ordered, an error message is sent to the Odetails_errors table. Otherwise, the part is sold by subtracting the quantity ordered from the quantity on hand for this part. A check is also made for the reorder level. If the updated quantity for the part is below the reorder level, an entry is made to the Restock table.
    Ship_order. This procedure takes as input an order number and a shipped date and tries to update the shipped value for the order. If the shipped date is null, the current date is used. If any errors occur, an entry is made in the Orders_errors table.
    Delete_order. This procedure takes as input an order number and tries to delete records from both the Orders and Odetails tables that match this order number. If any errors occur or there is no record that matches this order number, an entry is made in the Orders_errors table.
    Total_emp_sales. This function takes as input an employee number. It computes and returns the total sales for that employee.


    2. Create triggers on the PARTS, ORDERS, and ODETAILS tables to populate the CREATION_DATE, CREATED_BY, LAST_UPDATE_DATE, LAST_UPDATED_BY columns when an insert or update is made. Use SYSDATE and the pseudo column USER to populate these columns.


    3. Write a trigger that fires when a row in the Orders table is updated or deleted. The trigger should record the dropped order records in another table called deleted_orders. The deleted_orders table should also contain a date attribute that keeps track of the date and time the action (update or delete) was performed. This date is quite different from the CREATED_DATE and UPDATED_DATE from the Order table. Do not copy these dates to the deleted_order table. Please include the table creation script for the deleted_orders table in the script file.


    4. Create a sequence called order_number_seq that will be used to populate the order number (ONO) column.


    5. Write a PL/SQL anonymous block to test the above.

    Learn More
  9. CMIS 420 homework 1 spool file

    CMIS 420 Homework 1 Online Vehicle Sales Database

    Regular Price: $15.00

    Special Price: $12.00

    CMIS 420 Homework 1 Online Vehicle Sales Database


    You are part of a development team with Ace Software, Inc. who has recently been contracted to develop various database capabilities for Online Vehicle Sales, Inc. (OVS). OVS is a startup "dotcom" with about 10 dealership locations in Maryland, Virginia and Washington, D.C. They sell new and used cars (compacts, midsizes and full-sizes), sport utility vehicles (SUVs) and light trucks. Currently their business is just based on customers visiting one of the 10 dealership locations in person, but soon they plan to move the bulk of their business the Internet. Initially they have expressed a desire to have a custom OLTP database, and a custom DSS database, designed and built by your company. Each dealership has a staff of salespersons who assist customers in the purchase of different types of vehicles for which various financing plans are available. New and used vehicles are provided to each dealership based on sales and inventory needs.
    An ERD for a 3NF normalized online transaction processing (OLTP) relational database for this application is provided.
    Using an SQL script file create Oracle tables for the entities shown in the ERD. Use the plural form of the entity name for your table names.
    You can create your database on Nova or any other Oracle system you wish.
    Populate the VEHICLES and CUSTOMERS tables in your OLTP database with at least five rows each. Your DEALERSHIPS, SALESPERSONS and FINANCING_PLANS tables should have at least three rows each. Your SALES table should have at least 10 rows, using a variety of different customers, vehicles, salespersons, dealerships and financing plans. All other tables should have at least one row each. Run SELECT * statements on all your tables after they are populated to show all contents.
    Submit the following in either an SQL*Plus SPOOL file or screen snapshots of the output if using iSQL*Plus:
    1) The names of all your tables from the output of the SQL statement “SELECT table_name FROM user_tables”.
    2) A DESC (i.e. DESCRIBE) of all tables to show the column names.
    3) The contents of all tables from SELECT * statements
    4) Do NOT submit your SQL script files. Only submit the output specified in Steps #1 though #3 above.

    Learn More
  10. CMIS 320 Lab 4 Homework 4 Choose Oracle datatypes

    CMIS 320 Lab 4 Homework 4 Choose Oracle datatypes

    Regular Price: $8.00

    Special Price: $5.00

    CMIS 320 Lab 4 Homework 4 Choose Oracle datatypes


    Instructions
    Choose Oracle datatypes for the following attributes from a normalized relation including:
    Vendor(VendorID, Address, ContactName)
    Item (ItemID, Description)
    PriceQuote(VendorID, ItemID, Price)
    Describe why you selected the datatypes for each attribute.

    Learn More

Items 1 to 10 of 63 total

per page
Page:
  1. 1
  2. 2
  3. 3
  4. 4
  5. 5

Grid  List 

Set Descending Direction